==> Building on centiskorch ==> Checking for remote environment... ==> Syncing package to remote host... sending incremental file list created directory packages/solr ./ .SRCINFO 4,276 100% 0.00kB/s 0:00:00 4,276 100% 0.00kB/s 0:00:00 (xfr#1, to-chk=28/30) PKGBUILD 7,488 100% 7.14MB/s 0:00:00 7,488 100% 7.14MB/s 0:00:00 (xfr#2, to-chk=27/30) allow-using-system-gradle.patch 1,429 100% 1.36MB/s 0:00:00 1,429 100% 1.36MB/s 0:00:00 (xfr#3, to-chk=26/30) skip-checks-for-git-repo.patch 1,539 100% 1.47MB/s 0:00:00 1,539 100% 1.47MB/s 0:00:00 (xfr#4, to-chk=25/30) skip-failing-test.patch 1,047 100% 1022.46kB/s 0:00:00 1,047 100% 1022.46kB/s 0:00:00 (xfr#5, to-chk=24/30) solr-9.4.1-1.log 766 100% 748.05kB/s 0:00:00 766 100% 748.05kB/s 0:00:00 (xfr#6, to-chk=23/30) solr.service 962 100% 939.45kB/s 0:00:00 962 100% 939.45kB/s 0:00:00 (xfr#7, to-chk=22/30) solr.sysusers 32 100% 31.25kB/s 0:00:00 32 100% 31.25kB/s 0:00:00 (xfr#8, to-chk=21/30) solr.tmpfiles 161 100% 157.23kB/s 0:00:00 161 100% 157.23kB/s 0:00:00 (xfr#9, to-chk=20/30) support-reproducible-builds.patch 1,668 100% 1.59MB/s 0:00:00 1,668 100% 1.59MB/s 0:00:00 (xfr#10, to-chk=19/30) keys/ keys/pgp/ keys/pgp/2085660D9C1FCCACC4A479A3BF160FF14992A24C.asc 3,078 100% 2.94MB/s 0:00:00 3,078 100% 2.94MB/s 0:00:00 (xfr#11, to-chk=16/30) keys/pgp/2289AC4180E130507D7A82F479C211E0AEFCA72E.asc 5,592 100% 5.33MB/s 0:00:00 5,592 100% 5.33MB/s 0:00:00 (xfr#12, to-chk=15/30) keys/pgp/3558857D1F5754B78C7F8B5A71A45A3D0D8D0B93.asc 1,672 100% 1.59MB/s 0:00:00 1,672 100% 1.59MB/s 0:00:00 (xfr#13, to-chk=14/30) keys/pgp/38DA0C3CE8181703A08E4D57377C3BA26AD29C0A.asc 3,139 100% 2.99MB/s 0:00:00 3,139 100% 2.99MB/s 0:00:00 (xfr#14, to-chk=13/30) keys/pgp/50E3EE1C91C7E0CB4DFB007B369424FC98F3F6EC.asc 3,171 100% 3.02MB/s 0:00:00 3,171 100% 3.02MB/s 0:00:00 (xfr#15, to-chk=12/30) keys/pgp/515EA995ED1DD799FA1422E5CA7514D8385D790B.asc 3,171 100% 3.02MB/s 0:00:00 3,171 100% 3.02MB/s 0:00:00 (xfr#16, to-chk=11/30) keys/pgp/7D8D90F8F64F23077AC87CF7CB77CB79928BB0EC.asc 1,302 100% 635.74kB/s 0:00:00 1,302 100% 635.74kB/s 0:00:00 (xfr#17, to-chk=10/30) keys/pgp/81D3EB0408B4E1EB10AF443BA4F4C886B29BC2F4.asc 3,163 100% 1.51MB/s 0:00:00 3,163 100% 1.51MB/s 0:00:00 (xfr#18, to-chk=9/30) keys/pgp/86EDB9C33B8517228E88A8F93E48C0C6EF362B9E.asc 3,090 100% 1.47MB/s 0:00:00 3,090 100% 1.47MB/s 0:00:00 (xfr#19, to-chk=8/30) keys/pgp/902CC51935C140BF820230961FD5295281436075.asc 3,167 100% 1.51MB/s 0:00:00 3,167 100% 1.51MB/s 0:00:00 (xfr#20, to-chk=7/30) keys/pgp/9722F25F650057E26C803B60A6D064D833B3A969.asc 3,159 100% 1.51MB/s 0:00:00 3,159 100% 1.51MB/s 0:00:00 (xfr#21, to-chk=6/30) keys/pgp/C3E7CBD9B9FE2B419EB14B89612B39A5BC981763.asc 2,521 100% 1.20MB/s 0:00:00 2,521 100% 1.20MB/s 0:00:00 (xfr#22, to-chk=5/30) keys/pgp/CFCE5FBB920C3C745CEEE084C38FF5EC3FCFDB3E.asc 3,090 100% 1.47MB/s 0:00:00 3,090 100% 1.47MB/s 0:00:00 (xfr#23, to-chk=4/30) keys/pgp/E58A6F4D5B2B48AC66D5E53BD4F181881A42F9E6.asc 3,155 100% 1.50MB/s 0:00:00 3,155 100% 1.50MB/s 0:00:00 (xfr#24, to-chk=3/30) keys/pgp/E6E21FFCDCEA14C95910EA65051A0FAF76BC6507.asc 3,094 100% 1007.16kB/s 0:00:00 3,094 100% 1007.16kB/s 0:00:00 (xfr#25, to-chk=2/30) keys/pgp/FBC25D7E1712025294FE66590A6AA179B9BBF45E.asc 3,163 100% 1.01MB/s 0:00:00 3,163 100% 1.01MB/s 0:00:00 (xfr#26, to-chk=1/30) keys/pgp/FDB3D11D716BB32ACF8C93AB919B21537AA80271.asc 3,167 100% 1.01MB/s 0:00:00 3,167 100% 1.01MB/s 0:00:00 (xfr#27, to-chk=0/30) sent 50,688 bytes received 584 bytes 102,544.00 bytes/sec total size is 70,694 speedup is 1.38 ==> Ensuring required PGP keys are present...  -> Checking for 2085660D9C1FCCACC4A479A3BF160FF14992A24C...  -> Checking for E58A6F4D5B2B48AC66D5E53BD4F181881A42F9E6...  -> Checking for 81D3EB0408B4E1EB10AF443BA4F4C886B29BC2F4...  -> Checking for 86EDB9C33B8517228E88A8F93E48C0C6EF362B9E...  -> Checking for 38DA0C3CE8181703A08E4D57377C3BA26AD29C0A...  -> Checking for 50E3EE1C91C7E0CB4DFB007B369424FC98F3F6EC...  -> Checking for 902CC51935C140BF820230961FD5295281436075...  -> Checking for 7D8D90F8F64F23077AC87CF7CB77CB79928BB0EC...  -> Checking for CFCE5FBB920C3C745CEEE084C38FF5EC3FCFDB3E...  -> Checking for FBC25D7E1712025294FE66590A6AA179B9BBF45E...  -> Checking for 9722F25F650057E26C803B60A6D064D833B3A969...  -> Checking for E6E21FFCDCEA14C95910EA65051A0FAF76BC6507...  -> Checking for 3558857D1F5754B78C7F8B5A71A45A3D0D8D0B93...  -> Checking for C3E7CBD9B9FE2B419EB14B89612B39A5BC981763...  -> Checking for FDB3D11D716BB32ACF8C93AB919B21537AA80271...  -> Checking for 2289AC4180E130507D7A82F479C211E0AEFCA72E...  -> Checking for 515EA995ED1DD799FA1422E5CA7514D8385D790B...  -> Importing key from local... ==> Running extra-riscv64-build -- -d /home/felix/packages/riscv64-pkg-cache:/var/cache/pacman/pkg -l root7 on remote host... [?25l:: Synchronizing package databases... core downloading... extra downloading... :: Starting full system upgrade... there is nothing to do [?25h==> Building in chroot for [extra] (riscv64)... ==> Synchronizing chroot copy [/var/lib/archbuild/extra-riscv64/root] -> [root7]...done ==> Making package: solr 9.4.1-1 (Sun Jan 21 15:16:52 2024) ==> Retrieving sources...  -> Downloading solr-9.4.1-src.tgz... % Total % Received % Xferd Average Speed Time Time Time Current Dload Upload Total Spent Left Speed 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 0 85.9M 0 39941 0 0 48539 0 0:30:57 --:--:-- 0:30:57 48530 9 85.9M 9 7999k 0 0 4377k 0 0:00:20 0:00:01 0:00:19 4378k 29 85.9M 29 25.0M 0 0 9072k 0 0:00:09 0:00:02 0:00:07 9072k 49 85.9M 49 42.2M 0 0 10.9M 0 0:00:07 0:00:03 0:00:04 10.9M 69 85.9M 69 59.7M 0 0 12.3M 0 0:00:06 0:00:04 0:00:02 12.3M 88 85.9M 88 76.4M 0 0 13.1M 0 0:00:06 0:00:05 0:00:01 15.2M 100 85.9M 100 85.9M 0 0 13.4M 0 0:00:06 0:00:06 --:--:-- 17.1M  -> Downloading solr-9.4.1-src.tgz.asc... % Total % Received % Xferd Average Speed Time Time Time Current Dload Upload Total Spent Left Speed 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 100 853 100 853 0 0 2020 0 --:--:-- --:--:-- --:--:-- 2021  -> Found solr.service  -> Found solr.sysusers  -> Found solr.tmpfiles  -> Found support-reproducible-builds.patch  -> Found skip-failing-test.patch  -> Found skip-checks-for-git-repo.patch  -> Found allow-using-system-gradle.patch ==> Validating source files with sha512sums... solr-9.4.1-src.tgz ... Passed solr-9.4.1-src.tgz.asc ... Skipped solr.service ... Passed solr.sysusers ... Passed solr.tmpfiles ... Passed support-reproducible-builds.patch ... Passed skip-failing-test.patch ... Passed skip-checks-for-git-repo.patch ... Passed allow-using-system-gradle.patch ... Passed ==> Validating source files with b2sums... solr-9.4.1-src.tgz ... Passed solr-9.4.1-src.tgz.asc ... Skipped solr.service ... Passed solr.sysusers ... Passed solr.tmpfiles ... Passed support-reproducible-builds.patch ... Passed skip-failing-test.patch ... Passed skip-checks-for-git-repo.patch ... Passed allow-using-system-gradle.patch ... Passed ==> Verifying source file signatures with gpg... solr-9.4.1-src.tgz ... Passed ==> Making package: solr 9.4.1-1 (Sun Jan 21 15:17:18 2024) ==> Checking runtime dependencies... ==> Installing missing dependencies... [?25lresolving dependencies... looking for conflicting packages... warning: dependency cycle detected: warning: harfbuzz will be installed before its freetype2 dependency Package (15) New Version Net Change Download Size extra/freetype2 2.13.2-1 1.51 MiB extra/giflib 5.2.1-2 0.22 MiB extra/graphite 1:1.3.14-3 0.17 MiB extra/harfbuzz 8.3.0-2 3.68 MiB extra/java-runtime-common 3-5 0.01 MiB extra/jbigkit 2.1-7 0.13 MiB extra/jre11-openjdk-headless 11.0.22.u7-1 138.07 MiB 32.50 MiB extra/lcms2 2.16-1 0.61 MiB extra/libjpeg-turbo 3.0.1-1 1.81 MiB extra/libnet 2:1.3-1 1.46 MiB extra/libpng 1.6.40-2 0.51 MiB extra/libtiff 4.6.0-2 4.95 MiB core/nspr 4.35-2 0.68 MiB core/nss 3.96.1-1 4.27 MiB extra/jre11-openjdk 11.0.22.u7-1 0.49 MiB 0.20 MiB Total Download Size: 32.70 MiB Total Installed Size: 158.57 MiB :: Proceed with installation? [Y/n] :: Retrieving packages... jre11-openjdk-headless-11.0.22.u7-1-riscv64 downloading... jre11-openjdk-11.0.22.u7-1-riscv64 downloading... checking keyring... checking package integrity... loading package files... checking for file conflicts... :: Processing package changes... installing java-runtime-common... For the complete set of Java binaries to be available in your PATH, you need to re-login or source /etc/profile.d/jre.sh Please note that this package does not support forcing JAVA_HOME as former package java-common did installing nspr... installing nss... installing libjpeg-turbo... Optional dependencies for libjpeg-turbo java-runtime>11: for TurboJPEG Java wrapper installing jbigkit... installing libtiff... Optional dependencies for libtiff freeglut: for using tiffgt installing lcms2... installing libnet... installing libpng... installing graphite... Optional dependencies for graphite graphite-docs: Documentation installing harfbuzz... Optional dependencies for harfbuzz harfbuzz-utils: utilities installing freetype2... installing jre11-openjdk-headless... Optional dependencies for jre11-openjdk-headless java-rhino: for some JavaScript support installing giflib... installing jre11-openjdk... when you use a non-reparenting window manager, set _JAVA_AWT_WM_NONREPARENTING=1 in /etc/profile.d/jre.sh Optional dependencies for jre11-openjdk alsa-lib: for basic sound support gtk2: for the Gtk+ 2 look and feel - desktop usage gtk3: for the Gtk+ 3 look and feel - desktop usage [?25h==> Checking buildtime dependencies... ==> Installing missing dependencies... [?25lresolving dependencies... looking for conflicting packages... Package (9) New Version Net Change Download Size extra/hicolor-icon-theme 0.17-3 0.05 MiB extra/java-environment-common 3-5 0.00 MiB extra/perl-error 0.17029-5 0.04 MiB extra/perl-mailtools 2.21-7 0.10 MiB extra/perl-timedate 2.33-5 0.08 MiB extra/git 2.43.0-1 24.90 MiB extra/gradle 8.5-1 140.04 MiB extra/groovy 4.0.15-3 23.94 MiB extra/jdk11-openjdk 11.0.22.u7-1 246.65 MiB 234.28 MiB Total Download Size: 234.28 MiB Total Installed Size: 435.81 MiB :: Proceed with installation? [Y/n] :: Retrieving packages... jdk11-openjdk-11.0.22.u7-1-riscv64 downloading... checking keyring... checking package integrity... loading package files... checking for file conflicts... :: Processing package changes... installing java-environment-common... installing hicolor-icon-theme... installing jdk11-openjdk... installing gradle... Optional dependencies for gradle gradle-doc: gradle documentation gradle-src: gradle sources installing perl-error... installing perl-timedate... installing perl-mailtools... installing git... Optional dependencies for git tk: gitk and git gui openssh: ssh transport and crypto perl-libwww: git svn perl-term-readkey: git svn and interactive.singlekey setting perl-io-socket-ssl: git send-email TLS support perl-authen-sasl: git send-email TLS support perl-mediawiki-api: git mediawiki support perl-datetime-format-iso8601: git mediawiki support perl-lwp-protocol-https: git mediawiki https support perl-cgi: gitweb (web interface) support python: git svn & git p4 subversion: git svn org.freedesktop.secrets: keyring credential helper libsecret: libsecret credential helper [installed] installing groovy... Optional dependencies for groovy groovy-docs: html and pdf documentation for Groovy :: Running post-transaction hooks... (1/1) Warn about old perl modules [?25h==> Retrieving sources...  -> Found solr-9.4.1-src.tgz  -> Found solr-9.4.1-src.tgz.asc  -> Found solr.service  -> Found solr.sysusers  -> Found solr.tmpfiles  -> Found support-reproducible-builds.patch  -> Found skip-failing-test.patch  -> Found skip-checks-for-git-repo.patch  -> Found allow-using-system-gradle.patch ==> WARNING: Skipping all source file integrity checks. ==> Extracting sources...  -> Extracting solr-9.4.1-src.tgz with bsdtar ==> Starting prepare()... patching file gradle/validation/precommit.gradle patching file solr/distribution/build.gradle patching file solr/distribution/source-release.gradle patching file gradle/globals.gradle patching file gradle/java/jar-manifest.gradle patching file solr/modules/extraction/src/test/org/apache/solr/handler/extraction/TestXLSXResponseWriter.java ==> Starting build()... Downloading gradle-wrapper.jar from https://raw.githubusercontent.com/gradle/gradle/v7.6.0/gradle/wrapper/gradle-wrapper.jar Generating gradle.properties Downloading https://services.gradle.org/distributions/gradle-7.6-bin.zip ...........10%............20%...........30%............40%............50%...........60%............70%............80%...........90%............100% Welcome to Gradle 7.6! Here are the highlights of this release: - Added support for Java 19. - Introduced `--rerun` flag for individual task rerun. - Improved dependency block for test suites to be strongly typed. - Added a pluggable system for Java toolchains provisioning. For more details see https://docs.gradle.org/7.6/release-notes.html Starting a Gradle Daemon (subsequent builds will be faster) > Task :buildSrc:compileJava > Task :buildSrc:compileGroovy NO-SOURCE > Task :buildSrc:processResources NO-SOURCE > Task :buildSrc:classes > Task :buildSrc:jar > Task :buildSrc:assemble > Task :buildSrc:compileTestJava NO-SOURCE > Task :buildSrc:compileTestGroovy NO-SOURCE > Task :buildSrc:processTestResources NO-SOURCE > Task :buildSrc:testClasses UP-TO-DATE > Task :buildSrc:test NO-SOURCE > Task :buildSrc:check UP-TO-DATE > Task :buildSrc:build > Task :solr:solrj-zookeeper:processResources NO-SOURCE > Task :solr:modules:analytics:processResources NO-SOURCE > Task :solr:modules:analysis-extras:processResources NO-SOURCE > Task :solr:prometheus-exporter:processResources NO-SOURCE > Task :solr:modules:clustering:processResources NO-SOURCE > Task :solr:api:processResources NO-SOURCE > Task :mavenLocalClean UP-TO-DATE > Task :solr:modules:extraction:processResources > Task :solr:solrj-streaming:processResources > Task :solr:solrj:processResources > Task :solr:test-framework:processResources > Task :solr:core:processResources > Task :solr:api:generatePomFileForJarsPublication > Task :solr:modules:gcs-repository:processResources NO-SOURCE > Task :gitStatus > Task :errorProneSkipped WARNING: errorprone disabled (skipped on builds not running inside CI environments, pass -Pvalidation.errorprone=true to enable) > Task :solr:solrj-streaming:generatePomFileForJarsPublication > Task :solr:modules:hadoop-auth:processResources NO-SOURCE > Task :solr:solrj:generatePomFileForJarsPublication > Task :solr:modules:hdfs:processResources NO-SOURCE > Task :solr:solrj-zookeeper:generatePomFileForJarsPublication > Task :solr:modules:jaegertracer-configurator:processResources NO-SOURCE > Task :solr:modules:clustering:generatePomFileForJarsPublication > Task :solr:modules:jwt-auth:processResources NO-SOURCE > Task :solr:prometheus-exporter:generatePomFileForJarsPublication > Task :solr:modules:analytics:generatePomFileForJarsPublication > Task :solr:modules:analysis-extras:generatePomFileForJarsPublication > Task :solr:modules:ltr:processResources NO-SOURCE > Task :solr:modules:opentelemetry:processResources NO-SOURCE > Task :solr:modules:langid:processResources > Task :solr:modules:jaegertracer-configurator:generatePomFileForJarsPublication > Task :solr:modules:s3-repository:processResources NO-SOURCE > Task :solr:modules:ltr:generatePomFileForJarsPublication > Task :solr:modules:scripting:processResources NO-SOURCE > Task :solr:modules:hdfs:generatePomFileForJarsPublication > Task :solr:test-framework:generatePomFileForJarsPublication > Task :solr:modules:sql:processResources > Task :solr:modules:jwt-auth:generatePomFileForJarsPublication > Task :solr:docker:createBodySnippetDockerfile > Task :solr:docker:createDockerfileLocal > Task :solr:modules:langid:generatePomFileForJarsPublication > Task :solr:webapp:processResources NO-SOURCE > Task :solr:distribution:prepareGitRev > Task :solr:docker:assemblePackaging > Task :solr:modules:opentelemetry:generatePomFileForJarsPublication > Task :solr:modules:hadoop-auth:generatePomFileForJarsPublication > Task :solr-missing-doclet:compileJava > Task :solr-missing-doclet:processResources NO-SOURCE > Task :solr-missing-doclet:classes > Task :solr-missing-doclet:jar > Task :solr:modules:scripting:generatePomFileForJarsPublication > Task :solr:core:generatePomFileForJarsPublication > Task :solr:modules:extraction:generatePomFileForJarsPublication > Task :solr:modules:gcs-repository:generatePomFileForJarsPublication > Task :solr:modules:sql:generatePomFileForJarsPublication > Task :solr:modules:s3-repository:generatePomFileForJarsPublication > Task :solr:api:compileJava > Task :solr:api:classes > Task :solr:api:jar > Task :solr:api:generateMetadataFileForJarsPublication SKIPPED > Task :solr:api:renderJavadoc > Task :solr:api:javadoc SKIPPED > Task :solr:api:javadocJar > Task :solr:api:sourcesJar > Task :solr:api:publishJarsPublicationToBuildRepository > Task :solr:documentation:changesToHtml > Task :solr:api:resolve SLF4J(W): No SLF4J providers were found. SLF4J(W): Defaulting to no-operation (NOP) logger implementation SLF4J(W): See https://www.slf4j.org/codes.html#noProviders for further details. > Task :solr:documentation:copyDocumentationAssets > Task :solr:documentation:markdownToHtml > Task :solr:documentation:copyChangesToHtmlForMiniSite > Task :solr:documentation:copyMiniDocumentationAssets > Task :solr:solrj:openApiGenerate ################################################################################ # Thanks for using OpenAPI Generator. # # Please consider donation to help us maintain this project 🙏 # # https://opencollective.com/openapi_generator/donate # ################################################################################ Successfully generated code to /build/solr/src/solr-9.4.1/solr/solrj/build/generated/ > Task :spotlessInternalRegisterDependencies > Task :solr:documentation:createMiniDocumentationIndex > Task :solr:documentation:documentationMinimal > Task :solr:solrj:compileJava Note: Some input files use or override a deprecated API. Note: Recompile with -Xlint:deprecation for details. > Task :solr:prometheus-exporter:compileJava Note: Some input files use or override a deprecated API. Note: Recompile with -Xlint:deprecation for details. > Task :solr:solrj:generatedSpotlessHelper > Task :solr:solrj:generatedSpotless > Task :solr:solrj-zookeeper:compileJava Note: Some input files use or override a deprecated API. Note: Recompile with -Xlint:deprecation for details. > Task :solr:solrj-zookeeper:classes > Task :solr:prometheus-exporter:classes > Task :solr:solrj-zookeeper:jar > Task :solr:solrj-zookeeper:generateMetadataFileForJarsPublication SKIPPED > Task :solr:prometheus-exporter:jar > Task :solr:prometheus-exporter:generateMetadataFileForJarsPublication SKIPPED > Task :solr:solrj-zookeeper:sourcesJar > Task :solr:prometheus-exporter:sourcesJar > Task :solr:solrj-streaming:compileJava Note: Some input files use or override a deprecated API. Note: Recompile with -Xlint:deprecation for details. > Task :solr:solrj-streaming:classes > Task :solr:solrj-streaming:jar > Task :solr:solrj-streaming:generateMetadataFileForJarsPublication SKIPPED > Task :solr:solrj-streaming:sourcesJar > Task :solr:solrj:renderJavadoc > Task :solr:prometheus-exporter:assemblePackaging > Task :solr:core:compileJava > Task :solr:solrj:classes > Task :solr:solrj:jar > Task :solr:solrj:generateMetadataFileForJarsPublication SKIPPED > Task :solr:solrj:javadoc SKIPPED > Task :solr:solrj:javadocJar > Task :solr:solrj:sourcesJar > Task :solr:solrj:publishJarsPublicationToBuildRepository > Task :solr:solrj-streaming:renderJavadoc > Task :solr:solrj-zookeeper:renderJavadoc > Task :solr:prometheus-exporter:renderJavadoc > Task :solr:prometheus-exporter:javadoc SKIPPED > Task :solr:prometheus-exporter:javadocJar > Task :solr:prometheus-exporter:publishJarsPublicationToBuildRepository > Task :solr:solrj-zookeeper:javadoc SKIPPED > Task :solr:solrj-zookeeper:javadocJar > Task :solr:solrj-zookeeper:publishJarsPublicationToBuildRepository > Task :solr:solrj-streaming:javadoc SKIPPED > Task :solr:solrj-streaming:javadocJar > Task :solr:solrj-streaming:publishJarsPublicationToBuildRepository > Task :solr:core:compileJava Note: Some input files use or override a deprecated API. Note: Recompile with -Xlint:deprecation for details. > Task :solr:core:classes > Task :solr:core:jar > Task :solr:core:generateMetadataFileForJarsPublication SKIPPED > Task :solr:modules:langid:compileJava Note: Some input files use or override a deprecated API. Note: Recompile with -Xlint:deprecation for details. > Task :solr:modules:jwt-auth:compileJava > Task :solr:modules:jwt-auth:classes > Task :solr:modules:analysis-extras:compileJava Note: /build/solr/src/solr-9.4.1/solr/modules/analysis-extras/src/java/org/apache/solr/schema/ICUCollationField.java uses or overrides a deprecated API. Note: Recompile with -Xlint:deprecation for details. > Task :solr:modules:jwt-auth:jar > Task :solr:modules:jwt-auth:generateMetadataFileForJarsPublication SKIPPED > Task :solr:modules:jwt-auth:sourcesJar > Task :solr:modules:clustering:compileJava > Task :solr:modules:clustering:classes > Task :solr:modules:jaegertracer-configurator:compileJava > Task :solr:modules:jaegertracer-configurator:classes > Task :solr:modules:jaegertracer-configurator:jar > Task :solr:modules:jaegertracer-configurator:generateMetadataFileForJarsPublication SKIPPED > Task :solr:modules:clustering:jar > Task :solr:modules:clustering:generateMetadataFileForJarsPublication SKIPPED > Task :solr:modules:jaegertracer-configurator:sourcesJar > Task :solr:modules:clustering:sourcesJar > Task :solr:modules:langid:classes > Task :solr:test-framework:compileJava > Task :solr:modules:extraction:compileJava > Task :solr:modules:gcs-repository:compileJava > Task :solr:modules:analytics:compileJava > Task :solr:modules:hdfs:compileJava > Task :solr:modules:hadoop-auth:compileJava > Task :solr:core:renderJavadoc > Task :solr:modules:langid:jar > Task :solr:modules:langid:generateMetadataFileForJarsPublication SKIPPED > Task :solr:modules:langid:sourcesJar > Task :solr:modules:analysis-extras:classes > Task :solr:modules:analysis-extras:jar > Task :solr:modules:analysis-extras:generateMetadataFileForJarsPublication SKIPPED > Task :solr:modules:analysis-extras:sourcesJar > Task :solr:modules:hadoop-auth:compileJava Note: Some input files use or override a deprecated API. Note: Recompile with -Xlint:deprecation for details. > Task :solr:modules:ltr:compileJava Note: Some input files use or override a deprecated API. Note: Recompile with -Xlint:deprecation for details. > Task :solr:modules:ltr:classes > Task :solr:modules:gcs-repository:compileJava Note: /build/solr/src/solr-9.4.1/solr/modules/gcs-repository/src/java/org/apache/solr/gcs/GCSConfigParser.java uses or overrides a deprecated API. Note: Recompile with -Xlint:deprecation for details. > Task :solr:modules:ltr:jar > Task :solr:modules:ltr:generateMetadataFileForJarsPublication SKIPPED > Task :solr:modules:scripting:compileJava Note: /build/solr/src/solr-9.4.1/solr/modules/scripting/src/java/org/apache/solr/scripting/xslt/XSLTUpdateRequestHandler.java uses or overrides a deprecated API. Note: Recompile with -Xlint:deprecation for details. > Task :solr:modules:ltr:sourcesJar > Task :solr:webapp:compileJava NO-SOURCE > Task :solr:webapp:classes UP-TO-DATE > Task :solr:modules:hdfs:compileJava Note: Some input files use or override a deprecated API. Note: Recompile with -Xlint:deprecation for details. > Task :solr:modules:opentelemetry:compileJava > Task :solr:modules:s3-repository:compileJava > Task :solr:modules:opentelemetry:classes > Task :solr:test-framework:compileJava Note: Some input files use or override a deprecated API. Note: Recompile with -Xlint:deprecation for details. > Task :solr:modules:analytics:compileJava Note: Some input files use or override a deprecated API. Note: Recompile with -Xlint:deprecation for details. > Task :solr:modules:analytics:classes > Task :solr:modules:opentelemetry:jar > Task :solr:modules:opentelemetry:generateMetadataFileForJarsPublication SKIPPED > Task :solr:modules:sql:compileJava Note: Some input files use or override a deprecated API. Note: Recompile with -Xlint:deprecation for details. > Task :solr:modules:opentelemetry:sourcesJar > Task :solr:modules:analytics:jar > Task :solr:modules:analytics:generateMetadataFileForJarsPublication SKIPPED > Task :solr:modules:analytics:sourcesJar > Task :solr:modules:analytics:assemblePackaging > Task :solr:modules:clustering:assemblePackaging > Task :solr:modules:extraction:compileJava Note: /build/solr/src/solr-9.4.1/solr/modules/extraction/src/java/org/apache/solr/handler/extraction/ExtractingDocumentLoader.java uses or overrides a deprecated API. Note: Recompile with -Xlint:deprecation for details. > Task :solr:modules:scripting:classes > Task :solr:modules:scripting:jar > Task :solr:modules:scripting:generateMetadataFileForJarsPublication SKIPPED > Task :solr:modules:scripting:sourcesJar > Task :solr:modules:jwt-auth:assemblePackaging > Task :solr:modules:langid:assemblePackaging > Task :solr:modules:ltr:assemblePackaging > Task :solr:modules:analysis-extras:assemblePackaging > Task :solr:modules:scripting:assemblePackaging > Task :solr:modules:s3-repository:classes > Task :solr:modules:s3-repository:jar > Task :solr:modules:s3-repository:generateMetadataFileForJarsPublication SKIPPED > Task :solr:modules:s3-repository:sourcesJar > Task :solr:modules:s3-repository:assemblePackaging > Task :solr:modules:gcs-repository:classes > Task :solr:modules:gcs-repository:jar > Task :solr:modules:gcs-repository:generateMetadataFileForJarsPublication SKIPPED > Task :solr:modules:gcs-repository:sourcesJar > Task :solr:webapp:war > Task :solr:modules:jaegertracer-configurator:assemblePackaging > Task :solr:server:assemblePackaging > Task :solr:modules:opentelemetry:assemblePackaging > Task :solr:test-framework:classes > Task :solr:modules:hadoop-auth:classes > Task :solr:test-framework:jar > Task :solr:test-framework:generateMetadataFileForJarsPublication SKIPPED > Task :solr:modules:hadoop-auth:jar > Task :solr:modules:hadoop-auth:generateMetadataFileForJarsPublication SKIPPED > Task :solr:modules:hadoop-auth:sourcesJar > Task :solr:test-framework:sourcesJar > Task :solr:modules:gcs-repository:assemblePackaging > Task :solr:modules:sql:classes > Task :solr:modules:sql:jar > Task :solr:modules:sql:generateMetadataFileForJarsPublication SKIPPED > Task :solr:modules:sql:sourcesJar > Task :solr:modules:hadoop-auth:assemblePackaging > Task :solr:modules:sql:assemblePackaging > Task :solr:modules:extraction:classes > Task :solr:modules:extraction:jar > Task :solr:modules:extraction:generateMetadataFileForJarsPublication SKIPPED > Task :solr:modules:extraction:sourcesJar > Task :solr:modules:hdfs:classes > Task :solr:modules:hdfs:jar > Task :solr:modules:hdfs:generateMetadataFileForJarsPublication SKIPPED > Task :solr:modules:extraction:assemblePackaging > Task :solr:modules:hdfs:sourcesJar > Task :solr:modules:hdfs:assemblePackaging > Task :solr:core:javadoc SKIPPED > Task :solr:core:javadocJar > Task :solr:core:sourcesJar > Task :solr:core:publishJarsPublicationToBuildRepository > Task :solr:modules:jaegertracer-configurator:renderJavadoc > Task :solr:modules:jaegertracer-configurator:javadoc SKIPPED > Task :solr:modules:jaegertracer-configurator:javadocJar > Task :solr:modules:jaegertracer-configurator:publishJarsPublicationToBuildRepository > Task :solr:modules:clustering:renderJavadoc > Task :solr:modules:clustering:javadoc SKIPPED > Task :solr:modules:gcs-repository:renderJavadoc > Task :solr:modules:gcs-repository:javadoc SKIPPED > Task :solr:modules:clustering:javadocJar > Task :solr:modules:gcs-repository:javadocJar > Task :solr:modules:clustering:publishJarsPublicationToBuildRepository > Task :solr:modules:gcs-repository:publishJarsPublicationToBuildRepository > Task :solr:modules:analysis-extras:renderJavadoc > Task :solr:modules:analysis-extras:javadoc SKIPPED > Task :solr:modules:analysis-extras:javadocJar > Task :solr:modules:analysis-extras:publishJarsPublicationToBuildRepository > Task :solr:modules:hadoop-auth:renderJavadoc > Task :solr:modules:hadoop-auth:javadoc SKIPPED > Task :solr:modules:hadoop-auth:javadocJar > Task :solr:modules:hadoop-auth:publishJarsPublicationToBuildRepository > Task :solr:core:assemblePostJar > Task :solr:example:assemblePackaging > Task :solr:modules:jwt-auth:renderJavadoc > Task :solr:modules:jwt-auth:javadoc SKIPPED > Task :solr:modules:langid:renderJavadoc > Task :solr:modules:langid:javadoc SKIPPED > Task :solr:modules:jwt-auth:javadocJar > Task :solr:modules:jwt-auth:publishJarsPublicationToBuildRepository > Task :solr:modules:langid:javadocJar > Task :solr:modules:langid:publishJarsPublicationToBuildRepository > Task :solr:modules:hdfs:renderJavadoc > Task :solr:modules:hdfs:javadoc SKIPPED > Task :solr:modules:hdfs:javadocJar > Task :solr:modules:hdfs:publishJarsPublicationToBuildRepository > Task :solr:modules:extraction:renderJavadoc > Task :solr:modules:extraction:javadoc SKIPPED > Task :solr:modules:extraction:javadocJar > Task :solr:modules:extraction:publishJarsPublicationToBuildRepository > Task :solr:test-framework:renderJavadoc > Task :solr:modules:analytics:renderJavadoc > Task :solr:modules:opentelemetry:renderJavadoc > Task :solr:modules:opentelemetry:javadoc SKIPPED > Task :solr:modules:opentelemetry:javadocJar > Task :solr:modules:opentelemetry:publishJarsPublicationToBuildRepository > Task :solr:modules:scripting:renderJavadoc > Task :solr:modules:scripting:javadoc SKIPPED > Task :solr:modules:scripting:javadocJar > Task :solr:modules:scripting:publishJarsPublicationToBuildRepository > Task :solr:modules:s3-repository:renderJavadoc > Task :solr:modules:s3-repository:javadoc SKIPPED > Task :solr:modules:s3-repository:javadocJar > Task :solr:modules:s3-repository:publishJarsPublicationToBuildRepository > Task :solr:modules:ltr:renderJavadoc > Task :solr:modules:ltr:javadoc SKIPPED > Task :solr:modules:ltr:javadocJar > Task :solr:modules:ltr:publishJarsPublicationToBuildRepository > Task :solr:modules:sql:renderJavadoc > Task :solr:modules:sql:javadoc SKIPPED > Task :solr:modules:sql:javadocJar > Task :solr:modules:sql:publishJarsPublicationToBuildRepository > Task :solr:test-framework:javadoc SKIPPED > Task :solr:modules:analytics:javadoc SKIPPED > Task :solr:test-framework:javadocJar > Task :solr:test-framework:publishJarsPublicationToBuildRepository > Task :solr:modules:analytics:javadocJar > Task :solr:modules:analytics:publishJarsPublicationToBuildRepository > Task :mavenToLocalFolder Local maven artifacts (poms, jars) created at: /build/solr/src/solr-9.4.1/build/maven-local > Task :solr:packaging:fullDistTar > Task :solr:packaging:slimDistTar > Task :solr:docker:createDockerfileOfficialFull > Task :solr:docker:createDockerfileOfficialSlim > Task :solr:distribution:computeChecksums > Task :solr:distribution:assembleRelease BUILD SUCCESSFUL in 22m 4s 210 actionable tasks: 209 executed, 1 up-to-date install: creating directory '/build/solr/src/solr-9.4.1/dist' ==> Starting check()... > Task :buildSrc:compileJava UP-TO-DATE > Task :buildSrc:compileGroovy NO-SOURCE > Task :buildSrc:processResources NO-SOURCE > Task :buildSrc:classes UP-TO-DATE > Task :buildSrc:jar UP-TO-DATE > Task :buildSrc:assemble UP-TO-DATE > Task :buildSrc:compileTestJava NO-SOURCE > Task :buildSrc:compileTestGroovy NO-SOURCE > Task :buildSrc:processTestResources NO-SOURCE > Task :buildSrc:testClasses UP-TO-DATE > Task :buildSrc:test NO-SOURCE > Task :buildSrc:check UP-TO-DATE > Task :buildSrc:build UP-TO-DATE > Task :solr:prometheus-exporter:processResources NO-SOURCE > Task :solr:solrj-zookeeper:processResources NO-SOURCE > Task :solr:server:processResources NO-SOURCE > Task :solr:solr-ref-guide:processResources NO-SOURCE > Task :solr:api:processResources NO-SOURCE > Task :solr:modules:analysis-extras:processResources NO-SOURCE > Task :gitStatus > Task :solr:server:processTestResources NO-SOURCE > Task :solr:core:processResources UP-TO-DATE > Task :solr:test-framework:processResources UP-TO-DATE > Task :errorProneSkipped WARNING: errorprone disabled (skipped on builds not running inside CI environments, pass -Pvalidation.errorprone=true to enable) > Task :solr:server:compileJava SKIPPED > Task :solr:server:classes UP-TO-DATE > Task :ensureJdkSupported > Task :randomizationInfo Running tests with randomization seed: tests.seed=246C98A4C257C021 > Task :solr:server:compileTestJava NO-SOURCE > Task :solr:server:testClasses UP-TO-DATE > Task :solr:solrj:processResources UP-TO-DATE > Task :solr:solrj-streaming:processResources UP-TO-DATE > Task :solr:webapp:processResources NO-SOURCE > Task :solr:server:test NO-SOURCE > Task :solr:webapp:processTestResources NO-SOURCE > Task :solr:modules:sql:processResources UP-TO-DATE > Task :solr:api:processTestResources > Task :solr:modules:analytics:processResources NO-SOURCE > Task :solr:server:wipeTaskTemp > Task :solr:modules:clustering:processResources NO-SOURCE > Task :solr:solrj-zookeeper:processTestResources > Task :solr:prometheus-exporter:processTestResources > Task :solr:modules:gcs-repository:processResources NO-SOURCE > Task :solr:modules:clustering:processTestResources > Task :solr:benchmark:processResources > Task :solr:modules:analysis-extras:processTestResources > Task :solr:modules:hdfs:processResources NO-SOURCE > Task :solr:modules:gcs-repository:processTestResources > Task :solr:modules:jaegertracer-configurator:processResources NO-SOURCE > Task :solr:solrj:processTestResources > Task :solr:modules:jaegertracer-configurator:processTestResources > Task :solr:modules:hadoop-auth:processResources NO-SOURCE > Task :solr:modules:extraction:processResources UP-TO-DATE > Task :solr:modules:jwt-auth:processResources NO-SOURCE > Task :solr:modules:langid:processResources UP-TO-DATE > Task :solr:solrj-streaming:processTestResources > Task :solr:solr-ref-guide:copySolrjTestResources > Task :solr:benchmark:processTestResources > Task :solr:modules:ltr:processResources NO-SOURCE > Task :solr:modules:opentelemetry:processResources NO-SOURCE > Task :solr:solr-ref-guide:compileJava NO-SOURCE > Task :solr:solr-ref-guide:classes UP-TO-DATE > Task :solr:modules:langid:processTestResources > Task :solr:test-framework:processTestResources > Task :solr:modules:s3-repository:processResources NO-SOURCE > Task :solr:modules:jwt-auth:processTestResources > Task :solr:solr-ref-guide:processTestResources NO-SOURCE > Task :solr:modules:analytics:processTestResources > Task :solr:api:compileJava UP-TO-DATE > Task :solr:api:classes UP-TO-DATE > Task :solr:modules:extraction:processTestResources > Task :solr:modules:scripting:processResources NO-SOURCE > Task :solr:modules:opentelemetry:processTestResources > Task :solr:modules:s3-repository:processTestResources > Task :solr:api:jar UP-TO-DATE > Task :solr:api:resolve UP-TO-DATE > Task :solr:modules:scripting:processTestResources > Task :solr:modules:ltr:processTestResources > Task :solr:modules:hadoop-auth:copySolrCoreTestResources > Task :solr:modules:hdfs:copySolrCoreTestResources > Task :solr:modules:hdfs:processTestResources > Task :solr:core:processTestResources > Task :solr:modules:sql:copySolrCoreTestResources > Task :solr:modules:hadoop-auth:processTestResources > Task :solr:modules:sql:processTestResources > Task :solr:solrj:openApiGenerate ################################################################################ # Thanks for using OpenAPI Generator. # # Please consider donation to help us maintain this project 🙏 # # https://opencollective.com/openapi_generator/donate # ################################################################################ Successfully generated code to /build/solr/src/solr-9.4.1/solr/solrj/build/generated/ > Task :spotlessInternalRegisterDependencies UP-TO-DATE > Task :solr:solrj:compileJava UP-TO-DATE > Task :solr:solrj-streaming:compileJava UP-TO-DATE > Task :solr:solrj-streaming:classes UP-TO-DATE > Task :solr:prometheus-exporter:compileJava UP-TO-DATE > Task :solr:solrj-zookeeper:compileJava UP-TO-DATE > Task :solr:prometheus-exporter:classes UP-TO-DATE > Task :solr:solrj-zookeeper:classes UP-TO-DATE > Task :solr:solrj-streaming:jar UP-TO-DATE > Task :solr:solrj-zookeeper:jar UP-TO-DATE > Task :solr:core:compileJava UP-TO-DATE > Task :solr:core:classes UP-TO-DATE > Task :solr:core:jar UP-TO-DATE > Task :solr:modules:analysis-extras:compileJava UP-TO-DATE > Task :solr:modules:sql:compileJava UP-TO-DATE > Task :solr:modules:analytics:compileJava UP-TO-DATE > Task :solr:modules:analytics:classes UP-TO-DATE > Task :solr:modules:clustering:compileJava UP-TO-DATE > Task :solr:modules:jaegertracer-configurator:compileJava UP-TO-DATE > Task :solr:modules:sql:classes UP-TO-DATE > Task :solr:modules:analysis-extras:classes UP-TO-DATE > Task :solr:modules:hdfs:compileJava UP-TO-DATE > Task :solr:modules:jwt-auth:compileJava UP-TO-DATE > Task :solr:modules:jwt-auth:classes UP-TO-DATE > Task :solr:modules:hdfs:classes UP-TO-DATE > Task :solr:modules:jaegertracer-configurator:classes UP-TO-DATE > Task :solr:modules:clustering:classes UP-TO-DATE > Task :solr:modules:analysis-extras:jar UP-TO-DATE > Task :solr:modules:sql:jar UP-TO-DATE > Task :solr:modules:langid:compileJava UP-TO-DATE > Task :solr:modules:ltr:compileJava UP-TO-DATE > Task :solr:modules:hadoop-auth:compileJava UP-TO-DATE > Task :solr:modules:scripting:compileJava UP-TO-DATE > Task :solr:modules:hadoop-auth:classes UP-TO-DATE > Task :solr:modules:langid:classes UP-TO-DATE > Task :solr:modules:ltr:classes UP-TO-DATE > Task :solr:modules:scripting:classes UP-TO-DATE > Task :solr:test-framework:compileJava UP-TO-DATE > Task :solr:test-framework:classes UP-TO-DATE > Task :solr:modules:gcs-repository:compileJava UP-TO-DATE > Task :solr:modules:opentelemetry:compileJava UP-TO-DATE > Task :solr:modules:extraction:compileJava UP-TO-DATE > Task :solr:modules:s3-repository:compileJava UP-TO-DATE > Task :solr:test-framework:jar UP-TO-DATE > Task :solr:api:compileTestJava > Task :solr:api:testClasses > Task :solr:modules:extraction:classes UP-TO-DATE > Task :solr:solrj:generatedSpotlessHelper > Task :solr:solrj:generatedSpotless > Task :solr:solrj:classes UP-TO-DATE > Task :solr:solrj:jar UP-TO-DATE > Task :solr:solr-ref-guide:compileTestJava Note: Some input files use or override a deprecated API. Note: Recompile with -Xlint:deprecation for details. > Task :solr:solr-ref-guide:testClasses > Task :solr:solrj-zookeeper:compileTestJava Note: Some input files use or override a deprecated API. Note: Recompile with -Xlint:deprecation for details. > Task :solr:test-framework:compileTestJava Note: Some input files use or override a deprecated API. Note: Recompile with -Xlint:deprecation for details. > Task :solr:solrj-zookeeper:testClasses > Task :solr:modules:clustering:compileTestJava > Task :solr:test-framework:testClasses > Task :solr:prometheus-exporter:compileTestJava Note: /build/solr/src/solr-9.4.1/solr/prometheus-exporter/src/test/org/apache/solr/prometheus/exporter/MetricsQueryTemplateTest.java uses or overrides a deprecated API. Note: Recompile with -Xlint:deprecation for details. > Task :solr:prometheus-exporter:testClasses > Task :solr:modules:analysis-extras:compileTestJava Note: /build/solr/src/solr-9.4.1/solr/modules/analysis-extras/src/test/org/apache/solr/schema/TestICUCollationFieldUDVAS.java uses or overrides a deprecated API. Note: Recompile with -Xlint:deprecation for details. > Task :solr:benchmark:compileJava > Task :solr:core:compileTestJava > Task :solr:solrj-streaming:compileTestJava > Task :solr:modules:analytics:compileTestJava > Task :solr:modules:extraction:compileTestJava Note: /build/solr/src/solr-9.4.1/solr/modules/extraction/src/test/org/apache/solr/handler/extraction/ExtractingRequestHandlerTest.java uses or overrides a deprecated API. Note: Recompile with -Xlint:deprecation for details. > Task :solr:api:test > Task :solr:solr-ref-guide:test > Task :solr:solrj:compileTestJava > Task :solr:solrj-zookeeper:test > Task :solr:test-framework:test > Task :solr:webapp:compileJava NO-SOURCE > Task :solr:webapp:classes UP-TO-DATE > Task :solr:webapp:compileTestJava NO-SOURCE > Task :solr:webapp:testClasses UP-TO-DATE > Task :solr:webapp:test NO-SOURCE > Task :solr:webapp:wipeTaskTemp > Task :solr:modules:clustering:testClasses > Task :solr:prometheus-exporter:test > Task :solr:modules:analysis-extras:testClasses > Task :solr:modules:analytics:compileTestJava Note: Some input files use or override a deprecated API. Note: Recompile with -Xlint:deprecation for details. > Task :solr:modules:analytics:testClasses > Task :solr:benchmark:compileJava Note: /build/solr/src/solr-9.4.1/solr/benchmark/src/java/org/apache/solr/bench/search/StreamingSearch.java uses or overrides a deprecated API. Note: Recompile with -Xlint:deprecation for details. > Task :solr:solrj-streaming:testClasses > Task :solr:benchmark:classes > Task :solr:modules:clustering:test > Task :solr:modules:analysis-extras:test > Task :solr:modules:analytics:test > Task :solr:benchmark:compileTestJava > Task :solr:benchmark:testClasses > Task :solr:solrj:compileTestJava Note: Some input files use or override a deprecated API. Note: Recompile with -Xlint:deprecation for details. > Task :solr:solrj-streaming:test > Task :solr:solrj:testClasses > Task :solr:benchmark:test > Task :solr:solrj:test > Task :solr:core:compileTestJava Note: Some input files use or override a deprecated API. Note: Recompile with -Xlint:deprecation for details. > Task :solr:core:testClasses > Task :solr:core:test > Task :solr:prometheus-exporter:test :solr:prometheus-exporter:test (SUCCESS): 23 test(s) > Task :solr:prometheus-exporter:wipeTaskTemp > Task :solr:modules:extraction:testClasses > Task :solr:benchmark:test :solr:benchmark:test (SUCCESS): 11 test(s) > Task :solr:test-framework:test :solr:test-framework:test (SUCCESS): 18 test(s) > Task :solr:modules:extraction:test > Task :solr:test-framework:wipeTaskTemp > Task :solr:modules:gcs-repository:classes UP-TO-DATE > Task :solr:modules:gcs-repository:compileTestJava Note: /build/solr/src/solr-9.4.1/solr/modules/gcs-repository/src/test/org/apache/solr/gcs/ConcurrentDelegatingStorage.java uses or overrides a deprecated API. Note: Recompile with -Xlint:deprecation for details. > Task :solr:modules:analytics:test :solr:modules:analytics:test (SUCCESS): 472 test(s) > Task :solr:benchmark:wipeTaskTemp > Task :solr:api:test :solr:api:test (SUCCESS): 6 test(s) > Task :solr:api:wipeTaskTemp > Task :solr:modules:hadoop-auth:compileTestJava > Task :solr:modules:hdfs:compileTestJava > Task :solr:modules:analysis-extras:test :solr:modules:analysis-extras:test (SUCCESS): 31 test(s) > Task :solr:solrj-zookeeper:test :solr:solrj-zookeeper:test (SUCCESS): 16 test(s) > Task :solr:modules:analysis-extras:wipeTaskTemp > Task :solr:modules:hdfs:compileTestJava Note: Some input files use or override a deprecated API. Note: Recompile with -Xlint:deprecation for details. > Task :solr:modules:jaegertracer-configurator:compileTestJava > Task :solr:solr-ref-guide:test :solr:solr-ref-guide:test (SUCCESS): 32 test(s) > Task :solr:solr-ref-guide:wipeTaskTemp > Task :solr:modules:jwt-auth:compileTestJava > Task :solr:modules:hadoop-auth:compileTestJava Note: Some input files use or override a deprecated API. Note: Recompile with -Xlint:deprecation for details. > Task :solr:solrj-streaming:test :solr:solrj-streaming:test (SUCCESS): 608 test(s), 11 skipped > Task :solr:modules:analytics:wipeTaskTemp > Task :solr:modules:hdfs:testClasses > Task :solr:modules:gcs-repository:testClasses > Task :solr:modules:langid:compileTestJava > Task :solr:modules:hdfs:test > Task :solr:modules:gcs-repository:test > Task :solr:modules:jaegertracer-configurator:testClasses > Task :solr:core:test org.apache.solr.cloud.BasicDistributedZkTest > test FAILED org.apache.solr.client.solrj.impl.BaseHttpSolrClient$RemoteSolrException: Error from server at http://127.0.0.1:43017: ADDREPLICA failed to create replica at __randomizedtesting.SeedInfo.seed([246C98A4C257C021:AC38A77E6CABADD9]:0) at app//org.apache.solr.client.solrj.impl.HttpSolrClient.executeMethod(HttpSolrClient.java:747) at app//org.apache.solr.client.solrj.impl.HttpSolrClient.request(HttpSolrClient.java:263) at app//org.apache.solr.client.solrj.impl.HttpSolrClient.request(HttpSolrClient.java:244) at app//org.apache.solr.client.solrj.impl.LBSolrClient.doRequest(LBSolrClient.java:386) at app//org.apache.solr.client.solrj.impl.LBSolrClient.request(LBSolrClient.java:352) at app//org.apache.solr.client.solrj.impl.CloudSolrClient.sendRequest(CloudSolrClient.java:1199) at app//org.apache.solr.client.solrj.impl.CloudSolrClient.requestWithRetryOnStaleState(CloudSolrClient.java:898) at app//org.apache.solr.client.solrj.impl.CloudSolrClient.request(CloudSolrClient.java:826) at app//org.apache.solr.client.solrj.SolrRequest.process(SolrRequest.java:234) at app//org.apache.solr.client.solrj.SolrRequest.process(SolrRequest.java:249) at app//org.apache.solr.cloud.AbstractBasicDistributedZkTestBase.testANewCollectionInOneInstanceWithManualShardAssignement(AbstractBasicDistributedZkTestBase.java:1340) at app//org.apache.solr.cloud.AbstractBasicDistributedZkTestBase.test(AbstractBasicDistributedZkTestBase.java:749) at app//org.apache.solr.cloud.BasicDistributedZkTest.test(BasicDistributedZkTest.java:34) at java.base@11.0.22/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at java.base@11.0.22/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at java.base@11.0.22/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.base@11.0.22/java.lang.reflect.Method.invoke(Method.java:566) at app//com.carrotsearch.randomizedtesting.RandomizedRunner.invoke(RandomizedRunner.java:1758) at app//com.carrotsearch.randomizedtesting.RandomizedRunner$8.evaluate(RandomizedRunner.java:946) at app//com.carrotsearch.randomizedtesting.RandomizedRunner$9.evaluate(RandomizedRunner.java:982) at app//com.carrotsearch.randomizedtesting.RandomizedRunner$10.evaluate(RandomizedRunner.java:996) at app//org.apache.solr.BaseDistributedSearchTestCase$ShardsRepeatRule$ShardsFixedStatement.callStatement(BaseDistributedSearchTestCase.java:1163) at app//org.apache.solr.BaseDistributedSearchTestCase$ShardsRepeatRule$ShardsStatement.evaluate(BaseDistributedSearchTestCase.java:1134) at app//com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:80) at app//org.junit.rules.RunRules.evaluate(RunRules.java:20) at app//org.apache.lucene.tests.util.TestRuleSetupTeardownChained$1.evaluate(TestRuleSetupTeardownChained.java:48) at app//org.apache.lucene.tests.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:43) at app//org.apache.lucene.tests.util.TestRuleThreadAndTestName$1.evaluate(TestRuleThreadAndTestName.java:45) at app//org.apache.lucene.tests.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:60) at app//org.apache.lucene.tests.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:44) at app//org.junit.rules.RunRules.evaluate(RunRules.java:20) at app//com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at app//com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:390) at app//com.carrotsearch.randomizedtesting.ThreadLeakControl.forkTimeoutingTask(ThreadLeakControl.java:843) at app//com.carrotsearch.randomizedtesting.ThreadLeakControl$3.evaluate(ThreadLeakControl.java:490) at app//com.carrotsearch.randomizedtesting.RandomizedRunner.runSingleTest(RandomizedRunner.java:955) at app//com.carrotsearch.randomizedtesting.RandomizedRunner$5.evaluate(RandomizedRunner.java:840) at app//com.carrotsearch.randomizedtesting.RandomizedRunner$6.evaluate(RandomizedRunner.java:891) at app//com.carrotsearch.randomizedtesting.RandomizedRunner$7.evaluate(RandomizedRunner.java:902) at app//com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at app//com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at app//org.apache.lucene.tests.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:43) at app//org.apache.lucene.tests.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:43) at app//com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:80) at app//org.junit.rules.RunRules.evaluate(RunRules.java:20) at app//org.apache.lucene.tests.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:43) at app//com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at app//org.apache.lucene.tests.util.TestRuleStoreClassName$1.evaluate(TestRuleStoreClassName.java:38) at app//com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:40) at app//com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:40) at app//com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at app//com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at app//org.apache.lucene.tests.util.TestRuleAssertionsRequired$1.evaluate(TestRuleAssertionsRequired.java:53) at app//org.apache.lucene.tests.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:43) at app//org.apache.lucene.tests.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:44) at app//org.apache.lucene.tests.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:60) at app//org.apache.lucene.tests.util.TestRuleIgnoreTestSuites$1.evaluate(TestRuleIgnoreTestSuites.java:47) at app//org.junit.rules.RunRules.evaluate(RunRules.java:20) at app//com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at app//com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:390) at app//com.carrotsearch.randomizedtesting.ThreadLeakControl.lambda$forkTimeoutingTask$0(ThreadLeakControl.java:850) at java.base@11.0.22/java.lang.Thread.run(Thread.java:829) org.apache.solr.cloud.BasicDistributedZkTest > classMethod FAILED java.lang.AssertionError: ObjectTracker found 2 object(s) that were not released!!! [InternalHttpClient, InternalHttpClient] org.apache.http.impl.client.InternalHttpClient:org.apache.solr.common.util.ObjectReleaseTracker$ObjectTrackerException: org.apache.http.impl.client.InternalHttpClient at org.apache.solr.common.util.ObjectReleaseTracker.track(ObjectReleaseTracker.java:52) at org.apache.solr.client.solrj.impl.HttpClientUtil.createClient(HttpClientUtil.java:378) at org.apache.solr.client.solrj.impl.HttpClientUtil.createClient(HttpClientUtil.java:387) at org.apache.solr.client.solrj.impl.HttpClientUtil.createClient(HttpClientUtil.java:313) at org.apache.solr.client.solrj.impl.HttpClientUtil.createClient(HttpClientUtil.java:298) at org.apache.solr.client.solrj.impl.HttpSolrClient.(HttpSolrClient.java:180) at org.apache.solr.client.solrj.impl.HttpSolrClient$Builder.build(HttpSolrClient.java:1055) at org.apache.solr.SolrTestCaseJ4.getHttpSolrClient(SolrTestCaseJ4.java:2688) at org.apache.solr.cloud.AbstractBasicDistributedZkTestBase.createNewSolrClient(AbstractBasicDistributedZkTestBase.java:1716) at org.apache.solr.cloud.AbstractBasicDistributedZkTestBase.testANewCollectionInOneInstanceWithManualShardAssignement(AbstractBasicDistributedZkTestBase.java:1342) at org.apache.solr.cloud.AbstractBasicDistributedZkTestBase.test(AbstractBasicDistributedZkTestBase.java:749) at org.apache.solr.cloud.BasicDistributedZkTest.test(BasicDistributedZkTest.java:34) at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.base/java.lang.reflect.Method.invoke(Method.java:566) at com.carrotsearch.randomizedtesting.RandomizedRunner.invoke(RandomizedRunner.java:1758) at com.carrotsearch.randomizedtesting.RandomizedRunner$8.evaluate(RandomizedRunner.java:946) at com.carrotsearch.randomizedtesting.RandomizedRunner$9.evaluate(RandomizedRunner.java:982) at com.carrotsearch.randomizedtesting.RandomizedRunner$10.evaluate(RandomizedRunner.java:996) at org.apache.solr.BaseDistributedSearchTestCase$ShardsRepeatRule$ShardsFixedStatement.callStatement(BaseDistributedSearchTestCase.java:1163) at org.apache.solr.BaseDistributedSearchTestCase$ShardsRepeatRule$ShardsStatement.evaluate(BaseDistributedSearchTestCase.java:1134) at com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:80) at org.junit.rules.RunRules.evaluate(RunRules.java:20) at org.apache.lucene.tests.util.TestRuleSetupTeardownChained$1.evaluate(TestRuleSetupTeardownChained.java:48) at org.apache.lucene.tests.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:43) at org.apache.lucene.tests.util.TestRuleThreadAndTestName$1.evaluate(TestRuleThreadAndTestName.java:45) at org.apache.lucene.tests.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:60) at org.apache.lucene.tests.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:44) at org.junit.rules.RunRules.evaluate(RunRules.java:20) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:390) at com.carrotsearch.randomizedtesting.ThreadLeakControl.forkTimeoutingTask(ThreadLeakControl.java:843) at com.carrotsearch.randomizedtesting.ThreadLeakControl$3.evaluate(ThreadLeakControl.java:490) at com.carrotsearch.randomizedtesting.RandomizedRunner.runSingleTest(RandomizedRunner.java:955) at com.carrotsearch.randomizedtesting.RandomizedRunner$5.evaluate(RandomizedRunner.java:840) at com.carrotsearch.randomizedtesting.RandomizedRunner$6.evaluate(RandomizedRunner.java:891) at com.carrotsearch.randomizedtesting.RandomizedRunner$7.evaluate(RandomizedRunner.java:902) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at org.apache.lucene.tests.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:43) at org.apache.lucene.tests.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:43) at com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:80) at org.junit.rules.RunRules.evaluate(RunRules.java:20) at org.apache.lucene.tests.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:43) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at org.apache.lucene.tests.util.TestRuleStoreClassName$1.evaluate(TestRuleStoreClassName.java:38) at com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:40) at com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:40) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at org.apache.lucene.tests.util.TestRuleAssertionsRequired$1.evaluate(TestRuleAssertionsRequired.java:53) at org.apache.lucene.tests.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:43) at org.apache.lucene.tests.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:44) at org.apache.lucene.tests.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:60) at org.apache.lucene.tests.util.TestRuleIgnoreTestSuites$1.evaluate(TestRuleIgnoreTestSuites.java:47) at org.junit.rules.RunRules.evaluate(RunRules.java:20) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:390) at com.carrotsearch.randomizedtesting.ThreadLeakControl.lambda$forkTimeoutingTask$0(ThreadLeakControl.java:850) at java.base/java.lang.Thread.run(Thread.java:829) org.apache.http.impl.client.InternalHttpClient:org.apache.solr.common.util.ObjectReleaseTracker$ObjectTrackerException: org.apache.http.impl.client.InternalHttpClient at org.apache.solr.common.util.ObjectReleaseTracker.track(ObjectReleaseTracker.java:52) at org.apache.solr.client.solrj.impl.HttpClientUtil.createClient(HttpClientUtil.java:378) at org.apache.solr.client.solrj.impl.HttpClientUtil.createClient(HttpClientUtil.java:387) at org.apache.solr.client.solrj.impl.HttpClientUtil.createClient(HttpClientUtil.java:313) at org.apache.solr.client.solrj.impl.HttpClientUtil.createClient(HttpClientUtil.java:298) at org.apache.solr.client.solrj.impl.HttpSolrClient.(HttpSolrClient.java:180) at org.apache.solr.client.solrj.impl.HttpSolrClient$Builder.build(HttpSolrClient.java:1055) at org.apache.solr.SolrTestCaseJ4.getHttpSolrClient(SolrTestCaseJ4.java:2688) at org.apache.solr.cloud.AbstractBasicDistributedZkTestBase.createNewSolrClient(AbstractBasicDistributedZkTestBase.java:1716) at org.apache.solr.cloud.AbstractBasicDistributedZkTestBase.testANewCollectionInOneInstanceWithManualShardAssignement(AbstractBasicDistributedZkTestBase.java:1342) at org.apache.solr.cloud.AbstractBasicDistributedZkTestBase.test(AbstractBasicDistributedZkTestBase.java:749) at org.apache.solr.cloud.BasicDistributedZkTest.test(BasicDistributedZkTest.java:34) at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.base/java.lang.reflect.Method.invoke(Method.java:566) at com.carrotsearch.randomizedtesting.RandomizedRunner.invoke(RandomizedRunner.java:1758) at com.carrotsearch.randomizedtesting.RandomizedRunner$8.evaluate(RandomizedRunner.java:946) at com.carrotsearch.randomizedtesting.RandomizedRunner$9.evaluate(RandomizedRunner.java:982) at com.carrotsearch.randomizedtesting.RandomizedRunner$10.evaluate(RandomizedRunner.java:996) at org.apache.solr.BaseDistributedSearchTestCase$ShardsRepeatRule$ShardsFixedStatement.callStatement(BaseDistributedSearchTestCase.java:1163) at org.apache.solr.BaseDistributedSearchTestCase$ShardsRepeatRule$ShardsStatement.evaluate(BaseDistributedSearchTestCase.java:1134) at com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:80) at org.junit.rules.RunRules.evaluate(RunRules.java:20) at org.apache.lucene.tests.util.TestRuleSetupTeardownChained$1.evaluate(TestRuleSetupTeardownChained.java:48) at org.apache.lucene.tests.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:43) at org.apache.lucene.tests.util.TestRuleThreadAndTestName$1.evaluate(TestRuleThreadAndTestName.java:45) at org.apache.lucene.tests.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:60) at org.apache.lucene.tests.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:44) at org.junit.rules.RunRules.evaluate(RunRules.java:20) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:390) at com.carrotsearch.randomizedtesting.ThreadLeakControl.forkTimeoutingTask(ThreadLeakControl.java:843) at com.carrotsearch.randomizedtesting.ThreadLeakControl$3.evaluate(ThreadLeakControl.java:490) at com.carrotsearch.randomizedtesting.RandomizedRunner.runSingleTest(RandomizedRunner.java:955) at com.carrotsearch.randomizedtesting.RandomizedRunner$5.evaluate(RandomizedRunner.java:840) at com.carrotsearch.randomizedtesting.RandomizedRunner$6.evaluate(RandomizedRunner.java:891) at com.carrotsearch.randomizedtesting.RandomizedRunner$7.evaluate(RandomizedRunner.java:902) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at org.apache.lucene.tests.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:43) at org.apache.lucene.tests.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:43) at com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:80) at org.junit.rules.RunRules.evaluate(RunRules.java:20) at org.apache.lucene.tests.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:43) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at org.apache.lucene.tests.util.TestRuleStoreClassName$1.evaluate(TestRuleStoreClassName.java:38) at com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:40) at com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:40) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at org.apache.lucene.tests.util.TestRuleAssertionsRequired$1.evaluate(TestRuleAssertionsRequired.java:53) at org.apache.lucene.tests.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:43) at org.apache.lucene.tests.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:44) at org.apache.lucene.tests.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:60) at org.apache.lucene.tests.util.TestRuleIgnoreTestSuites$1.evaluate(TestRuleIgnoreTestSuites.java:47) at org.junit.rules.RunRules.evaluate(RunRules.java:20) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:390) at com.carrotsearch.randomizedtesting.ThreadLeakControl.lambda$forkTimeoutingTask$0(ThreadLeakControl.java:850) at java.base/java.lang.Thread.run(Thread.java:829) expected null, but was:(HttpSolrClient.java:180) at org.apache.solr.client.solrj.impl.HttpSolrClient$Builder.build(HttpSolrClient.java:1055) at org.apache.solr.SolrTestCaseJ4.getHttpSolrClient(SolrTestCaseJ4.java:2688) at org.apache.solr.cloud.AbstractBasicDistributedZkTestBase.createNewSolrClient(AbstractBasicDistributedZkTestBase.java:1716) at org.apache.solr.cloud.AbstractBasicDistributedZkTestBase.testANewCollectionInOneInstanceWithManualShardAssignement(AbstractBasicDistributedZkTestBase.java:1342) at org.apache.solr.cloud.AbstractBasicDistributedZkTestBase.test(AbstractBasicDistributedZkTestBase.java:749) at org.apache.solr.cloud.BasicDistributedZkTest.test(BasicDistributedZkTest.java:34) at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.base/java.lang.reflect.Method.invoke(Method.java:566) at com.carrotsearch.randomizedtesting.RandomizedRunner.invoke(RandomizedRunner.java:1758) at com.carrotsearch.randomizedtesting.RandomizedRunner$8.evaluate(RandomizedRunner.java:946) at com.carrotsearch.randomizedtesting.RandomizedRunner$9.evaluate(RandomizedRunner.java:982) at com.carrotsearch.randomizedtesting.RandomizedRunner$10.evaluate(RandomizedRunner.java:996) at org.apache.solr.BaseDistributedSearchTestCase$ShardsRepeatRule$ShardsFixedStatement.callStatement(BaseDistributedSearchTestCase.java:1163) at org.apache.solr.BaseDistributedSearchTestCase$ShardsRepeatRule$ShardsStatement.evaluate(BaseDistributedSearchTestCase.java:1134) at com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:80) at org.junit.rules.RunRules.evaluate(RunRules.java:20) at org.apache.lucene.tests.util.TestRuleSetupTeardownChained$1.evaluate(TestRuleSetupTeardownChained.java:48) at org.apache.lucene.tests.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:43) at org.apache.lucene.tests.util.TestRuleThreadAndTestName$1.evaluate(TestRuleThreadAndTestName.java:45) at org.apache.lucene.tests.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:60) at org.apache.lucene.tests.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:44) at org.junit.rules.RunRules.evaluate(RunRules.java:20) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:390) at com.carrotsearch.randomizedtesting.ThreadLeakControl.forkTimeoutingTask(ThreadLeakControl.java:843) at com.carrotsearch.randomizedtesting.ThreadLeakControl$3.evaluate(ThreadLeakControl.java:490) at com.carrotsearch.randomizedtesting.RandomizedRunner.runSingleTest(RandomizedRunner.java:955) at com.carrotsearch.randomizedtesting.RandomizedRunner$5.evaluate(RandomizedRunner.java:840) at com.carrotsearch.randomizedtesting.RandomizedRunner$6.evaluate(RandomizedRunner.java:891) at com.carrotsearch.randomizedtesting.RandomizedRunner$7.evaluate(RandomizedRunner.java:902) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at org.apache.lucene.tests.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:43) at org.apache.lucene.tests.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:43) at com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:80) at org.junit.rules.RunRules.evaluate(RunRules.java:20) at org.apache.lucene.tests.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:43) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at org.apache.lucene.tests.util.TestRuleStoreClassName$1.evaluate(TestRuleStoreClassName.java:38) at com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:40) at com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:40) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at org.apache.lucene.tests.util.TestRuleAssertionsRequired$1.evaluate(TestRuleAssertionsRequired.java:53) at org.apache.lucene.tests.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:43) at org.apache.lucene.tests.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:44) at org.apache.lucene.tests.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:60) at org.apache.lucene.tests.util.TestRuleIgnoreTestSuites$1.evaluate(TestRuleIgnoreTestSuites.java:47) at org.junit.rules.RunRules.evaluate(RunRules.java:20) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:390) at com.carrotsearch.randomizedtesting.ThreadLeakControl.lambda$forkTimeoutingTask$0(ThreadLeakControl.java:850) at java.base/java.lang.Thread.run(Thread.java:829) org.apache.http.impl.client.InternalHttpClient:org.apache.solr.common.util.ObjectReleaseTracker$ObjectTrackerException: org.apache.http.impl.client.InternalHttpClient at org.apache.solr.common.util.ObjectReleaseTracker.track(ObjectReleaseTracker.java:52) at org.apache.solr.client.solrj.impl.HttpClientUtil.createClient(HttpClientUtil.java:378) at org.apache.solr.client.solrj.impl.HttpClientUtil.createClient(HttpClientUtil.java:387) at org.apache.solr.client.solrj.impl.HttpClientUtil.createClient(HttpClientUtil.java:313) at org.apache.solr.client.solrj.impl.HttpClientUtil.createClient(HttpClientUtil.java:298) at org.apache.solr.client.solrj.impl.HttpSolrClient.(HttpSolrClient.java:180) at org.apache.solr.client.solrj.impl.HttpSolrClient$Builder.build(HttpSolrClient.java:1055) at org.apache.solr.SolrTestCaseJ4.getHttpSolrClient(SolrTestCaseJ4.java:2688) at org.apache.solr.cloud.AbstractBasicDistributedZkTestBase.createNewSolrClient(AbstractBasicDistributedZkTestBase.java:1716) at org.apache.solr.cloud.AbstractBasicDistributedZkTestBase.testANewCollectionInOneInstanceWithManualShardAssignement(AbstractBasicDistributedZkTestBase.java:1342) at org.apache.solr.cloud.AbstractBasicDistributedZkTestBase.test(AbstractBasicDistributedZkTestBase.java:749) at org.apache.solr.cloud.BasicDistributedZkTest.test(BasicDistributedZkTest.java:34) at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.base/java.lang.reflect.Method.invoke(Method.java:566) at com.carrotsearch.randomizedtesting.RandomizedRunner.invoke(RandomizedRunner.java:1758) at com.carrotsearch.randomizedtesting.RandomizedRunner$8.evaluate(RandomizedRunner.java:946) at com.carrotsearch.randomizedtesting.RandomizedRunner$9.evaluate(RandomizedRunner.java:982) at com.carrotsearch.randomizedtesting.RandomizedRunner$10.evaluate(RandomizedRunner.java:996) at org.apache.solr.BaseDistributedSearchTestCase$ShardsRepeatRule$ShardsFixedStatement.callStatement(BaseDistributedSearchTestCase.java:1163) at org.apache.solr.BaseDistributedSearchTestCase$ShardsRepeatRule$ShardsStatement.evaluate(BaseDistributedSearchTestCase.java:1134) at com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:80) at org.junit.rules.RunRules.evaluate(RunRules.java:20) at org.apache.lucene.tests.util.TestRuleSetupTeardownChained$1.evaluate(TestRuleSetupTeardownChained.java:48) at org.apache.lucene.tests.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:43) at org.apache.lucene.tests.util.TestRuleThreadAndTestName$1.evaluate(TestRuleThreadAndTestName.java:45) at org.apache.lucene.tests.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:60) at org.apache.lucene.tests.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:44) at org.junit.rules.RunRules.evaluate(RunRules.java:20) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:390) at com.carrotsearch.randomizedtesting.ThreadLeakControl.forkTimeoutingTask(ThreadLeakControl.java:843) at com.carrotsearch.randomizedtesting.ThreadLeakControl$3.evaluate(ThreadLeakControl.java:490) at com.carrotsearch.randomizedtesting.RandomizedRunner.runSingleTest(RandomizedRunner.java:955) at com.carrotsearch.randomizedtesting.RandomizedRunner$5.evaluate(RandomizedRunner.java:840) at com.carrotsearch.randomizedtesting.RandomizedRunner$6.evaluate(RandomizedRunner.java:891) at com.carrotsearch.randomizedtesting.RandomizedRunner$7.evaluate(RandomizedRunner.java:902) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at org.apache.lucene.tests.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:43) at org.apache.lucene.tests.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:43) at com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:80) at org.junit.rules.RunRules.evaluate(RunRules.java:20) at org.apache.lucene.tests.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:43) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at org.apache.lucene.tests.util.TestRuleStoreClassName$1.evaluate(TestRuleStoreClassName.java:38) at com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:40) at com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:40) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at org.apache.lucene.tests.util.TestRuleAssertionsRequired$1.evaluate(TestRuleAssertionsRequired.java:53) at org.apache.lucene.tests.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:43) at org.apache.lucene.tests.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:44) at org.apache.lucene.tests.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:60) at org.apache.lucene.tests.util.TestRuleIgnoreTestSuites$1.evaluate(TestRuleIgnoreTestSuites.java:47) at org.junit.rules.RunRules.evaluate(RunRules.java:20) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:390) at com.carrotsearch.randomizedtesting.ThreadLeakControl.lambda$forkTimeoutingTask$0(ThreadLeakControl.java:850) at java.base/java.lang.Thread.run(Thread.java:829) > at __randomizedtesting.SeedInfo.seed([246C98A4C257C021]:0) at org.junit.Assert.fail(Assert.java:89) at org.junit.Assert.failNotNull(Assert.java:756) at org.junit.Assert.assertNull(Assert.java:738) at org.apache.solr.SolrTestCase$1.afterIfSuccessful(SolrTestCase.java:100) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:37) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at org.apache.lucene.tests.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:43) at org.apache.lucene.tests.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:43) at com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:80) at org.junit.rules.RunRules.evaluate(RunRules.java:20) at org.apache.lucene.tests.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:43) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at org.apache.lucene.tests.util.TestRuleStoreClassName$1.evaluate(TestRuleStoreClassName.java:38) at com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:40) at com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:40) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at org.apache.lucene.tests.util.TestRuleAssertionsRequired$1.evaluate(TestRuleAssertionsRequired.java:53) at org.apache.lucene.tests.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:43) at org.apache.lucene.tests.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:44) at org.apache.lucene.tests.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:60) at org.apache.lucene.tests.util.TestRuleIgnoreTestSuites$1.evaluate(TestRuleIgnoreTestSuites.java:47) at org.junit.rules.RunRules.evaluate(RunRules.java:20) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:390) at com.carrotsearch.randomizedtesting.ThreadLeakControl.lambda$forkTimeoutingTask$0(ThreadLeakControl.java:850) at java.base/java.lang.Thread.run(Thread.java:829) org.apache.solr.cloud.BasicDistributedZkTest > test suite's output saved to /build/solr/src/solr-9.4.1/solr/core/build/test-results/test/outputs/OUTPUT-org.apache.solr.cloud.BasicDistributedZkTest.txt, copied below: 2> 144389 INFO (SUITE-BasicDistributedZkTest-seed#[246C98A4C257C021]-worker) [n: c: s: r: x: t:] o.a.s.SolrTestCase Setting 'solr.default.confdir' system property to test-framework derived value of '/build/solr/src/solr-9.4.1/solr/server/solr/configsets/_default/conf' 2> 144390 INFO (SUITE-BasicDistributedZkTest-seed#[246C98A4C257C021]-worker) [n: c: s: r: x: t:] o.a.s.SolrTestCaseJ4 SecureRandom sanity checks: test.solr.allowed.securerandom=null & java.security.egd=file:/dev/./urandom 2> 144394 INFO (SUITE-BasicDistributedZkTest-seed#[246C98A4C257C021]-worker) [n: c: s: r: x: t:] o.a.s.u.ErrorLogMuter Closing ErrorLogMuter-regex-8 after mutting 0 log messages 2> 144395 INFO (SUITE-BasicDistributedZkTest-seed#[246C98A4C257C021]-worker) [n: c: s: r: x: t:] o.a.s.u.ErrorLogMuter Creating ErrorLogMuter-regex-9 for ERROR logs matching regex: ignore_exception 2> 144401 INFO (SUITE-BasicDistributedZkTest-seed#[246C98A4C257C021]-worker) [n: c: s: r: x: t:] o.a.s.SolrTestCaseJ4 Created dataDir: /build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.cloud.BasicDistributedZkTest_246C98A4C257C021-001/data-dir-5-001 2> 144402 WARN (SUITE-BasicDistributedZkTest-seed#[246C98A4C257C021]-worker) [n: c: s: r: x: t:] o.a.s.SolrTestCaseJ4 startTrackingSearchers: numOpens=4 numCloses=4 2> 144404 INFO (SUITE-BasicDistributedZkTest-seed#[246C98A4C257C021]-worker) [n: c: s: r: x: t:] o.a.s.SolrTestCaseJ4 Using PointFields (NUMERIC_POINTS_SYSPROP=true) w/NUMERIC_DOCVALUES_SYSPROP=true 2> 144410 INFO (SUITE-BasicDistributedZkTest-seed#[246C98A4C257C021]-worker) [n: c: s: r: x: t:] o.a.s.SolrTestCaseJ4 Randomized ssl (false) and clientAuth (false) via: @org.apache.solr.SolrTestCaseJ4$SuppressSSL(bugUrl="https://issues.apache.org/jira/browse/SOLR-5776") 2> 144412 INFO (SUITE-BasicDistributedZkTest-seed#[246C98A4C257C021]-worker) [n: c: s: r: x: t:] o.a.s.BaseDistributedSearchTestCase Setting hostContext system property: / 2> 144507 INFO (TEST-BasicDistributedZkTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.u.ErrorLogMuter Closing ErrorLogMuter-regex-9 after mutting 0 log messages 2> 144507 INFO (TEST-BasicDistributedZkTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.u.ErrorLogMuter Creating ErrorLogMuter-regex-10 for ERROR logs matching regex: ignore_exception 2> 144513 INFO (TEST-BasicDistributedZkTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.ZkTestServer STARTING ZK TEST SERVER 2> 144515 INFO (ZkTestServer Run Thread) [n: c: s: r: x: t:] o.a.s.c.ZkTestServer client port: 0.0.0.0/0.0.0.0:0 2> 144515 INFO (ZkTestServer Run Thread) [n: c: s: r: x: t:] o.a.s.c.ZkTestServer Starting server 2> 144526 WARN (ZkTestServer Run Thread) [n: c: s: r: x: t:] o.a.z.s.ServerCnxnFactory maxCnxns is not configured, using default value 0. 2> 144614 INFO (TEST-BasicDistributedZkTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.ZkTestServer start zk server on port: 42019 2> 144624 WARN (TEST-BasicDistributedZkTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.c.SolrZkClient Using default ZkCredentialsInjector. ZkCredentialsInjector is not secure, it creates an empty list of credentials which leads to 'OPEN_ACL_UNSAFE' ACLs to Zookeeper nodes 2> 144632 INFO (TEST-BasicDistributedZkTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.c.ConnectionManager Waiting up to 30000ms for client to connect to ZooKeeper 2> 144664 INFO (zkConnectionManagerCallback-229-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ConnectionManager zkClient has connected 2> 144664 INFO (TEST-BasicDistributedZkTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper 2> 144665 WARN (TEST-BasicDistributedZkTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.c.SolrZkClient Using default ZkACLProvider. DefaultZkACLProvider is not secure, it creates 'OPEN_ACL_UNSAFE' ACLs to Zookeeper nodes 2> 144686 WARN (TEST-BasicDistributedZkTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.c.SolrZkClient Using default ZkCredentialsInjector. ZkCredentialsInjector is not secure, it creates an empty list of credentials which leads to 'OPEN_ACL_UNSAFE' ACLs to Zookeeper nodes 2> 144699 INFO (TEST-BasicDistributedZkTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.c.ConnectionManager Waiting up to 30000ms for client to connect to ZooKeeper 2> 144722 INFO (zkConnectionManagerCallback-231-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ConnectionManager zkClient has connected 2> 144723 INFO (TEST-BasicDistributedZkTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper 2> 144723 WARN (TEST-BasicDistributedZkTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.c.SolrZkClient Using default ZkACLProvider. DefaultZkACLProvider is not secure, it creates 'OPEN_ACL_UNSAFE' ACLs to Zookeeper nodes 2> 144756 INFO (TEST-BasicDistributedZkTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.ZkTestServer put /build/solr/src/solr-9.4.1/solr/core/build/resources/test/solr/collection1/conf/solrconfig-tlog.xml to /configs/conf1/solrconfig.xml 2> 144784 INFO (TEST-BasicDistributedZkTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.ZkTestServer put /build/solr/src/solr-9.4.1/solr/core/build/resources/test/solr/collection1/conf/schema.xml to /configs/conf1/schema.xml 2> 144805 INFO (TEST-BasicDistributedZkTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.ZkTestServer put /build/solr/src/solr-9.4.1/solr/core/build/resources/test/solr/collection1/conf/solrconfig.snippet.randomindexconfig.xml to /configs/conf1/solrconfig.snippet.randomindexconfig.xml 2> 144819 INFO (TEST-BasicDistributedZkTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.ZkTestServer put /build/solr/src/solr-9.4.1/solr/core/build/resources/test/solr/collection1/conf/stopwords.txt to /configs/conf1/stopwords.txt 2> 144829 INFO (TEST-BasicDistributedZkTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.ZkTestServer put /build/solr/src/solr-9.4.1/solr/core/build/resources/test/solr/collection1/conf/protwords.txt to /configs/conf1/protwords.txt 2> 144841 INFO (TEST-BasicDistributedZkTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.ZkTestServer put /build/solr/src/solr-9.4.1/solr/core/build/resources/test/solr/collection1/conf/currency.xml to /configs/conf1/currency.xml 2> 144854 INFO (TEST-BasicDistributedZkTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.ZkTestServer put /build/solr/src/solr-9.4.1/solr/core/build/resources/test/solr/collection1/conf/enumsConfig.xml to /configs/conf1/enumsConfig.xml 2> 144869 INFO (TEST-BasicDistributedZkTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.ZkTestServer put /build/solr/src/solr-9.4.1/solr/core/build/resources/test/solr/collection1/conf/open-exchange-rates.json to /configs/conf1/open-exchange-rates.json 2> 144885 INFO (TEST-BasicDistributedZkTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.ZkTestServer put /build/solr/src/solr-9.4.1/solr/core/build/resources/test/solr/collection1/conf/mapping-ISOLatin1Accent.txt to /configs/conf1/mapping-ISOLatin1Accent.txt 2> 144897 INFO (TEST-BasicDistributedZkTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.ZkTestServer put /build/solr/src/solr-9.4.1/solr/core/build/resources/test/solr/collection1/conf/old_synonyms.txt to /configs/conf1/old_synonyms.txt 2> 144909 INFO (TEST-BasicDistributedZkTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.ZkTestServer put /build/solr/src/solr-9.4.1/solr/core/build/resources/test/solr/collection1/conf/synonyms.txt to /configs/conf1/synonyms.txt 2> 144921 INFO (TEST-BasicDistributedZkTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.AbstractFullDistribZkTestBase Will use NRT replicas unless explicitly asked otherwise 2> 148318 INFO (TEST-BasicDistributedZkTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.e.JettySolrRunner Start Jetty (configured port=0, binding port=0) 2> 148319 INFO (TEST-BasicDistributedZkTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.e.JettySolrRunner Trying to start Jetty on port 0 try number 2 ... 2> 148320 INFO (TEST-BasicDistributedZkTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.e.j.s.Server jetty-10.0.19; built: 2023-12-15T20:38:32.477Z; git: 8492d1c78f122bb30cce20aecfa07e7283facd47; jvm 11.0.22+7 2> 148340 INFO (TEST-BasicDistributedZkTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.e.j.s.s.DefaultSessionIdManager Session workerName=node0 2> 148346 INFO (TEST-BasicDistributedZkTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.e.j.s.h.ContextHandler Started o.e.j.s.ServletContextHandler@3237e41b{/,file:///build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-cwd/,AVAILABLE} 2> 148354 INFO (TEST-BasicDistributedZkTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.e.j.s.AbstractConnector Started ServerConnector@45540c3e{HTTP/1.1, (http/1.1, h2c)}{127.0.0.1:45297} 2> 148355 INFO (TEST-BasicDistributedZkTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.e.j.s.Server Started Server@86332bd{STARTING}[10.0.19,sto=0] @148590ms 2> 148357 ERROR (TEST-BasicDistributedZkTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.u.StartupLoggingUtils Missing Java Option solr.log.dir. Logging may be missing or incomplete. 2> 148357 INFO (TEST-BasicDistributedZkTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.s.CoreContainerProvider Using logger factory org.apache.logging.slf4j.Log4jLoggerFactory 2> 148358 INFO (TEST-BasicDistributedZkTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.s.CoreContainerProvider ___ _ Welcome to Apache Solr™ version 9.4.1 2> 148359 INFO (TEST-BasicDistributedZkTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.s.CoreContainerProvider / __| ___| |_ _ Starting in cloud mode on port null 2> 148360 INFO (TEST-BasicDistributedZkTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.s.CoreContainerProvider \__ \/ _ \ | '_| Install dir: /build/solr/src/solr-9.4.1/solr 2> 148360 INFO (TEST-BasicDistributedZkTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.s.CoreContainerProvider |___/\___/_|_| Start time: 2024-01-21T21:04:40.458296Z 2> 148362 INFO (TEST-BasicDistributedZkTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.s.CoreContainerProvider Solr Home: /build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.cloud.BasicDistributedZkTest_246C98A4C257C021-001/control-001 (source: servlet config: solr.solr.home) 2> 148367 WARN (TEST-BasicDistributedZkTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.c.SolrZkClient Using default ZkCredentialsInjector. ZkCredentialsInjector is not secure, it creates an empty list of credentials which leads to 'OPEN_ACL_UNSAFE' ACLs to Zookeeper nodes 2> 148385 INFO (TEST-BasicDistributedZkTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.c.ConnectionManager Waiting up to 30000ms for client to connect to ZooKeeper 2> 148397 INFO (zkConnectionManagerCallback-233-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ConnectionManager zkClient has connected 2> 148398 INFO (TEST-BasicDistributedZkTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper 2> 148398 WARN (TEST-BasicDistributedZkTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.c.SolrZkClient Using default ZkACLProvider. DefaultZkACLProvider is not secure, it creates 'OPEN_ACL_UNSAFE' ACLs to Zookeeper nodes 2> 148508 INFO (TEST-BasicDistributedZkTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.NodeConfig Loading solr.xml from SolrHome (not found in ZooKeeper) 2> 148510 INFO (TEST-BasicDistributedZkTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.SolrXmlConfig Loading solr.xml from /build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.cloud.BasicDistributedZkTest_246C98A4C257C021-001/control-001/solr.xml 2> 149686 INFO (TEST-BasicDistributedZkTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.u.t.SimplePropagator Always-on trace id generation enabled. 2> 149736 INFO (TEST-BasicDistributedZkTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.ZkContainer Zookeeper client=127.0.0.1:42019/solr 2> 149738 WARN (TEST-BasicDistributedZkTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.c.SolrZkClient Using default ZkCredentialsInjector. ZkCredentialsInjector is not secure, it creates an empty list of credentials which leads to 'OPEN_ACL_UNSAFE' ACLs to Zookeeper nodes 2> 149753 INFO (TEST-BasicDistributedZkTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.c.ConnectionManager Waiting up to 15000ms for client to connect to ZooKeeper 2> 149765 INFO (zkConnectionManagerCallback-243-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ConnectionManager zkClient has connected 2> 149765 INFO (TEST-BasicDistributedZkTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper 2> 149766 WARN (TEST-BasicDistributedZkTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.c.SolrZkClient Using default ZkACLProvider. DefaultZkACLProvider is not secure, it creates 'OPEN_ACL_UNSAFE' ACLs to Zookeeper nodes 2> 149882 INFO (TEST-BasicDistributedZkTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.DistributedClusterStateUpdater Creating DistributedClusterStateUpdater with useDistributedStateUpdate=false. Solr will be using Overseer based cluster state updates. 2> 149893 INFO (TEST-BasicDistributedZkTest.test-seed#[246C98A4C257C021]) [n:127.0.0.1:45297_ c: s: r: x: t:] o.a.s.c.c.ConnectionManager Waiting up to 15000ms for client to connect to ZooKeeper 2> 149904 INFO (zkConnectionManagerCallback-245-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ConnectionManager zkClient has connected 2> 149905 INFO (TEST-BasicDistributedZkTest.test-seed#[246C98A4C257C021]) [n:127.0.0.1:45297_ c: s: r: x: t:] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper 2> 150114 WARN (TEST-BasicDistributedZkTest.test-seed#[246C98A4C257C021]) [n:127.0.0.1:45297_ c: s: r: x: t:] o.a.s.c.ZkController Contents of zookeeper /security.json are world-readable; consider setting up ACLs as described in https://solr.apache.org/guide/solr/latest/deployment-guide/zookeeper-access-control.html 2> 150143 INFO (TEST-BasicDistributedZkTest.test-seed#[246C98A4C257C021]) [n:127.0.0.1:45297_ c: s: r: x: t:] o.a.s.c.DistributedClusterStateUpdater Creating DistributedClusterStateUpdater with useDistributedStateUpdate=false. Solr will be using Overseer based cluster state updates. 2> 150165 INFO (TEST-BasicDistributedZkTest.test-seed#[246C98A4C257C021]) [n:127.0.0.1:45297_ c: s: r: x: t:] o.a.s.c.OverseerElectionContext I am going to be the leader 127.0.0.1:45297_ 2> 150171 INFO (TEST-BasicDistributedZkTest.test-seed#[246C98A4C257C021]) [n:127.0.0.1:45297_ c: s: r: x: t:] o.a.s.c.Overseer Overseer (id=72077803529764868-127.0.0.1:45297_-n_0000000000) starting 2> 150470 INFO (OverseerStateUpdate-72077803529764868-127.0.0.1:45297_-n_0000000000) [n:127.0.0.1:45297_ c: s: r: x: t:] o.a.s.c.Overseer Starting to work on the main queue : 127.0.0.1:45297_ 2> 150473 INFO (TEST-BasicDistributedZkTest.test-seed#[246C98A4C257C021]) [n:127.0.0.1:45297_ c: s: r: x: t:] o.a.s.c.ZkController Register node as live in ZooKeeper:/live_nodes/127.0.0.1:45297_ 2> 150493 INFO (OverseerStateUpdate-72077803529764868-127.0.0.1:45297_-n_0000000000) [n:127.0.0.1:45297_ c: s: r: x: t:] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (0) -> (1) 2> 150530 WARN (TEST-BasicDistributedZkTest.test-seed#[246C98A4C257C021]) [n:127.0.0.1:45297_ c: s: r: x: t:] o.a.s.c.CoreContainer Not all security plugins configured! authentication=disabled authorization=disabled. Solr is only as secure as you make it. Consider configuring authentication/authorization before exposing Solr to users internal or external. See https://s.apache.org/solrsecurity for more info 2> 151327 INFO (TEST-BasicDistributedZkTest.test-seed#[246C98A4C257C021]) [n:127.0.0.1:45297_ c: s: r: x: t:] o.a.s.c.CorePropertiesLocator Found 0 core definitions underneath /build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.cloud.BasicDistributedZkTest_246C98A4C257C021-001/control-001/cores 2> 151821 INFO (TEST-BasicDistributedZkTest.test-seed#[246C98A4C257C021]) [n:127.0.0.1:45297_ c: s: r: x: t:] o.a.s.j.SolrRequestAuthorizer Creating a new SolrRequestAuthorizer 2> 152077 INFO (TEST-BasicDistributedZkTest.test-seed#[246C98A4C257C021]) [n:127.0.0.1:45297_ c: s: r: x: t:] o.a.s.e.JettySolrRunner Jetty properties: {hostContext=/, solr.data.dir=/build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.cloud.BasicDistributedZkTest_246C98A4C257C021-001/tempDir-001/control/data, hostPort=45297, coreRootDirectory=/build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-cwd/../../../../../../../../../build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.cloud.BasicDistributedZkTest_246C98A4C257C021-001/control-001/cores, replicaType=NRT} 2> 152095 WARN (TEST-BasicDistributedZkTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.c.SolrZkClient Using default ZkCredentialsInjector. ZkCredentialsInjector is not secure, it creates an empty list of credentials which leads to 'OPEN_ACL_UNSAFE' ACLs to Zookeeper nodes 2> 152113 INFO (TEST-BasicDistributedZkTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.c.ConnectionManager Waiting up to 15000ms for client to connect to ZooKeeper 2> 152136 INFO (zkConnectionManagerCallback-258-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ConnectionManager zkClient has connected 2> 152136 INFO (TEST-BasicDistributedZkTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper 2> 152137 WARN (TEST-BasicDistributedZkTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.c.SolrZkClient Using default ZkACLProvider. DefaultZkACLProvider is not secure, it creates 'OPEN_ACL_UNSAFE' ACLs to Zookeeper nodes 2> 152147 INFO (TEST-BasicDistributedZkTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (0) -> (1) 2> 152155 INFO (TEST-BasicDistributedZkTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.s.i.ZkClientClusterStateProvider Cluster at 127.0.0.1:42019/solr ready 2> 152242 INFO (OverseerThreadFactory-250-thread-1) [n: c:control_collection s: r: x: t:] o.a.s.c.a.c.CreateCollectionCmd Create collection control_collection 2> 152418 INFO (OverseerStateUpdate-72077803529764868-127.0.0.1:45297_-n_0000000000) [n:127.0.0.1:45297_ c: s: r: x: t:] o.a.s.c.o.SliceMutator createReplica() { 2> "core":"control_collection_shard1_replica_n1", 2> "node_name":"127.0.0.1:45297_", 2> "base_url":"http://127.0.0.1:45297", 2> "collection":"control_collection", 2> "shard":"shard1", 2> "state":"down", 2> "type":"NRT", 2> "operation":"ADDREPLICA", 2> "waitForFinalState":"false"} 2> 152541 INFO (zkCallback-244-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/control_collection/state.json zxid: 153] for collection [control_collection] has occurred - updating... (live nodes size: [1]) 2> 152598 INFO (qtp2065109343-470) [n:127.0.0.1:45297_ c: s: r: x:control_collection_shard1_replica_n1 t:null-109] o.a.s.h.a.CoreAdminOperation core create command qt=/admin/cores&collection.configName=conf1&newCollection=true&collection=control_collection&version=2&replicaType=NRT&coreNodeName=core_node2&name=control_collection_shard1_replica_n1&action=CREATE&numShards=1&shard=shard1&wt=javabin 2> 152688 INFO (qtp2065109343-470) [n:127.0.0.1:45297_ c:control_collection s:shard1 r:core_node2 x:control_collection_shard1_replica_n1 t:null-109] o.a.s.c.SolrConfig Using Lucene MatchVersion: 9.8.0 2> 152755 INFO (qtp2065109343-470) [n:127.0.0.1:45297_ c:control_collection s:shard1 r:core_node2 x:control_collection_shard1_replica_n1 t:null-109] o.a.s.s.IndexSchema Schema name=test 2> 152936 WARN (qtp2065109343-470) [n:127.0.0.1:45297_ c:control_collection s:shard1 r:core_node2 x:control_collection_shard1_replica_n1 t:null-109] o.a.s.D.s.WordDelimiterFilterFactory Solr loaded a deprecated plugin/analysis class [solr.WordDelimiterFilterFactory]. Please consult documentation how to replace it accordingly. 2> 154564 INFO (qtp2065109343-470) [n:127.0.0.1:45297_ c:control_collection s:shard1 r:core_node2 x:control_collection_shard1_replica_n1 t:null-109] o.a.s.s.IndexSchema Loaded schema test/1.0 with uniqueid field id 2> 154788 INFO (qtp2065109343-470) [n:127.0.0.1:45297_ c:control_collection s:shard1 r:core_node2 x:control_collection_shard1_replica_n1 t:null-109] o.a.s.c.CoreContainer Creating SolrCore 'control_collection_shard1_replica_n1' using configuration from configset conf1, trusted=true 2> 154804 INFO (qtp2065109343-470) [n:127.0.0.1:45297_ c:control_collection s:shard1 r:core_node2 x:control_collection_shard1_replica_n1 t:null-109] o.a.s.c.SolrCore Opening new SolrCore at [/build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.cloud.BasicDistributedZkTest_246C98A4C257C021-001/control-001/cores/control_collection_shard1_replica_n1], dataDir=[/build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.cloud.BasicDistributedZkTest_246C98A4C257C021-001/control-001/cores/control_collection_shard1_replica_n1/data/] 2> 154830 INFO (qtp2065109343-470) [n:127.0.0.1:45297_ c:control_collection s:shard1 r:core_node2 x:control_collection_shard1_replica_n1 t:null-109] o.a.s.u.RandomMergePolicy RandomMergePolicy wrapping class org.apache.lucene.index.TieredMergePolicy: [TieredMergePolicy: maxMergeAtOnce=26, maxMergedSegmentMB=49.734910011291504, floorSegmentMB=0.3930988311767578, forceMergeDeletesPctAllowed=19.244929011688583, segmentsPerTier=32.0, maxCFSSegmentSizeMB=8.796093022208E12, noCFSRatio=0.844874139736486, deletesPctAllowed=38.311155010312675 2> 154889 WARN (qtp2065109343-470) [n:127.0.0.1:45297_ c:control_collection s:shard1 r:core_node2 x:control_collection_shard1_replica_n1 t:null-109] o.a.s.c.RequestHandlers INVALID paramSet a in requestHandler {type = requestHandler,name = /dump,class = DumpRequestHandler,attributes = {initParams=a, name=/dump, class=DumpRequestHandler},args = {defaults={a=A, b=B}}} 2> 155422 INFO (qtp2065109343-470) [n:127.0.0.1:45297_ c:control_collection s:shard1 r:core_node2 x:control_collection_shard1_replica_n1 t:null-109] o.a.s.j.SolrRequestAuthorizer Creating a new SolrRequestAuthorizer 2> 155491 INFO (qtp2065109343-470) [n:127.0.0.1:45297_ c:control_collection s:shard1 r:core_node2 x:control_collection_shard1_replica_n1 t:null-109] o.a.s.u.UpdateHandler Using UpdateLog implementation: org.apache.solr.update.UpdateLog 2> 155491 INFO (qtp2065109343-470) [n:127.0.0.1:45297_ c:control_collection s:shard1 r:core_node2 x:control_collection_shard1_replica_n1 t:null-109] o.a.s.u.UpdateLog Initializing UpdateLog: dataDir= defaultSyncLevel=FLUSH numRecordsToKeep=100 maxNumLogsToKeep=10 numVersionBuckets=65536 2> 155509 INFO (qtp2065109343-470) [n:127.0.0.1:45297_ c:control_collection s:shard1 r:core_node2 x:control_collection_shard1_replica_n1 t:null-109] o.a.s.u.CommitTracker Hard AutoCommit: disabled 2> 155509 INFO (qtp2065109343-470) [n:127.0.0.1:45297_ c:control_collection s:shard1 r:core_node2 x:control_collection_shard1_replica_n1 t:null-109] o.a.s.u.CommitTracker Soft AutoCommit: disabled 2> 155521 INFO (qtp2065109343-470) [n:127.0.0.1:45297_ c:control_collection s:shard1 r:core_node2 x:control_collection_shard1_replica_n1 t:null-109] o.a.s.u.RandomMergePolicy RandomMergePolicy wrapping class org.apache.lucene.index.TieredMergePolicy: [TieredMergePolicy: maxMergeAtOnce=25, maxMergedSegmentMB=98.51693820953369, floorSegmentMB=1.4527759552001953, forceMergeDeletesPctAllowed=2.9097891812865595, segmentsPerTier=45.0, maxCFSSegmentSizeMB=8.796093022208E12, noCFSRatio=0.5977428682331861, deletesPctAllowed=37.29251706615804 2> 155548 INFO (qtp2065109343-470) [n:127.0.0.1:45297_ c:control_collection s:shard1 r:core_node2 x:control_collection_shard1_replica_n1 t:null-109] o.a.s.r.ManagedResourceStorage Configured ZooKeeperStorageIO with znodeBase: /configs/conf1 2> 155551 INFO (qtp2065109343-470) [n:127.0.0.1:45297_ c:control_collection s:shard1 r:core_node2 x:control_collection_shard1_replica_n1 t:null-109] o.a.s.r.ManagedResourceStorage Loaded null at path _rest_managed.json using ZooKeeperStorageIO:path=/configs/conf1 2> 155555 INFO (qtp2065109343-470) [n:127.0.0.1:45297_ c:control_collection s:shard1 r:core_node2 x:control_collection_shard1_replica_n1 t:null-109] o.a.s.h.ReplicationHandler Commits will be reserved for 10000 ms 2> 155573 INFO (qtp2065109343-470) [n:127.0.0.1:45297_ c:control_collection s:shard1 r:core_node2 x:control_collection_shard1_replica_n1 t:null-109] o.a.s.u.UpdateLog Could not find max version in index or recent updates, using new clock 1788735481625706496 2> 155616 INFO (searcherExecutor-260-thread-1-processing-control_collection_shard1_replica_n1 null-109 core_node2 127.0.0.1:45297_ control_collection shard1) [n:127.0.0.1:45297_ c:control_collection s:shard1 r:core_node2 x:control_collection_shard1_replica_n1 t:null-109] o.a.s.c.SolrCore Registered new searcher autowarm time: 0 ms 2> 155618 INFO (qtp2065109343-470) [n:127.0.0.1:45297_ c:control_collection s:shard1 r:core_node2 x:control_collection_shard1_replica_n1 t:null-109] o.a.s.c.ZkShardTerms Successful update of terms at /collections/control_collection/terms/shard1 to Terms{values={core_node2=0}, version=0} for registerTerm 2> 155619 INFO (qtp2065109343-470) [n:127.0.0.1:45297_ c:control_collection s:shard1 r:core_node2 x:control_collection_shard1_replica_n1 t:null-109] o.a.s.c.ShardLeaderElectionContextBase make sure parent is created /collections/control_collection/leaders/shard1 2> 155658 INFO (qtp2065109343-470) [n:127.0.0.1:45297_ c:control_collection s:shard1 r:core_node2 x:control_collection_shard1_replica_n1 t:null-109] o.a.s.c.ShardLeaderElectionContext Enough replicas found to continue. 2> 155658 INFO (qtp2065109343-470) [n:127.0.0.1:45297_ c:control_collection s:shard1 r:core_node2 x:control_collection_shard1_replica_n1 t:null-109] o.a.s.c.ShardLeaderElectionContext I may be the new leader - try and sync 2> 155658 INFO (qtp2065109343-470) [n:127.0.0.1:45297_ c:control_collection s:shard1 r:core_node2 x:control_collection_shard1_replica_n1 t:null-109] o.a.s.c.SyncStrategy Sync replicas to http://127.0.0.1:45297/control_collection_shard1_replica_n1/ 2> 155661 INFO (qtp2065109343-470) [n:127.0.0.1:45297_ c:control_collection s:shard1 r:core_node2 x:control_collection_shard1_replica_n1 t:null-109] o.a.s.c.SyncStrategy Sync Success - now sync replicas to me 2> 155664 INFO (qtp2065109343-470) [n:127.0.0.1:45297_ c:control_collection s:shard1 r:core_node2 x:control_collection_shard1_replica_n1 t:null-109] o.a.s.c.SyncStrategy http://127.0.0.1:45297/control_collection_shard1_replica_n1/ has no replicas 2> 155665 INFO (qtp2065109343-470) [n:127.0.0.1:45297_ c:control_collection s:shard1 r:core_node2 x:control_collection_shard1_replica_n1 t:null-109] o.a.s.c.ShardLeaderElectionContextBase Creating leader registration node /collections/control_collection/leaders/shard1/leader after winning as /collections/control_collection/leader_elect/shard1/election/72077803529764868-core_node2-n_0000000000 2> 155690 INFO (qtp2065109343-470) [n:127.0.0.1:45297_ c:control_collection s:shard1 r:core_node2 x:control_collection_shard1_replica_n1 t:null-109] o.a.s.c.ShardLeaderElectionContext I am the new leader: http://127.0.0.1:45297/control_collection_shard1_replica_n1/ shard1 2> 155729 INFO (qtp2065109343-470) [n:127.0.0.1:45297_ c:control_collection s:shard1 r:core_node2 x:control_collection_shard1_replica_n1 t:null-109] o.a.s.c.ZkController I am the leader, no recovery necessary 2> 155757 INFO (qtp2065109343-470) [n:127.0.0.1:45297_ c: s: r: x: t:null-109] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/cores params={qt=/admin/cores&coreNodeName=core_node2&collection.configName=conf1&newCollection=true&name=control_collection_shard1_replica_n1&action=CREATE&numShards=1&collection=control_collection&shard=shard1&wt=javabin&version=2&replicaType=NRT} status=0 QTime=3166 2> 155783 INFO (qtp2065109343-467) [n:127.0.0.1:45297_ c:control_collection s: r: x: t:null-108] o.a.s.h.a.CollectionsHandler Wait for new collection to be active for at most 45 seconds. Check all shard replicas 2> 155856 INFO (zkCallback-244-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/control_collection/state.json zxid: 177] for collection [control_collection] has occurred - updating... (live nodes size: [1]) 2> 155856 INFO (zkCallback-244-thread-2) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/control_collection/state.json zxid: 177] for collection [control_collection] has occurred - updating... (live nodes size: [1]) 2> 155864 INFO (qtp2065109343-467) [n:127.0.0.1:45297_ c:control_collection s: r: x: t:null-108] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/collections params={collection.configName=conf1&name=control_collection&nrtReplicas=1&action=CREATE&numShards=1&createNodeSet=127.0.0.1:45297_&wt=javabin&version=2} status=0 QTime=3679 2> 155869 INFO (TEST-BasicDistributedZkTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.AbstractFullDistribZkTestBase Waiting to see 1 active replicas in collection: control_collection 2> 156008 WARN (TEST-BasicDistributedZkTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.c.SolrZkClient Using default ZkCredentialsInjector. ZkCredentialsInjector is not secure, it creates an empty list of credentials which leads to 'OPEN_ACL_UNSAFE' ACLs to Zookeeper nodes 2> 156015 INFO (TEST-BasicDistributedZkTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.c.ConnectionManager Waiting up to 15000ms for client to connect to ZooKeeper 2> 156032 INFO (zkConnectionManagerCallback-269-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ConnectionManager zkClient has connected 2> 156033 INFO (TEST-BasicDistributedZkTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper 2> 156033 WARN (TEST-BasicDistributedZkTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.c.SolrZkClient Using default ZkACLProvider. DefaultZkACLProvider is not secure, it creates 'OPEN_ACL_UNSAFE' ACLs to Zookeeper nodes 2> 156044 INFO (TEST-BasicDistributedZkTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (0) -> (1) 2> 156052 INFO (TEST-BasicDistributedZkTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.s.i.ZkClientClusterStateProvider Cluster at 127.0.0.1:42019/solr ready 2> 156057 INFO (TEST-BasicDistributedZkTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.ChaosMonkey monkey: init - expire sessions:false cause connection loss:false 2> 156098 INFO (OverseerThreadFactory-250-thread-2) [n: c:collection1 s: r: x: t:] o.a.s.c.a.c.CreateCollectionCmd Create collection collection1 2> 156101 INFO (OverseerCollectionConfigSetProcessor-72077803529764868-127.0.0.1:45297_-n_0000000000) [n: c: s: r: x: t:] o.a.s.c.OverseerTaskQueue Response ZK path: /overseer/collection-queue-work/qnr-0000000000 doesn't exist. Requestor may have disconnected from ZooKeeper 2> 156328 WARN (OverseerThreadFactory-250-thread-2) [n: c:collection1 s: r: x: t:] o.a.s.c.a.c.CreateCollectionCmd It is unusual to create a collection (collection1) without cores. 2> 156342 INFO (qtp2065109343-469) [n:127.0.0.1:45297_ c:collection1 s: r: x: t:null-110] o.a.s.h.a.CollectionsHandler Wait for new collection to be active for at most 45 seconds. Check all shard replicas 2> 156347 INFO (qtp2065109343-469) [n:127.0.0.1:45297_ c:collection1 s: r: x: t:null-110] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/collections params={collection.configName=conf1&name=collection1&nrtReplicas=1&action=CREATE&numShards=2&createNodeSet=&wt=javabin&version=2} status=0 QTime=277 2> 156356 INFO (TEST-BasicDistributedZkTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.SolrCloudTestCase active slice count: 2 expected: 2 2> 156357 INFO (TEST-BasicDistributedZkTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.SolrCloudTestCase active replica count: 0 expected replica count: 0 2> 156357 INFO (TEST-BasicDistributedZkTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.AbstractFullDistribZkTestBase Creating jetty instances pullReplicaCount=0 numOtherReplicas=4 2> 158118 INFO (OverseerCollectionConfigSetProcessor-72077803529764868-127.0.0.1:45297_-n_0000000000) [n: c: s: r: x: t:] o.a.s.c.OverseerTaskQueue Response ZK path: /overseer/collection-queue-work/qnr-0000000002 doesn't exist. Requestor may have disconnected from ZooKeeper 2> 159281 INFO (TEST-BasicDistributedZkTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.AbstractFullDistribZkTestBase create jetty 1 in directory /build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.cloud.BasicDistributedZkTest_246C98A4C257C021-001/shard-1-001 of type NRT for shard2 2> 159309 INFO (closeThreadPool-270-thread-1) [n: c: s: r: x: t:] o.a.s.e.JettySolrRunner Start Jetty (configured port=0, binding port=0) 2> 159309 INFO (closeThreadPool-270-thread-1) [n: c: s: r: x: t:] o.a.s.e.JettySolrRunner Trying to start Jetty on port 0 try number 2 ... 2> 159310 INFO (closeThreadPool-270-thread-1) [n: c: s: r: x: t:] o.e.j.s.Server jetty-10.0.19; built: 2023-12-15T20:38:32.477Z; git: 8492d1c78f122bb30cce20aecfa07e7283facd47; jvm 11.0.22+7 2> 159327 INFO (closeThreadPool-270-thread-1) [n: c: s: r: x: t:] o.e.j.s.s.DefaultSessionIdManager Session workerName=node0 2> 159334 INFO (closeThreadPool-270-thread-1) [n: c: s: r: x: t:] o.e.j.s.h.ContextHandler Started o.e.j.s.ServletContextHandler@58ad96b0{/,file:///build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-cwd/,AVAILABLE} 2> 159339 INFO (closeThreadPool-270-thread-1) [n: c: s: r: x: t:] o.e.j.s.AbstractConnector Started ServerConnector@27f64de8{HTTP/1.1, (http/1.1, h2c)}{127.0.0.1:40707} 2> 159341 INFO (closeThreadPool-270-thread-1) [n: c: s: r: x: t:] o.e.j.s.Server Started Server@46d0a8c2{STARTING}[10.0.19,sto=0] @159577ms 2> 159343 ERROR (closeThreadPool-270-thread-1) [n: c: s: r: x: t:] o.a.s.u.StartupLoggingUtils Missing Java Option solr.log.dir. Logging may be missing or incomplete. 2> 159344 INFO (closeThreadPool-270-thread-1) [n: c: s: r: x: t:] o.a.s.s.CoreContainerProvider Using logger factory org.apache.logging.slf4j.Log4jLoggerFactory 2> 159344 INFO (closeThreadPool-270-thread-1) [n: c: s: r: x: t:] o.a.s.s.CoreContainerProvider ___ _ Welcome to Apache Solr™ version 9.4.1 2> 159345 INFO (closeThreadPool-270-thread-1) [n: c: s: r: x: t:] o.a.s.s.CoreContainerProvider / __| ___| |_ _ Starting in cloud mode on port null 2> 159346 INFO (closeThreadPool-270-thread-1) [n: c: s: r: x: t:] o.a.s.s.CoreContainerProvider \__ \/ _ \ | '_| Install dir: /build/solr/src/solr-9.4.1/solr 2> 159346 INFO (closeThreadPool-270-thread-1) [n: c: s: r: x: t:] o.a.s.s.CoreContainerProvider |___/\___/_|_| Start time: 2024-01-21T21:04:51.444440Z 2> 159348 INFO (closeThreadPool-270-thread-1) [n: c: s: r: x: t:] o.a.s.s.CoreContainerProvider Solr Home: /build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.cloud.BasicDistributedZkTest_246C98A4C257C021-001/shard-1-001 (source: servlet config: solr.solr.home) 2> 159352 WARN (closeThreadPool-270-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.SolrZkClient Using default ZkCredentialsInjector. ZkCredentialsInjector is not secure, it creates an empty list of credentials which leads to 'OPEN_ACL_UNSAFE' ACLs to Zookeeper nodes 2> 159370 INFO (closeThreadPool-270-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ConnectionManager Waiting up to 30000ms for client to connect to ZooKeeper 2> 159382 INFO (zkConnectionManagerCallback-272-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ConnectionManager zkClient has connected 2> 159382 INFO (closeThreadPool-270-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper 2> 159383 WARN (closeThreadPool-270-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.SolrZkClient Using default ZkACLProvider. DefaultZkACLProvider is not secure, it creates 'OPEN_ACL_UNSAFE' ACLs to Zookeeper nodes 2> 159494 INFO (closeThreadPool-270-thread-1) [n: c: s: r: x: t:] o.a.s.c.NodeConfig Loading solr.xml from SolrHome (not found in ZooKeeper) 2> 159496 INFO (closeThreadPool-270-thread-1) [n: c: s: r: x: t:] o.a.s.c.SolrXmlConfig Loading solr.xml from /build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.cloud.BasicDistributedZkTest_246C98A4C257C021-001/shard-1-001/solr.xml 2> 162608 INFO (TEST-BasicDistributedZkTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.AbstractFullDistribZkTestBase create jetty 2 in directory /build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.cloud.BasicDistributedZkTest_246C98A4C257C021-001/shard-2-001 of type NRT for shard1 2> 162630 INFO (closeThreadPool-270-thread-2) [n: c: s: r: x: t:] o.a.s.e.JettySolrRunner Start Jetty (configured port=0, binding port=0) 2> 162631 INFO (closeThreadPool-270-thread-2) [n: c: s: r: x: t:] o.a.s.e.JettySolrRunner Trying to start Jetty on port 0 try number 2 ... 2> 162632 INFO (closeThreadPool-270-thread-2) [n: c: s: r: x: t:] o.e.j.s.Server jetty-10.0.19; built: 2023-12-15T20:38:32.477Z; git: 8492d1c78f122bb30cce20aecfa07e7283facd47; jvm 11.0.22+7 2> 162651 INFO (closeThreadPool-270-thread-2) [n: c: s: r: x: t:] o.e.j.s.s.DefaultSessionIdManager Session workerName=node0 2> 162658 INFO (closeThreadPool-270-thread-2) [n: c: s: r: x: t:] o.e.j.s.h.ContextHandler Started o.e.j.s.ServletContextHandler@11c811bb{/,file:///build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-cwd/,AVAILABLE} 2> 162662 INFO (closeThreadPool-270-thread-2) [n: c: s: r: x: t:] o.e.j.s.AbstractConnector Started ServerConnector@231656cb{HTTP/1.1, (http/1.1, h2c)}{127.0.0.1:37133} 2> 162663 INFO (closeThreadPool-270-thread-2) [n: c: s: r: x: t:] o.e.j.s.Server Started Server@4609ed2a{STARTING}[10.0.19,sto=0] @162898ms 2> 162665 ERROR (closeThreadPool-270-thread-2) [n: c: s: r: x: t:] o.a.s.u.StartupLoggingUtils Missing Java Option solr.log.dir. Logging may be missing or incomplete. 2> 162665 INFO (closeThreadPool-270-thread-2) [n: c: s: r: x: t:] o.a.s.s.CoreContainerProvider Using logger factory org.apache.logging.slf4j.Log4jLoggerFactory 2> 162666 INFO (closeThreadPool-270-thread-2) [n: c: s: r: x: t:] o.a.s.s.CoreContainerProvider ___ _ Welcome to Apache Solr™ version 9.4.1 2> 162666 INFO (closeThreadPool-270-thread-2) [n: c: s: r: x: t:] o.a.s.s.CoreContainerProvider / __| ___| |_ _ Starting in cloud mode on port null 2> 162667 INFO (closeThreadPool-270-thread-2) [n: c: s: r: x: t:] o.a.s.s.CoreContainerProvider \__ \/ _ \ | '_| Install dir: /build/solr/src/solr-9.4.1/solr 2> 162667 INFO (closeThreadPool-270-thread-2) [n: c: s: r: x: t:] o.a.s.s.CoreContainerProvider |___/\___/_|_| Start time: 2024-01-21T21:04:54.765465Z 2> 162669 INFO (closeThreadPool-270-thread-2) [n: c: s: r: x: t:] o.a.s.s.CoreContainerProvider Solr Home: /build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.cloud.BasicDistributedZkTest_246C98A4C257C021-001/shard-2-001 (source: servlet config: solr.solr.home) 2> 162672 WARN (closeThreadPool-270-thread-2) [n: c: s: r: x: t:] o.a.s.c.c.SolrZkClient Using default ZkCredentialsInjector. ZkCredentialsInjector is not secure, it creates an empty list of credentials which leads to 'OPEN_ACL_UNSAFE' ACLs to Zookeeper nodes 2> 162688 INFO (closeThreadPool-270-thread-2) [n: c: s: r: x: t:] o.a.s.c.c.ConnectionManager Waiting up to 30000ms for client to connect to ZooKeeper 2> 162703 INFO (zkConnectionManagerCallback-277-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ConnectionManager zkClient has connected 2> 162704 INFO (closeThreadPool-270-thread-2) [n: c: s: r: x: t:] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper 2> 162705 WARN (closeThreadPool-270-thread-2) [n: c: s: r: x: t:] o.a.s.c.c.SolrZkClient Using default ZkACLProvider. DefaultZkACLProvider is not secure, it creates 'OPEN_ACL_UNSAFE' ACLs to Zookeeper nodes 2> 162815 INFO (closeThreadPool-270-thread-2) [n: c: s: r: x: t:] o.a.s.c.NodeConfig Loading solr.xml from SolrHome (not found in ZooKeeper) 2> 162818 INFO (closeThreadPool-270-thread-2) [n: c: s: r: x: t:] o.a.s.c.SolrXmlConfig Loading solr.xml from /build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.cloud.BasicDistributedZkTest_246C98A4C257C021-001/shard-2-001/solr.xml 2> 164605 INFO (closeThreadPool-270-thread-1) [n: c: s: r: x: t:] o.a.s.c.ZkContainer Zookeeper client=127.0.0.1:42019/solr 2> 164608 WARN (closeThreadPool-270-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.SolrZkClient Using default ZkCredentialsInjector. ZkCredentialsInjector is not secure, it creates an empty list of credentials which leads to 'OPEN_ACL_UNSAFE' ACLs to Zookeeper nodes 2> 164625 INFO (closeThreadPool-270-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ConnectionManager Waiting up to 15000ms for client to connect to ZooKeeper 2> 164653 INFO (zkConnectionManagerCallback-287-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ConnectionManager zkClient has connected 2> 164653 INFO (closeThreadPool-270-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper 2> 164654 WARN (closeThreadPool-270-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.SolrZkClient Using default ZkACLProvider. DefaultZkACLProvider is not secure, it creates 'OPEN_ACL_UNSAFE' ACLs to Zookeeper nodes 2> 164771 INFO (closeThreadPool-270-thread-1) [n: c: s: r: x: t:] o.a.s.c.DistributedClusterStateUpdater Creating DistributedClusterStateUpdater with useDistributedStateUpdate=false. Solr will be using Overseer based cluster state updates. 2> 164783 INFO (closeThreadPool-270-thread-1) [n:127.0.0.1:40707_ c: s: r: x: t:] o.a.s.c.c.ConnectionManager Waiting up to 15000ms for client to connect to ZooKeeper 2> 164796 INFO (zkConnectionManagerCallback-289-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ConnectionManager zkClient has connected 2> 164796 INFO (closeThreadPool-270-thread-1) [n:127.0.0.1:40707_ c: s: r: x: t:] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper 2> 164855 WARN (closeThreadPool-270-thread-1) [n:127.0.0.1:40707_ c: s: r: x: t:] o.a.s.c.ZkController Contents of zookeeper /security.json are world-readable; consider setting up ACLs as described in https://solr.apache.org/guide/solr/latest/deployment-guide/zookeeper-access-control.html 2> 164867 INFO (closeThreadPool-270-thread-1) [n:127.0.0.1:40707_ c: s: r: x: t:] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (0) -> (1) 2> 164890 INFO (closeThreadPool-270-thread-1) [n:127.0.0.1:40707_ c: s: r: x: t:] o.a.s.c.DistributedClusterStateUpdater Creating DistributedClusterStateUpdater with useDistributedStateUpdate=false. Solr will be using Overseer based cluster state updates. 2> 164914 INFO (closeThreadPool-270-thread-1) [n:127.0.0.1:40707_ c: s: r: x: t:] o.a.s.c.ZkController Publish node=127.0.0.1:40707_ as DOWN 2> 164934 INFO (closeThreadPool-270-thread-1) [n:127.0.0.1:40707_ c: s: r: x: t:] o.a.s.c.ZkController Register node as live in ZooKeeper:/live_nodes/127.0.0.1:40707_ 2> 164947 INFO (zkCallback-244-thread-2) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (1) -> (2) 2> 164953 INFO (zkCallback-288-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (1) -> (2) 2> 164954 INFO (zkCallback-268-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (1) -> (2) 2> 164969 INFO (closeThreadPool-270-thread-2) [n: c: s: r: x: t:] o.a.s.c.ZkContainer Zookeeper client=127.0.0.1:42019/solr 2> 164971 WARN (closeThreadPool-270-thread-2) [n: c: s: r: x: t:] o.a.s.c.c.SolrZkClient Using default ZkCredentialsInjector. ZkCredentialsInjector is not secure, it creates an empty list of credentials which leads to 'OPEN_ACL_UNSAFE' ACLs to Zookeeper nodes 2> 164989 INFO (closeThreadPool-270-thread-2) [n: c: s: r: x: t:] o.a.s.c.c.ConnectionManager Waiting up to 15000ms for client to connect to ZooKeeper 2> 164993 WARN (closeThreadPool-270-thread-1) [n:127.0.0.1:40707_ c: s: r: x: t:] o.a.s.c.CoreContainer Not all security plugins configured! authentication=disabled authorization=disabled. Solr is only as secure as you make it. Consider configuring authentication/authorization before exposing Solr to users internal or external. See https://s.apache.org/solrsecurity for more info 2> 165001 INFO (zkConnectionManagerCallback-298-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ConnectionManager zkClient has connected 2> 165003 INFO (closeThreadPool-270-thread-2) [n: c: s: r: x: t:] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper 2> 165003 WARN (closeThreadPool-270-thread-2) [n: c: s: r: x: t:] o.a.s.c.c.SolrZkClient Using default ZkACLProvider. DefaultZkACLProvider is not secure, it creates 'OPEN_ACL_UNSAFE' ACLs to Zookeeper nodes 2> 165120 INFO (closeThreadPool-270-thread-2) [n: c: s: r: x: t:] o.a.s.c.DistributedClusterStateUpdater Creating DistributedClusterStateUpdater with useDistributedStateUpdate=false. Solr will be using Overseer based cluster state updates. 2> 165133 INFO (closeThreadPool-270-thread-2) [n:127.0.0.1:37133_ c: s: r: x: t:] o.a.s.c.c.ConnectionManager Waiting up to 15000ms for client to connect to ZooKeeper 2> 165150 INFO (zkConnectionManagerCallback-300-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ConnectionManager zkClient has connected 2> 165150 INFO (closeThreadPool-270-thread-2) [n:127.0.0.1:37133_ c: s: r: x: t:] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper 2> 165207 WARN (closeThreadPool-270-thread-2) [n:127.0.0.1:37133_ c: s: r: x: t:] o.a.s.c.ZkController Contents of zookeeper /security.json are world-readable; consider setting up ACLs as described in https://solr.apache.org/guide/solr/latest/deployment-guide/zookeeper-access-control.html 2> 165216 INFO (closeThreadPool-270-thread-2) [n:127.0.0.1:37133_ c: s: r: x: t:] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (0) -> (2) 2> 165237 INFO (closeThreadPool-270-thread-2) [n:127.0.0.1:37133_ c: s: r: x: t:] o.a.s.c.DistributedClusterStateUpdater Creating DistributedClusterStateUpdater with useDistributedStateUpdate=false. Solr will be using Overseer based cluster state updates. 2> 165257 INFO (closeThreadPool-270-thread-2) [n:127.0.0.1:37133_ c: s: r: x: t:] o.a.s.c.ZkController Publish node=127.0.0.1:37133_ as DOWN 2> 165268 INFO (closeThreadPool-270-thread-2) [n:127.0.0.1:37133_ c: s: r: x: t:] o.a.s.c.ZkController Register node as live in ZooKeeper:/live_nodes/127.0.0.1:37133_ 2> 165281 INFO (zkCallback-288-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (2) -> (3) 2> 165281 INFO (zkCallback-244-thread-2) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (2) -> (3) 2> 165282 INFO (zkCallback-268-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (2) -> (3) 2> 165284 INFO (zkCallback-299-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (2) -> (3) 2> 165302 WARN (closeThreadPool-270-thread-2) [n:127.0.0.1:37133_ c: s: r: x: t:] o.a.s.c.CoreContainer Not all security plugins configured! authentication=disabled authorization=disabled. Solr is only as secure as you make it. Consider configuring authentication/authorization before exposing Solr to users internal or external. See https://s.apache.org/solrsecurity for more info 2> 165977 INFO (TEST-BasicDistributedZkTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.AbstractFullDistribZkTestBase create jetty 3 in directory /build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.cloud.BasicDistributedZkTest_246C98A4C257C021-001/shard-3-001 of type NRT for shard2 2> 166000 INFO (closeThreadPool-270-thread-3) [n: c: s: r: x: t:] o.a.s.e.JettySolrRunner Start Jetty (configured port=0, binding port=0) 2> 166001 INFO (closeThreadPool-270-thread-3) [n: c: s: r: x: t:] o.a.s.e.JettySolrRunner Trying to start Jetty on port 0 try number 2 ... 2> 166002 INFO (closeThreadPool-270-thread-3) [n: c: s: r: x: t:] o.e.j.s.Server jetty-10.0.19; built: 2023-12-15T20:38:32.477Z; git: 8492d1c78f122bb30cce20aecfa07e7283facd47; jvm 11.0.22+7 2> 166020 INFO (closeThreadPool-270-thread-3) [n: c: s: r: x: t:] o.e.j.s.s.DefaultSessionIdManager Session workerName=node0 2> 166027 INFO (closeThreadPool-270-thread-3) [n: c: s: r: x: t:] o.e.j.s.h.ContextHandler Started o.e.j.s.ServletContextHandler@467365d4{/,file:///build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-cwd/,AVAILABLE} 2> 166028 INFO (closeThreadPool-270-thread-1) [n:127.0.0.1:40707_ c: s: r: x: t:] o.a.s.c.CorePropertiesLocator Found 0 core definitions underneath /build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.cloud.BasicDistributedZkTest_246C98A4C257C021-001/shard-1-001/cores 2> 166033 INFO (closeThreadPool-270-thread-3) [n: c: s: r: x: t:] o.e.j.s.AbstractConnector Started ServerConnector@25705235{HTTP/1.1, (http/1.1, h2c)}{127.0.0.1:32885} 2> 166035 INFO (closeThreadPool-270-thread-3) [n: c: s: r: x: t:] o.e.j.s.Server Started Server@761e3cbc{STARTING}[10.0.19,sto=0] @166270ms 2> 166038 ERROR (closeThreadPool-270-thread-3) [n: c: s: r: x: t:] o.a.s.u.StartupLoggingUtils Missing Java Option solr.log.dir. Logging may be missing or incomplete. 2> 166038 INFO (closeThreadPool-270-thread-3) [n: c: s: r: x: t:] o.a.s.s.CoreContainerProvider Using logger factory org.apache.logging.slf4j.Log4jLoggerFactory 2> 166039 INFO (closeThreadPool-270-thread-3) [n: c: s: r: x: t:] o.a.s.s.CoreContainerProvider ___ _ Welcome to Apache Solr™ version 9.4.1 2> 166040 INFO (closeThreadPool-270-thread-3) [n: c: s: r: x: t:] o.a.s.s.CoreContainerProvider / __| ___| |_ _ Starting in cloud mode on port null 2> 166040 INFO (closeThreadPool-270-thread-3) [n: c: s: r: x: t:] o.a.s.s.CoreContainerProvider \__ \/ _ \ | '_| Install dir: /build/solr/src/solr-9.4.1/solr 2> 166041 INFO (closeThreadPool-270-thread-3) [n: c: s: r: x: t:] o.a.s.s.CoreContainerProvider |___/\___/_|_| Start time: 2024-01-21T21:04:58.139251Z 2> 166045 INFO (closeThreadPool-270-thread-3) [n: c: s: r: x: t:] o.a.s.s.CoreContainerProvider Solr Home: /build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.cloud.BasicDistributedZkTest_246C98A4C257C021-001/shard-3-001 (source: servlet config: solr.solr.home) 2> 166050 WARN (closeThreadPool-270-thread-3) [n: c: s: r: x: t:] o.a.s.c.c.SolrZkClient Using default ZkCredentialsInjector. ZkCredentialsInjector is not secure, it creates an empty list of credentials which leads to 'OPEN_ACL_UNSAFE' ACLs to Zookeeper nodes 2> 166074 INFO (closeThreadPool-270-thread-3) [n: c: s: r: x: t:] o.a.s.c.c.ConnectionManager Waiting up to 30000ms for client to connect to ZooKeeper 2> 166087 INFO (zkConnectionManagerCallback-309-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ConnectionManager zkClient has connected 2> 166087 INFO (closeThreadPool-270-thread-3) [n: c: s: r: x: t:] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper 2> 166088 WARN (closeThreadPool-270-thread-3) [n: c: s: r: x: t:] o.a.s.c.c.SolrZkClient Using default ZkACLProvider. DefaultZkACLProvider is not secure, it creates 'OPEN_ACL_UNSAFE' ACLs to Zookeeper nodes 2> 166202 INFO (closeThreadPool-270-thread-3) [n: c: s: r: x: t:] o.a.s.c.NodeConfig Loading solr.xml from SolrHome (not found in ZooKeeper) 2> 166205 INFO (closeThreadPool-270-thread-3) [n: c: s: r: x: t:] o.a.s.c.SolrXmlConfig Loading solr.xml from /build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.cloud.BasicDistributedZkTest_246C98A4C257C021-001/shard-3-001/solr.xml 2> 166207 INFO (closeThreadPool-270-thread-2) [n:127.0.0.1:37133_ c: s: r: x: t:] o.a.s.c.CorePropertiesLocator Found 0 core definitions underneath /build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.cloud.BasicDistributedZkTest_246C98A4C257C021-001/shard-2-001/cores 2> 166642 INFO (closeThreadPool-270-thread-1) [n:127.0.0.1:40707_ c: s: r: x: t:] o.a.s.j.SolrRequestAuthorizer Creating a new SolrRequestAuthorizer 2> 166901 INFO (closeThreadPool-270-thread-1) [n:127.0.0.1:40707_ c: s: r: x: t:] o.a.s.e.JettySolrRunner Jetty properties: {hostContext=/, solrconfig=solrconfig.xml, solr.data.dir=/build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.cloud.BasicDistributedZkTest_246C98A4C257C021-001/tempDir-001/jetty1, hostPort=40707, coreRootDirectory=/build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-cwd/../../../../../../../../../build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.cloud.BasicDistributedZkTest_246C98A4C257C021-001/shard-1-001/cores, replicaType=NRT} 2> 166905 INFO (closeThreadPool-270-thread-1) [n: c: s: r: x: t:] o.a.s.c.AbstractFullDistribZkTestBase waitForLiveNode: 127.0.0.1:40707_ 2> 166928 INFO (closeThreadPool-270-thread-2) [n:127.0.0.1:37133_ c: s: r: x: t:] o.a.s.j.SolrRequestAuthorizer Creating a new SolrRequestAuthorizer 2> 167163 INFO (closeThreadPool-270-thread-2) [n:127.0.0.1:37133_ c: s: r: x: t:] o.a.s.e.JettySolrRunner Jetty properties: {hostContext=/, solrconfig=solrconfig.xml, solr.data.dir=/build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.cloud.BasicDistributedZkTest_246C98A4C257C021-001/tempDir-001/jetty2, hostPort=37133, coreRootDirectory=/build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-cwd/../../../../../../../../../build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.cloud.BasicDistributedZkTest_246C98A4C257C021-001/shard-2-001/cores, replicaType=NRT} 2> 167170 INFO (closeThreadPool-270-thread-2) [n: c: s: r: x: t:] o.a.s.c.AbstractFullDistribZkTestBase waitForLiveNode: 127.0.0.1:37133_ 2> 169369 INFO (TEST-BasicDistributedZkTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.AbstractFullDistribZkTestBase create jetty 4 in directory /build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.cloud.BasicDistributedZkTest_246C98A4C257C021-001/shard-4-001 of type NRT for shard1 2> 169385 INFO (closeThreadPool-270-thread-2) [n: c: s: r: x: t:] o.a.s.e.JettySolrRunner Start Jetty (configured port=0, binding port=0) 2> 169386 INFO (closeThreadPool-270-thread-2) [n: c: s: r: x: t:] o.a.s.e.JettySolrRunner Trying to start Jetty on port 0 try number 2 ... 2> 169387 INFO (closeThreadPool-270-thread-2) [n: c: s: r: x: t:] o.e.j.s.Server jetty-10.0.19; built: 2023-12-15T20:38:32.477Z; git: 8492d1c78f122bb30cce20aecfa07e7283facd47; jvm 11.0.22+7 2> 169408 INFO (closeThreadPool-270-thread-2) [n: c: s: r: x: t:] o.e.j.s.s.DefaultSessionIdManager Session workerName=node0 2> 169414 INFO (closeThreadPool-270-thread-2) [n: c: s: r: x: t:] o.e.j.s.h.ContextHandler Started o.e.j.s.ServletContextHandler@74705c4d{/,file:///build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-cwd/,AVAILABLE} 2> 169418 INFO (closeThreadPool-270-thread-2) [n: c: s: r: x: t:] o.e.j.s.AbstractConnector Started ServerConnector@510f304{HTTP/1.1, (http/1.1, h2c)}{127.0.0.1:43017} 2> 169419 INFO (closeThreadPool-270-thread-2) [n: c: s: r: x: t:] o.e.j.s.Server Started Server@582d86f1{STARTING}[10.0.19,sto=0] @169654ms 2> 169421 ERROR (closeThreadPool-270-thread-2) [n: c: s: r: x: t:] o.a.s.u.StartupLoggingUtils Missing Java Option solr.log.dir. Logging may be missing or incomplete. 2> 169422 INFO (closeThreadPool-270-thread-2) [n: c: s: r: x: t:] o.a.s.s.CoreContainerProvider Using logger factory org.apache.logging.slf4j.Log4jLoggerFactory 2> 169422 INFO (closeThreadPool-270-thread-2) [n: c: s: r: x: t:] o.a.s.s.CoreContainerProvider ___ _ Welcome to Apache Solr™ version 9.4.1 2> 169422 INFO (closeThreadPool-270-thread-2) [n: c: s: r: x: t:] o.a.s.s.CoreContainerProvider / __| ___| |_ _ Starting in cloud mode on port null 2> 169423 INFO (closeThreadPool-270-thread-2) [n: c: s: r: x: t:] o.a.s.s.CoreContainerProvider \__ \/ _ \ | '_| Install dir: /build/solr/src/solr-9.4.1/solr 2> 169423 INFO (closeThreadPool-270-thread-2) [n: c: s: r: x: t:] o.a.s.s.CoreContainerProvider |___/\___/_|_| Start time: 2024-01-21T21:05:01.521721Z 2> 169425 INFO (closeThreadPool-270-thread-2) [n: c: s: r: x: t:] o.a.s.s.CoreContainerProvider Solr Home: /build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.cloud.BasicDistributedZkTest_246C98A4C257C021-001/shard-4-001 (source: servlet config: solr.solr.home) 2> 169428 WARN (closeThreadPool-270-thread-2) [n: c: s: r: x: t:] o.a.s.c.c.SolrZkClient Using default ZkCredentialsInjector. ZkCredentialsInjector is not secure, it creates an empty list of credentials which leads to 'OPEN_ACL_UNSAFE' ACLs to Zookeeper nodes 2> 169448 INFO (closeThreadPool-270-thread-2) [n: c: s: r: x: t:] o.a.s.c.c.ConnectionManager Waiting up to 30000ms for client to connect to ZooKeeper 2> 169460 INFO (zkConnectionManagerCallback-317-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ConnectionManager zkClient has connected 2> 169461 INFO (closeThreadPool-270-thread-2) [n: c: s: r: x: t:] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper 2> 169461 WARN (closeThreadPool-270-thread-2) [n: c: s: r: x: t:] o.a.s.c.c.SolrZkClient Using default ZkACLProvider. DefaultZkACLProvider is not secure, it creates 'OPEN_ACL_UNSAFE' ACLs to Zookeeper nodes 2> 169469 INFO (closeThreadPool-270-thread-3) [n: c: s: r: x: t:] o.a.s.c.ZkContainer Zookeeper client=127.0.0.1:42019/solr 2> 169470 WARN (closeThreadPool-270-thread-3) [n: c: s: r: x: t:] o.a.s.c.c.SolrZkClient Using default ZkCredentialsInjector. ZkCredentialsInjector is not secure, it creates an empty list of credentials which leads to 'OPEN_ACL_UNSAFE' ACLs to Zookeeper nodes 2> 169485 INFO (closeThreadPool-270-thread-3) [n: c: s: r: x: t:] o.a.s.c.c.ConnectionManager Waiting up to 15000ms for client to connect to ZooKeeper 2> 169496 INFO (zkConnectionManagerCallback-322-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ConnectionManager zkClient has connected 2> 169497 INFO (closeThreadPool-270-thread-3) [n: c: s: r: x: t:] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper 2> 169497 WARN (closeThreadPool-270-thread-3) [n: c: s: r: x: t:] o.a.s.c.c.SolrZkClient Using default ZkACLProvider. DefaultZkACLProvider is not secure, it creates 'OPEN_ACL_UNSAFE' ACLs to Zookeeper nodes 2> 169574 INFO (closeThreadPool-270-thread-2) [n: c: s: r: x: t:] o.a.s.c.NodeConfig Loading solr.xml from SolrHome (not found in ZooKeeper) 2> 169576 INFO (closeThreadPool-270-thread-2) [n: c: s: r: x: t:] o.a.s.c.SolrXmlConfig Loading solr.xml from /build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.cloud.BasicDistributedZkTest_246C98A4C257C021-001/shard-4-001/solr.xml 2> 169611 INFO (closeThreadPool-270-thread-3) [n: c: s: r: x: t:] o.a.s.c.DistributedClusterStateUpdater Creating DistributedClusterStateUpdater with useDistributedStateUpdate=false. Solr will be using Overseer based cluster state updates. 2> 169622 INFO (closeThreadPool-270-thread-3) [n:127.0.0.1:32885_ c: s: r: x: t:] o.a.s.c.c.ConnectionManager Waiting up to 15000ms for client to connect to ZooKeeper 2> 169653 INFO (zkConnectionManagerCallback-327-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ConnectionManager zkClient has connected 2> 169654 INFO (closeThreadPool-270-thread-3) [n:127.0.0.1:32885_ c: s: r: x: t:] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper 2> 169731 WARN (closeThreadPool-270-thread-3) [n:127.0.0.1:32885_ c: s: r: x: t:] o.a.s.c.ZkController Contents of zookeeper /security.json are world-readable; consider setting up ACLs as described in https://solr.apache.org/guide/solr/latest/deployment-guide/zookeeper-access-control.html 2> 169740 INFO (closeThreadPool-270-thread-3) [n:127.0.0.1:32885_ c: s: r: x: t:] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (0) -> (3) 2> 169770 INFO (closeThreadPool-270-thread-3) [n:127.0.0.1:32885_ c: s: r: x: t:] o.a.s.c.DistributedClusterStateUpdater Creating DistributedClusterStateUpdater with useDistributedStateUpdate=false. Solr will be using Overseer based cluster state updates. 2> 169796 INFO (closeThreadPool-270-thread-3) [n:127.0.0.1:32885_ c: s: r: x: t:] o.a.s.c.ZkController Publish node=127.0.0.1:32885_ as DOWN 2> 169815 INFO (closeThreadPool-270-thread-3) [n:127.0.0.1:32885_ c: s: r: x: t:] o.a.s.c.ZkController Register node as live in ZooKeeper:/live_nodes/127.0.0.1:32885_ 2> 169832 INFO (zkCallback-244-thread-2) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (3) -> (4) 2> 169832 INFO (zkCallback-268-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (3) -> (4) 2> 169833 INFO (zkCallback-299-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (3) -> (4) 2> 169834 INFO (zkCallback-288-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (3) -> (4) 2> 169834 INFO (zkCallback-326-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (3) -> (4) 2> 169855 WARN (closeThreadPool-270-thread-3) [n:127.0.0.1:32885_ c: s: r: x: t:] o.a.s.c.CoreContainer Not all security plugins configured! authentication=disabled authorization=disabled. Solr is only as secure as you make it. Consider configuring authentication/authorization before exposing Solr to users internal or external. See https://s.apache.org/solrsecurity for more info 2> 170658 INFO (closeThreadPool-270-thread-3) [n:127.0.0.1:32885_ c: s: r: x: t:] o.a.s.c.CorePropertiesLocator Found 0 core definitions underneath /build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.cloud.BasicDistributedZkTest_246C98A4C257C021-001/shard-3-001/cores 2> 171240 INFO (closeThreadPool-270-thread-3) [n:127.0.0.1:32885_ c: s: r: x: t:] o.a.s.j.SolrRequestAuthorizer Creating a new SolrRequestAuthorizer 2> 171489 INFO (closeThreadPool-270-thread-3) [n:127.0.0.1:32885_ c: s: r: x: t:] o.a.s.e.JettySolrRunner Jetty properties: {hostContext=/, solrconfig=solrconfig.xml, solr.data.dir=/build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.cloud.BasicDistributedZkTest_246C98A4C257C021-001/tempDir-001/jetty3, hostPort=32885, coreRootDirectory=/build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-cwd/../../../../../../../../../build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.cloud.BasicDistributedZkTest_246C98A4C257C021-001/shard-3-001/cores, replicaType=NRT} 2> 171494 INFO (closeThreadPool-270-thread-3) [n: c: s: r: x: t:] o.a.s.c.AbstractFullDistribZkTestBase waitForLiveNode: 127.0.0.1:32885_ 2> 172097 INFO (closeThreadPool-270-thread-2) [n: c: s: r: x: t:] o.a.s.c.ZkContainer Zookeeper client=127.0.0.1:42019/solr 2> 172099 WARN (closeThreadPool-270-thread-2) [n: c: s: r: x: t:] o.a.s.c.c.SolrZkClient Using default ZkCredentialsInjector. ZkCredentialsInjector is not secure, it creates an empty list of credentials which leads to 'OPEN_ACL_UNSAFE' ACLs to Zookeeper nodes 2> 172113 INFO (closeThreadPool-270-thread-2) [n: c: s: r: x: t:] o.a.s.c.c.ConnectionManager Waiting up to 15000ms for client to connect to ZooKeeper 2> 172127 INFO (zkConnectionManagerCallback-339-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ConnectionManager zkClient has connected 2> 172128 INFO (closeThreadPool-270-thread-2) [n: c: s: r: x: t:] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper 2> 172128 WARN (closeThreadPool-270-thread-2) [n: c: s: r: x: t:] o.a.s.c.c.SolrZkClient Using default ZkACLProvider. DefaultZkACLProvider is not secure, it creates 'OPEN_ACL_UNSAFE' ACLs to Zookeeper nodes 2> 172242 INFO (closeThreadPool-270-thread-2) [n: c: s: r: x: t:] o.a.s.c.DistributedClusterStateUpdater Creating DistributedClusterStateUpdater with useDistributedStateUpdate=false. Solr will be using Overseer based cluster state updates. 2> 172253 INFO (closeThreadPool-270-thread-2) [n:127.0.0.1:43017_ c: s: r: x: t:] o.a.s.c.c.ConnectionManager Waiting up to 15000ms for client to connect to ZooKeeper 2> 172267 INFO (zkConnectionManagerCallback-341-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ConnectionManager zkClient has connected 2> 172267 INFO (closeThreadPool-270-thread-2) [n:127.0.0.1:43017_ c: s: r: x: t:] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper 2> 172310 WARN (closeThreadPool-270-thread-2) [n:127.0.0.1:43017_ c: s: r: x: t:] o.a.s.c.ZkController Contents of zookeeper /security.json are world-readable; consider setting up ACLs as described in https://solr.apache.org/guide/solr/latest/deployment-guide/zookeeper-access-control.html 2> 172319 INFO (closeThreadPool-270-thread-2) [n:127.0.0.1:43017_ c: s: r: x: t:] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (0) -> (4) 2> 172339 INFO (closeThreadPool-270-thread-2) [n:127.0.0.1:43017_ c: s: r: x: t:] o.a.s.c.DistributedClusterStateUpdater Creating DistributedClusterStateUpdater with useDistributedStateUpdate=false. Solr will be using Overseer based cluster state updates. 2> 172355 INFO (closeThreadPool-270-thread-2) [n:127.0.0.1:43017_ c: s: r: x: t:] o.a.s.c.ZkController Publish node=127.0.0.1:43017_ as DOWN 2> 172363 INFO (closeThreadPool-270-thread-2) [n:127.0.0.1:43017_ c: s: r: x: t:] o.a.s.c.ZkController Register node as live in ZooKeeper:/live_nodes/127.0.0.1:43017_ 2> 172375 INFO (zkCallback-268-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (4) -> (5) 2> 172375 INFO (zkCallback-244-thread-2) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (4) -> (5) 2> 172376 INFO (zkCallback-299-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (4) -> (5) 2> 172376 INFO (zkCallback-288-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (4) -> (5) 2> 172376 INFO (zkCallback-326-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (4) -> (5) 2> 172378 INFO (zkCallback-340-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (4) -> (5) 2> 172392 WARN (closeThreadPool-270-thread-2) [n:127.0.0.1:43017_ c: s: r: x: t:] o.a.s.c.CoreContainer Not all security plugins configured! authentication=disabled authorization=disabled. Solr is only as secure as you make it. Consider configuring authentication/authorization before exposing Solr to users internal or external. See https://s.apache.org/solrsecurity for more info 2> 173049 INFO (closeThreadPool-270-thread-2) [n:127.0.0.1:43017_ c: s: r: x: t:] o.a.s.c.CorePropertiesLocator Found 0 core definitions underneath /build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.cloud.BasicDistributedZkTest_246C98A4C257C021-001/shard-4-001/cores 2> 173590 INFO (closeThreadPool-270-thread-2) [n:127.0.0.1:43017_ c: s: r: x: t:] o.a.s.j.SolrRequestAuthorizer Creating a new SolrRequestAuthorizer 2> 173853 INFO (closeThreadPool-270-thread-2) [n:127.0.0.1:43017_ c: s: r: x: t:] o.a.s.e.JettySolrRunner Jetty properties: {hostContext=/, solrconfig=solrconfig.xml, solr.data.dir=/build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.cloud.BasicDistributedZkTest_246C98A4C257C021-001/tempDir-001/jetty4, hostPort=43017, coreRootDirectory=/build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-cwd/../../../../../../../../../build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.cloud.BasicDistributedZkTest_246C98A4C257C021-001/shard-4-001/cores} 2> 173856 INFO (closeThreadPool-270-thread-2) [n: c: s: r: x: t:] o.a.s.c.AbstractFullDistribZkTestBase waitForLiveNode: 127.0.0.1:43017_ 2> 173973 INFO (OverseerThreadFactory-250-thread-3) [n: c:collection1 s:shard1 r: x: t:] o.a.s.c.a.c.AddReplicaCmd Node Identified 127.0.0.1:43017_ for creating new replica of shard shard1 for collection collection1 2> 173975 INFO (OverseerThreadFactory-250-thread-4) [n: c:collection1 s:shard2 r: x: t:] o.a.s.c.a.c.AddReplicaCmd Node Identified 127.0.0.1:40707_ for creating new replica of shard shard2 for collection collection1 2> 173992 INFO (OverseerThreadFactory-250-thread-3) [n: c:collection1 s:shard1 r: x: t:] o.a.s.c.a.c.AddReplicaCmd Returning CreateReplica command. 2> 173996 INFO (OverseerThreadFactory-250-thread-4) [n: c:collection1 s:shard2 r: x: t:] o.a.s.c.a.c.AddReplicaCmd Returning CreateReplica command. 2> 174010 INFO (OverseerStateUpdate-72077803529764868-127.0.0.1:45297_-n_0000000000) [n:127.0.0.1:45297_ c: s: r: x: t:] o.a.s.c.o.SliceMutator createReplica() { 2> "core":"collection1_shard1_replica_n1", 2> "node_name":"127.0.0.1:43017_", 2> "base_url":"http://127.0.0.1:43017", 2> "collection":"collection1", 2> "shard":"shard1", 2> "state":"down", 2> "type":"NRT", 2> "operation":"addreplica"} 2> 174028 INFO (zkCallback-244-thread-2) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/collection1/state.json zxid: 244] for collection [collection1] has occurred - updating... (live nodes size: [5]) 2> 174037 INFO (OverseerStateUpdate-72077803529764868-127.0.0.1:45297_-n_0000000000) [n:127.0.0.1:45297_ c: s: r: x: t:] o.a.s.c.o.SliceMutator createReplica() { 2> "core":"collection1_shard2_replica_n2", 2> "node_name":"127.0.0.1:40707_", 2> "base_url":"http://127.0.0.1:40707", 2> "collection":"collection1", 2> "shard":"shard2", 2> "state":"down", 2> "type":"NRT", 2> "operation":"addreplica"} 2> 174060 INFO (qtp1128963290-590) [n:127.0.0.1:43017_ c: s: r: x:collection1_shard1_replica_n1 t:null-115] o.a.s.h.a.CoreAdminOperation core create command qt=/admin/cores&coreNodeName=core_node3&collection.configName=conf1&name=collection1_shard1_replica_n1&action=CREATE&collection=collection1&shard=shard1&wt=javabin&version=2&replicaType=NRT 2> 174161 INFO (qtp1128963290-590) [n:127.0.0.1:43017_ c:collection1 s:shard1 r:core_node3 x:collection1_shard1_replica_n1 t:null-115] o.a.s.c.SolrConfig Using Lucene MatchVersion: 9.8.0 2> 174201 INFO (zkCallback-244-thread-2) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/collection1/state.json zxid: 248] for collection [collection1] has occurred - updating... (live nodes size: [5]) 2> 174544 INFO (qtp1128963290-590) [n:127.0.0.1:43017_ c:collection1 s:shard1 r:core_node3 x:collection1_shard1_replica_n1 t:null-115] o.a.s.s.IndexSchema Schema name=test 2> 174553 INFO (qtp600427849-517) [n:127.0.0.1:40707_ c: s: r: x:collection1_shard2_replica_n2 t:null-116] o.a.s.h.a.CoreAdminOperation core create command qt=/admin/cores&coreNodeName=core_node4&collection.configName=conf1&name=collection1_shard2_replica_n2&action=CREATE&collection=collection1&shard=shard2&wt=javabin&version=2&replicaType=NRT 2> 174649 INFO (qtp600427849-517) [n:127.0.0.1:40707_ c:collection1 s:shard2 r:core_node4 x:collection1_shard2_replica_n2 t:null-116] o.a.s.c.SolrConfig Using Lucene MatchVersion: 9.8.0 2> 174719 INFO (qtp600427849-517) [n:127.0.0.1:40707_ c:collection1 s:shard2 r:core_node4 x:collection1_shard2_replica_n2 t:null-116] o.a.s.s.IndexSchema Schema name=test 2> 174806 INFO (qtp1128963290-590) [n:127.0.0.1:43017_ c:collection1 s:shard1 r:core_node3 x:collection1_shard1_replica_n1 t:null-115] o.a.s.s.IndexSchema Loaded schema test/1.0 with uniqueid field id 2> 175026 INFO (qtp1128963290-590) [n:127.0.0.1:43017_ c:collection1 s:shard1 r:core_node3 x:collection1_shard1_replica_n1 t:null-115] o.a.s.c.CoreContainer Creating SolrCore 'collection1_shard1_replica_n1' using configuration from configset conf1, trusted=true 2> 175030 INFO (qtp1128963290-590) [n:127.0.0.1:43017_ c:collection1 s:shard1 r:core_node3 x:collection1_shard1_replica_n1 t:null-115] o.a.s.c.SolrCore Opening new SolrCore at [/build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.cloud.BasicDistributedZkTest_246C98A4C257C021-001/shard-4-001/cores/collection1_shard1_replica_n1], dataDir=[/build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.cloud.BasicDistributedZkTest_246C98A4C257C021-001/shard-4-001/cores/collection1_shard1_replica_n1/data/] 2> 175039 INFO (qtp600427849-517) [n:127.0.0.1:40707_ c:collection1 s:shard2 r:core_node4 x:collection1_shard2_replica_n2 t:null-116] o.a.s.s.IndexSchema Loaded schema test/1.0 with uniqueid field id 2> 175058 INFO (qtp1128963290-590) [n:127.0.0.1:43017_ c:collection1 s:shard1 r:core_node3 x:collection1_shard1_replica_n1 t:null-115] o.a.s.u.RandomMergePolicy RandomMergePolicy wrapping class org.apache.lucene.index.TieredMergePolicy: [TieredMergePolicy: maxMergeAtOnce=26, maxMergedSegmentMB=49.734910011291504, floorSegmentMB=0.3930988311767578, forceMergeDeletesPctAllowed=19.244929011688583, segmentsPerTier=32.0, maxCFSSegmentSizeMB=8.796093022208E12, noCFSRatio=0.844874139736486, deletesPctAllowed=38.311155010312675 2> 175087 WARN (qtp1128963290-590) [n:127.0.0.1:43017_ c:collection1 s:shard1 r:core_node3 x:collection1_shard1_replica_n1 t:null-115] o.a.s.c.RequestHandlers INVALID paramSet a in requestHandler {type = requestHandler,name = /dump,class = DumpRequestHandler,attributes = {initParams=a, name=/dump, class=DumpRequestHandler},args = {defaults={a=A, b=B}}} 2> 175269 INFO (qtp600427849-517) [n:127.0.0.1:40707_ c:collection1 s:shard2 r:core_node4 x:collection1_shard2_replica_n2 t:null-116] o.a.s.c.CoreContainer Creating SolrCore 'collection1_shard2_replica_n2' using configuration from configset conf1, trusted=true 2> 175275 INFO (qtp600427849-517) [n:127.0.0.1:40707_ c:collection1 s:shard2 r:core_node4 x:collection1_shard2_replica_n2 t:null-116] o.a.s.c.SolrCore Opening new SolrCore at [/build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.cloud.BasicDistributedZkTest_246C98A4C257C021-001/shard-1-001/cores/collection1_shard2_replica_n2], dataDir=[/build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.cloud.BasicDistributedZkTest_246C98A4C257C021-001/shard-1-001/cores/collection1_shard2_replica_n2/data/] 2> 175324 INFO (qtp600427849-517) [n:127.0.0.1:40707_ c:collection1 s:shard2 r:core_node4 x:collection1_shard2_replica_n2 t:null-116] o.a.s.u.RandomMergePolicy RandomMergePolicy wrapping class org.apache.lucene.index.TieredMergePolicy: [TieredMergePolicy: maxMergeAtOnce=26, maxMergedSegmentMB=49.734910011291504, floorSegmentMB=0.3930988311767578, forceMergeDeletesPctAllowed=19.244929011688583, segmentsPerTier=32.0, maxCFSSegmentSizeMB=8.796093022208E12, noCFSRatio=0.844874139736486, deletesPctAllowed=38.311155010312675 2> 175345 WARN (qtp600427849-517) [n:127.0.0.1:40707_ c:collection1 s:shard2 r:core_node4 x:collection1_shard2_replica_n2 t:null-116] o.a.s.c.RequestHandlers INVALID paramSet a in requestHandler {type = requestHandler,name = /dump,class = DumpRequestHandler,attributes = {initParams=a, name=/dump, class=DumpRequestHandler},args = {defaults={a=A, b=B}}} 2> 175690 INFO (qtp1128963290-590) [n:127.0.0.1:43017_ c:collection1 s:shard1 r:core_node3 x:collection1_shard1_replica_n1 t:null-115] o.a.s.j.SolrRequestAuthorizer Creating a new SolrRequestAuthorizer 2> 175750 INFO (qtp1128963290-590) [n:127.0.0.1:43017_ c:collection1 s:shard1 r:core_node3 x:collection1_shard1_replica_n1 t:null-115] o.a.s.u.UpdateHandler Using UpdateLog implementation: org.apache.solr.update.UpdateLog 2> 175751 INFO (qtp1128963290-590) [n:127.0.0.1:43017_ c:collection1 s:shard1 r:core_node3 x:collection1_shard1_replica_n1 t:null-115] o.a.s.u.UpdateLog Initializing UpdateLog: dataDir= defaultSyncLevel=FLUSH numRecordsToKeep=100 maxNumLogsToKeep=10 numVersionBuckets=65536 2> 175772 INFO (qtp1128963290-590) [n:127.0.0.1:43017_ c:collection1 s:shard1 r:core_node3 x:collection1_shard1_replica_n1 t:null-115] o.a.s.u.CommitTracker Hard AutoCommit: disabled 2> 175772 INFO (qtp1128963290-590) [n:127.0.0.1:43017_ c:collection1 s:shard1 r:core_node3 x:collection1_shard1_replica_n1 t:null-115] o.a.s.u.CommitTracker Soft AutoCommit: disabled 2> 175786 INFO (qtp1128963290-590) [n:127.0.0.1:43017_ c:collection1 s:shard1 r:core_node3 x:collection1_shard1_replica_n1 t:null-115] o.a.s.u.RandomMergePolicy RandomMergePolicy wrapping class org.apache.lucene.index.TieredMergePolicy: [TieredMergePolicy: maxMergeAtOnce=25, maxMergedSegmentMB=98.51693820953369, floorSegmentMB=1.4527759552001953, forceMergeDeletesPctAllowed=2.9097891812865595, segmentsPerTier=45.0, maxCFSSegmentSizeMB=8.796093022208E12, noCFSRatio=0.5977428682331861, deletesPctAllowed=37.29251706615804 2> 175808 INFO (qtp1128963290-590) [n:127.0.0.1:43017_ c:collection1 s:shard1 r:core_node3 x:collection1_shard1_replica_n1 t:null-115] o.a.s.r.ManagedResourceStorage Configured ZooKeeperStorageIO with znodeBase: /configs/conf1 2> 175812 INFO (qtp1128963290-590) [n:127.0.0.1:43017_ c:collection1 s:shard1 r:core_node3 x:collection1_shard1_replica_n1 t:null-115] o.a.s.r.ManagedResourceStorage Loaded null at path _rest_managed.json using ZooKeeperStorageIO:path=/configs/conf1 2> 175818 INFO (qtp1128963290-590) [n:127.0.0.1:43017_ c:collection1 s:shard1 r:core_node3 x:collection1_shard1_replica_n1 t:null-115] o.a.s.h.ReplicationHandler Commits will be reserved for 10000 ms 2> 175819 INFO (qtp1128963290-590) [n:127.0.0.1:43017_ c:collection1 s:shard1 r:core_node3 x:collection1_shard1_replica_n1 t:null-115] o.a.s.u.UpdateLog Could not find max version in index or recent updates, using new clock 1788735502855176192 2> 175865 INFO (searcherExecutor-351-thread-1-processing-collection1_shard1_replica_n1 null-115 core_node3 127.0.0.1:43017_ collection1 shard1) [n:127.0.0.1:43017_ c:collection1 s:shard1 r:core_node3 x:collection1_shard1_replica_n1 t:null-115] o.a.s.c.SolrCore Registered new searcher autowarm time: 0 ms 2> 175871 INFO (qtp1128963290-590) [n:127.0.0.1:43017_ c:collection1 s:shard1 r:core_node3 x:collection1_shard1_replica_n1 t:null-115] o.a.s.c.ZkShardTerms Successful update of terms at /collections/collection1/terms/shard1 to Terms{values={core_node3=0}, version=0} for registerTerm 2> 175872 INFO (qtp1128963290-590) [n:127.0.0.1:43017_ c:collection1 s:shard1 r:core_node3 x:collection1_shard1_replica_n1 t:null-115] o.a.s.c.ShardLeaderElectionContextBase make sure parent is created /collections/collection1/leaders/shard1 2> 175909 INFO (qtp1128963290-590) [n:127.0.0.1:43017_ c:collection1 s:shard1 r:core_node3 x:collection1_shard1_replica_n1 t:null-115] o.a.s.c.ShardLeaderElectionContext Enough replicas found to continue. 2> 175909 INFO (qtp1128963290-590) [n:127.0.0.1:43017_ c:collection1 s:shard1 r:core_node3 x:collection1_shard1_replica_n1 t:null-115] o.a.s.c.ShardLeaderElectionContext I may be the new leader - try and sync 2> 175910 INFO (qtp1128963290-590) [n:127.0.0.1:43017_ c:collection1 s:shard1 r:core_node3 x:collection1_shard1_replica_n1 t:null-115] o.a.s.c.SyncStrategy Sync replicas to http://127.0.0.1:43017/collection1_shard1_replica_n1/ 2> 175913 INFO (qtp1128963290-590) [n:127.0.0.1:43017_ c:collection1 s:shard1 r:core_node3 x:collection1_shard1_replica_n1 t:null-115] o.a.s.c.SyncStrategy Sync Success - now sync replicas to me 2> 175916 INFO (qtp1128963290-590) [n:127.0.0.1:43017_ c:collection1 s:shard1 r:core_node3 x:collection1_shard1_replica_n1 t:null-115] o.a.s.c.SyncStrategy http://127.0.0.1:43017/collection1_shard1_replica_n1/ has no replicas 2> 175916 INFO (qtp1128963290-590) [n:127.0.0.1:43017_ c:collection1 s:shard1 r:core_node3 x:collection1_shard1_replica_n1 t:null-115] o.a.s.c.ShardLeaderElectionContextBase Creating leader registration node /collections/collection1/leaders/shard1/leader after winning as /collections/collection1/leader_elect/shard1/election/72077803529764882-core_node3-n_0000000000 2> 175938 INFO (qtp1128963290-590) [n:127.0.0.1:43017_ c:collection1 s:shard1 r:core_node3 x:collection1_shard1_replica_n1 t:null-115] o.a.s.c.ShardLeaderElectionContext I am the new leader: http://127.0.0.1:43017/collection1_shard1_replica_n1/ shard1 2> 175942 INFO (qtp600427849-517) [n:127.0.0.1:40707_ c:collection1 s:shard2 r:core_node4 x:collection1_shard2_replica_n2 t:null-116] o.a.s.j.SolrRequestAuthorizer Creating a new SolrRequestAuthorizer 2> 176013 INFO (qtp600427849-517) [n:127.0.0.1:40707_ c:collection1 s:shard2 r:core_node4 x:collection1_shard2_replica_n2 t:null-116] o.a.s.u.UpdateHandler Using UpdateLog implementation: org.apache.solr.update.UpdateLog 2> 176013 INFO (qtp600427849-517) [n:127.0.0.1:40707_ c:collection1 s:shard2 r:core_node4 x:collection1_shard2_replica_n2 t:null-116] o.a.s.u.UpdateLog Initializing UpdateLog: dataDir= defaultSyncLevel=FLUSH numRecordsToKeep=100 maxNumLogsToKeep=10 numVersionBuckets=65536 2> 176037 INFO (qtp600427849-517) [n:127.0.0.1:40707_ c:collection1 s:shard2 r:core_node4 x:collection1_shard2_replica_n2 t:null-116] o.a.s.u.CommitTracker Hard AutoCommit: disabled 2> 176037 INFO (qtp600427849-517) [n:127.0.0.1:40707_ c:collection1 s:shard2 r:core_node4 x:collection1_shard2_replica_n2 t:null-116] o.a.s.u.CommitTracker Soft AutoCommit: disabled 2> 176052 INFO (qtp600427849-517) [n:127.0.0.1:40707_ c:collection1 s:shard2 r:core_node4 x:collection1_shard2_replica_n2 t:null-116] o.a.s.u.RandomMergePolicy RandomMergePolicy wrapping class org.apache.lucene.index.TieredMergePolicy: [TieredMergePolicy: maxMergeAtOnce=25, maxMergedSegmentMB=98.51693820953369, floorSegmentMB=1.4527759552001953, forceMergeDeletesPctAllowed=2.9097891812865595, segmentsPerTier=45.0, maxCFSSegmentSizeMB=8.796093022208E12, noCFSRatio=0.5977428682331861, deletesPctAllowed=37.29251706615804 2> 176055 INFO (zkCallback-340-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/collection1/state.json zxid: 266] for collection [collection1] has occurred - updating... (live nodes size: [5]) 2> 176066 INFO (qtp1128963290-590) [n:127.0.0.1:43017_ c:collection1 s:shard1 r:core_node3 x:collection1_shard1_replica_n1 t:null-115] o.a.s.c.ZkController I am the leader, no recovery necessary 2> 176071 INFO (qtp600427849-517) [n:127.0.0.1:40707_ c:collection1 s:shard2 r:core_node4 x:collection1_shard2_replica_n2 t:null-116] o.a.s.r.ManagedResourceStorage Configured ZooKeeperStorageIO with znodeBase: /configs/conf1 2> 176074 INFO (qtp600427849-517) [n:127.0.0.1:40707_ c:collection1 s:shard2 r:core_node4 x:collection1_shard2_replica_n2 t:null-116] o.a.s.r.ManagedResourceStorage Loaded null at path _rest_managed.json using ZooKeeperStorageIO:path=/configs/conf1 2> 176081 INFO (qtp600427849-517) [n:127.0.0.1:40707_ c:collection1 s:shard2 r:core_node4 x:collection1_shard2_replica_n2 t:null-116] o.a.s.h.ReplicationHandler Commits will be reserved for 10000 ms 2> 176083 INFO (qtp600427849-517) [n:127.0.0.1:40707_ c:collection1 s:shard2 r:core_node4 x:collection1_shard2_replica_n2 t:null-116] o.a.s.u.UpdateLog Could not find max version in index or recent updates, using new clock 1788735503132000256 2> 176098 INFO (qtp1128963290-590) [n:127.0.0.1:43017_ c: s: r: x: t:null-115] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/cores params={qt=/admin/cores&coreNodeName=core_node3&collection.configName=conf1&name=collection1_shard1_replica_n1&action=CREATE&collection=collection1&shard=shard1&wt=javabin&version=2&replicaType=NRT} status=0 QTime=2041 2> 176130 INFO (qtp558875332-573) [n:127.0.0.1:32885_ c:collection1 s: r: x: t:null-111] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/collections params={node=127.0.0.1:43017_&action=ADDREPLICA&collection=collection1&shard=shard1&type=NRT&wt=javabin&version=2} status=0 QTime=2229 2> 176131 INFO (searcherExecutor-354-thread-1-processing-collection1_shard2_replica_n2 null-116 core_node4 127.0.0.1:40707_ collection1 shard2) [n:127.0.0.1:40707_ c:collection1 s:shard2 r:core_node4 x:collection1_shard2_replica_n2 t:null-116] o.a.s.c.SolrCore Registered new searcher autowarm time: 0 ms 2> 176135 INFO (qtp600427849-517) [n:127.0.0.1:40707_ c:collection1 s:shard2 r:core_node4 x:collection1_shard2_replica_n2 t:null-116] o.a.s.c.ZkShardTerms Successful update of terms at /collections/collection1/terms/shard2 to Terms{values={core_node4=0}, version=0} for registerTerm 2> 176135 INFO (qtp600427849-517) [n:127.0.0.1:40707_ c:collection1 s:shard2 r:core_node4 x:collection1_shard2_replica_n2 t:null-116] o.a.s.c.ShardLeaderElectionContextBase make sure parent is created /collections/collection1/leaders/shard2 2> 176170 INFO (qtp600427849-517) [n:127.0.0.1:40707_ c:collection1 s:shard2 r:core_node4 x:collection1_shard2_replica_n2 t:null-116] o.a.s.c.ShardLeaderElectionContext Enough replicas found to continue. 2> 176170 INFO (qtp600427849-517) [n:127.0.0.1:40707_ c:collection1 s:shard2 r:core_node4 x:collection1_shard2_replica_n2 t:null-116] o.a.s.c.ShardLeaderElectionContext I may be the new leader - try and sync 2> 176170 INFO (qtp600427849-517) [n:127.0.0.1:40707_ c:collection1 s:shard2 r:core_node4 x:collection1_shard2_replica_n2 t:null-116] o.a.s.c.SyncStrategy Sync replicas to http://127.0.0.1:40707/collection1_shard2_replica_n2/ 2> 176174 INFO (qtp600427849-517) [n:127.0.0.1:40707_ c:collection1 s:shard2 r:core_node4 x:collection1_shard2_replica_n2 t:null-116] o.a.s.c.SyncStrategy Sync Success - now sync replicas to me 2> 176177 INFO (qtp600427849-517) [n:127.0.0.1:40707_ c:collection1 s:shard2 r:core_node4 x:collection1_shard2_replica_n2 t:null-116] o.a.s.c.SyncStrategy http://127.0.0.1:40707/collection1_shard2_replica_n2/ has no replicas 2> 176178 INFO (qtp600427849-517) [n:127.0.0.1:40707_ c:collection1 s:shard2 r:core_node4 x:collection1_shard2_replica_n2 t:null-116] o.a.s.c.ShardLeaderElectionContextBase Creating leader registration node /collections/collection1/leaders/shard2/leader after winning as /collections/collection1/leader_elect/shard2/election/72077803529764874-core_node4-n_0000000000 2> 176188 INFO (zkCallback-340-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/collection1/state.json zxid: 284] for collection [collection1] has occurred - updating... (live nodes size: [5]) 2> 176189 INFO (zkCallback-340-thread-2) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/collection1/state.json zxid: 284] for collection [collection1] has occurred - updating... (live nodes size: [5]) 2> 176202 INFO (qtp600427849-517) [n:127.0.0.1:40707_ c:collection1 s:shard2 r:core_node4 x:collection1_shard2_replica_n2 t:null-116] o.a.s.c.ShardLeaderElectionContext I am the new leader: http://127.0.0.1:40707/collection1_shard2_replica_n2/ shard2 2> 176317 INFO (zkCallback-288-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/collection1/state.json zxid: 287] for collection [collection1] has occurred - updating... (live nodes size: [5]) 2> 176317 INFO (zkCallback-340-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/collection1/state.json zxid: 287] for collection [collection1] has occurred - updating... (live nodes size: [5]) 2> 176317 INFO (zkCallback-340-thread-2) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/collection1/state.json zxid: 287] for collection [collection1] has occurred - updating... (live nodes size: [5]) 2> 176327 INFO (qtp600427849-517) [n:127.0.0.1:40707_ c:collection1 s:shard2 r:core_node4 x:collection1_shard2_replica_n2 t:null-116] o.a.s.c.ZkController I am the leader, no recovery necessary 2> 176368 INFO (qtp600427849-517) [n:127.0.0.1:40707_ c: s: r: x: t:null-116] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/cores params={qt=/admin/cores&coreNodeName=core_node4&collection.configName=conf1&name=collection1_shard2_replica_n2&action=CREATE&collection=collection1&shard=shard2&wt=javabin&version=2&replicaType=NRT} status=0 QTime=1817 2> 176394 INFO (qtp558875332-568) [n:127.0.0.1:32885_ c:collection1 s: r: x: t:null-114] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/collections params={node=127.0.0.1:40707_&action=ADDREPLICA&collection=collection1&shard=shard2&type=NRT&wt=javabin&version=2} status=0 QTime=2493 2> 176450 INFO (zkCallback-340-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/collection1/state.json zxid: 293] for collection [collection1] has occurred - updating... (live nodes size: [5]) 2> 176450 INFO (zkCallback-340-thread-2) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/collection1/state.json zxid: 293] for collection [collection1] has occurred - updating... (live nodes size: [5]) 2> 176450 INFO (zkCallback-288-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/collection1/state.json zxid: 293] for collection [collection1] has occurred - updating... (live nodes size: [5]) 2> 176451 INFO (zkCallback-288-thread-2) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/collection1/state.json zxid: 293] for collection [collection1] has occurred - updating... (live nodes size: [5]) 2> 177985 INFO (OverseerCollectionConfigSetProcessor-72077803529764868-127.0.0.1:45297_-n_0000000000) [n: c: s: r: x: t:] o.a.s.c.OverseerTaskQueue Response ZK path: /overseer/collection-queue-work/qnr-0000000004 doesn't exist. Requestor may have disconnected from ZooKeeper 2> 177992 INFO (OverseerCollectionConfigSetProcessor-72077803529764868-127.0.0.1:45297_-n_0000000000) [n: c: s: r: x: t:] o.a.s.c.OverseerTaskQueue Response ZK path: /overseer/collection-queue-work/qnr-0000000006 doesn't exist. Requestor may have disconnected from ZooKeeper 2> 178008 INFO (OverseerThreadFactory-250-thread-4) [n: c:collection1 s:shard2 r: x: t:] o.a.s.c.a.c.AddReplicaCmd Node Identified 127.0.0.1:32885_ for creating new replica of shard shard2 for collection collection1 2> 178011 INFO (OverseerThreadFactory-250-thread-5) [n: c:collection1 s:shard1 r: x: t:] o.a.s.c.a.c.AddReplicaCmd Node Identified 127.0.0.1:37133_ for creating new replica of shard shard1 for collection collection1 2> 178027 INFO (OverseerThreadFactory-250-thread-4) [n: c:collection1 s:shard2 r: x: t:] o.a.s.c.a.c.AddReplicaCmd Returning CreateReplica command. 2> 178038 INFO (OverseerThreadFactory-250-thread-5) [n: c:collection1 s:shard1 r: x: t:] o.a.s.c.a.c.AddReplicaCmd Returning CreateReplica command. 2> 178048 INFO (OverseerStateUpdate-72077803529764868-127.0.0.1:45297_-n_0000000000) [n:127.0.0.1:45297_ c: s: r: x: t:] o.a.s.c.o.SliceMutator createReplica() { 2> "core":"collection1_shard2_replica_n5", 2> "node_name":"127.0.0.1:32885_", 2> "base_url":"http://127.0.0.1:32885", 2> "collection":"collection1", 2> "shard":"shard2", 2> "state":"down", 2> "type":"NRT", 2> "operation":"addreplica"} 2> 178074 INFO (OverseerStateUpdate-72077803529764868-127.0.0.1:45297_-n_0000000000) [n:127.0.0.1:45297_ c: s: r: x: t:] o.a.s.c.o.SliceMutator createReplica() { 2> "core":"collection1_shard1_replica_n6", 2> "node_name":"127.0.0.1:37133_", 2> "base_url":"http://127.0.0.1:37133", 2> "collection":"collection1", 2> "shard":"shard1", 2> "state":"down", 2> "type":"NRT", 2> "operation":"addreplica"} 2> 178200 INFO (zkCallback-288-thread-2) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/collection1/state.json zxid: 306] for collection [collection1] has occurred - updating... (live nodes size: [5]) 2> 178200 INFO (zkCallback-288-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/collection1/state.json zxid: 306] for collection [collection1] has occurred - updating... (live nodes size: [5]) 2> 178200 INFO (zkCallback-340-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/collection1/state.json zxid: 306] for collection [collection1] has occurred - updating... (live nodes size: [5]) 2> 178200 INFO (zkCallback-340-thread-2) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/collection1/state.json zxid: 306] for collection [collection1] has occurred - updating... (live nodes size: [5]) 2> 178200 INFO (zkCallback-244-thread-2) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/collection1/state.json zxid: 306] for collection [collection1] has occurred - updating... (live nodes size: [5]) 2> 178243 INFO (qtp657049334-530) [n:127.0.0.1:37133_ c: s: r: x:collection1_shard1_replica_n6 t:null-117] o.a.s.h.a.CoreAdminOperation core create command qt=/admin/cores&coreNodeName=core_node8&collection.configName=conf1&name=collection1_shard1_replica_n6&action=CREATE&collection=collection1&shard=shard1&wt=javabin&version=2&replicaType=NRT 2> 178244 INFO (qtp558875332-573) [n:127.0.0.1:32885_ c: s: r: x:collection1_shard2_replica_n5 t:null-118] o.a.s.h.a.CoreAdminOperation core create command qt=/admin/cores&coreNodeName=core_node7&collection.configName=conf1&name=collection1_shard2_replica_n5&action=CREATE&collection=collection1&shard=shard2&wt=javabin&version=2&replicaType=NRT 2> 178338 INFO (qtp558875332-573) [n:127.0.0.1:32885_ c:collection1 s:shard2 r:core_node7 x:collection1_shard2_replica_n5 t:null-118] o.a.s.c.SolrConfig Using Lucene MatchVersion: 9.8.0 2> 178338 INFO (qtp657049334-530) [n:127.0.0.1:37133_ c:collection1 s:shard1 r:core_node8 x:collection1_shard1_replica_n6 t:null-117] o.a.s.c.SolrConfig Using Lucene MatchVersion: 9.8.0 2> 178392 INFO (qtp558875332-573) [n:127.0.0.1:32885_ c:collection1 s:shard2 r:core_node7 x:collection1_shard2_replica_n5 t:null-118] o.a.s.s.IndexSchema Schema name=test 2> 178393 INFO (qtp657049334-530) [n:127.0.0.1:37133_ c:collection1 s:shard1 r:core_node8 x:collection1_shard1_replica_n6 t:null-117] o.a.s.s.IndexSchema Schema name=test 2> 178400 INFO (zkCallback-288-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/collection1/state.json zxid: 310] for collection [collection1] has occurred - updating... (live nodes size: [5]) 2> 178401 INFO (zkCallback-340-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/collection1/state.json zxid: 310] for collection [collection1] has occurred - updating... (live nodes size: [5]) 2> 178401 INFO (zkCallback-340-thread-2) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/collection1/state.json zxid: 310] for collection [collection1] has occurred - updating... (live nodes size: [5]) 2> 178400 INFO (zkCallback-288-thread-2) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/collection1/state.json zxid: 310] for collection [collection1] has occurred - updating... (live nodes size: [5]) 2> 178593 INFO (qtp657049334-530) [n:127.0.0.1:37133_ c:collection1 s:shard1 r:core_node8 x:collection1_shard1_replica_n6 t:null-117] o.a.s.s.IndexSchema Loaded schema test/1.0 with uniqueid field id 2> 178594 INFO (qtp558875332-573) [n:127.0.0.1:32885_ c:collection1 s:shard2 r:core_node7 x:collection1_shard2_replica_n5 t:null-118] o.a.s.s.IndexSchema Loaded schema test/1.0 with uniqueid field id 2> 178781 INFO (qtp657049334-530) [n:127.0.0.1:37133_ c:collection1 s:shard1 r:core_node8 x:collection1_shard1_replica_n6 t:null-117] o.a.s.c.CoreContainer Creating SolrCore 'collection1_shard1_replica_n6' using configuration from configset conf1, trusted=true 2> 178782 INFO (qtp558875332-573) [n:127.0.0.1:32885_ c:collection1 s:shard2 r:core_node7 x:collection1_shard2_replica_n5 t:null-118] o.a.s.c.CoreContainer Creating SolrCore 'collection1_shard2_replica_n5' using configuration from configset conf1, trusted=true 2> 178785 INFO (qtp558875332-573) [n:127.0.0.1:32885_ c:collection1 s:shard2 r:core_node7 x:collection1_shard2_replica_n5 t:null-118] o.a.s.c.SolrCore Opening new SolrCore at [/build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.cloud.BasicDistributedZkTest_246C98A4C257C021-001/shard-3-001/cores/collection1_shard2_replica_n5], dataDir=[/build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.cloud.BasicDistributedZkTest_246C98A4C257C021-001/shard-3-001/cores/collection1_shard2_replica_n5/data/] 2> 178785 INFO (qtp657049334-530) [n:127.0.0.1:37133_ c:collection1 s:shard1 r:core_node8 x:collection1_shard1_replica_n6 t:null-117] o.a.s.c.SolrCore Opening new SolrCore at [/build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.cloud.BasicDistributedZkTest_246C98A4C257C021-001/shard-2-001/cores/collection1_shard1_replica_n6], dataDir=[/build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.cloud.BasicDistributedZkTest_246C98A4C257C021-001/shard-2-001/cores/collection1_shard1_replica_n6/data/] 2> 178806 INFO (qtp657049334-530) [n:127.0.0.1:37133_ c:collection1 s:shard1 r:core_node8 x:collection1_shard1_replica_n6 t:null-117] o.a.s.u.RandomMergePolicy RandomMergePolicy wrapping class org.apache.lucene.index.TieredMergePolicy: [TieredMergePolicy: maxMergeAtOnce=26, maxMergedSegmentMB=49.734910011291504, floorSegmentMB=0.3930988311767578, forceMergeDeletesPctAllowed=19.244929011688583, segmentsPerTier=32.0, maxCFSSegmentSizeMB=8.796093022208E12, noCFSRatio=0.844874139736486, deletesPctAllowed=38.311155010312675 2> 178808 INFO (qtp558875332-573) [n:127.0.0.1:32885_ c:collection1 s:shard2 r:core_node7 x:collection1_shard2_replica_n5 t:null-118] o.a.s.u.RandomMergePolicy RandomMergePolicy wrapping class org.apache.lucene.index.TieredMergePolicy: [TieredMergePolicy: maxMergeAtOnce=26, maxMergedSegmentMB=49.734910011291504, floorSegmentMB=0.3930988311767578, forceMergeDeletesPctAllowed=19.244929011688583, segmentsPerTier=32.0, maxCFSSegmentSizeMB=8.796093022208E12, noCFSRatio=0.844874139736486, deletesPctAllowed=38.311155010312675 2> 178841 WARN (qtp558875332-573) [n:127.0.0.1:32885_ c:collection1 s:shard2 r:core_node7 x:collection1_shard2_replica_n5 t:null-118] o.a.s.c.RequestHandlers INVALID paramSet a in requestHandler {type = requestHandler,name = /dump,class = DumpRequestHandler,attributes = {initParams=a, name=/dump, class=DumpRequestHandler},args = {defaults={a=A, b=B}}} 2> 178841 WARN (qtp657049334-530) [n:127.0.0.1:37133_ c:collection1 s:shard1 r:core_node8 x:collection1_shard1_replica_n6 t:null-117] o.a.s.c.RequestHandlers INVALID paramSet a in requestHandler {type = requestHandler,name = /dump,class = DumpRequestHandler,attributes = {initParams=a, name=/dump, class=DumpRequestHandler},args = {defaults={a=A, b=B}}} 2> 179346 INFO (qtp657049334-530) [n:127.0.0.1:37133_ c:collection1 s:shard1 r:core_node8 x:collection1_shard1_replica_n6 t:null-117] o.a.s.j.SolrRequestAuthorizer Creating a new SolrRequestAuthorizer 2> 179390 INFO (qtp558875332-573) [n:127.0.0.1:32885_ c:collection1 s:shard2 r:core_node7 x:collection1_shard2_replica_n5 t:null-118] o.a.s.j.SolrRequestAuthorizer Creating a new SolrRequestAuthorizer 2> 179403 INFO (qtp657049334-530) [n:127.0.0.1:37133_ c:collection1 s:shard1 r:core_node8 x:collection1_shard1_replica_n6 t:null-117] o.a.s.u.UpdateHandler Using UpdateLog implementation: org.apache.solr.update.UpdateLog 2> 179403 INFO (qtp657049334-530) [n:127.0.0.1:37133_ c:collection1 s:shard1 r:core_node8 x:collection1_shard1_replica_n6 t:null-117] o.a.s.u.UpdateLog Initializing UpdateLog: dataDir= defaultSyncLevel=FLUSH numRecordsToKeep=100 maxNumLogsToKeep=10 numVersionBuckets=65536 2> 179424 INFO (qtp657049334-530) [n:127.0.0.1:37133_ c:collection1 s:shard1 r:core_node8 x:collection1_shard1_replica_n6 t:null-117] o.a.s.u.CommitTracker Hard AutoCommit: disabled 2> 179424 INFO (qtp657049334-530) [n:127.0.0.1:37133_ c:collection1 s:shard1 r:core_node8 x:collection1_shard1_replica_n6 t:null-117] o.a.s.u.CommitTracker Soft AutoCommit: disabled 2> 179435 INFO (qtp657049334-530) [n:127.0.0.1:37133_ c:collection1 s:shard1 r:core_node8 x:collection1_shard1_replica_n6 t:null-117] o.a.s.u.RandomMergePolicy RandomMergePolicy wrapping class org.apache.lucene.index.TieredMergePolicy: [TieredMergePolicy: maxMergeAtOnce=25, maxMergedSegmentMB=98.51693820953369, floorSegmentMB=1.4527759552001953, forceMergeDeletesPctAllowed=2.9097891812865595, segmentsPerTier=45.0, maxCFSSegmentSizeMB=8.796093022208E12, noCFSRatio=0.5977428682331861, deletesPctAllowed=37.29251706615804 2> 179449 INFO (qtp657049334-530) [n:127.0.0.1:37133_ c:collection1 s:shard1 r:core_node8 x:collection1_shard1_replica_n6 t:null-117] o.a.s.r.ManagedResourceStorage Configured ZooKeeperStorageIO with znodeBase: /configs/conf1 2> 179452 INFO (qtp657049334-530) [n:127.0.0.1:37133_ c:collection1 s:shard1 r:core_node8 x:collection1_shard1_replica_n6 t:null-117] o.a.s.r.ManagedResourceStorage Loaded null at path _rest_managed.json using ZooKeeperStorageIO:path=/configs/conf1 2> 179457 INFO (qtp657049334-530) [n:127.0.0.1:37133_ c:collection1 s:shard1 r:core_node8 x:collection1_shard1_replica_n6 t:null-117] o.a.s.h.ReplicationHandler Commits will be reserved for 10000 ms 2> 179458 INFO (qtp657049334-530) [n:127.0.0.1:37133_ c:collection1 s:shard1 r:core_node8 x:collection1_shard1_replica_n6 t:null-117] o.a.s.u.UpdateLog Could not find max version in index or recent updates, using new clock 1788735506670944256 2> 179464 INFO (qtp558875332-573) [n:127.0.0.1:32885_ c:collection1 s:shard2 r:core_node7 x:collection1_shard2_replica_n5 t:null-118] o.a.s.u.UpdateHandler Using UpdateLog implementation: org.apache.solr.update.UpdateLog 2> 179464 INFO (qtp558875332-573) [n:127.0.0.1:32885_ c:collection1 s:shard2 r:core_node7 x:collection1_shard2_replica_n5 t:null-118] o.a.s.u.UpdateLog Initializing UpdateLog: dataDir= defaultSyncLevel=FLUSH numRecordsToKeep=100 maxNumLogsToKeep=10 numVersionBuckets=65536 2> 179488 INFO (qtp558875332-573) [n:127.0.0.1:32885_ c:collection1 s:shard2 r:core_node7 x:collection1_shard2_replica_n5 t:null-118] o.a.s.u.CommitTracker Hard AutoCommit: disabled 2> 179489 INFO (qtp558875332-573) [n:127.0.0.1:32885_ c:collection1 s:shard2 r:core_node7 x:collection1_shard2_replica_n5 t:null-118] o.a.s.u.CommitTracker Soft AutoCommit: disabled 2> 179495 INFO (searcherExecutor-365-thread-1-processing-collection1_shard1_replica_n6 null-117 core_node8 127.0.0.1:37133_ collection1 shard1) [n:127.0.0.1:37133_ c:collection1 s:shard1 r:core_node8 x:collection1_shard1_replica_n6 t:null-117] o.a.s.c.SolrCore Registered new searcher autowarm time: 0 ms 2> 179499 INFO (qtp657049334-530) [n:127.0.0.1:37133_ c:collection1 s:shard1 r:core_node8 x:collection1_shard1_replica_n6 t:null-117] o.a.s.c.ZkShardTerms Successful update of terms at /collections/collection1/terms/shard1 to Terms{values={core_node3=0, core_node8=0}, version=1} for registerTerm 2> 179499 INFO (qtp657049334-530) [n:127.0.0.1:37133_ c:collection1 s:shard1 r:core_node8 x:collection1_shard1_replica_n6 t:null-117] o.a.s.c.ShardLeaderElectionContextBase make sure parent is created /collections/collection1/leaders/shard1 2> 179506 INFO (qtp558875332-573) [n:127.0.0.1:32885_ c:collection1 s:shard2 r:core_node7 x:collection1_shard2_replica_n5 t:null-118] o.a.s.u.RandomMergePolicy RandomMergePolicy wrapping class org.apache.lucene.index.TieredMergePolicy: [TieredMergePolicy: maxMergeAtOnce=25, maxMergedSegmentMB=98.51693820953369, floorSegmentMB=1.4527759552001953, forceMergeDeletesPctAllowed=2.9097891812865595, segmentsPerTier=45.0, maxCFSSegmentSizeMB=8.796093022208E12, noCFSRatio=0.5977428682331861, deletesPctAllowed=37.29251706615804 2> 179529 INFO (qtp657049334-530) [n:127.0.0.1:37133_ c:collection1 s:shard1 r:core_node8 x:collection1_shard1_replica_n6 t:null-117] o.a.s.c.ZkController Core needs to recover:collection1_shard1_replica_n6 2> 179536 INFO (qtp558875332-573) [n:127.0.0.1:32885_ c:collection1 s:shard2 r:core_node7 x:collection1_shard2_replica_n5 t:null-118] o.a.s.r.ManagedResourceStorage Configured ZooKeeperStorageIO with znodeBase: /configs/conf1 2> 179537 INFO (updateExecutor-294-thread-1-processing-collection1_shard1_replica_n6 null-117 core_node8 127.0.0.1:37133_ collection1 shard1) [n:127.0.0.1:37133_ c:collection1 s:shard1 r:core_node8 x:collection1_shard1_replica_n6 t:null-117] o.a.s.u.DefaultSolrCoreState Running recovery 2> 179539 INFO (qtp558875332-573) [n:127.0.0.1:32885_ c:collection1 s:shard2 r:core_node7 x:collection1_shard2_replica_n5 t:null-118] o.a.s.r.ManagedResourceStorage Loaded null at path _rest_managed.json using ZooKeeperStorageIO:path=/configs/conf1 2> 179545 INFO (qtp558875332-573) [n:127.0.0.1:32885_ c:collection1 s:shard2 r:core_node7 x:collection1_shard2_replica_n5 t:null-118] o.a.s.h.ReplicationHandler Commits will be reserved for 10000 ms 2> 179548 INFO (qtp558875332-573) [n:127.0.0.1:32885_ c:collection1 s:shard2 r:core_node7 x:collection1_shard2_replica_n5 t:null-118] o.a.s.u.UpdateLog Could not find max version in index or recent updates, using new clock 1788735506765316096 2> 179556 INFO (recoveryExecutor-296-thread-1-processing-collection1_shard1_replica_n6 null-117 core_node8 127.0.0.1:37133_ collection1 shard1) [n:127.0.0.1:37133_ c:collection1 s:shard1 r:core_node8 x:collection1_shard1_replica_n6 t:null-117] o.a.s.c.RecoveryStrategy Starting recovery process. recoveringAfterStartup=true 2> 179557 INFO (recoveryExecutor-296-thread-1-processing-collection1_shard1_replica_n6 null-117 core_node8 127.0.0.1:37133_ collection1 shard1) [n:127.0.0.1:37133_ c:collection1 s:shard1 r:core_node8 x:collection1_shard1_replica_n6 t:null-117] o.a.s.c.RecoveryStrategy startupVersions is empty 2> 179559 INFO (qtp657049334-530) [n:127.0.0.1:37133_ c: s: r: x: t:null-117] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/cores params={qt=/admin/cores&coreNodeName=core_node8&collection.configName=conf1&name=collection1_shard1_replica_n6&action=CREATE&collection=collection1&shard=shard1&wt=javabin&version=2&replicaType=NRT} status=0 QTime=1317 2> 179583 INFO (qtp558875332-569) [n:127.0.0.1:32885_ c:collection1 s: r: x: t:null-112] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/collections params={node=127.0.0.1:37133_&action=ADDREPLICA&collection=collection1&shard=shard1&type=NRT&wt=javabin&version=2} status=0 QTime=5682 2> 179591 INFO (qtp1128963290-592) [n:127.0.0.1:43017_ c:collection1 s:shard1 r:core_node3 x:collection1_shard1_replica_n1 t:null-119] o.a.s.c.S.Request webapp= path=/admin/ping params={wt=javabin&version=2} hits=0 status=0 QTime=12 2> 179591 INFO (qtp1128963290-592) [n:127.0.0.1:43017_ c:collection1 s:shard1 r:core_node3 x:collection1_shard1_replica_n1 t:null-119] o.a.s.c.S.Request webapp= path=/admin/ping params={wt=javabin&version=2} status=0 QTime=13 2> 179592 INFO (searcherExecutor-367-thread-1-processing-collection1_shard2_replica_n5 null-118 core_node7 127.0.0.1:32885_ collection1 shard2) [n:127.0.0.1:32885_ c:collection1 s:shard2 r:core_node7 x:collection1_shard2_replica_n5 t:null-118] o.a.s.c.SolrCore Registered new searcher autowarm time: 0 ms 2> 179596 INFO (qtp558875332-573) [n:127.0.0.1:32885_ c:collection1 s:shard2 r:core_node7 x:collection1_shard2_replica_n5 t:null-118] o.a.s.c.ZkShardTerms Successful update of terms at /collections/collection1/terms/shard2 to Terms{values={core_node7=0, core_node4=0}, version=1} for registerTerm 2> 179597 INFO (qtp558875332-573) [n:127.0.0.1:32885_ c:collection1 s:shard2 r:core_node7 x:collection1_shard2_replica_n5 t:null-118] o.a.s.c.ShardLeaderElectionContextBase make sure parent is created /collections/collection1/leaders/shard2 2> 179609 INFO (recoveryExecutor-296-thread-1-processing-collection1_shard1_replica_n6 null-117 core_node8 127.0.0.1:37133_ collection1 shard1) [n:127.0.0.1:37133_ c:collection1 s:shard1 r:core_node8 x:collection1_shard1_replica_n6 t:null-117] o.a.s.c.RecoveryStrategy Begin buffering updates. core=[collection1_shard1_replica_n6] 2> 179621 INFO (recoveryExecutor-296-thread-1-processing-collection1_shard1_replica_n6 null-117 core_node8 127.0.0.1:37133_ collection1 shard1) [n:127.0.0.1:37133_ c:collection1 s:shard1 r:core_node8 x:collection1_shard1_replica_n6 t:null-117] o.a.s.u.UpdateLog Starting to buffer updates. FSUpdateLog{state=ACTIVE, tlog=null} 2> 179623 INFO (qtp558875332-573) [n:127.0.0.1:32885_ c:collection1 s:shard2 r:core_node7 x:collection1_shard2_replica_n5 t:null-118] o.a.s.c.ZkController Core needs to recover:collection1_shard2_replica_n5 2> 179623 INFO (recoveryExecutor-296-thread-1-processing-collection1_shard1_replica_n6 null-117 core_node8 127.0.0.1:37133_ collection1 shard1) [n:127.0.0.1:37133_ c:collection1 s:shard1 r:core_node8 x:collection1_shard1_replica_n6 t:null-117] o.a.s.c.RecoveryStrategy Publishing state of core [collection1_shard1_replica_n6] as recovering, leader is [http://127.0.0.1:43017/collection1_shard1_replica_n1/] and I am [http://127.0.0.1:37133/collection1_shard1_replica_n6/] 2> 179629 INFO (updateExecutor-318-thread-1-processing-collection1_shard2_replica_n5 null-118 core_node7 127.0.0.1:32885_ collection1 shard2) [n:127.0.0.1:32885_ c:collection1 s:shard2 r:core_node7 x:collection1_shard2_replica_n5 t:null-118] o.a.s.u.DefaultSolrCoreState Running recovery 2> 179634 INFO (recoveryExecutor-320-thread-1-processing-collection1_shard2_replica_n5 null-118 core_node7 127.0.0.1:32885_ collection1 shard2) [n:127.0.0.1:32885_ c:collection1 s:shard2 r:core_node7 x:collection1_shard2_replica_n5 t:null-118] o.a.s.c.RecoveryStrategy Starting recovery process. recoveringAfterStartup=true 2> 179634 INFO (recoveryExecutor-320-thread-1-processing-collection1_shard2_replica_n5 null-118 core_node7 127.0.0.1:32885_ collection1 shard2) [n:127.0.0.1:32885_ c:collection1 s:shard2 r:core_node7 x:collection1_shard2_replica_n5 t:null-118] o.a.s.c.RecoveryStrategy startupVersions is empty 2> 179646 INFO (recoveryExecutor-296-thread-1-processing-collection1_shard1_replica_n6 null-117 core_node8 127.0.0.1:37133_ collection1 shard1) [n:127.0.0.1:37133_ c:collection1 s:shard1 r:core_node8 x:collection1_shard1_replica_n6 t:null-117] o.a.s.c.RecoveryStrategy Sending prep recovery command to [http://127.0.0.1:43017]; [WaitForState: action=PREPRECOVERY&core=collection1_shard1_replica_n1&nodeName=127.0.0.1:37133_&coreNodeName=core_node8&state=recovering&checkLive=true&onlyIfLeader=true&onlyIfLeaderActive=true] 2> 179655 INFO (qtp558875332-573) [n:127.0.0.1:32885_ c: s: r: x: t:null-118] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/cores params={qt=/admin/cores&coreNodeName=core_node7&collection.configName=conf1&name=collection1_shard2_replica_n5&action=CREATE&collection=collection1&shard=shard2&wt=javabin&version=2&replicaType=NRT} status=0 QTime=1413 2> 179659 INFO (qtp1128963290-593) [n:127.0.0.1:43017_ c: s: r: x:collection1_shard1_replica_n1 t:null-120] o.a.s.h.a.PrepRecoveryOp Going to wait for coreNodeName: core_node8, state: recovering, checkLive: true, onlyIfLeader: true, onlyIfLeaderActive: true 2> 179661 INFO (qtp600427849-519) [n:127.0.0.1:40707_ c:collection1 s:shard2 r:core_node4 x:collection1_shard2_replica_n2 t:null-121] o.a.s.c.S.Request webapp= path=/admin/ping params={wt=javabin&version=2} hits=0 status=0 QTime=3 2> 179661 INFO (qtp600427849-519) [n:127.0.0.1:40707_ c:collection1 s:shard2 r:core_node4 x:collection1_shard2_replica_n2 t:null-121] o.a.s.c.S.Request webapp= path=/admin/ping params={wt=javabin&version=2} status=0 QTime=3 2> 179670 INFO (recoveryExecutor-320-thread-1-processing-collection1_shard2_replica_n5 null-118 core_node7 127.0.0.1:32885_ collection1 shard2) [n:127.0.0.1:32885_ c:collection1 s:shard2 r:core_node7 x:collection1_shard2_replica_n5 t:null-118] o.a.s.c.RecoveryStrategy Begin buffering updates. core=[collection1_shard2_replica_n5] 2> 179677 INFO (recoveryExecutor-320-thread-1-processing-collection1_shard2_replica_n5 null-118 core_node7 127.0.0.1:32885_ collection1 shard2) [n:127.0.0.1:32885_ c:collection1 s:shard2 r:core_node7 x:collection1_shard2_replica_n5 t:null-118] o.a.s.u.UpdateLog Starting to buffer updates. FSUpdateLog{state=ACTIVE, tlog=null} 2> 179677 INFO (recoveryExecutor-320-thread-1-processing-collection1_shard2_replica_n5 null-118 core_node7 127.0.0.1:32885_ collection1 shard2) [n:127.0.0.1:32885_ c:collection1 s:shard2 r:core_node7 x:collection1_shard2_replica_n5 t:null-118] o.a.s.c.RecoveryStrategy Publishing state of core [collection1_shard2_replica_n5] as recovering, leader is [http://127.0.0.1:40707/collection1_shard2_replica_n2/] and I am [http://127.0.0.1:32885/collection1_shard2_replica_n5/] 2> 179688 INFO (recoveryExecutor-320-thread-1-processing-collection1_shard2_replica_n5 null-118 core_node7 127.0.0.1:32885_ collection1 shard2) [n:127.0.0.1:32885_ c:collection1 s:shard2 r:core_node7 x:collection1_shard2_replica_n5 t:null-118] o.a.s.c.RecoveryStrategy Sending prep recovery command to [http://127.0.0.1:40707]; [WaitForState: action=PREPRECOVERY&core=collection1_shard2_replica_n2&nodeName=127.0.0.1:32885_&coreNodeName=core_node7&state=recovering&checkLive=true&onlyIfLeader=true&onlyIfLeaderActive=true] 2> 179691 INFO (qtp558875332-572) [n:127.0.0.1:32885_ c:collection1 s: r: x: t:null-113] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/collections params={node=127.0.0.1:32885_&action=ADDREPLICA&collection=collection1&shard=shard2&type=NRT&wt=javabin&version=2} status=0 QTime=5790 2> 179699 INFO (qtp600427849-520) [n:127.0.0.1:40707_ c: s: r: x:collection1_shard2_replica_n2 t:null-122] o.a.s.h.a.PrepRecoveryOp Going to wait for coreNodeName: core_node7, state: recovering, checkLive: true, onlyIfLeader: true, onlyIfLeaderActive: true 2> 179700 INFO (TEST-BasicDistributedZkTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.AbstractFullDistribZkTestBase Waiting to see 4 active replicas in collection: collection1 2> 179803 INFO (zkCallback-340-thread-2) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/collection1/state.json zxid: 332] for collection [collection1] has occurred - updating... (live nodes size: [5]) 2> 179803 INFO (zkCallback-340-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/collection1/state.json zxid: 332] for collection [collection1] has occurred - updating... (live nodes size: [5]) 2> 179803 INFO (zkCallback-299-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/collection1/state.json zxid: 332] for collection [collection1] has occurred - updating... (live nodes size: [5]) 2> 179803 INFO (zkCallback-288-thread-2) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/collection1/state.json zxid: 332] for collection [collection1] has occurred - updating... (live nodes size: [5]) 2> 179803 INFO (zkCallback-288-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/collection1/state.json zxid: 332] for collection [collection1] has occurred - updating... (live nodes size: [5]) 2> 179803 INFO (zkCallback-326-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/collection1/state.json zxid: 332] for collection [collection1] has occurred - updating... (live nodes size: [5]) 2> 179803 INFO (zkCallback-268-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/collection1/state.json zxid: 332] for collection [collection1] has occurred - updating... (live nodes size: [5]) 2> 179915 INFO (qtp1128963290-593) [n:127.0.0.1:43017_ c: s: r: x: t:null-120] o.a.s.h.a.PrepRecoveryOp In WaitForState(recovering): collection=collection1, shard=shard1, thisCore=collection1_shard1_replica_n1, leaderDoesNotNeedRecovery=false, isLeader? true, live=true, checkLive=true, currentState=down, localState=active, nodeName=127.0.0.1:37133_, coreNodeName=core_node8, onlyIfActiveCheckResult=false, nodeProps: core_node8:{ 2> "core":"collection1_shard1_replica_n6", 2> "node_name":"127.0.0.1:37133_", 2> "base_url":"http://127.0.0.1:37133", 2> "state":"down", 2> "type":"NRT", 2> "force_set_state":"false"} 2> 179915 INFO (qtp600427849-520) [n:127.0.0.1:40707_ c: s: r: x: t:null-122] o.a.s.h.a.PrepRecoveryOp In WaitForState(recovering): collection=collection1, shard=shard2, thisCore=collection1_shard2_replica_n2, leaderDoesNotNeedRecovery=false, isLeader? true, live=true, checkLive=true, currentState=down, localState=active, nodeName=127.0.0.1:32885_, coreNodeName=core_node7, onlyIfActiveCheckResult=false, nodeProps: core_node7:{ 2> "core":"collection1_shard2_replica_n5", 2> "node_name":"127.0.0.1:32885_", 2> "base_url":"http://127.0.0.1:32885", 2> "state":"down", 2> "type":"NRT", 2> "force_set_state":"false"} 2> 179916 INFO (qtp600427849-520) [n:127.0.0.1:40707_ c: s: r: x: t:null-122] o.a.s.h.a.PrepRecoveryOp In WaitForState(recovering): collection=collection1, shard=shard2, thisCore=collection1_shard2_replica_n2, leaderDoesNotNeedRecovery=false, isLeader? true, live=true, checkLive=true, currentState=recovering, localState=active, nodeName=127.0.0.1:32885_, coreNodeName=core_node7, onlyIfActiveCheckResult=false, nodeProps: core_node7:{ 2> "core":"collection1_shard2_replica_n5", 2> "node_name":"127.0.0.1:32885_", 2> "base_url":"http://127.0.0.1:32885", 2> "state":"recovering", 2> "type":"NRT", 2> "force_set_state":"false"} 2> 179916 INFO (qtp1128963290-593) [n:127.0.0.1:43017_ c: s: r: x: t:null-120] o.a.s.h.a.PrepRecoveryOp In WaitForState(recovering): collection=collection1, shard=shard1, thisCore=collection1_shard1_replica_n1, leaderDoesNotNeedRecovery=false, isLeader? true, live=true, checkLive=true, currentState=recovering, localState=active, nodeName=127.0.0.1:37133_, coreNodeName=core_node8, onlyIfActiveCheckResult=false, nodeProps: core_node8:{ 2> "core":"collection1_shard1_replica_n6", 2> "node_name":"127.0.0.1:37133_", 2> "base_url":"http://127.0.0.1:37133", 2> "state":"recovering", 2> "type":"NRT", 2> "force_set_state":"false"} 2> 179918 INFO (qtp1128963290-593) [n:127.0.0.1:43017_ c: s: r: x: t:null-120] o.a.s.h.a.PrepRecoveryOp In WaitForState(recovering): collection=collection1, shard=shard1, thisCore=collection1_shard1_replica_n1, leaderDoesNotNeedRecovery=false, isLeader? true, live=true, checkLive=true, currentState=recovering, localState=active, nodeName=127.0.0.1:37133_, coreNodeName=core_node8, onlyIfActiveCheckResult=false, nodeProps: core_node8:{ 2> "core":"collection1_shard1_replica_n6", 2> "node_name":"127.0.0.1:37133_", 2> "base_url":"http://127.0.0.1:37133", 2> "state":"recovering", 2> "type":"NRT", 2> "force_set_state":"false"} 2> 179918 INFO (qtp600427849-520) [n:127.0.0.1:40707_ c: s: r: x: t:null-122] o.a.s.h.a.PrepRecoveryOp In WaitForState(recovering): collection=collection1, shard=shard2, thisCore=collection1_shard2_replica_n2, leaderDoesNotNeedRecovery=false, isLeader? true, live=true, checkLive=true, currentState=recovering, localState=active, nodeName=127.0.0.1:32885_, coreNodeName=core_node7, onlyIfActiveCheckResult=false, nodeProps: core_node7:{ 2> "core":"collection1_shard2_replica_n5", 2> "node_name":"127.0.0.1:32885_", 2> "base_url":"http://127.0.0.1:32885", 2> "state":"recovering", 2> "type":"NRT", 2> "force_set_state":"false"} 2> 179918 INFO (qtp1128963290-593) [n:127.0.0.1:43017_ c: s: r: x: t:null-120] o.a.s.h.a.PrepRecoveryOp In WaitForState(recovering): collection=collection1, shard=shard1, thisCore=collection1_shard1_replica_n1, leaderDoesNotNeedRecovery=false, isLeader? true, live=true, checkLive=true, currentState=recovering, localState=active, nodeName=127.0.0.1:37133_, coreNodeName=core_node8, onlyIfActiveCheckResult=false, nodeProps: core_node8:{ 2> "core":"collection1_shard1_replica_n6", 2> "node_name":"127.0.0.1:37133_", 2> "base_url":"http://127.0.0.1:37133", 2> "state":"recovering", 2> "type":"NRT", 2> "force_set_state":"false"} 2> 179918 INFO (qtp600427849-520) [n:127.0.0.1:40707_ c: s: r: x: t:null-122] o.a.s.h.a.PrepRecoveryOp In WaitForState(recovering): collection=collection1, shard=shard2, thisCore=collection1_shard2_replica_n2, leaderDoesNotNeedRecovery=false, isLeader? true, live=true, checkLive=true, currentState=recovering, localState=active, nodeName=127.0.0.1:32885_, coreNodeName=core_node7, onlyIfActiveCheckResult=false, nodeProps: core_node7:{ 2> "core":"collection1_shard2_replica_n5", 2> "node_name":"127.0.0.1:32885_", 2> "base_url":"http://127.0.0.1:32885", 2> "state":"recovering", 2> "type":"NRT", 2> "force_set_state":"false"} 2> 179919 INFO (qtp1128963290-593) [n:127.0.0.1:43017_ c: s: r: x: t:null-120] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/cores params={nodeName=127.0.0.1:37133_&onlyIfLeaderActive=true&core=collection1_shard1_replica_n1&coreNodeName=core_node8&action=PREPRECOVERY&checkLive=true&state=recovering&onlyIfLeader=true&wt=javabin&version=2} status=0 QTime=261 2> 179919 INFO (qtp600427849-520) [n:127.0.0.1:40707_ c: s: r: x: t:null-122] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/cores params={nodeName=127.0.0.1:32885_&onlyIfLeaderActive=true&core=collection1_shard2_replica_n2&coreNodeName=core_node7&action=PREPRECOVERY&checkLive=true&state=recovering&onlyIfLeader=true&wt=javabin&version=2} status=0 QTime=221 2> 180013 INFO (OverseerCollectionConfigSetProcessor-72077803529764868-127.0.0.1:45297_-n_0000000000) [n: c: s: r: x: t:] o.a.s.c.OverseerTaskQueue Response ZK path: /overseer/collection-queue-work/qnr-0000000005 doesn't exist. Requestor may have disconnected from ZooKeeper 2> 180019 INFO (OverseerCollectionConfigSetProcessor-72077803529764868-127.0.0.1:45297_-n_0000000000) [n: c: s: r: x: t:] o.a.s.c.OverseerTaskQueue Response ZK path: /overseer/collection-queue-work/qnr-0000000007 doesn't exist. Requestor may have disconnected from ZooKeeper 2> 180423 INFO (recoveryExecutor-296-thread-1-processing-collection1_shard1_replica_n6 null-117 core_node8 127.0.0.1:37133_ collection1 shard1) [n:127.0.0.1:37133_ c:collection1 s:shard1 r:core_node8 x:collection1_shard1_replica_n6 t:null-117] o.a.s.c.RecoveryStrategy Attempting to PeerSync from [http://127.0.0.1:43017/collection1_shard1_replica_n1/] - recoveringAfterStartup=[true] 2> 180423 INFO (recoveryExecutor-320-thread-1-processing-collection1_shard2_replica_n5 null-118 core_node7 127.0.0.1:32885_ collection1 shard2) [n:127.0.0.1:32885_ c:collection1 s:shard2 r:core_node7 x:collection1_shard2_replica_n5 t:null-118] o.a.s.c.RecoveryStrategy Attempting to PeerSync from [http://127.0.0.1:40707/collection1_shard2_replica_n2/] - recoveringAfterStartup=[true] 2> 180440 WARN (recoveryExecutor-320-thread-1-processing-collection1_shard2_replica_n5 null-118 core_node7 127.0.0.1:32885_ collection1 shard2) [n:127.0.0.1:32885_ c:collection1 s:shard2 r:core_node7 x:collection1_shard2_replica_n5 t:null-118] o.a.s.u.PeerSyncWithLeader no frame of reference to tell if we've missed updates 2> 180440 WARN (recoveryExecutor-296-thread-1-processing-collection1_shard1_replica_n6 null-117 core_node8 127.0.0.1:37133_ collection1 shard1) [n:127.0.0.1:37133_ c:collection1 s:shard1 r:core_node8 x:collection1_shard1_replica_n6 t:null-117] o.a.s.u.PeerSyncWithLeader no frame of reference to tell if we've missed updates 2> 180441 INFO (recoveryExecutor-320-thread-1-processing-collection1_shard2_replica_n5 null-118 core_node7 127.0.0.1:32885_ collection1 shard2) [n:127.0.0.1:32885_ c:collection1 s:shard2 r:core_node7 x:collection1_shard2_replica_n5 t:null-118] o.a.s.c.RecoveryStrategy PeerSync Recovery was not successful - trying replication. 2> 180441 INFO (recoveryExecutor-296-thread-1-processing-collection1_shard1_replica_n6 null-117 core_node8 127.0.0.1:37133_ collection1 shard1) [n:127.0.0.1:37133_ c:collection1 s:shard1 r:core_node8 x:collection1_shard1_replica_n6 t:null-117] o.a.s.c.RecoveryStrategy PeerSync Recovery was not successful - trying replication. 2> 180442 INFO (recoveryExecutor-320-thread-1-processing-collection1_shard2_replica_n5 null-118 core_node7 127.0.0.1:32885_ collection1 shard2) [n:127.0.0.1:32885_ c:collection1 s:shard2 r:core_node7 x:collection1_shard2_replica_n5 t:null-118] o.a.s.c.RecoveryStrategy Starting Replication Recovery. 2> 180442 INFO (recoveryExecutor-296-thread-1-processing-collection1_shard1_replica_n6 null-117 core_node8 127.0.0.1:37133_ collection1 shard1) [n:127.0.0.1:37133_ c:collection1 s:shard1 r:core_node8 x:collection1_shard1_replica_n6 t:null-117] o.a.s.c.RecoveryStrategy Starting Replication Recovery. 2> 180442 INFO (recoveryExecutor-296-thread-1-processing-collection1_shard1_replica_n6 null-117 core_node8 127.0.0.1:37133_ collection1 shard1) [n:127.0.0.1:37133_ c:collection1 s:shard1 r:core_node8 x:collection1_shard1_replica_n6 t:null-117] o.a.s.c.RecoveryStrategy Attempting to replicate from [http://127.0.0.1:43017/collection1_shard1_replica_n1/]. 2> 180442 INFO (recoveryExecutor-320-thread-1-processing-collection1_shard2_replica_n5 null-118 core_node7 127.0.0.1:32885_ collection1 shard2) [n:127.0.0.1:32885_ c:collection1 s:shard2 r:core_node7 x:collection1_shard2_replica_n5 t:null-118] o.a.s.c.RecoveryStrategy Attempting to replicate from [http://127.0.0.1:40707/collection1_shard2_replica_n2/]. 2> 180587 INFO (qtp558875332-571) [n:127.0.0.1:32885_ c:collection1 s:shard2 r:core_node7 x:collection1_shard2_replica_n5 t:null-124] o.a.s.u.p.DistributedUpdateProcessor Ignoring commit while not ACTIVE - state: BUFFERING replay: false 2> 180589 INFO (qtp558875332-571) [n:127.0.0.1:32885_ c:collection1 s:shard2 r:core_node7 x:collection1_shard2_replica_n5 t:null-124] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={update.distrib=FROMLEADER&waitSearcher=true&openSearcher=false&commit=true&softCommit=false&distrib.from=http://127.0.0.1:40707/collection1_shard2_replica_n2/&commit_end_point=replicas&wt=javabin&version=2&expungeDeletes=false}{commit=} 0 12 2> 180589 INFO (qtp657049334-533) [n:127.0.0.1:37133_ c:collection1 s:shard1 r:core_node8 x:collection1_shard1_replica_n6 t:null-123] o.a.s.u.p.DistributedUpdateProcessor Ignoring commit while not ACTIVE - state: BUFFERING replay: false 2> 180589 INFO (qtp657049334-533) [n:127.0.0.1:37133_ c:collection1 s:shard1 r:core_node8 x:collection1_shard1_replica_n6 t:null-123] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={update.distrib=FROMLEADER&waitSearcher=true&openSearcher=false&commit=true&softCommit=false&distrib.from=http://127.0.0.1:43017/collection1_shard1_replica_n1/&commit_end_point=replicas&wt=javabin&version=2&expungeDeletes=false}{commit=} 0 10 2> 180596 INFO (qtp600427849-518) [n:127.0.0.1:40707_ c:collection1 s:shard2 r:core_node4 x:collection1_shard2_replica_n2 t:null-124] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={update.distrib=TOLEADER&waitSearcher=true&openSearcher=false&commit=true&softCommit=false&distrib.from=http://127.0.0.1:43017/collection1_shard1_replica_n1/&commit_end_point=leaders&wt=javabin&version=2&expungeDeletes=false}{commit=} 0 78 2> 180596 INFO (qtp1128963290-591) [n:127.0.0.1:43017_ c:collection1 s:shard1 r:core_node3 x:collection1_shard1_replica_n1 t:null-123] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={update.distrib=TOLEADER&waitSearcher=true&openSearcher=false&commit=true&softCommit=false&distrib.from=http://127.0.0.1:40707/collection1_shard2_replica_n2/&commit_end_point=leaders&wt=javabin&version=2&expungeDeletes=false}{commit=} 0 78 2> 180645 INFO (qtp558875332-569) [n:127.0.0.1:32885_ c:collection1 s:shard2 r:core_node7 x:collection1_shard2_replica_n5 t:null-123] o.a.s.u.p.DistributedUpdateProcessor Ignoring commit while not ACTIVE - state: BUFFERING replay: false 2> 180645 INFO (qtp657049334-528) [n:127.0.0.1:37133_ c:collection1 s:shard1 r:core_node8 x:collection1_shard1_replica_n6 t:null-124] o.a.s.u.p.DistributedUpdateProcessor Ignoring commit while not ACTIVE - state: BUFFERING replay: false 2> 180645 INFO (qtp558875332-569) [n:127.0.0.1:32885_ c:collection1 s:shard2 r:core_node7 x:collection1_shard2_replica_n5 t:null-123] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={update.distrib=FROMLEADER&waitSearcher=true&openSearcher=false&commit=true&softCommit=false&distrib.from=http://127.0.0.1:40707/collection1_shard2_replica_n2/&commit_end_point=replicas&wt=javabin&version=2&expungeDeletes=false}{commit=} 0 8 2> 180645 INFO (qtp657049334-528) [n:127.0.0.1:37133_ c:collection1 s:shard1 r:core_node8 x:collection1_shard1_replica_n6 t:null-124] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={update.distrib=FROMLEADER&waitSearcher=true&openSearcher=false&commit=true&softCommit=false&distrib.from=http://127.0.0.1:43017/collection1_shard1_replica_n1/&commit_end_point=replicas&wt=javabin&version=2&expungeDeletes=false}{commit=} 0 8 2> 180649 INFO (qtp600427849-515) [n:127.0.0.1:40707_ c:collection1 s:shard2 r:core_node4 x:collection1_shard2_replica_n2 t:null-123] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={waitSearcher=true&openSearcher=false&commit=true&softCommit=false&wt=javabin&version=2}{commit=} 0 194 2> 180649 INFO (qtp1128963290-588) [n:127.0.0.1:43017_ c:collection1 s:shard1 r:core_node3 x:collection1_shard1_replica_n1 t:null-124] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={waitSearcher=true&openSearcher=false&commit=true&softCommit=false&wt=javabin&version=2}{commit=} 0 194 2> 180701 INFO (qtp1128963290-592) [n:127.0.0.1:43017_ c:collection1 s:shard1 r:core_node3 x:collection1_shard1_replica_n1 t:null-125] o.a.s.c.S.Request webapp= path=/replication params={qt=/replication&wt=javabin&version=2&command=indexversion} status=0 QTime=9 2> 180701 INFO (qtp600427849-519) [n:127.0.0.1:40707_ c:collection1 s:shard2 r:core_node4 x:collection1_shard2_replica_n2 t:null-126] o.a.s.c.S.Request webapp= path=/replication params={qt=/replication&wt=javabin&version=2&command=indexversion} status=0 QTime=9 2> 180708 INFO (recoveryExecutor-296-thread-1-processing-collection1_shard1_replica_n6 null-117 core_node8 127.0.0.1:37133_ collection1 shard1) [n:127.0.0.1:37133_ c:collection1 s:shard1 r:core_node8 x:collection1_shard1_replica_n6 t:null-117] o.a.s.h.IndexFetcher Leader's generation: 1 2> 180708 INFO (recoveryExecutor-320-thread-1-processing-collection1_shard2_replica_n5 null-118 core_node7 127.0.0.1:32885_ collection1 shard2) [n:127.0.0.1:32885_ c:collection1 s:shard2 r:core_node7 x:collection1_shard2_replica_n5 t:null-118] o.a.s.h.IndexFetcher Leader's generation: 1 2> 180709 INFO (recoveryExecutor-296-thread-1-processing-collection1_shard1_replica_n6 null-117 core_node8 127.0.0.1:37133_ collection1 shard1) [n:127.0.0.1:37133_ c:collection1 s:shard1 r:core_node8 x:collection1_shard1_replica_n6 t:null-117] o.a.s.h.IndexFetcher Leader's version: 0 2> 180709 INFO (recoveryExecutor-320-thread-1-processing-collection1_shard2_replica_n5 null-118 core_node7 127.0.0.1:32885_ collection1 shard2) [n:127.0.0.1:32885_ c:collection1 s:shard2 r:core_node7 x:collection1_shard2_replica_n5 t:null-118] o.a.s.h.IndexFetcher Leader's version: 0 2> 180709 INFO (recoveryExecutor-320-thread-1-processing-collection1_shard2_replica_n5 null-118 core_node7 127.0.0.1:32885_ collection1 shard2) [n:127.0.0.1:32885_ c:collection1 s:shard2 r:core_node7 x:collection1_shard2_replica_n5 t:null-118] o.a.s.h.IndexFetcher Follower's generation: 1 2> 180709 INFO (recoveryExecutor-296-thread-1-processing-collection1_shard1_replica_n6 null-117 core_node8 127.0.0.1:37133_ collection1 shard1) [n:127.0.0.1:37133_ c:collection1 s:shard1 r:core_node8 x:collection1_shard1_replica_n6 t:null-117] o.a.s.h.IndexFetcher Follower's generation: 1 2> 180709 INFO (recoveryExecutor-320-thread-1-processing-collection1_shard2_replica_n5 null-118 core_node7 127.0.0.1:32885_ collection1 shard2) [n:127.0.0.1:32885_ c:collection1 s:shard2 r:core_node7 x:collection1_shard2_replica_n5 t:null-118] o.a.s.h.IndexFetcher Follower's version: 0 2> 180709 INFO (recoveryExecutor-296-thread-1-processing-collection1_shard1_replica_n6 null-117 core_node8 127.0.0.1:37133_ collection1 shard1) [n:127.0.0.1:37133_ c:collection1 s:shard1 r:core_node8 x:collection1_shard1_replica_n6 t:null-117] o.a.s.h.IndexFetcher Follower's version: 0 2> 180713 INFO (recoveryExecutor-320-thread-1-processing-collection1_shard2_replica_n5 null-118 core_node7 127.0.0.1:32885_ collection1 shard2) [n:127.0.0.1:32885_ c:collection1 s:shard2 r:core_node7 x:collection1_shard2_replica_n5 t:null-118] o.a.s.c.RecoveryStrategy No replay needed. 2> 180713 INFO (recoveryExecutor-296-thread-1-processing-collection1_shard1_replica_n6 null-117 core_node8 127.0.0.1:37133_ collection1 shard1) [n:127.0.0.1:37133_ c:collection1 s:shard1 r:core_node8 x:collection1_shard1_replica_n6 t:null-117] o.a.s.c.RecoveryStrategy No replay needed. 2> 180723 INFO (recoveryExecutor-296-thread-1-processing-collection1_shard1_replica_n6 null-117 core_node8 127.0.0.1:37133_ collection1 shard1) [n:127.0.0.1:37133_ c:collection1 s:shard1 r:core_node8 x:collection1_shard1_replica_n6 t:null-117] o.a.s.c.RecoveryStrategy Replication Recovery was successful. 2> 180723 INFO (recoveryExecutor-320-thread-1-processing-collection1_shard2_replica_n5 null-118 core_node7 127.0.0.1:32885_ collection1 shard2) [n:127.0.0.1:32885_ c:collection1 s:shard2 r:core_node7 x:collection1_shard2_replica_n5 t:null-118] o.a.s.c.RecoveryStrategy Replication Recovery was successful. 2> 180723 INFO (recoveryExecutor-296-thread-1-processing-collection1_shard1_replica_n6 null-117 core_node8 127.0.0.1:37133_ collection1 shard1) [n:127.0.0.1:37133_ c:collection1 s:shard1 r:core_node8 x:collection1_shard1_replica_n6 t:null-117] o.a.s.c.RecoveryStrategy Registering as Active after recovery. 2> 180723 INFO (recoveryExecutor-320-thread-1-processing-collection1_shard2_replica_n5 null-118 core_node7 127.0.0.1:32885_ collection1 shard2) [n:127.0.0.1:32885_ c:collection1 s:shard2 r:core_node7 x:collection1_shard2_replica_n5 t:null-118] o.a.s.c.RecoveryStrategy Registering as Active after recovery. 2> 180729 INFO (recoveryExecutor-320-thread-1-processing-collection1_shard2_replica_n5 null-118 core_node7 127.0.0.1:32885_ collection1 shard2) [n:127.0.0.1:32885_ c:collection1 s:shard2 r:core_node7 x:collection1_shard2_replica_n5 t:null-118] o.a.s.c.RecoveryStrategy Updating version bucket highest from index after successful recovery. 2> 180729 INFO (recoveryExecutor-320-thread-1-processing-collection1_shard2_replica_n5 null-118 core_node7 127.0.0.1:32885_ collection1 shard2) [n:127.0.0.1:32885_ c:collection1 s:shard2 r:core_node7 x:collection1_shard2_replica_n5 t:null-118] o.a.s.u.UpdateLog Could not find max version in index or recent updates, using new clock 1788735508003684352 2> 180729 INFO (recoveryExecutor-296-thread-1-processing-collection1_shard1_replica_n6 null-117 core_node8 127.0.0.1:37133_ collection1 shard1) [n:127.0.0.1:37133_ c:collection1 s:shard1 r:core_node8 x:collection1_shard1_replica_n6 t:null-117] o.a.s.c.RecoveryStrategy Updating version bucket highest from index after successful recovery. 2> 180730 INFO (recoveryExecutor-296-thread-1-processing-collection1_shard1_replica_n6 null-117 core_node8 127.0.0.1:37133_ collection1 shard1) [n:127.0.0.1:37133_ c:collection1 s:shard1 r:core_node8 x:collection1_shard1_replica_n6 t:null-117] o.a.s.u.UpdateLog Could not find max version in index or recent updates, using new clock 1788735508004732928 2> 180738 INFO (recoveryExecutor-296-thread-1-processing-collection1_shard1_replica_n6 null-117 core_node8 127.0.0.1:37133_ collection1 shard1) [n:127.0.0.1:37133_ c:collection1 s:shard1 r:core_node8 x:collection1_shard1_replica_n6 t:null-117] o.a.s.c.RecoveryStrategy Finished recovery process, successful=[true] msTimeTaken=1180.0 2> 180738 INFO (recoveryExecutor-320-thread-1-processing-collection1_shard2_replica_n5 null-118 core_node7 127.0.0.1:32885_ collection1 shard2) [n:127.0.0.1:32885_ c:collection1 s:shard2 r:core_node7 x:collection1_shard2_replica_n5 t:null-118] o.a.s.c.RecoveryStrategy Finished recovery process, successful=[true] msTimeTaken=1103.0 2> 180739 INFO (recoveryExecutor-296-thread-1-processing-collection1_shard1_replica_n6 null-117 core_node8 127.0.0.1:37133_ collection1 shard1) [n:127.0.0.1:37133_ c:collection1 s:shard1 r:core_node8 x:collection1_shard1_replica_n6 t:null-117] o.a.s.c.RecoveryStrategy Finished recovery process. recoveringAfterStartup=true msTimeTaken=1182.0 2> 180739 INFO (recoveryExecutor-320-thread-1-processing-collection1_shard2_replica_n5 null-118 core_node7 127.0.0.1:32885_ collection1 shard2) [n:127.0.0.1:32885_ c:collection1 s:shard2 r:core_node7 x:collection1_shard2_replica_n5 t:null-118] o.a.s.c.RecoveryStrategy Finished recovery process. recoveringAfterStartup=true msTimeTaken=1105.0 2> 180846 INFO (zkCallback-268-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/collection1/state.json zxid: 340] for collection [collection1] has occurred - updating... (live nodes size: [5]) 2> 180846 INFO (zkCallback-340-thread-2) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/collection1/state.json zxid: 340] for collection [collection1] has occurred - updating... (live nodes size: [5]) 2> 180846 INFO (zkCallback-299-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/collection1/state.json zxid: 340] for collection [collection1] has occurred - updating... (live nodes size: [5]) 2> 180846 INFO (zkCallback-288-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/collection1/state.json zxid: 340] for collection [collection1] has occurred - updating... (live nodes size: [5]) 2> 180846 INFO (zkCallback-288-thread-2) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/collection1/state.json zxid: 340] for collection [collection1] has occurred - updating... (live nodes size: [5]) 2> 180846 INFO (zkCallback-326-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/collection1/state.json zxid: 340] for collection [collection1] has occurred - updating... (live nodes size: [5]) 2> 180846 INFO (zkCallback-340-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/collection1/state.json zxid: 340] for collection [collection1] has occurred - updating... (live nodes size: [5]) 2> 180874 INFO (TEST-BasicDistributedZkTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.SolrTestCaseJ4 ###Starting test 2> 180881 INFO (TEST-BasicDistributedZkTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.AbstractDistribZkTestBase Wait for recoveries to finish - collection:collection1 failOnTimeout:true timeout:330SECONDS 2> 180888 INFO (TEST-BasicDistributedZkTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.AbstractDistribZkTestBase Recoveries finished - collection:collection1 2> 180988 INFO (qtp2065109343-471) [n:127.0.0.1:45297_ c:control_collection s:shard1 r:core_node2 x:control_collection_shard1_replica_n1 t:null-127] o.a.s.c.ZkShardTerms Successful update of terms at /collections/control_collection/terms/shard1 to Terms{values={core_node2=1}, version=1} for ensureHighestTermsAreNotZero 2> 180989 INFO (qtp2065109343-471) [n:127.0.0.1:45297_ c:control_collection s:shard1 r:core_node2 x:control_collection_shard1_replica_n1 t:null-127] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={wt=javabin&version=2}{deleteByQuery=*:* (-1788735508198719488)} 0 87 2> 181100 INFO (qtp558875332-573) [n:127.0.0.1:32885_ c:collection1 s:shard2 r:core_node7 x:collection1_shard2_replica_n5 t:null-128] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={update.distrib=FROMLEADER&_version_=-1788735508352860160&distrib.from=http://127.0.0.1:40707/collection1_shard2_replica_n2/&wt=javabin&version=2}{deleteByQuery=*:* (-1788735508352860160)} 0 13 2> 181110 INFO (qtp600427849-520) [n:127.0.0.1:40707_ c:collection1 s:shard2 r:core_node4 x:collection1_shard2_replica_n2 t:null-128] o.a.s.c.ZkShardTerms Successful update of terms at /collections/collection1/terms/shard2 to Terms{values={core_node7=1, core_node4=1}, version=2} for ensureHighestTermsAreNotZero 2> 181110 INFO (qtp600427849-520) [n:127.0.0.1:40707_ c:collection1 s:shard2 r:core_node4 x:collection1_shard2_replica_n2 t:null-128] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={update.distrib=TOLEADER&distrib.from=http://127.0.0.1:43017/collection1_shard1_replica_n1/&wt=javabin&version=2}{deleteByQuery=*:* (-1788735508352860160)} 0 53 2> 181145 INFO (qtp657049334-529) [n:127.0.0.1:37133_ c:collection1 s:shard1 r:core_node8 x:collection1_shard1_replica_n6 t:null-128] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={update.distrib=FROMLEADER&_version_=-1788735508303577088&distrib.from=http://127.0.0.1:43017/collection1_shard1_replica_n1/&wt=javabin&version=2}{deleteByQuery=*:* (-1788735508303577088)} 0 19 2> 181155 INFO (qtp1128963290-589) [n:127.0.0.1:43017_ c:collection1 s:shard1 r:core_node3 x:collection1_shard1_replica_n1 t:null-128] o.a.s.c.ZkShardTerms Successful update of terms at /collections/collection1/terms/shard1 to Terms{values={core_node3=1, core_node8=1}, version=2} for ensureHighestTermsAreNotZero 2> 181156 INFO (qtp1128963290-589) [n:127.0.0.1:43017_ c:collection1 s:shard1 r:core_node3 x:collection1_shard1_replica_n1 t:null-128] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={wt=javabin&version=2}{deleteByQuery=*:* (-1788735508303577088)} 0 151 2> 181191 INFO (qtp600427849-517) [n:127.0.0.1:40707_ c:collection1 s:shard2 r:core_node4 x:collection1_shard2_replica_n2 t:null-129] o.a.s.c.S.Request webapp= path=/select params={q=*:*&distrib=false&sanity_check=is_empty&wt=javabin&version=2} hits=0 status=0 QTime=5 2> 181241 INFO (qtp558875332-568) [n:127.0.0.1:32885_ c:collection1 s:shard2 r:core_node7 x:collection1_shard2_replica_n5 t:null-130] o.a.s.c.S.Request webapp= path=/select params={q=*:*&distrib=false&sanity_check=is_empty&wt=javabin&version=2} hits=0 status=0 QTime=3 2> 181267 INFO (qtp657049334-531) [n:127.0.0.1:37133_ c:collection1 s:shard1 r:core_node8 x:collection1_shard1_replica_n6 t:null-131] o.a.s.c.S.Request webapp= path=/select params={q=*:*&distrib=false&sanity_check=is_empty&wt=javabin&version=2} hits=0 status=0 QTime=4 2> 181353 INFO (qtp1128963290-591) [n:127.0.0.1:43017_ c:collection1 s:shard1 r:core_node3 x:collection1_shard1_replica_n1 t:null-132] o.a.s.c.S.Request webapp= path=/select params={q=*:*&distrib=false&sanity_check=is_empty&wt=javabin&version=2} hits=0 status=0 QTime=4 2> 181417 INFO (qtp2065109343-466) [n:127.0.0.1:45297_ c:control_collection s:shard1 r:core_node2 x:control_collection_shard1_replica_n1 t:null-133] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={wt=javabin&version=2}{add=[1 (1788735508688404480)]} 0 44 2> 181527 INFO (qtp657049334-532) [n:127.0.0.1:37133_ c:collection1 s:shard1 r:core_node8 x:collection1_shard1_replica_n6 t:null-134] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={update.distrib=FROMLEADER&distrib.from=http://127.0.0.1:43017/collection1_shard1_replica_n1/&wt=javabin&version=2}{add=[1 (1788735508795359232)]} 0 21 2> 181539 INFO (qtp1128963290-588) [n:127.0.0.1:43017_ c:collection1 s:shard1 r:core_node3 x:collection1_shard1_replica_n1 t:null-134] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={update.distrib=TOLEADER&distrib.from=http://127.0.0.1:40707/collection1_shard2_replica_n2/&wt=javabin&version=2}{add=[1 (1788735508795359232)]} 0 60 2> 181546 INFO (qtp600427849-518) [n:127.0.0.1:40707_ c:collection1 s:shard2 r:core_node4 x:collection1_shard2_replica_n2 t:null-134] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={wt=javabin&version=2}{add=[1]} 0 83 2> 181592 INFO (qtp600427849-515) [n:127.0.0.1:40707_ c:collection1 s:shard2 r:core_node4 x:collection1_shard2_replica_n2 t:null-135] o.a.s.c.S.Request webapp= path=/select params={q=id:1&distrib=false&sanity_check=non_distrib_id_1_lookup&wt=javabin&version=2} hits=0 status=0 QTime=23 2> 181607 INFO (qtp558875332-570) [n:127.0.0.1:32885_ c:collection1 s:shard2 r:core_node7 x:collection1_shard2_replica_n5 t:null-136] o.a.s.c.S.Request webapp= path=/select params={q=id:1&distrib=false&sanity_check=non_distrib_id_1_lookup&wt=javabin&version=2} hits=0 status=0 QTime=5 2> 181627 INFO (qtp657049334-533) [n:127.0.0.1:37133_ c:collection1 s:shard1 r:core_node8 x:collection1_shard1_replica_n6 t:null-137] o.a.s.c.S.Request webapp= path=/select params={q=id:1&distrib=false&sanity_check=non_distrib_id_1_lookup&wt=javabin&version=2} hits=0 status=0 QTime=3 2> 181638 INFO (qtp1128963290-592) [n:127.0.0.1:43017_ c:collection1 s:shard1 r:core_node3 x:collection1_shard1_replica_n1 t:null-138] o.a.s.c.S.Request webapp= path=/select params={q=id:1&distrib=false&sanity_check=non_distrib_id_1_lookup&wt=javabin&version=2} hits=0 status=0 QTime=2 2> 181705 INFO (qtp558875332-573) [n:127.0.0.1:32885_ c:collection1 s:shard2 r:core_node7 x:collection1_shard2_replica_n5 t:null-139] o.a.s.c.S.Request webapp= path=/select params={df=text&distrib=false&fl=id&fl=score&shards.purpose=16388&start=0&fsv=true&rid=null-139&rows=10&version=2&q=id:1&omitHeader=false&NOW=1705871113758&sanity_check=distrib_id_1_lookup&isShard=true&wt=javabin} hits=0 status=0 QTime=8 2> 181705 INFO (qtp1128963290-590) [n:127.0.0.1:43017_ c:collection1 s:shard1 r:core_node3 x:collection1_shard1_replica_n1 t:null-139] o.a.s.c.S.Request webapp= path=/select params={df=text&distrib=false&fl=id&fl=score&shards.purpose=16388&start=0&fsv=true&rid=null-139&rows=10&version=2&q=id:1&omitHeader=false&NOW=1705871113758&sanity_check=distrib_id_1_lookup&isShard=true&wt=javabin} hits=0 status=0 QTime=6 2> 181722 INFO (qtp600427849-519) [n:127.0.0.1:40707_ c:collection1 s:shard2 r:core_node4 x:collection1_shard2_replica_n2 t:null-139] o.a.s.c.S.Request webapp= path=/select params={q=id:1&sanity_check=distrib_id_1_lookup&wt=javabin&version=2} rid=null-139 hits=0 status=0 QTime=63 2> 181764 INFO (qtp600427849-517) [n:127.0.0.1:40707_ c:collection1 s:shard2 r:core_node4 x:collection1_shard2_replica_n2 t:null-140] o.a.s.c.S.Request webapp= path=/select params={df=text&distrib=false&fl=id&fl=score&shards.purpose=16388&start=0&fsv=true&rid=null-140&rows=10&version=2&q=id:1&omitHeader=false&NOW=1705871113830&sanity_check=distrib_id_1_lookup&isShard=true&wt=javabin} hits=0 status=0 QTime=3 2> 181766 INFO (qtp1128963290-592) [n:127.0.0.1:43017_ c:collection1 s:shard1 r:core_node3 x:collection1_shard1_replica_n1 t:null-140] o.a.s.c.S.Request webapp= path=/select params={df=text&distrib=false&fl=id&fl=score&shards.purpose=16388&start=0&fsv=true&rid=null-140&rows=10&version=2&q=id:1&omitHeader=false&NOW=1705871113830&sanity_check=distrib_id_1_lookup&isShard=true&wt=javabin} hits=0 status=0 QTime=3 2> 181771 INFO (qtp558875332-572) [n:127.0.0.1:32885_ c:collection1 s:shard2 r:core_node7 x:collection1_shard2_replica_n5 t:null-140] o.a.s.c.S.Request webapp= path=/select params={q=id:1&sanity_check=distrib_id_1_lookup&wt=javabin&version=2} rid=null-140 hits=0 status=0 QTime=39 2> 181820 INFO (qtp558875332-571) [n:127.0.0.1:32885_ c:collection1 s:shard2 r:core_node7 x:collection1_shard2_replica_n5 t:null-141] o.a.s.c.S.Request webapp= path=/select params={df=text&distrib=false&fl=id&fl=score&shards.purpose=16388&start=0&fsv=true&rid=null-141&rows=10&version=2&q=id:1&omitHeader=false&NOW=1705871113886&sanity_check=distrib_id_1_lookup&isShard=true&wt=javabin} hits=0 status=0 QTime=3 2> 181824 INFO (qtp657049334-531) [n:127.0.0.1:37133_ c:collection1 s:shard1 r:core_node8 x:collection1_shard1_replica_n6 t:null-141] o.a.s.c.S.Request webapp= path=/select params={df=text&distrib=false&fl=id&fl=score&shards.purpose=16388&start=0&fsv=true&rid=null-141&rows=10&version=2&q=id:1&omitHeader=false&NOW=1705871113886&sanity_check=distrib_id_1_lookup&isShard=true&wt=javabin} hits=0 status=0 QTime=4 2> 181829 INFO (qtp657049334-528) [n:127.0.0.1:37133_ c:collection1 s:shard1 r:core_node8 x:collection1_shard1_replica_n6 t:null-141] o.a.s.c.S.Request webapp= path=/select params={q=id:1&sanity_check=distrib_id_1_lookup&wt=javabin&version=2} rid=null-141 hits=0 status=0 QTime=41 2> 181873 INFO (qtp558875332-572) [n:127.0.0.1:32885_ c:collection1 s:shard2 r:core_node7 x:collection1_shard2_replica_n5 t:null-142] o.a.s.c.S.Request webapp= path=/select params={df=text&distrib=false&fl=id&fl=score&shards.purpose=16388&start=0&fsv=true&rid=null-142&rows=10&version=2&q=id:1&omitHeader=false&NOW=1705871113940&sanity_check=distrib_id_1_lookup&isShard=true&wt=javabin} hits=0 status=0 QTime=3 2> 181877 INFO (qtp1128963290-591) [n:127.0.0.1:43017_ c:collection1 s:shard1 r:core_node3 x:collection1_shard1_replica_n1 t:null-142] o.a.s.c.S.Request webapp= path=/select params={df=text&distrib=false&fl=id&fl=score&shards.purpose=16388&start=0&fsv=true&rid=null-142&rows=10&version=2&q=id:1&omitHeader=false&NOW=1705871113940&sanity_check=distrib_id_1_lookup&isShard=true&wt=javabin} hits=0 status=0 QTime=3 2> 181882 INFO (qtp1128963290-593) [n:127.0.0.1:43017_ c:collection1 s:shard1 r:core_node3 x:collection1_shard1_replica_n1 t:null-142] o.a.s.c.S.Request webapp= path=/select params={q=id:1&sanity_check=distrib_id_1_lookup&wt=javabin&version=2} rid=null-142 hits=0 status=0 QTime=40 2> 181913 INFO (qtp2065109343-468) [n:127.0.0.1:45297_ c:control_collection s:shard1 r:core_node2 x:control_collection_shard1_replica_n1 t:null-143] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={wt=javabin&version=2}{add=[1 (1788735509226323968)]} 0 20 2> 181981 INFO (qtp657049334-533) [n:127.0.0.1:37133_ c:collection1 s:shard1 r:core_node8 x:collection1_shard1_replica_n6 t:null-144] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={update.distrib=FROMLEADER&distrib.from=http://127.0.0.1:43017/collection1_shard1_replica_n1/&wt=javabin&version=2}{add=[1 (1788735509294481408)]} 0 8 2> 181985 INFO (qtp1128963290-589) [n:127.0.0.1:43017_ c:collection1 s:shard1 r:core_node3 x:collection1_shard1_replica_n1 t:null-144] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={update.distrib=TOLEADER&distrib.from=http://127.0.0.1:37133/collection1_shard1_replica_n6/&wt=javabin&version=2}{add=[1 (1788735509294481408)]} 0 28 2> 181989 INFO (qtp657049334-532) [n:127.0.0.1:37133_ c:collection1 s:shard1 r:core_node8 x:collection1_shard1_replica_n6 t:null-144] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={wt=javabin&version=2}{add=[1]} 0 64 2> 182045 INFO (qtp600427849-518) [n:127.0.0.1:40707_ c:collection1 s:shard2 r:core_node4 x:collection1_shard2_replica_n2 t:null-145] o.a.s.c.S.Request webapp= path=/select params={q=id:1&distrib=false&sanity_check=non_distrib_id_1_lookup&wt=javabin&version=2} hits=0 status=0 QTime=29 2> 182067 INFO (qtp558875332-568) [n:127.0.0.1:32885_ c:collection1 s:shard2 r:core_node7 x:collection1_shard2_replica_n5 t:null-146] o.a.s.c.S.Request webapp= path=/select params={q=id:1&distrib=false&sanity_check=non_distrib_id_1_lookup&wt=javabin&version=2} hits=0 status=0 QTime=3 2> 182086 INFO (qtp657049334-529) [n:127.0.0.1:37133_ c:collection1 s:shard1 r:core_node8 x:collection1_shard1_replica_n6 t:null-147] o.a.s.c.S.Request webapp= path=/select params={q=id:1&distrib=false&sanity_check=non_distrib_id_1_lookup&wt=javabin&version=2} hits=0 status=0 QTime=2 2> 182099 INFO (qtp1128963290-590) [n:127.0.0.1:43017_ c:collection1 s:shard1 r:core_node3 x:collection1_shard1_replica_n1 t:null-148] o.a.s.c.S.Request webapp= path=/select params={q=id:1&distrib=false&sanity_check=non_distrib_id_1_lookup&wt=javabin&version=2} hits=0 status=0 QTime=3 2> 182132 INFO (qtp558875332-570) [n:127.0.0.1:32885_ c:collection1 s:shard2 r:core_node7 x:collection1_shard2_replica_n5 t:null-149] o.a.s.c.S.Request webapp= path=/select params={df=text&distrib=false&fl=id&fl=score&shards.purpose=16388&start=0&fsv=true&rid=null-149&rows=10&version=2&q=id:1&omitHeader=false&NOW=1705871114218&sanity_check=distrib_id_1_lookup&isShard=true&wt=javabin} hits=0 status=0 QTime=2 2> 182137 INFO (qtp1128963290-591) [n:127.0.0.1:43017_ c:collection1 s:shard1 r:core_node3 x:collection1_shard1_replica_n1 t:null-149] o.a.s.c.S.Request webapp= path=/select params={df=text&distrib=false&fl=id&fl=score&shards.purpose=16388&start=0&fsv=true&rid=null-149&rows=10&version=2&q=id:1&omitHeader=false&NOW=1705871114218&sanity_check=distrib_id_1_lookup&isShard=true&wt=javabin} hits=0 status=0 QTime=3 2> 182142 INFO (qtp600427849-515) [n:127.0.0.1:40707_ c:collection1 s:shard2 r:core_node4 x:collection1_shard2_replica_n2 t:null-149] o.a.s.c.S.Request webapp= path=/select params={q=id:1&sanity_check=distrib_id_1_lookup&wt=javabin&version=2} rid=null-149 hits=0 status=0 QTime=22 2> 182172 INFO (qtp1128963290-593) [n:127.0.0.1:43017_ c:collection1 s:shard1 r:core_node3 x:collection1_shard1_replica_n1 t:null-150] o.a.s.c.S.Request webapp= path=/select params={df=text&distrib=false&fl=id&fl=score&shards.purpose=16388&start=0&fsv=true&rid=null-150&rows=10&version=2&q=id:1&omitHeader=false&NOW=1705871114254&sanity_check=distrib_id_1_lookup&isShard=true&wt=javabin} hits=0 status=0 QTime=2 2> 182187 INFO (qtp558875332-572) [n:127.0.0.1:32885_ c:collection1 s:shard2 r:core_node7 x:collection1_shard2_replica_n5 t:null-150] o.a.s.c.S.Request webapp= path=/select params={df=text&distrib=false&fl=id&fl=score&shards.purpose=16388&start=0&fsv=true&rid=null-150&rows=10&version=2&q=id:1&omitHeader=false&NOW=1705871114254&sanity_check=distrib_id_1_lookup&isShard=true&wt=javabin} hits=0 status=0 QTime=2 2> 182191 INFO (qtp558875332-571) [n:127.0.0.1:32885_ c:collection1 s:shard2 r:core_node7 x:collection1_shard2_replica_n5 t:null-150] o.a.s.c.S.Request webapp= path=/select params={q=id:1&sanity_check=distrib_id_1_lookup&wt=javabin&version=2} rid=null-150 hits=0 status=0 QTime=35 2> 182232 INFO (qtp657049334-531) [n:127.0.0.1:37133_ c:collection1 s:shard1 r:core_node8 x:collection1_shard1_replica_n6 t:null-151] o.a.s.c.S.Request webapp= path=/select params={df=text&distrib=false&fl=id&fl=score&shards.purpose=16388&start=0&fsv=true&rid=null-151&rows=10&version=2&q=id:1&omitHeader=false&NOW=1705871114312&sanity_check=distrib_id_1_lookup&isShard=true&wt=javabin} hits=0 status=0 QTime=3 2> 182239 INFO (qtp600427849-516) [n:127.0.0.1:40707_ c:collection1 s:shard2 r:core_node4 x:collection1_shard2_replica_n2 t:null-151] o.a.s.c.S.Request webapp= path=/select params={df=text&distrib=false&fl=id&fl=score&shards.purpose=16388&start=0&fsv=true&rid=null-151&rows=10&version=2&q=id:1&omitHeader=false&NOW=1705871114312&sanity_check=distrib_id_1_lookup&isShard=true&wt=javabin} hits=0 status=0 QTime=3 2> 182244 INFO (qtp657049334-530) [n:127.0.0.1:37133_ c:collection1 s:shard1 r:core_node8 x:collection1_shard1_replica_n6 t:null-151] o.a.s.c.S.Request webapp= path=/select params={q=id:1&sanity_check=distrib_id_1_lookup&wt=javabin&version=2} rid=null-151 hits=0 status=0 QTime=31 2> 182269 INFO (qtp558875332-568) [n:127.0.0.1:32885_ c:collection1 s:shard2 r:core_node7 x:collection1_shard2_replica_n5 t:null-152] o.a.s.c.S.Request webapp= path=/select params={df=text&distrib=false&fl=id&fl=score&shards.purpose=16388&start=0&fsv=true&rid=null-152&rows=10&version=2&q=id:1&omitHeader=false&NOW=1705871114352&sanity_check=distrib_id_1_lookup&isShard=true&wt=javabin} hits=0 status=0 QTime=4 2> 182293 INFO (qtp657049334-532) [n:127.0.0.1:37133_ c:collection1 s:shard1 r:core_node8 x:collection1_shard1_replica_n6 t:null-152] o.a.s.c.S.Request webapp= path=/select params={df=text&distrib=false&fl=id&fl=score&shards.purpose=16388&start=0&fsv=true&rid=null-152&rows=10&version=2&q=id:1&omitHeader=false&NOW=1705871114352&sanity_check=distrib_id_1_lookup&isShard=true&wt=javabin} hits=0 status=0 QTime=8 2> 182300 INFO (qtp1128963290-588) [n:127.0.0.1:43017_ c:collection1 s:shard1 r:core_node3 x:collection1_shard1_replica_n1 t:null-152] o.a.s.c.S.Request webapp= path=/select params={q=id:1&sanity_check=distrib_id_1_lookup&wt=javabin&version=2} rid=null-152 hits=0 status=0 QTime=46 2> 182325 INFO (qtp2065109343-469) [n:127.0.0.1:45297_ c:control_collection s:shard1 r:core_node2 x:control_collection_shard1_replica_n1 t:null-153] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={wt=javabin&version=2}{add=[1 (1788735509673017344)]} 0 7 2> 182426 INFO (qtp657049334-529) [n:127.0.0.1:37133_ c:collection1 s:shard1 r:core_node8 x:collection1_shard1_replica_n6 t:null-154] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={update.distrib=FROMLEADER&distrib.from=http://127.0.0.1:43017/collection1_shard1_replica_n1/&wt=javabin&version=2}{add=[1 (1788735509763194880)]} 0 8 2> 182431 INFO (qtp1128963290-590) [n:127.0.0.1:43017_ c:collection1 s:shard1 r:core_node3 x:collection1_shard1_replica_n1 t:null-154] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={update.distrib=TOLEADER&distrib.from=http://127.0.0.1:32885/collection1_shard2_replica_n5/&wt=javabin&version=2}{add=[1 (1788735509763194880)]} 0 56 2> 182435 INFO (qtp558875332-570) [n:127.0.0.1:32885_ c:collection1 s:shard2 r:core_node7 x:collection1_shard2_replica_n5 t:null-154] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={wt=javabin&version=2}{add=[1]} 0 94 2> 182464 INFO (qtp600427849-517) [n:127.0.0.1:40707_ c:collection1 s:shard2 r:core_node4 x:collection1_shard2_replica_n2 t:null-155] o.a.s.c.S.Request webapp= path=/select params={q=id:1&distrib=false&sanity_check=non_distrib_id_1_lookup&wt=javabin&version=2} hits=0 status=0 QTime=3 2> 182484 INFO (qtp558875332-569) [n:127.0.0.1:32885_ c:collection1 s:shard2 r:core_node7 x:collection1_shard2_replica_n5 t:null-156] o.a.s.c.S.Request webapp= path=/select params={q=id:1&distrib=false&sanity_check=non_distrib_id_1_lookup&wt=javabin&version=2} hits=0 status=0 QTime=2 2> 182512 INFO (qtp657049334-531) [n:127.0.0.1:37133_ c:collection1 s:shard1 r:core_node8 x:collection1_shard1_replica_n6 t:null-157] o.a.s.c.S.Request webapp= path=/select params={q=id:1&distrib=false&sanity_check=non_distrib_id_1_lookup&wt=javabin&version=2} hits=0 status=0 QTime=2 2> 182531 INFO (qtp1128963290-591) [n:127.0.0.1:43017_ c:collection1 s:shard1 r:core_node3 x:collection1_shard1_replica_n1 t:null-158] o.a.s.c.S.Request webapp= path=/select params={q=id:1&distrib=false&sanity_check=non_distrib_id_1_lookup&wt=javabin&version=2} hits=0 status=0 QTime=3 2> 182567 INFO (qtp558875332-573) [n:127.0.0.1:32885_ c:collection1 s:shard2 r:core_node7 x:collection1_shard2_replica_n5 t:null-159] o.a.s.c.S.Request webapp= path=/select params={df=text&distrib=false&fl=id&fl=score&shards.purpose=16388&start=0&fsv=true&rid=null-159&rows=10&version=2&q=id:1&omitHeader=false&NOW=1705871114651&sanity_check=distrib_id_1_lookup&isShard=true&wt=javabin} hits=0 status=0 QTime=6 2> 182568 INFO (qtp1128963290-593) [n:127.0.0.1:43017_ c:collection1 s:shard1 r:core_node3 x:collection1_shard1_replica_n1 t:null-159] o.a.s.c.S.Request webapp= path=/select params={df=text&distrib=false&fl=id&fl=score&shards.purpose=16388&start=0&fsv=true&rid=null-159&rows=10&version=2&q=id:1&omitHeader=false&NOW=1705871114651&sanity_check=distrib_id_1_lookup&isShard=true&wt=javabin} hits=0 status=0 QTime=3 2> 182571 INFO (qtp600427849-518) [n:127.0.0.1:40707_ c:collection1 s:shard2 r:core_node4 x:collection1_shard2_replica_n2 t:null-159] o.a.s.c.S.Request webapp= path=/select params={q=id:1&sanity_check=distrib_id_1_lookup&wt=javabin&version=2} rid=null-159 hits=0 status=0 QTime=19 2> 182595 INFO (qtp558875332-571) [n:127.0.0.1:32885_ c:collection1 s:shard2 r:core_node7 x:collection1_shard2_replica_n5 t:null-160] o.a.s.c.S.Request webapp= path=/select params={df=text&distrib=false&fl=id&fl=score&shards.purpose=16388&start=0&fsv=true&rid=null-160&rows=10&version=2&q=id:1&omitHeader=false&NOW=1705871114678&sanity_check=distrib_id_1_lookup&isShard=true&wt=javabin} hits=0 status=0 QTime=2 2> 182619 INFO (qtp657049334-533) [n:127.0.0.1:37133_ c:collection1 s:shard1 r:core_node8 x:collection1_shard1_replica_n6 t:null-160] o.a.s.c.S.Request webapp= path=/select params={df=text&distrib=false&fl=id&fl=score&shards.purpose=16388&start=0&fsv=true&rid=null-160&rows=10&version=2&q=id:1&omitHeader=false&NOW=1705871114678&sanity_check=distrib_id_1_lookup&isShard=true&wt=javabin} hits=0 status=0 QTime=2 2> 182624 INFO (qtp558875332-572) [n:127.0.0.1:32885_ c:collection1 s:shard2 r:core_node7 x:collection1_shard2_replica_n5 t:null-160] o.a.s.c.S.Request webapp= path=/select params={q=id:1&sanity_check=distrib_id_1_lookup&wt=javabin&version=2} rid=null-160 hits=0 status=0 QTime=44 2> 182650 INFO (qtp600427849-515) [n:127.0.0.1:40707_ c:collection1 s:shard2 r:core_node4 x:collection1_shard2_replica_n2 t:null-161] o.a.s.c.S.Request webapp= path=/select params={df=text&distrib=false&fl=id&fl=score&shards.purpose=16388&start=0&fsv=true&rid=null-161&rows=10&version=2&q=id:1&omitHeader=false&NOW=1705871114736&sanity_check=distrib_id_1_lookup&isShard=true&wt=javabin} hits=0 status=0 QTime=3 2> 182653 INFO (qtp657049334-529) [n:127.0.0.1:37133_ c:collection1 s:shard1 r:core_node8 x:collection1_shard1_replica_n6 t:null-161] o.a.s.c.S.Request webapp= path=/select params={df=text&distrib=false&fl=id&fl=score&shards.purpose=16388&start=0&fsv=true&rid=null-161&rows=10&version=2&q=id:1&omitHeader=false&NOW=1705871114736&sanity_check=distrib_id_1_lookup&isShard=true&wt=javabin} hits=0 status=0 QTime=2 2> 182657 INFO (qtp657049334-532) [n:127.0.0.1:37133_ c:collection1 s:shard1 r:core_node8 x:collection1_shard1_replica_n6 t:null-161] o.a.s.c.S.Request webapp= path=/select params={q=id:1&sanity_check=distrib_id_1_lookup&wt=javabin&version=2} rid=null-161 hits=0 status=0 QTime=18 2> 182690 INFO (qtp1128963290-592) [n:127.0.0.1:43017_ c:collection1 s:shard1 r:core_node3 x:collection1_shard1_replica_n1 t:null-162] o.a.s.c.S.Request webapp= path=/select params={df=text&distrib=false&fl=id&fl=score&shards.purpose=16388&start=0&fsv=true&rid=null-162&rows=10&version=2&q=id:1&omitHeader=false&NOW=1705871114767&sanity_check=distrib_id_1_lookup&isShard=true&wt=javabin} hits=0 status=0 QTime=2 2> 182692 INFO (qtp600427849-516) [n:127.0.0.1:40707_ c:collection1 s:shard2 r:core_node4 x:collection1_shard2_replica_n2 t:null-162] o.a.s.c.S.Request webapp= path=/select params={df=text&distrib=false&fl=id&fl=score&shards.purpose=16388&start=0&fsv=true&rid=null-162&rows=10&version=2&q=id:1&omitHeader=false&NOW=1705871114767&sanity_check=distrib_id_1_lookup&isShard=true&wt=javabin} hits=0 status=0 QTime=2 2> 182695 INFO (qtp1128963290-588) [n:127.0.0.1:43017_ c:collection1 s:shard1 r:core_node3 x:collection1_shard1_replica_n1 t:null-162] o.a.s.c.S.Request webapp= path=/select params={q=id:1&sanity_check=distrib_id_1_lookup&wt=javabin&version=2} rid=null-162 hits=0 status=0 QTime=27 2> 182711 INFO (qtp2065109343-470) [n:127.0.0.1:45297_ c:control_collection s:shard1 r:core_node2 x:control_collection_shard1_replica_n1 t:null-163] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={wt=javabin&version=2}{add=[1 (1788735510078816256)]} 0 6 2> 182747 INFO (qtp657049334-531) [n:127.0.0.1:37133_ c:collection1 s:shard1 r:core_node8 x:collection1_shard1_replica_n6 t:null-164] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={update.distrib=FROMLEADER&distrib.from=http://127.0.0.1:43017/collection1_shard1_replica_n1/&wt=javabin&version=2}{add=[1 (1788735510092447744)]} 0 6 2> 182751 INFO (qtp1128963290-589) [n:127.0.0.1:43017_ c:collection1 s:shard1 r:core_node3 x:collection1_shard1_replica_n1 t:null-164] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={wt=javabin&version=2}{add=[1 (1788735510092447744)]} 0 32 2> 182774 INFO (qtp600427849-517) [n:127.0.0.1:40707_ c:collection1 s:shard2 r:core_node4 x:collection1_shard2_replica_n2 t:null-165] o.a.s.c.S.Request webapp= path=/select params={q=id:1&distrib=false&sanity_check=non_distrib_id_1_lookup&wt=javabin&version=2} hits=0 status=0 QTime=3 2> 182786 INFO (qtp558875332-568) [n:127.0.0.1:32885_ c:collection1 s:shard2 r:core_node7 x:collection1_shard2_replica_n5 t:null-166] o.a.s.c.S.Request webapp= path=/select params={q=id:1&distrib=false&sanity_check=non_distrib_id_1_lookup&wt=javabin&version=2} hits=0 status=0 QTime=2 2> 182804 INFO (qtp657049334-530) [n:127.0.0.1:37133_ c:collection1 s:shard1 r:core_node8 x:collection1_shard1_replica_n6 t:null-167] o.a.s.c.S.Request webapp= path=/select params={q=id:1&distrib=false&sanity_check=non_distrib_id_1_lookup&wt=javabin&version=2} hits=0 status=0 QTime=2 2> 182815 INFO (qtp1128963290-590) [n:127.0.0.1:43017_ c:collection1 s:shard1 r:core_node3 x:collection1_shard1_replica_n1 t:null-168] o.a.s.c.S.Request webapp= path=/select params={q=id:1&distrib=false&sanity_check=non_distrib_id_1_lookup&wt=javabin&version=2} hits=0 status=0 QTime=3 2> 182862 INFO (qtp600427849-520) [n:127.0.0.1:40707_ c:collection1 s:shard2 r:core_node4 x:collection1_shard2_replica_n2 t:null-169] o.a.s.c.S.Request webapp= path=/select params={df=text&distrib=false&fl=id&fl=score&shards.purpose=16388&start=0&fsv=true&rid=null-169&rows=10&version=2&q=id:1&omitHeader=false&NOW=1705871114935&sanity_check=distrib_id_1_lookup&isShard=true&wt=javabin} hits=0 status=0 QTime=3 2> 182863 INFO (qtp657049334-529) [n:127.0.0.1:37133_ c:collection1 s:shard1 r:core_node8 x:collection1_shard1_replica_n6 t:null-169] o.a.s.c.S.Request webapp= path=/select params={df=text&distrib=false&fl=id&fl=score&shards.purpose=16388&start=0&fsv=true&rid=null-169&rows=10&version=2&q=id:1&omitHeader=false&NOW=1705871114935&sanity_check=distrib_id_1_lookup&isShard=true&wt=javabin} hits=0 status=0 QTime=3 2> 182868 INFO (qtp600427849-518) [n:127.0.0.1:40707_ c:collection1 s:shard2 r:core_node4 x:collection1_shard2_replica_n2 t:null-169] o.a.s.c.S.Request webapp= path=/select params={q=id:1&sanity_check=distrib_id_1_lookup&wt=javabin&version=2} rid=null-169 hits=0 status=0 QTime=31 2> 182889 INFO (qtp558875332-569) [n:127.0.0.1:32885_ c:collection1 s:shard2 r:core_node7 x:collection1_shard2_replica_n5 t:null-170] o.a.s.c.S.Request webapp= path=/select params={df=text&distrib=false&fl=id&fl=score&shards.purpose=16388&start=0&fsv=true&rid=null-170&rows=10&version=2&q=id:1&omitHeader=false&NOW=1705871114976&sanity_check=distrib_id_1_lookup&isShard=true&wt=javabin} hits=0 status=0 QTime=2 2> 182892 INFO (qtp657049334-532) [n:127.0.0.1:37133_ c:collection1 s:shard1 r:core_node8 x:collection1_shard1_replica_n6 t:null-170] o.a.s.c.S.Request webapp= path=/select params={df=text&distrib=false&fl=id&fl=score&shards.purpose=16388&start=0&fsv=true&rid=null-170&rows=10&version=2&q=id:1&omitHeader=false&NOW=1705871114976&sanity_check=distrib_id_1_lookup&isShard=true&wt=javabin} hits=0 status=0 QTime=2 2> 182898 INFO (qtp558875332-570) [n:127.0.0.1:32885_ c:collection1 s:shard2 r:core_node7 x:collection1_shard2_replica_n5 t:null-170] o.a.s.c.S.Request webapp= path=/select params={q=id:1&sanity_check=distrib_id_1_lookup&wt=javabin&version=2} rid=null-170 hits=0 status=0 QTime=20 2> 182959 INFO (qtp558875332-573) [n:127.0.0.1:32885_ c:collection1 s:shard2 r:core_node7 x:collection1_shard2_replica_n5 t:null-171] o.a.s.c.S.Request webapp= path=/select params={df=text&distrib=false&fl=id&fl=score&shards.purpose=16388&start=0&fsv=true&rid=null-171&rows=10&version=2&q=id:1&omitHeader=false&NOW=1705871115029&sanity_check=distrib_id_1_lookup&isShard=true&wt=javabin} hits=0 status=0 QTime=3 2> 182959 INFO (qtp657049334-530) [n:127.0.0.1:37133_ c:collection1 s:shard1 r:core_node8 x:collection1_shard1_replica_n6 t:null-171] o.a.s.c.S.Request webapp= path=/select params={df=text&distrib=false&fl=id&fl=score&shards.purpose=16388&start=0&fsv=true&rid=null-171&rows=10&version=2&q=id:1&omitHeader=false&NOW=1705871115029&sanity_check=distrib_id_1_lookup&isShard=true&wt=javabin} hits=0 status=0 QTime=2 2> 182969 INFO (qtp657049334-531) [n:127.0.0.1:37133_ c:collection1 s:shard1 r:core_node8 x:collection1_shard1_replica_n6 t:null-171] o.a.s.c.S.Request webapp= path=/select params={q=id:1&sanity_check=distrib_id_1_lookup&wt=javabin&version=2} rid=null-171 hits=0 status=0 QTime=38 2> 183014 INFO (qtp600427849-516) [n:127.0.0.1:40707_ c:collection1 s:shard2 r:core_node4 x:collection1_shard2_replica_n2 t:null-172] o.a.s.c.S.Request webapp= path=/select params={df=text&distrib=false&fl=id&fl=score&shards.purpose=16388&start=0&fsv=true&rid=null-172&rows=10&version=2&q=id:1&omitHeader=false&NOW=1705871115076&sanity_check=distrib_id_1_lookup&isShard=true&wt=javabin} hits=0 status=0 QTime=2 2> 183017 INFO (qtp657049334-528) [n:127.0.0.1:37133_ c:collection1 s:shard1 r:core_node8 x:collection1_shard1_replica_n6 t:null-172] o.a.s.c.S.Request webapp= path=/select params={df=text&distrib=false&fl=id&fl=score&shards.purpose=16388&start=0&fsv=true&rid=null-172&rows=10&version=2&q=id:1&omitHeader=false&NOW=1705871115076&sanity_check=distrib_id_1_lookup&isShard=true&wt=javabin} hits=0 status=0 QTime=3 2> 183022 INFO (qtp1128963290-591) [n:127.0.0.1:43017_ c:collection1 s:shard1 r:core_node3 x:collection1_shard1_replica_n1 t:null-172] o.a.s.c.S.Request webapp= path=/select params={q=id:1&sanity_check=distrib_id_1_lookup&wt=javabin&version=2} rid=null-172 hits=0 status=0 QTime=44 2> 183080 INFO (qtp2065109343-467) [n:127.0.0.1:45297_ c:control_collection s:shard1 r:core_node2 x:control_collection_shard1_replica_n1 t:null-173] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={wt=javabin&version=2&CONTROL=TRUE}{add=[1 (1788735510423797760)]} 0 46 2> 183130 INFO (qtp657049334-533) [n:127.0.0.1:37133_ c:collection1 s:shard1 r:core_node8 x:collection1_shard1_replica_n6 t:null-174] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={update.distrib=FROMLEADER&distrib.from=http://127.0.0.1:43017/collection1_shard1_replica_n1/&wt=javabin&version=2}{add=[1 (1788735510486712320)]} 0 13 2> 183134 INFO (qtp1128963290-593) [n:127.0.0.1:43017_ c:collection1 s:shard1 r:core_node3 x:collection1_shard1_replica_n1 t:null-174] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={wt=javabin&version=2}{add=[1 (1788735510486712320)]} 0 42 2> 183165 INFO (qtp2065109343-471) [n:127.0.0.1:45297_ c:control_collection s:shard1 r:core_node2 x:control_collection_shard1_replica_n1 t:null-175] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={wt=javabin&version=2&CONTROL=TRUE}{add=[2 (1788735510540189696)]} 0 19 2> 183242 INFO (qtp558875332-571) [n:127.0.0.1:32885_ c:collection1 s:shard2 r:core_node7 x:collection1_shard2_replica_n5 t:null-176] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={update.distrib=FROMLEADER&distrib.from=http://127.0.0.1:40707/collection1_shard2_replica_n2/&wt=javabin&version=2}{add=[2 (1788735510577938432)]} 0 23 2> 183251 INFO (qtp600427849-515) [n:127.0.0.1:40707_ c:collection1 s:shard2 r:core_node4 x:collection1_shard2_replica_n2 t:null-176] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={wt=javabin&version=2}{add=[2 (1788735510577938432)]} 0 69 2> 183276 INFO (qtp2065109343-466) [n:127.0.0.1:45297_ c:control_collection s:shard1 r:core_node2 x:control_collection_shard1_replica_n1 t:null-177] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={wt=javabin&version=2&CONTROL=TRUE}{add=[3 (1788735510668115968)]} 0 8 2> 183322 INFO (qtp558875332-572) [n:127.0.0.1:32885_ c:collection1 s:shard2 r:core_node7 x:collection1_shard2_replica_n5 t:null-178] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={update.distrib=FROMLEADER&distrib.from=http://127.0.0.1:40707/collection1_shard2_replica_n2/&wt=javabin&version=2}{add=[3 (1788735510685941760)]} 0 17 2> 183327 INFO (qtp600427849-519) [n:127.0.0.1:40707_ c:collection1 s:shard2 r:core_node4 x:collection1_shard2_replica_n2 t:null-178] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={wt=javabin&version=2}{add=[3 (1788735510685941760)]} 0 42 2> 183346 INFO (qtp2065109343-468) [n:127.0.0.1:45297_ c:control_collection s:shard1 r:core_node2 x:control_collection_shard1_replica_n1 t:null-179] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={wt=javabin&version=2&CONTROL=TRUE}{add=[4 (1788735510740467712)]} 0 10 2> 183397 INFO (qtp657049334-529) [n:127.0.0.1:37133_ c:collection1 s:shard1 r:core_node8 x:collection1_shard1_replica_n6 t:null-180] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={update.distrib=FROMLEADER&distrib.from=http://127.0.0.1:43017/collection1_shard1_replica_n1/&wt=javabin&version=2}{add=[4 (1788735510758293504)]} 0 17 2> 183401 INFO (qtp1128963290-592) [n:127.0.0.1:43017_ c:collection1 s:shard1 r:core_node3 x:collection1_shard1_replica_n1 t:null-180] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={wt=javabin&version=2}{add=[4 (1788735510758293504)]} 0 47 2> 183421 INFO (qtp2065109343-469) [n:127.0.0.1:45297_ c:control_collection s:shard1 r:core_node2 x:control_collection_shard1_replica_n1 t:null-181] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={wt=javabin&version=2&CONTROL=TRUE}{add=[5 (1788735510817013760)]} 0 10 2> 183495 INFO (qtp558875332-568) [n:127.0.0.1:32885_ c:collection1 s:shard2 r:core_node7 x:collection1_shard2_replica_n5 t:null-182] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={update.distrib=FROMLEADER&distrib.from=http://127.0.0.1:40707/collection1_shard2_replica_n2/&wt=javabin&version=2}{add=[5 (1788735510868393984)]} 0 10 2> 183500 INFO (qtp600427849-520) [n:127.0.0.1:40707_ c:collection1 s:shard2 r:core_node4 x:collection1_shard2_replica_n2 t:null-182] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={wt=javabin&version=2}{add=[5 (1788735510868393984)]} 0 41 2> 183518 INFO (qtp2065109343-470) [n:127.0.0.1:45297_ c:control_collection s:shard1 r:core_node2 x:control_collection_shard1_replica_n1 t:null-183] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={wt=javabin&version=2&CONTROL=TRUE}{add=[6 (1788735510922919936)]} 0 7 2> 183557 INFO (qtp558875332-569) [n:127.0.0.1:32885_ c:collection1 s:shard2 r:core_node7 x:collection1_shard2_replica_n5 t:null-184] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={update.distrib=FROMLEADER&distrib.from=http://127.0.0.1:40707/collection1_shard2_replica_n2/&wt=javabin&version=2}{add=[6 (1788735510939697152)]} 0 15 2> 183561 INFO (qtp600427849-518) [n:127.0.0.1:40707_ c:collection1 s:shard2 r:core_node4 x:collection1_shard2_replica_n2 t:null-184] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={wt=javabin&version=2}{add=[6 (1788735510939697152)]} 0 34 2> 183621 INFO (qtp2065109343-467) [n:127.0.0.1:45297_ c:control_collection s:shard1 r:core_node2 x:control_collection_shard1_replica_n1 t:null-185] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={wt=javabin&version=2&CONTROL=TRUE}{add=[7 (1788735511016243200)]} 0 22 2> 183668 INFO (qtp558875332-570) [n:127.0.0.1:32885_ c:collection1 s:shard2 r:core_node7 x:collection1_shard2_replica_n5 t:null-186] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={update.distrib=FROMLEADER&distrib.from=http://127.0.0.1:40707/collection1_shard2_replica_n2/&wt=javabin&version=2}{add=[7 (1788735511049797632)]} 0 6 2> 183672 INFO (qtp600427849-516) [n:127.0.0.1:40707_ c:collection1 s:shard2 r:core_node4 x:collection1_shard2_replica_n2 t:null-186] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={wt=javabin&version=2}{add=[7 (1788735511049797632)]} 0 40 2> 183695 INFO (qtp2065109343-471) [n:127.0.0.1:45297_ c:control_collection s:shard1 r:core_node2 x:control_collection_shard1_replica_n1 t:null-187] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={wt=javabin&version=2&CONTROL=TRUE}{add=[8 (1788735511105372160)]} 0 10 2> 183721 INFO (qtp657049334-532) [n:127.0.0.1:37133_ c:collection1 s:shard1 r:core_node8 x:collection1_shard1_replica_n6 t:null-188] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={update.distrib=FROMLEADER&distrib.from=http://127.0.0.1:43017/collection1_shard1_replica_n1/&wt=javabin&version=2}{add=[8 (1788735511125295104)]} 0 6 2> 183727 INFO (qtp1128963290-588) [n:127.0.0.1:43017_ c:collection1 s:shard1 r:core_node3 x:collection1_shard1_replica_n1 t:null-188] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={wt=javabin&version=2}{add=[8 (1788735511125295104)]} 0 23 2> 183742 INFO (qtp2065109343-466) [n:127.0.0.1:45297_ c:control_collection s:shard1 r:core_node2 x:control_collection_shard1_replica_n1 t:null-189] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={wt=javabin&version=2&CONTROL=TRUE}{add=[9 (1788735511157800960)]} 0 7 2> 183776 INFO (qtp558875332-573) [n:127.0.0.1:32885_ c:collection1 s:shard2 r:core_node7 x:collection1_shard2_replica_n5 t:null-190] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={update.distrib=FROMLEADER&distrib.from=http://127.0.0.1:40707/collection1_shard2_replica_n2/&wt=javabin&version=2}{add=[9 (1788735511173529600)]} 0 9 2> 183782 INFO (qtp600427849-517) [n:127.0.0.1:40707_ c:collection1 s:shard2 r:core_node4 x:collection1_shard2_replica_n2 t:null-190] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={wt=javabin&version=2}{add=[9 (1788735511173529600)]} 0 32 2> 183803 INFO (qtp2065109343-468) [n:127.0.0.1:45297_ c:control_collection s:shard1 r:core_node2 x:control_collection_shard1_replica_n1 t:null-191] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={wt=javabin&version=2&CONTROL=TRUE}{add=[10 (1788735511221764096)]} 0 12 2> 183845 INFO (qtp657049334-530) [n:127.0.0.1:37133_ c:collection1 s:shard1 r:core_node8 x:collection1_shard1_replica_n6 t:null-192] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={update.distrib=FROMLEADER&distrib.from=http://127.0.0.1:43017/collection1_shard1_replica_n1/&wt=javabin&version=2}{add=[10 (1788735511238541312)]} 0 7 2> 183849 INFO (qtp1128963290-589) [n:127.0.0.1:43017_ c:collection1 s:shard1 r:core_node3 x:collection1_shard1_replica_n1 t:null-192] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={wt=javabin&version=2}{add=[10 (1788735511238541312)]} 0 37 2> 183869 INFO (qtp2065109343-469) [n:127.0.0.1:45297_ c:control_collection s:shard1 r:core_node2 x:control_collection_shard1_replica_n1 t:null-193] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={wt=javabin&version=2&CONTROL=TRUE}{add=[11 (1788735511286775808)]} 0 12 2> 183908 INFO (qtp657049334-531) [n:127.0.0.1:37133_ c:collection1 s:shard1 r:core_node8 x:collection1_shard1_replica_n6 t:null-194] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={update.distrib=FROMLEADER&distrib.from=http://127.0.0.1:43017/collection1_shard1_replica_n1/&wt=javabin&version=2}{add=[11 (1788735511312990208)]} 0 9 2> 183913 INFO (qtp1128963290-590) [n:127.0.0.1:43017_ c:collection1 s:shard1 r:core_node3 x:collection1_shard1_replica_n1 t:null-194] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={wt=javabin&version=2}{add=[11 (1788735511312990208)]} 0 30 2> 183936 INFO (qtp2065109343-470) [n:127.0.0.1:45297_ c:control_collection s:shard1 r:core_node2 x:control_collection_shard1_replica_n1 t:null-195] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={wt=javabin&version=2&CONTROL=TRUE}{add=[12 (1788735511360176128)]} 0 9 2> 183975 INFO (qtp657049334-528) [n:127.0.0.1:37133_ c:collection1 s:shard1 r:core_node8 x:collection1_shard1_replica_n6 t:null-196] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={update.distrib=FROMLEADER&distrib.from=http://127.0.0.1:43017/collection1_shard1_replica_n1/&wt=javabin&version=2}{add=[12 (1788735511389536256)]} 0 7 2> 183982 INFO (qtp1128963290-591) [n:127.0.0.1:43017_ c:collection1 s:shard1 r:core_node3 x:collection1_shard1_replica_n1 t:null-196] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={wt=javabin&version=2}{add=[12 (1788735511389536256)]} 0 26 2> 184001 INFO (qtp2065109343-467) [n:127.0.0.1:45297_ c:control_collection s:shard1 r:core_node2 x:control_collection_shard1_replica_n1 t:null-197] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={wt=javabin&version=2&CONTROL=TRUE}{add=[13 (1788735511427284992)]} 0 9 2> 184044 INFO (qtp657049334-533) [n:127.0.0.1:37133_ c:collection1 s:shard1 r:core_node8 x:collection1_shard1_replica_n6 t:null-198] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={update.distrib=FROMLEADER&distrib.from=http://127.0.0.1:43017/collection1_shard1_replica_n1/&wt=javabin&version=2}{add=[13 (1788735511449305088)]} 0 7 2> 184049 INFO (qtp1128963290-593) [n:127.0.0.1:43017_ c:collection1 s:shard1 r:core_node3 x:collection1_shard1_replica_n1 t:null-198] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={wt=javabin&version=2}{add=[13 (1788735511449305088)]} 0 38 2> 184075 INFO (qtp2065109343-471) [n:127.0.0.1:45297_ c:control_collection s:shard1 r:core_node2 x:control_collection_shard1_replica_n1 t:null-199] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={wt=javabin&version=2&CONTROL=TRUE}{add=[14 (1788735511506976768)]} 0 9 2> 184870 INFO (qtp657049334-529) [n:127.0.0.1:37133_ c:collection1 s:shard1 r:core_node8 x:collection1_shard1_replica_n6 t:null-200] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={update.distrib=FROMLEADER&distrib.from=http://127.0.0.1:43017/collection1_shard1_replica_n1/&wt=javabin&version=2}{add=[14 (1788735511524802560)]} 0 773 2> 184874 INFO (qtp1128963290-592) [n:127.0.0.1:43017_ c:collection1 s:shard1 r:core_node3 x:collection1_shard1_replica_n1 t:null-200] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={wt=javabin&version=2}{add=[14 (1788735511524802560)]} 0 789 2> 184890 INFO (qtp2065109343-466) [n:127.0.0.1:45297_ c:control_collection s:shard1 r:core_node2 x:control_collection_shard1_replica_n1 t:null-201] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={wt=javabin&version=2&CONTROL=TRUE}{add=[15 (1788735512362614784)]} 0 6 2> 184923 INFO (qtp657049334-532) [n:127.0.0.1:37133_ c:collection1 s:shard1 r:core_node8 x:collection1_shard1_replica_n6 t:null-202] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={update.distrib=FROMLEADER&distrib.from=http://127.0.0.1:43017/collection1_shard1_replica_n1/&wt=javabin&version=2}{add=[15 (1788735512383586304)]} 0 6 2> 184927 INFO (qtp1128963290-588) [n:127.0.0.1:43017_ c:collection1 s:shard1 r:core_node3 x:collection1_shard1_replica_n1 t:null-202] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={wt=javabin&version=2}{add=[15 (1788735512383586304)]} 0 27 2> 184945 INFO (qtp2065109343-468) [n:127.0.0.1:45297_ c:control_collection s:shard1 r:core_node2 x:control_collection_shard1_replica_n1 t:null-203] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={wt=javabin&version=2&CONTROL=TRUE}{add=[16 (1788735512420286464)]} 0 9 2> 184981 INFO (qtp657049334-530) [n:127.0.0.1:37133_ c:collection1 s:shard1 r:core_node8 x:collection1_shard1_replica_n6 t:null-204] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={update.distrib=FROMLEADER&distrib.from=http://127.0.0.1:43017/collection1_shard1_replica_n1/&wt=javabin&version=2}{add=[16 (1788735512440209408)]} 0 9 2> 184986 INFO (qtp1128963290-589) [n:127.0.0.1:43017_ c:collection1 s:shard1 r:core_node3 x:collection1_shard1_replica_n1 t:null-204] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={wt=javabin&version=2}{add=[16 (1788735512440209408)]} 0 29 2> 185015 INFO (qtp2065109343-469) [n:127.0.0.1:45297_ c:control_collection s:shard1 r:core_node2 x:control_collection_shard1_replica_n1 t:null-205] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={wt=javabin&version=2&CONTROL=TRUE}{add=[17 (1788735512491589632)]} 0 16 2> 185056 INFO (qtp558875332-571) [n:127.0.0.1:32885_ c:collection1 s:shard2 r:core_node7 x:collection1_shard2_replica_n5 t:null-206] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={update.distrib=FROMLEADER&distrib.from=http://127.0.0.1:40707/collection1_shard2_replica_n2/&wt=javabin&version=2}{add=[17 (1788735512514658304)]} 0 11 2> 185099 INFO (qtp600427849-515) [n:127.0.0.1:40707_ c:collection1 s:shard2 r:core_node4 x:collection1_shard2_replica_n2 t:null-206] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={wt=javabin&version=2}{add=[17 (1788735512514658304)]} 0 74 2> 185120 INFO (qtp2065109343-470) [n:127.0.0.1:45297_ c:control_collection s:shard1 r:core_node2 x:control_collection_shard1_replica_n1 t:null-207] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={wt=javabin&version=2&CONTROL=TRUE}{add=[100 (1788735512605884416)]} 0 12 2> 185155 INFO (qtp657049334-531) [n:127.0.0.1:37133_ c:collection1 s:shard1 r:core_node8 x:collection1_shard1_replica_n6 t:null-208] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={update.distrib=FROMLEADER&distrib.from=http://127.0.0.1:43017/collection1_shard1_replica_n1/&wt=javabin&version=2}{add=[100 (1788735512630001664)]} 0 6 2> 185160 INFO (qtp1128963290-590) [n:127.0.0.1:43017_ c:collection1 s:shard1 r:core_node3 x:collection1_shard1_replica_n1 t:null-208] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={wt=javabin&version=2}{add=[100 (1788735512630001664)]} 0 31 2> 185173 INFO (qtp2065109343-467) [n:127.0.0.1:45297_ c:control_collection s:shard1 r:core_node2 x:control_collection_shard1_replica_n1 t:null-209] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={wt=javabin&version=2&CONTROL=TRUE}{add=[101 (1788735512660410368)]} 0 5 2> 185207 INFO (qtp657049334-528) [n:127.0.0.1:37133_ c:collection1 s:shard1 r:core_node8 x:collection1_shard1_replica_n6 t:null-210] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={update.distrib=FROMLEADER&distrib.from=http://127.0.0.1:43017/collection1_shard1_replica_n1/&wt=javabin&version=2}{add=[101 (1788735512675090432)]} 0 10 2> 185212 INFO (qtp1128963290-591) [n:127.0.0.1:43017_ c:collection1 s:shard1 r:core_node3 x:collection1_shard1_replica_n1 t:null-210] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={wt=javabin&version=2}{add=[101 (1788735512675090432)]} 0 30 2> 185230 INFO (qtp2065109343-471) [n:127.0.0.1:45297_ c:control_collection s:shard1 r:core_node2 x:control_collection_shard1_replica_n1 t:null-211] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={wt=javabin&version=2&CONTROL=TRUE}{add=[102 (1788735512720179200)]} 0 8 2> 185255 INFO (qtp657049334-533) [n:127.0.0.1:37133_ c:collection1 s:shard1 r:core_node8 x:collection1_shard1_replica_n6 t:null-212] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={update.distrib=FROMLEADER&distrib.from=http://127.0.0.1:43017/collection1_shard1_replica_n1/&wt=javabin&version=2}{add=[102 (1788735512733810688)]} 0 5 2> 185259 INFO (qtp1128963290-593) [n:127.0.0.1:43017_ c:collection1 s:shard1 r:core_node3 x:collection1_shard1_replica_n1 t:null-212] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={wt=javabin&version=2}{add=[102 (1788735512733810688)]} 0 20 2> 185272 INFO (qtp2065109343-466) [n:127.0.0.1:45297_ c:control_collection s:shard1 r:core_node2 x:control_collection_shard1_replica_n1 t:null-213] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={wt=javabin&version=2&CONTROL=TRUE}{add=[103 (1788735512764219392)]} 0 5 2> 185299 INFO (qtp558875332-572) [n:127.0.0.1:32885_ c:collection1 s:shard2 r:core_node7 x:collection1_shard2_replica_n5 t:null-214] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={update.distrib=FROMLEADER&distrib.from=http://127.0.0.1:40707/collection1_shard2_replica_n2/&wt=javabin&version=2}{add=[103 (1788735512780996608)]} 0 5 2> 185308 INFO (qtp600427849-519) [n:127.0.0.1:40707_ c:collection1 s:shard2 r:core_node4 x:collection1_shard2_replica_n2 t:null-214] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={wt=javabin&version=2}{add=[103 (1788735512780996608)]} 0 24 2> 185323 INFO (qtp2065109343-468) [n:127.0.0.1:45297_ c:control_collection s:shard1 r:core_node2 x:control_collection_shard1_replica_n1 t:null-215] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={wt=javabin&version=2&CONTROL=TRUE}{add=[104 (1788735512816648192)]} 0 7 2> 185354 INFO (qtp558875332-568) [n:127.0.0.1:32885_ c:collection1 s:shard2 r:core_node7 x:collection1_shard2_replica_n5 t:null-216] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={update.distrib=FROMLEADER&distrib.from=http://127.0.0.1:40707/collection1_shard2_replica_n2/&wt=javabin&version=2}{add=[104 (1788735512833425408)]} 0 5 2> 185358 INFO (qtp600427849-520) [n:127.0.0.1:40707_ c:collection1 s:shard2 r:core_node4 x:collection1_shard2_replica_n2 t:null-216] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={wt=javabin&version=2}{add=[104 (1788735512833425408)]} 0 26 2> 185375 INFO (qtp2065109343-469) [n:127.0.0.1:45297_ c:control_collection s:shard1 r:core_node2 x:control_collection_shard1_replica_n1 t:null-217] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={wt=javabin&version=2&CONTROL=TRUE}{add=[105 (1788735512872222720)]} 0 7 2> 185406 INFO (qtp558875332-569) [n:127.0.0.1:32885_ c:collection1 s:shard2 r:core_node7 x:collection1_shard2_replica_n5 t:null-218] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={update.distrib=FROMLEADER&distrib.from=http://127.0.0.1:40707/collection1_shard2_replica_n2/&wt=javabin&version=2}{add=[105 (1788735512892145664)]} 0 5 2> 185409 INFO (qtp600427849-518) [n:127.0.0.1:40707_ c:collection1 s:shard2 r:core_node4 x:collection1_shard2_replica_n2 t:null-218] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={wt=javabin&version=2}{add=[105 (1788735512892145664)]} 0 20 2> 185433 INFO (qtp2065109343-470) [n:127.0.0.1:45297_ c:control_collection s:shard1 r:core_node2 x:control_collection_shard1_replica_n1 t:null-219] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={wt=javabin&version=2&CONTROL=TRUE}{add=[106 (1788735512931991552)]} 0 5 2> 185472 INFO (qtp558875332-570) [n:127.0.0.1:32885_ c:collection1 s:shard2 r:core_node7 x:collection1_shard2_replica_n5 t:null-220] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={update.distrib=FROMLEADER&distrib.from=http://127.0.0.1:40707/collection1_shard2_replica_n2/&wt=javabin&version=2}{add=[106 (1788735512961351680)]} 0 5 2> 185476 INFO (qtp600427849-516) [n:127.0.0.1:40707_ c:collection1 s:shard2 r:core_node4 x:collection1_shard2_replica_n2 t:null-220] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={wt=javabin&version=2}{add=[106 (1788735512961351680)]} 0 21 2> 185493 INFO (qtp2065109343-467) [n:127.0.0.1:45297_ c:control_collection s:shard1 r:core_node2 x:control_collection_shard1_replica_n1 t:null-221] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={wt=javabin&version=2&CONTROL=TRUE}{add=[107 (1788735512995954688)]} 0 6 2> 185518 INFO (qtp657049334-529) [n:127.0.0.1:37133_ c:collection1 s:shard1 r:core_node8 x:collection1_shard1_replica_n6 t:null-222] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={update.distrib=FROMLEADER&distrib.from=http://127.0.0.1:43017/collection1_shard1_replica_n1/&wt=javabin&version=2}{add=[107 (1788735513009586176)]} 0 5 2> 185522 INFO (qtp1128963290-592) [n:127.0.0.1:43017_ c:collection1 s:shard1 r:core_node3 x:collection1_shard1_replica_n1 t:null-222] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={wt=javabin&version=2}{add=[107 (1788735513009586176)]} 0 21 2> 185538 INFO (qtp2065109343-471) [n:127.0.0.1:45297_ c:control_collection s:shard1 r:core_node2 x:control_collection_shard1_replica_n1 t:null-223] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={wt=javabin&version=2&CONTROL=TRUE}{add=[108 (1788735513043140608)]} 0 6 2> 185562 INFO (qtp657049334-532) [n:127.0.0.1:37133_ c:collection1 s:shard1 r:core_node8 x:collection1_shard1_replica_n6 t:null-224] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={update.distrib=FROMLEADER&distrib.from=http://127.0.0.1:43017/collection1_shard1_replica_n1/&wt=javabin&version=2}{add=[108 (1788735513057820672)]} 0 4 2> 185566 INFO (qtp1128963290-588) [n:127.0.0.1:43017_ c:collection1 s:shard1 r:core_node3 x:collection1_shard1_replica_n1 t:null-224] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={wt=javabin&version=2}{add=[108 (1788735513057820672)]} 0 18 2> 185580 INFO (qtp2065109343-466) [n:127.0.0.1:45297_ c:control_collection s:shard1 r:core_node2 x:control_collection_shard1_replica_n1 t:null-225] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={wt=javabin&version=2&CONTROL=TRUE}{add=[109 (1788735513087180800)]} 0 6 2> 185604 INFO (qtp657049334-530) [n:127.0.0.1:37133_ c:collection1 s:shard1 r:core_node8 x:collection1_shard1_replica_n6 t:null-226] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={update.distrib=FROMLEADER&distrib.from=http://127.0.0.1:43017/collection1_shard1_replica_n1/&wt=javabin&version=2}{add=[109 (1788735513100812288)]} 0 4 2> 185607 INFO (qtp1128963290-589) [n:127.0.0.1:43017_ c:collection1 s:shard1 r:core_node3 x:collection1_shard1_replica_n1 t:null-226] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={wt=javabin&version=2}{add=[109 (1788735513100812288)]} 0 18 2> 185620 INFO (qtp2065109343-468) [n:127.0.0.1:45297_ c:control_collection s:shard1 r:core_node2 x:control_collection_shard1_replica_n1 t:null-227] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={wt=javabin&version=2&CONTROL=TRUE}{add=[110 (1788735513129123840)]} 0 5 2> 185642 INFO (qtp657049334-531) [n:127.0.0.1:37133_ c:collection1 s:shard1 r:core_node8 x:collection1_shard1_replica_n6 t:null-228] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={update.distrib=FROMLEADER&distrib.from=http://127.0.0.1:43017/collection1_shard1_replica_n1/&wt=javabin&version=2}{add=[110 (1788735513140658176)]} 0 4 2> 185645 INFO (qtp1128963290-590) [n:127.0.0.1:43017_ c:collection1 s:shard1 r:core_node3 x:collection1_shard1_replica_n1 t:null-228] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={wt=javabin&version=2}{add=[110 (1788735513140658176)]} 0 18 2> 185657 INFO (qtp2065109343-469) [n:127.0.0.1:45297_ c:control_collection s:shard1 r:core_node2 x:control_collection_shard1_replica_n1 t:null-229] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={wt=javabin&version=2&CONTROL=TRUE}{add=[111 (1788735513167921152)]} 0 5 2> 185683 INFO (qtp657049334-528) [n:127.0.0.1:37133_ c:collection1 s:shard1 r:core_node8 x:collection1_shard1_replica_n6 t:null-230] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={update.distrib=FROMLEADER&distrib.from=http://127.0.0.1:43017/collection1_shard1_replica_n1/&wt=javabin&version=2}{add=[111 (1788735513182601216)]} 0 4 2> 185686 INFO (qtp1128963290-591) [n:127.0.0.1:43017_ c:collection1 s:shard1 r:core_node3 x:collection1_shard1_replica_n1 t:null-230] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={wt=javabin&version=2}{add=[111 (1788735513182601216)]} 0 21 2> 185700 INFO (qtp2065109343-470) [n:127.0.0.1:45297_ c:control_collection s:shard1 r:core_node2 x:control_collection_shard1_replica_n1 t:null-231] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={wt=javabin&version=2&CONTROL=TRUE}{add=[112 (1788735513213009920)]} 0 5 2> 185771 INFO (qtp558875332-573) [n:127.0.0.1:32885_ c:collection1 s:shard2 r:core_node7 x:collection1_shard2_replica_n5 t:null-232] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={update.distrib=FROMLEADER&distrib.from=http://127.0.0.1:40707/collection1_shard2_replica_n2/&wt=javabin&version=2}{add=[112 (1788735513254952960)]} 0 7 2> 185775 INFO (qtp600427849-517) [n:127.0.0.1:40707_ c:collection1 s:shard2 r:core_node4 x:collection1_shard2_replica_n2 t:null-232] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={wt=javabin&version=2}{add=[112 (1788735513254952960)]} 0 66 2> 185791 INFO (qtp2065109343-467) [n:127.0.0.1:45297_ c:control_collection s:shard1 r:core_node2 x:control_collection_shard1_replica_n1 t:null-233] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={wt=javabin&version=2&CONTROL=TRUE}{add=[113 (1788735513308430336)]} 0 6 2> 185823 INFO (qtp657049334-533) [n:127.0.0.1:37133_ c:collection1 s:shard1 r:core_node8 x:collection1_shard1_replica_n6 t:null-234] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={update.distrib=FROMLEADER&distrib.from=http://127.0.0.1:43017/collection1_shard1_replica_n1/&wt=javabin&version=2}{add=[113 (1788735513330450432)]} 0 5 2> 185830 INFO (qtp1128963290-593) [n:127.0.0.1:43017_ c:collection1 s:shard1 r:core_node3 x:collection1_shard1_replica_n1 t:null-234] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={wt=javabin&version=2}{add=[113 (1788735513330450432)]} 0 30 2> 185868 INFO (qtp2065109343-471) [n:127.0.0.1:45297_ c:control_collection s:shard1 r:core_node2 x:control_collection_shard1_replica_n1 t:null-235] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={wt=javabin&version=2&CONTROL=TRUE}{add=[114 (1788735513389170688)]} 0 29 2> 185915 INFO (qtp657049334-529) [n:127.0.0.1:37133_ c:collection1 s:shard1 r:core_node8 x:collection1_shard1_replica_n6 t:null-236] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={update.distrib=FROMLEADER&distrib.from=http://127.0.0.1:43017/collection1_shard1_replica_n1/&wt=javabin&version=2}{add=[114 (1788735513414336512)]} 0 7 2> 185922 INFO (qtp1128963290-592) [n:127.0.0.1:43017_ c:collection1 s:shard1 r:core_node3 x:collection1_shard1_replica_n1 t:null-236] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={wt=javabin&version=2}{add=[114 (1788735513414336512)]} 0 41 2> 185944 INFO (qtp2065109343-466) [n:127.0.0.1:45297_ c:control_collection s:shard1 r:core_node2 x:control_collection_shard1_replica_n1 t:null-237] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={wt=javabin&version=2&CONTROL=TRUE}{add=[115 (1788735513468862464)]} 0 8 2> 185974 INFO (qtp558875332-571) [n:127.0.0.1:32885_ c:collection1 s:shard2 r:core_node7 x:collection1_shard2_replica_n5 t:null-238] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={update.distrib=FROMLEADER&distrib.from=http://127.0.0.1:40707/collection1_shard2_replica_n2/&wt=javabin&version=2}{add=[115 (1788735513484591104)]} 0 7 2> 185978 INFO (qtp600427849-515) [n:127.0.0.1:40707_ c:collection1 s:shard2 r:core_node4 x:collection1_shard2_replica_n2 t:null-238] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={wt=javabin&version=2}{add=[115 (1788735513484591104)]} 0 24 2> 186010 INFO (qtp2065109343-468) [n:127.0.0.1:45297_ c:control_collection s:shard1 r:core_node2 x:control_collection_shard1_replica_n1 t:null-239] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={wt=javabin&version=2&CONTROL=TRUE}{add=[116 (1788735513538068480)]} 0 5 2> 186042 INFO (qtp558875332-572) [n:127.0.0.1:32885_ c:collection1 s:shard2 r:core_node7 x:collection1_shard2_replica_n5 t:null-240] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={update.distrib=FROMLEADER&distrib.from=http://127.0.0.1:40707/collection1_shard2_replica_n2/&wt=javabin&version=2}{add=[116 (1788735513560088576)]} 0 4 2> 186052 INFO (qtp600427849-519) [n:127.0.0.1:40707_ c:collection1 s:shard2 r:core_node4 x:collection1_shard2_replica_n2 t:null-240] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={wt=javabin&version=2}{add=[116 (1788735513560088576)]} 0 26 2> 186065 INFO (qtp2065109343-469) [n:127.0.0.1:45297_ c:control_collection s:shard1 r:core_node2 x:control_collection_shard1_replica_n1 t:null-241] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={wt=javabin&version=2&CONTROL=TRUE}{add=[117 (1788735513595740160)]} 0 4 2> 186094 INFO (qtp558875332-568) [n:127.0.0.1:32885_ c:collection1 s:shard2 r:core_node7 x:collection1_shard2_replica_n5 t:null-242] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={update.distrib=FROMLEADER&distrib.from=http://127.0.0.1:40707/collection1_shard2_replica_n2/&wt=javabin&version=2}{add=[117 (1788735513609371648)]} 0 5 2> 186100 INFO (qtp600427849-520) [n:127.0.0.1:40707_ c:collection1 s:shard2 r:core_node4 x:collection1_shard2_replica_n2 t:null-242] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={wt=javabin&version=2}{add=[117 (1788735513609371648)]} 0 27 2> 186116 INFO (qtp2065109343-470) [n:127.0.0.1:45297_ c:control_collection s:shard1 r:core_node2 x:control_collection_shard1_replica_n1 t:null-243] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={wt=javabin&version=2&CONTROL=TRUE}{add=[118 (1788735513649217536)]} 0 4 2> 186147 INFO (qtp657049334-532) [n:127.0.0.1:37133_ c:collection1 s:shard1 r:core_node8 x:collection1_shard1_replica_n6 t:null-244] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={update.distrib=FROMLEADER&distrib.from=http://127.0.0.1:43017/collection1_shard1_replica_n1/&wt=javabin&version=2}{add=[118 (1788735513662849024)]} 0 12 2> 186153 INFO (qtp1128963290-588) [n:127.0.0.1:43017_ c:collection1 s:shard1 r:core_node3 x:collection1_shard1_replica_n1 t:null-244] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={wt=javabin&version=2}{add=[118 (1788735513662849024)]} 0 29 2> 186181 INFO (qtp2065109343-467) [n:127.0.0.1:45297_ c:control_collection s:shard1 r:core_node2 x:control_collection_shard1_replica_n1 t:null-245] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={wt=javabin&version=2&CONTROL=TRUE}{add=[119 (1788735513716326400)]} 0 20 2> 186215 INFO (qtp558875332-569) [n:127.0.0.1:32885_ c:collection1 s:shard2 r:core_node7 x:collection1_shard2_replica_n5 t:null-246] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={update.distrib=FROMLEADER&distrib.from=http://127.0.0.1:40707/collection1_shard2_replica_n2/&wt=javabin&version=2}{add=[119 (1788735513735200768)]} 0 9 2> 186219 INFO (qtp600427849-518) [n:127.0.0.1:40707_ c:collection1 s:shard2 r:core_node4 x:collection1_shard2_replica_n2 t:null-246] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={wt=javabin&version=2}{add=[119 (1788735513735200768)]} 0 27 2> 186232 INFO (qtp2065109343-471) [n:127.0.0.1:45297_ c:control_collection s:shard1 r:core_node2 x:control_collection_shard1_replica_n1 t:null-247] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={wt=javabin&version=2&CONTROL=TRUE}{add=[120 (1788735513769803776)]} 0 6 2> 186257 INFO (qtp558875332-570) [n:127.0.0.1:32885_ c:collection1 s:shard2 r:core_node7 x:collection1_shard2_replica_n5 t:null-248] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={update.distrib=FROMLEADER&distrib.from=http://127.0.0.1:40707/collection1_shard2_replica_n2/&wt=javabin&version=2}{add=[120 (1788735513784483840)]} 0 4 2> 186260 INFO (qtp600427849-516) [n:127.0.0.1:40707_ c:collection1 s:shard2 r:core_node4 x:collection1_shard2_replica_n2 t:null-248] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={wt=javabin&version=2}{add=[120 (1788735513784483840)]} 0 19 2> 186273 INFO (qtp2065109343-466) [n:127.0.0.1:45297_ c:control_collection s:shard1 r:core_node2 x:control_collection_shard1_replica_n1 t:null-249] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={wt=javabin&version=2&CONTROL=TRUE}{add=[121 (1788735513813843968)]} 0 5 2> 186299 INFO (qtp558875332-573) [n:127.0.0.1:32885_ c:collection1 s:shard2 r:core_node7 x:collection1_shard2_replica_n5 t:null-250] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={update.distrib=FROMLEADER&distrib.from=http://127.0.0.1:40707/collection1_shard2_replica_n2/&wt=javabin&version=2}{add=[121 (1788735513828524032)]} 0 4 2> 186305 INFO (qtp600427849-517) [n:127.0.0.1:40707_ c:collection1 s:shard2 r:core_node4 x:collection1_shard2_replica_n2 t:null-250] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={wt=javabin&version=2}{add=[121 (1788735513828524032)]} 0 23 2> 186324 INFO (qtp2065109343-468) [n:127.0.0.1:45297_ c:control_collection s:shard1 r:core_node2 x:control_collection_shard1_replica_n1 t:null-251] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={wt=javabin&version=2&CONTROL=TRUE}{add=[122 (1788735513867321344)]} 0 5 2> 186352 INFO (qtp657049334-530) [n:127.0.0.1:37133_ c:collection1 s:shard1 r:core_node8 x:collection1_shard1_replica_n6 t:null-252] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={update.distrib=FROMLEADER&distrib.from=http://127.0.0.1:43017/collection1_shard1_replica_n1/&wt=javabin&version=2}{add=[122 (1788735513882001408)]} 0 5 2> 186356 INFO (qtp1128963290-589) [n:127.0.0.1:43017_ c:collection1 s:shard1 r:core_node3 x:collection1_shard1_replica_n1 t:null-252] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={wt=javabin&version=2}{add=[122 (1788735513882001408)]} 0 24 2> 186371 INFO (qtp2065109343-469) [n:127.0.0.1:45297_ c:control_collection s:shard1 r:core_node2 x:control_collection_shard1_replica_n1 t:null-253] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={wt=javabin&version=2&CONTROL=TRUE}{add=[123 (1788735513916604416)]} 0 6 2> 186394 INFO (qtp657049334-531) [n:127.0.0.1:37133_ c:collection1 s:shard1 r:core_node8 x:collection1_shard1_replica_n6 t:null-254] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={update.distrib=FROMLEADER&distrib.from=http://127.0.0.1:43017/collection1_shard1_replica_n1/&wt=javabin&version=2}{add=[123 (1788735513929187328)]} 0 5 2> 186398 INFO (qtp1128963290-590) [n:127.0.0.1:43017_ c:collection1 s:shard1 r:core_node3 x:collection1_shard1_replica_n1 t:null-254] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={wt=javabin&version=2}{add=[123 (1788735513929187328)]} 0 19 2> 186410 INFO (qtp2065109343-470) [n:127.0.0.1:45297_ c:control_collection s:shard1 r:core_node2 x:control_collection_shard1_replica_n1 t:null-255] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={wt=javabin&version=2&CONTROL=TRUE}{add=[124 (1788735513957498880)]} 0 4 2> 186432 INFO (qtp657049334-528) [n:127.0.0.1:37133_ c:collection1 s:shard1 r:core_node8 x:collection1_shard1_replica_n6 t:null-256] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={update.distrib=FROMLEADER&distrib.from=http://127.0.0.1:43017/collection1_shard1_replica_n1/&wt=javabin&version=2}{add=[124 (1788735513970081792)]} 0 4 2> 186436 INFO (qtp1128963290-591) [n:127.0.0.1:43017_ c:collection1 s:shard1 r:core_node3 x:collection1_shard1_replica_n1 t:null-256] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={wt=javabin&version=2}{add=[124 (1788735513970081792)]} 0 18 2> 186449 INFO (qtp2065109343-467) [n:127.0.0.1:45297_ c:control_collection s:shard1 r:core_node2 x:control_collection_shard1_replica_n1 t:null-257] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={wt=javabin&version=2&CONTROL=TRUE}{add=[125 (1788735513997344768)]} 0 5 2> 186477 INFO (qtp657049334-533) [n:127.0.0.1:37133_ c:collection1 s:shard1 r:core_node8 x:collection1_shard1_replica_n6 t:null-258] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={update.distrib=FROMLEADER&distrib.from=http://127.0.0.1:43017/collection1_shard1_replica_n1/&wt=javabin&version=2}{add=[125 (1788735514012024832)]} 0 4 2> 186480 INFO (qtp1128963290-593) [n:127.0.0.1:43017_ c:collection1 s:shard1 r:core_node3 x:collection1_shard1_replica_n1 t:null-258] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={wt=javabin&version=2}{add=[125 (1788735514012024832)]} 0 23 2> 186493 INFO (qtp2065109343-471) [n:127.0.0.1:45297_ c:control_collection s:shard1 r:core_node2 x:control_collection_shard1_replica_n1 t:null-259] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={wt=javabin&version=2&CONTROL=TRUE}{add=[126 (1788735514044530688)]} 0 5 2> 186519 INFO (qtp558875332-571) [n:127.0.0.1:32885_ c:collection1 s:shard2 r:core_node7 x:collection1_shard2_replica_n5 t:null-260] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={update.distrib=FROMLEADER&distrib.from=http://127.0.0.1:40707/collection1_shard2_replica_n2/&wt=javabin&version=2}{add=[126 (1788735514057113600)]} 0 6 2> 186523 INFO (qtp600427849-515) [n:127.0.0.1:40707_ c:collection1 s:shard2 r:core_node4 x:collection1_shard2_replica_n2 t:null-260] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={wt=javabin&version=2}{add=[126 (1788735514057113600)]} 0 22 2> 186541 INFO (qtp2065109343-466) [n:127.0.0.1:45297_ c:control_collection s:shard1 r:core_node2 x:control_collection_shard1_replica_n1 t:null-261] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={wt=javabin&version=2&CONTROL=TRUE}{add=[127 (1788735514093813760)]} 0 9 2> 186568 INFO (qtp558875332-572) [n:127.0.0.1:32885_ c:collection1 s:shard2 r:core_node7 x:collection1_shard2_replica_n5 t:null-262] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={update.distrib=FROMLEADER&distrib.from=http://127.0.0.1:40707/collection1_shard2_replica_n2/&wt=javabin&version=2}{add=[127 (1788735514111639552)]} 0 4 2> 186572 INFO (qtp600427849-519) [n:127.0.0.1:40707_ c:collection1 s:shard2 r:core_node4 x:collection1_shard2_replica_n2 t:null-262] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={wt=javabin&version=2}{add=[127 (1788735514111639552)]} 0 21 2> 186586 INFO (qtp2065109343-468) [n:127.0.0.1:45297_ c:control_collection s:shard1 r:core_node2 x:control_collection_shard1_replica_n1 t:null-263] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={wt=javabin&version=2&CONTROL=TRUE}{add=[128 (1788735514140999680)]} 0 5 2> 186612 INFO (qtp558875332-568) [n:127.0.0.1:32885_ c:collection1 s:shard2 r:core_node7 x:collection1_shard2_replica_n5 t:null-264] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={update.distrib=FROMLEADER&distrib.from=http://127.0.0.1:40707/collection1_shard2_replica_n2/&wt=javabin&version=2}{add=[128 (1788735514156728320)]} 0 5 2> 186616 INFO (qtp600427849-520) [n:127.0.0.1:40707_ c:collection1 s:shard2 r:core_node4 x:collection1_shard2_replica_n2 t:null-264] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={wt=javabin&version=2}{add=[128 (1788735514156728320)]} 0 21 2> 186629 INFO (qtp2065109343-469) [n:127.0.0.1:45297_ c:control_collection s:shard1 r:core_node2 x:control_collection_shard1_replica_n1 t:null-265] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={wt=javabin&version=2&CONTROL=TRUE}{add=[129 (1788735514187137024)]} 0 5 2> 186650 INFO (qtp657049334-529) [n:127.0.0.1:37133_ c:collection1 s:shard1 r:core_node8 x:collection1_shard1_replica_n6 t:null-266] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={update.distrib=FROMLEADER&distrib.from=http://127.0.0.1:43017/collection1_shard1_replica_n1/&wt=javabin&version=2}{add=[129 (1788735514199719936)]} 0 4 2> 186654 INFO (qtp1128963290-592) [n:127.0.0.1:43017_ c:collection1 s:shard1 r:core_node3 x:collection1_shard1_replica_n1 t:null-266] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={wt=javabin&version=2}{add=[129 (1788735514199719936)]} 0 18 2> 186668 INFO (qtp2065109343-470) [n:127.0.0.1:45297_ c:control_collection s:shard1 r:core_node2 x:control_collection_shard1_replica_n1 t:null-267] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={wt=javabin&version=2&CONTROL=TRUE}{add=[130 (1788735514229080064)]} 0 5 2> 186691 INFO (qtp558875332-569) [n:127.0.0.1:32885_ c:collection1 s:shard2 r:core_node7 x:collection1_shard2_replica_n5 t:null-268] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={update.distrib=FROMLEADER&distrib.from=http://127.0.0.1:40707/collection1_shard2_replica_n2/&wt=javabin&version=2}{add=[130 (1788735514240614400)]} 0 5 2> 186695 INFO (qtp600427849-518) [n:127.0.0.1:40707_ c:collection1 s:shard2 r:core_node4 x:collection1_shard2_replica_n2 t:null-268] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={wt=javabin&version=2}{add=[130 (1788735514240614400)]} 0 19 2> 186707 INFO (qtp2065109343-467) [n:127.0.0.1:45297_ c:control_collection s:shard1 r:core_node2 x:control_collection_shard1_replica_n1 t:null-269] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={wt=javabin&version=2&CONTROL=TRUE}{add=[131 (1788735514268925952)]} 0 5 2> 186735 INFO (qtp558875332-570) [n:127.0.0.1:32885_ c:collection1 s:shard2 r:core_node7 x:collection1_shard2_replica_n5 t:null-270] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={update.distrib=FROMLEADER&distrib.from=http://127.0.0.1:40707/collection1_shard2_replica_n2/&wt=javabin&version=2}{add=[131 (1788735514284654592)]} 0 5 2> 186739 INFO (qtp600427849-516) [n:127.0.0.1:40707_ c:collection1 s:shard2 r:core_node4 x:collection1_shard2_replica_n2 t:null-270] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={wt=javabin&version=2}{add=[131 (1788735514284654592)]} 0 23 2> 186753 INFO (qtp2065109343-471) [n:127.0.0.1:45297_ c:control_collection s:shard1 r:core_node2 x:control_collection_shard1_replica_n1 t:null-271] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={wt=javabin&version=2&CONTROL=TRUE}{add=[132 (1788735514317160448)]} 0 5 2> 186790 INFO (qtp657049334-532) [n:127.0.0.1:37133_ c:collection1 s:shard1 r:core_node8 x:collection1_shard1_replica_n6 t:null-272] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={update.distrib=FROMLEADER&distrib.from=http://127.0.0.1:43017/collection1_shard1_replica_n1/&wt=javabin&version=2}{add=[132 (1788735514341277696)]} 0 6 2> 186794 INFO (qtp1128963290-588) [n:127.0.0.1:43017_ c:collection1 s:shard1 r:core_node3 x:collection1_shard1_replica_n1 t:null-272] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={wt=javabin&version=2}{add=[132 (1788735514341277696)]} 0 22 2> 186807 INFO (qtp2065109343-466) [n:127.0.0.1:45297_ c:control_collection s:shard1 r:core_node2 x:control_collection_shard1_replica_n1 t:null-273] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={wt=javabin&version=2&CONTROL=TRUE}{add=[133 (1788735514373783552)]} 0 5 2> 186837 INFO (qtp657049334-530) [n:127.0.0.1:37133_ c:collection1 s:shard1 r:core_node8 x:collection1_shard1_replica_n6 t:null-274] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={update.distrib=FROMLEADER&distrib.from=http://127.0.0.1:43017/collection1_shard1_replica_n1/&wt=javabin&version=2}{add=[133 (1788735514389512192)]} 0 5 2> 186845 INFO (qtp1128963290-589) [n:127.0.0.1:43017_ c:collection1 s:shard1 r:core_node3 x:collection1_shard1_replica_n1 t:null-274] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={wt=javabin&version=2}{add=[133 (1788735514389512192)]} 0 28 2> 186858 INFO (qtp2065109343-468) [n:127.0.0.1:45297_ c:control_collection s:shard1 r:core_node2 x:control_collection_shard1_replica_n1 t:null-275] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={wt=javabin&version=2&CONTROL=TRUE}{add=[134 (1788735514427260928)]} 0 6 2> 186886 INFO (qtp558875332-573) [n:127.0.0.1:32885_ c:collection1 s:shard2 r:core_node7 x:collection1_shard2_replica_n5 t:null-276] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={update.distrib=FROMLEADER&distrib.from=http://127.0.0.1:40707/collection1_shard2_replica_n2/&wt=javabin&version=2}{add=[134 (1788735514440892416)]} 0 4 2> 186899 INFO (qtp600427849-517) [n:127.0.0.1:40707_ c:collection1 s:shard2 r:core_node4 x:collection1_shard2_replica_n2 t:null-276] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={wt=javabin&version=2}{add=[134 (1788735514440892416)]} 0 33 2> 186914 INFO (qtp2065109343-469) [n:127.0.0.1:45297_ c:control_collection s:shard1 r:core_node2 x:control_collection_shard1_replica_n1 t:null-277] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={wt=javabin&version=2&CONTROL=TRUE}{add=[135 (1788735514485981184)]} 0 5 2> 186943 INFO (qtp657049334-531) [n:127.0.0.1:37133_ c:collection1 s:shard1 r:core_node8 x:collection1_shard1_replica_n6 t:null-278] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={update.distrib=FROMLEADER&distrib.from=http://127.0.0.1:43017/collection1_shard1_replica_n1/&wt=javabin&version=2}{add=[135 (1788735514499612672)]} 0 5 2> 186947 INFO (qtp1128963290-590) [n:127.0.0.1:43017_ c:collection1 s:shard1 r:core_node3 x:collection1_shard1_replica_n1 t:null-278] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={wt=javabin&version=2}{add=[135 (1788735514499612672)]} 0 25 2> 186963 INFO (qtp2065109343-470) [n:127.0.0.1:45297_ c:control_collection s:shard1 r:core_node2 x:control_collection_shard1_replica_n1 t:null-279] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={wt=javabin&version=2&CONTROL=TRUE}{add=[136 (1788735514537361408)]} 0 5 2> 186995 INFO (qtp558875332-571) [n:127.0.0.1:32885_ c:collection1 s:shard2 r:core_node7 x:collection1_shard2_replica_n5 t:null-280] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={update.distrib=FROMLEADER&distrib.from=http://127.0.0.1:40707/collection1_shard2_replica_n2/&wt=javabin&version=2}{add=[136 (1788735514552041472)]} 0 4 2> 186999 INFO (qtp600427849-515) [n:127.0.0.1:40707_ c:collection1 s:shard2 r:core_node4 x:collection1_shard2_replica_n2 t:null-280] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={wt=javabin&version=2}{add=[136 (1788735514552041472)]} 0 26 2> 187014 INFO (qtp2065109343-467) [n:127.0.0.1:45297_ c:control_collection s:shard1 r:core_node2 x:control_collection_shard1_replica_n1 t:null-281] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={wt=javabin&version=2&CONTROL=TRUE}{add=[137 (1788735514590838784)]} 0 5 2> 187042 INFO (qtp558875332-572) [n:127.0.0.1:32885_ c:collection1 s:shard2 r:core_node7 x:collection1_shard2_replica_n5 t:null-282] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={update.distrib=FROMLEADER&distrib.from=http://127.0.0.1:40707/collection1_shard2_replica_n2/&wt=javabin&version=2}{add=[137 (1788735514604470272)]} 0 5 2> 187048 INFO (qtp600427849-519) [n:127.0.0.1:40707_ c:collection1 s:shard2 r:core_node4 x:collection1_shard2_replica_n2 t:null-282] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={wt=javabin&version=2}{add=[137 (1788735514604470272)]} 0 26 2> 187062 INFO (qtp2065109343-471) [n:127.0.0.1:45297_ c:control_collection s:shard1 r:core_node2 x:control_collection_shard1_replica_n1 t:null-283] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={wt=javabin&version=2&CONTROL=TRUE}{add=[138 (1788735514641170432)]} 0 5 2> 187112 INFO (qtp657049334-528) [n:127.0.0.1:37133_ c:collection1 s:shard1 r:core_node8 x:collection1_shard1_replica_n6 t:null-284] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={update.distrib=FROMLEADER&distrib.from=http://127.0.0.1:43017/collection1_shard1_replica_n1/&wt=javabin&version=2}{add=[138 (1788735514665287680)]} 0 19 2> 187117 INFO (qtp1128963290-591) [n:127.0.0.1:43017_ c:collection1 s:shard1 r:core_node3 x:collection1_shard1_replica_n1 t:null-284] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={wt=javabin&version=2}{add=[138 (1788735514665287680)]} 0 37 2> 187143 INFO (qtp2065109343-466) [n:127.0.0.1:45297_ c:control_collection s:shard1 r:core_node2 x:control_collection_shard1_replica_n1 t:null-285] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={wt=javabin&version=2&CONTROL=TRUE}{add=[139 (1788735514725056512)]} 0 6 2> 187174 INFO (qtp558875332-568) [n:127.0.0.1:32885_ c:collection1 s:shard2 r:core_node7 x:collection1_shard2_replica_n5 t:null-286] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={update.distrib=FROMLEADER&distrib.from=http://127.0.0.1:40707/collection1_shard2_replica_n2/&wt=javabin&version=2}{add=[139 (1788735514740785152)]} 0 7 2> 187177 INFO (qtp600427849-520) [n:127.0.0.1:40707_ c:collection1 s:shard2 r:core_node4 x:collection1_shard2_replica_n2 t:null-286] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={wt=javabin&version=2}{add=[139 (1788735514740785152)]} 0 25 2> 187192 INFO (qtp2065109343-468) [n:127.0.0.1:45297_ c:control_collection s:shard1 r:core_node2 x:control_collection_shard1_replica_n1 t:null-287] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={wt=javabin&version=2&CONTROL=TRUE}{add=[140 (1788735514776436736)]} 0 6 2> 187216 INFO (qtp558875332-569) [n:127.0.0.1:32885_ c:collection1 s:shard2 r:core_node7 x:collection1_shard2_replica_n5 t:null-288] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={update.distrib=FROMLEADER&distrib.from=http://127.0.0.1:40707/collection1_shard2_replica_n2/&wt=javabin&version=2}{add=[140 (1788735514791116800)]} 0 4 2> 187220 INFO (qtp600427849-518) [n:127.0.0.1:40707_ c:collection1 s:shard2 r:core_node4 x:collection1_shard2_replica_n2 t:null-288] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={wt=javabin&version=2}{add=[140 (1788735514791116800)]} 0 19 2> 187232 INFO (qtp2065109343-469) [n:127.0.0.1:45297_ c:control_collection s:shard1 r:core_node2 x:control_collection_shard1_replica_n1 t:null-289] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={wt=javabin&version=2&CONTROL=TRUE}{add=[141 (1788735514819428352)]} 0 5 2> 187255 INFO (qtp558875332-570) [n:127.0.0.1:32885_ c:collection1 s:shard2 r:core_node7 x:collection1_shard2_replica_n5 t:null-290] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={update.distrib=FROMLEADER&distrib.from=http://127.0.0.1:40707/collection1_shard2_replica_n2/&wt=javabin&version=2}{add=[141 (1788735514832011264)]} 0 4 2> 187259 INFO (qtp600427849-516) [n:127.0.0.1:40707_ c:collection1 s:shard2 r:core_node4 x:collection1_shard2_replica_n2 t:null-290] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={wt=javabin&version=2}{add=[141 (1788735514832011264)]} 0 19 2> 187274 INFO (qtp2065109343-470) [n:127.0.0.1:45297_ c:control_collection s:shard1 r:core_node2 x:control_collection_shard1_replica_n1 t:null-291] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={wt=javabin&version=2&CONTROL=TRUE}{add=[142 (1788735514863468544)]} 0 7 2> 187300 INFO (qtp558875332-573) [n:127.0.0.1:32885_ c:collection1 s:shard2 r:core_node7 x:collection1_shard2_replica_n5 t:null-292] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={update.distrib=FROMLEADER&distrib.from=http://127.0.0.1:40707/collection1_shard2_replica_n2/&wt=javabin&version=2}{add=[142 (1788735514877100032)]} 0 5 2> 187303 INFO (qtp600427849-517) [n:127.0.0.1:40707_ c:collection1 s:shard2 r:core_node4 x:collection1_shard2_replica_n2 t:null-292] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={wt=javabin&version=2}{add=[142 (1788735514877100032)]} 0 22 2> 187317 INFO (qtp2065109343-467) [n:127.0.0.1:45297_ c:control_collection s:shard1 r:core_node2 x:control_collection_shard1_replica_n1 t:null-293] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={wt=javabin&version=2&CONTROL=TRUE}{add=[143 (1788735514908557312)]} 0 5 2> 187344 INFO (qtp558875332-571) [n:127.0.0.1:32885_ c:collection1 s:shard2 r:core_node7 x:collection1_shard2_replica_n5 t:null-294] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={update.distrib=FROMLEADER&distrib.from=http://127.0.0.1:40707/collection1_shard2_replica_n2/&wt=javabin&version=2}{add=[143 (1788735514923237376)]} 0 5 2> 187347 INFO (qtp600427849-515) [n:127.0.0.1:40707_ c:collection1 s:shard2 r:core_node4 x:collection1_shard2_replica_n2 t:null-294] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={wt=javabin&version=2}{add=[143 (1788735514923237376)]} 0 22 2> 187478 INFO (qtp2065109343-471) [n:127.0.0.1:45297_ c:control_collection s:shard1 r:core_node2 x:control_collection_shard1_replica_n1 t:null-295] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={wt=javabin&version=2&CONTROL=TRUE}{add=[144 (1788735515071086592)]} 0 18 2> 187528 INFO (qtp558875332-572) [n:127.0.0.1:32885_ c:collection1 s:shard2 r:core_node7 x:collection1_shard2_replica_n5 t:null-296] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={update.distrib=FROMLEADER&distrib.from=http://127.0.0.1:40707/collection1_shard2_replica_n2/&wt=javabin&version=2}{add=[144 (1788735515107786752)]} 0 6 2> 187534 INFO (qtp600427849-519) [n:127.0.0.1:40707_ c:collection1 s:shard2 r:core_node4 x:collection1_shard2_replica_n2 t:null-296] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={wt=javabin&version=2}{add=[144 (1788735515107786752)]} 0 35 2> 187548 INFO (qtp2065109343-466) [n:127.0.0.1:45297_ c:control_collection s:shard1 r:core_node2 x:control_collection_shard1_replica_n1 t:null-297] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={wt=javabin&version=2&CONTROL=TRUE}{add=[145 (1788735515150778368)]} 0 4 2> 187572 INFO (qtp558875332-568) [n:127.0.0.1:32885_ c:collection1 s:shard2 r:core_node7 x:collection1_shard2_replica_n5 t:null-298] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={update.distrib=FROMLEADER&distrib.from=http://127.0.0.1:40707/collection1_shard2_replica_n2/&wt=javabin&version=2}{add=[145 (1788735515163361280)]} 0 5 2> 187576 INFO (qtp600427849-520) [n:127.0.0.1:40707_ c:collection1 s:shard2 r:core_node4 x:collection1_shard2_replica_n2 t:null-298] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={wt=javabin&version=2}{add=[145 (1788735515163361280)]} 0 21 2> 187591 INFO (qtp2065109343-468) [n:127.0.0.1:45297_ c:control_collection s:shard1 r:core_node2 x:control_collection_shard1_replica_n1 t:null-299] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={wt=javabin&version=2&CONTROL=TRUE}{add=[146 (1788735515195867136)]} 0 5 2> 187625 INFO (qtp657049334-533) [n:127.0.0.1:37133_ c:collection1 s:shard1 r:core_node8 x:collection1_shard1_replica_n6 t:null-300] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={update.distrib=FROMLEADER&distrib.from=http://127.0.0.1:43017/collection1_shard1_replica_n1/&wt=javabin&version=2}{add=[146 (1788735515213692928)]} 0 6 2> 187632 INFO (qtp1128963290-593) [n:127.0.0.1:43017_ c:collection1 s:shard1 r:core_node3 x:collection1_shard1_replica_n1 t:null-300] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={wt=javabin&version=2}{add=[146 (1788735515213692928)]} 0 29 2> 187653 INFO (qtp2065109343-469) [n:127.0.0.1:45297_ c:control_collection s:shard1 r:core_node2 x:control_collection_shard1_replica_n1 t:null-301] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={wt=javabin&version=2&CONTROL=TRUE}{add=[147 (1788735515254587392)]} 0 11 2> 187674 INFO (qtp558875332-569) [n:127.0.0.1:32885_ c:collection1 s:shard2 r:core_node7 x:collection1_shard2_replica_n5 t:null-302] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={update.distrib=FROMLEADER&distrib.from=http://127.0.0.1:40707/collection1_shard2_replica_n2/&wt=javabin&version=2}{add=[147 (1788735515273461760)]} 0 4 2> 187678 INFO (qtp600427849-518) [n:127.0.0.1:40707_ c:collection1 s:shard2 r:core_node4 x:collection1_shard2_replica_n2 t:null-302] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={wt=javabin&version=2}{add=[147 (1788735515273461760)]} 0 17 2> 187690 INFO (qtp2065109343-470) [n:127.0.0.1:45297_ c:control_collection s:shard1 r:core_node2 x:control_collection_shard1_replica_n1 t:null-303] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={wt=javabin&version=2&CONTROL=TRUE}{add=[148 (1788735515300724736)]} 0 5 2> 187713 INFO (qtp657049334-529) [n:127.0.0.1:37133_ c:collection1 s:shard1 r:core_node8 x:collection1_shard1_replica_n6 t:null-304] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={update.distrib=FROMLEADER&distrib.from=http://127.0.0.1:43017/collection1_shard1_replica_n1/&wt=javabin&version=2}{add=[148 (1788735515313307648)]} 0 4 2> 187716 INFO (qtp1128963290-592) [n:127.0.0.1:43017_ c:collection1 s:shard1 r:core_node3 x:collection1_shard1_replica_n1 t:null-304] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={wt=javabin&version=2}{add=[148 (1788735515313307648)]} 0 18 2> 187729 INFO (qtp2065109343-467) [n:127.0.0.1:45297_ c:control_collection s:shard1 r:core_node2 x:control_collection_shard1_replica_n1 t:null-305] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={wt=javabin&version=2&CONTROL=TRUE}{add=[149 (1788735515340570624)]} 0 5 2> 187752 INFO (qtp558875332-570) [n:127.0.0.1:32885_ c:collection1 s:shard2 r:core_node7 x:collection1_shard2_replica_n5 t:null-306] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={update.distrib=FROMLEADER&distrib.from=http://127.0.0.1:40707/collection1_shard2_replica_n2/&wt=javabin&version=2}{add=[149 (1788735515354202112)]} 0 4 2> 187756 INFO (qtp600427849-516) [n:127.0.0.1:40707_ c:collection1 s:shard2 r:core_node4 x:collection1_shard2_replica_n2 t:null-306] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={wt=javabin&version=2}{add=[149 (1788735515354202112)]} 0 19 2> 188078 INFO (searcherExecutor-260-thread-1-processing-control_collection_shard1_replica_n1 null-307 core_node2 127.0.0.1:45297_ control_collection shard1) [n:127.0.0.1:45297_ c:control_collection s:shard1 r:core_node2 x:control_collection_shard1_replica_n1 t:null-307] o.a.s.c.SolrCore Registered new searcher autowarm time: 1 ms 2> 188081 INFO (qtp2065109343-471) [n:127.0.0.1:45297_ c:control_collection s:shard1 r:core_node2 x:control_collection_shard1_replica_n1 t:null-307] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={waitSearcher=true&commit=true&softCommit=false&wt=javabin&version=2}{commit=} 0 317 2> 188368 INFO (searcherExecutor-365-thread-1-processing-collection1_shard1_replica_n6 null-308 core_node8 127.0.0.1:37133_ collection1 shard1) [n:127.0.0.1:37133_ c:collection1 s:shard1 r:core_node8 x:collection1_shard1_replica_n6 t:null-308] o.a.s.c.SolrCore Registered new searcher autowarm time: 0 ms 2> 188369 INFO (searcherExecutor-351-thread-1-processing-collection1_shard1_replica_n1 null-308 core_node3 127.0.0.1:43017_ collection1 shard1) [n:127.0.0.1:43017_ c:collection1 s:shard1 r:core_node3 x:collection1_shard1_replica_n1 t:null-308] o.a.s.c.SolrCore Registered new searcher autowarm time: 0 ms 2> 188375 INFO (qtp657049334-532) [n:127.0.0.1:37133_ c:collection1 s:shard1 r:core_node8 x:collection1_shard1_replica_n6 t:null-308] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={update.distrib=FROMLEADER&waitSearcher=true&openSearcher=true&commit=true&softCommit=false&distrib.from=http://127.0.0.1:43017/collection1_shard1_replica_n1/&commit_end_point=replicas&wt=javabin&version=2&expungeDeletes=false}{commit=} 0 235 2> 188384 INFO (qtp1128963290-588) [n:127.0.0.1:43017_ c:collection1 s:shard1 r:core_node3 x:collection1_shard1_replica_n1 t:null-308] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={update.distrib=TOLEADER&waitSearcher=true&openSearcher=true&commit=true&softCommit=false&distrib.from=http://127.0.0.1:40707/collection1_shard2_replica_n2/&commit_end_point=leaders&wt=javabin&version=2&expungeDeletes=false}{commit=} 0 260 2> 188550 INFO (searcherExecutor-367-thread-1-processing-collection1_shard2_replica_n5 null-308 core_node7 127.0.0.1:32885_ collection1 shard2) [n:127.0.0.1:32885_ c:collection1 s:shard2 r:core_node7 x:collection1_shard2_replica_n5 t:null-308] o.a.s.c.SolrCore Registered new searcher autowarm time: 0 ms 2> 188551 INFO (searcherExecutor-354-thread-1-processing-collection1_shard2_replica_n2 null-308 core_node4 127.0.0.1:40707_ collection1 shard2) [n:127.0.0.1:40707_ c:collection1 s:shard2 r:core_node4 x:collection1_shard2_replica_n2 t:null-308] o.a.s.c.SolrCore Registered new searcher autowarm time: 0 ms 2> 188554 INFO (qtp558875332-573) [n:127.0.0.1:32885_ c:collection1 s:shard2 r:core_node7 x:collection1_shard2_replica_n5 t:null-308] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={update.distrib=FROMLEADER&waitSearcher=true&openSearcher=true&commit=true&softCommit=false&distrib.from=http://127.0.0.1:40707/collection1_shard2_replica_n2/&commit_end_point=replicas&wt=javabin&version=2&expungeDeletes=false}{commit=} 0 160 2> 188562 INFO (qtp600427849-517) [n:127.0.0.1:40707_ c:collection1 s:shard2 r:core_node4 x:collection1_shard2_replica_n2 t:null-308] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={_stateVer_=collection1:11&waitSearcher=true&commit=true&softCommit=false&wt=javabin&version=2}{commit=} 0 471 2> 188603 ERROR (qtp2065109343-466) [n:127.0.0.1:45297_ c:control_collection s:shard1 r:core_node2 x:control_collection_shard1_replica_n1 t:null-309] o.a.s.h.RequestHandlerBase Client exception 2> => org.apache.solr.common.SolrException: Sorting on a tokenized field that is not a SortableTextField is not supported in cloud mode. 2> at org.apache.solr.handler.component.QueryComponent.prepareGrouping(QueryComponent.java:300) 2> org.apache.solr.common.SolrException: Sorting on a tokenized field that is not a SortableTextField is not supported in cloud mode. 2> at org.apache.solr.handler.component.QueryComponent.prepareGrouping(QueryComponent.java:300) ~[main/:?] 2> at org.apache.solr.handler.component.QueryComponent.prepare(QueryComponent.java:222) ~[main/:?] 2> at org.apache.solr.handler.component.SearchHandler.handleRequestBody(SearchHandler.java:431) ~[main/:?] 2> at org.apache.solr.handler.RequestHandlerBase.handleRequest(RequestHandlerBase.java:226) [main/:?] 2> at org.apache.solr.core.SolrCore.execute(SolrCore.java:2908) [main/:?] 2> at org.apache.solr.servlet.HttpSolrCall.executeCoreRequest(HttpSolrCall.java:875) [main/:?] 2> at org.apache.solr.servlet.HttpSolrCall.call(HttpSolrCall.java:561) [main/:?] 2> at org.apache.solr.servlet.SolrDispatchFilter.dispatch(SolrDispatchFilter.java:262) [main/:?] 2> at org.apache.solr.servlet.SolrDispatchFilter.lambda$doFilter$0(SolrDispatchFilter.java:219) [main/:?] 2> at org.apache.solr.servlet.ServletUtils.traceHttpRequestExecution2(ServletUtils.java:246) [main/:?] 2> at org.apache.solr.servlet.ServletUtils.rateLimitRequest(ServletUtils.java:215) [main/:?] 2> at org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:213) [main/:?] 2> at org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:195) [main/:?] 2> at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:202) [jetty-servlet-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1635) [jetty-servlet-10.0.19.jar:10.0.19] 2> at org.apache.solr.embedded.JettySolrRunner$DebugFilter.doFilter(JettySolrRunner.java:187) [solr-test-framework-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:202) [jetty-servlet-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1635) [jetty-servlet-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:527) [jetty-servlet-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:221) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1580) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:221) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1384) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:176) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:484) [jetty-servlet-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1553) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:174) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1306) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:129) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:122) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.rewrite.handler.RewriteHandler.handle(RewriteHandler.java:301) [jetty-rewrite-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:122) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.gzip.GzipHandler.handle(GzipHandler.java:822) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:122) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.Server.handle(Server.java:563) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.HttpChannel$RequestDispatchable.dispatch(HttpChannel.java:1598) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:753) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:501) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:287) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:314) [jetty-io-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:100) [jetty-io-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.io.SelectableChannelEndPoint$1.run(SelectableChannelEndPoint.java:53) [jetty-io-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:969) [jetty-util-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.doRunJob(QueuedThreadPool.java:1194) [jetty-util-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1149) [jetty-util-10.0.19.jar:10.0.19] 2> at java.base/java.lang.Thread.run(Thread.java:829) [?:?] 2> 188622 INFO (qtp2065109343-466) [n:127.0.0.1:45297_ c:control_collection s:shard1 r:core_node2 x:control_collection_shard1_replica_n1 t:null-309] o.a.s.c.S.Request webapp= path=/select params={q=*:*&distrib=false&wt=javabin&version=2&group.field=a_t&group=true} status=400 QTime=41 2> 188653 WARN (TEST-BasicDistributedZkTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.c.SolrZkClient Using default ZkCredentialsInjector. ZkCredentialsInjector is not secure, it creates an empty list of credentials which leads to 'OPEN_ACL_UNSAFE' ACLs to Zookeeper nodes 2> 188665 INFO (TEST-BasicDistributedZkTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.c.ConnectionManager Waiting up to 15000ms for client to connect to ZooKeeper 2> 188684 INFO (zkConnectionManagerCallback-383-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ConnectionManager zkClient has connected 2> 188684 INFO (TEST-BasicDistributedZkTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper 2> 188685 WARN (TEST-BasicDistributedZkTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.c.SolrZkClient Using default ZkACLProvider. DefaultZkACLProvider is not secure, it creates 'OPEN_ACL_UNSAFE' ACLs to Zookeeper nodes 2> 188700 INFO (TEST-BasicDistributedZkTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (0) -> (5) 2> 188711 INFO (TEST-BasicDistributedZkTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.s.i.ZkClientClusterStateProvider Cluster at 127.0.0.1:42019/solr ready 2> 188711 INFO (TEST-BasicDistributedZkTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.AbstractFullDistribZkTestBase Created commonCloudSolrClient with updatesToLeaders=true and parallelUpdates=true 2> 188864 INFO (qtp600427849-515) [n:127.0.0.1:40707_ c:collection1 s:shard2 r:core_node4 x:collection1_shard2_replica_n2 t:null-310] o.a.s.c.S.Request webapp= path=/select params={facet.field=t_sortable&df=text&distrib=false&_stateVer_=collection1:11&fl=id&fl=score&shards.purpose=16404&start=0&fsv=true&facet.missing=false&rid=null-310&rows=10&version=2&f.t_sortable.facet.limit=160&q=*:*&omitHeader=false&f.t_sortable.facet.mincount=0&NOW=1705871120831&isShard=true&facet=true&wt=javabin} hits=35 status=0 QTime=87 2> 188864 INFO (qtp1128963290-591) [n:127.0.0.1:43017_ c:collection1 s:shard1 r:core_node3 x:collection1_shard1_replica_n1 t:null-310] o.a.s.c.S.Request webapp= path=/select params={facet.field=t_sortable&df=text&distrib=false&_stateVer_=collection1:11&fl=id&fl=score&shards.purpose=16404&start=0&fsv=true&facet.missing=false&rid=null-310&rows=10&version=2&f.t_sortable.facet.limit=160&q=*:*&omitHeader=false&f.t_sortable.facet.mincount=0&NOW=1705871120831&isShard=true&facet=true&wt=javabin} hits=32 status=0 QTime=61 2> 188926 INFO (qtp600427849-519) [n:127.0.0.1:40707_ c:collection1 s:shard2 r:core_node4 x:collection1_shard2_replica_n2 t:null-310] o.a.s.c.S.Request webapp= path=/select params={facet.field=t_sortable&df=text&distrib=false&_stateVer_=collection1:11&shards.purpose=64&facet.missing=false&rid=null-310&version=2&q=*:*&omitHeader=false&NOW=1705871120831&ids=2,3,103,104,5,105,17,6,7,9&isShard=true&facet=false&wt=javabin} status=0 QTime=12 2> 188972 INFO (qtp657049334-531) [n:127.0.0.1:37133_ c:collection1 s:shard1 r:core_node8 x:collection1_shard1_replica_n6 t:null-310] o.a.s.c.S.Request webapp= path=/select params={q=*:*&facet.field=t_sortable&_stateVer_=collection1:11&facet.missing=false&facet=true&wt=javabin&version=2} rid=null-310 hits=67 status=0 QTime=240 2> 189021 INFO (qtp600427849-520) [n:127.0.0.1:40707_ c:collection1 s:shard2 r:core_node4 x:collection1_shard2_replica_n2 t:null-311] o.a.s.c.S.Request webapp= path=/select params={df=text&distrib=false&_stateVer_=collection1:11&fl=id&shards.purpose=4&start=0&fsv=true&sort=t_sortable+desc&rid=null-311&rows=10&version=2&q=*:*&omitHeader=false&NOW=1705871121101&isShard=true&wt=javabin} hits=35 status=0 QTime=8 2> 189033 INFO (qtp1128963290-593) [n:127.0.0.1:43017_ c:collection1 s:shard1 r:core_node3 x:collection1_shard1_replica_n1 t:null-311] o.a.s.c.S.Request webapp= path=/select params={df=text&distrib=false&_stateVer_=collection1:11&fl=id&shards.purpose=4&start=0&fsv=true&sort=t_sortable+desc&rid=null-311&rows=10&version=2&q=*:*&omitHeader=false&NOW=1705871121101&isShard=true&wt=javabin} hits=32 status=0 QTime=7 2> 189054 INFO (qtp600427849-518) [n:127.0.0.1:40707_ c:collection1 s:shard2 r:core_node4 x:collection1_shard2_replica_n2 t:null-311] o.a.s.c.S.Request webapp= path=/select params={df=text&distrib=false&_stateVer_=collection1:11&fl=*,eoe_sortable,t_sortable&shards.purpose=64&rid=null-311&version=2&q=*:*&omitHeader=false&NOW=1705871121101&ids=2,3,5,6,7,9&isShard=true&wt=javabin} status=0 QTime=5 2> 189057 INFO (qtp1128963290-592) [n:127.0.0.1:43017_ c:collection1 s:shard1 r:core_node3 x:collection1_shard1_replica_n1 t:null-311] o.a.s.c.S.Request webapp= path=/select params={df=text&distrib=false&_stateVer_=collection1:11&fl=*,eoe_sortable,t_sortable&shards.purpose=64&rid=null-311&version=2&q=*:*&omitHeader=false&NOW=1705871121101&ids=1,13,4,10&isShard=true&wt=javabin} status=0 QTime=4 2> 189063 INFO (qtp558875332-571) [n:127.0.0.1:32885_ c:collection1 s:shard2 r:core_node7 x:collection1_shard2_replica_n5 t:null-311] o.a.s.c.S.Request webapp= path=/select params={q=*:*&_stateVer_=collection1:11&fl=*,eoe_sortable,t_sortable&sort=t_sortable+desc&wt=javabin&version=2} rid=null-311 hits=67 status=0 QTime=60 2> 189148 INFO (qtp558875332-569) [n:127.0.0.1:32885_ c:collection1 s:shard2 r:core_node7 x:collection1_shard2_replica_n5 t:null-312] o.a.s.c.S.Request webapp= path=/select params={df=text&group.distributed.first=true&distrib=false&_stateVer_=collection1:11&fl=id,score&shards.purpose=2048&start=0&collection=collection1&rid=null-312&rows=10&version=2&q=*:*&omitHeader=false&NOW=1705871121174&isShard=true&wt=javabin&group.field=t_sortable&group=true} status=0 QTime=54 2> 189148 INFO (qtp657049334-528) [n:127.0.0.1:37133_ c:collection1 s:shard1 r:core_node8 x:collection1_shard1_replica_n6 t:null-312] o.a.s.c.S.Request webapp= path=/select params={df=text&group.distributed.first=true&distrib=false&_stateVer_=collection1:11&fl=id,score&shards.purpose=2048&start=0&collection=collection1&rid=null-312&rows=10&version=2&q=*:*&omitHeader=false&NOW=1705871121174&isShard=true&wt=javabin&group.field=t_sortable&group=true} status=0 QTime=47 2> 189237 INFO (qtp558875332-570) [n:127.0.0.1:32885_ c:collection1 s:shard2 r:core_node7 x:collection1_shard2_replica_n5 t:null-312] o.a.s.c.S.Request webapp= path=/select params={group.topgroups.t_sortable=to+come+to+the+aid+of+their+country.&group.topgroups.t_sortable=how+now+brown+cow&group.topgroups.t_sortable=the+quick+fox+jumped+over+the+lazy+dog&group.topgroups.t_sortable=%01&group.topgroups.t_sortable=now+is+the+time+for+all+good+men&group.topgroups.t_sortable=all+the+kings+horses+and+all+the+kings+men&group.topgroups.t_sortable=this+too+shall+pass&group.topgroups.t_sortable=An+eye+for+eye+only+ends+up+making+the+whole+world+blind.&group.topgroups.t_sortable=Great+works+are+performed,+not+by+strength,+but+by+perseverance.&group.topgroups.t_sortable=no+eggs+on+wall,+lesson+learned&df=text&distrib=false&_stateVer_=collection1:11&fl=id,score&shards.purpose=4&start=0&collection=collection1&rid=null-312&rows=10&version=2&group.distributed.second=true&q=*:*&omitHeader=false&NOW=1705871121174&isShard=true&wt=javabin&group.field=t_sortable&group=true} status=0 QTime=56 2> 189238 INFO (qtp657049334-533) [n:127.0.0.1:37133_ c:collection1 s:shard1 r:core_node8 x:collection1_shard1_replica_n6 t:null-312] o.a.s.c.S.Request webapp= path=/select params={group.topgroups.t_sortable=to+come+to+the+aid+of+their+country.&group.topgroups.t_sortable=how+now+brown+cow&group.topgroups.t_sortable=the+quick+fox+jumped+over+the+lazy+dog&group.topgroups.t_sortable=%01&group.topgroups.t_sortable=now+is+the+time+for+all+good+men&group.topgroups.t_sortable=all+the+kings+horses+and+all+the+kings+men&group.topgroups.t_sortable=this+too+shall+pass&group.topgroups.t_sortable=An+eye+for+eye+only+ends+up+making+the+whole+world+blind.&group.topgroups.t_sortable=Great+works+are+performed,+not+by+strength,+but+by+perseverance.&group.topgroups.t_sortable=no+eggs+on+wall,+lesson+learned&df=text&distrib=false&_stateVer_=collection1:11&fl=id,score&shards.purpose=4&start=0&collection=collection1&rid=null-312&rows=10&version=2&group.distributed.second=true&q=*:*&omitHeader=false&NOW=1705871121174&isShard=true&wt=javabin&group.field=t_sortable&group=true} status=0 QTime=60 2> 189267 INFO (qtp657049334-529) [n:127.0.0.1:37133_ c:collection1 s:shard1 r:core_node8 x:collection1_shard1_replica_n6 t:null-312] o.a.s.c.S.Request webapp= path=/select params={df=text&distrib=false&_stateVer_=collection1:11&shards.purpose=64&collection=collection1&rid=null-312&version=2&q=*:*&omitHeader=false&NOW=1705871121174&ids=11,1,12,13,8,10&isShard=true&wt=javabin&group.field=t_sortable} status=0 QTime=5 2> 189269 INFO (qtp558875332-573) [n:127.0.0.1:32885_ c:collection1 s:shard2 r:core_node7 x:collection1_shard2_replica_n5 t:null-312] o.a.s.c.S.Request webapp= path=/select params={df=text&distrib=false&_stateVer_=collection1:11&shards.purpose=64&collection=collection1&rid=null-312&version=2&q=*:*&omitHeader=false&NOW=1705871121174&ids=2,3,5,17&isShard=true&wt=javabin&group.field=t_sortable} status=0 QTime=3 2> 189327 INFO (qtp558875332-568) [n:127.0.0.1:32885_ c:collection1 s:shard2 r:core_node7 x:collection1_shard2_replica_n5 t:null-312] o.a.s.c.S.Request webapp= path=/select params={q=*:*&_stateVer_=collection1:11&collection=collection1&wt=javabin&version=2&group.field=t_sortable&group=true} rid=null-312 status=0 QTime=251 2> 189373 INFO (qtp600427849-516) [n:127.0.0.1:40707_ c:collection1 s:shard2 r:core_node4 x:collection1_shard2_replica_n2 t:null-313] o.a.s.c.S.Request webapp= path=/select params={q=*:*&distrib=false&sanity_check=is_empty&sort=id+desc&wt=javabin&version=2} hits=35 status=0 QTime=18 2> 189400 INFO (qtp558875332-571) [n:127.0.0.1:32885_ c:collection1 s:shard2 r:core_node7 x:collection1_shard2_replica_n5 t:null-314] o.a.s.c.S.Request webapp= path=/select params={q=*:*&distrib=false&sanity_check=is_empty&sort=id+desc&wt=javabin&version=2} hits=35 status=0 QTime=5 2> 189444 INFO (qtp657049334-532) [n:127.0.0.1:37133_ c:collection1 s:shard1 r:core_node8 x:collection1_shard1_replica_n6 t:null-315] o.a.s.c.S.Request webapp= path=/select params={q=*:*&distrib=false&sanity_check=is_empty&sort=id+desc&wt=javabin&version=2} hits=32 status=0 QTime=6 2> 189476 INFO (qtp1128963290-588) [n:127.0.0.1:43017_ c:collection1 s:shard1 r:core_node3 x:collection1_shard1_replica_n1 t:null-316] o.a.s.c.S.Request webapp= path=/select params={q=*:*&distrib=false&sanity_check=is_empty&sort=id+desc&wt=javabin&version=2} hits=32 status=0 QTime=4 2> 189509 INFO (qtp2065109343-468) [n:127.0.0.1:45297_ c:control_collection s:shard1 r:core_node2 x:control_collection_shard1_replica_n1 t:null-317] o.a.s.c.S.Request webapp= path=/select params={q=*:*&distrib=false&sort=n_ti1+desc&wt=javabin&version=2} hits=67 status=0 QTime=12 2> 189538 INFO (qtp600427849-517) [n:127.0.0.1:40707_ c:collection1 s:shard2 r:core_node4 x:collection1_shard2_replica_n2 t:null-318] o.a.s.c.S.Request webapp= path=/select params={df=text&distrib=false&_stateVer_=collection1:11&fl=id&shards.purpose=4&start=0&fsv=true&sort=n_ti1+desc&rid=null-318&rows=10&version=2&q=*:*&omitHeader=false&NOW=1705871121622&isShard=true&wt=javabin} hits=35 status=0 QTime=7 2> 189538 INFO (qtp657049334-530) [n:127.0.0.1:37133_ c:collection1 s:shard1 r:core_node8 x:collection1_shard1_replica_n6 t:null-318] o.a.s.c.S.Request webapp= path=/select params={df=text&distrib=false&_stateVer_=collection1:11&fl=id&shards.purpose=4&start=0&fsv=true&sort=n_ti1+desc&rid=null-318&rows=10&version=2&q=*:*&omitHeader=false&NOW=1705871121622&isShard=true&wt=javabin} hits=32 status=0 QTime=4 2> 189567 INFO (qtp657049334-531) [n:127.0.0.1:37133_ c:collection1 s:shard1 r:core_node8 x:collection1_shard1_replica_n6 t:null-318] o.a.s.c.S.Request webapp= path=/select params={q=*:*&df=text&distrib=false&_stateVer_=collection1:11&omitHeader=false&shards.purpose=64&NOW=1705871121622&ids=146,107&isShard=true&rid=null-318&wt=javabin&version=2} status=0 QTime=3 2> 189567 INFO (qtp600427849-515) [n:127.0.0.1:40707_ c:collection1 s:shard2 r:core_node4 x:collection1_shard2_replica_n2 t:null-318] o.a.s.c.S.Request webapp= path=/select params={q=*:*&df=text&distrib=false&_stateVer_=collection1:11&omitHeader=false&shards.purpose=64&NOW=1705871121622&ids=112,2,137,17,7,117,9,130&isShard=true&rid=null-318&wt=javabin&version=2} status=0 QTime=7 2> 189576 INFO (qtp1128963290-589) [n:127.0.0.1:43017_ c:collection1 s:shard1 r:core_node3 x:collection1_shard1_replica_n1 t:null-318] o.a.s.c.S.Request webapp= path=/select params={q=*:*&_stateVer_=collection1:11&sort=n_ti1+desc&wt=javabin&version=2} rid=null-318 hits=67 status=0 QTime=51 2> 189592 INFO (qtp2065109343-469) [n:127.0.0.1:45297_ c:control_collection s:shard1 r:core_node2 x:control_collection_shard1_replica_n1 t:null-319] o.a.s.c.S.Request webapp= path=/select params={q=*:*&distrib=false&sort=n_ti1+asc&wt=javabin&version=2} hits=67 status=0 QTime=4 2> 189670 INFO (qtp558875332-569) [n:127.0.0.1:32885_ c:collection1 s:shard2 r:core_node7 x:collection1_shard2_replica_n5 t:null-320] o.a.s.c.S.Request webapp= path=/select params={df=text&distrib=false&_stateVer_=collection1:11&fl=id&shards.purpose=4&start=0&fsv=true&sort=n_ti1+asc&collection=collection1&rid=null-320&rows=10&version=2&q=*:*&omitHeader=false&NOW=1705871121719&isShard=true&wt=javabin} hits=35 status=0 QTime=7 2> 189671 INFO (qtp657049334-528) [n:127.0.0.1:37133_ c:collection1 s:shard1 r:core_node8 x:collection1_shard1_replica_n6 t:null-320] o.a.s.c.S.Request webapp= path=/select params={df=text&distrib=false&_stateVer_=collection1:11&fl=id&shards.purpose=4&start=0&fsv=true&sort=n_ti1+asc&collection=collection1&rid=null-320&rows=10&version=2&q=*:*&omitHeader=false&NOW=1705871121719&isShard=true&wt=javabin} hits=32 status=0 QTime=4 2> 189703 INFO (qtp657049334-533) [n:127.0.0.1:37133_ c:collection1 s:shard1 r:core_node8 x:collection1_shard1_replica_n6 t:null-320] o.a.s.c.S.Request webapp= path=/select params={df=text&distrib=false&_stateVer_=collection1:11&shards.purpose=64&collection=collection1&rid=null-320&version=2&q=*:*&omitHeader=false&NOW=1705871121719&ids=1,100,101,114,8,109&isShard=true&wt=javabin} status=0 QTime=9 2> 189707 INFO (qtp558875332-570) [n:127.0.0.1:32885_ c:collection1 s:shard2 r:core_node7 x:collection1_shard2_replica_n5 t:null-320] o.a.s.c.S.Request webapp= path=/select params={df=text&distrib=false&_stateVer_=collection1:11&shards.purpose=64&collection=collection1&rid=null-320&version=2&q=*:*&omitHeader=false&NOW=1705871121719&ids=145,115,139,141&isShard=true&wt=javabin} status=0 QTime=10 2> 189714 INFO (qtp558875332-572) [n:127.0.0.1:32885_ c:collection1 s:shard2 r:core_node7 x:collection1_shard2_replica_n5 t:null-320] o.a.s.c.S.Request webapp= path=/select params={q=*:*&_stateVer_=collection1:11&sort=n_ti1+asc&collection=collection1&wt=javabin&version=2} rid=null-320 hits=67 status=0 QTime=93 2> 189746 INFO (qtp2065109343-470) [n:127.0.0.1:45297_ c:control_collection s:shard1 r:core_node2 x:control_collection_shard1_replica_n1 t:null-321] o.a.s.c.S.Request webapp= path=/select params={q=*:*&distrib=false&sort=n_f1+desc&wt=javabin&version=2} hits=67 status=0 QTime=19 2> 189776 INFO (qtp558875332-573) [n:127.0.0.1:32885_ c:collection1 s:shard2 r:core_node7 x:collection1_shard2_replica_n5 t:null-322] o.a.s.c.S.Request webapp= path=/select params={df=text&distrib=false&_stateVer_=collection1:11&fl=id&shards.purpose=4&start=0&fsv=true&sort=n_f1+desc&rid=null-322&rows=10&version=2&q=*:*&omitHeader=false&NOW=1705871121860&isShard=true&wt=javabin} hits=35 status=0 QTime=4 2> 189783 INFO (qtp1128963290-591) [n:127.0.0.1:43017_ c:collection1 s:shard1 r:core_node3 x:collection1_shard1_replica_n1 t:null-322] o.a.s.c.S.Request webapp= path=/select params={df=text&distrib=false&_stateVer_=collection1:11&fl=id&shards.purpose=4&start=0&fsv=true&sort=n_f1+desc&rid=null-322&rows=10&version=2&q=*:*&omitHeader=false&NOW=1705871121860&isShard=true&wt=javabin} hits=32 status=0 QTime=5 2> 189806 INFO (qtp1128963290-593) [n:127.0.0.1:43017_ c:collection1 s:shard1 r:core_node3 x:collection1_shard1_replica_n1 t:null-322] o.a.s.c.S.Request webapp= path=/select params={q=*:*&df=text&distrib=false&_stateVer_=collection1:11&omitHeader=false&shards.purpose=64&NOW=1705871121860&ids=102,109,10&isShard=true&rid=null-322&wt=javabin&version=2} status=0 QTime=3 2> 189811 INFO (qtp558875332-568) [n:127.0.0.1:32885_ c:collection1 s:shard2 r:core_node7 x:collection1_shard2_replica_n5 t:null-322] o.a.s.c.S.Request webapp= path=/select params={q=*:*&df=text&distrib=false&_stateVer_=collection1:11&omitHeader=false&shards.purpose=64&NOW=1705871121860&ids=144,136,116,9,119,131,142&isShard=true&rid=null-322&wt=javabin&version=2} status=0 QTime=6 2> 189818 INFO (qtp1128963290-590) [n:127.0.0.1:43017_ c:collection1 s:shard1 r:core_node3 x:collection1_shard1_replica_n1 t:null-322] o.a.s.c.S.Request webapp= path=/select params={q=*:*&_stateVer_=collection1:11&sort=n_f1+desc&wt=javabin&version=2} rid=null-322 hits=67 status=0 QTime=56 2> 189833 INFO (qtp2065109343-467) [n:127.0.0.1:45297_ c:control_collection s:shard1 r:core_node2 x:control_collection_shard1_replica_n1 t:null-323] o.a.s.c.S.Request webapp= path=/select params={q=*:*&distrib=false&sort=n_f1+asc&wt=javabin&version=2} hits=67 status=0 QTime=4 2> 189865 INFO (qtp558875332-571) [n:127.0.0.1:32885_ c:collection1 s:shard2 r:core_node7 x:collection1_shard2_replica_n5 t:null-324] o.a.s.c.S.Request webapp= path=/select params={df=text&distrib=false&_stateVer_=collection1:11&fl=id&shards.purpose=4&start=0&fsv=true&sort=n_f1+asc&collection=collection1&rid=null-324&rows=10&version=2&q=*:*&omitHeader=false&NOW=1705871121951&isShard=true&wt=javabin} hits=35 status=0 QTime=4 2> 189868 INFO (qtp657049334-529) [n:127.0.0.1:37133_ c:collection1 s:shard1 r:core_node8 x:collection1_shard1_replica_n6 t:null-324] o.a.s.c.S.Request webapp= path=/select params={df=text&distrib=false&_stateVer_=collection1:11&fl=id&shards.purpose=4&start=0&fsv=true&sort=n_f1+asc&collection=collection1&rid=null-324&rows=10&version=2&q=*:*&omitHeader=false&NOW=1705871121951&isShard=true&wt=javabin} hits=32 status=0 QTime=3 2> 189884 INFO (qtp657049334-532) [n:127.0.0.1:37133_ c:collection1 s:shard1 r:core_node8 x:collection1_shard1_replica_n6 t:null-324] o.a.s.c.S.Request webapp= path=/select params={df=text&distrib=false&_stateVer_=collection1:11&shards.purpose=64&collection=collection1&rid=null-324&version=2&q=*:*&omitHeader=false&NOW=1705871121951&ids=111,100,13,135,113,118&isShard=true&wt=javabin} status=0 QTime=4 2> 189889 INFO (qtp558875332-569) [n:127.0.0.1:32885_ c:collection1 s:shard2 r:core_node7 x:collection1_shard2_replica_n5 t:null-324] o.a.s.c.S.Request webapp= path=/select params={df=text&distrib=false&_stateVer_=collection1:11&shards.purpose=64&collection=collection1&rid=null-324&version=2&q=*:*&omitHeader=false&NOW=1705871121951&ids=145,127,130,141&isShard=true&wt=javabin} status=0 QTime=6 2> 189895 INFO (qtp1128963290-592) [n:127.0.0.1:43017_ c:collection1 s:shard1 r:core_node3 x:collection1_shard1_replica_n1 t:null-324] o.a.s.c.S.Request webapp= path=/select params={q=*:*&_stateVer_=collection1:11&sort=n_f1+asc&collection=collection1&wt=javabin&version=2} rid=null-324 hits=67 status=0 QTime=42 2> 189929 INFO (qtp2065109343-471) [n:127.0.0.1:45297_ c:control_collection s:shard1 r:core_node2 x:control_collection_shard1_replica_n1 t:null-325] o.a.s.c.S.Request webapp= path=/select params={q=*:*&distrib=false&sort=n_tf1+desc&wt=javabin&version=2} hits=67 status=0 QTime=7 2> 190000 INFO (qtp1128963290-589) [n:127.0.0.1:43017_ c:collection1 s:shard1 r:core_node3 x:collection1_shard1_replica_n1 t:null-326] o.a.s.c.S.Request webapp= path=/select params={df=text&distrib=false&_stateVer_=collection1:11&fl=id&shards.purpose=4&start=0&fsv=true&sort=n_tf1+desc&rid=null-326&rows=10&version=2&q=*:*&omitHeader=false&NOW=1705871122080&isShard=true&wt=javabin} hits=32 status=0 QTime=5 2> 190001 INFO (qtp600427849-519) [n:127.0.0.1:40707_ c:collection1 s:shard2 r:core_node4 x:collection1_shard2_replica_n2 t:null-326] o.a.s.c.S.Request webapp= path=/select params={df=text&distrib=false&_stateVer_=collection1:11&fl=id&shards.purpose=4&start=0&fsv=true&sort=n_tf1+desc&rid=null-326&rows=10&version=2&q=*:*&omitHeader=false&NOW=1705871122080&isShard=true&wt=javabin} hits=35 status=0 QTime=9 2> 190020 INFO (qtp600427849-520) [n:127.0.0.1:40707_ c:collection1 s:shard2 r:core_node4 x:collection1_shard2_replica_n2 t:null-326] o.a.s.c.S.Request webapp= path=/select params={q=*:*&df=text&distrib=false&_stateVer_=collection1:11&omitHeader=false&shards.purpose=64&NOW=1705871122080&ids=17,6,117,119,142&isShard=true&rid=null-326&wt=javabin&version=2} status=0 QTime=5 2> 190024 INFO (qtp1128963290-591) [n:127.0.0.1:43017_ c:collection1 s:shard1 r:core_node3 x:collection1_shard1_replica_n1 t:null-326] o.a.s.c.S.Request webapp= path=/select params={q=*:*&df=text&distrib=false&_stateVer_=collection1:11&omitHeader=false&shards.purpose=64&NOW=1705871122080&ids=133,113,148,129,118&isShard=true&rid=null-326&wt=javabin&version=2} status=0 QTime=6 2> 190030 INFO (qtp1128963290-588) [n:127.0.0.1:43017_ c:collection1 s:shard1 r:core_node3 x:collection1_shard1_replica_n1 t:null-326] o.a.s.c.S.Request webapp= path=/select params={q=*:*&_stateVer_=collection1:11&sort=n_tf1+desc&wt=javabin&version=2} rid=null-326 hits=67 status=0 QTime=49 2> 190051 INFO (qtp2065109343-466) [n:127.0.0.1:45297_ c:control_collection s:shard1 r:core_node2 x:control_collection_shard1_replica_n1 t:null-327] o.a.s.c.S.Request webapp= path=/select params={q=*:*&distrib=false&sort=n_tf1+asc&wt=javabin&version=2} hits=67 status=0 QTime=5 2> 190080 INFO (qtp600427849-516) [n:127.0.0.1:40707_ c:collection1 s:shard2 r:core_node4 x:collection1_shard2_replica_n2 t:null-328] o.a.s.c.S.Request webapp= path=/select params={df=text&distrib=false&_stateVer_=collection1:11&fl=id&shards.purpose=4&start=0&fsv=true&sort=n_tf1+asc&rid=null-328&rows=10&version=2&q=*:*&omitHeader=false&NOW=1705871122166&isShard=true&wt=javabin} hits=35 status=0 QTime=3 2> 190084 INFO (qtp657049334-530) [n:127.0.0.1:37133_ c:collection1 s:shard1 r:core_node8 x:collection1_shard1_replica_n6 t:null-328] o.a.s.c.S.Request webapp= path=/select params={df=text&distrib=false&_stateVer_=collection1:11&fl=id&shards.purpose=4&start=0&fsv=true&sort=n_tf1+asc&rid=null-328&rows=10&version=2&q=*:*&omitHeader=false&NOW=1705871122166&isShard=true&wt=javabin} hits=32 status=0 QTime=4 2> 190104 INFO (qtp600427849-517) [n:127.0.0.1:40707_ c:collection1 s:shard2 r:core_node4 x:collection1_shard2_replica_n2 t:null-328] o.a.s.c.S.Request webapp= path=/select params={q=*:*&df=text&distrib=false&_stateVer_=collection1:11&omitHeader=false&shards.purpose=64&NOW=1705871122166&ids=2,147,105,7&isShard=true&rid=null-328&wt=javabin&version=2} status=0 QTime=5 2> 190107 INFO (qtp657049334-531) [n:127.0.0.1:37133_ c:collection1 s:shard1 r:core_node8 x:collection1_shard1_replica_n6 t:null-328] o.a.s.c.S.Request webapp= path=/select params={q=*:*&df=text&distrib=false&_stateVer_=collection1:11&omitHeader=false&shards.purpose=64&NOW=1705871122166&ids=12,101,13,124,16,109&isShard=true&rid=null-328&wt=javabin&version=2} status=0 QTime=5 2> 190117 INFO (qtp600427849-518) [n:127.0.0.1:40707_ c:collection1 s:shard2 r:core_node4 x:collection1_shard2_replica_n2 t:null-328] o.a.s.c.S.Request webapp= path=/select params={q=*:*&_stateVer_=collection1:11&sort=n_tf1+asc&wt=javabin&version=2} rid=null-328 hits=67 status=0 QTime=50 2> 190140 INFO (qtp2065109343-468) [n:127.0.0.1:45297_ c:control_collection s:shard1 r:core_node2 x:control_collection_shard1_replica_n1 t:null-329] o.a.s.c.S.Request webapp= path=/select params={q=*:*&distrib=false&sort=n_d1+desc&wt=javabin&version=2} hits=67 status=0 QTime=13 2> 190187 INFO (qtp558875332-570) [n:127.0.0.1:32885_ c:collection1 s:shard2 r:core_node7 x:collection1_shard2_replica_n5 t:null-330] o.a.s.c.S.Request webapp= path=/select params={df=text&distrib=false&_stateVer_=collection1:11&fl=id&shards.purpose=4&start=0&fsv=true&sort=n_d1+desc&rid=null-330&rows=10&version=2&q=*:*&omitHeader=false&NOW=1705871122262&isShard=true&wt=javabin} hits=35 status=0 QTime=11 2> 190191 INFO (qtp1128963290-593) [n:127.0.0.1:43017_ c:collection1 s:shard1 r:core_node3 x:collection1_shard1_replica_n1 t:null-330] o.a.s.c.S.Request webapp= path=/select params={df=text&distrib=false&_stateVer_=collection1:11&fl=id&shards.purpose=4&start=0&fsv=true&sort=n_d1+desc&rid=null-330&rows=10&version=2&q=*:*&omitHeader=false&NOW=1705871122262&isShard=true&wt=javabin} hits=32 status=0 QTime=4 2> 190208 INFO (qtp558875332-572) [n:127.0.0.1:32885_ c:collection1 s:shard2 r:core_node7 x:collection1_shard2_replica_n5 t:null-330] o.a.s.c.S.Request webapp= path=/select params={q=*:*&df=text&distrib=false&_stateVer_=collection1:11&omitHeader=false&shards.purpose=64&NOW=1705871122262&ids=3,116,130&isShard=true&rid=null-330&wt=javabin&version=2} status=0 QTime=3 2> 190216 INFO (qtp1128963290-590) [n:127.0.0.1:43017_ c:collection1 s:shard1 r:core_node3 x:collection1_shard1_replica_n1 t:null-330] o.a.s.c.S.Request webapp= path=/select params={q=*:*&df=text&distrib=false&_stateVer_=collection1:11&omitHeader=false&shards.purpose=64&NOW=1705871122262&ids=132,12,1,101,123,124,16&isShard=true&rid=null-330&wt=javabin&version=2} status=0 QTime=14 2> 190222 INFO (qtp600427849-519) [n:127.0.0.1:40707_ c:collection1 s:shard2 r:core_node4 x:collection1_shard2_replica_n2 t:null-330] o.a.s.c.S.Request webapp= path=/select params={q=*:*&_stateVer_=collection1:11&sort=n_d1+desc&wt=javabin&version=2} rid=null-330 hits=67 status=0 QTime=58 2> 190234 INFO (qtp2065109343-469) [n:127.0.0.1:45297_ c:control_collection s:shard1 r:core_node2 x:control_collection_shard1_replica_n1 t:null-331] o.a.s.c.S.Request webapp= path=/select params={q=*:*&distrib=false&sort=n_d1+asc&wt=javabin&version=2} hits=67 status=0 QTime=3 2> 190258 INFO (qtp558875332-568) [n:127.0.0.1:32885_ c:collection1 s:shard2 r:core_node7 x:collection1_shard2_replica_n5 t:null-332] o.a.s.c.S.Request webapp= path=/select params={df=text&distrib=false&_stateVer_=collection1:11&fl=id&shards.purpose=4&start=0&fsv=true&sort=n_d1+asc&rid=null-332&rows=10&version=2&q=*:*&omitHeader=false&NOW=1705871122344&isShard=true&wt=javabin} hits=35 status=0 QTime=4 2> 190261 INFO (qtp1128963290-592) [n:127.0.0.1:43017_ c:collection1 s:shard1 r:core_node3 x:collection1_shard1_replica_n1 t:null-332] o.a.s.c.S.Request webapp= path=/select params={df=text&distrib=false&_stateVer_=collection1:11&fl=id&shards.purpose=4&start=0&fsv=true&sort=n_d1+asc&rid=null-332&rows=10&version=2&q=*:*&omitHeader=false&NOW=1705871122344&isShard=true&wt=javabin} hits=32 status=0 QTime=3 2> 190280 INFO (qtp1128963290-589) [n:127.0.0.1:43017_ c:collection1 s:shard1 r:core_node3 x:collection1_shard1_replica_n1 t:null-332] o.a.s.c.S.Request webapp= path=/select params={q=*:*&df=text&distrib=false&_stateVer_=collection1:11&omitHeader=false&shards.purpose=64&NOW=1705871122344&ids=110,100,133,113,138,129&isShard=true&rid=null-332&wt=javabin&version=2} status=0 QTime=5 2> 190287 INFO (qtp558875332-571) [n:127.0.0.1:32885_ c:collection1 s:shard2 r:core_node7 x:collection1_shard2_replica_n5 t:null-332] o.a.s.c.S.Request webapp= path=/select params={q=*:*&df=text&distrib=false&_stateVer_=collection1:11&omitHeader=false&shards.purpose=64&NOW=1705871122344&ids=103,126,127,120&isShard=true&rid=null-332&wt=javabin&version=2} status=0 QTime=4 2> 190294 INFO (qtp558875332-573) [n:127.0.0.1:32885_ c:collection1 s:shard2 r:core_node7 x:collection1_shard2_replica_n5 t:null-332] o.a.s.c.S.Request webapp= path=/select params={q=*:*&_stateVer_=collection1:11&sort=n_d1+asc&wt=javabin&version=2} rid=null-332 hits=67 status=0 QTime=49 2> 190307 INFO (qtp2065109343-470) [n:127.0.0.1:45297_ c:control_collection s:shard1 r:core_node2 x:control_collection_shard1_replica_n1 t:null-333] o.a.s.c.S.Request webapp= path=/select params={q=*:*&distrib=false&sort=n_td1+desc&wt=javabin&version=2} hits=67 status=0 QTime=3 2> 190339 INFO (qtp600427849-520) [n:127.0.0.1:40707_ c:collection1 s:shard2 r:core_node4 x:collection1_shard2_replica_n2 t:null-334] o.a.s.c.S.Request webapp= path=/select params={df=text&distrib=false&_stateVer_=collection1:11&fl=id&shards.purpose=4&start=0&fsv=true&sort=n_td1+desc&collection=collection1&rid=null-334&rows=10&version=2&q=*:*&omitHeader=false&NOW=1705871122419&isShard=true&wt=javabin} hits=35 status=0 QTime=3 2> 190348 INFO (qtp657049334-528) [n:127.0.0.1:37133_ c:collection1 s:shard1 r:core_node8 x:collection1_shard1_replica_n6 t:null-334] o.a.s.c.S.Request webapp= path=/select params={df=text&distrib=false&_stateVer_=collection1:11&fl=id&shards.purpose=4&start=0&fsv=true&sort=n_td1+desc&collection=collection1&rid=null-334&rows=10&version=2&q=*:*&omitHeader=false&NOW=1705871122419&isShard=true&wt=javabin} hits=32 status=0 QTime=8 2> 190366 INFO (qtp600427849-516) [n:127.0.0.1:40707_ c:collection1 s:shard2 r:core_node4 x:collection1_shard2_replica_n2 t:null-334] o.a.s.c.S.Request webapp= path=/select params={df=text&distrib=false&_stateVer_=collection1:11&shards.purpose=64&collection=collection1&rid=null-334&version=2&q=*:*&omitHeader=false&NOW=1705871122419&ids=121,134,17,7,130,142&isShard=true&wt=javabin} status=0 QTime=5 2> 190376 INFO (qtp657049334-533) [n:127.0.0.1:37133_ c:collection1 s:shard1 r:core_node8 x:collection1_shard1_replica_n6 t:null-334] o.a.s.c.S.Request webapp= path=/select params={df=text&distrib=false&_stateVer_=collection1:11&shards.purpose=64&collection=collection1&rid=null-334&version=2&q=*:*&omitHeader=false&NOW=1705871122419&ids=111,13,8,118&isShard=true&wt=javabin} status=0 QTime=8 2> 190381 INFO (qtp558875332-569) [n:127.0.0.1:32885_ c:collection1 s:shard2 r:core_node7 x:collection1_shard2_replica_n5 t:null-334] o.a.s.c.S.Request webapp= path=/select params={q=*:*&_stateVer_=collection1:11&sort=n_td1+desc&collection=collection1&wt=javabin&version=2} rid=null-334 hits=67 status=0 QTime=61 2> 190395 INFO (qtp2065109343-467) [n:127.0.0.1:45297_ c:control_collection s:shard1 r:core_node2 x:control_collection_shard1_replica_n1 t:null-335] o.a.s.c.S.Request webapp= path=/select params={q=*:*&distrib=false&sort=n_td1+asc&wt=javabin&version=2} hits=67 status=0 QTime=3 2> 190421 INFO (qtp558875332-570) [n:127.0.0.1:32885_ c:collection1 s:shard2 r:core_node7 x:collection1_shard2_replica_n5 t:null-336] o.a.s.c.S.Request webapp= path=/select params={df=text&distrib=false&_stateVer_=collection1:11&fl=id&shards.purpose=4&start=0&fsv=true&sort=n_td1+asc&rid=null-336&rows=10&version=2&q=*:*&omitHeader=false&NOW=1705871122508&isShard=true&wt=javabin} hits=35 status=0 QTime=3 2> 190425 INFO (qtp657049334-529) [n:127.0.0.1:37133_ c:collection1 s:shard1 r:core_node8 x:collection1_shard1_replica_n6 t:null-336] o.a.s.c.S.Request webapp= path=/select params={df=text&distrib=false&_stateVer_=collection1:11&fl=id&shards.purpose=4&start=0&fsv=true&sort=n_td1+asc&rid=null-336&rows=10&version=2&q=*:*&omitHeader=false&NOW=1705871122508&isShard=true&wt=javabin} hits=32 status=0 QTime=3 2> 190446 INFO (qtp657049334-532) [n:127.0.0.1:37133_ c:collection1 s:shard1 r:core_node8 x:collection1_shard1_replica_n6 t:null-336] o.a.s.c.S.Request webapp= path=/select params={q=*:*&df=text&distrib=false&_stateVer_=collection1:11&omitHeader=false&shards.purpose=64&NOW=1705871122508&ids=1,16,138,107,10&isShard=true&rid=null-336&wt=javabin&version=2} status=0 QTime=3 2> 190481 INFO (qtp558875332-572) [n:127.0.0.1:32885_ c:collection1 s:shard2 r:core_node7 x:collection1_shard2_replica_n5 t:null-336] o.a.s.c.S.Request webapp= path=/select params={q=*:*&df=text&distrib=false&_stateVer_=collection1:11&omitHeader=false&shards.purpose=64&NOW=1705871122508&ids=115,104,126,105,128&isShard=true&rid=null-336&wt=javabin&version=2} status=0 QTime=6 2> 190487 INFO (qtp1128963290-591) [n:127.0.0.1:43017_ c:collection1 s:shard1 r:core_node3 x:collection1_shard1_replica_n1 t:null-336] o.a.s.c.S.Request webapp= path=/select params={q=*:*&_stateVer_=collection1:11&sort=n_td1+asc&wt=javabin&version=2} rid=null-336 hits=67 status=0 QTime=77 2> 190536 INFO (qtp2065109343-471) [n:127.0.0.1:45297_ c:control_collection s:shard1 r:core_node2 x:control_collection_shard1_replica_n1 t:null-337] o.a.s.c.S.Request webapp= path=/select params={q=*:*&distrib=false&sort=n_l1+desc&wt=javabin&version=2} hits=67 status=0 QTime=20 2> 190589 INFO (qtp657049334-528) [n:127.0.0.1:37133_ c:collection1 s:shard1 r:core_node8 x:collection1_shard1_replica_n6 t:null-338] o.a.s.c.S.Request webapp= path=/select params={df=text&distrib=false&_stateVer_=collection1:11&fl=id&shards.purpose=4&start=0&fsv=true&sort=n_l1+desc&rid=null-338&rows=10&version=2&q=*:*&omitHeader=false&NOW=1705871122663&isShard=true&wt=javabin} hits=32 status=0 QTime=12 2> 190589 INFO (qtp558875332-568) [n:127.0.0.1:32885_ c:collection1 s:shard2 r:core_node7 x:collection1_shard2_replica_n5 t:null-338] o.a.s.c.S.Request webapp= path=/select params={df=text&distrib=false&_stateVer_=collection1:11&fl=id&shards.purpose=4&start=0&fsv=true&sort=n_l1+desc&rid=null-338&rows=10&version=2&q=*:*&omitHeader=false&NOW=1705871122663&isShard=true&wt=javabin} hits=35 status=0 QTime=12 2> 190611 INFO (qtp657049334-533) [n:127.0.0.1:37133_ c:collection1 s:shard1 r:core_node8 x:collection1_shard1_replica_n6 t:null-338] o.a.s.c.S.Request webapp= path=/select params={q=*:*&df=text&distrib=false&_stateVer_=collection1:11&omitHeader=false&shards.purpose=64&NOW=1705871122663&ids=111,133,13,4,15,8&isShard=true&rid=null-338&wt=javabin&version=2} status=0 QTime=3 2> 190618 INFO (qtp558875332-571) [n:127.0.0.1:32885_ c:collection1 s:shard2 r:core_node7 x:collection1_shard2_replica_n5 t:null-338] o.a.s.c.S.Request webapp= path=/select params={q=*:*&df=text&distrib=false&_stateVer_=collection1:11&omitHeader=false&shards.purpose=64&NOW=1705871122663&ids=121,112,103,119&isShard=true&rid=null-338&wt=javabin&version=2} status=0 QTime=5 2> 190625 INFO (qtp657049334-531) [n:127.0.0.1:37133_ c:collection1 s:shard1 r:core_node8 x:collection1_shard1_replica_n6 t:null-338] o.a.s.c.S.Request webapp= path=/select params={q=*:*&_stateVer_=collection1:11&sort=n_l1+desc&wt=javabin&version=2} rid=null-338 hits=67 status=0 QTime=60 2> 190647 INFO (qtp2065109343-466) [n:127.0.0.1:45297_ c:control_collection s:shard1 r:core_node2 x:control_collection_shard1_replica_n1 t:null-339] o.a.s.c.S.Request webapp= path=/select params={q=*:*&distrib=false&sort=n_l1+asc&wt=javabin&version=2} hits=67 status=0 QTime=8 2> 190674 INFO (qtp657049334-529) [n:127.0.0.1:37133_ c:collection1 s:shard1 r:core_node8 x:collection1_shard1_replica_n6 t:null-340] o.a.s.c.S.Request webapp= path=/select params={df=text&distrib=false&_stateVer_=collection1:11&fl=id&shards.purpose=4&start=0&fsv=true&sort=n_l1+asc&rid=null-340&rows=10&version=2&q=*:*&omitHeader=false&NOW=1705871122760&isShard=true&wt=javabin} hits=32 status=0 QTime=3 2> 190681 INFO (qtp558875332-573) [n:127.0.0.1:32885_ c:collection1 s:shard2 r:core_node7 x:collection1_shard2_replica_n5 t:null-340] o.a.s.c.S.Request webapp= path=/select params={df=text&distrib=false&_stateVer_=collection1:11&fl=id&shards.purpose=4&start=0&fsv=true&sort=n_l1+asc&rid=null-340&rows=10&version=2&q=*:*&omitHeader=false&NOW=1705871122760&isShard=true&wt=javabin} hits=35 status=0 QTime=11 2> 190696 INFO (qtp657049334-532) [n:127.0.0.1:37133_ c:collection1 s:shard1 r:core_node8 x:collection1_shard1_replica_n6 t:null-340] o.a.s.c.S.Request webapp= path=/select params={q=*:*&df=text&distrib=false&_stateVer_=collection1:11&omitHeader=false&shards.purpose=64&NOW=1705871122760&ids=11,12,123,135,118,108,109&isShard=true&rid=null-340&wt=javabin&version=2} status=0 QTime=3 2> 190698 INFO (qtp558875332-569) [n:127.0.0.1:32885_ c:collection1 s:shard2 r:core_node7 x:collection1_shard2_replica_n5 t:null-340] o.a.s.c.S.Request webapp= path=/select params={q=*:*&df=text&distrib=false&_stateVer_=collection1:11&omitHeader=false&shards.purpose=64&NOW=1705871122760&ids=3,126,17&isShard=true&rid=null-340&wt=javabin&version=2} status=0 QTime=3 2> 190702 INFO (qtp600427849-517) [n:127.0.0.1:40707_ c:collection1 s:shard2 r:core_node4 x:collection1_shard2_replica_n2 t:null-340] o.a.s.c.S.Request webapp= path=/select params={q=*:*&_stateVer_=collection1:11&sort=n_l1+asc&wt=javabin&version=2} rid=null-340 hits=67 status=0 QTime=40 2> 190712 INFO (qtp2065109343-468) [n:127.0.0.1:45297_ c:control_collection s:shard1 r:core_node2 x:control_collection_shard1_replica_n1 t:null-341] o.a.s.c.S.Request webapp= path=/select params={q=*:*&distrib=false&sort=n_tl1+desc&wt=javabin&version=2} hits=67 status=0 QTime=2 2> 190733 INFO (qtp558875332-570) [n:127.0.0.1:32885_ c:collection1 s:shard2 r:core_node7 x:collection1_shard2_replica_n5 t:null-342] o.a.s.c.S.Request webapp= path=/select params={df=text&distrib=false&_stateVer_=collection1:11&fl=id&shards.purpose=4&start=0&fsv=true&sort=n_tl1+desc&collection=collection1&rid=null-342&rows=10&version=2&q=*:*&omitHeader=false&NOW=1705871122820&isShard=true&wt=javabin} hits=35 status=0 QTime=4 2> 190734 INFO (qtp657049334-528) [n:127.0.0.1:37133_ c:collection1 s:shard1 r:core_node8 x:collection1_shard1_replica_n6 t:null-342] o.a.s.c.S.Request webapp= path=/select params={df=text&distrib=false&_stateVer_=collection1:11&fl=id&shards.purpose=4&start=0&fsv=true&sort=n_tl1+desc&collection=collection1&rid=null-342&rows=10&version=2&q=*:*&omitHeader=false&NOW=1705871122820&isShard=true&wt=javabin} hits=32 status=0 QTime=3 2> 190745 INFO (qtp657049334-533) [n:127.0.0.1:37133_ c:collection1 s:shard1 r:core_node8 x:collection1_shard1_replica_n6 t:null-342] o.a.s.c.S.Request webapp= path=/select params={df=text&distrib=false&_stateVer_=collection1:11&shards.purpose=64&collection=collection1&rid=null-342&version=2&q=*:*&omitHeader=false&NOW=1705871122820&ids=110,1,123,146,16&isShard=true&wt=javabin} status=0 QTime=3 2> 190748 INFO (qtp558875332-572) [n:127.0.0.1:32885_ c:collection1 s:shard2 r:core_node7 x:collection1_shard2_replica_n5 t:null-342] o.a.s.c.S.Request webapp= path=/select params={df=text&distrib=false&_stateVer_=collection1:11&shards.purpose=64&collection=collection1&rid=null-342&version=2&q=*:*&omitHeader=false&NOW=1705871122820&ids=134,2,147,5,142&isShard=true&wt=javabin} status=0 QTime=3 2> 190752 INFO (qtp657049334-530) [n:127.0.0.1:37133_ c:collection1 s:shard1 r:core_node8 x:collection1_shard1_replica_n6 t:null-342] o.a.s.c.S.Request webapp= path=/select params={q=*:*&_stateVer_=collection1:11&sort=n_tl1+desc&collection=collection1&wt=javabin&version=2} rid=null-342 hits=67 status=0 QTime=29 2> 190763 INFO (qtp2065109343-469) [n:127.0.0.1:45297_ c:control_collection s:shard1 r:core_node2 x:control_collection_shard1_replica_n1 t:null-343] o.a.s.c.S.Request webapp= path=/select params={q=*:*&distrib=false&sort=n_tl1+asc&wt=javabin&version=2} hits=67 status=0 QTime=3 2> 190788 INFO (qtp600427849-518) [n:127.0.0.1:40707_ c:collection1 s:shard2 r:core_node4 x:collection1_shard2_replica_n2 t:null-344] o.a.s.c.S.Request webapp= path=/select params={df=text&distrib=false&_stateVer_=collection1:11&fl=id&shards.purpose=4&start=0&fsv=true&sort=n_tl1+asc&rid=null-344&rows=10&version=2&q=*:*&omitHeader=false&NOW=1705871122873&isShard=true&wt=javabin} hits=35 status=0 QTime=4 2> 190789 INFO (qtp657049334-531) [n:127.0.0.1:37133_ c:collection1 s:shard1 r:core_node8 x:collection1_shard1_replica_n6 t:null-344] o.a.s.c.S.Request webapp= path=/select params={df=text&distrib=false&_stateVer_=collection1:11&fl=id&shards.purpose=4&start=0&fsv=true&sort=n_tl1+asc&rid=null-344&rows=10&version=2&q=*:*&omitHeader=false&NOW=1705871122873&isShard=true&wt=javabin} hits=32 status=0 QTime=2 2> 190822 INFO (qtp600427849-515) [n:127.0.0.1:40707_ c:collection1 s:shard2 r:core_node4 x:collection1_shard2_replica_n2 t:null-344] o.a.s.c.S.Request webapp= path=/select params={q=*:*&df=text&distrib=false&_stateVer_=collection1:11&omitHeader=false&shards.purpose=64&NOW=1705871122873&ids=3,136,137,139,9,119,130&isShard=true&rid=null-344&wt=javabin&version=2} status=0 QTime=3 2> 190824 INFO (qtp657049334-529) [n:127.0.0.1:37133_ c:collection1 s:shard1 r:core_node8 x:collection1_shard1_replica_n6 t:null-344] o.a.s.c.S.Request webapp= path=/select params={q=*:*&df=text&distrib=false&_stateVer_=collection1:11&omitHeader=false&shards.purpose=64&NOW=1705871122873&ids=101,8,10&isShard=true&rid=null-344&wt=javabin&version=2} status=0 QTime=3 2> 190828 INFO (qtp558875332-568) [n:127.0.0.1:32885_ c:collection1 s:shard2 r:core_node7 x:collection1_shard2_replica_n5 t:null-344] o.a.s.c.S.Request webapp= path=/select params={q=*:*&_stateVer_=collection1:11&sort=n_tl1+asc&wt=javabin&version=2} rid=null-344 hits=67 status=0 QTime=53 2> 190844 INFO (qtp2065109343-470) [n:127.0.0.1:45297_ c:control_collection s:shard1 r:core_node2 x:control_collection_shard1_replica_n1 t:null-345] o.a.s.c.S.Request webapp= path=/select params={q=*:*&distrib=false&sort=n_dt1+desc&wt=javabin&version=2} hits=67 status=0 QTime=2 2> 190874 INFO (qtp558875332-571) [n:127.0.0.1:32885_ c:collection1 s:shard2 r:core_node7 x:collection1_shard2_replica_n5 t:null-346] o.a.s.c.S.Request webapp= path=/select params={df=text&distrib=false&_stateVer_=collection1:11&fl=id&shards.purpose=4&start=0&fsv=true&sort=n_dt1+desc&collection=collection1&rid=null-346&rows=10&version=2&q=*:*&omitHeader=false&NOW=1705871122962&isShard=true&wt=javabin} hits=35 status=0 QTime=3 2> 190877 INFO (qtp1128963290-588) [n:127.0.0.1:43017_ c:collection1 s:shard1 r:core_node3 x:collection1_shard1_replica_n1 t:null-346] o.a.s.c.S.Request webapp= path=/select params={df=text&distrib=false&_stateVer_=collection1:11&fl=id&shards.purpose=4&start=0&fsv=true&sort=n_dt1+desc&collection=collection1&rid=null-346&rows=10&version=2&q=*:*&omitHeader=false&NOW=1705871122962&isShard=true&wt=javabin} hits=32 status=0 QTime=3 2> 190894 INFO (qtp1128963290-593) [n:127.0.0.1:43017_ c:collection1 s:shard1 r:core_node3 x:collection1_shard1_replica_n1 t:null-346] o.a.s.c.S.Request webapp= path=/select params={df=text&distrib=false&_stateVer_=collection1:11&shards.purpose=64&collection=collection1&rid=null-346&version=2&q=*:*&omitHeader=false&NOW=1705871122962&ids=132,100,125,114&isShard=true&wt=javabin} status=0 QTime=3 2> 190897 INFO (qtp558875332-573) [n:127.0.0.1:32885_ c:collection1 s:shard2 r:core_node7 x:collection1_shard2_replica_n5 t:null-346] o.a.s.c.S.Request webapp= path=/select params={df=text&distrib=false&_stateVer_=collection1:11&shards.purpose=64&collection=collection1&rid=null-346&version=2&q=*:*&omitHeader=false&NOW=1705871122962&ids=121,147,115,117,128,130&isShard=true&wt=javabin} status=0 QTime=3 2> 190902 INFO (qtp600427849-519) [n:127.0.0.1:40707_ c:collection1 s:shard2 r:core_node4 x:collection1_shard2_replica_n2 t:null-346] o.a.s.c.S.Request webapp= path=/select params={q=*:*&_stateVer_=collection1:11&sort=n_dt1+desc&collection=collection1&wt=javabin&version=2} rid=null-346 hits=67 status=0 QTime=38 2> 190924 INFO (qtp2065109343-467) [n:127.0.0.1:45297_ c:control_collection s:shard1 r:core_node2 x:control_collection_shard1_replica_n1 t:null-347] o.a.s.c.S.Request webapp= path=/select params={q=*:*&distrib=false&sort=n_dt1+asc&wt=javabin&version=2} hits=67 status=0 QTime=4 2> 190950 INFO (qtp600427849-520) [n:127.0.0.1:40707_ c:collection1 s:shard2 r:core_node4 x:collection1_shard2_replica_n2 t:null-348] o.a.s.c.S.Request webapp= path=/select params={df=text&distrib=false&_stateVer_=collection1:11&fl=id&shards.purpose=4&start=0&fsv=true&sort=n_dt1+asc&rid=null-348&rows=10&version=2&q=*:*&omitHeader=false&NOW=1705871123037&isShard=true&wt=javabin} hits=35 status=0 QTime=4 2> 190954 INFO (qtp657049334-532) [n:127.0.0.1:37133_ c:collection1 s:shard1 r:core_node8 x:collection1_shard1_replica_n6 t:null-348] o.a.s.c.S.Request webapp= path=/select params={df=text&distrib=false&_stateVer_=collection1:11&fl=id&shards.purpose=4&start=0&fsv=true&sort=n_dt1+asc&rid=null-348&rows=10&version=2&q=*:*&omitHeader=false&NOW=1705871123037&isShard=true&wt=javabin} hits=32 status=0 QTime=3 2> 190965 INFO (qtp600427849-516) [n:127.0.0.1:40707_ c:collection1 s:shard2 r:core_node4 x:collection1_shard2_replica_n2 t:null-348] o.a.s.c.S.Request webapp= path=/select params={q=*:*&df=text&distrib=false&_stateVer_=collection1:11&omitHeader=false&shards.purpose=64&NOW=1705871123037&ids=2,5,137,6,119&isShard=true&rid=null-348&wt=javabin&version=2} status=0 QTime=3 2> 190972 INFO (qtp657049334-528) [n:127.0.0.1:37133_ c:collection1 s:shard1 r:core_node8 x:collection1_shard1_replica_n6 t:null-348] o.a.s.c.S.Request webapp= path=/select params={q=*:*&df=text&distrib=false&_stateVer_=collection1:11&omitHeader=false&shards.purpose=64&NOW=1705871123037&ids=101,123,14,138,10&isShard=true&rid=null-348&wt=javabin&version=2} status=0 QTime=6 2> 190976 INFO (qtp1128963290-590) [n:127.0.0.1:43017_ c:collection1 s:shard1 r:core_node3 x:collection1_shard1_replica_n1 t:null-348] o.a.s.c.S.Request webapp= path=/select params={q=*:*&_stateVer_=collection1:11&sort=n_dt1+asc&wt=javabin&version=2} rid=null-348 hits=67 status=0 QTime=37 2> 190988 INFO (qtp2065109343-471) [n:127.0.0.1:45297_ c:control_collection s:shard1 r:core_node2 x:control_collection_shard1_replica_n1 t:null-349] o.a.s.c.S.Request webapp= path=/select params={q=*:*&distrib=false&sort=n_tdt1+desc&wt=javabin&version=2} hits=67 status=0 QTime=3 2> 191011 INFO (qtp558875332-569) [n:127.0.0.1:32885_ c:collection1 s:shard2 r:core_node7 x:collection1_shard2_replica_n5 t:null-350] o.a.s.c.S.Request webapp= path=/select params={df=text&distrib=false&_stateVer_=collection1:11&fl=id&shards.purpose=4&start=0&fsv=true&sort=n_tdt1+desc&collection=collection1&rid=null-350&rows=10&version=2&q=*:*&omitHeader=false&NOW=1705871123096&isShard=true&wt=javabin} hits=35 status=0 QTime=4 2> 191019 INFO (qtp657049334-533) [n:127.0.0.1:37133_ c:collection1 s:shard1 r:core_node8 x:collection1_shard1_replica_n6 t:null-350] o.a.s.c.S.Request webapp= path=/select params={df=text&distrib=false&_stateVer_=collection1:11&fl=id&shards.purpose=4&start=0&fsv=true&sort=n_tdt1+desc&collection=collection1&rid=null-350&rows=10&version=2&q=*:*&omitHeader=false&NOW=1705871123096&isShard=true&wt=javabin} hits=32 status=0 QTime=5 2> 191030 INFO (qtp657049334-530) [n:127.0.0.1:37133_ c:collection1 s:shard1 r:core_node8 x:collection1_shard1_replica_n6 t:null-350] o.a.s.c.S.Request webapp= path=/select params={df=text&distrib=false&_stateVer_=collection1:11&shards.purpose=64&collection=collection1&rid=null-350&version=2&q=*:*&omitHeader=false&NOW=1705871123096&ids=1,14,124,108&isShard=true&wt=javabin} status=0 QTime=3 2> 191038 INFO (qtp558875332-570) [n:127.0.0.1:32885_ c:collection1 s:shard2 r:core_node7 x:collection1_shard2_replica_n5 t:null-350] o.a.s.c.S.Request webapp= path=/select params={df=text&distrib=false&_stateVer_=collection1:11&shards.purpose=64&collection=collection1&rid=null-350&version=2&q=*:*&omitHeader=false&NOW=1705871123096&ids=3,137,6,117,131,142&isShard=true&wt=javabin} status=0 QTime=9 2> 191043 INFO (qtp600427849-517) [n:127.0.0.1:40707_ c:collection1 s:shard2 r:core_node4 x:collection1_shard2_replica_n2 t:null-350] o.a.s.c.S.Request webapp= path=/select params={q=*:*&_stateVer_=collection1:11&sort=n_tdt1+desc&collection=collection1&wt=javabin&version=2} rid=null-350 hits=67 status=0 QTime=45 2> 191055 INFO (qtp2065109343-466) [n:127.0.0.1:45297_ c:control_collection s:shard1 r:core_node2 x:control_collection_shard1_replica_n1 t:null-351] o.a.s.c.S.Request webapp= path=/select params={q=*:*&distrib=false&sort=n_tdt1+asc&wt=javabin&version=2} hits=67 status=0 QTime=2 2> 191071 INFO (qtp558875332-572) [n:127.0.0.1:32885_ c:collection1 s:shard2 r:core_node7 x:collection1_shard2_replica_n5 t:null-352] o.a.s.c.S.Request webapp= path=/select params={df=text&distrib=false&_stateVer_=collection1:11&fl=id&shards.purpose=4&start=0&fsv=true&sort=n_tdt1+asc&collection=collection1&rid=null-352&rows=10&version=2&q=*:*&omitHeader=false&NOW=1705871123162&isShard=true&wt=javabin} hits=35 status=0 QTime=2 2> 191074 INFO (qtp1128963290-589) [n:127.0.0.1:43017_ c:collection1 s:shard1 r:core_node3 x:collection1_shard1_replica_n1 t:null-352] o.a.s.c.S.Request webapp= path=/select params={df=text&distrib=false&_stateVer_=collection1:11&fl=id&shards.purpose=4&start=0&fsv=true&sort=n_tdt1+asc&collection=collection1&rid=null-352&rows=10&version=2&q=*:*&omitHeader=false&NOW=1705871123162&isShard=true&wt=javabin} hits=32 status=0 QTime=2 2> 191085 INFO (qtp1128963290-591) [n:127.0.0.1:43017_ c:collection1 s:shard1 r:core_node3 x:collection1_shard1_replica_n1 t:null-352] o.a.s.c.S.Request webapp= path=/select params={df=text&distrib=false&_stateVer_=collection1:11&shards.purpose=64&collection=collection1&rid=null-352&version=2&q=*:*&omitHeader=false&NOW=1705871123162&ids=122,100,113,15,138&isShard=true&wt=javabin} status=0 QTime=3 2> 191089 INFO (qtp558875332-568) [n:127.0.0.1:32885_ c:collection1 s:shard2 r:core_node7 x:collection1_shard2_replica_n5 t:null-352] o.a.s.c.S.Request webapp= path=/select params={df=text&distrib=false&_stateVer_=collection1:11&shards.purpose=64&collection=collection1&rid=null-352&version=2&q=*:*&omitHeader=false&NOW=1705871123162&ids=127,9,119,141,120&isShard=true&wt=javabin} status=0 QTime=4 2> 191094 INFO (qtp1128963290-592) [n:127.0.0.1:43017_ c:collection1 s:shard1 r:core_node3 x:collection1_shard1_replica_n1 t:null-352] o.a.s.c.S.Request webapp= path=/select params={q=*:*&_stateVer_=collection1:11&sort=n_tdt1+asc&collection=collection1&wt=javabin&version=2} rid=null-352 hits=67 status=0 QTime=30 2> 191107 INFO (qtp2065109343-468) [n:127.0.0.1:45297_ c:control_collection s:shard1 r:core_node2 x:control_collection_shard1_replica_n1 t:null-353] o.a.s.c.S.Request webapp= path=/select params={q=*:*&distrib=false&sort=a_i1+desc&wt=javabin&version=2} hits=67 status=0 QTime=5 2> 191131 INFO (qtp657049334-531) [n:127.0.0.1:37133_ c:collection1 s:shard1 r:core_node8 x:collection1_shard1_replica_n6 t:null-354] o.a.s.c.S.Request webapp= path=/select params={df=text&distrib=false&_stateVer_=collection1:11&fl=id&shards.purpose=4&start=0&fsv=true&sort=a_i1+desc&collection=collection1&rid=null-354&rows=10&version=2&q=*:*&omitHeader=false&NOW=1705871123216&isShard=true&wt=javabin} hits=32 status=0 QTime=2 2> 191131 INFO (qtp600427849-518) [n:127.0.0.1:40707_ c:collection1 s:shard2 r:core_node4 x:collection1_shard2_replica_n2 t:null-354] o.a.s.c.S.Request webapp= path=/select params={df=text&distrib=false&_stateVer_=collection1:11&fl=id&shards.purpose=4&start=0&fsv=true&sort=a_i1+desc&collection=collection1&rid=null-354&rows=10&version=2&q=*:*&omitHeader=false&NOW=1705871123216&isShard=true&wt=javabin} hits=35 status=0 QTime=5 2> 191150 INFO (qtp600427849-515) [n:127.0.0.1:40707_ c:collection1 s:shard2 r:core_node4 x:collection1_shard2_replica_n2 t:null-354] o.a.s.c.S.Request webapp= path=/select params={df=text&distrib=false&_stateVer_=collection1:11&shards.purpose=64&collection=collection1&rid=null-354&version=2&q=*:*&omitHeader=false&NOW=1705871123216&ids=2,3,5,7,9&isShard=true&wt=javabin} status=0 QTime=8 2> 191151 INFO (qtp657049334-529) [n:127.0.0.1:37133_ c:collection1 s:shard1 r:core_node8 x:collection1_shard1_replica_n6 t:null-354] o.a.s.c.S.Request webapp= path=/select params={df=text&distrib=false&_stateVer_=collection1:11&shards.purpose=64&collection=collection1&rid=null-354&version=2&q=*:*&omitHeader=false&NOW=1705871123216&ids=1,12,13,8,10&isShard=true&wt=javabin} status=0 QTime=6 2> 191155 INFO (qtp1128963290-593) [n:127.0.0.1:43017_ c:collection1 s:shard1 r:core_node3 x:collection1_shard1_replica_n1 t:null-354] o.a.s.c.S.Request webapp= path=/select params={q=*:*&_stateVer_=collection1:11&sort=a_i1+desc&collection=collection1&wt=javabin&version=2} rid=null-354 hits=67 status=0 QTime=37 2> 191168 INFO (qtp2065109343-469) [n:127.0.0.1:45297_ c:control_collection s:shard1 r:core_node2 x:control_collection_shard1_replica_n1 t:null-355] o.a.s.c.S.Request webapp= path=/select params={q=*:*&distrib=false&sort=a_i1+asc&wt=javabin&version=2} hits=67 status=0 QTime=3 2> 191186 INFO (qtp558875332-573) [n:127.0.0.1:32885_ c:collection1 s:shard2 r:core_node7 x:collection1_shard2_replica_n5 t:null-356] o.a.s.c.S.Request webapp= path=/select params={df=text&distrib=false&_stateVer_=collection1:11&fl=id&shards.purpose=4&start=0&fsv=true&sort=a_i1+asc&rid=null-356&rows=10&version=2&q=*:*&omitHeader=false&NOW=1705871123275&isShard=true&wt=javabin} hits=35 status=0 QTime=2 2> 191191 INFO (qtp657049334-532) [n:127.0.0.1:37133_ c:collection1 s:shard1 r:core_node8 x:collection1_shard1_replica_n6 t:null-356] o.a.s.c.S.Request webapp= path=/select params={df=text&distrib=false&_stateVer_=collection1:11&fl=id&shards.purpose=4&start=0&fsv=true&sort=a_i1+asc&rid=null-356&rows=10&version=2&q=*:*&omitHeader=false&NOW=1705871123275&isShard=true&wt=javabin} hits=32 status=0 QTime=5 2> 191205 INFO (qtp657049334-528) [n:127.0.0.1:37133_ c:collection1 s:shard1 r:core_node8 x:collection1_shard1_replica_n6 t:null-356] o.a.s.c.S.Request webapp= path=/select params={q=*:*&df=text&distrib=false&_stateVer_=collection1:11&omitHeader=false&shards.purpose=64&NOW=1705871123275&ids=11,12,1,13,4&isShard=true&rid=null-356&wt=javabin&version=2} status=0 QTime=6 2> 191205 INFO (qtp558875332-569) [n:127.0.0.1:32885_ c:collection1 s:shard2 r:core_node7 x:collection1_shard2_replica_n5 t:null-356] o.a.s.c.S.Request webapp= path=/select params={q=*:*&df=text&distrib=false&_stateVer_=collection1:11&omitHeader=false&shards.purpose=64&NOW=1705871123275&ids=2,3,6,7,9&isShard=true&rid=null-356&wt=javabin&version=2} status=0 QTime=4 2> 191210 INFO (qtp558875332-571) [n:127.0.0.1:32885_ c:collection1 s:shard2 r:core_node7 x:collection1_shard2_replica_n5 t:null-356] o.a.s.c.S.Request webapp= path=/select params={q=*:*&_stateVer_=collection1:11&sort=a_i1+asc&wt=javabin&version=2} rid=null-356 hits=67 status=0 QTime=33 2> 191239 INFO (qtp2065109343-470) [n:127.0.0.1:45297_ c:control_collection s:shard1 r:core_node2 x:control_collection_shard1_replica_n1 t:null-357] o.a.s.c.S.Request webapp= path=/select params={q=*:*&distrib=false&fl=*,score&sort=a_i1+desc&wt=javabin&version=2} hits=67 status=0 QTime=21 2> 191260 INFO (qtp600427849-519) [n:127.0.0.1:40707_ c:collection1 s:shard2 r:core_node4 x:collection1_shard2_replica_n2 t:null-358] o.a.s.c.S.Request webapp= path=/select params={df=text&distrib=false&_stateVer_=collection1:11&fl=id&fl=score&shards.purpose=16388&start=0&fsv=true&sort=a_i1+desc&rid=null-358&rows=10&version=2&q=*:*&omitHeader=false&NOW=1705871123348&isShard=true&wt=javabin} hits=35 status=0 QTime=4 2> 191266 INFO (qtp1128963290-590) [n:127.0.0.1:43017_ c:collection1 s:shard1 r:core_node3 x:collection1_shard1_replica_n1 t:null-358] o.a.s.c.S.Request webapp= path=/select params={df=text&distrib=false&_stateVer_=collection1:11&fl=id&fl=score&shards.purpose=16388&start=0&fsv=true&sort=a_i1+desc&rid=null-358&rows=10&version=2&q=*:*&omitHeader=false&NOW=1705871123348&isShard=true&wt=javabin} hits=32 status=0 QTime=7 2> 191284 INFO (qtp600427849-520) [n:127.0.0.1:40707_ c:collection1 s:shard2 r:core_node4 x:collection1_shard2_replica_n2 t:null-358] o.a.s.c.S.Request webapp= path=/select params={df=text&distrib=false&_stateVer_=collection1:11&fl=*,score&shards.purpose=64&rid=null-358&version=2&q=*:*&omitHeader=false&NOW=1705871123348&ids=2,3,5,7,9&isShard=true&wt=javabin} status=0 QTime=10 2> 191286 INFO (qtp1128963290-589) [n:127.0.0.1:43017_ c:collection1 s:shard1 r:core_node3 x:collection1_shard1_replica_n1 t:null-358] o.a.s.c.S.Request webapp= path=/select params={df=text&distrib=false&_stateVer_=collection1:11&fl=*,score&shards.purpose=64&rid=null-358&version=2&q=*:*&omitHeader=false&NOW=1705871123348&ids=1,12,13,8,10&isShard=true&wt=javabin} status=0 QTime=9 2> 191291 INFO (qtp657049334-533) [n:127.0.0.1:37133_ c:collection1 s:shard1 r:core_node8 x:collection1_shard1_replica_n6 t:null-358] o.a.s.c.S.Request webapp= path=/select params={q=*:*&_stateVer_=collection1:11&fl=*,score&sort=a_i1+desc&wt=javabin&version=2} rid=null-358 hits=67 status=0 QTime=42 2> 191322 INFO (qtp2065109343-467) [n:127.0.0.1:45297_ c:control_collection s:shard1 r:core_node2 x:control_collection_shard1_replica_n1 t:null-359] o.a.s.c.S.Request webapp= path=/select params={q=*:*&distrib=false&fl=*,score&sort=n_tl1+asc&wt=javabin&version=2} hits=67 status=0 QTime=5 2> 191360 INFO (qtp558875332-570) [n:127.0.0.1:32885_ c:collection1 s:shard2 r:core_node7 x:collection1_shard2_replica_n5 t:null-360] o.a.s.c.S.Request webapp= path=/select params={df=text&distrib=false&_stateVer_=collection1:11&fl=id&fl=score&shards.purpose=16388&start=0&fsv=true&sort=n_tl1+asc&rid=null-360&rows=10&version=2&q=*:*&omitHeader=false&NOW=1705871123432&isShard=true&wt=javabin} hits=35 status=0 QTime=15 2> 191360 INFO (qtp657049334-531) [n:127.0.0.1:37133_ c:collection1 s:shard1 r:core_node8 x:collection1_shard1_replica_n6 t:null-360] o.a.s.c.S.Request webapp= path=/select params={df=text&distrib=false&_stateVer_=collection1:11&fl=id&fl=score&shards.purpose=16388&start=0&fsv=true&sort=n_tl1+asc&rid=null-360&rows=10&version=2&q=*:*&omitHeader=false&NOW=1705871123432&isShard=true&wt=javabin} hits=32 status=0 QTime=12 2> 191372 INFO (qtp657049334-529) [n:127.0.0.1:37133_ c:collection1 s:shard1 r:core_node8 x:collection1_shard1_replica_n6 t:null-360] o.a.s.c.S.Request webapp= path=/select params={df=text&distrib=false&_stateVer_=collection1:11&fl=*,score&shards.purpose=64&rid=null-360&version=2&q=*:*&omitHeader=false&NOW=1705871123432&ids=101,8,10&isShard=true&wt=javabin} status=0 QTime=3 2> 191377 INFO (qtp558875332-572) [n:127.0.0.1:32885_ c:collection1 s:shard2 r:core_node7 x:collection1_shard2_replica_n5 t:null-360] o.a.s.c.S.Request webapp= path=/select params={df=text&distrib=false&_stateVer_=collection1:11&fl=*,score&shards.purpose=64&rid=null-360&version=2&q=*:*&omitHeader=false&NOW=1705871123432&ids=3,136,137,139,9,119,130&isShard=true&wt=javabin} status=0 QTime=4 2> 191383 INFO (qtp657049334-530) [n:127.0.0.1:37133_ c:collection1 s:shard1 r:core_node8 x:collection1_shard1_replica_n6 t:null-360] o.a.s.c.S.Request webapp= path=/select params={q=*:*&_stateVer_=collection1:11&fl=*,score&sort=n_tl1+asc&wt=javabin&version=2} rid=null-360 hits=67 status=0 QTime=49 2> 191396 INFO (qtp2065109343-471) [n:127.0.0.1:45297_ c:control_collection s:shard1 r:core_node2 x:control_collection_shard1_replica_n1 t:null-361] o.a.s.c.S.Request webapp= path=/select params={q=*:*&distrib=false&sort=n_tl1+desc&wt=javabin&version=2} hits=67 status=0 QTime=5 2> 191416 INFO (qtp558875332-573) [n:127.0.0.1:32885_ c:collection1 s:shard2 r:core_node7 x:collection1_shard2_replica_n5 t:null-362] o.a.s.c.S.Request webapp= path=/select params={df=text&distrib=false&_stateVer_=collection1:11&fl=id&shards.purpose=4&start=0&fsv=true&sort=n_tl1+desc&collection=collection1&rid=null-362&rows=10&version=2&q=*:*&omitHeader=false&NOW=1705871123504&isShard=true&wt=javabin} hits=35 status=0 QTime=3 2> 191420 INFO (qtp657049334-532) [n:127.0.0.1:37133_ c:collection1 s:shard1 r:core_node8 x:collection1_shard1_replica_n6 t:null-362] o.a.s.c.S.Request webapp= path=/select params={df=text&distrib=false&_stateVer_=collection1:11&fl=id&shards.purpose=4&start=0&fsv=true&sort=n_tl1+desc&collection=collection1&rid=null-362&rows=10&version=2&q=*:*&omitHeader=false&NOW=1705871123504&isShard=true&wt=javabin} hits=32 status=0 QTime=2 2> 191436 INFO (qtp657049334-528) [n:127.0.0.1:37133_ c:collection1 s:shard1 r:core_node8 x:collection1_shard1_replica_n6 t:null-362] o.a.s.c.S.Request webapp= path=/select params={df=text&distrib=false&_stateVer_=collection1:11&shards.purpose=64&collection=collection1&rid=null-362&version=2&q=*:*&omitHeader=false&NOW=1705871123504&ids=110,1,123,146,16&isShard=true&wt=javabin} status=0 QTime=3 2> 191443 INFO (qtp558875332-569) [n:127.0.0.1:32885_ c:collection1 s:shard2 r:core_node7 x:collection1_shard2_replica_n5 t:null-362] o.a.s.c.S.Request webapp= path=/select params={df=text&distrib=false&_stateVer_=collection1:11&shards.purpose=64&collection=collection1&rid=null-362&version=2&q=*:*&omitHeader=false&NOW=1705871123504&ids=134,2,147,5,142&isShard=true&wt=javabin} status=0 QTime=7 2> 191449 INFO (qtp558875332-568) [n:127.0.0.1:32885_ c:collection1 s:shard2 r:core_node7 x:collection1_shard2_replica_n5 t:null-362] o.a.s.c.S.Request webapp= path=/select params={q=*:*&_stateVer_=collection1:11&sort=n_tl1+desc&collection=collection1&wt=javabin&version=2} rid=null-362 hits=67 status=0 QTime=43 2> 191584 INFO (qtp2065109343-466) [n:127.0.0.1:45297_ c:control_collection s:shard1 r:core_node2 x:control_collection_shard1_replica_n1 t:null-363] o.a.s.c.S.Request webapp= path=/select params={q={!func}a_i1&distrib=false&wt=javabin&version=2} hits=67 status=0 QTime=115 2> 191630 INFO (qtp558875332-571) [n:127.0.0.1:32885_ c:collection1 s:shard2 r:core_node7 x:collection1_shard2_replica_n5 t:null-364] o.a.s.c.S.Request webapp= path=/select params={df=text&distrib=false&_stateVer_=collection1:11&fl=id&fl=score&shards.purpose=16388&start=0&fsv=true&collection=collection1&rid=null-364&rows=10&version=2&q={!func}a_i1&omitHeader=false&NOW=1705871123717&isShard=true&wt=javabin} hits=35 status=0 QTime=3 2> 191639 INFO (qtp1128963290-592) [n:127.0.0.1:43017_ c:collection1 s:shard1 r:core_node3 x:collection1_shard1_replica_n1 t:null-364] o.a.s.c.S.Request webapp= path=/select params={df=text&distrib=false&_stateVer_=collection1:11&fl=id&fl=score&shards.purpose=16388&start=0&fsv=true&collection=collection1&rid=null-364&rows=10&version=2&q={!func}a_i1&omitHeader=false&NOW=1705871123717&isShard=true&wt=javabin} hits=32 status=0 QTime=6 2> 191668 INFO (qtp1128963290-588) [n:127.0.0.1:43017_ c:collection1 s:shard1 r:core_node3 x:collection1_shard1_replica_n1 t:null-364] o.a.s.c.S.Request webapp= path=/select params={df=text&distrib=false&_stateVer_=collection1:11&shards.purpose=64&collection=collection1&rid=null-364&version=2&q={!func}a_i1&omitHeader=false&NOW=1705871123717&ids=1,12,13,8,10&isShard=true&wt=javabin} status=0 QTime=16 2> 191668 INFO (qtp558875332-570) [n:127.0.0.1:32885_ c:collection1 s:shard2 r:core_node7 x:collection1_shard2_replica_n5 t:null-364] o.a.s.c.S.Request webapp= path=/select params={df=text&distrib=false&_stateVer_=collection1:11&shards.purpose=64&collection=collection1&rid=null-364&version=2&q={!func}a_i1&omitHeader=false&NOW=1705871123717&ids=2,3,5,7,9&isShard=true&wt=javabin} status=0 QTime=12 2> 191677 INFO (qtp1128963290-591) [n:127.0.0.1:43017_ c:collection1 s:shard1 r:core_node3 x:collection1_shard1_replica_n1 t:null-364] o.a.s.c.S.Request webapp= path=/select params={q={!func}a_i1&_stateVer_=collection1:11&collection=collection1&wt=javabin&version=2} rid=null-364 hits=67 status=0 QTime=59 2> 191707 INFO (qtp2065109343-468) [n:127.0.0.1:45297_ c:control_collection s:shard1 r:core_node2 x:control_collection_shard1_replica_n1 t:null-365] o.a.s.c.S.Request webapp= path=/select params={q={!func}a_i1&distrib=false&fl=*,score&wt=javabin&version=2} hits=67 status=0 QTime=5 2> 191732 INFO (qtp558875332-572) [n:127.0.0.1:32885_ c:collection1 s:shard2 r:core_node7 x:collection1_shard2_replica_n5 t:null-366] o.a.s.c.S.Request webapp= path=/select params={df=text&distrib=false&_stateVer_=collection1:11&fl=id&fl=score&shards.purpose=16388&start=0&fsv=true&collection=collection1&rid=null-366&rows=10&version=2&q={!func}a_i1&omitHeader=false&NOW=1705871123821&isShard=true&wt=javabin} hits=35 status=0 QTime=2 2> 191735 INFO (qtp1128963290-593) [n:127.0.0.1:43017_ c:collection1 s:shard1 r:core_node3 x:collection1_shard1_replica_n1 t:null-366] o.a.s.c.S.Request webapp= path=/select params={df=text&distrib=false&_stateVer_=collection1:11&fl=id&fl=score&shards.purpose=16388&start=0&fsv=true&collection=collection1&rid=null-366&rows=10&version=2&q={!func}a_i1&omitHeader=false&NOW=1705871123821&isShard=true&wt=javabin} hits=32 status=0 QTime=2 2> 191748 INFO (qtp1128963290-590) [n:127.0.0.1:43017_ c:collection1 s:shard1 r:core_node3 x:collection1_shard1_replica_n1 t:null-366] o.a.s.c.S.Request webapp= path=/select params={df=text&distrib=false&_stateVer_=collection1:11&fl=*,score&shards.purpose=64&collection=collection1&rid=null-366&version=2&q={!func}a_i1&omitHeader=false&NOW=1705871123821&ids=1,12,13,8,10&isShard=true&wt=javabin} status=0 QTime=3 2> 191751 INFO (qtp558875332-573) [n:127.0.0.1:32885_ c:collection1 s:shard2 r:core_node7 x:collection1_shard2_replica_n5 t:null-366] o.a.s.c.S.Request webapp= path=/select params={df=text&distrib=false&_stateVer_=collection1:11&fl=*,score&shards.purpose=64&collection=collection1&rid=null-366&version=2&q={!func}a_i1&omitHeader=false&NOW=1705871123821&ids=2,3,5,7,9&isShard=true&wt=javabin} status=0 QTime=2 2> 191758 INFO (qtp600427849-517) [n:127.0.0.1:40707_ c:collection1 s:shard2 r:core_node4 x:collection1_shard2_replica_n2 t:null-366] o.a.s.c.S.Request webapp= path=/select params={q={!func}a_i1&_stateVer_=collection1:11&fl=*,score&collection=collection1&wt=javabin&version=2} rid=null-366 hits=67 status=0 QTime=35 2> 191787 INFO (qtp2065109343-469) [n:127.0.0.1:45297_ c:control_collection s:shard1 r:core_node2 x:control_collection_shard1_replica_n1 t:null-367] o.a.s.c.S.Request webapp= path=/select params={q=quick&distrib=false&wt=javabin&version=2} hits=2 status=0 QTime=13 2> 191810 INFO (qtp600427849-518) [n:127.0.0.1:40707_ c:collection1 s:shard2 r:core_node4 x:collection1_shard2_replica_n2 t:null-368] o.a.s.c.S.Request webapp= path=/select params={df=text&distrib=false&_stateVer_=collection1:11&fl=id&fl=score&shards.purpose=16388&start=0&fsv=true&rid=null-368&rows=10&version=2&q=quick&omitHeader=false&NOW=1705871123894&isShard=true&wt=javabin} hits=1 status=0 QTime=5 2> 191838 INFO (qtp657049334-533) [n:127.0.0.1:37133_ c:collection1 s:shard1 r:core_node8 x:collection1_shard1_replica_n6 t:null-368] o.a.s.c.S.Request webapp= path=/select params={df=text&distrib=false&_stateVer_=collection1:11&fl=id&fl=score&shards.purpose=16388&start=0&fsv=true&rid=null-368&rows=10&version=2&q=quick&omitHeader=false&NOW=1705871123894&isShard=true&wt=javabin} hits=1 status=0 QTime=30 2> 191851 INFO (qtp600427849-515) [n:127.0.0.1:40707_ c:collection1 s:shard2 r:core_node4 x:collection1_shard2_replica_n2 t:null-368] o.a.s.c.S.Request webapp= path=/select params={q=quick&df=text&distrib=false&_stateVer_=collection1:11&omitHeader=false&shards.purpose=64&NOW=1705871123894&ids=5&isShard=true&rid=null-368&wt=javabin&version=2} status=0 QTime=3 2> 191855 INFO (qtp657049334-531) [n:127.0.0.1:37133_ c:collection1 s:shard1 r:core_node8 x:collection1_shard1_replica_n6 t:null-368] o.a.s.c.S.Request webapp= path=/select params={q=quick&df=text&distrib=false&_stateVer_=collection1:11&omitHeader=false&shards.purpose=64&NOW=1705871123894&ids=4&isShard=true&rid=null-368&wt=javabin&version=2} status=0 QTime=3 2> 191859 INFO (qtp558875332-569) [n:127.0.0.1:32885_ c:collection1 s:shard2 r:core_node7 x:collection1_shard2_replica_n5 t:null-368] o.a.s.c.S.Request webapp= path=/select params={q=quick&_stateVer_=collection1:11&wt=javabin&version=2} rid=null-368 hits=2 status=0 QTime=63 2> 191872 INFO (qtp2065109343-470) [n:127.0.0.1:45297_ c:control_collection s:shard1 r:core_node2 x:control_collection_shard1_replica_n1 t:null-369] o.a.s.c.S.Request webapp= path=/select params={q=all&distrib=false&fl=id&start=0&wt=javabin&version=2} hits=2 status=0 QTime=5 2> 191895 INFO (qtp558875332-568) [n:127.0.0.1:32885_ c:collection1 s:shard2 r:core_node7 x:collection1_shard2_replica_n5 t:null-370] o.a.s.c.S.Request webapp= path=/select params={df=text&distrib=false&_stateVer_=collection1:11&fl=id&fl=score&shards.purpose=16452&start=0&fsv=true&rid=null-370&rows=10&version=2&q=all&omitHeader=false&NOW=1705871123978&isShard=true&wt=javabin} hits=0 status=0 QTime=5 2> 191899 INFO (qtp1128963290-589) [n:127.0.0.1:43017_ c:collection1 s:shard1 r:core_node3 x:collection1_shard1_replica_n1 t:null-370] o.a.s.c.S.Request webapp= path=/select params={df=text&distrib=false&_stateVer_=collection1:11&fl=id&fl=score&shards.purpose=16452&start=0&fsv=true&rid=null-370&rows=10&version=2&q=all&omitHeader=false&NOW=1705871123978&isShard=true&wt=javabin} hits=2 status=0 QTime=6 2> 191903 INFO (qtp600427849-519) [n:127.0.0.1:40707_ c:collection1 s:shard2 r:core_node4 x:collection1_shard2_replica_n2 t:null-370] o.a.s.c.S.Request webapp= path=/select params={q=all&_stateVer_=collection1:11&fl=id&start=0&wt=javabin&version=2} rid=null-370 hits=2 status=0 QTime=23 2> 191916 INFO (qtp2065109343-467) [n:127.0.0.1:45297_ c:control_collection s:shard1 r:core_node2 x:control_collection_shard1_replica_n1 t:null-371] o.a.s.c.S.Request webapp= path=/select params={q=all&distrib=false&fl=foofoofoo&start=0&wt=javabin&version=2} hits=2 status=0 QTime=3 2> 191938 INFO (qtp600427849-520) [n:127.0.0.1:40707_ c:collection1 s:shard2 r:core_node4 x:collection1_shard2_replica_n2 t:null-372] o.a.s.c.S.Request webapp= path=/select params={df=text&distrib=false&_stateVer_=collection1:11&fl=id&fl=score&shards.purpose=16388&start=0&fsv=true&rid=null-372&rows=10&version=2&q=all&omitHeader=false&NOW=1705871124023&isShard=true&wt=javabin} hits=0 status=0 QTime=4 2> 191941 INFO (qtp1128963290-592) [n:127.0.0.1:43017_ c:collection1 s:shard1 r:core_node3 x:collection1_shard1_replica_n1 t:null-372] o.a.s.c.S.Request webapp= path=/select params={df=text&distrib=false&_stateVer_=collection1:11&fl=id&fl=score&shards.purpose=16388&start=0&fsv=true&rid=null-372&rows=10&version=2&q=all&omitHeader=false&NOW=1705871124023&isShard=true&wt=javabin} hits=2 status=0 QTime=3 2> 191954 INFO (qtp1128963290-588) [n:127.0.0.1:43017_ c:collection1 s:shard1 r:core_node3 x:collection1_shard1_replica_n1 t:null-372] o.a.s.c.S.Request webapp= path=/select params={df=text&distrib=false&_stateVer_=collection1:11&fl=foofoofoo&fl=id&shards.purpose=64&start=0&rid=null-372&version=2&q=all&omitHeader=false&NOW=1705871124023&ids=1,8&isShard=true&wt=javabin} status=0 QTime=3 2> 191958 INFO (qtp558875332-571) [n:127.0.0.1:32885_ c:collection1 s:shard2 r:core_node7 x:collection1_shard2_replica_n5 t:null-372] o.a.s.c.S.Request webapp= path=/select params={q=all&_stateVer_=collection1:11&fl=foofoofoo&start=0&wt=javabin&version=2} rid=null-372 hits=2 status=0 QTime=33 2> 191968 INFO (qtp2065109343-471) [n:127.0.0.1:45297_ c:control_collection s:shard1 r:core_node2 x:control_collection_shard1_replica_n1 t:null-373] o.a.s.c.S.Request webapp= path=/select params={q=all&distrib=false&fl=id&start=100&wt=javabin&version=2} hits=2 status=0 QTime=3 2> 191995 INFO (qtp558875332-570) [n:127.0.0.1:32885_ c:collection1 s:shard2 r:core_node7 x:collection1_shard2_replica_n5 t:null-374] o.a.s.c.S.Request webapp= path=/select params={df=text&distrib=false&_stateVer_=collection1:11&fl=id&fl=score&shards.purpose=16452&start=0&fsv=true&collection=collection1&rid=null-374&rows=110&version=2&q=all&omitHeader=false&NOW=1705871124074&isShard=true&wt=javabin} hits=0 status=0 QTime=2 2> 191997 INFO (qtp1128963290-591) [n:127.0.0.1:43017_ c:collection1 s:shard1 r:core_node3 x:collection1_shard1_replica_n1 t:null-374] o.a.s.c.S.Request webapp= path=/select params={df=text&distrib=false&_stateVer_=collection1:11&fl=id&fl=score&shards.purpose=16452&start=0&fsv=true&collection=collection1&rid=null-374&rows=110&version=2&q=all&omitHeader=false&NOW=1705871124074&isShard=true&wt=javabin} hits=2 status=0 QTime=2 2> 192005 INFO (qtp600427849-516) [n:127.0.0.1:40707_ c:collection1 s:shard2 r:core_node4 x:collection1_shard2_replica_n2 t:null-374] o.a.s.c.S.Request webapp= path=/select params={q=all&_stateVer_=collection1:11&fl=id&start=100&collection=collection1&wt=javabin&version=2} rid=null-374 hits=2 status=0 QTime=29 2> 192016 INFO (qtp2065109343-466) [n:127.0.0.1:45297_ c:control_collection s:shard1 r:core_node2 x:control_collection_shard1_replica_n1 t:null-375] o.a.s.c.S.Request webapp= path=/select params={q=quick&distrib=false&fl=*,score&wt=javabin&version=2} hits=2 status=0 QTime=4 2> 192034 INFO (qtp600427849-517) [n:127.0.0.1:40707_ c:collection1 s:shard2 r:core_node4 x:collection1_shard2_replica_n2 t:null-376] o.a.s.c.S.Request webapp= path=/select params={df=text&distrib=false&_stateVer_=collection1:11&fl=id&fl=score&shards.purpose=16388&start=0&fsv=true&collection=collection1&rid=null-376&rows=10&version=2&q=quick&omitHeader=false&NOW=1705871124122&isShard=true&wt=javabin} hits=1 status=0 QTime=2 2> 192037 INFO (qtp657049334-529) [n:127.0.0.1:37133_ c:collection1 s:shard1 r:core_node8 x:collection1_shard1_replica_n6 t:null-376] o.a.s.c.S.Request webapp= path=/select params={df=text&distrib=false&_stateVer_=collection1:11&fl=id&fl=score&shards.purpose=16388&start=0&fsv=true&collection=collection1&rid=null-376&rows=10&version=2&q=quick&omitHeader=false&NOW=1705871124122&isShard=true&wt=javabin} hits=1 status=0 QTime=2 2> 192047 INFO (qtp600427849-518) [n:127.0.0.1:40707_ c:collection1 s:shard2 r:core_node4 x:collection1_shard2_replica_n2 t:null-376] o.a.s.c.S.Request webapp= path=/select params={df=text&distrib=false&_stateVer_=collection1:11&fl=*,score&shards.purpose=64&collection=collection1&rid=null-376&version=2&q=quick&omitHeader=false&NOW=1705871124122&ids=5&isShard=true&wt=javabin} status=0 QTime=3 2> 192050 INFO (qtp657049334-530) [n:127.0.0.1:37133_ c:collection1 s:shard1 r:core_node8 x:collection1_shard1_replica_n6 t:null-376] o.a.s.c.S.Request webapp= path=/select params={df=text&distrib=false&_stateVer_=collection1:11&fl=*,score&shards.purpose=64&collection=collection1&rid=null-376&version=2&q=quick&omitHeader=false&NOW=1705871124122&ids=4&isShard=true&wt=javabin} status=0 QTime=3 2> 192053 INFO (qtp1128963290-593) [n:127.0.0.1:43017_ c:collection1 s:shard1 r:core_node3 x:collection1_shard1_replica_n1 t:null-376] o.a.s.c.S.Request webapp= path=/select params={q=quick&_stateVer_=collection1:11&fl=*,score&collection=collection1&wt=javabin&version=2} rid=null-376 hits=2 status=0 QTime=30 2> 192065 INFO (qtp2065109343-468) [n:127.0.0.1:45297_ c:control_collection s:shard1 r:core_node2 x:control_collection_shard1_replica_n1 t:null-377] o.a.s.c.S.Request webapp= path=/select params={q=all&distrib=false&fl=*,score&start=1&wt=javabin&version=2} hits=2 status=0 QTime=4 2> 192085 INFO (qtp558875332-572) [n:127.0.0.1:32885_ c:collection1 s:shard2 r:core_node7 x:collection1_shard2_replica_n5 t:null-378] o.a.s.c.S.Request webapp= path=/select params={df=text&distrib=false&_stateVer_=collection1:11&fl=id&fl=score&shards.purpose=16388&start=0&fsv=true&collection=collection1&rid=null-378&rows=11&version=2&q=all&omitHeader=false&NOW=1705871124173&isShard=true&wt=javabin} hits=0 status=0 QTime=2 2> 192090 INFO (qtp1128963290-592) [n:127.0.0.1:43017_ c:collection1 s:shard1 r:core_node3 x:collection1_shard1_replica_n1 t:null-378] o.a.s.c.S.Request webapp= path=/select params={df=text&distrib=false&_stateVer_=collection1:11&fl=id&fl=score&shards.purpose=16388&start=0&fsv=true&collection=collection1&rid=null-378&rows=11&version=2&q=all&omitHeader=false&NOW=1705871124173&isShard=true&wt=javabin} hits=2 status=0 QTime=3 2> 192107 INFO (qtp1128963290-588) [n:127.0.0.1:43017_ c:collection1 s:shard1 r:core_node3 x:collection1_shard1_replica_n1 t:null-378] o.a.s.c.S.Request webapp= path=/select params={df=text&distrib=false&_stateVer_=collection1:11&fl=*,score&shards.purpose=64&start=1&collection=collection1&rid=null-378&version=2&q=all&omitHeader=false&NOW=1705871124173&ids=1&isShard=true&wt=javabin} status=0 QTime=3 2> 192111 INFO (qtp1128963290-589) [n:127.0.0.1:43017_ c:collection1 s:shard1 r:core_node3 x:collection1_shard1_replica_n1 t:null-378] o.a.s.c.S.Request webapp= path=/select params={q=all&_stateVer_=collection1:11&fl=*,score&start=1&collection=collection1&wt=javabin&version=2} rid=null-378 hits=2 status=0 QTime=36 2> 192121 INFO (qtp2065109343-469) [n:127.0.0.1:45297_ c:control_collection s:shard1 r:core_node2 x:control_collection_shard1_replica_n1 t:null-379] o.a.s.c.S.Request webapp= path=/select params={q=all&distrib=false&fl=*,score&start=100&wt=javabin&version=2} hits=2 status=0 QTime=3 2> 192139 INFO (qtp600427849-519) [n:127.0.0.1:40707_ c:collection1 s:shard2 r:core_node4 x:collection1_shard2_replica_n2 t:null-380] o.a.s.c.S.Request webapp= path=/select params={df=text&distrib=false&_stateVer_=collection1:11&fl=id&fl=score&shards.purpose=16388&start=0&fsv=true&rid=null-380&rows=110&version=2&q=all&omitHeader=false&NOW=1705871124225&isShard=true&wt=javabin} hits=0 status=0 QTime=3 2> 192145 INFO (qtp657049334-532) [n:127.0.0.1:37133_ c:collection1 s:shard1 r:core_node8 x:collection1_shard1_replica_n6 t:null-380] o.a.s.c.S.Request webapp= path=/select params={df=text&distrib=false&_stateVer_=collection1:11&fl=id&fl=score&shards.purpose=16388&start=0&fsv=true&rid=null-380&rows=110&version=2&q=all&omitHeader=false&NOW=1705871124225&isShard=true&wt=javabin} hits=2 status=0 QTime=5 2> 192149 INFO (qtp600427849-515) [n:127.0.0.1:40707_ c:collection1 s:shard2 r:core_node4 x:collection1_shard2_replica_n2 t:null-380] o.a.s.c.S.Request webapp= path=/select params={q=all&_stateVer_=collection1:11&fl=*,score&start=100&wt=javabin&version=2} rid=null-380 hits=2 status=0 QTime=22 2> 192608 INFO (qtp2065109343-470) [n:127.0.0.1:45297_ c:control_collection s:shard1 r:core_node2 x:control_collection_shard1_replica_n1 t:null-381] o.a.s.c.S.Request webapp= path=/select params={q=now+their+fox+sat+had+put&distrib=false&hl=true&fl=*,score&hl.fl=a_t&wt=javabin&version=2} hits=7 status=0 QTime=452 2> 192643 INFO (qtp558875332-573) [n:127.0.0.1:32885_ c:collection1 s:shard2 r:core_node7 x:collection1_shard2_replica_n5 t:null-382] o.a.s.c.S.Request webapp= path=/select params={df=text&distrib=false&_stateVer_=collection1:11&hl=false&fl=id&fl=score&shards.purpose=16388&start=0&fsv=true&collection=collection1&rid=null-382&rows=10&version=2&q=now+their+fox+sat+had+put&omitHeader=false&NOW=1705871124717&isShard=true&hl.fl=a_t&wt=javabin} hits=5 status=0 QTime=12 2> 192644 INFO (qtp657049334-528) [n:127.0.0.1:37133_ c:collection1 s:shard1 r:core_node8 x:collection1_shard1_replica_n6 t:null-382] o.a.s.c.S.Request webapp= path=/select params={df=text&distrib=false&_stateVer_=collection1:11&hl=false&fl=id&fl=score&shards.purpose=16388&start=0&fsv=true&collection=collection1&rid=null-382&rows=10&version=2&q=now+their+fox+sat+had+put&omitHeader=false&NOW=1705871124717&isShard=true&hl.fl=a_t&wt=javabin} hits=2 status=0 QTime=9 2> 192723 INFO (qtp657049334-533) [n:127.0.0.1:37133_ c:collection1 s:shard1 r:core_node8 x:collection1_shard1_replica_n6 t:null-382] o.a.s.c.S.Request webapp= path=/select params={df=text&distrib=false&_stateVer_=collection1:11&hl=true&fl=*,score&shards.purpose=192&collection=collection1&rid=null-382&version=2&q=now+their+fox+sat+had+put&omitHeader=false&NOW=1705871124717&ids=1,4&isShard=true&hl.fl=a_t&wt=javabin} status=0 QTime=34 2> 192727 INFO (qtp558875332-569) [n:127.0.0.1:32885_ c:collection1 s:shard2 r:core_node7 x:collection1_shard2_replica_n5 t:null-382] o.a.s.c.S.Request webapp= path=/select params={df=text&distrib=false&_stateVer_=collection1:11&hl=true&fl=*,score&shards.purpose=192&collection=collection1&rid=null-382&version=2&q=now+their+fox+sat+had+put&omitHeader=false&NOW=1705871124717&ids=3,5,6,7,9&isShard=true&hl.fl=a_t&wt=javabin} status=0 QTime=39 2> 192734 INFO (qtp600427849-520) [n:127.0.0.1:40707_ c:collection1 s:shard2 r:core_node4 x:collection1_shard2_replica_n2 t:null-382] o.a.s.c.S.Request webapp= path=/select params={q=now+their+fox+sat+had+put&_stateVer_=collection1:11&hl=true&fl=*,score&hl.fl=a_t&collection=collection1&wt=javabin&version=2} rid=null-382 hits=7 status=0 QTime=115 2> 192795 INFO (qtp2065109343-467) [n:127.0.0.1:45297_ c:control_collection s:shard1 r:core_node2 x:control_collection_shard1_replica_n1 t:null-383] o.a.s.c.S.Request webapp= path=/select params={q=now+their+fox+sat+had+put&distrib=false&hl=true&fl=foofoofoo&hl.fl=a_t&wt=javabin&version=2} hits=7 status=0 QTime=37 2> 192850 INFO (qtp1128963290-593) [n:127.0.0.1:43017_ c:collection1 s:shard1 r:core_node3 x:collection1_shard1_replica_n1 t:null-384] o.a.s.c.S.Request webapp= path=/select params={df=text&distrib=false&_stateVer_=collection1:11&hl=false&fl=id&fl=score&shards.purpose=16388&start=0&fsv=true&collection=collection1&rid=null-384&rows=10&version=2&q=now+their+fox+sat+had+put&omitHeader=false&NOW=1705871124908&isShard=true&hl.fl=a_t&wt=javabin} hits=2 status=0 QTime=26 2> 192858 INFO (qtp600427849-516) [n:127.0.0.1:40707_ c:collection1 s:shard2 r:core_node4 x:collection1_shard2_replica_n2 t:null-384] o.a.s.c.S.Request webapp= path=/select params={df=text&distrib=false&_stateVer_=collection1:11&hl=false&fl=id&fl=score&shards.purpose=16388&start=0&fsv=true&collection=collection1&rid=null-384&rows=10&version=2&q=now+their+fox+sat+had+put&omitHeader=false&NOW=1705871124908&isShard=true&hl.fl=a_t&wt=javabin} hits=5 status=0 QTime=38 2> 192898 INFO (qtp1128963290-590) [n:127.0.0.1:43017_ c:collection1 s:shard1 r:core_node3 x:collection1_shard1_replica_n1 t:null-384] o.a.s.c.S.Request webapp= path=/select params={df=text&distrib=false&_stateVer_=collection1:11&hl=true&fl=foofoofoo&fl=id&shards.purpose=192&collection=collection1&rid=null-384&version=2&q=now+their+fox+sat+had+put&omitHeader=false&NOW=1705871124908&ids=1,4&isShard=true&hl.fl=a_t&wt=javabin} status=0 QTime=23 2> 192910 INFO (qtp600427849-517) [n:127.0.0.1:40707_ c:collection1 s:shard2 r:core_node4 x:collection1_shard2_replica_n2 t:null-384] o.a.s.c.S.Request webapp= path=/select params={df=text&distrib=false&_stateVer_=collection1:11&hl=true&fl=foofoofoo&fl=id&shards.purpose=192&collection=collection1&rid=null-384&version=2&q=now+their+fox+sat+had+put&omitHeader=false&NOW=1705871124908&ids=3,5,6,7,9&isShard=true&hl.fl=a_t&wt=javabin} status=0 QTime=38 2> 192914 INFO (qtp1128963290-591) [n:127.0.0.1:43017_ c:collection1 s:shard1 r:core_node3 x:collection1_shard1_replica_n1 t:null-384] o.a.s.c.S.Request webapp= path=/select params={q=now+their+fox+sat+had+put&_stateVer_=collection1:11&hl=true&fl=foofoofoo&hl.fl=a_t&collection=collection1&wt=javabin&version=2} rid=null-384 hits=7 status=0 QTime=103 2> 192934 INFO (qtp2065109343-471) [n:127.0.0.1:45297_ c:control_collection s:shard1 r:core_node2 x:control_collection_shard1_replica_n1 t:null-385] o.a.s.c.S.Request webapp= path=/select params={q=matchesnothing&distrib=false&fl=*,score&wt=javabin&version=2} hits=0 status=0 QTime=6 2> 192964 INFO (qtp558875332-568) [n:127.0.0.1:32885_ c:collection1 s:shard2 r:core_node7 x:collection1_shard2_replica_n5 t:null-386] o.a.s.c.S.Request webapp= path=/select params={df=text&distrib=false&_stateVer_=collection1:11&fl=id&fl=score&shards.purpose=16388&start=0&fsv=true&collection=collection1&rid=null-386&rows=10&version=2&q=matchesnothing&omitHeader=false&NOW=1705871125040&isShard=true&wt=javabin} hits=0 status=0 QTime=12 2> 192973 INFO (qtp657049334-531) [n:127.0.0.1:37133_ c:collection1 s:shard1 r:core_node8 x:collection1_shard1_replica_n6 t:null-386] o.a.s.c.S.Request webapp= path=/select params={df=text&distrib=false&_stateVer_=collection1:11&fl=id&fl=score&shards.purpose=16388&start=0&fsv=true&collection=collection1&rid=null-386&rows=10&version=2&q=matchesnothing&omitHeader=false&NOW=1705871125040&isShard=true&wt=javabin} hits=0 status=0 QTime=3 2> 192976 INFO (qtp600427849-518) [n:127.0.0.1:40707_ c:collection1 s:shard2 r:core_node4 x:collection1_shard2_replica_n2 t:null-386] o.a.s.c.S.Request webapp= path=/select params={q=matchesnothing&_stateVer_=collection1:11&fl=*,score&collection=collection1&wt=javabin&version=2} rid=null-386 hits=0 status=0 QTime=34 2> 193013 INFO (qtp2065109343-466) [n:127.0.0.1:45297_ c:control_collection s:shard1 r:core_node2 x:control_collection_shard1_replica_n1 t:null-387] o.a.s.c.S.Request webapp= path=/select params={q=*:*&facet.field=a_t&distrib=false&rows=100&facet=true&wt=javabin&version=2} hits=67 status=0 QTime=30 2> 193067 INFO (qtp1128963290-592) [n:127.0.0.1:43017_ c:collection1 s:shard1 r:core_node3 x:collection1_shard1_replica_n1 t:null-388] o.a.s.c.S.Request webapp= path=/select params={facet.field=a_t&df=text&distrib=false&_stateVer_=collection1:11&fl=id&fl=score&shards.purpose=16404&start=0&fsv=true&f.a_t.facet.mincount=0&rows=100&rid=null-388&version=2&q=*:*&f.a_t.facet.limit=160&omitHeader=false&NOW=1705871125148&isShard=true&facet=true&wt=javabin} hits=32 status=0 QTime=7 2> 193068 INFO (qtp600427849-519) [n:127.0.0.1:40707_ c:collection1 s:shard2 r:core_node4 x:collection1_shard2_replica_n2 t:null-388] o.a.s.c.S.Request webapp= path=/select params={facet.field=a_t&df=text&distrib=false&_stateVer_=collection1:11&fl=id&fl=score&shards.purpose=16404&start=0&fsv=true&f.a_t.facet.mincount=0&rows=100&rid=null-388&version=2&q=*:*&f.a_t.facet.limit=160&omitHeader=false&NOW=1705871125148&isShard=true&facet=true&wt=javabin} hits=35 status=0 QTime=11 2> 193118 INFO (qtp1128963290-588) [n:127.0.0.1:43017_ c:collection1 s:shard1 r:core_node3 x:collection1_shard1_replica_n1 t:null-388] o.a.s.c.S.Request webapp= path=/select params={facet.field=a_t&df=text&distrib=false&_stateVer_=collection1:11&shards.purpose=64&rows=100&rid=null-388&version=2&q=*:*&omitHeader=false&NOW=1705871125148&ids=110,111,113,114,118,10,11,12,13,14,15,16,122,1,123,124,125,4,129,8,132,133,135,138,100,101,146,102,148,107,108,109&isShard=true&facet=false&wt=javabin} status=0 QTime=7 2> 193118 INFO (qtp600427849-515) [n:127.0.0.1:40707_ c:collection1 s:shard2 r:core_node4 x:collection1_shard2_replica_n2 t:null-388] o.a.s.c.S.Request webapp= path=/select params={facet.field=a_t&df=text&distrib=false&_stateVer_=collection1:11&shards.purpose=64&rows=100&rid=null-388&version=2&q=*:*&omitHeader=false&NOW=1705871125148&ids=112,115,116,117,119,17,120,121,2,3,126,5,127,6,128,7,9,130,131,134,136,137,139,140,141,142,143,144,145,147,103,104,149,105,106&isShard=true&facet=false&wt=javabin} status=0 QTime=11 2> 193133 INFO (qtp657049334-529) [n:127.0.0.1:37133_ c:collection1 s:shard1 r:core_node8 x:collection1_shard1_replica_n6 t:null-388] o.a.s.c.S.Request webapp= path=/select params={q=*:*&facet.field=a_t&_stateVer_=collection1:11&rows=100&facet=true&wt=javabin&version=2} rid=null-388 hits=67 status=0 QTime=83 2> 193164 INFO (qtp2065109343-468) [n:127.0.0.1:45297_ c:control_collection s:shard1 r:core_node2 x:control_collection_shard1_replica_n1 t:null-389] o.a.s.c.S.Request webapp= path=/select params={q=*:*&facet.limit=-1&facet.field=a_t&distrib=false&rows=100&facet=true&wt=javabin&version=2&facet.sort=count} hits=67 status=0 QTime=6 2> 193214 INFO (qtp1128963290-589) [n:127.0.0.1:43017_ c:collection1 s:shard1 r:core_node3 x:collection1_shard1_replica_n1 t:null-390] o.a.s.c.S.Request webapp= path=/select params={facet.field=a_t&df=text&distrib=false&_stateVer_=collection1:11&fl=id&fl=score&shards.purpose=16404&start=0&fsv=true&f.a_t.facet.mincount=0&collection=collection1&rows=100&rid=null-390&version=2&q=*:*&facet.limit=-1&f.a_t.facet.limit=-1&omitHeader=false&NOW=1705871125295&isShard=true&facet=true&wt=javabin&facet.sort=count} hits=32 status=0 QTime=4 2> 193214 INFO (qtp558875332-571) [n:127.0.0.1:32885_ c:collection1 s:shard2 r:core_node7 x:collection1_shard2_replica_n5 t:null-390] o.a.s.c.S.Request webapp= path=/select params={facet.field=a_t&df=text&distrib=false&_stateVer_=collection1:11&fl=id&fl=score&shards.purpose=16404&start=0&fsv=true&f.a_t.facet.mincount=0&collection=collection1&rows=100&rid=null-390&version=2&q=*:*&facet.limit=-1&f.a_t.facet.limit=-1&omitHeader=false&NOW=1705871125295&isShard=true&facet=true&wt=javabin&facet.sort=count} hits=35 status=0 QTime=8 2> 193235 INFO (qtp1128963290-593) [n:127.0.0.1:43017_ c:collection1 s:shard1 r:core_node3 x:collection1_shard1_replica_n1 t:null-390] o.a.s.c.S.Request webapp= path=/select params={facet.field=a_t&df=text&distrib=false&_stateVer_=collection1:11&shards.purpose=64&collection=collection1&rows=100&rid=null-390&version=2&q=*:*&facet.limit=-1&omitHeader=false&NOW=1705871125295&ids=110,111,113,114,118,10,11,12,13,14,15,16,122,1,123,124,125,4,129,8,132,133,135,138,100,101,146,102,148,107,108,109&isShard=true&facet=false&wt=javabin&facet.sort=count} status=0 QTime=7 2> 193238 INFO (qtp558875332-570) [n:127.0.0.1:32885_ c:collection1 s:shard2 r:core_node7 x:collection1_shard2_replica_n5 t:null-390] o.a.s.c.S.Request webapp= path=/select params={facet.field=a_t&df=text&distrib=false&_stateVer_=collection1:11&shards.purpose=64&collection=collection1&rows=100&rid=null-390&version=2&q=*:*&facet.limit=-1&omitHeader=false&NOW=1705871125295&ids=112,115,116,117,119,17,120,121,2,3,126,5,127,6,128,7,9,130,131,134,136,137,139,140,141,142,143,144,145,147,103,104,149,105,106&isShard=true&facet=false&wt=javabin&facet.sort=count} status=0 QTime=6 2> 193255 INFO (qtp600427849-520) [n:127.0.0.1:40707_ c:collection1 s:shard2 r:core_node4 x:collection1_shard2_replica_n2 t:null-390] o.a.s.c.S.Request webapp= path=/select params={q=*:*&facet.limit=-1&facet.field=a_t&_stateVer_=collection1:11&collection=collection1&rows=100&facet=true&wt=javabin&version=2&facet.sort=count} rid=null-390 hits=67 status=0 QTime=58 2> 193288 INFO (qtp2065109343-469) [n:127.0.0.1:45297_ c:control_collection s:shard1 r:core_node2 x:control_collection_shard1_replica_n1 t:null-391] o.a.s.c.S.Request webapp= path=/select params={q=*:*&facet.limit=-1&facet.field=a_t&distrib=false&facet.mincount=2&rows=100&facet=true&wt=javabin&version=2&facet.sort=count} hits=67 status=0 QTime=8 2> 193330 INFO (qtp600427849-516) [n:127.0.0.1:40707_ c:collection1 s:shard2 r:core_node4 x:collection1_shard2_replica_n2 t:null-392] o.a.s.c.S.Request webapp= path=/select params={facet.field=a_t&df=text&distrib=false&_stateVer_=collection1:11&fl=id&fl=score&shards.purpose=16404&start=0&fsv=true&f.a_t.facet.mincount=1&rows=100&rid=null-392&version=2&q=*:*&facet.limit=-1&f.a_t.facet.limit=-1&omitHeader=false&NOW=1705871125416&isShard=true&facet=true&wt=javabin&facet.sort=count} hits=35 status=0 QTime=4 2> 193334 INFO (qtp1128963290-590) [n:127.0.0.1:43017_ c:collection1 s:shard1 r:core_node3 x:collection1_shard1_replica_n1 t:null-392] o.a.s.c.S.Request webapp= path=/select params={facet.field=a_t&df=text&distrib=false&_stateVer_=collection1:11&fl=id&fl=score&shards.purpose=16404&start=0&fsv=true&f.a_t.facet.mincount=1&rows=100&rid=null-392&version=2&q=*:*&facet.limit=-1&f.a_t.facet.limit=-1&omitHeader=false&NOW=1705871125416&isShard=true&facet=true&wt=javabin&facet.sort=count} hits=32 status=0 QTime=5 2> 193354 INFO (qtp600427849-517) [n:127.0.0.1:40707_ c:collection1 s:shard2 r:core_node4 x:collection1_shard2_replica_n2 t:null-392] o.a.s.c.S.Request webapp= path=/select params={facet.field=a_t&df=text&distrib=false&_stateVer_=collection1:11&shards.purpose=64&rows=100&rid=null-392&version=2&q=*:*&facet.limit=-1&omitHeader=false&NOW=1705871125416&ids=112,115,116,117,119,17,120,121,2,3,126,5,127,6,128,7,9,130,131,134,136,137,139,140,141,142,143,144,145,147,103,104,149,105,106&isShard=true&facet.mincount=2&facet=false&wt=javabin&facet.sort=count} status=0 QTime=7 2> 193357 INFO (qtp1128963290-591) [n:127.0.0.1:43017_ c:collection1 s:shard1 r:core_node3 x:collection1_shard1_replica_n1 t:null-392] o.a.s.c.S.Request webapp= path=/select params={facet.field=a_t&df=text&distrib=false&_stateVer_=collection1:11&shards.purpose=64&rows=100&rid=null-392&version=2&q=*:*&facet.limit=-1&omitHeader=false&NOW=1705871125416&ids=110,111,113,114,118,10,11,12,13,14,15,16,122,1,123,124,125,4,129,8,132,133,135,138,100,101,146,102,148,107,108,109&isShard=true&facet.mincount=2&facet=false&wt=javabin&facet.sort=count} status=0 QTime=7 2> 193372 INFO (qtp657049334-530) [n:127.0.0.1:37133_ c:collection1 s:shard1 r:core_node8 x:collection1_shard1_replica_n6 t:null-392] o.a.s.c.S.Request webapp= path=/select params={q=*:*&facet.limit=-1&facet.field=a_t&_stateVer_=collection1:11&facet.mincount=2&rows=100&facet=true&wt=javabin&version=2&facet.sort=count} rid=null-392 hits=67 status=0 QTime=55 2> 193403 INFO (qtp2065109343-470) [n:127.0.0.1:45297_ c:control_collection s:shard1 r:core_node2 x:control_collection_shard1_replica_n1 t:null-393] o.a.s.c.S.Request webapp= path=/select params={q=*:*&facet.limit=-1&facet.field=a_t&distrib=false&rows=100&facet=true&wt=javabin&version=2&facet.sort=index} hits=67 status=0 QTime=9 2> 193468 INFO (qtp558875332-573) [n:127.0.0.1:32885_ c:collection1 s:shard2 r:core_node7 x:collection1_shard2_replica_n5 t:null-394] o.a.s.c.S.Request webapp= path=/select params={facet.field=a_t&df=text&distrib=false&_stateVer_=collection1:11&fl=id&fl=score&shards.purpose=16404&start=0&fsv=true&f.a_t.facet.mincount=0&rows=100&rid=null-394&version=2&q=*:*&facet.limit=-1&f.a_t.facet.limit=-1&omitHeader=false&NOW=1705871125532&isShard=true&facet=true&wt=javabin&facet.sort=index} hits=35 status=0 QTime=26 2> 193474 INFO (qtp657049334-532) [n:127.0.0.1:37133_ c:collection1 s:shard1 r:core_node8 x:collection1_shard1_replica_n6 t:null-394] o.a.s.c.S.Request webapp= path=/select params={facet.field=a_t&df=text&distrib=false&_stateVer_=collection1:11&fl=id&fl=score&shards.purpose=16404&start=0&fsv=true&f.a_t.facet.mincount=0&rows=100&rid=null-394&version=2&q=*:*&facet.limit=-1&f.a_t.facet.limit=-1&omitHeader=false&NOW=1705871125532&isShard=true&facet=true&wt=javabin&facet.sort=index} hits=32 status=0 QTime=29 2> 193492 INFO (qtp657049334-528) [n:127.0.0.1:37133_ c:collection1 s:shard1 r:core_node8 x:collection1_shard1_replica_n6 t:null-394] o.a.s.c.S.Request webapp= path=/select params={facet.field=a_t&df=text&distrib=false&_stateVer_=collection1:11&shards.purpose=64&rows=100&rid=null-394&version=2&q=*:*&facet.limit=-1&omitHeader=false&NOW=1705871125532&ids=110,111,113,114,118,10,11,12,13,14,15,16,122,1,123,124,125,4,129,8,132,133,135,138,100,101,146,102,148,107,108,109&isShard=true&facet=false&wt=javabin&facet.sort=index} status=0 QTime=7 2> 193493 INFO (qtp558875332-569) [n:127.0.0.1:32885_ c:collection1 s:shard2 r:core_node7 x:collection1_shard2_replica_n5 t:null-394] o.a.s.c.S.Request webapp= path=/select params={facet.field=a_t&df=text&distrib=false&_stateVer_=collection1:11&shards.purpose=64&rows=100&rid=null-394&version=2&q=*:*&facet.limit=-1&omitHeader=false&NOW=1705871125532&ids=112,115,116,117,119,17,120,121,2,3,126,5,127,6,128,7,9,130,131,134,136,137,139,140,141,142,143,144,145,147,103,104,149,105,106&isShard=true&facet=false&wt=javabin&facet.sort=index} status=0 QTime=6 2> 193508 INFO (qtp558875332-572) [n:127.0.0.1:32885_ c:collection1 s:shard2 r:core_node7 x:collection1_shard2_replica_n5 t:null-394] o.a.s.c.S.Request webapp= path=/select params={q=*:*&facet.limit=-1&facet.field=a_t&_stateVer_=collection1:11&rows=100&facet=true&wt=javabin&version=2&facet.sort=index} rid=null-394 hits=67 status=0 QTime=74 2> 193536 INFO (qtp2065109343-467) [n:127.0.0.1:45297_ c:control_collection s:shard1 r:core_node2 x:control_collection_shard1_replica_n1 t:null-395] o.a.s.c.S.Request webapp= path=/select params={q=*:*&facet.limit=-1&facet.field=a_t&distrib=false&facet.mincount=2&rows=100&facet=true&wt=javabin&version=2&facet.sort=index} hits=67 status=0 QTime=5 2> 193575 INFO (qtp600427849-518) [n:127.0.0.1:40707_ c:collection1 s:shard2 r:core_node4 x:collection1_shard2_replica_n2 t:null-396] o.a.s.c.S.Request webapp= path=/select params={facet.field=a_t&df=text&distrib=false&_stateVer_=collection1:11&fl=id&fl=score&shards.purpose=16404&start=0&fsv=true&f.a_t.facet.mincount=1&collection=collection1&rows=100&rid=null-396&version=2&q=*:*&facet.limit=-1&f.a_t.facet.limit=-1&omitHeader=false&NOW=1705871125662&isShard=true&facet=true&wt=javabin&facet.sort=index} hits=35 status=0 QTime=4 2> 193577 INFO (qtp1128963290-592) [n:127.0.0.1:43017_ c:collection1 s:shard1 r:core_node3 x:collection1_shard1_replica_n1 t:null-396] o.a.s.c.S.Request webapp= path=/select params={facet.field=a_t&df=text&distrib=false&_stateVer_=collection1:11&fl=id&fl=score&shards.purpose=16404&start=0&fsv=true&f.a_t.facet.mincount=1&collection=collection1&rows=100&rid=null-396&version=2&q=*:*&facet.limit=-1&f.a_t.facet.limit=-1&omitHeader=false&NOW=1705871125662&isShard=true&facet=true&wt=javabin&facet.sort=index} hits=32 status=0 QTime=3 2> 193594 INFO (qtp600427849-519) [n:127.0.0.1:40707_ c:collection1 s:shard2 r:core_node4 x:collection1_shard2_replica_n2 t:null-396] o.a.s.c.S.Request webapp= path=/select params={facet.field=a_t&df=text&distrib=false&_stateVer_=collection1:11&shards.purpose=64&collection=collection1&rows=100&rid=null-396&version=2&q=*:*&facet.limit=-1&omitHeader=false&NOW=1705871125662&ids=112,115,116,117,119,17,120,121,2,3,126,5,127,6,128,7,9,130,131,134,136,137,139,140,141,142,143,144,145,147,103,104,149,105,106&isShard=true&facet.mincount=2&facet=false&wt=javabin&facet.sort=index} status=0 QTime=6 2> 193597 INFO (qtp1128963290-588) [n:127.0.0.1:43017_ c:collection1 s:shard1 r:core_node3 x:collection1_shard1_replica_n1 t:null-396] o.a.s.c.S.Request webapp= path=/select params={facet.field=a_t&df=text&distrib=false&_stateVer_=collection1:11&shards.purpose=64&collection=collection1&rows=100&rid=null-396&version=2&q=*:*&facet.limit=-1&omitHeader=false&NOW=1705871125662&ids=110,111,113,114,118,10,11,12,13,14,15,16,122,1,123,124,125,4,129,8,132,133,135,138,100,101,146,102,148,107,108,109&isShard=true&facet.mincount=2&facet=false&wt=javabin&facet.sort=index} status=0 QTime=7 2> 193611 INFO (qtp657049334-533) [n:127.0.0.1:37133_ c:collection1 s:shard1 r:core_node8 x:collection1_shard1_replica_n6 t:null-396] o.a.s.c.S.Request webapp= path=/select params={q=*:*&facet.limit=-1&facet.field=a_t&_stateVer_=collection1:11&facet.mincount=2&collection=collection1&rows=100&facet=true&wt=javabin&version=2&facet.sort=index} rid=null-396 hits=67 status=0 QTime=47 2> 193636 INFO (qtp2065109343-471) [n:127.0.0.1:45297_ c:control_collection s:shard1 r:core_node2 x:control_collection_shard1_replica_n1 t:null-397] o.a.s.c.S.Request webapp= path=/select params={q=*:*&facet.limit=1&facet.field=a_t&distrib=false&rows=100&facet=true&wt=javabin&version=2} hits=67 status=0 QTime=4 2> 193677 INFO (qtp600427849-515) [n:127.0.0.1:40707_ c:collection1 s:shard2 r:core_node4 x:collection1_shard2_replica_n2 t:null-398] o.a.s.c.S.Request webapp= path=/select params={facet.field=a_t&df=text&distrib=false&_stateVer_=collection1:11&fl=id&fl=score&shards.purpose=16404&start=0&fsv=true&f.a_t.facet.mincount=0&rows=100&rid=null-398&version=2&q=*:*&facet.limit=1&f.a_t.facet.limit=11&omitHeader=false&NOW=1705871125764&isShard=true&facet=true&wt=javabin} hits=35 status=0 QTime=3 2> 193681 INFO (qtp657049334-531) [n:127.0.0.1:37133_ c:collection1 s:shard1 r:core_node8 x:collection1_shard1_replica_n6 t:null-398] o.a.s.c.S.Request webapp= path=/select params={facet.field=a_t&df=text&distrib=false&_stateVer_=collection1:11&fl=id&fl=score&shards.purpose=16404&start=0&fsv=true&f.a_t.facet.mincount=0&rows=100&rid=null-398&version=2&q=*:*&facet.limit=1&f.a_t.facet.limit=11&omitHeader=false&NOW=1705871125764&isShard=true&facet=true&wt=javabin} hits=32 status=0 QTime=3 2> 193714 INFO (qtp657049334-529) [n:127.0.0.1:37133_ c:collection1 s:shard1 r:core_node8 x:collection1_shard1_replica_n6 t:null-398] o.a.s.c.S.Request webapp= path=/select params={df=text&facet.field={!terms%3D$a_t__terms}a_t&distrib=false&_stateVer_=collection1:11&shards.purpose=96&rows=100&rid=null-398&version=2&q=*:*&facet.limit=1&a_t__terms=humpti,dumpi&omitHeader=false&NOW=1705871125764&ids=110,111,113,114,118,10,11,12,13,14,15,16,122,1,123,124,125,4,129,8,132,133,135,138,100,101,146,102,148,107,108,109&isShard=true&facet=true&wt=javabin} status=0 QTime=17 2> 193714 INFO (qtp600427849-520) [n:127.0.0.1:40707_ c:collection1 s:shard2 r:core_node4 x:collection1_shard2_replica_n2 t:null-398] o.a.s.c.S.Request webapp= path=/select params={df=text&facet.field={!terms%3D$a_t__terms}a_t&distrib=false&_stateVer_=collection1:11&shards.purpose=96&rows=100&rid=null-398&version=2&q=*:*&facet.limit=1&a_t__terms=all,men&omitHeader=false&NOW=1705871125764&ids=112,115,116,117,119,17,120,121,2,3,126,5,127,6,128,7,9,130,131,134,136,137,139,140,141,142,143,144,145,147,103,104,149,105,106&isShard=true&facet=true&wt=javabin} status=0 QTime=19 2> 193730 INFO (qtp558875332-568) [n:127.0.0.1:32885_ c:collection1 s:shard2 r:core_node7 x:collection1_shard2_replica_n5 t:null-398] o.a.s.c.S.Request webapp= path=/select params={q=*:*&facet.limit=1&facet.field=a_t&_stateVer_=collection1:11&rows=100&facet=true&wt=javabin&version=2} rid=null-398 hits=67 status=0 QTime=64 2> 193761 INFO (qtp2065109343-466) [n:127.0.0.1:45297_ c:control_collection s:shard1 r:core_node2 x:control_collection_shard1_replica_n1 t:null-399] o.a.s.c.S.Request webapp= path=/select params={facet.query=quick&facet.query=all&facet.query=*:*&q=*:*&distrib=false&rows=100&facet=true&wt=javabin&version=2} hits=67 status=0 QTime=6 2> 193808 INFO (qtp558875332-571) [n:127.0.0.1:32885_ c:collection1 s:shard2 r:core_node7 x:collection1_shard2_replica_n5 t:null-400] o.a.s.c.S.Request webapp= path=/select params={df=text&distrib=false&_stateVer_=collection1:11&fl=id&fl=score&shards.purpose=16404&start=0&fsv=true&collection=collection1&rows=100&rid=null-400&version=2&facet.query=quick&facet.query=all&facet.query=*:*&q=*:*&omitHeader=false&NOW=1705871125889&isShard=true&facet=true&wt=javabin} hits=35 status=0 QTime=5 2> 193810 INFO (qtp1128963290-589) [n:127.0.0.1:43017_ c:collection1 s:shard1 r:core_node3 x:collection1_shard1_replica_n1 t:null-400] o.a.s.c.S.Request webapp= path=/select params={df=text&distrib=false&_stateVer_=collection1:11&fl=id&fl=score&shards.purpose=16404&start=0&fsv=true&collection=collection1&rows=100&rid=null-400&version=2&facet.query=quick&facet.query=all&facet.query=*:*&q=*:*&omitHeader=false&NOW=1705871125889&isShard=true&facet=true&wt=javabin} hits=32 status=0 QTime=4 2> 193827 INFO (qtp1128963290-593) [n:127.0.0.1:43017_ c:collection1 s:shard1 r:core_node3 x:collection1_shard1_replica_n1 t:null-400] o.a.s.c.S.Request webapp= path=/select params={df=text&distrib=false&_stateVer_=collection1:11&shards.purpose=64&collection=collection1&rows=100&rid=null-400&version=2&facet.query=quick&facet.query=all&facet.query=*:*&q=*:*&omitHeader=false&NOW=1705871125889&ids=110,111,113,114,118,10,11,12,13,14,15,16,122,1,123,124,125,4,129,8,132,133,135,138,100,101,146,102,148,107,108,109&isShard=true&facet=false&wt=javabin} status=0 QTime=6 2> 193830 INFO (qtp558875332-570) [n:127.0.0.1:32885_ c:collection1 s:shard2 r:core_node7 x:collection1_shard2_replica_n5 t:null-400] o.a.s.c.S.Request webapp= path=/select params={df=text&distrib=false&_stateVer_=collection1:11&shards.purpose=64&collection=collection1&rows=100&rid=null-400&version=2&facet.query=quick&facet.query=all&facet.query=*:*&q=*:*&omitHeader=false&NOW=1705871125889&ids=112,115,116,117,119,17,120,121,2,3,126,5,127,6,128,7,9,130,131,134,136,137,139,140,141,142,143,144,145,147,103,104,149,105,106&isShard=true&facet=false&wt=javabin} status=0 QTime=7 2> 193845 INFO (qtp657049334-530) [n:127.0.0.1:37133_ c:collection1 s:shard1 r:core_node8 x:collection1_shard1_replica_n6 t:null-400] o.a.s.c.S.Request webapp= path=/select params={facet.query=quick&facet.query=all&facet.query=*:*&q=*:*&_stateVer_=collection1:11&collection=collection1&rows=100&facet=true&wt=javabin&version=2} rid=null-400 hits=67 status=0 QTime=54 2> 193869 INFO (qtp2065109343-468) [n:127.0.0.1:45297_ c:control_collection s:shard1 r:core_node2 x:control_collection_shard1_replica_n1 t:null-401] o.a.s.c.S.Request webapp= path=/select params={q=*:*&facet.field=a_t&distrib=false&rows=100&facet=true&wt=javabin&version=2&facet.offset=1} hits=67 status=0 QTime=4 2> 193914 INFO (qtp558875332-573) [n:127.0.0.1:32885_ c:collection1 s:shard2 r:core_node7 x:collection1_shard2_replica_n5 t:null-402] o.a.s.c.S.Request webapp= path=/select params={facet.field=a_t&df=text&distrib=false&_stateVer_=collection1:11&fl=id&fl=score&shards.purpose=16404&start=0&fsv=true&f.a_t.facet.mincount=0&rows=100&rid=null-402&version=2&q=*:*&f.a_t.facet.limit=161&omitHeader=false&NOW=1705871125999&isShard=true&facet=true&wt=javabin} hits=35 status=0 QTime=4 2> 193916 INFO (qtp1128963290-590) [n:127.0.0.1:43017_ c:collection1 s:shard1 r:core_node3 x:collection1_shard1_replica_n1 t:null-402] o.a.s.c.S.Request webapp= path=/select params={facet.field=a_t&df=text&distrib=false&_stateVer_=collection1:11&fl=id&fl=score&shards.purpose=16404&start=0&fsv=true&f.a_t.facet.mincount=0&rows=100&rid=null-402&version=2&q=*:*&f.a_t.facet.limit=161&omitHeader=false&NOW=1705871125999&isShard=true&facet=true&wt=javabin} hits=32 status=0 QTime=3 2> 193947 INFO (qtp1128963290-591) [n:127.0.0.1:43017_ c:collection1 s:shard1 r:core_node3 x:collection1_shard1_replica_n1 t:null-402] o.a.s.c.S.Request webapp= path=/select params={facet.field=a_t&df=text&distrib=false&_stateVer_=collection1:11&shards.purpose=64&rows=100&rid=null-402&version=2&q=*:*&omitHeader=false&NOW=1705871125999&ids=110,111,113,114,118,10,11,12,13,14,15,16,122,1,123,124,125,4,129,8,132,133,135,138,100,101,146,102,148,107,108,109&isShard=true&facet=false&wt=javabin&facet.offset=1} status=0 QTime=6 2> 193953 INFO (qtp558875332-569) [n:127.0.0.1:32885_ c:collection1 s:shard2 r:core_node7 x:collection1_shard2_replica_n5 t:null-402] o.a.s.c.S.Request webapp= path=/select params={facet.field=a_t&df=text&distrib=false&_stateVer_=collection1:11&shards.purpose=64&rows=100&rid=null-402&version=2&q=*:*&omitHeader=false&NOW=1705871125999&ids=112,115,116,117,119,17,120,121,2,3,126,5,127,6,128,7,9,130,131,134,136,137,139,140,141,142,143,144,145,147,103,104,149,105,106&isShard=true&facet=false&wt=javabin&facet.offset=1} status=0 QTime=5 2> 193976 INFO (qtp657049334-532) [n:127.0.0.1:37133_ c:collection1 s:shard1 r:core_node8 x:collection1_shard1_replica_n6 t:null-402] o.a.s.c.S.Request webapp= path=/select params={q=*:*&facet.field=a_t&_stateVer_=collection1:11&rows=100&facet=true&wt=javabin&version=2&facet.offset=1} rid=null-402 hits=67 status=0 QTime=76 2> 194001 INFO (qtp2065109343-469) [n:127.0.0.1:45297_ c:control_collection s:shard1 r:core_node2 x:control_collection_shard1_replica_n1 t:null-403] o.a.s.c.S.Request webapp= path=/select params={q=*:*&facet.field=a_t&distrib=false&facet.mincount=2&rows=100&facet=true&wt=javabin&version=2} hits=67 status=0 QTime=4 2> 194045 INFO (qtp558875332-572) [n:127.0.0.1:32885_ c:collection1 s:shard2 r:core_node7 x:collection1_shard2_replica_n5 t:null-404] o.a.s.c.S.Request webapp= path=/select params={facet.field=a_t&df=text&distrib=false&_stateVer_=collection1:11&fl=id&fl=score&shards.purpose=16404&start=0&fsv=true&f.a_t.facet.mincount=1&rows=100&rid=null-404&version=2&q=*:*&f.a_t.facet.limit=160&omitHeader=false&NOW=1705871126129&isShard=true&facet=true&wt=javabin} hits=35 status=0 QTime=3 2> 194048 INFO (qtp657049334-528) [n:127.0.0.1:37133_ c:collection1 s:shard1 r:core_node8 x:collection1_shard1_replica_n6 t:null-404] o.a.s.c.S.Request webapp= path=/select params={facet.field=a_t&df=text&distrib=false&_stateVer_=collection1:11&fl=id&fl=score&shards.purpose=16404&start=0&fsv=true&f.a_t.facet.mincount=1&rows=100&rid=null-404&version=2&q=*:*&f.a_t.facet.limit=160&omitHeader=false&NOW=1705871126129&isShard=true&facet=true&wt=javabin} hits=32 status=0 QTime=3 2> 194073 INFO (qtp657049334-533) [n:127.0.0.1:37133_ c:collection1 s:shard1 r:core_node8 x:collection1_shard1_replica_n6 t:null-404] o.a.s.c.S.Request webapp= path=/select params={facet.field=a_t&df=text&distrib=false&_stateVer_=collection1:11&shards.purpose=64&rows=100&rid=null-404&version=2&q=*:*&omitHeader=false&NOW=1705871126129&ids=110,111,113,114,118,10,11,12,13,14,15,16,122,1,123,124,125,4,129,8,132,133,135,138,100,101,146,102,148,107,108,109&isShard=true&facet.mincount=2&facet=false&wt=javabin} status=0 QTime=5 2> 194095 INFO (qtp558875332-568) [n:127.0.0.1:32885_ c:collection1 s:shard2 r:core_node7 x:collection1_shard2_replica_n5 t:null-404] o.a.s.c.S.Request webapp= path=/select params={facet.field=a_t&df=text&distrib=false&_stateVer_=collection1:11&shards.purpose=64&rows=100&rid=null-404&version=2&q=*:*&omitHeader=false&NOW=1705871126129&ids=112,115,116,117,119,17,120,121,2,3,126,5,127,6,128,7,9,130,131,134,136,137,139,140,141,142,143,144,145,147,103,104,149,105,106&isShard=true&facet.mincount=2&facet=false&wt=javabin} status=0 QTime=24 2> 194112 INFO (qtp600427849-516) [n:127.0.0.1:40707_ c:collection1 s:shard2 r:core_node4 x:collection1_shard2_replica_n2 t:null-404] o.a.s.c.S.Request webapp= path=/select params={q=*:*&facet.field=a_t&_stateVer_=collection1:11&facet.mincount=2&rows=100&facet=true&wt=javabin&version=2} rid=null-404 hits=67 status=0 QTime=81 2> 194141 INFO (qtp2065109343-470) [n:127.0.0.1:45297_ c:control_collection s:shard1 r:core_node2 x:control_collection_shard1_replica_n1 t:null-405] o.a.s.c.S.Request webapp= path=/select params={facet.query=quick&facet.query=all&facet.query=*:*&q=*:*&facet.field=a_t&distrib=false&rows=100&facet=true&wt=javabin&version=2} hits=67 status=0 QTime=6 2> 194185 INFO (qtp558875332-570) [n:127.0.0.1:32885_ c:collection1 s:shard2 r:core_node7 x:collection1_shard2_replica_n5 t:null-406] o.a.s.c.S.Request webapp= path=/select params={facet.field=a_t&df=text&distrib=false&_stateVer_=collection1:11&fl=id&fl=score&shards.purpose=16404&start=0&fsv=true&f.a_t.facet.mincount=0&collection=collection1&rows=100&rid=null-406&version=2&facet.query=quick&facet.query=all&facet.query=*:*&q=*:*&f.a_t.facet.limit=160&omitHeader=false&NOW=1705871126266&isShard=true&facet=true&wt=javabin} hits=35 status=0 QTime=6 2> 194196 INFO (qtp657049334-531) [n:127.0.0.1:37133_ c:collection1 s:shard1 r:core_node8 x:collection1_shard1_replica_n6 t:null-406] o.a.s.c.S.Request webapp= path=/select params={facet.field=a_t&df=text&distrib=false&_stateVer_=collection1:11&fl=id&fl=score&shards.purpose=16404&start=0&fsv=true&f.a_t.facet.mincount=0&collection=collection1&rows=100&rid=null-406&version=2&facet.query=quick&facet.query=all&facet.query=*:*&q=*:*&f.a_t.facet.limit=160&omitHeader=false&NOW=1705871126266&isShard=true&facet=true&wt=javabin} hits=32 status=0 QTime=14 2> 194218 INFO (qtp657049334-529) [n:127.0.0.1:37133_ c:collection1 s:shard1 r:core_node8 x:collection1_shard1_replica_n6 t:null-406] o.a.s.c.S.Request webapp= path=/select params={facet.field=a_t&df=text&distrib=false&_stateVer_=collection1:11&shards.purpose=64&collection=collection1&rows=100&rid=null-406&version=2&facet.query=quick&facet.query=all&facet.query=*:*&q=*:*&omitHeader=false&NOW=1705871126266&ids=110,111,113,114,118,10,11,12,13,14,15,16,122,1,123,124,125,4,129,8,132,133,135,138,100,101,146,102,148,107,108,109&isShard=true&facet=false&wt=javabin} status=0 QTime=8 2> 194222 INFO (qtp558875332-573) [n:127.0.0.1:32885_ c:collection1 s:shard2 r:core_node7 x:collection1_shard2_replica_n5 t:null-406] o.a.s.c.S.Request webapp= path=/select params={facet.field=a_t&df=text&distrib=false&_stateVer_=collection1:11&shards.purpose=64&collection=collection1&rows=100&rid=null-406&version=2&facet.query=quick&facet.query=all&facet.query=*:*&q=*:*&omitHeader=false&NOW=1705871126266&ids=112,115,116,117,119,17,120,121,2,3,126,5,127,6,128,7,9,130,131,134,136,137,139,140,141,142,143,144,145,147,103,104,149,105,106&isShard=true&facet=false&wt=javabin} status=0 QTime=6 2> 194253 INFO (qtp558875332-571) [n:127.0.0.1:32885_ c:collection1 s:shard2 r:core_node7 x:collection1_shard2_replica_n5 t:null-406] o.a.s.c.S.Request webapp= path=/select params={facet.query=quick&facet.query=all&facet.query=*:*&q=*:*&facet.field=a_t&_stateVer_=collection1:11&collection=collection1&rows=100&facet=true&wt=javabin&version=2} rid=null-406 hits=67 status=0 QTime=85 2> 194352 INFO (qtp2065109343-467) [n:127.0.0.1:45297_ c:control_collection s:shard1 r:core_node2 x:control_collection_shard1_replica_n1 t:null-407] o.a.s.c.S.Request webapp= path=/select params={facet.query={!key%3Dmyquick}quick&facet.query={!key%3Dmyall+ex%3Da}all&facet.query=*:*&q=*:*&facet.field={!key%3Dmykey+ex%3Da}a_t&facet.field={!key%3Dother+ex%3Db}a_t&facet.field={!key%3Dagain+ex%3Da,b}a_t&facet.field=a_t&distrib=false&fq={!tag%3Da}id_i1:[1+TO+7]&fq={!tag%3Db}id_i1:[3+TO+9]&rows=100&facet=true&wt=javabin&version=2} hits=5 status=0 QTime=73 2> 194408 INFO (qtp1128963290-592) [n:127.0.0.1:43017_ c:collection1 s:shard1 r:core_node3 x:collection1_shard1_replica_n1 t:null-408] o.a.s.c.S.Request webapp= path=/select params={facet.field={!key%3Dmykey+ex%3Da}a_t&facet.field={!key%3Dother+ex%3Db}a_t&facet.field={!key%3Dagain+ex%3Da,b}a_t&facet.field=a_t&df=text&distrib=false&_stateVer_=collection1:11&fl=id&fl=score&shards.purpose=16404&start=0&fsv=true&f.a_t.facet.mincount=0&fq={!tag%3Da}id_i1:[1+TO+7]&fq={!tag%3Db}id_i1:[3+TO+9]&rows=100&rid=null-408&version=2&facet.query={!key%3Dmyquick}quick&facet.query={!key%3Dmyall+ex%3Da}all&facet.query=*:*&q=*:*&f.a_t.facet.limit=160&omitHeader=false&NOW=1705871126463&isShard=true&facet=true&wt=javabin} hits=1 status=0 QTime=27 2> 194410 INFO (qtp558875332-569) [n:127.0.0.1:32885_ c:collection1 s:shard2 r:core_node7 x:collection1_shard2_replica_n5 t:null-408] o.a.s.c.S.Request webapp= path=/select params={facet.field={!key%3Dmykey+ex%3Da}a_t&facet.field={!key%3Dother+ex%3Db}a_t&facet.field={!key%3Dagain+ex%3Da,b}a_t&facet.field=a_t&df=text&distrib=false&_stateVer_=collection1:11&fl=id&fl=score&shards.purpose=16404&start=0&fsv=true&f.a_t.facet.mincount=0&fq={!tag%3Da}id_i1:[1+TO+7]&fq={!tag%3Db}id_i1:[3+TO+9]&rows=100&rid=null-408&version=2&facet.query={!key%3Dmyquick}quick&facet.query={!key%3Dmyall+ex%3Da}all&facet.query=*:*&q=*:*&f.a_t.facet.limit=160&omitHeader=false&NOW=1705871126463&isShard=true&facet=true&wt=javabin} hits=4 status=0 QTime=32 2> 194424 INFO (qtp1128963290-588) [n:127.0.0.1:43017_ c:collection1 s:shard1 r:core_node3 x:collection1_shard1_replica_n1 t:null-408] o.a.s.c.S.Request webapp= path=/select params={facet.field={!key%3Dmykey+ex%3Da}a_t&facet.field={!key%3Dother+ex%3Db}a_t&facet.field={!key%3Dagain+ex%3Da,b}a_t&facet.field=a_t&df=text&distrib=false&_stateVer_=collection1:11&shards.purpose=64&fq={!tag%3Da}id_i1:[1+TO+7]&fq={!tag%3Db}id_i1:[3+TO+9]&rows=100&rid=null-408&version=2&facet.query={!key%3Dmyquick}quick&facet.query={!key%3Dmyall+ex%3Da}all&facet.query=*:*&q=*:*&omitHeader=false&NOW=1705871126463&ids=4&isShard=true&facet=false&wt=javabin} status=0 QTime=3 2> 194427 INFO (qtp558875332-572) [n:127.0.0.1:32885_ c:collection1 s:shard2 r:core_node7 x:collection1_shard2_replica_n5 t:null-408] o.a.s.c.S.Request webapp= path=/select params={facet.field={!key%3Dmykey+ex%3Da}a_t&facet.field={!key%3Dother+ex%3Db}a_t&facet.field={!key%3Dagain+ex%3Da,b}a_t&facet.field=a_t&df=text&distrib=false&_stateVer_=collection1:11&shards.purpose=64&fq={!tag%3Da}id_i1:[1+TO+7]&fq={!tag%3Db}id_i1:[3+TO+9]&rows=100&rid=null-408&version=2&facet.query={!key%3Dmyquick}quick&facet.query={!key%3Dmyall+ex%3Da}all&facet.query=*:*&q=*:*&omitHeader=false&NOW=1705871126463&ids=3,5,6,7&isShard=true&facet=false&wt=javabin} status=0 QTime=3 2> 194433 INFO (qtp600427849-517) [n:127.0.0.1:40707_ c:collection1 s:shard2 r:core_node4 x:collection1_shard2_replica_n2 t:null-408] o.a.s.c.S.Request webapp= path=/select params={facet.query={!key%3Dmyquick}quick&facet.query={!key%3Dmyall+ex%3Da}all&facet.query=*:*&q=*:*&facet.field={!key%3Dmykey+ex%3Da}a_t&facet.field={!key%3Dother+ex%3Db}a_t&facet.field={!key%3Dagain+ex%3Da,b}a_t&facet.field=a_t&_stateVer_=collection1:11&fq={!tag%3Da}id_i1:[1+TO+7]&fq={!tag%3Db}id_i1:[3+TO+9]&rows=100&facet=true&wt=javabin&version=2} rid=null-408 hits=5 status=0 QTime=68 2> 194465 INFO (qtp2065109343-471) [n:127.0.0.1:45297_ c:control_collection s:shard1 r:core_node2 x:control_collection_shard1_replica_n1 t:null-409] o.a.s.c.S.Request webapp= path=/select params={q=*:*&facet.limit=10&facet.field={!ex%3Dt1}SubjectTerms_mfacet&distrib=false&fq={!tag%3Dt1}SubjectTerms_mfacet:(test+1)&facet.mincount=1&facet=true&wt=javabin&version=2} hits=0 status=0 QTime=18 2> 194571 INFO (qtp558875332-568) [n:127.0.0.1:32885_ c:collection1 s:shard2 r:core_node7 x:collection1_shard2_replica_n5 t:null-410] o.a.s.c.S.Request webapp= path=/select params={facet.field={!ex%3Dt1}SubjectTerms_mfacet&df=text&distrib=false&_stateVer_=collection1:11&fl=id&fl=score&f.SubjectTerms_mfacet.facet.mincount=1&shards.purpose=16404&start=0&fsv=true&fq={!tag%3Dt1}SubjectTerms_mfacet:(test+1)&collection=collection1&rid=null-410&rows=10&version=2&q=*:*&facet.limit=10&omitHeader=false&f.SubjectTerms_mfacet.facet.limit=25&NOW=1705871126584&isShard=true&facet=true&wt=javabin} hits=0 status=0 QTime=76 2> 194572 INFO (qtp657049334-532) [n:127.0.0.1:37133_ c:collection1 s:shard1 r:core_node8 x:collection1_shard1_replica_n6 t:null-410] o.a.s.c.S.Request webapp= path=/select params={facet.field={!ex%3Dt1}SubjectTerms_mfacet&df=text&distrib=false&_stateVer_=collection1:11&fl=id&fl=score&f.SubjectTerms_mfacet.facet.mincount=1&shards.purpose=16404&start=0&fsv=true&fq={!tag%3Dt1}SubjectTerms_mfacet:(test+1)&collection=collection1&rid=null-410&rows=10&version=2&q=*:*&facet.limit=10&omitHeader=false&f.SubjectTerms_mfacet.facet.limit=25&NOW=1705871126584&isShard=true&facet=true&wt=javabin} hits=0 status=0 QTime=13 2> 194595 INFO (qtp558875332-570) [n:127.0.0.1:32885_ c:collection1 s:shard2 r:core_node7 x:collection1_shard2_replica_n5 t:null-410] o.a.s.c.S.Request webapp= path=/select params={df=text&facet.field={!terms%3D$SubjectTerms_mfacet__terms+ex%3Dt1}SubjectTerms_mfacet&distrib=false&_stateVer_=collection1:11&shards.purpose=32&fq={!tag%3Dt1}SubjectTerms_mfacet:(test+1)&collection=collection1&rid=null-410&rows=0&version=2&q=*:*&facet.limit=10&omitHeader=false&NOW=1705871126584&isShard=true&facet.mincount=1&facet=true&wt=javabin&SubjectTerms_mfacet__terms=test3,mathematical+analysis,mathematical+models} hits=0 status=0 QTime=10 2> 194599 INFO (qtp657049334-530) [n:127.0.0.1:37133_ c:collection1 s:shard1 r:core_node8 x:collection1_shard1_replica_n6 t:null-410] o.a.s.c.S.Request webapp= path=/select params={q=*:*&facet.limit=10&facet.field={!ex%3Dt1}SubjectTerms_mfacet&_stateVer_=collection1:11&fq={!tag%3Dt1}SubjectTerms_mfacet:(test+1)&facet.mincount=1&collection=collection1&facet=true&wt=javabin&version=2} rid=null-410 hits=0 status=0 QTime=113 2> 194612 INFO (qtp2065109343-466) [n:127.0.0.1:45297_ c:control_collection s:shard1 r:core_node2 x:control_collection_shard1_replica_n1 t:null-411] o.a.s.c.S.Request webapp= path=/select params={q=*:*&facet.field=ignore_exception__missing_but_valid_field_t&distrib=false&facet.mincount=2&rows=100&facet=true&wt=javabin&version=2} hits=67 status=0 QTime=3 2> 194653 INFO (qtp600427849-518) [n:127.0.0.1:40707_ c:collection1 s:shard2 r:core_node4 x:collection1_shard2_replica_n2 t:null-412] o.a.s.c.S.Request webapp= path=/select params={facet.field=ignore_exception__missing_but_valid_field_t&df=text&distrib=false&_stateVer_=collection1:11&fl=id&fl=score&shards.purpose=16404&start=0&fsv=true&rows=100&rid=null-412&version=2&q=*:*&f.ignore_exception__missing_but_valid_field_t.facet.limit=160&omitHeader=false&f.ignore_exception__missing_but_valid_field_t.facet.mincount=1&NOW=1705871126742&isShard=true&facet=true&wt=javabin} hits=35 status=0 QTime=3 2> 194656 INFO (qtp657049334-533) [n:127.0.0.1:37133_ c:collection1 s:shard1 r:core_node8 x:collection1_shard1_replica_n6 t:null-412] o.a.s.c.S.Request webapp= path=/select params={facet.field=ignore_exception__missing_but_valid_field_t&df=text&distrib=false&_stateVer_=collection1:11&fl=id&fl=score&shards.purpose=16404&start=0&fsv=true&rows=100&rid=null-412&version=2&q=*:*&f.ignore_exception__missing_but_valid_field_t.facet.limit=160&omitHeader=false&f.ignore_exception__missing_but_valid_field_t.facet.mincount=1&NOW=1705871126742&isShard=true&facet=true&wt=javabin} hits=32 status=0 QTime=3 2> 194671 INFO (qtp600427849-519) [n:127.0.0.1:40707_ c:collection1 s:shard2 r:core_node4 x:collection1_shard2_replica_n2 t:null-412] o.a.s.c.S.Request webapp= path=/select params={facet.field=ignore_exception__missing_but_valid_field_t&df=text&distrib=false&_stateVer_=collection1:11&shards.purpose=64&rows=100&rid=null-412&version=2&q=*:*&omitHeader=false&NOW=1705871126742&ids=112,115,116,117,119,17,120,121,2,3,126,5,127,6,128,7,9,130,131,134,136,137,139,140,141,142,143,144,145,147,103,104,149,105,106&isShard=true&facet.mincount=2&facet=false&wt=javabin} status=0 QTime=6 2> 194672 INFO (qtp657049334-531) [n:127.0.0.1:37133_ c:collection1 s:shard1 r:core_node8 x:collection1_shard1_replica_n6 t:null-412] o.a.s.c.S.Request webapp= path=/select params={facet.field=ignore_exception__missing_but_valid_field_t&df=text&distrib=false&_stateVer_=collection1:11&shards.purpose=64&rows=100&rid=null-412&version=2&q=*:*&omitHeader=false&NOW=1705871126742&ids=110,111,113,114,118,10,11,12,13,14,15,16,122,1,123,124,125,4,129,8,132,133,135,138,100,101,146,102,148,107,108,109&isShard=true&facet.mincount=2&facet=false&wt=javabin} status=0 QTime=5 2> 194685 INFO (qtp1128963290-589) [n:127.0.0.1:43017_ c:collection1 s:shard1 r:core_node3 x:collection1_shard1_replica_n1 t:null-412] o.a.s.c.S.Request webapp= path=/select params={q=*:*&facet.field=ignore_exception__missing_but_valid_field_t&_stateVer_=collection1:11&facet.mincount=2&rows=100&facet=true&wt=javabin&version=2} rid=null-412 hits=67 status=0 QTime=41 2> 194737 INFO (qtp2065109343-468) [n:127.0.0.1:45297_ c:control_collection s:shard1 r:core_node2 x:control_collection_shard1_replica_n1 t:null-413] o.a.s.c.S.Request webapp= path=/select params={q=*:*&facet.field=oddField_s&distrib=false&facet.mincount=2&rows=100&facet=true&wt=javabin&version=2} hits=67 status=0 QTime=5 2> 194782 INFO (qtp558875332-569) [n:127.0.0.1:32885_ c:collection1 s:shard2 r:core_node7 x:collection1_shard2_replica_n5 t:null-414] o.a.s.c.S.Request webapp= path=/select params={facet.field=oddField_s&df=text&distrib=false&_stateVer_=collection1:11&fl=id&fl=score&shards.purpose=16404&start=0&fsv=true&f.oddField_s.facet.limit=160&rows=100&rid=null-414&f.oddField_s.facet.mincount=1&version=2&q=*:*&omitHeader=false&NOW=1705871126867&isShard=true&facet=true&wt=javabin} hits=35 status=0 QTime=2 2> 194784 INFO (qtp657049334-529) [n:127.0.0.1:37133_ c:collection1 s:shard1 r:core_node8 x:collection1_shard1_replica_n6 t:null-414] o.a.s.c.S.Request webapp= path=/select params={facet.field=oddField_s&df=text&distrib=false&_stateVer_=collection1:11&fl=id&fl=score&shards.purpose=16404&start=0&fsv=true&f.oddField_s.facet.limit=160&rows=100&rid=null-414&f.oddField_s.facet.mincount=1&version=2&q=*:*&omitHeader=false&NOW=1705871126867&isShard=true&facet=true&wt=javabin} hits=32 status=0 QTime=5 2> 194800 INFO (qtp657049334-528) [n:127.0.0.1:37133_ c:collection1 s:shard1 r:core_node8 x:collection1_shard1_replica_n6 t:null-414] o.a.s.c.S.Request webapp= path=/select params={facet.field=oddField_s&df=text&distrib=false&_stateVer_=collection1:11&shards.purpose=64&rows=100&rid=null-414&version=2&q=*:*&omitHeader=false&NOW=1705871126867&ids=110,111,113,114,118,10,11,12,13,14,15,16,122,1,123,124,125,4,129,8,132,133,135,138,100,101,146,102,148,107,108,109&isShard=true&facet.mincount=2&facet=false&wt=javabin} status=0 QTime=6 2> 194803 INFO (qtp558875332-572) [n:127.0.0.1:32885_ c:collection1 s:shard2 r:core_node7 x:collection1_shard2_replica_n5 t:null-414] o.a.s.c.S.Request webapp= path=/select params={facet.field=oddField_s&df=text&distrib=false&_stateVer_=collection1:11&shards.purpose=64&rows=100&rid=null-414&version=2&q=*:*&omitHeader=false&NOW=1705871126867&ids=112,115,116,117,119,17,120,121,2,3,126,5,127,6,128,7,9,130,131,134,136,137,139,140,141,142,143,144,145,147,103,104,149,105,106&isShard=true&facet.mincount=2&facet=false&wt=javabin} status=0 QTime=5 2> 194816 INFO (qtp558875332-571) [n:127.0.0.1:32885_ c:collection1 s:shard2 r:core_node7 x:collection1_shard2_replica_n5 t:null-414] o.a.s.c.S.Request webapp= path=/select params={q=*:*&facet.field=oddField_s&_stateVer_=collection1:11&facet.mincount=2&rows=100&facet=true&wt=javabin&version=2} rid=null-414 hits=67 status=0 QTime=48 2> 194900 INFO (qtp2065109343-469) [n:127.0.0.1:45297_ c:control_collection s:shard1 r:core_node2 x:control_collection_shard1_replica_n1 t:null-415] o.a.s.c.S.Request webapp= path=/select params={q=*:*&distrib=false&stats=true&sort=a_i1+desc&wt=javabin&version=2&stats.field=a_i1} hits=67 status=0 QTime=54 2> 194931 INFO (qtp600427849-520) [n:127.0.0.1:40707_ c:collection1 s:shard2 r:core_node4 x:collection1_shard2_replica_n2 t:null-416] o.a.s.c.S.Request webapp= path=/select params={df=text&distrib=false&_stateVer_=collection1:11&fl=id&shards.purpose=516&start=0&fsv=true&sort=a_i1+desc&rid=null-416&rows=10&version=2&q=*:*&stats=true&omitHeader=false&NOW=1705871127014&isShard=true&wt=javabin&stats.field=a_i1} hits=35 status=0 QTime=6 2> 194932 INFO (qtp1128963290-593) [n:127.0.0.1:43017_ c:collection1 s:shard1 r:core_node3 x:collection1_shard1_replica_n1 t:null-416] o.a.s.c.S.Request webapp= path=/select params={df=text&distrib=false&_stateVer_=collection1:11&fl=id&shards.purpose=516&start=0&fsv=true&sort=a_i1+desc&rid=null-416&rows=10&version=2&q=*:*&stats=true&omitHeader=false&NOW=1705871127014&isShard=true&wt=javabin&stats.field=a_i1} hits=32 status=0 QTime=4 2> 194949 INFO (qtp1128963290-590) [n:127.0.0.1:43017_ c:collection1 s:shard1 r:core_node3 x:collection1_shard1_replica_n1 t:null-416] o.a.s.c.S.Request webapp= path=/select params={df=text&distrib=false&_stateVer_=collection1:11&shards.purpose=64&rid=null-416&version=2&q=*:*&stats=false&omitHeader=false&NOW=1705871127014&ids=1,12,13,8,10&isShard=true&wt=javabin&stats.field=a_i1} status=0 QTime=4 2> 194949 INFO (qtp600427849-516) [n:127.0.0.1:40707_ c:collection1 s:shard2 r:core_node4 x:collection1_shard2_replica_n2 t:null-416] o.a.s.c.S.Request webapp= path=/select params={df=text&distrib=false&_stateVer_=collection1:11&shards.purpose=64&rid=null-416&version=2&q=*:*&stats=false&omitHeader=false&NOW=1705871127014&ids=2,3,5,7,9&isShard=true&wt=javabin&stats.field=a_i1} status=0 QTime=6 2> 194954 INFO (qtp600427849-515) [n:127.0.0.1:40707_ c:collection1 s:shard2 r:core_node4 x:collection1_shard2_replica_n2 t:null-416] o.a.s.c.S.Request webapp= path=/select params={q=*:*&_stateVer_=collection1:11&stats=true&sort=a_i1+desc&wt=javabin&version=2&stats.field=a_i1} rid=null-416 hits=67 status=0 QTime=38 2> 194968 INFO (qtp2065109343-470) [n:127.0.0.1:45297_ c:control_collection s:shard1 r:core_node2 x:control_collection_shard1_replica_n1 t:null-417] o.a.s.c.S.Request webapp= path=/select params={q=*:*&facet.limit=5&facet.field=a_t&distrib=false&facet.shard.limit=5&rows=0&facet=true&wt=javabin&version=2} hits=67 status=0 QTime=4 2> 194989 INFO (qtp600427849-518) [n:127.0.0.1:40707_ c:collection1 s:shard2 r:core_node4 x:collection1_shard2_replica_n2 t:null-418] o.a.s.c.S.Request webapp= path=/select params={facet.field=a_t&df=text&distrib=false&_stateVer_=collection1:11&facet.shard.limit=5&fl=id&fl=score&shards.purpose=16404&start=0&fsv=true&f.a_t.facet.mincount=0&rows=0&rid=null-418&version=2&q=*:*&facet.limit=5&f.a_t.facet.limit=5&omitHeader=false&NOW=1705871127075&isShard=true&facet=true&wt=javabin} hits=35 status=0 QTime=3 2> 194995 INFO (qtp1128963290-591) [n:127.0.0.1:43017_ c:collection1 s:shard1 r:core_node3 x:collection1_shard1_replica_n1 t:null-418] o.a.s.c.S.Request webapp= path=/select params={facet.field=a_t&df=text&distrib=false&_stateVer_=collection1:11&facet.shard.limit=5&fl=id&fl=score&shards.purpose=16404&start=0&fsv=true&f.a_t.facet.mincount=0&rows=0&rid=null-418&version=2&q=*:*&facet.limit=5&f.a_t.facet.limit=5&omitHeader=false&NOW=1705871127075&isShard=true&facet=true&wt=javabin} hits=32 status=0 QTime=6 2> 195007 INFO (qtp600427849-519) [n:127.0.0.1:40707_ c:collection1 s:shard2 r:core_node4 x:collection1_shard2_replica_n2 t:null-418] o.a.s.c.S.Request webapp= path=/select params={df=text&facet.field={!terms%3D$a_t__terms}a_t&distrib=false&_stateVer_=collection1:11&facet.shard.limit=5&shards.purpose=32&rows=0&rid=null-418&version=2&q=*:*&facet.limit=5&a_t__terms=all,men,blind,dog,egg&omitHeader=false&NOW=1705871127075&isShard=true&facet=true&wt=javabin} hits=35 status=0 QTime=3 2> 195010 INFO (qtp1128963290-592) [n:127.0.0.1:43017_ c:collection1 s:shard1 r:core_node3 x:collection1_shard1_replica_n1 t:null-418] o.a.s.c.S.Request webapp= path=/select params={df=text&facet.field={!terms%3D$a_t__terms}a_t&distrib=false&_stateVer_=collection1:11&facet.shard.limit=5&shards.purpose=32&rows=0&rid=null-418&version=2&q=*:*&facet.limit=5&a_t__terms=humpti,dumpi,again,aid,brown&omitHeader=false&NOW=1705871127075&isShard=true&facet=true&wt=javabin} hits=32 status=0 QTime=4 2> 195013 INFO (qtp600427849-517) [n:127.0.0.1:40707_ c:collection1 s:shard2 r:core_node4 x:collection1_shard2_replica_n2 t:null-418] o.a.s.c.S.Request webapp= path=/select params={q=*:*&facet.limit=5&facet.field=a_t&_stateVer_=collection1:11&facet.shard.limit=5&rows=0&facet=true&wt=javabin&version=2} rid=null-418 hits=67 status=0 QTime=35 2> 195024 INFO (qtp2065109343-467) [n:127.0.0.1:45297_ c:control_collection s:shard1 r:core_node2 x:control_collection_shard1_replica_n1 t:null-419] o.a.s.c.S.Request webapp= path=/select params={q=*:*&facet.limit=5&facet.field={!key%3D'a+b/c+\'+\}+foo'}a_t&distrib=false&facet.shard.limit=5&rows=0&facet=true&wt=javabin&version=2} hits=67 status=0 QTime=4 2> 195040 INFO (qtp600427849-520) [n:127.0.0.1:40707_ c:collection1 s:shard2 r:core_node4 x:collection1_shard2_replica_n2 t:null-420] o.a.s.c.S.Request webapp= path=/select params={facet.field={!key%3D'a+b/c+\'+\}+foo'}a_t&df=text&distrib=false&_stateVer_=collection1:11&facet.shard.limit=5&fl=id&fl=score&shards.purpose=16404&start=0&fsv=true&f.a_t.facet.mincount=0&rows=0&rid=null-420&version=2&q=*:*&facet.limit=5&f.a_t.facet.limit=5&omitHeader=false&NOW=1705871127128&isShard=true&facet=true&wt=javabin} hits=35 status=0 QTime=3 2> 195043 INFO (qtp657049334-532) [n:127.0.0.1:37133_ c:collection1 s:shard1 r:core_node8 x:collection1_shard1_replica_n6 t:null-420] o.a.s.c.S.Request webapp= path=/select params={facet.field={!key%3D'a+b/c+\'+\}+foo'}a_t&df=text&distrib=false&_stateVer_=collection1:11&facet.shard.limit=5&fl=id&fl=score&shards.purpose=16404&start=0&fsv=true&f.a_t.facet.mincount=0&rows=0&rid=null-420&version=2&q=*:*&facet.limit=5&f.a_t.facet.limit=5&omitHeader=false&NOW=1705871127128&isShard=true&facet=true&wt=javabin} hits=32 status=0 QTime=3 2> 195054 INFO (qtp600427849-516) [n:127.0.0.1:40707_ c:collection1 s:shard2 r:core_node4 x:collection1_shard2_replica_n2 t:null-420] o.a.s.c.S.Request webapp= path=/select params={df=text&facet.field={!terms%3D$'a+b/c+\'+}+foo__terms'+key%3D'a+b/c+\'+\}+foo'}a_t&distrib=false&_stateVer_=collection1:11&facet.shard.limit=5&shards.purpose=32&rows=0&rid=null-420&a+b/c+'+}+foo__terms=all,men,blind,dog,egg&version=2&q=*:*&facet.limit=5&omitHeader=false&NOW=1705871127128&isShard=true&facet=true&wt=javabin} hits=35 status=0 QTime=3 2> 195057 INFO (qtp657049334-530) [n:127.0.0.1:37133_ c:collection1 s:shard1 r:core_node8 x:collection1_shard1_replica_n6 t:null-420] o.a.s.c.S.Request webapp= path=/select params={df=text&facet.field={!terms%3D$'a+b/c+\'+}+foo__terms'+key%3D'a+b/c+\'+\}+foo'}a_t&distrib=false&_stateVer_=collection1:11&facet.shard.limit=5&shards.purpose=32&rows=0&rid=null-420&a+b/c+'+}+foo__terms=humpti,dumpi,again,aid,brown&version=2&q=*:*&facet.limit=5&omitHeader=false&NOW=1705871127128&isShard=true&facet=true&wt=javabin} hits=32 status=0 QTime=3 2> 195059 INFO (qtp1128963290-588) [n:127.0.0.1:43017_ c:collection1 s:shard1 r:core_node3 x:collection1_shard1_replica_n1 t:null-420] o.a.s.c.S.Request webapp= path=/select params={q=*:*&facet.limit=5&facet.field={!key%3D'a+b/c+\'+\}+foo'}a_t&_stateVer_=collection1:11&facet.shard.limit=5&rows=0&facet=true&wt=javabin&version=2} rid=null-420 hits=67 status=0 QTime=29 2> 195109 INFO (qtp2065109343-471) [n:127.0.0.1:45297_ c:control_collection s:shard1 r:core_node2 x:control_collection_shard1_replica_n1 t:null-421] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={wt=javabin&version=2&CONTROL=TRUE}{add=[100 (1788735523048652800)]} 0 34 2> 195157 INFO (qtp657049334-533) [n:127.0.0.1:37133_ c:collection1 s:shard1 r:core_node8 x:collection1_shard1_replica_n6 t:null-422] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={update.distrib=FROMLEADER&distrib.from=http://127.0.0.1:43017/collection1_shard1_replica_n1/&wt=javabin&version=2}{add=[100 (1788735523091644416)]} 0 16 2> 195160 INFO (qtp1128963290-589) [n:127.0.0.1:43017_ c:collection1 s:shard1 r:core_node3 x:collection1_shard1_replica_n1 t:null-422] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={wt=javabin&version=2}{add=[100 (1788735523091644416)]} 0 44 2> 195177 INFO (qtp2065109343-466) [n:127.0.0.1:45297_ c:control_collection s:shard1 r:core_node2 x:control_collection_shard1_replica_n1 t:null-423] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={wt=javabin&version=2}{add=[100 (1788735523144073216)]} 0 10 2> 195218 INFO (qtp657049334-531) [n:127.0.0.1:37133_ c:collection1 s:shard1 r:core_node8 x:collection1_shard1_replica_n6 t:null-424] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={update.distrib=FROMLEADER&distrib.from=http://127.0.0.1:43017/collection1_shard1_replica_n1/&wt=javabin&version=2}{add=[100 (1788735523176579072)]} 0 10 2> 195226 INFO (qtp1128963290-593) [n:127.0.0.1:43017_ c:collection1 s:shard1 r:core_node3 x:collection1_shard1_replica_n1 t:null-424] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={update.distrib=TOLEADER&distrib.from=http://127.0.0.1:40707/collection1_shard2_replica_n2/&wt=javabin&version=2}{add=[100 (1788735523176579072)]} 0 29 2> 195229 INFO (qtp600427849-515) [n:127.0.0.1:40707_ c:collection1 s:shard2 r:core_node4 x:collection1_shard2_replica_n2 t:null-424] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={wt=javabin&version=2}{add=[100]} 0 44 2> 195238 INFO (qtp2065109343-468) [n:127.0.0.1:45297_ c:control_collection s:shard1 r:core_node2 x:control_collection_shard1_replica_n1 t:null-425] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={wt=javabin&version=2}{add=[100 (1788735523215376384)]} 0 4 2> 195265 INFO (qtp657049334-528) [n:127.0.0.1:37133_ c:collection1 s:shard1 r:core_node8 x:collection1_shard1_replica_n6 t:null-426] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={update.distrib=FROMLEADER&distrib.from=http://127.0.0.1:43017/collection1_shard1_replica_n1/&wt=javabin&version=2}{add=[100 (1788735523235299328)]} 0 4 2> 195268 INFO (qtp1128963290-590) [n:127.0.0.1:43017_ c:collection1 s:shard1 r:core_node3 x:collection1_shard1_replica_n1 t:null-426] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={update.distrib=TOLEADER&distrib.from=http://127.0.0.1:37133/collection1_shard1_replica_n6/&wt=javabin&version=2}{add=[100 (1788735523235299328)]} 0 15 2> 195271 INFO (qtp657049334-529) [n:127.0.0.1:37133_ c:collection1 s:shard1 r:core_node8 x:collection1_shard1_replica_n6 t:null-426] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={wt=javabin&version=2}{add=[100]} 0 26 2> 195279 INFO (qtp2065109343-469) [n:127.0.0.1:45297_ c:control_collection s:shard1 r:core_node2 x:control_collection_shard1_replica_n1 t:null-427] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={wt=javabin&version=2}{add=[100 (1788735523258368000)]} 0 3 2> 195309 INFO (qtp657049334-532) [n:127.0.0.1:37133_ c:collection1 s:shard1 r:core_node8 x:collection1_shard1_replica_n6 t:null-428] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={update.distrib=FROMLEADER&distrib.from=http://127.0.0.1:43017/collection1_shard1_replica_n1/&wt=javabin&version=2}{add=[100 (1788735523281436672)]} 0 3 2> 195312 INFO (qtp1128963290-591) [n:127.0.0.1:43017_ c:collection1 s:shard1 r:core_node3 x:collection1_shard1_replica_n1 t:null-428] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={update.distrib=TOLEADER&distrib.from=http://127.0.0.1:32885/collection1_shard2_replica_n5/&wt=javabin&version=2}{add=[100 (1788735523281436672)]} 0 18 2> 195316 INFO (qtp558875332-568) [n:127.0.0.1:32885_ c:collection1 s:shard2 r:core_node7 x:collection1_shard2_replica_n5 t:null-428] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={wt=javabin&version=2}{add=[100]} 0 30 2> 195328 INFO (qtp2065109343-470) [n:127.0.0.1:45297_ c:control_collection s:shard1 r:core_node2 x:control_collection_shard1_replica_n1 t:null-429] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={wt=javabin&version=2}{add=[100 (1788735523309748224)]} 0 4 2> 195349 INFO (qtp657049334-530) [n:127.0.0.1:37133_ c:collection1 s:shard1 r:core_node8 x:collection1_shard1_replica_n6 t:null-430] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={update.distrib=FROMLEADER&distrib.from=http://127.0.0.1:43017/collection1_shard1_replica_n1/&wt=javabin&version=2}{add=[100 (1788735523321282560)]} 0 3 2> 195363 INFO (qtp1128963290-592) [n:127.0.0.1:43017_ c:collection1 s:shard1 r:core_node3 x:collection1_shard1_replica_n1 t:null-430] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={wt=javabin&version=2}{add=[100 (1788735523321282560)]} 0 28 2> 195548 INFO (searcherExecutor-260-thread-1-processing-control_collection_shard1_replica_n1 null-431 core_node2 127.0.0.1:45297_ control_collection shard1) [n:127.0.0.1:45297_ c:control_collection s:shard1 r:core_node2 x:control_collection_shard1_replica_n1 t:null-431] o.a.s.c.SolrCore Registered new searcher autowarm time: 0 ms 2> 195557 INFO (qtp2065109343-467) [n:127.0.0.1:45297_ c:control_collection s:shard1 r:core_node2 x:control_collection_shard1_replica_n1 t:null-431] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={waitSearcher=true&commit=true&softCommit=false&wt=javabin&version=2}{commit=} 0 179 2> 195625 INFO (qtp558875332-570) [n:127.0.0.1:32885_ c:collection1 s:shard2 r:core_node7 x:collection1_shard2_replica_n5 t:null-432] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={update.distrib=FROMLEADER&waitSearcher=true&openSearcher=true&commit=true&softCommit=false&distrib.from=http://127.0.0.1:40707/collection1_shard2_replica_n2/&commit_end_point=replicas&wt=javabin&version=2&expungeDeletes=false}{commit=} 0 9 2> 195633 INFO (qtp600427849-518) [n:127.0.0.1:40707_ c:collection1 s:shard2 r:core_node4 x:collection1_shard2_replica_n2 t:null-432] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={update.distrib=TOLEADER&waitSearcher=true&openSearcher=true&commit=true&softCommit=false&distrib.from=http://127.0.0.1:43017/collection1_shard1_replica_n1/&commit_end_point=leaders&wt=javabin&version=2&expungeDeletes=false}{commit=} 0 33 2> 195793 INFO (searcherExecutor-365-thread-1-processing-collection1_shard1_replica_n6 null-432 core_node8 127.0.0.1:37133_ collection1 shard1) [n:127.0.0.1:37133_ c:collection1 s:shard1 r:core_node8 x:collection1_shard1_replica_n6 t:null-432] o.a.s.c.SolrCore Registered new searcher autowarm time: 0 ms 2> 195798 INFO (qtp657049334-533) [n:127.0.0.1:37133_ c:collection1 s:shard1 r:core_node8 x:collection1_shard1_replica_n6 t:null-432] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={update.distrib=FROMLEADER&waitSearcher=true&openSearcher=true&commit=true&softCommit=false&distrib.from=http://127.0.0.1:43017/collection1_shard1_replica_n1/&commit_end_point=replicas&wt=javabin&version=2&expungeDeletes=false}{commit=} 0 158 2> 195806 INFO (searcherExecutor-351-thread-1-processing-collection1_shard1_replica_n1 null-432 core_node3 127.0.0.1:43017_ collection1 shard1) [n:127.0.0.1:43017_ c:collection1 s:shard1 r:core_node3 x:collection1_shard1_replica_n1 t:null-432] o.a.s.c.SolrCore Registered new searcher autowarm time: 0 ms 2> 195823 INFO (qtp1128963290-588) [n:127.0.0.1:43017_ c:collection1 s:shard1 r:core_node3 x:collection1_shard1_replica_n1 t:null-432] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={_stateVer_=collection1:11&waitSearcher=true&commit=true&softCommit=false&wt=javabin&version=2}{commit=} 0 247 2> 195844 INFO (qtp2065109343-471) [n:127.0.0.1:45297_ c:control_collection s:shard1 r:core_node2 x:control_collection_shard1_replica_n1 t:null-433] o.a.s.c.S.Request webapp= path=/select params={q=duplicate&distrib=false&hl=true&hl.fl=a_t&wt=javabin&version=2} hits=1 status=0 QTime=12 2> 195862 INFO (qtp558875332-573) [n:127.0.0.1:32885_ c:collection1 s:shard2 r:core_node7 x:collection1_shard2_replica_n5 t:null-434] o.a.s.c.S.Request webapp= path=/select params={df=text&distrib=false&_stateVer_=collection1:11&hl=false&fl=id&fl=score&shards.purpose=16388&start=0&fsv=true&rid=null-434&rows=10&version=2&q=duplicate&omitHeader=false&NOW=1705871127949&isShard=true&hl.fl=a_t&wt=javabin} hits=0 status=0 QTime=3 2> 195866 INFO (qtp1128963290-593) [n:127.0.0.1:43017_ c:collection1 s:shard1 r:core_node3 x:collection1_shard1_replica_n1 t:null-434] o.a.s.c.S.Request webapp= path=/select params={df=text&distrib=false&_stateVer_=collection1:11&hl=false&fl=id&fl=score&shards.purpose=16388&start=0&fsv=true&rid=null-434&rows=10&version=2&q=duplicate&omitHeader=false&NOW=1705871127949&isShard=true&hl.fl=a_t&wt=javabin} hits=1 status=0 QTime=4 2> 195881 INFO (qtp1128963290-590) [n:127.0.0.1:43017_ c:collection1 s:shard1 r:core_node3 x:collection1_shard1_replica_n1 t:null-434] o.a.s.c.S.Request webapp= path=/select params={df=text&distrib=false&_stateVer_=collection1:11&hl=true&shards.purpose=192&rid=null-434&version=2&q=duplicate&omitHeader=false&NOW=1705871127949&ids=100&isShard=true&hl.fl=a_t&wt=javabin} status=0 QTime=8 2> 195885 INFO (qtp1128963290-589) [n:127.0.0.1:43017_ c:collection1 s:shard1 r:core_node3 x:collection1_shard1_replica_n1 t:null-434] o.a.s.c.S.Request webapp= path=/select params={q=duplicate&_stateVer_=collection1:11&hl=true&hl.fl=a_t&wt=javabin&version=2} rid=null-434 hits=1 status=0 QTime=34 2> 195935 INFO (qtp2065109343-466) [n:127.0.0.1:45297_ c:control_collection s:shard1 r:core_node2 x:control_collection_shard1_replica_n1 t:null-435] o.a.s.c.S.Request webapp= path=/select params={q=fox+duplicate+horses&distrib=false&hl=true&hl.fl=a_t&wt=javabin&version=2} hits=4 status=0 QTime=38 2> 195965 INFO (qtp600427849-519) [n:127.0.0.1:40707_ c:collection1 s:shard2 r:core_node4 x:collection1_shard2_replica_n2 t:null-436] o.a.s.c.S.Request webapp= path=/select params={df=text&distrib=false&_stateVer_=collection1:11&hl=false&fl=id&fl=score&shards.purpose=16388&start=0&fsv=true&collection=collection1&rid=null-436&rows=10&version=2&q=fox+duplicate+horses&omitHeader=false&NOW=1705871128043&isShard=true&hl.fl=a_t&wt=javabin} hits=1 status=0 QTime=12 2> 195965 INFO (qtp1128963290-592) [n:127.0.0.1:43017_ c:collection1 s:shard1 r:core_node3 x:collection1_shard1_replica_n1 t:null-436] o.a.s.c.S.Request webapp= path=/select params={df=text&distrib=false&_stateVer_=collection1:11&hl=false&fl=id&fl=score&shards.purpose=16388&start=0&fsv=true&collection=collection1&rid=null-436&rows=10&version=2&q=fox+duplicate+horses&omitHeader=false&NOW=1705871128043&isShard=true&hl.fl=a_t&wt=javabin} hits=3 status=0 QTime=9 2> 196023 INFO (qtp600427849-517) [n:127.0.0.1:40707_ c:collection1 s:shard2 r:core_node4 x:collection1_shard2_replica_n2 t:null-436] o.a.s.c.S.Request webapp= path=/select params={df=text&distrib=false&_stateVer_=collection1:11&hl=true&shards.purpose=192&collection=collection1&rid=null-436&version=2&q=fox+duplicate+horses&omitHeader=false&NOW=1705871128043&ids=5&isShard=true&hl.fl=a_t&wt=javabin} status=0 QTime=42 2> 196072 INFO (qtp1128963290-588) [n:127.0.0.1:43017_ c:collection1 s:shard1 r:core_node3 x:collection1_shard1_replica_n1 t:null-436] o.a.s.c.S.Request webapp= path=/select params={df=text&distrib=false&_stateVer_=collection1:11&hl=true&shards.purpose=192&collection=collection1&rid=null-436&version=2&q=fox+duplicate+horses&omitHeader=false&NOW=1705871128043&ids=100,4,8&isShard=true&hl.fl=a_t&wt=javabin} status=0 QTime=37 2> 196079 INFO (qtp1128963290-591) [n:127.0.0.1:43017_ c:collection1 s:shard1 r:core_node3 x:collection1_shard1_replica_n1 t:null-436] o.a.s.c.S.Request webapp= path=/select params={q=fox+duplicate+horses&_stateVer_=collection1:11&hl=true&hl.fl=a_t&collection=collection1&wt=javabin&version=2} rid=null-436 hits=4 status=0 QTime=134 2> 196095 INFO (qtp2065109343-468) [n:127.0.0.1:45297_ c:control_collection s:shard1 r:core_node2 x:control_collection_shard1_replica_n1 t:null-437] o.a.s.c.S.Request webapp= path=/select params={q=*:*&distrib=false&rows=100&wt=javabin&version=2} hits=67 status=0 QTime=4 2> 196157 INFO (qtp600427849-516) [n:127.0.0.1:40707_ c:collection1 s:shard2 r:core_node4 x:collection1_shard2_replica_n2 t:null-438] o.a.s.c.S.Request webapp= path=/select params={df=text&distrib=false&_stateVer_=collection1:11&fl=id&fl=score&shards.purpose=16388&start=0&fsv=true&collection=collection1&rows=100&rid=null-438&version=2&q=*:*&omitHeader=false&NOW=1705871128236&isShard=true&wt=javabin} hits=35 status=0 QTime=2 2> 196161 INFO (qtp657049334-531) [n:127.0.0.1:37133_ c:collection1 s:shard1 r:core_node8 x:collection1_shard1_replica_n6 t:null-438] o.a.s.c.S.Request webapp= path=/select params={df=text&distrib=false&_stateVer_=collection1:11&fl=id&fl=score&shards.purpose=16388&start=0&fsv=true&collection=collection1&rows=100&rid=null-438&version=2&q=*:*&omitHeader=false&NOW=1705871128236&isShard=true&wt=javabin} hits=32 status=0 QTime=3 2> 196183 INFO (qtp600427849-515) [n:127.0.0.1:40707_ c:collection1 s:shard2 r:core_node4 x:collection1_shard2_replica_n2 t:null-438] o.a.s.c.S.Request webapp= path=/select params={df=text&distrib=false&_stateVer_=collection1:11&shards.purpose=64&collection=collection1&rows=100&rid=null-438&version=2&q=*:*&omitHeader=false&NOW=1705871128236&ids=112,115,116,117,119,17,120,121,2,3,126,5,127,6,128,7,9,130,131,134,136,137,139,140,141,142,143,144,145,147,103,104,149,105,106&isShard=true&wt=javabin} status=0 QTime=6 2> 196185 INFO (qtp657049334-528) [n:127.0.0.1:37133_ c:collection1 s:shard1 r:core_node8 x:collection1_shard1_replica_n6 t:null-438] o.a.s.c.S.Request webapp= path=/select params={df=text&distrib=false&_stateVer_=collection1:11&shards.purpose=64&collection=collection1&rows=100&rid=null-438&version=2&q=*:*&omitHeader=false&NOW=1705871128236&ids=110,111,113,114,118,10,11,12,13,14,15,16,122,1,123,124,125,4,129,8,132,133,135,138,100,101,146,102,148,107,108,109&isShard=true&wt=javabin} status=0 QTime=5 2> 196200 INFO (qtp600427849-520) [n:127.0.0.1:40707_ c:collection1 s:shard2 r:core_node4 x:collection1_shard2_replica_n2 t:null-438] o.a.s.c.S.Request webapp= path=/select params={q=*:*&_stateVer_=collection1:11&collection=collection1&rows=100&wt=javabin&version=2} rid=null-438 hits=67 status=0 QTime=61 2> 196454 INFO (qtp2065109343-469) [n:127.0.0.1:45297_ c:control_collection s:shard1 r:core_node2 x:control_collection_shard1_replica_n1 t:null-439] o.a.s.c.S.Request webapp= path=/select params={q=now+their+fox+sat+had+put&distrib=false&fl=*,score&wt=javabin&version=2&debugQuery=true} hits=7 status=0 QTime=233 2> 196487 INFO (qtp600427849-519) [n:127.0.0.1:40707_ c:collection1 s:shard2 r:core_node4 x:collection1_shard2_replica_n2 t:null-440] o.a.s.c.S.Request webapp= path=/select params={df=text&distrib=false&_stateVer_=collection1:11&debug=false&debug=timing&debug=track&fl=id&fl=score&shards.purpose=16388&start=0&fsv=true&rid=null-440&rows=10&version=2&q=now+their+fox+sat+had+put&omitHeader=false&requestPurpose=GET_TOP_IDS,SET_TERM_STATS&NOW=1705871128567&isShard=true&wt=javabin&debugQuery=false} hits=5 status=0 QTime=6 2> 196498 INFO (qtp657049334-529) [n:127.0.0.1:37133_ c:collection1 s:shard1 r:core_node8 x:collection1_shard1_replica_n6 t:null-440] o.a.s.c.S.Request webapp= path=/select params={df=text&distrib=false&_stateVer_=collection1:11&debug=false&debug=timing&debug=track&fl=id&fl=score&shards.purpose=16388&start=0&fsv=true&rid=null-440&rows=10&version=2&q=now+their+fox+sat+had+put&omitHeader=false&requestPurpose=GET_TOP_IDS,SET_TERM_STATS&NOW=1705871128567&isShard=true&wt=javabin&debugQuery=false} hits=2 status=0 QTime=14 2> 196634 INFO (qtp657049334-532) [n:127.0.0.1:37133_ c:collection1 s:shard1 r:core_node8 x:collection1_shard1_replica_n6 t:null-440] o.a.s.c.S.Request webapp= path=/select params={df=text&distrib=false&_stateVer_=collection1:11&debug=timing&debug=track&fl=*,score&shards.purpose=16704&rid=null-440&version=2&q=now+their+fox+sat+had+put&omitHeader=false&requestPurpose=GET_FIELDS,GET_DEBUG,SET_TERM_STATS&NOW=1705871128567&ids=1,4&isShard=true&wt=javabin&debugQuery=true} status=0 QTime=18 2> 196652 INFO (qtp600427849-517) [n:127.0.0.1:40707_ c:collection1 s:shard2 r:core_node4 x:collection1_shard2_replica_n2 t:null-440] o.a.s.c.S.Request webapp= path=/select params={df=text&distrib=false&_stateVer_=collection1:11&debug=timing&debug=track&fl=*,score&shards.purpose=16704&rid=null-440&version=2&q=now+their+fox+sat+had+put&omitHeader=false&requestPurpose=GET_FIELDS,GET_DEBUG,SET_TERM_STATS&NOW=1705871128567&ids=3,5,6,7,9&isShard=true&wt=javabin&debugQuery=true} status=0 QTime=41 2> 196661 INFO (qtp600427849-518) [n:127.0.0.1:40707_ c:collection1 s:shard2 r:core_node4 x:collection1_shard2_replica_n2 t:null-440] o.a.s.c.S.Request webapp= path=/select params={q=now+their+fox+sat+had+put&_stateVer_=collection1:11&fl=*,score&wt=javabin&version=2&debugQuery=true} rid=null-440 hits=7 status=0 QTime=192 2> 196681 INFO (qtp2065109343-470) [n:127.0.0.1:45297_ c:control_collection s:shard1 r:core_node2 x:control_collection_shard1_replica_n1 t:null-441] o.a.s.c.S.Request webapp= path=/select params={q=id_i1:[1+TO+5]&distrib=false&wt=javabin&version=2&debugQuery=true} hits=5 status=0 QTime=9 2> 196701 INFO (qtp558875332-569) [n:127.0.0.1:32885_ c:collection1 s:shard2 r:core_node7 x:collection1_shard2_replica_n5 t:null-442] o.a.s.c.S.Request webapp= path=/select params={df=text&distrib=false&_stateVer_=collection1:11&debug=false&debug=timing&debug=track&fl=id&fl=score&shards.purpose=16388&start=0&fsv=true&collection=collection1&rid=null-442&rows=10&version=2&q=id_i1:[1+TO+5]&omitHeader=false&requestPurpose=GET_TOP_IDS,SET_TERM_STATS&NOW=1705871128787&isShard=true&wt=javabin&debugQuery=false} hits=3 status=0 QTime=4 2> 196702 INFO (qtp1128963290-593) [n:127.0.0.1:43017_ c:collection1 s:shard1 r:core_node3 x:collection1_shard1_replica_n1 t:null-442] o.a.s.c.S.Request webapp= path=/select params={df=text&distrib=false&_stateVer_=collection1:11&debug=false&debug=timing&debug=track&fl=id&fl=score&shards.purpose=16388&start=0&fsv=true&collection=collection1&rid=null-442&rows=10&version=2&q=id_i1:[1+TO+5]&omitHeader=false&requestPurpose=GET_TOP_IDS,SET_TERM_STATS&NOW=1705871128787&isShard=true&wt=javabin&debugQuery=false} hits=2 status=0 QTime=3 2> 196721 INFO (qtp1128963290-590) [n:127.0.0.1:43017_ c:collection1 s:shard1 r:core_node3 x:collection1_shard1_replica_n1 t:null-442] o.a.s.c.S.Request webapp= path=/select params={df=text&distrib=false&_stateVer_=collection1:11&debug=timing&debug=track&shards.purpose=16704&collection=collection1&rid=null-442&version=2&q=id_i1:[1+TO+5]&omitHeader=false&requestPurpose=GET_FIELDS,GET_DEBUG,SET_TERM_STATS&NOW=1705871128787&ids=1,4&isShard=true&wt=javabin&debugQuery=true} status=0 QTime=10 2> 196721 INFO (qtp558875332-572) [n:127.0.0.1:32885_ c:collection1 s:shard2 r:core_node7 x:collection1_shard2_replica_n5 t:null-442] o.a.s.c.S.Request webapp= path=/select params={df=text&distrib=false&_stateVer_=collection1:11&debug=timing&debug=track&shards.purpose=16704&collection=collection1&rid=null-442&version=2&q=id_i1:[1+TO+5]&omitHeader=false&requestPurpose=GET_FIELDS,GET_DEBUG,SET_TERM_STATS&NOW=1705871128787&ids=2,3,5&isShard=true&wt=javabin&debugQuery=true} status=0 QTime=8 2> 197763 INFO (qtp657049334-530) [n:127.0.0.1:37133_ c:collection1 s:shard1 r:core_node8 x:collection1_shard1_replica_n6 t:null-442] o.a.s.c.S.Request webapp= path=/select params={q=id_i1:[1+TO+5]&_stateVer_=collection1:11&collection=collection1&wt=javabin&version=2&debugQuery=true} rid=null-442 hits=5 status=0 QTime=1073 2> 197774 INFO (qtp2065109343-467) [n:127.0.0.1:45297_ c:control_collection s:shard1 r:core_node2 x:control_collection_shard1_replica_n1 t:null-443] o.a.s.c.S.Request webapp= path=/select params={q=id_i1:[1+TO+5]&distrib=false&debug=timing&wt=javabin&version=2} hits=5 status=0 QTime=3 2> 197797 INFO (qtp558875332-571) [n:127.0.0.1:32885_ c:collection1 s:shard2 r:core_node7 x:collection1_shard2_replica_n5 t:null-444] o.a.s.c.S.Request webapp= path=/select params={df=text&distrib=false&debug=false&debug=timing&_stateVer_=collection1:11&fl=id&fl=score&shards.purpose=16388&start=0&fsv=true&rid=null-444&rows=10&version=2&q=id_i1:[1+TO+5]&omitHeader=false&NOW=1705871129884&isShard=true&wt=javabin&debugQuery=false} hits=3 status=0 QTime=2 2> 197801 INFO (qtp657049334-533) [n:127.0.0.1:37133_ c:collection1 s:shard1 r:core_node8 x:collection1_shard1_replica_n6 t:null-444] o.a.s.c.S.Request webapp= path=/select params={df=text&distrib=false&debug=false&debug=timing&_stateVer_=collection1:11&fl=id&fl=score&shards.purpose=16388&start=0&fsv=true&rid=null-444&rows=10&version=2&q=id_i1:[1+TO+5]&omitHeader=false&NOW=1705871129884&isShard=true&wt=javabin&debugQuery=false} hits=2 status=0 QTime=4 2> 197812 INFO (qtp657049334-531) [n:127.0.0.1:37133_ c:collection1 s:shard1 r:core_node8 x:collection1_shard1_replica_n6 t:null-444] o.a.s.c.S.Request webapp= path=/select params={df=text&distrib=false&debug=timing&debug=timing&_stateVer_=collection1:11&shards.purpose=16704&rid=null-444&version=2&q=id_i1:[1+TO+5]&omitHeader=false&NOW=1705871129884&ids=1,4&isShard=true&wt=javabin} status=0 QTime=2 2> 197815 INFO (qtp558875332-568) [n:127.0.0.1:32885_ c:collection1 s:shard2 r:core_node7 x:collection1_shard2_replica_n5 t:null-444] o.a.s.c.S.Request webapp= path=/select params={df=text&distrib=false&debug=timing&debug=timing&_stateVer_=collection1:11&shards.purpose=16704&rid=null-444&version=2&q=id_i1:[1+TO+5]&omitHeader=false&NOW=1705871129884&ids=2,3,5&isShard=true&wt=javabin} status=0 QTime=3 2> 197830 INFO (qtp600427849-516) [n:127.0.0.1:40707_ c:collection1 s:shard2 r:core_node4 x:collection1_shard2_replica_n2 t:null-444] o.a.s.c.S.Request webapp= path=/select params={q=id_i1:[1+TO+5]&debug=timing&_stateVer_=collection1:11&wt=javabin&version=2} rid=null-444 hits=5 status=0 QTime=44 2> 197848 INFO (qtp2065109343-471) [n:127.0.0.1:45297_ c:control_collection s:shard1 r:core_node2 x:control_collection_shard1_replica_n1 t:null-445] o.a.s.c.S.Request webapp= path=/select params={q=id_i1:[1+TO+5]&distrib=false&debug=results&wt=javabin&version=2} hits=5 status=0 QTime=4 2> 197871 INFO (qtp600427849-515) [n:127.0.0.1:40707_ c:collection1 s:shard2 r:core_node4 x:collection1_shard2_replica_n2 t:null-446] o.a.s.c.S.Request webapp= path=/select params={df=text&distrib=false&debug=false&_stateVer_=collection1:11&fl=id&fl=score&shards.purpose=16388&start=0&fsv=true&rid=null-446&rows=10&version=2&q=id_i1:[1+TO+5]&omitHeader=false&NOW=1705871129959&isShard=true&wt=javabin&debugQuery=false} hits=3 status=0 QTime=3 2> 197873 INFO (qtp1128963290-592) [n:127.0.0.1:43017_ c:collection1 s:shard1 r:core_node3 x:collection1_shard1_replica_n1 t:null-446] o.a.s.c.S.Request webapp= path=/select params={df=text&distrib=false&debug=false&_stateVer_=collection1:11&fl=id&fl=score&shards.purpose=16388&start=0&fsv=true&rid=null-446&rows=10&version=2&q=id_i1:[1+TO+5]&omitHeader=false&NOW=1705871129959&isShard=true&wt=javabin&debugQuery=false} hits=2 status=0 QTime=2 2> 197884 INFO (qtp600427849-520) [n:127.0.0.1:40707_ c:collection1 s:shard2 r:core_node4 x:collection1_shard2_replica_n2 t:null-446] o.a.s.c.S.Request webapp= path=/select params={df=text&distrib=false&debug=results&debug=results&_stateVer_=collection1:11&shards.purpose=16704&rid=null-446&version=2&q=id_i1:[1+TO+5]&omitHeader=false&NOW=1705871129959&ids=2,3,5&isShard=true&wt=javabin} status=0 QTime=3 2> 197885 INFO (qtp1128963290-588) [n:127.0.0.1:43017_ c:collection1 s:shard1 r:core_node3 x:collection1_shard1_replica_n1 t:null-446] o.a.s.c.S.Request webapp= path=/select params={df=text&distrib=false&debug=results&debug=results&_stateVer_=collection1:11&shards.purpose=16704&rid=null-446&version=2&q=id_i1:[1+TO+5]&omitHeader=false&NOW=1705871129959&ids=1,4&isShard=true&wt=javabin} status=0 QTime=3 2> 197889 INFO (qtp1128963290-589) [n:127.0.0.1:43017_ c:collection1 s:shard1 r:core_node3 x:collection1_shard1_replica_n1 t:null-446] o.a.s.c.S.Request webapp= path=/select params={q=id_i1:[1+TO+5]&debug=results&_stateVer_=collection1:11&wt=javabin&version=2} rid=null-446 hits=5 status=0 QTime=28 2> 197898 INFO (qtp2065109343-466) [n:127.0.0.1:45297_ c:control_collection s:shard1 r:core_node2 x:control_collection_shard1_replica_n1 t:null-447] o.a.s.c.S.Request webapp= path=/select params={q=id_i1:[1+TO+5]&distrib=false&debug=query&wt=javabin&version=2} hits=5 status=0 QTime=2 2> 197916 INFO (qtp600427849-519) [n:127.0.0.1:40707_ c:collection1 s:shard2 r:core_node4 x:collection1_shard2_replica_n2 t:null-448] o.a.s.c.S.Request webapp= path=/select params={df=text&distrib=false&debug=false&_stateVer_=collection1:11&fl=id&fl=score&shards.purpose=16388&start=0&fsv=true&rid=null-448&rows=10&version=2&q=id_i1:[1+TO+5]&omitHeader=false&NOW=1705871130005&isShard=true&wt=javabin&debugQuery=false} hits=3 status=0 QTime=2 2> 197918 INFO (qtp1128963290-593) [n:127.0.0.1:43017_ c:collection1 s:shard1 r:core_node3 x:collection1_shard1_replica_n1 t:null-448] o.a.s.c.S.Request webapp= path=/select params={df=text&distrib=false&debug=false&_stateVer_=collection1:11&fl=id&fl=score&shards.purpose=16388&start=0&fsv=true&rid=null-448&rows=10&version=2&q=id_i1:[1+TO+5]&omitHeader=false&NOW=1705871130005&isShard=true&wt=javabin&debugQuery=false} hits=2 status=0 QTime=2 2> 197928 INFO (qtp600427849-517) [n:127.0.0.1:40707_ c:collection1 s:shard2 r:core_node4 x:collection1_shard2_replica_n2 t:null-448] o.a.s.c.S.Request webapp= path=/select params={df=text&distrib=false&debug=query&debug=query&_stateVer_=collection1:11&shards.purpose=16704&rid=null-448&version=2&q=id_i1:[1+TO+5]&omitHeader=false&NOW=1705871130005&ids=2,3,5&isShard=true&wt=javabin} status=0 QTime=2 2> 197930 INFO (qtp1128963290-590) [n:127.0.0.1:43017_ c:collection1 s:shard1 r:core_node3 x:collection1_shard1_replica_n1 t:null-448] o.a.s.c.S.Request webapp= path=/select params={df=text&distrib=false&debug=query&debug=query&_stateVer_=collection1:11&shards.purpose=16704&rid=null-448&version=2&q=id_i1:[1+TO+5]&omitHeader=false&NOW=1705871130005&ids=1,4&isShard=true&wt=javabin} status=0 QTime=2 2> 197934 INFO (qtp1128963290-591) [n:127.0.0.1:43017_ c:collection1 s:shard1 r:core_node3 x:collection1_shard1_replica_n1 t:null-448] o.a.s.c.S.Request webapp= path=/select params={q=id_i1:[1+TO+5]&debug=query&_stateVer_=collection1:11&wt=javabin&version=2} rid=null-448 hits=5 status=0 QTime=27 2> 197953 INFO (qtp558875332-570) [n:127.0.0.1:32885_ c:collection1 s:shard2 r:core_node7 x:collection1_shard2_replica_n5 t:null-449] o.a.s.c.S.Request webapp= path=/select params={df=text&distrib=false&_stateVer_=collection1:11&fl=id&fl=score&shards.purpose=16388&start=0&fsv=true&rid=null-449&rows=10&version=2&q=*:*&omitHeader=false&NOW=1705871130042&isShard=true&wt=javabin} hits=35 status=0 QTime=2 2> 197955 INFO (qtp657049334-529) [n:127.0.0.1:37133_ c:collection1 s:shard1 r:core_node8 x:collection1_shard1_replica_n6 t:null-449] o.a.s.c.S.Request webapp= path=/select params={df=text&distrib=false&_stateVer_=collection1:11&fl=id&fl=score&shards.purpose=16388&start=0&fsv=true&rid=null-449&rows=10&version=2&q=*:*&omitHeader=false&NOW=1705871130042&isShard=true&wt=javabin} hits=32 status=0 QTime=1 2> 197965 INFO (qtp558875332-573) [n:127.0.0.1:32885_ c:collection1 s:shard2 r:core_node7 x:collection1_shard2_replica_n5 t:null-449] o.a.s.c.S.Request webapp= path=/select params={q=*:*&df=text&distrib=false&_stateVer_=collection1:11&omitHeader=false&shards.purpose=64&NOW=1705871130042&ids=2,3,103,104,5,105,17,6,7,9&isShard=true&rid=null-449&wt=javabin&version=2} status=0 QTime=2 2> 197971 INFO (qtp657049334-528) [n:127.0.0.1:37133_ c:collection1 s:shard1 r:core_node8 x:collection1_shard1_replica_n6 t:null-449] o.a.s.c.S.Request webapp= path=/select params={q=*:*&_stateVer_=collection1:11&wt=javabin&version=2} rid=null-449 hits=67 status=0 QTime=27 2> 197987 INFO (qtp558875332-569) [n:127.0.0.1:32885_ c:collection1 s:shard2 r:core_node7 x:collection1_shard2_replica_n5 t:null-450] o.a.s.c.S.Request webapp= path=/select params={df=text&distrib=false&fl=id&fl=score&shards.purpose=16388&start=0&fsv=true&rid=null-450&rows=10&version=2&q=*:*&omitHeader=false&NOW=1705871130077&isShard=true&wt=javabin} hits=35 status=0 QTime=1 2> 197991 INFO (qtp1128963290-592) [n:127.0.0.1:43017_ c:collection1 s:shard1 r:core_node3 x:collection1_shard1_replica_n1 t:null-450] o.a.s.c.S.Request webapp= path=/select params={df=text&distrib=false&fl=id&fl=score&shards.purpose=16388&start=0&fsv=true&rid=null-450&rows=10&version=2&q=*:*&omitHeader=false&NOW=1705871130077&isShard=true&wt=javabin} hits=32 status=0 QTime=2 2> 198013 INFO (qtp558875332-572) [n:127.0.0.1:32885_ c:collection1 s:shard2 r:core_node7 x:collection1_shard2_replica_n5 t:null-450] o.a.s.c.S.Request webapp= path=/select params={q=*:*&df=text&distrib=false&omitHeader=false&shards.purpose=64&NOW=1705871130077&ids=2,3,103,104,5,105,17,6,7,9&isShard=true&rid=null-450&wt=javabin&version=2} status=0 QTime=3 2> 198025 INFO (qtp600427849-518) [n:127.0.0.1:40707_ c:collection1 s:shard2 r:core_node4 x:collection1_shard2_replica_n2 t:null-450] o.a.s.c.S.Request webapp= path=/select params={q=*:*&wt=javabin&version=2} rid=null-450 hits=67 status=0 QTime=46 2> 198044 INFO (qtp600427849-516) [n:127.0.0.1:40707_ c:collection1 s:shard2 r:core_node4 x:collection1_shard2_replica_n2 t:null-451] o.a.s.c.S.Request webapp= path=/select params={df=text&distrib=false&fl=id&fl=score&shards.purpose=16388&start=0&fsv=true&rid=null-451&rows=10&version=2&q=*:*&omitHeader=false&NOW=1705871130129&isShard=true&wt=javabin} hits=35 status=0 QTime=2 2> 198047 INFO (qtp1128963290-588) [n:127.0.0.1:43017_ c:collection1 s:shard1 r:core_node3 x:collection1_shard1_replica_n1 t:null-451] o.a.s.c.S.Request webapp= path=/select params={df=text&distrib=false&fl=id&fl=score&shards.purpose=16388&start=0&fsv=true&rid=null-451&rows=10&version=2&q=*:*&omitHeader=false&NOW=1705871130129&isShard=true&wt=javabin} hits=32 status=0 QTime=2 2> 198072 INFO (qtp600427849-515) [n:127.0.0.1:40707_ c:collection1 s:shard2 r:core_node4 x:collection1_shard2_replica_n2 t:null-451] o.a.s.c.S.Request webapp= path=/select params={q=*:*&df=text&distrib=false&omitHeader=false&shards.purpose=64&NOW=1705871130129&ids=2,3,103,104,5,105,17,6,7,9&isShard=true&rid=null-451&wt=javabin&version=2} status=0 QTime=4 2> 198080 INFO (qtp657049334-532) [n:127.0.0.1:37133_ c:collection1 s:shard1 r:core_node8 x:collection1_shard1_replica_n6 t:null-451] o.a.s.c.S.Request webapp= path=/select params={q=*:*&wt=javabin&version=2} rid=null-451 hits=67 status=0 QTime=49 2> 198097 INFO (qtp558875332-568) [n:127.0.0.1:32885_ c:collection1 s:shard2 r:core_node7 x:collection1_shard2_replica_n5 t:null-452] o.a.s.c.S.Request webapp= path=/select params={df=text&distrib=false&fl=id&fl=score&shards.purpose=16388&start=0&fsv=true&rid=null-452&rows=10&version=2&q=*:*&omitHeader=false&NOW=1705871130184&isShard=true&wt=javabin} hits=35 status=0 QTime=2 2> 198099 INFO (qtp657049334-530) [n:127.0.0.1:37133_ c:collection1 s:shard1 r:core_node8 x:collection1_shard1_replica_n6 t:null-452] o.a.s.c.S.Request webapp= path=/select params={df=text&distrib=false&fl=id&fl=score&shards.purpose=16388&start=0&fsv=true&rid=null-452&rows=10&version=2&q=*:*&omitHeader=false&NOW=1705871130184&isShard=true&wt=javabin} hits=32 status=0 QTime=1 2> 198113 INFO (qtp558875332-570) [n:127.0.0.1:32885_ c:collection1 s:shard2 r:core_node7 x:collection1_shard2_replica_n5 t:null-452] o.a.s.c.S.Request webapp= path=/select params={q=*:*&df=text&distrib=false&omitHeader=false&shards.purpose=64&NOW=1705871130184&ids=2,3,103,104,5,105,17,6,7,9&isShard=true&rid=null-452&wt=javabin&version=2} status=0 QTime=4 2> 198121 INFO (qtp558875332-571) [n:127.0.0.1:32885_ c:collection1 s:shard2 r:core_node7 x:collection1_shard2_replica_n5 t:null-452] o.a.s.c.S.Request webapp= path=/select params={q=*:*&wt=javabin&version=2} rid=null-452 hits=67 status=0 QTime=35 2> 198135 INFO (qtp600427849-520) [n:127.0.0.1:40707_ c:collection1 s:shard2 r:core_node4 x:collection1_shard2_replica_n2 t:null-453] o.a.s.c.S.Request webapp= path=/select params={df=text&distrib=false&fl=id&fl=score&shards.purpose=16388&start=0&fsv=true&rid=null-453&rows=10&version=2&q=*:*&omitHeader=false&NOW=1705871130225&isShard=true&wt=javabin} hits=35 status=0 QTime=2 2> 198137 INFO (qtp1128963290-593) [n:127.0.0.1:43017_ c:collection1 s:shard1 r:core_node3 x:collection1_shard1_replica_n1 t:null-453] o.a.s.c.S.Request webapp= path=/select params={df=text&distrib=false&fl=id&fl=score&shards.purpose=16388&start=0&fsv=true&rid=null-453&rows=10&version=2&q=*:*&omitHeader=false&NOW=1705871130225&isShard=true&wt=javabin} hits=32 status=0 QTime=2 2> 198159 INFO (qtp600427849-519) [n:127.0.0.1:40707_ c:collection1 s:shard2 r:core_node4 x:collection1_shard2_replica_n2 t:null-453] o.a.s.c.S.Request webapp= path=/select params={q=*:*&df=text&distrib=false&omitHeader=false&shards.purpose=64&NOW=1705871130225&ids=2,3,103,104,5,105,17,6,7,9&isShard=true&rid=null-453&wt=javabin&version=2} status=0 QTime=3 2> 198166 INFO (qtp1128963290-589) [n:127.0.0.1:43017_ c:collection1 s:shard1 r:core_node3 x:collection1_shard1_replica_n1 t:null-453] o.a.s.c.S.Request webapp= path=/select params={q=*:*&wt=javabin&version=2} rid=null-453 hits=67 status=0 QTime=39 2> 198223 INFO (qtp657049334-533) [n:127.0.0.1:37133_ c:collection1 s:shard1 r:core_node8 x:collection1_shard1_replica_n6 t:null-454] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={update.distrib=FROMLEADER&distrib.from=http://127.0.0.1:43017/collection1_shard1_replica_n1/&wt=javabin&version=2}{add=[300 (1788735526300286976), 301 (1788735526323355648)]} 0 18 2> 198226 INFO (qtp1128963290-590) [n:127.0.0.1:43017_ c:collection1 s:shard1 r:core_node3 x:collection1_shard1_replica_n1 t:null-454] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={_stateVer_=collection1:11&commitWithin=10&wt=javabin&version=2}{add=[300 (1788735526300286976), 301 (1788735526323355648)]} 0 51 2> 198341 INFO (searcherExecutor-365-thread-1-processing-127.0.0.1:37133_ collection1_shard1_replica_n6 collection1 shard1 core_node8) [n:127.0.0.1:37133_ c:collection1 s:shard1 r:core_node8 x:collection1_shard1_replica_n6 t:] o.a.s.c.SolrCore Registered new searcher autowarm time: 0 ms 2> 198341 INFO (searcherExecutor-365-thread-1-processing-127.0.0.1:37133_ collection1_shard1_replica_n6 collection1 shard1 core_node8) [n:127.0.0.1:37133_ c:collection1 s:shard1 r:core_node8 x:collection1_shard1_replica_n6 t:] o.a.s.c.AbstractBasicDistributedZkTestBase Hook detected newSearcher 2> 198342 INFO (searcherExecutor-351-thread-1-processing-127.0.0.1:43017_ collection1_shard1_replica_n1 collection1 shard1 core_node3) [n:127.0.0.1:43017_ c:collection1 s:shard1 r:core_node3 x:collection1_shard1_replica_n1 t:] o.a.s.c.SolrCore Registered new searcher autowarm time: 2 ms 2> 198343 INFO (searcherExecutor-351-thread-1-processing-127.0.0.1:43017_ collection1_shard1_replica_n1 collection1 shard1 core_node3) [n:127.0.0.1:43017_ c:collection1 s:shard1 r:core_node3 x:collection1_shard1_replica_n1 t:] o.a.s.c.AbstractBasicDistributedZkTestBase Hook detected newSearcher 2> 198369 INFO (qtp600427849-518) [n:127.0.0.1:40707_ c:collection1 s:shard2 r:core_node4 x:collection1_shard2_replica_n2 t:null-455] o.a.s.c.S.Request webapp= path=/select params={q=*:*&distrib=false&wt=javabin&version=2} hits=35 status=0 QTime=1 2> 198392 INFO (qtp558875332-569) [n:127.0.0.1:32885_ c:collection1 s:shard2 r:core_node7 x:collection1_shard2_replica_n5 t:null-456] o.a.s.c.S.Request webapp= path=/select params={q=*:*&distrib=false&wt=javabin&version=2} hits=35 status=0 QTime=2 2> 198414 INFO (qtp1128963290-592) [n:127.0.0.1:43017_ c:collection1 s:shard1 r:core_node3 x:collection1_shard1_replica_n1 t:null-457] o.a.s.c.S.Request webapp= path=/select params={q=*:*&distrib=false&wt=javabin&version=2} hits=34 status=0 QTime=2 2> 198435 INFO (qtp657049334-529) [n:127.0.0.1:37133_ c:collection1 s:shard1 r:core_node8 x:collection1_shard1_replica_n6 t:null-458] o.a.s.c.S.Request webapp= path=/select params={q=*:*&distrib=false&wt=javabin&version=2} hits=34 status=0 QTime=1 2> 198457 INFO (qtp600427849-519) [n:127.0.0.1:40707_ c:collection1 s:shard2 r:core_node4 x:collection1_shard2_replica_n2 t:null-459] o.a.s.c.S.Request webapp= path=/select params={q=*:*&distrib=false&wt=javabin&version=2} hits=35 status=0 QTime=1 2> 198476 INFO (qtp558875332-571) [n:127.0.0.1:32885_ c:collection1 s:shard2 r:core_node7 x:collection1_shard2_replica_n5 t:null-460] o.a.s.c.S.Request webapp= path=/select params={q=*:*&distrib=false&wt=javabin&version=2} hits=35 status=0 QTime=1 2> 198496 INFO (qtp1128963290-590) [n:127.0.0.1:43017_ c:collection1 s:shard1 r:core_node3 x:collection1_shard1_replica_n1 t:null-461] o.a.s.c.S.Request webapp= path=/select params={q=*:*&distrib=false&wt=javabin&version=2} hits=34 status=0 QTime=1 2> 198517 INFO (qtp657049334-533) [n:127.0.0.1:37133_ c:collection1 s:shard1 r:core_node8 x:collection1_shard1_replica_n6 t:null-462] o.a.s.c.S.Request webapp= path=/select params={q=*:*&distrib=false&wt=javabin&version=2} hits=34 status=0 QTime=2 2> 198578 INFO (qtp657049334-532) [n:127.0.0.1:37133_ c:collection1 s:shard1 r:core_node8 x:collection1_shard1_replica_n6 t:null-463] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={update.distrib=FROMLEADER&distrib.from=http://127.0.0.1:43017/collection1_shard1_replica_n1/&wt=javabin&version=2}{delete=[300 (-1788735526691405824)]} 0 14 2> 198585 INFO (qtp1128963290-593) [n:127.0.0.1:43017_ c:collection1 s:shard1 r:core_node3 x:collection1_shard1_replica_n1 t:null-463] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={wt=javabin&version=2}{delete=[300 (-1788735526691405824)]} 0 56 2> 198653 INFO (searcherExecutor-351-thread-1-processing-127.0.0.1:43017_ collection1_shard1_replica_n1 collection1 shard1 core_node3) [n:127.0.0.1:43017_ c:collection1 s:shard1 r:core_node3 x:collection1_shard1_replica_n1 t:] o.a.s.c.SolrCore Registered new searcher autowarm time: 0 ms 2> 198654 INFO (searcherExecutor-351-thread-1-processing-127.0.0.1:43017_ collection1_shard1_replica_n1 collection1 shard1 core_node3) [n:127.0.0.1:43017_ c:collection1 s:shard1 r:core_node3 x:collection1_shard1_replica_n1 t:] o.a.s.c.AbstractBasicDistributedZkTestBase Hook detected newSearcher 2> 198659 INFO (searcherExecutor-365-thread-1-processing-127.0.0.1:37133_ collection1_shard1_replica_n6 collection1 shard1 core_node8) [n:127.0.0.1:37133_ c:collection1 s:shard1 r:core_node8 x:collection1_shard1_replica_n6 t:] o.a.s.c.SolrCore Registered new searcher autowarm time: 0 ms 2> 198659 INFO (searcherExecutor-365-thread-1-processing-127.0.0.1:37133_ collection1_shard1_replica_n6 collection1 shard1 core_node8) [n:127.0.0.1:37133_ c:collection1 s:shard1 r:core_node8 x:collection1_shard1_replica_n6 t:] o.a.s.c.AbstractBasicDistributedZkTestBase Hook detected newSearcher 2> 198674 INFO (qtp600427849-516) [n:127.0.0.1:40707_ c:collection1 s:shard2 r:core_node4 x:collection1_shard2_replica_n2 t:null-464] o.a.s.c.S.Request webapp= path=/select params={q=*:*&distrib=false&wt=javabin&version=2} hits=35 status=0 QTime=1 2> 198695 INFO (qtp558875332-568) [n:127.0.0.1:32885_ c:collection1 s:shard2 r:core_node7 x:collection1_shard2_replica_n5 t:null-465] o.a.s.c.S.Request webapp= path=/select params={q=*:*&distrib=false&wt=javabin&version=2} hits=35 status=0 QTime=1 2> 198715 INFO (qtp1128963290-589) [n:127.0.0.1:43017_ c:collection1 s:shard1 r:core_node3 x:collection1_shard1_replica_n1 t:null-466] o.a.s.c.S.Request webapp= path=/select params={q=*:*&distrib=false&wt=javabin&version=2} hits=33 status=0 QTime=2 2> 198740 INFO (qtp657049334-530) [n:127.0.0.1:37133_ c:collection1 s:shard1 r:core_node8 x:collection1_shard1_replica_n6 t:null-467] o.a.s.c.S.Request webapp= path=/select params={q=*:*&distrib=false&wt=javabin&version=2} hits=33 status=0 QTime=3 2> 198763 INFO (qtp600427849-518) [n:127.0.0.1:40707_ c:collection1 s:shard2 r:core_node4 x:collection1_shard2_replica_n2 t:null-468] o.a.s.c.S.Request webapp= path=/select params={q=*:*&distrib=false&wt=javabin&version=2} hits=35 status=0 QTime=1 2> 198785 INFO (qtp558875332-573) [n:127.0.0.1:32885_ c:collection1 s:shard2 r:core_node7 x:collection1_shard2_replica_n5 t:null-469] o.a.s.c.S.Request webapp= path=/select params={q=*:*&distrib=false&wt=javabin&version=2} hits=35 status=0 QTime=2 2> 198806 INFO (qtp1128963290-593) [n:127.0.0.1:43017_ c:collection1 s:shard1 r:core_node3 x:collection1_shard1_replica_n1 t:null-470] o.a.s.c.S.Request webapp= path=/select params={q=*:*&distrib=false&wt=javabin&version=2} hits=33 status=0 QTime=1 2> 198827 INFO (qtp657049334-532) [n:127.0.0.1:37133_ c:collection1 s:shard1 r:core_node8 x:collection1_shard1_replica_n6 t:null-471] o.a.s.c.S.Request webapp= path=/select params={q=*:*&distrib=false&wt=javabin&version=2} hits=33 status=0 QTime=2 2> 198876 INFO (searcherExecutor-354-thread-1-processing-127.0.0.1:40707_ collection1_shard2_replica_n2 collection1 shard2 core_node4) [n:127.0.0.1:40707_ c:collection1 s:shard2 r:core_node4 x:collection1_shard2_replica_n2 t:] o.a.s.c.AbstractBasicDistributedZkTestBase Hook detected newSearcher 2> 198890 INFO (qtp558875332-571) [n:127.0.0.1:32885_ c:collection1 s:shard2 r:core_node7 x:collection1_shard2_replica_n5 t:null-472] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={update.distrib=FROMLEADER&_version_=-1788735527000735744&distrib.from=http://127.0.0.1:40707/collection1_shard2_replica_n2/&wt=javabin&version=2}{deleteByQuery=id:301 (-1788735527000735744)} 0 24 2> 198898 INFO (searcherExecutor-367-thread-1-processing-127.0.0.1:32885_ collection1_shard2_replica_n5 collection1 shard2 core_node7) [n:127.0.0.1:32885_ c:collection1 s:shard2 r:core_node7 x:collection1_shard2_replica_n5 t:] o.a.s.c.AbstractBasicDistributedZkTestBase Hook detected newSearcher 2> 198904 INFO (qtp600427849-519) [n:127.0.0.1:40707_ c:collection1 s:shard2 r:core_node4 x:collection1_shard2_replica_n2 t:null-472] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={update.distrib=TOLEADER&distrib.from=http://127.0.0.1:43017/collection1_shard1_replica_n1/&wt=javabin&version=2}{deleteByQuery=id:301 (-1788735527000735744)} 0 60 2> 198925 INFO (qtp657049334-529) [n:127.0.0.1:37133_ c:collection1 s:shard1 r:core_node8 x:collection1_shard1_replica_n6 t:null-472] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={update.distrib=FROMLEADER&_version_=-1788735526994444288&distrib.from=http://127.0.0.1:43017/collection1_shard1_replica_n1/&wt=javabin&version=2}{deleteByQuery=id:301 (-1788735526994444288)} 0 13 2> 198928 INFO (qtp1128963290-592) [n:127.0.0.1:43017_ c:collection1 s:shard1 r:core_node3 x:collection1_shard1_replica_n1 t:null-472] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={wt=javabin&version=2}{deleteByQuery=id:301 (-1788735526994444288)} 0 90 2> 198980 INFO (searcherExecutor-351-thread-1-processing-127.0.0.1:43017_ collection1_shard1_replica_n1 collection1 shard1 core_node3) [n:127.0.0.1:43017_ c:collection1 s:shard1 r:core_node3 x:collection1_shard1_replica_n1 t:] o.a.s.c.SolrCore Registered new searcher autowarm time: 0 ms 2> 198980 INFO (searcherExecutor-351-thread-1-processing-127.0.0.1:43017_ collection1_shard1_replica_n1 collection1 shard1 core_node3) [n:127.0.0.1:43017_ c:collection1 s:shard1 r:core_node3 x:collection1_shard1_replica_n1 t:] o.a.s.c.AbstractBasicDistributedZkTestBase Hook detected newSearcher 2> 198998 INFO (searcherExecutor-365-thread-1-processing-127.0.0.1:37133_ collection1_shard1_replica_n6 collection1 shard1 core_node8) [n:127.0.0.1:37133_ c:collection1 s:shard1 r:core_node8 x:collection1_shard1_replica_n6 t:] o.a.s.c.SolrCore Registered new searcher autowarm time: 0 ms 2> 198998 INFO (searcherExecutor-365-thread-1-processing-127.0.0.1:37133_ collection1_shard1_replica_n6 collection1 shard1 core_node8) [n:127.0.0.1:37133_ c:collection1 s:shard1 r:core_node8 x:collection1_shard1_replica_n6 t:] o.a.s.c.AbstractBasicDistributedZkTestBase Hook detected newSearcher 2> 199016 INFO (qtp600427849-517) [n:127.0.0.1:40707_ c:collection1 s:shard2 r:core_node4 x:collection1_shard2_replica_n2 t:null-473] o.a.s.c.S.Request webapp= path=/select params={q=*:*&distrib=false&wt=javabin&version=2} hits=35 status=0 QTime=2 2> 199048 INFO (qtp558875332-569) [n:127.0.0.1:32885_ c:collection1 s:shard2 r:core_node7 x:collection1_shard2_replica_n5 t:null-474] o.a.s.c.S.Request webapp= path=/select params={q=*:*&distrib=false&wt=javabin&version=2} hits=35 status=0 QTime=2 2> 199081 INFO (qtp1128963290-591) [n:127.0.0.1:43017_ c:collection1 s:shard1 r:core_node3 x:collection1_shard1_replica_n1 t:null-475] o.a.s.c.S.Request webapp= path=/select params={q=*:*&distrib=false&wt=javabin&version=2} hits=32 status=0 QTime=2 2> 199125 INFO (qtp657049334-531) [n:127.0.0.1:37133_ c:collection1 s:shard1 r:core_node8 x:collection1_shard1_replica_n6 t:null-476] o.a.s.c.S.Request webapp= path=/select params={q=*:*&distrib=false&wt=javabin&version=2} hits=32 status=0 QTime=1 2> 199148 INFO (qtp600427849-519) [n:127.0.0.1:40707_ c:collection1 s:shard2 r:core_node4 x:collection1_shard2_replica_n2 t:null-477] o.a.s.c.S.Request webapp= path=/select params={q=*:*&distrib=false&wt=javabin&version=2} hits=35 status=0 QTime=1 2> 199169 INFO (qtp558875332-571) [n:127.0.0.1:32885_ c:collection1 s:shard2 r:core_node7 x:collection1_shard2_replica_n5 t:null-478] o.a.s.c.S.Request webapp= path=/select params={q=*:*&distrib=false&wt=javabin&version=2} hits=35 status=0 QTime=2 2> 199189 INFO (qtp1128963290-592) [n:127.0.0.1:43017_ c:collection1 s:shard1 r:core_node3 x:collection1_shard1_replica_n1 t:null-479] o.a.s.c.S.Request webapp= path=/select params={q=*:*&distrib=false&wt=javabin&version=2} hits=32 status=0 QTime=2 2> 199208 INFO (qtp657049334-529) [n:127.0.0.1:37133_ c:collection1 s:shard1 r:core_node8 x:collection1_shard1_replica_n6 t:null-480] o.a.s.c.S.Request webapp= path=/select params={q=*:*&distrib=false&wt=javabin&version=2} hits=32 status=0 QTime=2 2> 199214 INFO (TEST-BasicDistributedZkTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.AbstractBasicDistributedZkTestBase ### STARTING doOptimisticLockingAndUpdating 2> 199216 WARN (TEST-BasicDistributedZkTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.c.SolrZkClient Using default ZkCredentialsInjector. ZkCredentialsInjector is not secure, it creates an empty list of credentials which leads to 'OPEN_ACL_UNSAFE' ACLs to Zookeeper nodes 2> 199237 INFO (TEST-BasicDistributedZkTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.c.ConnectionManager Waiting up to 15000ms for client to connect to ZooKeeper 2> 199251 INFO (zkConnectionManagerCallback-385-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ConnectionManager zkClient has connected 2> 199252 INFO (TEST-BasicDistributedZkTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper 2> 199252 WARN (TEST-BasicDistributedZkTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.c.SolrZkClient Using default ZkACLProvider. DefaultZkACLProvider is not secure, it creates 'OPEN_ACL_UNSAFE' ACLs to Zookeeper nodes 1> / (2) 1> DATA: 1> 1> /zookeeper (2) 1> DATA: 1> 1> /zookeeper/config (0) 1> DATA: 1> 1> /solr (8) 1> /solr/configs (2) 1> /solr/configs/_default (6) 1> /solr/configs/_default/managed-schema.xml (0) 1> DATA: ...supressed... 1> /solr/configs/_default/protwords.txt (0) 1> DATA: ...supressed... 1> /solr/configs/_default/solrconfig.xml (0) 1> DATA: ...supressed... 1> /solr/configs/_default/lang (39) 1> /solr/configs/_default/lang/contractions_it.txt (0) 1> DATA: ...supressed... 1> /solr/configs/_default/lang/stopwords_no.txt (0) 1> DATA: ...supressed... 1> /solr/configs/_default/lang/contractions_ca.txt (0) 1> DATA: ...supressed... 1> /solr/configs/_default/lang/stemdict_nl.txt (0) 1> DATA: ...supressed... 1> /solr/configs/_default/lang/stopwords_hy.txt (0) 1> DATA: ...supressed... 1> /solr/configs/_default/lang/stopwords_id.txt (0) 1> DATA: ...supressed... 1> /solr/configs/_default/lang/stopwords_et.txt (0) 1> DATA: ...supressed... 1> /solr/configs/_default/lang/stopwords_da.txt (0) 1> DATA: ...supressed... 1> /solr/configs/_default/lang/stopwords_ga.txt (0) 1> DATA: ...supressed... 1> /solr/configs/_default/lang/stopwords_hi.txt (0) 1> DATA: ...supressed... 1> /solr/configs/_default/lang/stopwords_pt.txt (0) 1> DATA: ...supressed... 1> /solr/configs/_default/lang/stopwords_ja.txt (0) 1> DATA: ...supressed... 1> /solr/configs/_default/lang/stopwords_hu.txt (0) 1> DATA: ...supressed... 1> /solr/configs/_default/lang/stopwords_el.txt (0) 1> DATA: ...supressed... 1> /solr/configs/_default/lang/stopwords_ru.txt (0) 1> DATA: ...supressed... 1> /solr/configs/_default/lang/stopwords_tr.txt (0) 1> DATA: ...supressed... 1> /solr/configs/_default/lang/stopwords_ar.txt (0) 1> DATA: ...supressed... 1> /solr/configs/_default/lang/userdict_ja.txt (0) 1> DATA: ...supressed... 1> /solr/configs/_default/lang/stopwords_it.txt (0) 1> DATA: ...supressed... 1> /solr/configs/_default/lang/stopwords_eu.txt (0) 1> DATA: ...supressed... 1> /solr/configs/_default/lang/stopwords_cz.txt (0) 1> DATA: ...supressed... 1> /solr/configs/_default/lang/stoptags_ja.txt (0) 1> DATA: ...supressed... 1> /solr/configs/_default/lang/contractions_fr.txt (0) 1> DATA: ...supressed... 1> /solr/configs/_default/lang/stopwords_de.txt (0) 1> DATA: ...supressed... 1> /solr/configs/_default/lang/stopwords_fa.txt (0) 1> DATA: ...supressed... 1> /solr/configs/_default/lang/contractions_ga.txt (0) 1> DATA: ...supressed... 1> /solr/configs/_default/lang/stopwords_ca.txt (0) 1> DATA: ...supressed... 1> /solr/configs/_default/lang/stopwords_nl.txt (0) 1> DATA: ...supressed... 1> /solr/configs/_default/lang/stopwords_bg.txt (0) 1> DATA: ...supressed... 1> /solr/configs/_default/lang/stopwords_es.txt (0) 1> DATA: ...supressed... 1> /solr/configs/_default/lang/stopwords_en.txt (0) 1> DATA: ...supressed... 1> /solr/configs/_default/lang/stopwords_lv.txt (0) 1> DATA: ...supressed... 1> /solr/configs/_default/lang/stopwords_th.txt (0) 1> DATA: ...supressed... 1> /solr/configs/_default/lang/stopwords_ro.txt (0) 1> DATA: ...supressed... 1> /solr/configs/_default/lang/stopwords_fr.txt (0) 1> DATA: ...supressed... 1> /solr/configs/_default/lang/stopwords_sv.txt (0) 1> DATA: ...supressed... 1> /solr/configs/_default/lang/stopwords_fi.txt (0) 1> DATA: ...supressed... 1> /solr/configs/_default/lang/hyphenations_ga.txt (0) 1> DATA: ...supressed... 1> /solr/configs/_default/lang/stopwords_gl.txt (0) 1> DATA: ...supressed... 1> /solr/configs/_default/synonyms.txt (0) 1> DATA: ...supressed... 1> /solr/configs/_default/stopwords.txt (0) 1> DATA: ...supressed... 1> /solr/configs/conf1 (11) 1> /solr/configs/conf1/currency.xml (0) 1> DATA: ...supressed... 1> /solr/configs/conf1/enumsConfig.xml (0) 1> DATA: ...supressed... 1> /solr/configs/conf1/protwords.txt (0) 1> DATA: ...supressed... 1> /solr/configs/conf1/old_synonyms.txt (0) 1> DATA: ...supressed... 1> /solr/configs/conf1/solrconfig.xml (0) 1> DATA: ...supressed... 1> /solr/configs/conf1/synonyms.txt (0) 1> DATA: ...supressed... 1> /solr/configs/conf1/stopwords.txt (0) 1> DATA: ...supressed... 1> /solr/configs/conf1/schema.xml (0) 1> DATA: ...supressed... 1> /solr/configs/conf1/solrconfig.snippet.randomindexconfig.xml (0) 1> DATA: ...supressed... 1> /solr/configs/conf1/mapping-ISOLatin1Accent.txt (0) 1> DATA: ...supressed... 1> /solr/configs/conf1/open-exchange-rates.json (0) 1> DATA: 1> { 1> "disclaimer": "This data is not real, it was synthetically created to match currency.xml. It is modeled after the data format available from openexchangerates.org. See https://openexchangerates.org/documentation for details 1> 1> 1> IMPORTANT NOTE: In order for tests to work, this data must be kept in sync with ./currency.xml", 1> 1> 1> "license": "http://www.apache.org/licenses/LICENSE-2.0", 1> "timestamp": 1332070464, 1> "base": "USD", 1> "rates": { 1> "USD": 1, 1> "JPY": 81.29, 1> "EUR": 2.5, 1> "GBP": 0.5, 1> "MXN": 2.0 1> } 1> } 1> 1> /solr/overseer (7) 1> /solr/overseer/queue-work (0) 1> /solr/overseer/collection-map-failure (0) 1> /solr/overseer/async_ids (0) 1> /solr/overseer/collection-queue-work (0) 1> /solr/overseer/collection-map-completed (0) 1> /solr/overseer/collection-map-running (0) 1> /solr/overseer/queue (0) 1> /solr/aliases.json (0) 1> /solr/collections (2) 1> /solr/collections/collection1 (6) 1> DATA: 1> { 1> "configName":"conf1"} 1> /solr/collections/collection1/shards (0) 1> /solr/collections/collection1/leaders (2) 1> /solr/collections/collection1/leaders/shard2 (1) 1> /solr/collections/collection1/leaders/shard2/leader (0) 1> DATA: 1> { 1> "node_name":"127.0.0.1:40707_", 1> "base_url":"http://127.0.0.1:40707", 1> "core":"collection1_shard2_replica_n2", 1> "core_node_name":"core_node4"} 1> /solr/collections/collection1/leaders/shard1 (1) 1> /solr/collections/collection1/leaders/shard1/leader (0) 1> DATA: 1> { 1> "node_name":"127.0.0.1:43017_", 1> "base_url":"http://127.0.0.1:43017", 1> "core":"collection1_shard1_replica_n1", 1> "core_node_name":"core_node3"} 1> /solr/collections/collection1/counter (0) 1> DATA: 1>  1> /solr/collections/collection1/state.json (0) 1> DATA: 1> {"collection1":{ 1> "pullReplicas":"0", 1> "configName":"conf1", 1> "replicationFactor":1, 1> "router":{"name":"compositeId"}, 1> "nrtReplicas":1, 1> "tlogReplicas":"0", 1> "shards":{ 1> "shard1":{ 1> "range":"80000000-ffffffff", 1> "state":"active", 1> "replicas":{ 1> "core_node3":{ 1> "core":"collection1_shard1_replica_n1", 1> "node_name":"127.0.0.1:43017_", 1> "type":"NRT", 1> "state":"active", 1> "leader":"true", 1> "force_set_state":"false", 1> "base_url":"http://127.0.0.1:43017"}, 1> "core_node8":{ 1> "core":"collection1_shard1_replica_n6", 1> "node_name":"127.0.0.1:37133_", 1> "type":"NRT", 1> "state":"active", 1> "force_set_state":"false", 1> "base_url":"http://127.0.0.1:37133"}}}, 1> "shard2":{ 1> "range":"0-7fffffff", 1> "state":"active", 1> "replicas":{ 1> "core_node4":{ 1> "core":"collection1_shard2_replica_n2", 1> "node_name":"127.0.0.1:40707_", 1> "type":"NRT", 1> "state":"active", 1> "leader":"true", 1> "force_set_state":"false", 1> "base_url":"http://127.0.0.1:40707"}, 1> "core_node7":{ 1> "core":"collection1_shard2_replica_n5", 1> "node_name":"127.0.0.1:32885_", 1> "type":"NRT", 1> "state":"active", 1> "force_set_state":"false", 1> "base_url":"http://127.0.0.1:32885"}}}}}} 1> /solr/collections/collection1/terms (2) 1> /solr/collections/collection1/terms/shard2 (0) 1> DATA: 1> { 1> "core_node7":1, 1> "core_node4":1} 1> /solr/collections/collection1/terms/shard1 (0) 1> DATA: 1> { 1> "core_node3":1, 1> "core_node8":1} 1> /solr/collections/collection1/leader_elect (2) 1> /solr/collections/collection1/leader_elect/shard2 (1) 1> /solr/collections/collection1/leader_elect/shard2/election (2) 1> /solr/collections/collection1/leader_elect/shard2/election/72077803529764874-core_node4-n_0000000000 (0) 1> /solr/collections/collection1/leader_elect/shard2/election/72077803529764880-core_node7-n_0000000001 (0) 1> /solr/collections/collection1/leader_elect/shard1 (1) 1> /solr/collections/collection1/leader_elect/shard1/election (2) 1> /solr/collections/collection1/leader_elect/shard1/election/72077803529764882-core_node3-n_0000000000 (0) 1> /solr/collections/collection1/leader_elect/shard1/election/72077803529764876-core_node8-n_0000000001 (0) 1> /solr/collections/control_collection (6) 1> DATA: 1> { 1> "configName":"conf1"} 1> /solr/collections/control_collection/shards (0) 1> /solr/collections/control_collection/leaders (1) 1> /solr/collections/control_collection/leaders/shard1 (1) 1> /solr/collections/control_collection/leaders/shard1/leader (0) 1> DATA: 1> { 1> "node_name":"127.0.0.1:45297_", 1> "base_url":"http://127.0.0.1:45297", 1> "core":"control_collection_shard1_replica_n1", 1> "core_node_name":"core_node2"} 1> /solr/collections/control_collection/counter (0) 1> DATA: 1>  1> /solr/collections/control_collection/state.json (0) 1> DATA: 1> {"control_collection":{ 1> "pullReplicas":"0", 1> "configName":"conf1", 1> "replicationFactor":1, 1> "router":{"name":"compositeId"}, 1> "nrtReplicas":1, 1> "tlogReplicas":"0", 1> "shards":{"shard1":{ 1> "range":"80000000-7fffffff", 1> "state":"active", 1> "replicas":{"core_node2":{ 1> "core":"control_collection_shard1_replica_n1", 1> "node_name":"127.0.0.1:45297_", 1> "type":"NRT", 1> "state":"active", 1> "leader":"true", 1> "force_set_state":"false", 1> "base_url":"http://127.0.0.1:45297"}}}}}} 1> /solr/collections/control_collection/terms (1) 1> /solr/collections/control_collection/terms/shard1 (0) 1> DATA: 1> { 1> "core_node2":1} 1> /solr/collections/control_collection/leader_elect (1) 1> /solr/collections/control_collection/leader_elect/shard1 (1) 1> /solr/collections/control_collection/leader_elect/shard1/election (1) 1> /solr/collections/control_collection/leader_elect/shard1/election/72077803529764868-core_node2-n_0000000000 (0) 1> /solr/live_nodes (5) 1> /solr/live_nodes/127.0.0.1:43017_ (0) 1> /solr/live_nodes/127.0.0.1:32885_ (0) 1> /solr/live_nodes/127.0.0.1:45297_ (0) 1> /solr/live_nodes/127.0.0.1:40707_ (0) 1> /solr/live_nodes/127.0.0.1:37133_ (0) 1> /solr/overseer_elect (2) 1> /solr/overseer_elect/leader (0) 1> DATA: 1> { 1> "id":"72077803529764868-127.0.0.1:45297_-n_0000000000"} 1> /solr/overseer_elect/election (5) 1> /solr/overseer_elect/election/72077803529764880-127.0.0.1:32885_-n_0000000003 (0) 1> /solr/overseer_elect/election/72077803529764874-127.0.0.1:40707_-n_0000000001 (0) 1> /solr/overseer_elect/election/72077803529764876-127.0.0.1:37133_-n_0000000002 (0) 1> /solr/overseer_elect/election/72077803529764882-127.0.0.1:43017_-n_0000000004 (0) 1> /solr/overseer_elect/election/72077803529764868-127.0.0.1:45297_-n_0000000000 (0) 1> /solr/security.json (0) 1> DATA: 1> {} 1> /solr/node_roles (3) 1> /solr/node_roles/coordinator (2) 1> /solr/node_roles/coordinator/off (5) 1> /solr/node_roles/coordinator/off/127.0.0.1:43017_ (0) 1> /solr/node_roles/coordinator/off/127.0.0.1:32885_ (0) 1> /solr/node_roles/coordinator/off/127.0.0.1:45297_ (0) 1> /solr/node_roles/coordinator/off/127.0.0.1:40707_ (0) 1> /solr/node_roles/coordinator/off/127.0.0.1:37133_ (0) 1> /solr/node_roles/coordinator/on (0) 1> /solr/node_roles/data (2) 1> /solr/node_roles/data/off (0) 1> /solr/node_roles/data/on (5) 1> /solr/node_roles/data/on/127.0.0.1:43017_ (0) 1> /solr/node_roles/data/on/127.0.0.1:32885_ (0) 1> /solr/node_roles/data/on/127.0.0.1:45297_ (0) 1> /solr/node_roles/data/on/127.0.0.1:40707_ (0) 1> /solr/node_roles/data/on/127.0.0.1:37133_ (0) 1> /solr/node_roles/overseer (3) 1> /solr/node_roles/overseer/disallowed (0) 1> /solr/node_roles/overseer/preferred (0) 1> /solr/node_roles/overseer/allowed (5) 1> /solr/node_roles/overseer/allowed/127.0.0.1:43017_ (0) 1> /solr/node_roles/overseer/allowed/127.0.0.1:32885_ (0) 1> /solr/node_roles/overseer/allowed/127.0.0.1:45297_ (0) 1> /solr/node_roles/overseer/allowed/127.0.0.1:40707_ (0) 1> /solr/node_roles/overseer/allowed/127.0.0.1:37133_ (0) 1> 2> 200142 INFO (qtp2065109343-468) [n:127.0.0.1:45297_ c:control_collection s:shard1 r:core_node2 x:control_collection_shard1_replica_n1 t:null-481] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={wt=javabin&version=2&CONTROL=TRUE}{add=[1000 (1788735528350253056)]} 0 13 2> 200162 INFO (qtp657049334-530) [n:127.0.0.1:37133_ c:collection1 s:shard1 r:core_node8 x:collection1_shard1_replica_n6 t:null-482] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={update.distrib=FROMLEADER&distrib.from=http://127.0.0.1:43017/collection1_shard1_replica_n1/&wt=javabin&version=2}{add=[1000 (1788735528367030272)]} 0 4 2> 200165 INFO (qtp1128963290-589) [n:127.0.0.1:43017_ c:collection1 s:shard1 r:core_node3 x:collection1_shard1_replica_n1 t:null-482] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={wt=javabin&version=2}{add=[1000 (1788735528367030272)]} 0 17 2> 200167 INFO (TEST-BasicDistributedZkTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.u.ErrorLogMuter Creating ErrorLogMuter-regex-11 for ERROR logs matching regex: version conflict 2> 200261 INFO (qtp1128963290-593) [n:127.0.0.1:43017_ c:collection1 s:shard1 r:core_node3 x:collection1_shard1_replica_n1 t:null-483] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={update.distrib=TOLEADER&distrib.from=http://127.0.0.1:40707/collection1_shard2_replica_n2/&wt=javabin&version=2}{} 0 77 2> 200290 INFO (qtp600427849-518) [n:127.0.0.1:40707_ c:collection1 s:shard2 r:core_node4 x:collection1_shard2_replica_n2 t:null-483] o.a.s.u.SolrCmdDistributor SolrCmdDistributor found 1 errors 2> 200297 INFO (qtp600427849-518) [n:127.0.0.1:40707_ c:collection1 s:shard2 r:core_node4 x:collection1_shard2_replica_n2 t:null-483] o.a.s.c.S.Request webapp= path=/update params={wt=javabin&version=2} status=409 QTime=117 2> 200323 INFO (qtp1128963290-588) [n:127.0.0.1:43017_ c:collection1 s:shard1 r:core_node3 x:collection1_shard1_replica_n1 t:null-484] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={update.distrib=TOLEADER&distrib.from=http://127.0.0.1:37133/collection1_shard1_replica_n6/&wt=javabin&version=2}{} 0 7 2> 200332 INFO (qtp657049334-532) [n:127.0.0.1:37133_ c:collection1 s:shard1 r:core_node8 x:collection1_shard1_replica_n6 t:null-484] o.a.s.u.SolrCmdDistributor SolrCmdDistributor found 1 errors 2> 200333 INFO (qtp657049334-532) [n:127.0.0.1:37133_ c:collection1 s:shard1 r:core_node8 x:collection1_shard1_replica_n6 t:null-484] o.a.s.c.S.Request webapp= path=/update params={wt=javabin&version=2} status=409 QTime=25 2> 200359 INFO (qtp1128963290-592) [n:127.0.0.1:43017_ c:collection1 s:shard1 r:core_node3 x:collection1_shard1_replica_n1 t:null-485] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={update.distrib=TOLEADER&distrib.from=http://127.0.0.1:32885/collection1_shard2_replica_n5/&wt=javabin&version=2}{} 0 6 2> 200366 INFO (qtp558875332-572) [n:127.0.0.1:32885_ c:collection1 s:shard2 r:core_node7 x:collection1_shard2_replica_n5 t:null-485] o.a.s.u.SolrCmdDistributor SolrCmdDistributor found 1 errors 2> 200367 INFO (qtp558875332-572) [n:127.0.0.1:32885_ c:collection1 s:shard2 r:core_node7 x:collection1_shard2_replica_n5 t:null-485] o.a.s.c.S.Request webapp= path=/update params={wt=javabin&version=2} status=409 QTime=24 2> 200375 INFO (qtp1128963290-590) [n:127.0.0.1:43017_ c:collection1 s:shard1 r:core_node3 x:collection1_shard1_replica_n1 t:null-486] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={wt=javabin&version=2}{} 0 2 2> 200387 INFO (TEST-BasicDistributedZkTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.u.ErrorLogMuter Closing ErrorLogMuter-regex-11 after mutting 3 log messages 2> 200416 INFO (qtp657049334-528) [n:127.0.0.1:37133_ c:collection1 s:shard1 r:core_node8 x:collection1_shard1_replica_n6 t:null-487] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={update.distrib=FROMLEADER&distrib.from=http://127.0.0.1:43017/collection1_shard1_replica_n1/&wt=javabin&version=2}{add=[1000 (1788735528628125696)]} 0 9 2> 200426 INFO (qtp1128963290-591) [n:127.0.0.1:43017_ c:collection1 s:shard1 r:core_node3 x:collection1_shard1_replica_n1 t:null-487] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={update.distrib=TOLEADER&distrib.from=http://127.0.0.1:40707/collection1_shard2_replica_n2/&wt=javabin&version=2}{add=[1000 (1788735528628125696)]} 0 29 2> 200436 INFO (qtp600427849-515) [n:127.0.0.1:40707_ c:collection1 s:shard2 r:core_node4 x:collection1_shard2_replica_n2 t:null-487] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={wt=javabin&version=2}{add=[1000]} 0 46 2> 200475 INFO (qtp657049334-529) [n:127.0.0.1:37133_ c:collection1 s:shard1 r:core_node8 x:collection1_shard1_replica_n6 t:null-488] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={update.distrib=FROMLEADER&distrib.from=http://127.0.0.1:43017/collection1_shard1_replica_n1/&wt=javabin&version=2}{add=[1000 (1788735528699428864)]} 0 3 2> 200477 INFO (qtp1128963290-589) [n:127.0.0.1:43017_ c:collection1 s:shard1 r:core_node3 x:collection1_shard1_replica_n1 t:null-488] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={update.distrib=TOLEADER&distrib.from=http://127.0.0.1:40707/collection1_shard2_replica_n2/&wt=javabin&version=2}{add=[1000 (1788735528699428864)]} 0 29 2> 200479 INFO (qtp600427849-516) [n:127.0.0.1:40707_ c:collection1 s:shard2 r:core_node4 x:collection1_shard2_replica_n2 t:null-488] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={wt=javabin&version=2}{add=[1000]} 0 37 2> 200503 INFO (qtp657049334-533) [n:127.0.0.1:37133_ c:collection1 s:shard1 r:core_node8 x:collection1_shard1_replica_n6 t:null-489] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={update.distrib=FROMLEADER&distrib.from=http://127.0.0.1:43017/collection1_shard1_replica_n1/&wt=javabin&version=2}{add=[1000 (1788735528729837568)]} 0 2 2> 200505 INFO (qtp1128963290-593) [n:127.0.0.1:43017_ c:collection1 s:shard1 r:core_node3 x:collection1_shard1_replica_n1 t:null-489] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={update.distrib=TOLEADER&distrib.from=http://127.0.0.1:37133/collection1_shard1_replica_n6/&wt=javabin&version=2}{add=[1000 (1788735528729837568)]} 0 14 2> 200508 INFO (qtp657049334-531) [n:127.0.0.1:37133_ c:collection1 s:shard1 r:core_node8 x:collection1_shard1_replica_n6 t:null-489] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={wt=javabin&version=2}{add=[1000]} 0 23 2> 200532 INFO (qtp657049334-530) [n:127.0.0.1:37133_ c:collection1 s:shard1 r:core_node8 x:collection1_shard1_replica_n6 t:null-490] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={update.distrib=FROMLEADER&distrib.from=http://127.0.0.1:43017/collection1_shard1_replica_n1/&wt=javabin&version=2}{add=[1000 (1788735528758149120)]} 0 2 2> 200534 INFO (qtp1128963290-588) [n:127.0.0.1:43017_ c:collection1 s:shard1 r:core_node3 x:collection1_shard1_replica_n1 t:null-490] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={update.distrib=TOLEADER&distrib.from=http://127.0.0.1:32885/collection1_shard2_replica_n5/&wt=javabin&version=2}{add=[1000 (1788735528758149120)]} 0 14 2> 200537 INFO (qtp558875332-573) [n:127.0.0.1:32885_ c:collection1 s:shard2 r:core_node7 x:collection1_shard2_replica_n5 t:null-490] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={wt=javabin&version=2}{add=[1000]} 0 24 2> 200555 INFO (qtp657049334-532) [n:127.0.0.1:37133_ c:collection1 s:shard1 r:core_node8 x:collection1_shard1_replica_n6 t:null-491] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={update.distrib=FROMLEADER&distrib.from=http://127.0.0.1:43017/collection1_shard1_replica_n1/&wt=javabin&version=2}{add=[1000 (1788735528782266368)]} 0 3 2> 200558 INFO (qtp1128963290-592) [n:127.0.0.1:43017_ c:collection1 s:shard1 r:core_node3 x:collection1_shard1_replica_n1 t:null-491] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={wt=javabin&version=2}{add=[1000 (1788735528782266368)]} 0 15 2> 200591 INFO (qtp657049334-528) [n:127.0.0.1:37133_ c:collection1 s:shard1 r:core_node8 x:collection1_shard1_replica_n6 t:null-492] o.a.s.c.S.Request webapp= path=/get params={distrib=false&qt=/get&omitHeader=false&shards.purpose=1&NOW=1705871132667&ids=1000&isShard=true&rid=null-492&wt=javabin&version=2&shards.qt=/get} status=0 QTime=6 2> 200597 INFO (qtp600427849-519) [n:127.0.0.1:40707_ c:collection1 s:shard2 r:core_node4 x:collection1_shard2_replica_n2 t:null-492] o.a.s.c.S.Request webapp= path=/get params={qt=/get&id=1000&wt=javabin&version=2} rid=null-492 status=0 QTime=28 2> 200615 INFO (qtp1128963290-590) [n:127.0.0.1:43017_ c:collection1 s:shard1 r:core_node3 x:collection1_shard1_replica_n1 t:null-493] o.a.s.c.S.Request webapp= path=/get params={distrib=false&qt=/get&omitHeader=false&shards.purpose=1&NOW=1705871132703&ids=1000&isShard=true&rid=null-493&wt=javabin&version=2&shards.qt=/get} status=0 QTime=2 2> 200619 INFO (qtp657049334-529) [n:127.0.0.1:37133_ c:collection1 s:shard1 r:core_node8 x:collection1_shard1_replica_n6 t:null-493] o.a.s.c.S.Request webapp= path=/get params={qt=/get&id=1000&wt=javabin&version=2} rid=null-493 status=0 QTime=13 2> 200634 INFO (qtp657049334-533) [n:127.0.0.1:37133_ c:collection1 s:shard1 r:core_node8 x:collection1_shard1_replica_n6 t:null-494] o.a.s.c.S.Request webapp= path=/get params={distrib=false&qt=/get&omitHeader=false&shards.purpose=1&NOW=1705871132722&ids=1000&isShard=true&rid=null-494&wt=javabin&version=2&shards.qt=/get} status=0 QTime=2 2> 200637 INFO (qtp558875332-568) [n:127.0.0.1:32885_ c:collection1 s:shard2 r:core_node7 x:collection1_shard2_replica_n5 t:null-494] o.a.s.c.S.Request webapp= path=/get params={qt=/get&id=1000&wt=javabin&version=2} rid=null-494 status=0 QTime=13 2> 200651 INFO (qtp1128963290-589) [n:127.0.0.1:43017_ c:collection1 s:shard1 r:core_node3 x:collection1_shard1_replica_n1 t:null-495] o.a.s.c.S.Request webapp= path=/get params={distrib=false&qt=/get&omitHeader=false&shards.purpose=1&NOW=1705871132740&ids=1000&isShard=true&rid=null-495&wt=javabin&version=2&shards.qt=/get} status=0 QTime=2 2> 200654 INFO (qtp1128963290-591) [n:127.0.0.1:43017_ c:collection1 s:shard1 r:core_node3 x:collection1_shard1_replica_n1 t:null-495] o.a.s.c.S.Request webapp= path=/get params={qt=/get&id=1000&wt=javabin&version=2} rid=null-495 status=0 QTime=12 2> 200668 INFO (qtp600427849-517) [n:127.0.0.1:40707_ c:collection1 s:shard2 r:core_node4 x:collection1_shard2_replica_n2 t:null-496] o.a.s.c.S.Request webapp= path=/select params={df=text&distrib=false&fl=id&fl=score&shards.purpose=16388&start=0&fsv=true&rid=null-496&rows=10&version=2&q=*:*&omitHeader=false&NOW=1705871132757&isShard=true&wt=javabin} hits=35 status=0 QTime=1 2> 200678 INFO (qtp600427849-520) [n:127.0.0.1:40707_ c:collection1 s:shard2 r:core_node4 x:collection1_shard2_replica_n2 t:null-496] o.a.s.c.S.Request webapp= path=/select params={q=*:*&df=text&distrib=false&omitHeader=false&shards.purpose=64&NOW=1705871132757&ids=2,3,103,104,5,105,17,6,7,9&isShard=true&rid=null-496&wt=javabin&version=2} status=0 QTime=3 2> 200684 INFO (qtp1128963290-593) [n:127.0.0.1:43017_ c:collection1 s:shard1 r:core_node3 x:collection1_shard1_replica_n1 t:null-496] o.a.s.c.S.Request webapp= path=/select params={q=*:*&shards=shard2&wt=javabin&version=2} rid=null-496 hits=35 status=0 QTime=25 2> 200701 INFO (qtp600427849-518) [n:127.0.0.1:40707_ c:collection1 s:shard2 r:core_node4 x:collection1_shard2_replica_n2 t:null-497] o.a.s.c.S.Request webapp= path=/select params={df=text&distrib=false&fl=id&fl=score&shards.purpose=16388&start=0&fsv=true&rid=null-497&rows=10&version=2&q=*:*&omitHeader=false&NOW=1705871132791&isShard=true&wt=javabin} hits=35 status=0 QTime=2 2> 200711 INFO (qtp600427849-515) [n:127.0.0.1:40707_ c:collection1 s:shard2 r:core_node4 x:collection1_shard2_replica_n2 t:null-497] o.a.s.c.S.Request webapp= path=/select params={q=*:*&df=text&distrib=false&omitHeader=false&shards.purpose=64&NOW=1705871132791&ids=2,3,103,104,5,105,17,6,7,9&isShard=true&rid=null-497&wt=javabin&version=2} status=0 QTime=3 2> 200718 INFO (qtp1128963290-588) [n:127.0.0.1:43017_ c:collection1 s:shard1 r:core_node3 x:collection1_shard1_replica_n1 t:null-497] o.a.s.c.S.Request webapp= path=/select params={q=*:*&shards=http://127.0.0.1:40707/collection1_shard2_replica_n2&wt=javabin&version=2} rid=null-497 hits=35 status=0 QTime=24 2> 200735 INFO (qtp558875332-571) [n:127.0.0.1:32885_ c:collection1 s:shard2 r:core_node7 x:collection1_shard2_replica_n5 t:null-498] o.a.s.c.S.Request webapp= path=/select params={df=text&distrib=false&fl=id&fl=score&shards.purpose=16388&start=0&fsv=true&rid=null-498&rows=10&version=2&q=*:*&omitHeader=false&NOW=1705871132825&isShard=true&wt=javabin} hits=35 status=0 QTime=2 2> 200746 INFO (qtp558875332-569) [n:127.0.0.1:32885_ c:collection1 s:shard2 r:core_node7 x:collection1_shard2_replica_n5 t:null-498] o.a.s.c.S.Request webapp= path=/select params={q=*:*&df=text&distrib=false&omitHeader=false&shards.purpose=64&NOW=1705871132825&ids=2,3,103,104,5,105,17,6,7,9&isShard=true&rid=null-498&wt=javabin&version=2} status=0 QTime=3 2> 200752 INFO (qtp1128963290-592) [n:127.0.0.1:43017_ c:collection1 s:shard1 r:core_node3 x:collection1_shard1_replica_n1 t:null-498] o.a.s.c.S.Request webapp= path=/select params={q=*:*&shards=http://127.0.0.1:32885/collection1_shard2_replica_n5&wt=javabin&version=2} rid=null-498 hits=35 status=0 QTime=25 2> 200774 INFO (qtp600427849-516) [n:127.0.0.1:40707_ c:collection1 s:shard2 r:core_node4 x:collection1_shard2_replica_n2 t:null-499] o.a.s.c.S.Request webapp= path=/select params={df=text&distrib=false&fl=id&fl=score&shards.purpose=16388&start=0&fsv=true&rid=null-499&rows=10&version=2&q=*:*&omitHeader=false&NOW=1705871132856&isShard=true&wt=javabin} hits=35 status=0 QTime=1 2> 200793 INFO (qtp600427849-519) [n:127.0.0.1:40707_ c:collection1 s:shard2 r:core_node4 x:collection1_shard2_replica_n2 t:null-499] o.a.s.c.S.Request webapp= path=/select params={q=*:*&df=text&distrib=false&omitHeader=false&shards.purpose=64&NOW=1705871132856&ids=2,3,103,104,5,105,17,6,7,9&isShard=true&rid=null-499&wt=javabin&version=2} status=0 QTime=3 2> 200800 INFO (qtp1128963290-590) [n:127.0.0.1:43017_ c:collection1 s:shard1 r:core_node3 x:collection1_shard1_replica_n1 t:null-499] o.a.s.c.S.Request webapp= path=/select params={q=*:*&shards=http://127.0.0.1:40707/collection1_shard2_replica_n2|http://127.0.0.1:32885/collection1_shard2_replica_n5&wt=javabin&version=2} rid=null-499 hits=35 status=0 QTime=42 2> 200812 INFO (qtp600427849-517) [n:127.0.0.1:40707_ c:collection1 s:shard2 r:core_node4 x:collection1_shard2_replica_n2 t:null-500] o.a.s.c.S.Request webapp= path=/select params={q=*:*&shards=shard2&wt=javabin&version=2} rid=null-500 hits=35 status=0 QTime=4 2> 200827 INFO (qtp600427849-518) [n:127.0.0.1:40707_ c:collection1 s:shard2 r:core_node4 x:collection1_shard2_replica_n2 t:null-501] o.a.s.c.S.Request webapp= path=/select params={df=text&distrib=false&fl=id&fl=score&shards.purpose=16388&start=0&fsv=true&rid=null-501&rows=10&version=2&q=*:*&omitHeader=false&NOW=1705871132918&isShard=true&wt=javabin} hits=35 status=0 QTime=1 2> 200838 INFO (qtp600427849-515) [n:127.0.0.1:40707_ c:collection1 s:shard2 r:core_node4 x:collection1_shard2_replica_n2 t:null-501] o.a.s.c.S.Request webapp= path=/select params={q=*:*&df=text&distrib=false&omitHeader=false&shards.purpose=64&NOW=1705871132918&ids=2,3,103,104,5,105,17,6,7,9&isShard=true&rid=null-501&wt=javabin&version=2} status=0 QTime=3 2> 200844 INFO (qtp600427849-520) [n:127.0.0.1:40707_ c:collection1 s:shard2 r:core_node4 x:collection1_shard2_replica_n2 t:null-501] o.a.s.c.S.Request webapp= path=/select params={q=*:*&shards=http://127.0.0.1:40707/collection1_shard2_replica_n2&wt=javabin&version=2} rid=null-501 hits=35 status=0 QTime=24 2> 200863 INFO (qtp558875332-570) [n:127.0.0.1:32885_ c:collection1 s:shard2 r:core_node7 x:collection1_shard2_replica_n5 t:null-502] o.a.s.c.S.Request webapp= path=/select params={df=text&distrib=false&fl=id&fl=score&shards.purpose=16388&start=0&fsv=true&rid=null-502&rows=10&version=2&q=*:*&omitHeader=false&NOW=1705871132948&isShard=true&wt=javabin} hits=35 status=0 QTime=2 2> 200881 INFO (qtp558875332-572) [n:127.0.0.1:32885_ c:collection1 s:shard2 r:core_node7 x:collection1_shard2_replica_n5 t:null-502] o.a.s.c.S.Request webapp= path=/select params={q=*:*&df=text&distrib=false&omitHeader=false&shards.purpose=64&NOW=1705871132948&ids=2,3,103,104,5,105,17,6,7,9&isShard=true&rid=null-502&wt=javabin&version=2} status=0 QTime=4 2> 200892 INFO (qtp600427849-516) [n:127.0.0.1:40707_ c:collection1 s:shard2 r:core_node4 x:collection1_shard2_replica_n2 t:null-502] o.a.s.c.S.Request webapp= path=/select params={q=*:*&shards=http://127.0.0.1:32885/collection1_shard2_replica_n5&wt=javabin&version=2} rid=null-502 hits=35 status=0 QTime=42 2> 200913 INFO (qtp600427849-517) [n:127.0.0.1:40707_ c:collection1 s:shard2 r:core_node4 x:collection1_shard2_replica_n2 t:null-503] o.a.s.c.S.Request webapp= path=/select params={df=text&distrib=false&fl=id&fl=score&shards.purpose=16388&start=0&fsv=true&rid=null-503&rows=10&version=2&q=*:*&omitHeader=false&NOW=1705871132996&isShard=true&wt=javabin} hits=35 status=0 QTime=1 2> 200926 INFO (qtp600427849-518) [n:127.0.0.1:40707_ c:collection1 s:shard2 r:core_node4 x:collection1_shard2_replica_n2 t:null-503] o.a.s.c.S.Request webapp= path=/select params={q=*:*&df=text&distrib=false&omitHeader=false&shards.purpose=64&NOW=1705871132996&ids=2,3,103,104,5,105,17,6,7,9&isShard=true&rid=null-503&wt=javabin&version=2} status=0 QTime=4 2> 200934 INFO (qtp600427849-519) [n:127.0.0.1:40707_ c:collection1 s:shard2 r:core_node4 x:collection1_shard2_replica_n2 t:null-503] o.a.s.c.S.Request webapp= path=/select params={q=*:*&shards=http://127.0.0.1:40707/collection1_shard2_replica_n2|http://127.0.0.1:32885/collection1_shard2_replica_n5&wt=javabin&version=2} rid=null-503 hits=35 status=0 QTime=36 2> 200951 INFO (qtp558875332-573) [n:127.0.0.1:32885_ c:collection1 s:shard2 r:core_node7 x:collection1_shard2_replica_n5 t:null-504] o.a.s.c.S.Request webapp= path=/select params={df=text&distrib=false&fl=id&fl=score&shards.purpose=16388&start=0&fsv=true&rid=null-504&rows=10&version=2&q=*:*&omitHeader=false&NOW=1705871133038&isShard=true&wt=javabin} hits=35 status=0 QTime=1 2> 200966 INFO (qtp558875332-568) [n:127.0.0.1:32885_ c:collection1 s:shard2 r:core_node7 x:collection1_shard2_replica_n5 t:null-504] o.a.s.c.S.Request webapp= path=/select params={q=*:*&df=text&distrib=false&omitHeader=false&shards.purpose=64&NOW=1705871133038&ids=2,3,103,104,5,105,17,6,7,9&isShard=true&rid=null-504&wt=javabin&version=2} status=0 QTime=3 2> 200974 INFO (qtp657049334-531) [n:127.0.0.1:37133_ c:collection1 s:shard1 r:core_node8 x:collection1_shard1_replica_n6 t:null-504] o.a.s.c.S.Request webapp= path=/select params={q=*:*&shards=shard2&wt=javabin&version=2} rid=null-504 hits=35 status=0 QTime=33 2> 200991 INFO (qtp600427849-515) [n:127.0.0.1:40707_ c:collection1 s:shard2 r:core_node4 x:collection1_shard2_replica_n2 t:null-505] o.a.s.c.S.Request webapp= path=/select params={df=text&distrib=false&fl=id&fl=score&shards.purpose=16388&start=0&fsv=true&rid=null-505&rows=10&version=2&q=*:*&omitHeader=false&NOW=1705871133078&isShard=true&wt=javabin} hits=35 status=0 QTime=1 2> 201001 INFO (qtp600427849-520) [n:127.0.0.1:40707_ c:collection1 s:shard2 r:core_node4 x:collection1_shard2_replica_n2 t:null-505] o.a.s.c.S.Request webapp= path=/select params={q=*:*&df=text&distrib=false&omitHeader=false&shards.purpose=64&NOW=1705871133078&ids=2,3,103,104,5,105,17,6,7,9&isShard=true&rid=null-505&wt=javabin&version=2} status=0 QTime=3 2> 201007 INFO (qtp657049334-530) [n:127.0.0.1:37133_ c:collection1 s:shard1 r:core_node8 x:collection1_shard1_replica_n6 t:null-505] o.a.s.c.S.Request webapp= path=/select params={q=*:*&shards=http://127.0.0.1:40707/collection1_shard2_replica_n2&wt=javabin&version=2} rid=null-505 hits=35 status=0 QTime=27 2> 201022 INFO (qtp558875332-571) [n:127.0.0.1:32885_ c:collection1 s:shard2 r:core_node7 x:collection1_shard2_replica_n5 t:null-506] o.a.s.c.S.Request webapp= path=/select params={df=text&distrib=false&fl=id&fl=score&shards.purpose=16388&start=0&fsv=true&rid=null-506&rows=10&version=2&q=*:*&omitHeader=false&NOW=1705871133111&isShard=true&wt=javabin} hits=35 status=0 QTime=2 2> 201034 INFO (qtp558875332-569) [n:127.0.0.1:32885_ c:collection1 s:shard2 r:core_node7 x:collection1_shard2_replica_n5 t:null-506] o.a.s.c.S.Request webapp= path=/select params={q=*:*&df=text&distrib=false&omitHeader=false&shards.purpose=64&NOW=1705871133111&ids=2,3,103,104,5,105,17,6,7,9&isShard=true&rid=null-506&wt=javabin&version=2} status=0 QTime=3 2> 201040 INFO (qtp657049334-532) [n:127.0.0.1:37133_ c:collection1 s:shard1 r:core_node8 x:collection1_shard1_replica_n6 t:null-506] o.a.s.c.S.Request webapp= path=/select params={q=*:*&shards=http://127.0.0.1:32885/collection1_shard2_replica_n5&wt=javabin&version=2} rid=null-506 hits=35 status=0 QTime=27 2> 201054 INFO (qtp558875332-570) [n:127.0.0.1:32885_ c:collection1 s:shard2 r:core_node7 x:collection1_shard2_replica_n5 t:null-507] o.a.s.c.S.Request webapp= path=/select params={df=text&distrib=false&fl=id&fl=score&shards.purpose=16388&start=0&fsv=true&rid=null-507&rows=10&version=2&q=*:*&omitHeader=false&NOW=1705871133145&isShard=true&wt=javabin} hits=35 status=0 QTime=2 2> 201066 INFO (qtp558875332-572) [n:127.0.0.1:32885_ c:collection1 s:shard2 r:core_node7 x:collection1_shard2_replica_n5 t:null-507] o.a.s.c.S.Request webapp= path=/select params={q=*:*&df=text&distrib=false&omitHeader=false&shards.purpose=64&NOW=1705871133145&ids=2,3,103,104,5,105,17,6,7,9&isShard=true&rid=null-507&wt=javabin&version=2} status=0 QTime=4 2> 201074 INFO (qtp657049334-528) [n:127.0.0.1:37133_ c:collection1 s:shard1 r:core_node8 x:collection1_shard1_replica_n6 t:null-507] o.a.s.c.S.Request webapp= path=/select params={q=*:*&shards=http://127.0.0.1:40707/collection1_shard2_replica_n2|http://127.0.0.1:32885/collection1_shard2_replica_n5&wt=javabin&version=2} rid=null-507 hits=35 status=0 QTime=28 2> 201089 INFO (qtp558875332-573) [n:127.0.0.1:32885_ c:collection1 s:shard2 r:core_node7 x:collection1_shard2_replica_n5 t:null-508] o.a.s.c.S.Request webapp= path=/select params={q=*:*&shards=shard2&wt=javabin&version=2} rid=null-508 hits=35 status=0 QTime=8 2> 201109 INFO (qtp600427849-516) [n:127.0.0.1:40707_ c:collection1 s:shard2 r:core_node4 x:collection1_shard2_replica_n2 t:null-509] o.a.s.c.S.Request webapp= path=/select params={df=text&distrib=false&fl=id&fl=score&shards.purpose=16388&start=0&fsv=true&rid=null-509&rows=10&version=2&q=*:*&omitHeader=false&NOW=1705871133197&isShard=true&wt=javabin} hits=35 status=0 QTime=1 2> 201120 INFO (qtp600427849-517) [n:127.0.0.1:40707_ c:collection1 s:shard2 r:core_node4 x:collection1_shard2_replica_n2 t:null-509] o.a.s.c.S.Request webapp= path=/select params={q=*:*&df=text&distrib=false&omitHeader=false&shards.purpose=64&NOW=1705871133197&ids=2,3,103,104,5,105,17,6,7,9&isShard=true&rid=null-509&wt=javabin&version=2} status=0 QTime=3 2> 201126 INFO (qtp558875332-568) [n:127.0.0.1:32885_ c:collection1 s:shard2 r:core_node7 x:collection1_shard2_replica_n5 t:null-509] o.a.s.c.S.Request webapp= path=/select params={q=*:*&shards=http://127.0.0.1:40707/collection1_shard2_replica_n2&wt=javabin&version=2} rid=null-509 hits=35 status=0 QTime=28 2> 201146 INFO (qtp558875332-569) [n:127.0.0.1:32885_ c:collection1 s:shard2 r:core_node7 x:collection1_shard2_replica_n5 t:null-510] o.a.s.c.S.Request webapp= path=/select params={df=text&distrib=false&fl=id&fl=score&shards.purpose=16388&start=0&fsv=true&rid=null-510&rows=10&version=2&q=*:*&omitHeader=false&NOW=1705871133231&isShard=true&wt=javabin} hits=35 status=0 QTime=2 2> 201158 INFO (qtp558875332-570) [n:127.0.0.1:32885_ c:collection1 s:shard2 r:core_node7 x:collection1_shard2_replica_n5 t:null-510] o.a.s.c.S.Request webapp= path=/select params={q=*:*&df=text&distrib=false&omitHeader=false&shards.purpose=64&NOW=1705871133231&ids=2,3,103,104,5,105,17,6,7,9&isShard=true&rid=null-510&wt=javabin&version=2} status=0 QTime=4 2> 201165 INFO (qtp558875332-571) [n:127.0.0.1:32885_ c:collection1 s:shard2 r:core_node7 x:collection1_shard2_replica_n5 t:null-510] o.a.s.c.S.Request webapp= path=/select params={q=*:*&shards=http://127.0.0.1:32885/collection1_shard2_replica_n5&wt=javabin&version=2} rid=null-510 hits=35 status=0 QTime=33 2> 201182 INFO (qtp558875332-573) [n:127.0.0.1:32885_ c:collection1 s:shard2 r:core_node7 x:collection1_shard2_replica_n5 t:null-511] o.a.s.c.S.Request webapp= path=/select params={df=text&distrib=false&fl=id&fl=score&shards.purpose=16388&start=0&fsv=true&rid=null-511&rows=10&version=2&q=*:*&omitHeader=false&NOW=1705871133269&isShard=true&wt=javabin} hits=35 status=0 QTime=1 2> 201191 INFO (qtp558875332-568) [n:127.0.0.1:32885_ c:collection1 s:shard2 r:core_node7 x:collection1_shard2_replica_n5 t:null-511] o.a.s.c.S.Request webapp= path=/select params={q=*:*&df=text&distrib=false&omitHeader=false&shards.purpose=64&NOW=1705871133269&ids=2,3,103,104,5,105,17,6,7,9&isShard=true&rid=null-511&wt=javabin&version=2} status=0 QTime=3 2> 201198 INFO (qtp558875332-572) [n:127.0.0.1:32885_ c:collection1 s:shard2 r:core_node7 x:collection1_shard2_replica_n5 t:null-511] o.a.s.c.S.Request webapp= path=/select params={q=*:*&shards=http://127.0.0.1:40707/collection1_shard2_replica_n2|http://127.0.0.1:32885/collection1_shard2_replica_n5&wt=javabin&version=2} rid=null-511 hits=35 status=0 QTime=27 2> 201207 INFO (qtp657049334-529) [n:127.0.0.1:37133_ c:collection1 s:shard1 r:core_node8 x:collection1_shard1_replica_n6 t:null-512] o.a.s.c.S.Request webapp= path=/select params={q=*:*&shards=shard1&wt=javabin&version=2} rid=null-512 hits=32 status=0 QTime=2 2> 201227 INFO (qtp657049334-531) [n:127.0.0.1:37133_ c:collection1 s:shard1 r:core_node8 x:collection1_shard1_replica_n6 t:null-513] o.a.s.c.S.Request webapp= path=/select params={df=text&distrib=false&fl=id&fl=score&shards.purpose=16388&start=0&fsv=true&rid=null-513&rows=10&version=2&q=*:*&omitHeader=false&NOW=1705871133315&isShard=true&wt=javabin} hits=32 status=0 QTime=5 2> 201237 INFO (qtp657049334-530) [n:127.0.0.1:37133_ c:collection1 s:shard1 r:core_node8 x:collection1_shard1_replica_n6 t:null-513] o.a.s.c.S.Request webapp= path=/select params={q=*:*&df=text&distrib=false&omitHeader=false&shards.purpose=64&NOW=1705871133315&ids=11,12,1,13,14,15,4,16,8,10&isShard=true&rid=null-513&wt=javabin&version=2} status=0 QTime=3 2> 201243 INFO (qtp657049334-533) [n:127.0.0.1:37133_ c:collection1 s:shard1 r:core_node8 x:collection1_shard1_replica_n6 t:null-513] o.a.s.c.S.Request webapp= path=/select params={q=*:*&shards=http://127.0.0.1:37133/collection1_shard1_replica_n6&wt=javabin&version=2} rid=null-513 hits=32 status=0 QTime=27 2> 201257 INFO (qtp1128963290-589) [n:127.0.0.1:43017_ c:collection1 s:shard1 r:core_node3 x:collection1_shard1_replica_n1 t:null-514] o.a.s.c.S.Request webapp= path=/select params={df=text&distrib=false&fl=id&fl=score&shards.purpose=16388&start=0&fsv=true&rid=null-514&rows=10&version=2&q=*:*&omitHeader=false&NOW=1705871133347&isShard=true&wt=javabin} hits=32 status=0 QTime=2 2> 201269 INFO (qtp1128963290-591) [n:127.0.0.1:43017_ c:collection1 s:shard1 r:core_node3 x:collection1_shard1_replica_n1 t:null-514] o.a.s.c.S.Request webapp= path=/select params={q=*:*&df=text&distrib=false&omitHeader=false&shards.purpose=64&NOW=1705871133347&ids=11,12,1,13,14,15,4,16,8,10&isShard=true&rid=null-514&wt=javabin&version=2} status=0 QTime=3 2> 201276 INFO (qtp657049334-532) [n:127.0.0.1:37133_ c:collection1 s:shard1 r:core_node8 x:collection1_shard1_replica_n6 t:null-514] o.a.s.c.S.Request webapp= path=/select params={q=*:*&shards=http://127.0.0.1:43017/collection1_shard1_replica_n1&wt=javabin&version=2} rid=null-514 hits=32 status=0 QTime=27 2> 201289 INFO (qtp1128963290-593) [n:127.0.0.1:43017_ c:collection1 s:shard1 r:core_node3 x:collection1_shard1_replica_n1 t:null-515] o.a.s.c.S.Request webapp= path=/select params={df=text&distrib=false&fl=id&fl=score&shards.purpose=16388&start=0&fsv=true&rid=null-515&rows=10&version=2&q=*:*&omitHeader=false&NOW=1705871133380&isShard=true&wt=javabin} hits=32 status=0 QTime=1 2> 201299 INFO (qtp1128963290-588) [n:127.0.0.1:43017_ c:collection1 s:shard1 r:core_node3 x:collection1_shard1_replica_n1 t:null-515] o.a.s.c.S.Request webapp= path=/select params={q=*:*&df=text&distrib=false&omitHeader=false&shards.purpose=64&NOW=1705871133380&ids=11,12,1,13,14,15,4,16,8,10&isShard=true&rid=null-515&wt=javabin&version=2} status=0 QTime=3 2> 201306 INFO (qtp657049334-528) [n:127.0.0.1:37133_ c:collection1 s:shard1 r:core_node8 x:collection1_shard1_replica_n6 t:null-515] o.a.s.c.S.Request webapp= path=/select params={q=*:*&shards=http://127.0.0.1:37133/collection1_shard1_replica_n6|http://127.0.0.1:43017/collection1_shard1_replica_n1&wt=javabin&version=2} rid=null-515 hits=32 status=0 QTime=23 2> 201318 INFO (qtp657049334-529) [n:127.0.0.1:37133_ c:collection1 s:shard1 r:core_node8 x:collection1_shard1_replica_n6 t:null-516] o.a.s.c.S.Request webapp= path=/select params={df=text&distrib=false&fl=id&fl=score&shards.purpose=16388&start=0&fsv=true&rid=null-516&rows=10&version=2&q=*:*&omitHeader=false&NOW=1705871133409&isShard=true&wt=javabin} hits=32 status=0 QTime=1 2> 201327 INFO (qtp657049334-531) [n:127.0.0.1:37133_ c:collection1 s:shard1 r:core_node8 x:collection1_shard1_replica_n6 t:null-516] o.a.s.c.S.Request webapp= path=/select params={q=*:*&df=text&distrib=false&omitHeader=false&shards.purpose=64&NOW=1705871133409&ids=11,12,1,13,14,15,4,16,8,10&isShard=true&rid=null-516&wt=javabin&version=2} status=0 QTime=2 2> 201334 INFO (qtp600427849-518) [n:127.0.0.1:40707_ c:collection1 s:shard2 r:core_node4 x:collection1_shard2_replica_n2 t:null-516] o.a.s.c.S.Request webapp= path=/select params={q=*:*&shards=shard1&wt=javabin&version=2} rid=null-516 hits=32 status=0 QTime=23 2> 201347 INFO (qtp657049334-530) [n:127.0.0.1:37133_ c:collection1 s:shard1 r:core_node8 x:collection1_shard1_replica_n6 t:null-517] o.a.s.c.S.Request webapp= path=/select params={df=text&distrib=false&fl=id&fl=score&shards.purpose=16388&start=0&fsv=true&rid=null-517&rows=10&version=2&q=*:*&omitHeader=false&NOW=1705871133438&isShard=true&wt=javabin} hits=32 status=0 QTime=1 2> 201356 INFO (qtp657049334-533) [n:127.0.0.1:37133_ c:collection1 s:shard1 r:core_node8 x:collection1_shard1_replica_n6 t:null-517] o.a.s.c.S.Request webapp= path=/select params={q=*:*&df=text&distrib=false&omitHeader=false&shards.purpose=64&NOW=1705871133438&ids=11,12,1,13,14,15,4,16,8,10&isShard=true&rid=null-517&wt=javabin&version=2} status=0 QTime=3 2> 201363 INFO (qtp600427849-519) [n:127.0.0.1:40707_ c:collection1 s:shard2 r:core_node4 x:collection1_shard2_replica_n2 t:null-517] o.a.s.c.S.Request webapp= path=/select params={q=*:*&shards=http://127.0.0.1:37133/collection1_shard1_replica_n6&wt=javabin&version=2} rid=null-517 hits=32 status=0 QTime=23 2> 201374 INFO (qtp1128963290-592) [n:127.0.0.1:43017_ c:collection1 s:shard1 r:core_node3 x:collection1_shard1_replica_n1 t:null-518] o.a.s.c.S.Request webapp= path=/select params={df=text&distrib=false&fl=id&fl=score&shards.purpose=16388&start=0&fsv=true&rid=null-518&rows=10&version=2&q=*:*&omitHeader=false&NOW=1705871133466&isShard=true&wt=javabin} hits=32 status=0 QTime=1 2> 201382 INFO (qtp1128963290-590) [n:127.0.0.1:43017_ c:collection1 s:shard1 r:core_node3 x:collection1_shard1_replica_n1 t:null-518] o.a.s.c.S.Request webapp= path=/select params={q=*:*&df=text&distrib=false&omitHeader=false&shards.purpose=64&NOW=1705871133466&ids=11,12,1,13,14,15,4,16,8,10&isShard=true&rid=null-518&wt=javabin&version=2} status=0 QTime=2 2> 201388 INFO (qtp600427849-515) [n:127.0.0.1:40707_ c:collection1 s:shard2 r:core_node4 x:collection1_shard2_replica_n2 t:null-518] o.a.s.c.S.Request webapp= path=/select params={q=*:*&shards=http://127.0.0.1:43017/collection1_shard1_replica_n1&wt=javabin&version=2} rid=null-518 hits=32 status=0 QTime=20 2> 201401 INFO (qtp1128963290-589) [n:127.0.0.1:43017_ c:collection1 s:shard1 r:core_node3 x:collection1_shard1_replica_n1 t:null-519] o.a.s.c.S.Request webapp= path=/select params={df=text&distrib=false&fl=id&fl=score&shards.purpose=16388&start=0&fsv=true&rid=null-519&rows=10&version=2&q=*:*&omitHeader=false&NOW=1705871133493&isShard=true&wt=javabin} hits=32 status=0 QTime=1 2> 201410 INFO (qtp1128963290-591) [n:127.0.0.1:43017_ c:collection1 s:shard1 r:core_node3 x:collection1_shard1_replica_n1 t:null-519] o.a.s.c.S.Request webapp= path=/select params={q=*:*&df=text&distrib=false&omitHeader=false&shards.purpose=64&NOW=1705871133493&ids=11,12,1,13,14,15,4,16,8,10&isShard=true&rid=null-519&wt=javabin&version=2} status=0 QTime=2 2> 201430 INFO (qtp600427849-520) [n:127.0.0.1:40707_ c:collection1 s:shard2 r:core_node4 x:collection1_shard2_replica_n2 t:null-519] o.a.s.c.S.Request webapp= path=/select params={q=*:*&shards=http://127.0.0.1:37133/collection1_shard1_replica_n6|http://127.0.0.1:43017/collection1_shard1_replica_n1&wt=javabin&version=2} rid=null-519 hits=32 status=0 QTime=35 2> 201437 INFO (qtp1128963290-593) [n:127.0.0.1:43017_ c:collection1 s:shard1 r:core_node3 x:collection1_shard1_replica_n1 t:null-520] o.a.s.c.S.Request webapp= path=/select params={q=*:*&shards=shard1&wt=javabin&version=2} rid=null-520 hits=32 status=0 QTime=1 2> 201451 INFO (qtp657049334-532) [n:127.0.0.1:37133_ c:collection1 s:shard1 r:core_node8 x:collection1_shard1_replica_n6 t:null-521] o.a.s.c.S.Request webapp= path=/select params={df=text&distrib=false&fl=id&fl=score&shards.purpose=16388&start=0&fsv=true&rid=null-521&rows=10&version=2&q=*:*&omitHeader=false&NOW=1705871133543&isShard=true&wt=javabin} hits=32 status=0 QTime=1 2> 201460 INFO (qtp657049334-528) [n:127.0.0.1:37133_ c:collection1 s:shard1 r:core_node8 x:collection1_shard1_replica_n6 t:null-521] o.a.s.c.S.Request webapp= path=/select params={q=*:*&df=text&distrib=false&omitHeader=false&shards.purpose=64&NOW=1705871133543&ids=11,12,1,13,14,15,4,16,8,10&isShard=true&rid=null-521&wt=javabin&version=2} status=0 QTime=2 2> 201466 INFO (qtp1128963290-588) [n:127.0.0.1:43017_ c:collection1 s:shard1 r:core_node3 x:collection1_shard1_replica_n1 t:null-521] o.a.s.c.S.Request webapp= path=/select params={q=*:*&shards=http://127.0.0.1:37133/collection1_shard1_replica_n6&wt=javabin&version=2} rid=null-521 hits=32 status=0 QTime=21 2> 201480 INFO (qtp1128963290-590) [n:127.0.0.1:43017_ c:collection1 s:shard1 r:core_node3 x:collection1_shard1_replica_n1 t:null-522] o.a.s.c.S.Request webapp= path=/select params={df=text&distrib=false&fl=id&fl=score&shards.purpose=16388&start=0&fsv=true&rid=null-522&rows=10&version=2&q=*:*&omitHeader=false&NOW=1705871133569&isShard=true&wt=javabin} hits=32 status=0 QTime=1 2> 201489 INFO (qtp1128963290-589) [n:127.0.0.1:43017_ c:collection1 s:shard1 r:core_node3 x:collection1_shard1_replica_n1 t:null-522] o.a.s.c.S.Request webapp= path=/select params={q=*:*&df=text&distrib=false&omitHeader=false&shards.purpose=64&NOW=1705871133569&ids=11,12,1,13,14,15,4,16,8,10&isShard=true&rid=null-522&wt=javabin&version=2} status=0 QTime=2 2> 201495 INFO (qtp1128963290-592) [n:127.0.0.1:43017_ c:collection1 s:shard1 r:core_node3 x:collection1_shard1_replica_n1 t:null-522] o.a.s.c.S.Request webapp= path=/select params={q=*:*&shards=http://127.0.0.1:43017/collection1_shard1_replica_n1&wt=javabin&version=2} rid=null-522 hits=32 status=0 QTime=24 2> 201507 INFO (qtp657049334-529) [n:127.0.0.1:37133_ c:collection1 s:shard1 r:core_node8 x:collection1_shard1_replica_n6 t:null-523] o.a.s.c.S.Request webapp= path=/select params={df=text&distrib=false&fl=id&fl=score&shards.purpose=16388&start=0&fsv=true&rid=null-523&rows=10&version=2&q=*:*&omitHeader=false&NOW=1705871133599&isShard=true&wt=javabin} hits=32 status=0 QTime=1 2> 201525 INFO (qtp657049334-531) [n:127.0.0.1:37133_ c:collection1 s:shard1 r:core_node8 x:collection1_shard1_replica_n6 t:null-523] o.a.s.c.S.Request webapp= path=/select params={q=*:*&df=text&distrib=false&omitHeader=false&shards.purpose=64&NOW=1705871133599&ids=11,12,1,13,14,15,4,16,8,10&isShard=true&rid=null-523&wt=javabin&version=2} status=0 QTime=2 2> 201531 INFO (qtp1128963290-591) [n:127.0.0.1:43017_ c:collection1 s:shard1 r:core_node3 x:collection1_shard1_replica_n1 t:null-523] o.a.s.c.S.Request webapp= path=/select params={q=*:*&shards=http://127.0.0.1:37133/collection1_shard1_replica_n6|http://127.0.0.1:43017/collection1_shard1_replica_n1&wt=javabin&version=2} rid=null-523 hits=32 status=0 QTime=30 2> 201542 INFO (qtp657049334-530) [n:127.0.0.1:37133_ c:collection1 s:shard1 r:core_node8 x:collection1_shard1_replica_n6 t:null-524] o.a.s.c.S.Request webapp= path=/select params={df=text&distrib=false&fl=id&fl=score&shards.purpose=16388&start=0&fsv=true&rid=null-524&rows=10&version=2&q=*:*&omitHeader=false&NOW=1705871133634&isShard=true&wt=javabin} hits=32 status=0 QTime=1 2> 201551 INFO (qtp657049334-533) [n:127.0.0.1:37133_ c:collection1 s:shard1 r:core_node8 x:collection1_shard1_replica_n6 t:null-524] o.a.s.c.S.Request webapp= path=/select params={q=*:*&df=text&distrib=false&omitHeader=false&shards.purpose=64&NOW=1705871133634&ids=11,12,1,13,14,15,4,16,8,10&isShard=true&rid=null-524&wt=javabin&version=2} status=0 QTime=2 2> 201556 INFO (qtp558875332-569) [n:127.0.0.1:32885_ c:collection1 s:shard2 r:core_node7 x:collection1_shard2_replica_n5 t:null-524] o.a.s.c.S.Request webapp= path=/select params={q=*:*&shards=shard1&wt=javabin&version=2} rid=null-524 hits=32 status=0 QTime=20 2> 201567 INFO (qtp657049334-532) [n:127.0.0.1:37133_ c:collection1 s:shard1 r:core_node8 x:collection1_shard1_replica_n6 t:null-525] o.a.s.c.S.Request webapp= path=/select params={df=text&distrib=false&fl=id&fl=score&shards.purpose=16388&start=0&fsv=true&rid=null-525&rows=10&version=2&q=*:*&omitHeader=false&NOW=1705871133659&isShard=true&wt=javabin} hits=32 status=0 QTime=1 2> 201576 INFO (qtp657049334-528) [n:127.0.0.1:37133_ c:collection1 s:shard1 r:core_node8 x:collection1_shard1_replica_n6 t:null-525] o.a.s.c.S.Request webapp= path=/select params={q=*:*&df=text&distrib=false&omitHeader=false&shards.purpose=64&NOW=1705871133659&ids=11,12,1,13,14,15,4,16,8,10&isShard=true&rid=null-525&wt=javabin&version=2} status=0 QTime=2 2> 201582 INFO (qtp558875332-570) [n:127.0.0.1:32885_ c:collection1 s:shard2 r:core_node7 x:collection1_shard2_replica_n5 t:null-525] o.a.s.c.S.Request webapp= path=/select params={q=*:*&shards=http://127.0.0.1:37133/collection1_shard1_replica_n6&wt=javabin&version=2} rid=null-525 hits=32 status=0 QTime=20 2> 201595 INFO (qtp1128963290-593) [n:127.0.0.1:43017_ c:collection1 s:shard1 r:core_node3 x:collection1_shard1_replica_n1 t:null-526] o.a.s.c.S.Request webapp= path=/select params={df=text&distrib=false&fl=id&fl=score&shards.purpose=16388&start=0&fsv=true&rid=null-526&rows=10&version=2&q=*:*&omitHeader=false&NOW=1705871133685&isShard=true&wt=javabin} hits=32 status=0 QTime=1 2> 201612 INFO (qtp1128963290-588) [n:127.0.0.1:43017_ c:collection1 s:shard1 r:core_node3 x:collection1_shard1_replica_n1 t:null-526] o.a.s.c.S.Request webapp= path=/select params={q=*:*&df=text&distrib=false&omitHeader=false&shards.purpose=64&NOW=1705871133685&ids=11,12,1,13,14,15,4,16,8,10&isShard=true&rid=null-526&wt=javabin&version=2} status=0 QTime=2 2> 201617 INFO (qtp558875332-571) [n:127.0.0.1:32885_ c:collection1 s:shard2 r:core_node7 x:collection1_shard2_replica_n5 t:null-526] o.a.s.c.S.Request webapp= path=/select params={q=*:*&shards=http://127.0.0.1:43017/collection1_shard1_replica_n1&wt=javabin&version=2} rid=null-526 hits=32 status=0 QTime=30 2> 201628 INFO (qtp657049334-529) [n:127.0.0.1:37133_ c:collection1 s:shard1 r:core_node8 x:collection1_shard1_replica_n6 t:null-527] o.a.s.c.S.Request webapp= path=/select params={df=text&distrib=false&fl=id&fl=score&shards.purpose=16388&start=0&fsv=true&rid=null-527&rows=10&version=2&q=*:*&omitHeader=false&NOW=1705871133720&isShard=true&wt=javabin} hits=32 status=0 QTime=1 2> 201638 INFO (qtp657049334-531) [n:127.0.0.1:37133_ c:collection1 s:shard1 r:core_node8 x:collection1_shard1_replica_n6 t:null-527] o.a.s.c.S.Request webapp= path=/select params={q=*:*&df=text&distrib=false&omitHeader=false&shards.purpose=64&NOW=1705871133720&ids=11,12,1,13,14,15,4,16,8,10&isShard=true&rid=null-527&wt=javabin&version=2} status=0 QTime=3 2> 201662 INFO (qtp558875332-573) [n:127.0.0.1:32885_ c:collection1 s:shard2 r:core_node7 x:collection1_shard2_replica_n5 t:null-527] o.a.s.c.S.Request webapp= path=/select params={q=*:*&shards=http://127.0.0.1:37133/collection1_shard1_replica_n6|http://127.0.0.1:43017/collection1_shard1_replica_n1&wt=javabin&version=2} rid=null-527 hits=32 status=0 QTime=40 2> 201676 INFO (qtp558875332-568) [n:127.0.0.1:32885_ c:collection1 s:shard2 r:core_node7 x:collection1_shard2_replica_n5 t:null-528] o.a.s.c.S.Request webapp= path=/select params={df=text&distrib=false&fl=id&fl=score&shards.purpose=16388&start=0&fsv=true&rid=null-528&rows=10&version=2&q=*:*&omitHeader=false&NOW=1705871133768&isShard=true&wt=javabin} hits=35 status=0 QTime=1 2> 201678 INFO (qtp1128963290-590) [n:127.0.0.1:43017_ c:collection1 s:shard1 r:core_node3 x:collection1_shard1_replica_n1 t:null-528] o.a.s.c.S.Request webapp= path=/select params={df=text&distrib=false&fl=id&fl=score&shards.purpose=16388&start=0&fsv=true&rid=null-528&rows=10&version=2&q=*:*&omitHeader=false&NOW=1705871133768&isShard=true&wt=javabin} hits=32 status=0 QTime=1 2> 201695 INFO (qtp558875332-572) [n:127.0.0.1:32885_ c:collection1 s:shard2 r:core_node7 x:collection1_shard2_replica_n5 t:null-528] o.a.s.c.S.Request webapp= path=/select params={q=*:*&df=text&distrib=false&omitHeader=false&shards.purpose=64&NOW=1705871133768&ids=2,3,103,104,5,105,17,6,7,9&isShard=true&rid=null-528&wt=javabin&version=2} status=0 QTime=2 2> 201700 INFO (qtp600427849-516) [n:127.0.0.1:40707_ c:collection1 s:shard2 r:core_node4 x:collection1_shard2_replica_n2 t:null-528] o.a.s.c.S.Request webapp= path=/select params={q=*:*&shards=shard2,shard1&wt=javabin&version=2} rid=null-528 hits=67 status=0 QTime=31 2> 201712 INFO (qtp558875332-569) [n:127.0.0.1:32885_ c:collection1 s:shard2 r:core_node7 x:collection1_shard2_replica_n5 t:null-529] o.a.s.c.S.Request webapp= path=/select params={df=text&distrib=false&fl=id&fl=score&shards.purpose=16388&start=0&fsv=true&rid=null-529&rows=10&version=2&q=*:*&omitHeader=false&NOW=1705871133805&isShard=true&wt=javabin} hits=35 status=0 QTime=1 2> 201714 INFO (qtp1128963290-589) [n:127.0.0.1:43017_ c:collection1 s:shard1 r:core_node3 x:collection1_shard1_replica_n1 t:null-529] o.a.s.c.S.Request webapp= path=/select params={df=text&distrib=false&fl=id&fl=score&shards.purpose=16388&start=0&fsv=true&rid=null-529&rows=10&version=2&q=*:*&omitHeader=false&NOW=1705871133805&isShard=true&wt=javabin} hits=32 status=0 QTime=1 2> 201722 INFO (qtp558875332-570) [n:127.0.0.1:32885_ c:collection1 s:shard2 r:core_node7 x:collection1_shard2_replica_n5 t:null-529] o.a.s.c.S.Request webapp= path=/select params={q=*:*&df=text&distrib=false&omitHeader=false&shards.purpose=64&NOW=1705871133805&ids=2,3,103,104,5,105,17,6,7,9&isShard=true&rid=null-529&wt=javabin&version=2} status=0 QTime=2 2> 201727 INFO (qtp657049334-530) [n:127.0.0.1:37133_ c:collection1 s:shard1 r:core_node8 x:collection1_shard1_replica_n6 t:null-529] o.a.s.c.S.Request webapp= path=/select params={q=*:*&shards=shard2,shard1&wt=javabin&version=2} rid=null-529 hits=67 status=0 QTime=22 2> 201738 INFO (qtp558875332-573) [n:127.0.0.1:32885_ c:collection1 s:shard2 r:core_node7 x:collection1_shard2_replica_n5 t:null-530] o.a.s.c.S.Request webapp= path=/select params={df=text&distrib=false&fl=id&fl=score&shards.purpose=16388&start=0&fsv=true&rid=null-530&rows=10&version=2&q=*:*&omitHeader=false&NOW=1705871133830&isShard=true&wt=javabin} hits=35 status=0 QTime=1 2> 201741 INFO (qtp1128963290-592) [n:127.0.0.1:43017_ c:collection1 s:shard1 r:core_node3 x:collection1_shard1_replica_n1 t:null-530] o.a.s.c.S.Request webapp= path=/select params={df=text&distrib=false&fl=id&fl=score&shards.purpose=16388&start=0&fsv=true&rid=null-530&rows=10&version=2&q=*:*&omitHeader=false&NOW=1705871133830&isShard=true&wt=javabin} hits=32 status=0 QTime=2 2> 201749 INFO (qtp558875332-568) [n:127.0.0.1:32885_ c:collection1 s:shard2 r:core_node7 x:collection1_shard2_replica_n5 t:null-530] o.a.s.c.S.Request webapp= path=/select params={q=*:*&df=text&distrib=false&omitHeader=false&shards.purpose=64&NOW=1705871133830&ids=2,3,103,104,5,105,17,6,7,9&isShard=true&rid=null-530&wt=javabin&version=2} status=0 QTime=2 2> 201755 INFO (qtp558875332-571) [n:127.0.0.1:32885_ c:collection1 s:shard2 r:core_node7 x:collection1_shard2_replica_n5 t:null-530] o.a.s.c.S.Request webapp= path=/select params={q=*:*&shards=shard2,shard1&wt=javabin&version=2} rid=null-530 hits=67 status=0 QTime=22 2> 201767 INFO (qtp558875332-572) [n:127.0.0.1:32885_ c:collection1 s:shard2 r:core_node7 x:collection1_shard2_replica_n5 t:null-531] o.a.s.c.S.Request webapp= path=/select params={df=text&distrib=false&fl=id&fl=score&shards.purpose=16388&start=0&fsv=true&rid=null-531&rows=10&version=2&q=*:*&omitHeader=false&NOW=1705871133858&isShard=true&wt=javabin} hits=35 status=0 QTime=2 2> 201768 INFO (qtp1128963290-593) [n:127.0.0.1:43017_ c:collection1 s:shard1 r:core_node3 x:collection1_shard1_replica_n1 t:null-531] o.a.s.c.S.Request webapp= path=/select params={df=text&distrib=false&fl=id&fl=score&shards.purpose=16388&start=0&fsv=true&rid=null-531&rows=10&version=2&q=*:*&omitHeader=false&NOW=1705871133858&isShard=true&wt=javabin} hits=32 status=0 QTime=1 2> 201777 INFO (qtp558875332-569) [n:127.0.0.1:32885_ c:collection1 s:shard2 r:core_node7 x:collection1_shard2_replica_n5 t:null-531] o.a.s.c.S.Request webapp= path=/select params={q=*:*&df=text&distrib=false&omitHeader=false&shards.purpose=64&NOW=1705871133858&ids=2,3,103,104,5,105,17,6,7,9&isShard=true&rid=null-531&wt=javabin&version=2} status=0 QTime=3 2> 201783 INFO (qtp1128963290-591) [n:127.0.0.1:43017_ c:collection1 s:shard1 r:core_node3 x:collection1_shard1_replica_n1 t:null-531] o.a.s.c.S.Request webapp= path=/select params={q=*:*&shards=shard2,shard1&wt=javabin&version=2} rid=null-531 hits=67 status=0 QTime=23 2> 201796 INFO (qtp600427849-518) [n:127.0.0.1:40707_ c:collection1 s:shard2 r:core_node4 x:collection1_shard2_replica_n2 t:null-532] o.a.s.c.S.Request webapp= path=/select params={df=text&distrib=false&fl=id&fl=score&shards.purpose=16388&start=0&fsv=true&rid=null-532&rows=10&version=2&q=*:*&omitHeader=false&NOW=1705871133888&isShard=true&wt=javabin} hits=35 status=0 QTime=1 2> 201798 INFO (qtp1128963290-588) [n:127.0.0.1:43017_ c:collection1 s:shard1 r:core_node3 x:collection1_shard1_replica_n1 t:null-532] o.a.s.c.S.Request webapp= path=/select params={df=text&distrib=false&fl=id&fl=score&shards.purpose=16388&start=0&fsv=true&rid=null-532&rows=10&version=2&q=*:*&omitHeader=false&NOW=1705871133888&isShard=true&wt=javabin} hits=32 status=0 QTime=1 2> 201807 INFO (qtp600427849-519) [n:127.0.0.1:40707_ c:collection1 s:shard2 r:core_node4 x:collection1_shard2_replica_n2 t:null-532] o.a.s.c.S.Request webapp= path=/select params={q=*:*&df=text&distrib=false&omitHeader=false&shards.purpose=64&NOW=1705871133888&ids=2,3,103,104,5,105,17,6,7,9&isShard=true&rid=null-532&wt=javabin&version=2} status=0 QTime=3 2> 201814 INFO (qtp600427849-517) [n:127.0.0.1:40707_ c:collection1 s:shard2 r:core_node4 x:collection1_shard2_replica_n2 t:null-532] o.a.s.c.S.Request webapp= path=/select params={q=*:*&wt=javabin&version=2} rid=null-532 hits=67 status=0 QTime=25 2> 201827 INFO (qtp600427849-515) [n:127.0.0.1:40707_ c:collection1 s:shard2 r:core_node4 x:collection1_shard2_replica_n2 t:null-533] o.a.s.c.S.Request webapp= path=/select params={df=text&distrib=false&fl=id&fl=score&shards.purpose=16388&start=0&fsv=true&rid=null-533&rows=10&version=2&q=*:*&omitHeader=false&NOW=1705871133918&isShard=true&wt=javabin} hits=35 status=0 QTime=1 2> 201830 INFO (qtp657049334-532) [n:127.0.0.1:37133_ c:collection1 s:shard1 r:core_node8 x:collection1_shard1_replica_n6 t:null-533] o.a.s.c.S.Request webapp= path=/select params={df=text&distrib=false&fl=id&fl=score&shards.purpose=16388&start=0&fsv=true&rid=null-533&rows=10&version=2&q=*:*&omitHeader=false&NOW=1705871133918&isShard=true&wt=javabin} hits=32 status=0 QTime=1 2> 201838 INFO (qtp657049334-528) [n:127.0.0.1:37133_ c:collection1 s:shard1 r:core_node8 x:collection1_shard1_replica_n6 t:null-533] o.a.s.c.S.Request webapp= path=/select params={q=*:*&df=text&distrib=false&omitHeader=false&shards.purpose=64&NOW=1705871133918&ids=11,12,1,13,14,15,4,16,8,10&isShard=true&rid=null-533&wt=javabin&version=2} status=0 QTime=2 2> 201844 INFO (qtp657049334-533) [n:127.0.0.1:37133_ c:collection1 s:shard1 r:core_node8 x:collection1_shard1_replica_n6 t:null-533] o.a.s.c.S.Request webapp= path=/select params={q=*:*&wt=javabin&version=2} rid=null-533 hits=67 status=0 QTime=24 2> 201856 INFO (qtp600427849-520) [n:127.0.0.1:40707_ c:collection1 s:shard2 r:core_node4 x:collection1_shard2_replica_n2 t:null-534] o.a.s.c.S.Request webapp= path=/select params={df=text&distrib=false&fl=id&fl=score&shards.purpose=16388&start=0&fsv=true&rid=null-534&rows=10&version=2&q=*:*&omitHeader=false&NOW=1705871133947&isShard=true&wt=javabin} hits=35 status=0 QTime=1 2> 201858 INFO (qtp1128963290-590) [n:127.0.0.1:43017_ c:collection1 s:shard1 r:core_node3 x:collection1_shard1_replica_n1 t:null-534] o.a.s.c.S.Request webapp= path=/select params={df=text&distrib=false&fl=id&fl=score&shards.purpose=16388&start=0&fsv=true&rid=null-534&rows=10&version=2&q=*:*&omitHeader=false&NOW=1705871133947&isShard=true&wt=javabin} hits=32 status=0 QTime=1 2> 201868 INFO (qtp600427849-516) [n:127.0.0.1:40707_ c:collection1 s:shard2 r:core_node4 x:collection1_shard2_replica_n2 t:null-534] o.a.s.c.S.Request webapp= path=/select params={q=*:*&df=text&distrib=false&omitHeader=false&shards.purpose=64&NOW=1705871133947&ids=2,3,103,104,5,105,17,6,7,9&isShard=true&rid=null-534&wt=javabin&version=2} status=0 QTime=3 2> 201874 INFO (qtp558875332-570) [n:127.0.0.1:32885_ c:collection1 s:shard2 r:core_node7 x:collection1_shard2_replica_n5 t:null-534] o.a.s.c.S.Request webapp= path=/select params={q=*:*&wt=javabin&version=2} rid=null-534 hits=67 status=0 QTime=25 2> 201886 INFO (qtp558875332-573) [n:127.0.0.1:32885_ c:collection1 s:shard2 r:core_node7 x:collection1_shard2_replica_n5 t:null-535] o.a.s.c.S.Request webapp= path=/select params={df=text&distrib=false&fl=id&fl=score&shards.purpose=16388&start=0&fsv=true&rid=null-535&rows=10&version=2&q=*:*&omitHeader=false&NOW=1705871133978&isShard=true&wt=javabin} hits=35 status=0 QTime=1 2> 201888 INFO (qtp657049334-529) [n:127.0.0.1:37133_ c:collection1 s:shard1 r:core_node8 x:collection1_shard1_replica_n6 t:null-535] o.a.s.c.S.Request webapp= path=/select params={df=text&distrib=false&fl=id&fl=score&shards.purpose=16388&start=0&fsv=true&rid=null-535&rows=10&version=2&q=*:*&omitHeader=false&NOW=1705871133978&isShard=true&wt=javabin} hits=32 status=0 QTime=1 2> 201896 INFO (qtp558875332-568) [n:127.0.0.1:32885_ c:collection1 s:shard2 r:core_node7 x:collection1_shard2_replica_n5 t:null-535] o.a.s.c.S.Request webapp= path=/select params={q=*:*&df=text&distrib=false&omitHeader=false&shards.purpose=64&NOW=1705871133978&ids=2,3,103,104,5,105,17,6,7,9&isShard=true&rid=null-535&wt=javabin&version=2} status=0 QTime=2 2> 201902 INFO (qtp1128963290-589) [n:127.0.0.1:43017_ c:collection1 s:shard1 r:core_node3 x:collection1_shard1_replica_n1 t:null-535] o.a.s.c.S.Request webapp= path=/select params={q=*:*&wt=javabin&version=2} rid=null-535 hits=67 status=0 QTime=22 2> 201905 INFO (TEST-BasicDistributedZkTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.AbstractBasicDistributedZkTestBase ### STARTING testMultipleCollections 2> 201939 INFO (OverseerThreadFactory-250-thread-4) [n: c:collection2 s: r: x: t:] o.a.s.c.a.c.CreateCollectionCmd Create collection collection2 2> 202066 WARN (OverseerThreadFactory-250-thread-4) [n: c:collection2 s: r: x: t:] o.a.s.c.a.c.CreateCollectionCmd It is unusual to create a collection (collection2) without cores. 2> 202076 INFO (qtp1128963290-592) [n:127.0.0.1:43017_ c:collection2 s: r: x: t:null-536] o.a.s.h.a.CollectionsHandler Wait for new collection to be active for at most 45 seconds. Check all shard replicas 2> 202082 INFO (qtp1128963290-592) [n:127.0.0.1:43017_ c:collection2 s: r: x: t:null-536] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/collections params={collection.configName=conf1&name=collection2&nrtReplicas=1&action=CREATE&numShards=2&createNodeSet=&wt=javabin&version=2} status=0 QTime=171 2> 202126 INFO (OverseerCollectionConfigSetProcessor-72077803529764868-127.0.0.1:45297_-n_0000000000) [n: c: s: r: x: t:] o.a.s.c.OverseerTaskQueue Response ZK path: /overseer/collection-queue-work/qnr-0000000012 doesn't exist. Requestor may have disconnected from ZooKeeper 2> 202140 INFO (OverseerThreadFactory-250-thread-4) [n: c:collection2 s:shard2 r: x: t:] o.a.s.c.a.c.AddReplicaCmd Node Identified 127.0.0.1:40707_ for creating new replica of shard shard2 for collection collection2 2> 202156 INFO (OverseerThreadFactory-250-thread-4) [n: c:collection2 s:shard2 r: x: t:] o.a.s.c.a.c.AddReplicaCmd Returning CreateReplica command. 2> 202171 INFO (OverseerStateUpdate-72077803529764868-127.0.0.1:45297_-n_0000000000) [n:127.0.0.1:45297_ c: s: r: x: t:] o.a.s.c.o.SliceMutator createReplica() { 2> "core":"collection2_shard2_replica_n1", 2> "node_name":"127.0.0.1:40707_", 2> "base_url":"http://127.0.0.1:40707", 2> "collection":"collection2", 2> "shard":"shard2", 2> "state":"down", 2> "type":"NRT", 2> "operation":"addreplica"} 2> 202291 INFO (zkCallback-244-thread-2) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/collection2/state.json zxid: 366] for collection [collection2] has occurred - updating... (live nodes size: [5]) 2> 202318 INFO (qtp600427849-517) [n:127.0.0.1:40707_ c: s: r: x:collection2_shard2_replica_n1 t:null-538] o.a.s.h.a.CoreAdminOperation core create command qt=/admin/cores&coreNodeName=core_node2&collection.configName=conf1&name=collection2_shard2_replica_n1&action=CREATE&collection=collection2&shard=shard2&wt=javabin&version=2&replicaType=NRT 2> 202374 INFO (qtp600427849-517) [n:127.0.0.1:40707_ c:collection2 s:shard2 r:core_node2 x:collection2_shard2_replica_n1 t:null-538] o.a.s.c.SolrConfig Using Lucene MatchVersion: 9.8.0 2> 202400 INFO (qtp600427849-517) [n:127.0.0.1:40707_ c:collection2 s:shard2 r:core_node2 x:collection2_shard2_replica_n1 t:null-538] o.a.s.s.IndexSchema Schema name=test 2> 202614 INFO (qtp600427849-517) [n:127.0.0.1:40707_ c:collection2 s:shard2 r:core_node2 x:collection2_shard2_replica_n1 t:null-538] o.a.s.s.IndexSchema Loaded schema test/1.0 with uniqueid field id 2> 202804 INFO (qtp600427849-517) [n:127.0.0.1:40707_ c:collection2 s:shard2 r:core_node2 x:collection2_shard2_replica_n1 t:null-538] o.a.s.c.CoreContainer Creating SolrCore 'collection2_shard2_replica_n1' using configuration from configset conf1, trusted=true 2> 202810 INFO (qtp600427849-517) [n:127.0.0.1:40707_ c:collection2 s:shard2 r:core_node2 x:collection2_shard2_replica_n1 t:null-538] o.a.s.c.SolrCore Opening new SolrCore at [/build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.cloud.BasicDistributedZkTest_246C98A4C257C021-001/shard-1-001/cores/collection2_shard2_replica_n1], dataDir=[/build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.cloud.BasicDistributedZkTest_246C98A4C257C021-001/shard-1-001/cores/collection2_shard2_replica_n1/data/] 2> 202833 INFO (qtp600427849-517) [n:127.0.0.1:40707_ c:collection2 s:shard2 r:core_node2 x:collection2_shard2_replica_n1 t:null-538] o.a.s.u.RandomMergePolicy RandomMergePolicy wrapping class org.apache.lucene.index.TieredMergePolicy: [TieredMergePolicy: maxMergeAtOnce=19, maxMergedSegmentMB=10.273061752319336, floorSegmentMB=0.749781608581543, forceMergeDeletesPctAllowed=12.565230667147118, segmentsPerTier=12.0, maxCFSSegmentSizeMB=8.796093022208E12, noCFSRatio=0.13744209775263264, deletesPctAllowed=45.32596601683406 2> 202857 WARN (qtp600427849-517) [n:127.0.0.1:40707_ c:collection2 s:shard2 r:core_node2 x:collection2_shard2_replica_n1 t:null-538] o.a.s.c.RequestHandlers INVALID paramSet a in requestHandler {type = requestHandler,name = /dump,class = DumpRequestHandler,attributes = {initParams=a, name=/dump, class=DumpRequestHandler},args = {defaults={a=A, b=B}}} 2> 203150 INFO (qtp600427849-517) [n:127.0.0.1:40707_ c:collection2 s:shard2 r:core_node2 x:collection2_shard2_replica_n1 t:null-538] o.a.s.u.UpdateHandler Using UpdateLog implementation: org.apache.solr.update.UpdateLog 2> 203150 INFO (qtp600427849-517) [n:127.0.0.1:40707_ c:collection2 s:shard2 r:core_node2 x:collection2_shard2_replica_n1 t:null-538] o.a.s.u.UpdateLog Initializing UpdateLog: dataDir= defaultSyncLevel=FLUSH numRecordsToKeep=100 maxNumLogsToKeep=10 numVersionBuckets=65536 2> 203174 INFO (qtp600427849-517) [n:127.0.0.1:40707_ c:collection2 s:shard2 r:core_node2 x:collection2_shard2_replica_n1 t:null-538] o.a.s.u.CommitTracker Hard AutoCommit: disabled 2> 203174 INFO (qtp600427849-517) [n:127.0.0.1:40707_ c:collection2 s:shard2 r:core_node2 x:collection2_shard2_replica_n1 t:null-538] o.a.s.u.CommitTracker Soft AutoCommit: disabled 2> 203185 INFO (qtp600427849-517) [n:127.0.0.1:40707_ c:collection2 s:shard2 r:core_node2 x:collection2_shard2_replica_n1 t:null-538] o.a.s.u.RandomMergePolicy RandomMergePolicy wrapping class org.apache.lucene.index.LogByteSizeMergePolicy: [LogByteSizeMergePolicy: minMergeSize=1677721, mergeFactor=35, maxMergeSize=2147483648, maxMergeSizeForForcedMerge=9223372036854775807, calibrateSizeByDeletes=false, maxMergeDocs=2147483647, maxCFSSegmentSizeMB=8.796093022208E12, noCFSRatio=0.0] 2> 203203 INFO (qtp600427849-517) [n:127.0.0.1:40707_ c:collection2 s:shard2 r:core_node2 x:collection2_shard2_replica_n1 t:null-538] o.a.s.r.ManagedResourceStorage Configured ZooKeeperStorageIO with znodeBase: /configs/conf1 2> 203206 INFO (qtp600427849-517) [n:127.0.0.1:40707_ c:collection2 s:shard2 r:core_node2 x:collection2_shard2_replica_n1 t:null-538] o.a.s.r.ManagedResourceStorage Loaded null at path _rest_managed.json using ZooKeeperStorageIO:path=/configs/conf1 2> 203212 INFO (qtp600427849-517) [n:127.0.0.1:40707_ c:collection2 s:shard2 r:core_node2 x:collection2_shard2_replica_n1 t:null-538] o.a.s.h.ReplicationHandler Commits will be reserved for 10000 ms 2> 203213 INFO (qtp600427849-517) [n:127.0.0.1:40707_ c:collection2 s:shard2 r:core_node2 x:collection2_shard2_replica_n1 t:null-538] o.a.s.u.UpdateLog Could not find max version in index or recent updates, using new clock 1788735531579867136 2> 203255 INFO (searcherExecutor-387-thread-1-processing-collection2_shard2_replica_n1 null-538 core_node2 127.0.0.1:40707_ collection2 shard2) [n:127.0.0.1:40707_ c:collection2 s:shard2 r:core_node2 x:collection2_shard2_replica_n1 t:null-538] o.a.s.c.SolrCore Registered new searcher autowarm time: 0 ms 2> 203262 INFO (qtp600427849-517) [n:127.0.0.1:40707_ c:collection2 s:shard2 r:core_node2 x:collection2_shard2_replica_n1 t:null-538] o.a.s.c.ZkShardTerms Successful update of terms at /collections/collection2/terms/shard2 to Terms{values={core_node2=0}, version=0} for registerTerm 2> 203263 INFO (qtp600427849-517) [n:127.0.0.1:40707_ c:collection2 s:shard2 r:core_node2 x:collection2_shard2_replica_n1 t:null-538] o.a.s.c.ShardLeaderElectionContextBase make sure parent is created /collections/collection2/leaders/shard2 2> 203297 INFO (qtp600427849-517) [n:127.0.0.1:40707_ c:collection2 s:shard2 r:core_node2 x:collection2_shard2_replica_n1 t:null-538] o.a.s.c.ShardLeaderElectionContext Enough replicas found to continue. 2> 203297 INFO (qtp600427849-517) [n:127.0.0.1:40707_ c:collection2 s:shard2 r:core_node2 x:collection2_shard2_replica_n1 t:null-538] o.a.s.c.ShardLeaderElectionContext I may be the new leader - try and sync 2> 203297 INFO (qtp600427849-517) [n:127.0.0.1:40707_ c:collection2 s:shard2 r:core_node2 x:collection2_shard2_replica_n1 t:null-538] o.a.s.c.SyncStrategy Sync replicas to http://127.0.0.1:40707/collection2_shard2_replica_n1/ 2> 203300 INFO (qtp600427849-517) [n:127.0.0.1:40707_ c:collection2 s:shard2 r:core_node2 x:collection2_shard2_replica_n1 t:null-538] o.a.s.c.SyncStrategy Sync Success - now sync replicas to me 2> 203303 INFO (qtp600427849-517) [n:127.0.0.1:40707_ c:collection2 s:shard2 r:core_node2 x:collection2_shard2_replica_n1 t:null-538] o.a.s.c.SyncStrategy http://127.0.0.1:40707/collection2_shard2_replica_n1/ has no replicas 2> 203303 INFO (qtp600427849-517) [n:127.0.0.1:40707_ c:collection2 s:shard2 r:core_node2 x:collection2_shard2_replica_n1 t:null-538] o.a.s.c.ShardLeaderElectionContextBase Creating leader registration node /collections/collection2/leaders/shard2/leader after winning as /collections/collection2/leader_elect/shard2/election/72077803529764874-core_node2-n_0000000000 2> 203327 INFO (qtp600427849-517) [n:127.0.0.1:40707_ c:collection2 s:shard2 r:core_node2 x:collection2_shard2_replica_n1 t:null-538] o.a.s.c.ShardLeaderElectionContext I am the new leader: http://127.0.0.1:40707/collection2_shard2_replica_n1/ shard2 2> 203441 INFO (zkCallback-288-thread-2) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/collection2/state.json zxid: 384] for collection [collection2] has occurred - updating... (live nodes size: [5]) 2> 203454 INFO (qtp600427849-517) [n:127.0.0.1:40707_ c:collection2 s:shard2 r:core_node2 x:collection2_shard2_replica_n1 t:null-538] o.a.s.c.ZkController I am the leader, no recovery necessary 2> 203485 INFO (qtp600427849-517) [n:127.0.0.1:40707_ c: s: r: x: t:null-538] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/cores params={qt=/admin/cores&coreNodeName=core_node2&collection.configName=conf1&name=collection2_shard2_replica_n1&action=CREATE&collection=collection2&shard=shard2&wt=javabin&version=2&replicaType=NRT} status=0 QTime=1169 2> 203506 INFO (qtp1128963290-593) [n:127.0.0.1:43017_ c:collection2 s: r: x: t:null-537] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/collections params={node=127.0.0.1:40707_&action=ADDREPLICA&collection=collection2&shard=shard2&type=NRT&wt=javabin&version=2} status=0 QTime=1403 2> 203553 INFO (OverseerCollectionConfigSetProcessor-72077803529764868-127.0.0.1:45297_-n_0000000000) [n: c: s: r: x: t:] o.a.s.c.OverseerTaskQueue Response ZK path: /overseer/collection-queue-work/qnr-0000000014 doesn't exist. Requestor may have disconnected from ZooKeeper 2> 203567 INFO (OverseerThreadFactory-250-thread-4) [n: c:collection2 s:shard1 r: x: t:] o.a.s.c.a.c.AddReplicaCmd Node Identified 127.0.0.1:37133_ for creating new replica of shard shard1 for collection collection2 2> 203579 INFO (zkCallback-288-thread-2) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/collection2/state.json zxid: 394] for collection [collection2] has occurred - updating... (live nodes size: [5]) 2> 203579 INFO (zkCallback-288-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/collection2/state.json zxid: 394] for collection [collection2] has occurred - updating... (live nodes size: [5]) 2> 203585 INFO (OverseerThreadFactory-250-thread-4) [n: c:collection2 s:shard1 r: x: t:] o.a.s.c.a.c.AddReplicaCmd Returning CreateReplica command. 2> 203603 INFO (OverseerStateUpdate-72077803529764868-127.0.0.1:45297_-n_0000000000) [n:127.0.0.1:45297_ c: s: r: x: t:] o.a.s.c.o.SliceMutator createReplica() { 2> "core":"collection2_shard1_replica_n3", 2> "node_name":"127.0.0.1:37133_", 2> "base_url":"http://127.0.0.1:37133", 2> "collection":"collection2", 2> "shard":"shard1", 2> "state":"down", 2> "type":"NRT", 2> "operation":"addreplica"} 2> 203726 INFO (zkCallback-288-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/collection2/state.json zxid: 399] for collection [collection2] has occurred - updating... (live nodes size: [5]) 2> 203726 INFO (zkCallback-288-thread-2) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/collection2/state.json zxid: 399] for collection [collection2] has occurred - updating... (live nodes size: [5]) 2> 203727 INFO (zkCallback-244-thread-2) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/collection2/state.json zxid: 399] for collection [collection2] has occurred - updating... (live nodes size: [5]) 2> 203760 INFO (qtp657049334-532) [n:127.0.0.1:37133_ c: s: r: x:collection2_shard1_replica_n3 t:null-540] o.a.s.h.a.CoreAdminOperation core create command qt=/admin/cores&coreNodeName=core_node4&collection.configName=conf1&name=collection2_shard1_replica_n3&action=CREATE&collection=collection2&shard=shard1&wt=javabin&version=2&replicaType=NRT 2> 203826 INFO (qtp657049334-532) [n:127.0.0.1:37133_ c:collection2 s:shard1 r:core_node4 x:collection2_shard1_replica_n3 t:null-540] o.a.s.c.SolrConfig Using Lucene MatchVersion: 9.8.0 2> 203840 INFO (qtp657049334-532) [n:127.0.0.1:37133_ c:collection2 s:shard1 r:core_node4 x:collection2_shard1_replica_n3 t:null-540] o.a.s.s.IndexSchema Schema name=test 2> 203902 INFO (zkCallback-288-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/collection2/state.json zxid: 402] for collection [collection2] has occurred - updating... (live nodes size: [5]) 2> 203902 INFO (zkCallback-288-thread-2) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/collection2/state.json zxid: 402] for collection [collection2] has occurred - updating... (live nodes size: [5]) 2> 204062 INFO (qtp657049334-532) [n:127.0.0.1:37133_ c:collection2 s:shard1 r:core_node4 x:collection2_shard1_replica_n3 t:null-540] o.a.s.s.IndexSchema Loaded schema test/1.0 with uniqueid field id 2> 204226 INFO (qtp657049334-532) [n:127.0.0.1:37133_ c:collection2 s:shard1 r:core_node4 x:collection2_shard1_replica_n3 t:null-540] o.a.s.c.CoreContainer Creating SolrCore 'collection2_shard1_replica_n3' using configuration from configset conf1, trusted=true 2> 204229 INFO (qtp657049334-532) [n:127.0.0.1:37133_ c:collection2 s:shard1 r:core_node4 x:collection2_shard1_replica_n3 t:null-540] o.a.s.c.SolrCore Opening new SolrCore at [/build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.cloud.BasicDistributedZkTest_246C98A4C257C021-001/shard-2-001/cores/collection2_shard1_replica_n3], dataDir=[/build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.cloud.BasicDistributedZkTest_246C98A4C257C021-001/shard-2-001/cores/collection2_shard1_replica_n3/data/] 2> 204253 INFO (qtp657049334-532) [n:127.0.0.1:37133_ c:collection2 s:shard1 r:core_node4 x:collection2_shard1_replica_n3 t:null-540] o.a.s.u.RandomMergePolicy RandomMergePolicy wrapping class org.apache.lucene.index.TieredMergePolicy: [TieredMergePolicy: maxMergeAtOnce=26, maxMergedSegmentMB=49.734910011291504, floorSegmentMB=0.3930988311767578, forceMergeDeletesPctAllowed=19.244929011688583, segmentsPerTier=32.0, maxCFSSegmentSizeMB=8.796093022208E12, noCFSRatio=0.844874139736486, deletesPctAllowed=38.311155010312675 2> 204272 WARN (qtp657049334-532) [n:127.0.0.1:37133_ c:collection2 s:shard1 r:core_node4 x:collection2_shard1_replica_n3 t:null-540] o.a.s.c.RequestHandlers INVALID paramSet a in requestHandler {type = requestHandler,name = /dump,class = DumpRequestHandler,attributes = {initParams=a, name=/dump, class=DumpRequestHandler},args = {defaults={a=A, b=B}}} 2> 204612 INFO (qtp657049334-532) [n:127.0.0.1:37133_ c:collection2 s:shard1 r:core_node4 x:collection2_shard1_replica_n3 t:null-540] o.a.s.u.UpdateHandler Using UpdateLog implementation: org.apache.solr.update.UpdateLog 2> 204613 INFO (qtp657049334-532) [n:127.0.0.1:37133_ c:collection2 s:shard1 r:core_node4 x:collection2_shard1_replica_n3 t:null-540] o.a.s.u.UpdateLog Initializing UpdateLog: dataDir= defaultSyncLevel=FLUSH numRecordsToKeep=100 maxNumLogsToKeep=10 numVersionBuckets=65536 2> 204633 INFO (qtp657049334-532) [n:127.0.0.1:37133_ c:collection2 s:shard1 r:core_node4 x:collection2_shard1_replica_n3 t:null-540] o.a.s.u.CommitTracker Hard AutoCommit: disabled 2> 204633 INFO (qtp657049334-532) [n:127.0.0.1:37133_ c:collection2 s:shard1 r:core_node4 x:collection2_shard1_replica_n3 t:null-540] o.a.s.u.CommitTracker Soft AutoCommit: disabled 2> 204648 INFO (qtp657049334-532) [n:127.0.0.1:37133_ c:collection2 s:shard1 r:core_node4 x:collection2_shard1_replica_n3 t:null-540] o.a.s.u.RandomMergePolicy RandomMergePolicy wrapping class org.apache.lucene.index.TieredMergePolicy: [TieredMergePolicy: maxMergeAtOnce=25, maxMergedSegmentMB=98.51693820953369, floorSegmentMB=1.4527759552001953, forceMergeDeletesPctAllowed=2.9097891812865595, segmentsPerTier=45.0, maxCFSSegmentSizeMB=8.796093022208E12, noCFSRatio=0.5977428682331861, deletesPctAllowed=37.29251706615804 2> 204666 INFO (qtp657049334-532) [n:127.0.0.1:37133_ c:collection2 s:shard1 r:core_node4 x:collection2_shard1_replica_n3 t:null-540] o.a.s.r.ManagedResourceStorage Configured ZooKeeperStorageIO with znodeBase: /configs/conf1 2> 204670 INFO (qtp657049334-532) [n:127.0.0.1:37133_ c:collection2 s:shard1 r:core_node4 x:collection2_shard1_replica_n3 t:null-540] o.a.s.r.ManagedResourceStorage Loaded null at path _rest_managed.json using ZooKeeperStorageIO:path=/configs/conf1 2> 204676 INFO (qtp657049334-532) [n:127.0.0.1:37133_ c:collection2 s:shard1 r:core_node4 x:collection2_shard1_replica_n3 t:null-540] o.a.s.h.ReplicationHandler Commits will be reserved for 10000 ms 2> 204677 INFO (qtp657049334-532) [n:127.0.0.1:37133_ c:collection2 s:shard1 r:core_node4 x:collection2_shard1_replica_n3 t:null-540] o.a.s.u.UpdateLog Could not find max version in index or recent updates, using new clock 1788735533114982400 2> 204724 INFO (searcherExecutor-393-thread-1-processing-collection2_shard1_replica_n3 null-540 core_node4 127.0.0.1:37133_ collection2 shard1) [n:127.0.0.1:37133_ c:collection2 s:shard1 r:core_node4 x:collection2_shard1_replica_n3 t:null-540] o.a.s.c.SolrCore Registered new searcher autowarm time: 0 ms 2> 204727 INFO (qtp657049334-532) [n:127.0.0.1:37133_ c:collection2 s:shard1 r:core_node4 x:collection2_shard1_replica_n3 t:null-540] o.a.s.c.ZkShardTerms Successful update of terms at /collections/collection2/terms/shard1 to Terms{values={core_node4=0}, version=0} for registerTerm 2> 204730 INFO (qtp657049334-532) [n:127.0.0.1:37133_ c:collection2 s:shard1 r:core_node4 x:collection2_shard1_replica_n3 t:null-540] o.a.s.c.ShardLeaderElectionContextBase make sure parent is created /collections/collection2/leaders/shard1 2> 204764 INFO (qtp657049334-532) [n:127.0.0.1:37133_ c:collection2 s:shard1 r:core_node4 x:collection2_shard1_replica_n3 t:null-540] o.a.s.c.ShardLeaderElectionContext Enough replicas found to continue. 2> 204765 INFO (qtp657049334-532) [n:127.0.0.1:37133_ c:collection2 s:shard1 r:core_node4 x:collection2_shard1_replica_n3 t:null-540] o.a.s.c.ShardLeaderElectionContext I may be the new leader - try and sync 2> 204765 INFO (qtp657049334-532) [n:127.0.0.1:37133_ c:collection2 s:shard1 r:core_node4 x:collection2_shard1_replica_n3 t:null-540] o.a.s.c.SyncStrategy Sync replicas to http://127.0.0.1:37133/collection2_shard1_replica_n3/ 2> 204768 INFO (qtp657049334-532) [n:127.0.0.1:37133_ c:collection2 s:shard1 r:core_node4 x:collection2_shard1_replica_n3 t:null-540] o.a.s.c.SyncStrategy Sync Success - now sync replicas to me 2> 204770 INFO (qtp657049334-532) [n:127.0.0.1:37133_ c:collection2 s:shard1 r:core_node4 x:collection2_shard1_replica_n3 t:null-540] o.a.s.c.SyncStrategy http://127.0.0.1:37133/collection2_shard1_replica_n3/ has no replicas 2> 204771 INFO (qtp657049334-532) [n:127.0.0.1:37133_ c:collection2 s:shard1 r:core_node4 x:collection2_shard1_replica_n3 t:null-540] o.a.s.c.ShardLeaderElectionContextBase Creating leader registration node /collections/collection2/leaders/shard1/leader after winning as /collections/collection2/leader_elect/shard1/election/72077803529764876-core_node4-n_0000000000 2> 204793 INFO (qtp657049334-532) [n:127.0.0.1:37133_ c:collection2 s:shard1 r:core_node4 x:collection2_shard1_replica_n3 t:null-540] o.a.s.c.ShardLeaderElectionContext I am the new leader: http://127.0.0.1:37133/collection2_shard1_replica_n3/ shard1 2> 204917 INFO (zkCallback-299-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/collection2/state.json zxid: 417] for collection [collection2] has occurred - updating... (live nodes size: [5]) 2> 204921 INFO (zkCallback-288-thread-2) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/collection2/state.json zxid: 417] for collection [collection2] has occurred - updating... (live nodes size: [5]) 2> 204922 INFO (zkCallback-288-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/collection2/state.json zxid: 417] for collection [collection2] has occurred - updating... (live nodes size: [5]) 2> 204938 INFO (qtp657049334-532) [n:127.0.0.1:37133_ c:collection2 s:shard1 r:core_node4 x:collection2_shard1_replica_n3 t:null-540] o.a.s.c.ZkController I am the leader, no recovery necessary 2> 204980 INFO (qtp657049334-532) [n:127.0.0.1:37133_ c: s: r: x: t:null-540] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/cores params={qt=/admin/cores&coreNodeName=core_node4&collection.configName=conf1&name=collection2_shard1_replica_n3&action=CREATE&collection=collection2&shard=shard1&wt=javabin&version=2&replicaType=NRT} status=0 QTime=1222 2> 205000 INFO (qtp1128963290-591) [n:127.0.0.1:43017_ c:collection2 s: r: x: t:null-539] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/collections params={node=127.0.0.1:37133_&action=ADDREPLICA&collection=collection2&shard=shard1&type=NRT&wt=javabin&version=2} status=0 QTime=1478 2> 205048 INFO (OverseerCollectionConfigSetProcessor-72077803529764868-127.0.0.1:45297_-n_0000000000) [n: c: s: r: x: t:] o.a.s.c.OverseerTaskQueue Response ZK path: /overseer/collection-queue-work/qnr-0000000016 doesn't exist. Requestor may have disconnected from ZooKeeper 2> 205062 INFO (OverseerThreadFactory-250-thread-4) [n: c:collection2 s:shard2 r: x: t:] o.a.s.c.a.c.AddReplicaCmd Node Identified 127.0.0.1:32885_ for creating new replica of shard shard2 for collection collection2 2> 205062 INFO (zkCallback-288-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/collection2/state.json zxid: 427] for collection [collection2] has occurred - updating... (live nodes size: [5]) 2> 205062 INFO (zkCallback-299-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/collection2/state.json zxid: 427] for collection [collection2] has occurred - updating... (live nodes size: [5]) 2> 205062 INFO (zkCallback-288-thread-2) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/collection2/state.json zxid: 427] for collection [collection2] has occurred - updating... (live nodes size: [5]) 2> 205065 INFO (zkCallback-299-thread-2) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/collection2/state.json zxid: 427] for collection [collection2] has occurred - updating... (live nodes size: [5]) 2> 205081 INFO (OverseerThreadFactory-250-thread-4) [n: c:collection2 s:shard2 r: x: t:] o.a.s.c.a.c.AddReplicaCmd Returning CreateReplica command. 2> 205098 INFO (OverseerStateUpdate-72077803529764868-127.0.0.1:45297_-n_0000000000) [n:127.0.0.1:45297_ c: s: r: x: t:] o.a.s.c.o.SliceMutator createReplica() { 2> "core":"collection2_shard2_replica_n5", 2> "node_name":"127.0.0.1:32885_", 2> "base_url":"http://127.0.0.1:32885", 2> "collection":"collection2", 2> "shard":"shard2", 2> "state":"down", 2> "type":"NRT", 2> "operation":"addreplica"} 2> 205219 INFO (zkCallback-244-thread-2) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/collection2/state.json zxid: 432] for collection [collection2] has occurred - updating... (live nodes size: [5]) 2> 205219 INFO (zkCallback-288-thread-2) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/collection2/state.json zxid: 432] for collection [collection2] has occurred - updating... (live nodes size: [5]) 2> 205219 INFO (zkCallback-299-thread-2) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/collection2/state.json zxid: 432] for collection [collection2] has occurred - updating... (live nodes size: [5]) 2> 205219 INFO (zkCallback-288-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/collection2/state.json zxid: 432] for collection [collection2] has occurred - updating... (live nodes size: [5]) 2> 205219 INFO (zkCallback-299-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/collection2/state.json zxid: 432] for collection [collection2] has occurred - updating... (live nodes size: [5]) 2> 205252 INFO (qtp558875332-572) [n:127.0.0.1:32885_ c: s: r: x:collection2_shard2_replica_n5 t:null-542] o.a.s.h.a.CoreAdminOperation core create command qt=/admin/cores&coreNodeName=core_node6&collection.configName=conf1&name=collection2_shard2_replica_n5&action=CREATE&collection=collection2&shard=shard2&wt=javabin&version=2&replicaType=NRT 2> 205323 INFO (qtp558875332-572) [n:127.0.0.1:32885_ c:collection2 s:shard2 r:core_node6 x:collection2_shard2_replica_n5 t:null-542] o.a.s.c.SolrConfig Using Lucene MatchVersion: 9.8.0 2> 205339 INFO (qtp558875332-572) [n:127.0.0.1:32885_ c:collection2 s:shard2 r:core_node6 x:collection2_shard2_replica_n5 t:null-542] o.a.s.s.IndexSchema Schema name=test 2> 205390 INFO (zkCallback-288-thread-2) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/collection2/state.json zxid: 435] for collection [collection2] has occurred - updating... (live nodes size: [5]) 2> 205390 INFO (zkCallback-288-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/collection2/state.json zxid: 435] for collection [collection2] has occurred - updating... (live nodes size: [5]) 2> 205390 INFO (zkCallback-299-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/collection2/state.json zxid: 435] for collection [collection2] has occurred - updating... (live nodes size: [5]) 2> 205390 INFO (zkCallback-299-thread-2) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/collection2/state.json zxid: 435] for collection [collection2] has occurred - updating... (live nodes size: [5]) 2> 205580 INFO (qtp558875332-572) [n:127.0.0.1:32885_ c:collection2 s:shard2 r:core_node6 x:collection2_shard2_replica_n5 t:null-542] o.a.s.s.IndexSchema Loaded schema test/1.0 with uniqueid field id 2> 205771 INFO (qtp558875332-572) [n:127.0.0.1:32885_ c:collection2 s:shard2 r:core_node6 x:collection2_shard2_replica_n5 t:null-542] o.a.s.c.CoreContainer Creating SolrCore 'collection2_shard2_replica_n5' using configuration from configset conf1, trusted=true 2> 205775 INFO (qtp558875332-572) [n:127.0.0.1:32885_ c:collection2 s:shard2 r:core_node6 x:collection2_shard2_replica_n5 t:null-542] o.a.s.c.SolrCore Opening new SolrCore at [/build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.cloud.BasicDistributedZkTest_246C98A4C257C021-001/shard-3-001/cores/collection2_shard2_replica_n5], dataDir=[/build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.cloud.BasicDistributedZkTest_246C98A4C257C021-001/shard-3-001/cores/collection2_shard2_replica_n5/data/] 2> 205805 INFO (qtp558875332-572) [n:127.0.0.1:32885_ c:collection2 s:shard2 r:core_node6 x:collection2_shard2_replica_n5 t:null-542] o.a.s.u.RandomMergePolicy RandomMergePolicy wrapping class org.apache.lucene.index.TieredMergePolicy: [TieredMergePolicy: maxMergeAtOnce=26, maxMergedSegmentMB=49.734910011291504, floorSegmentMB=0.3930988311767578, forceMergeDeletesPctAllowed=19.244929011688583, segmentsPerTier=32.0, maxCFSSegmentSizeMB=8.796093022208E12, noCFSRatio=0.844874139736486, deletesPctAllowed=38.311155010312675 2> 205828 WARN (qtp558875332-572) [n:127.0.0.1:32885_ c:collection2 s:shard2 r:core_node6 x:collection2_shard2_replica_n5 t:null-542] o.a.s.c.RequestHandlers INVALID paramSet a in requestHandler {type = requestHandler,name = /dump,class = DumpRequestHandler,attributes = {initParams=a, name=/dump, class=DumpRequestHandler},args = {defaults={a=A, b=B}}} 2> 206051 INFO (qtp558875332-572) [n:127.0.0.1:32885_ c:collection2 s:shard2 r:core_node6 x:collection2_shard2_replica_n5 t:null-542] o.a.s.u.UpdateHandler Using UpdateLog implementation: org.apache.solr.update.UpdateLog 2> 206051 INFO (qtp558875332-572) [n:127.0.0.1:32885_ c:collection2 s:shard2 r:core_node6 x:collection2_shard2_replica_n5 t:null-542] o.a.s.u.UpdateLog Initializing UpdateLog: dataDir= defaultSyncLevel=FLUSH numRecordsToKeep=100 maxNumLogsToKeep=10 numVersionBuckets=65536 2> 206071 INFO (qtp558875332-572) [n:127.0.0.1:32885_ c:collection2 s:shard2 r:core_node6 x:collection2_shard2_replica_n5 t:null-542] o.a.s.u.CommitTracker Hard AutoCommit: disabled 2> 206072 INFO (qtp558875332-572) [n:127.0.0.1:32885_ c:collection2 s:shard2 r:core_node6 x:collection2_shard2_replica_n5 t:null-542] o.a.s.u.CommitTracker Soft AutoCommit: disabled 2> 206083 INFO (qtp558875332-572) [n:127.0.0.1:32885_ c:collection2 s:shard2 r:core_node6 x:collection2_shard2_replica_n5 t:null-542] o.a.s.u.RandomMergePolicy RandomMergePolicy wrapping class org.apache.lucene.index.TieredMergePolicy: [TieredMergePolicy: maxMergeAtOnce=25, maxMergedSegmentMB=98.51693820953369, floorSegmentMB=1.4527759552001953, forceMergeDeletesPctAllowed=2.9097891812865595, segmentsPerTier=45.0, maxCFSSegmentSizeMB=8.796093022208E12, noCFSRatio=0.5977428682331861, deletesPctAllowed=37.29251706615804 2> 206098 INFO (qtp558875332-572) [n:127.0.0.1:32885_ c:collection2 s:shard2 r:core_node6 x:collection2_shard2_replica_n5 t:null-542] o.a.s.r.ManagedResourceStorage Configured ZooKeeperStorageIO with znodeBase: /configs/conf1 2> 206102 INFO (qtp558875332-572) [n:127.0.0.1:32885_ c:collection2 s:shard2 r:core_node6 x:collection2_shard2_replica_n5 t:null-542] o.a.s.r.ManagedResourceStorage Loaded null at path _rest_managed.json using ZooKeeperStorageIO:path=/configs/conf1 2> 206106 INFO (qtp558875332-572) [n:127.0.0.1:32885_ c:collection2 s:shard2 r:core_node6 x:collection2_shard2_replica_n5 t:null-542] o.a.s.h.ReplicationHandler Commits will be reserved for 10000 ms 2> 206107 INFO (qtp558875332-572) [n:127.0.0.1:32885_ c:collection2 s:shard2 r:core_node6 x:collection2_shard2_replica_n5 t:null-542] o.a.s.u.UpdateLog Could not find max version in index or recent updates, using new clock 1788735534614446080 2> 206147 INFO (searcherExecutor-399-thread-1-processing-collection2_shard2_replica_n5 null-542 core_node6 127.0.0.1:32885_ collection2 shard2) [n:127.0.0.1:32885_ c:collection2 s:shard2 r:core_node6 x:collection2_shard2_replica_n5 t:null-542] o.a.s.c.SolrCore Registered new searcher autowarm time: 0 ms 2> 206148 INFO (qtp558875332-572) [n:127.0.0.1:32885_ c:collection2 s:shard2 r:core_node6 x:collection2_shard2_replica_n5 t:null-542] o.a.s.c.ZkShardTerms Successful update of terms at /collections/collection2/terms/shard2 to Terms{values={core_node2=0, core_node6=0}, version=1} for registerTerm 2> 206150 INFO (qtp558875332-572) [n:127.0.0.1:32885_ c:collection2 s:shard2 r:core_node6 x:collection2_shard2_replica_n5 t:null-542] o.a.s.c.ShardLeaderElectionContextBase make sure parent is created /collections/collection2/leaders/shard2 2> 206174 INFO (qtp558875332-572) [n:127.0.0.1:32885_ c:collection2 s:shard2 r:core_node6 x:collection2_shard2_replica_n5 t:null-542] o.a.s.c.ZkController Core needs to recover:collection2_shard2_replica_n5 2> 206175 INFO (updateExecutor-318-thread-1-processing-collection2_shard2_replica_n5 null-542 core_node6 127.0.0.1:32885_ collection2 shard2) [n:127.0.0.1:32885_ c:collection2 s:shard2 r:core_node6 x:collection2_shard2_replica_n5 t:null-542] o.a.s.u.DefaultSolrCoreState Running recovery 2> 206178 INFO (recoveryExecutor-320-thread-1-processing-collection2_shard2_replica_n5 null-542 core_node6 127.0.0.1:32885_ collection2 shard2) [n:127.0.0.1:32885_ c:collection2 s:shard2 r:core_node6 x:collection2_shard2_replica_n5 t:null-542] o.a.s.c.RecoveryStrategy Starting recovery process. recoveringAfterStartup=true 2> 206178 INFO (recoveryExecutor-320-thread-1-processing-collection2_shard2_replica_n5 null-542 core_node6 127.0.0.1:32885_ collection2 shard2) [n:127.0.0.1:32885_ c:collection2 s:shard2 r:core_node6 x:collection2_shard2_replica_n5 t:null-542] o.a.s.c.RecoveryStrategy startupVersions is empty 2> 206200 INFO (qtp558875332-572) [n:127.0.0.1:32885_ c: s: r: x: t:null-542] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/cores params={qt=/admin/cores&coreNodeName=core_node6&collection.configName=conf1&name=collection2_shard2_replica_n5&action=CREATE&collection=collection2&shard=shard2&wt=javabin&version=2&replicaType=NRT} status=0 QTime=950 2> 206200 INFO (qtp600427849-515) [n:127.0.0.1:40707_ c:collection2 s:shard2 r:core_node2 x:collection2_shard2_replica_n1 t:null-543] o.a.s.c.S.Request webapp= path=/admin/ping params={wt=javabin&version=2} hits=0 status=0 QTime=7 2> 206201 INFO (qtp600427849-515) [n:127.0.0.1:40707_ c:collection2 s:shard2 r:core_node2 x:collection2_shard2_replica_n1 t:null-543] o.a.s.c.S.Request webapp= path=/admin/ping params={wt=javabin&version=2} status=0 QTime=8 2> 206203 INFO (recoveryExecutor-320-thread-1-processing-collection2_shard2_replica_n5 null-542 core_node6 127.0.0.1:32885_ collection2 shard2) [n:127.0.0.1:32885_ c:collection2 s:shard2 r:core_node6 x:collection2_shard2_replica_n5 t:null-542] o.a.s.c.RecoveryStrategy Begin buffering updates. core=[collection2_shard2_replica_n5] 2> 206206 INFO (recoveryExecutor-320-thread-1-processing-collection2_shard2_replica_n5 null-542 core_node6 127.0.0.1:32885_ collection2 shard2) [n:127.0.0.1:32885_ c:collection2 s:shard2 r:core_node6 x:collection2_shard2_replica_n5 t:null-542] o.a.s.u.UpdateLog Starting to buffer updates. FSUpdateLog{state=ACTIVE, tlog=null} 2> 206206 INFO (recoveryExecutor-320-thread-1-processing-collection2_shard2_replica_n5 null-542 core_node6 127.0.0.1:32885_ collection2 shard2) [n:127.0.0.1:32885_ c:collection2 s:shard2 r:core_node6 x:collection2_shard2_replica_n5 t:null-542] o.a.s.c.RecoveryStrategy Publishing state of core [collection2_shard2_replica_n5] as recovering, leader is [http://127.0.0.1:40707/collection2_shard2_replica_n1/] and I am [http://127.0.0.1:32885/collection2_shard2_replica_n5/] 2> 206219 INFO (recoveryExecutor-320-thread-1-processing-collection2_shard2_replica_n5 null-542 core_node6 127.0.0.1:32885_ collection2 shard2) [n:127.0.0.1:32885_ c:collection2 s:shard2 r:core_node6 x:collection2_shard2_replica_n5 t:null-542] o.a.s.c.RecoveryStrategy Sending prep recovery command to [http://127.0.0.1:40707]; [WaitForState: action=PREPRECOVERY&core=collection2_shard2_replica_n1&nodeName=127.0.0.1:32885_&coreNodeName=core_node6&state=recovering&checkLive=true&onlyIfLeader=true&onlyIfLeaderActive=true] 2> 206222 INFO (qtp1128963290-588) [n:127.0.0.1:43017_ c:collection2 s: r: x: t:null-541] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/collections params={node=127.0.0.1:32885_&action=ADDREPLICA&collection=collection2&shard=shard2&type=NRT&wt=javabin&version=2} status=0 QTime=1203 2> 206225 INFO (qtp600427849-520) [n:127.0.0.1:40707_ c: s: r: x:collection2_shard2_replica_n1 t:null-544] o.a.s.h.a.PrepRecoveryOp Going to wait for coreNodeName: core_node6, state: recovering, checkLive: true, onlyIfLeader: true, onlyIfLeaderActive: true 2> 206231 INFO (qtp600427849-520) [n:127.0.0.1:40707_ c: s: r: x: t:null-544] o.a.s.h.a.PrepRecoveryOp In WaitForState(recovering): collection=collection2, shard=shard2, thisCore=collection2_shard2_replica_n1, leaderDoesNotNeedRecovery=false, isLeader? true, live=true, checkLive=true, currentState=down, localState=active, nodeName=127.0.0.1:32885_, coreNodeName=core_node6, onlyIfActiveCheckResult=false, nodeProps: core_node6:{ 2> "core":"collection2_shard2_replica_n5", 2> "node_name":"127.0.0.1:32885_", 2> "base_url":"http://127.0.0.1:32885", 2> "state":"down", 2> "type":"NRT", 2> "force_set_state":"false"} 2> 206232 INFO (qtp600427849-520) [n:127.0.0.1:40707_ c: s: r: x: t:null-544] o.a.s.h.a.PrepRecoveryOp In WaitForState(recovering): collection=collection2, shard=shard2, thisCore=collection2_shard2_replica_n1, leaderDoesNotNeedRecovery=false, isLeader? true, live=true, checkLive=true, currentState=down, localState=active, nodeName=127.0.0.1:32885_, coreNodeName=core_node6, onlyIfActiveCheckResult=false, nodeProps: core_node6:{ 2> "core":"collection2_shard2_replica_n5", 2> "node_name":"127.0.0.1:32885_", 2> "base_url":"http://127.0.0.1:32885", 2> "state":"down", 2> "type":"NRT", 2> "force_set_state":"false"} 2> 206233 INFO (qtp600427849-520) [n:127.0.0.1:40707_ c: s: r: x: t:null-544] o.a.s.h.a.PrepRecoveryOp In WaitForState(recovering): collection=collection2, shard=shard2, thisCore=collection2_shard2_replica_n1, leaderDoesNotNeedRecovery=false, isLeader? true, live=true, checkLive=true, currentState=down, localState=active, nodeName=127.0.0.1:32885_, coreNodeName=core_node6, onlyIfActiveCheckResult=false, nodeProps: core_node6:{ 2> "core":"collection2_shard2_replica_n5", 2> "node_name":"127.0.0.1:32885_", 2> "base_url":"http://127.0.0.1:32885", 2> "state":"down", 2> "type":"NRT", 2> "force_set_state":"false"} 2> 206233 INFO (qtp600427849-520) [n:127.0.0.1:40707_ c: s: r: x: t:null-544] o.a.s.h.a.PrepRecoveryOp In WaitForState(recovering): collection=collection2, shard=shard2, thisCore=collection2_shard2_replica_n1, leaderDoesNotNeedRecovery=false, isLeader? true, live=true, checkLive=true, currentState=down, localState=active, nodeName=127.0.0.1:32885_, coreNodeName=core_node6, onlyIfActiveCheckResult=false, nodeProps: core_node6:{ 2> "core":"collection2_shard2_replica_n5", 2> "node_name":"127.0.0.1:32885_", 2> "base_url":"http://127.0.0.1:32885", 2> "state":"down", 2> "type":"NRT", 2> "force_set_state":"false"} 2> 206277 INFO (OverseerCollectionConfigSetProcessor-72077803529764868-127.0.0.1:45297_-n_0000000000) [n: c: s: r: x: t:] o.a.s.c.OverseerTaskQueue Response ZK path: /overseer/collection-queue-work/qnr-0000000018 doesn't exist. Requestor may have disconnected from ZooKeeper 2> 206291 INFO (OverseerThreadFactory-250-thread-4) [n: c:collection2 s:shard1 r: x: t:] o.a.s.c.a.c.AddReplicaCmd Node Identified 127.0.0.1:43017_ for creating new replica of shard shard1 for collection collection2 2> 206306 INFO (OverseerThreadFactory-250-thread-4) [n: c:collection2 s:shard1 r: x: t:] o.a.s.c.a.c.AddReplicaCmd Returning CreateReplica command. 2> 206322 INFO (OverseerStateUpdate-72077803529764868-127.0.0.1:45297_-n_0000000000) [n:127.0.0.1:45297_ c: s: r: x: t:] o.a.s.c.o.SliceMutator createReplica() { 2> "core":"collection2_shard1_replica_n7", 2> "node_name":"127.0.0.1:43017_", 2> "base_url":"http://127.0.0.1:43017", 2> "collection":"collection2", 2> "shard":"shard1", 2> "state":"down", 2> "type":"NRT", 2> "operation":"addreplica"} 2> 206446 INFO (zkCallback-299-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/collection2/state.json zxid: 454] for collection [collection2] has occurred - updating... (live nodes size: [5]) 2> 206446 INFO (zkCallback-244-thread-2) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/collection2/state.json zxid: 454] for collection [collection2] has occurred - updating... (live nodes size: [5]) 2> 206446 INFO (zkCallback-288-thread-2) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/collection2/state.json zxid: 454] for collection [collection2] has occurred - updating... (live nodes size: [5]) 2> 206446 INFO (zkCallback-326-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/collection2/state.json zxid: 454] for collection [collection2] has occurred - updating... (live nodes size: [5]) 2> 206446 INFO (zkCallback-288-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/collection2/state.json zxid: 454] for collection [collection2] has occurred - updating... (live nodes size: [5]) 2> 206446 INFO (zkCallback-299-thread-2) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/collection2/state.json zxid: 454] for collection [collection2] has occurred - updating... (live nodes size: [5]) 2> 206454 INFO (watches-290-thread-1) [n:127.0.0.1:40707_ c: s: r: x: t:] o.a.s.h.a.PrepRecoveryOp In WaitForState(recovering): collection=collection2, shard=shard2, thisCore=collection2_shard2_replica_n1, leaderDoesNotNeedRecovery=false, isLeader? true, live=true, checkLive=true, currentState=recovering, localState=active, nodeName=127.0.0.1:32885_, coreNodeName=core_node6, onlyIfActiveCheckResult=false, nodeProps: core_node6:{ 2> "core":"collection2_shard2_replica_n5", 2> "node_name":"127.0.0.1:32885_", 2> "base_url":"http://127.0.0.1:32885", 2> "state":"recovering", 2> "type":"NRT", 2> "force_set_state":"false"} 2> 206455 INFO (qtp600427849-520) [n:127.0.0.1:40707_ c: s: r: x: t:null-544] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/cores params={nodeName=127.0.0.1:32885_&onlyIfLeaderActive=true&core=collection2_shard2_replica_n1&coreNodeName=core_node6&action=PREPRECOVERY&checkLive=true&state=recovering&onlyIfLeader=true&wt=javabin&version=2} status=0 QTime=231 2> 206483 INFO (qtp1128963290-593) [n:127.0.0.1:43017_ c: s: r: x:collection2_shard1_replica_n7 t:null-546] o.a.s.h.a.CoreAdminOperation core create command qt=/admin/cores&coreNodeName=core_node8&collection.configName=conf1&name=collection2_shard1_replica_n7&action=CREATE&collection=collection2&shard=shard1&wt=javabin&version=2&replicaType=NRT 2> 206546 INFO (qtp1128963290-593) [n:127.0.0.1:43017_ c:collection2 s:shard1 r:core_node8 x:collection2_shard1_replica_n7 t:null-546] o.a.s.c.SolrConfig Using Lucene MatchVersion: 9.8.0 2> 206561 INFO (qtp1128963290-593) [n:127.0.0.1:43017_ c:collection2 s:shard1 r:core_node8 x:collection2_shard1_replica_n7 t:null-546] o.a.s.s.IndexSchema Schema name=test 2> 206624 INFO (zkCallback-299-thread-2) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/collection2/state.json zxid: 457] for collection [collection2] has occurred - updating... (live nodes size: [5]) 2> 206624 INFO (zkCallback-326-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/collection2/state.json zxid: 457] for collection [collection2] has occurred - updating... (live nodes size: [5]) 2> 206624 INFO (zkCallback-288-thread-2) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/collection2/state.json zxid: 457] for collection [collection2] has occurred - updating... (live nodes size: [5]) 2> 206624 INFO (zkCallback-299-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/collection2/state.json zxid: 457] for collection [collection2] has occurred - updating... (live nodes size: [5]) 2> 206624 INFO (zkCallback-288-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/collection2/state.json zxid: 457] for collection [collection2] has occurred - updating... (live nodes size: [5]) 2> 206761 INFO (qtp1128963290-593) [n:127.0.0.1:43017_ c:collection2 s:shard1 r:core_node8 x:collection2_shard1_replica_n7 t:null-546] o.a.s.s.IndexSchema Loaded schema test/1.0 with uniqueid field id 2> 206938 INFO (qtp1128963290-593) [n:127.0.0.1:43017_ c:collection2 s:shard1 r:core_node8 x:collection2_shard1_replica_n7 t:null-546] o.a.s.c.CoreContainer Creating SolrCore 'collection2_shard1_replica_n7' using configuration from configset conf1, trusted=true 2> 206942 INFO (qtp1128963290-593) [n:127.0.0.1:43017_ c:collection2 s:shard1 r:core_node8 x:collection2_shard1_replica_n7 t:null-546] o.a.s.c.SolrCore Opening new SolrCore at [/build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.cloud.BasicDistributedZkTest_246C98A4C257C021-001/shard-4-001/cores/collection2_shard1_replica_n7], dataDir=[/build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.cloud.BasicDistributedZkTest_246C98A4C257C021-001/shard-4-001/cores/collection2_shard1_replica_n7/data/] 2> 206958 INFO (recoveryExecutor-320-thread-1-processing-collection2_shard2_replica_n5 null-542 core_node6 127.0.0.1:32885_ collection2 shard2) [n:127.0.0.1:32885_ c:collection2 s:shard2 r:core_node6 x:collection2_shard2_replica_n5 t:null-542] o.a.s.c.RecoveryStrategy Attempting to PeerSync from [http://127.0.0.1:40707/collection2_shard2_replica_n1/] - recoveringAfterStartup=[true] 2> 206961 WARN (recoveryExecutor-320-thread-1-processing-collection2_shard2_replica_n5 null-542 core_node6 127.0.0.1:32885_ collection2 shard2) [n:127.0.0.1:32885_ c:collection2 s:shard2 r:core_node6 x:collection2_shard2_replica_n5 t:null-542] o.a.s.u.PeerSyncWithLeader no frame of reference to tell if we've missed updates 2> 206962 INFO (recoveryExecutor-320-thread-1-processing-collection2_shard2_replica_n5 null-542 core_node6 127.0.0.1:32885_ collection2 shard2) [n:127.0.0.1:32885_ c:collection2 s:shard2 r:core_node6 x:collection2_shard2_replica_n5 t:null-542] o.a.s.c.RecoveryStrategy PeerSync Recovery was not successful - trying replication. 2> 206963 INFO (recoveryExecutor-320-thread-1-processing-collection2_shard2_replica_n5 null-542 core_node6 127.0.0.1:32885_ collection2 shard2) [n:127.0.0.1:32885_ c:collection2 s:shard2 r:core_node6 x:collection2_shard2_replica_n5 t:null-542] o.a.s.c.RecoveryStrategy Starting Replication Recovery. 2> 206963 INFO (recoveryExecutor-320-thread-1-processing-collection2_shard2_replica_n5 null-542 core_node6 127.0.0.1:32885_ collection2 shard2) [n:127.0.0.1:32885_ c:collection2 s:shard2 r:core_node6 x:collection2_shard2_replica_n5 t:null-542] o.a.s.c.RecoveryStrategy Attempting to replicate from [http://127.0.0.1:40707/collection2_shard2_replica_n1/]. 2> 206969 INFO (qtp1128963290-593) [n:127.0.0.1:43017_ c:collection2 s:shard1 r:core_node8 x:collection2_shard1_replica_n7 t:null-546] o.a.s.u.RandomMergePolicy RandomMergePolicy wrapping class org.apache.lucene.index.TieredMergePolicy: [TieredMergePolicy: maxMergeAtOnce=26, maxMergedSegmentMB=49.734910011291504, floorSegmentMB=0.3930988311767578, forceMergeDeletesPctAllowed=19.244929011688583, segmentsPerTier=32.0, maxCFSSegmentSizeMB=8.796093022208E12, noCFSRatio=0.844874139736486, deletesPctAllowed=38.311155010312675 2> 206996 WARN (qtp1128963290-593) [n:127.0.0.1:43017_ c:collection2 s:shard1 r:core_node8 x:collection2_shard1_replica_n7 t:null-546] o.a.s.c.RequestHandlers INVALID paramSet a in requestHandler {type = requestHandler,name = /dump,class = DumpRequestHandler,attributes = {initParams=a, name=/dump, class=DumpRequestHandler},args = {defaults={a=A, b=B}}} 2> 207026 INFO (qtp657049334-529) [n:127.0.0.1:37133_ c:collection2 s:shard1 r:core_node4 x:collection2_shard1_replica_n3 t:null-547] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={update.distrib=TOLEADER&waitSearcher=true&openSearcher=false&commit=true&softCommit=false&distrib.from=http://127.0.0.1:40707/collection2_shard2_replica_n1/&commit_end_point=leaders&wt=javabin&version=2&expungeDeletes=false}{commit=} 0 22 2> 207058 INFO (qtp558875332-570) [n:127.0.0.1:32885_ c:collection2 s:shard2 r:core_node6 x:collection2_shard2_replica_n5 t:null-547] o.a.s.u.p.DistributedUpdateProcessor Ignoring commit while not ACTIVE - state: BUFFERING replay: false 2> 207059 INFO (qtp558875332-570) [n:127.0.0.1:32885_ c:collection2 s:shard2 r:core_node6 x:collection2_shard2_replica_n5 t:null-547] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={update.distrib=FROMLEADER&waitSearcher=true&openSearcher=false&commit=true&softCommit=false&distrib.from=http://127.0.0.1:40707/collection2_shard2_replica_n1/&commit_end_point=replicas&wt=javabin&version=2&expungeDeletes=false}{commit=} 0 10 2> 207062 INFO (qtp600427849-516) [n:127.0.0.1:40707_ c:collection2 s:shard2 r:core_node2 x:collection2_shard2_replica_n1 t:null-547] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={waitSearcher=true&openSearcher=false&commit=true&softCommit=false&wt=javabin&version=2}{commit=} 0 93 2> 207075 INFO (qtp600427849-518) [n:127.0.0.1:40707_ c:collection2 s:shard2 r:core_node2 x:collection2_shard2_replica_n1 t:null-548] o.a.s.c.S.Request webapp= path=/replication params={qt=/replication&wt=javabin&version=2&command=indexversion} status=0 QTime=3 2> 207081 INFO (recoveryExecutor-320-thread-1-processing-collection2_shard2_replica_n5 null-542 core_node6 127.0.0.1:32885_ collection2 shard2) [n:127.0.0.1:32885_ c:collection2 s:shard2 r:core_node6 x:collection2_shard2_replica_n5 t:null-542] o.a.s.h.IndexFetcher Leader's generation: 1 2> 207081 INFO (recoveryExecutor-320-thread-1-processing-collection2_shard2_replica_n5 null-542 core_node6 127.0.0.1:32885_ collection2 shard2) [n:127.0.0.1:32885_ c:collection2 s:shard2 r:core_node6 x:collection2_shard2_replica_n5 t:null-542] o.a.s.h.IndexFetcher Leader's version: 0 2> 207082 INFO (recoveryExecutor-320-thread-1-processing-collection2_shard2_replica_n5 null-542 core_node6 127.0.0.1:32885_ collection2 shard2) [n:127.0.0.1:32885_ c:collection2 s:shard2 r:core_node6 x:collection2_shard2_replica_n5 t:null-542] o.a.s.h.IndexFetcher Follower's generation: 1 2> 207082 INFO (recoveryExecutor-320-thread-1-processing-collection2_shard2_replica_n5 null-542 core_node6 127.0.0.1:32885_ collection2 shard2) [n:127.0.0.1:32885_ c:collection2 s:shard2 r:core_node6 x:collection2_shard2_replica_n5 t:null-542] o.a.s.h.IndexFetcher Follower's version: 0 2> 207082 INFO (recoveryExecutor-320-thread-1-processing-collection2_shard2_replica_n5 null-542 core_node6 127.0.0.1:32885_ collection2 shard2) [n:127.0.0.1:32885_ c:collection2 s:shard2 r:core_node6 x:collection2_shard2_replica_n5 t:null-542] o.a.s.c.RecoveryStrategy No replay needed. 2> 207094 INFO (recoveryExecutor-320-thread-1-processing-collection2_shard2_replica_n5 null-542 core_node6 127.0.0.1:32885_ collection2 shard2) [n:127.0.0.1:32885_ c:collection2 s:shard2 r:core_node6 x:collection2_shard2_replica_n5 t:null-542] o.a.s.c.RecoveryStrategy Replication Recovery was successful. 2> 207094 INFO (recoveryExecutor-320-thread-1-processing-collection2_shard2_replica_n5 null-542 core_node6 127.0.0.1:32885_ collection2 shard2) [n:127.0.0.1:32885_ c:collection2 s:shard2 r:core_node6 x:collection2_shard2_replica_n5 t:null-542] o.a.s.c.RecoveryStrategy Registering as Active after recovery. 2> 207099 INFO (recoveryExecutor-320-thread-1-processing-collection2_shard2_replica_n5 null-542 core_node6 127.0.0.1:32885_ collection2 shard2) [n:127.0.0.1:32885_ c:collection2 s:shard2 r:core_node6 x:collection2_shard2_replica_n5 t:null-542] o.a.s.c.RecoveryStrategy Updating version bucket highest from index after successful recovery. 2> 207100 INFO (recoveryExecutor-320-thread-1-processing-collection2_shard2_replica_n5 null-542 core_node6 127.0.0.1:32885_ collection2 shard2) [n:127.0.0.1:32885_ c:collection2 s:shard2 r:core_node6 x:collection2_shard2_replica_n5 t:null-542] o.a.s.u.UpdateLog Could not find max version in index or recent updates, using new clock 1788735535655682048 2> 207110 INFO (recoveryExecutor-320-thread-1-processing-collection2_shard2_replica_n5 null-542 core_node6 127.0.0.1:32885_ collection2 shard2) [n:127.0.0.1:32885_ c:collection2 s:shard2 r:core_node6 x:collection2_shard2_replica_n5 t:null-542] o.a.s.c.RecoveryStrategy Finished recovery process, successful=[true] msTimeTaken=931.0 2> 207110 INFO (recoveryExecutor-320-thread-1-processing-collection2_shard2_replica_n5 null-542 core_node6 127.0.0.1:32885_ collection2 shard2) [n:127.0.0.1:32885_ c:collection2 s:shard2 r:core_node6 x:collection2_shard2_replica_n5 t:null-542] o.a.s.c.RecoveryStrategy Finished recovery process. recoveringAfterStartup=true msTimeTaken=932.0 2> 207213 INFO (zkCallback-299-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/collection2/state.json zxid: 460] for collection [collection2] has occurred - updating... (live nodes size: [5]) 2> 207213 INFO (zkCallback-288-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/collection2/state.json zxid: 460] for collection [collection2] has occurred - updating... (live nodes size: [5]) 2> 207214 INFO (zkCallback-288-thread-2) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/collection2/state.json zxid: 460] for collection [collection2] has occurred - updating... (live nodes size: [5]) 2> 207213 INFO (zkCallback-299-thread-2) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/collection2/state.json zxid: 460] for collection [collection2] has occurred - updating... (live nodes size: [5]) 2> 207213 INFO (zkCallback-326-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/collection2/state.json zxid: 460] for collection [collection2] has occurred - updating... (live nodes size: [5]) 2> 207273 INFO (qtp1128963290-593) [n:127.0.0.1:43017_ c:collection2 s:shard1 r:core_node8 x:collection2_shard1_replica_n7 t:null-546] o.a.s.u.UpdateHandler Using UpdateLog implementation: org.apache.solr.update.UpdateLog 2> 207273 INFO (qtp1128963290-593) [n:127.0.0.1:43017_ c:collection2 s:shard1 r:core_node8 x:collection2_shard1_replica_n7 t:null-546] o.a.s.u.UpdateLog Initializing UpdateLog: dataDir= defaultSyncLevel=FLUSH numRecordsToKeep=100 maxNumLogsToKeep=10 numVersionBuckets=65536 2> 207293 INFO (qtp1128963290-593) [n:127.0.0.1:43017_ c:collection2 s:shard1 r:core_node8 x:collection2_shard1_replica_n7 t:null-546] o.a.s.u.CommitTracker Hard AutoCommit: disabled 2> 207293 INFO (qtp1128963290-593) [n:127.0.0.1:43017_ c:collection2 s:shard1 r:core_node8 x:collection2_shard1_replica_n7 t:null-546] o.a.s.u.CommitTracker Soft AutoCommit: disabled 2> 207307 INFO (qtp1128963290-593) [n:127.0.0.1:43017_ c:collection2 s:shard1 r:core_node8 x:collection2_shard1_replica_n7 t:null-546] o.a.s.u.RandomMergePolicy RandomMergePolicy wrapping class org.apache.lucene.index.TieredMergePolicy: [TieredMergePolicy: maxMergeAtOnce=25, maxMergedSegmentMB=98.51693820953369, floorSegmentMB=1.4527759552001953, forceMergeDeletesPctAllowed=2.9097891812865595, segmentsPerTier=45.0, maxCFSSegmentSizeMB=8.796093022208E12, noCFSRatio=0.5977428682331861, deletesPctAllowed=37.29251706615804 2> 208174 INFO (qtp1128963290-593) [n:127.0.0.1:43017_ c:collection2 s:shard1 r:core_node8 x:collection2_shard1_replica_n7 t:null-546] o.a.s.r.ManagedResourceStorage Configured ZooKeeperStorageIO with znodeBase: /configs/conf1 2> 208177 INFO (qtp1128963290-593) [n:127.0.0.1:43017_ c:collection2 s:shard1 r:core_node8 x:collection2_shard1_replica_n7 t:null-546] o.a.s.r.ManagedResourceStorage Loaded null at path _rest_managed.json using ZooKeeperStorageIO:path=/configs/conf1 2> 208182 INFO (qtp1128963290-593) [n:127.0.0.1:43017_ c:collection2 s:shard1 r:core_node8 x:collection2_shard1_replica_n7 t:null-546] o.a.s.h.ReplicationHandler Commits will be reserved for 10000 ms 2> 208183 INFO (qtp1128963290-593) [n:127.0.0.1:43017_ c:collection2 s:shard1 r:core_node8 x:collection2_shard1_replica_n7 t:null-546] o.a.s.u.UpdateLog Could not find max version in index or recent updates, using new clock 1788735536791289856 2> 208232 INFO (searcherExecutor-406-thread-1-processing-collection2_shard1_replica_n7 null-546 core_node8 127.0.0.1:43017_ collection2 shard1) [n:127.0.0.1:43017_ c:collection2 s:shard1 r:core_node8 x:collection2_shard1_replica_n7 t:null-546] o.a.s.c.SolrCore Registered new searcher autowarm time: 0 ms 2> 208234 INFO (qtp1128963290-593) [n:127.0.0.1:43017_ c:collection2 s:shard1 r:core_node8 x:collection2_shard1_replica_n7 t:null-546] o.a.s.c.ZkShardTerms Successful update of terms at /collections/collection2/terms/shard1 to Terms{values={core_node4=0, core_node8=0}, version=1} for registerTerm 2> 208238 INFO (qtp1128963290-593) [n:127.0.0.1:43017_ c:collection2 s:shard1 r:core_node8 x:collection2_shard1_replica_n7 t:null-546] o.a.s.c.ShardLeaderElectionContextBase make sure parent is created /collections/collection2/leaders/shard1 2> 208263 INFO (qtp1128963290-593) [n:127.0.0.1:43017_ c:collection2 s:shard1 r:core_node8 x:collection2_shard1_replica_n7 t:null-546] o.a.s.c.ZkController Core needs to recover:collection2_shard1_replica_n7 2> 208264 INFO (updateExecutor-335-thread-1-processing-collection2_shard1_replica_n7 null-546 core_node8 127.0.0.1:43017_ collection2 shard1) [n:127.0.0.1:43017_ c:collection2 s:shard1 r:core_node8 x:collection2_shard1_replica_n7 t:null-546] o.a.s.u.DefaultSolrCoreState Running recovery 2> 208266 INFO (recoveryExecutor-337-thread-1-processing-collection2_shard1_replica_n7 null-546 core_node8 127.0.0.1:43017_ collection2 shard1) [n:127.0.0.1:43017_ c:collection2 s:shard1 r:core_node8 x:collection2_shard1_replica_n7 t:null-546] o.a.s.c.RecoveryStrategy Starting recovery process. recoveringAfterStartup=true 2> 208267 INFO (recoveryExecutor-337-thread-1-processing-collection2_shard1_replica_n7 null-546 core_node8 127.0.0.1:43017_ collection2 shard1) [n:127.0.0.1:43017_ c:collection2 s:shard1 r:core_node8 x:collection2_shard1_replica_n7 t:null-546] o.a.s.c.RecoveryStrategy startupVersions is empty 2> 208286 INFO (qtp1128963290-593) [n:127.0.0.1:43017_ c: s: r: x: t:null-546] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/cores params={qt=/admin/cores&coreNodeName=core_node8&collection.configName=conf1&name=collection2_shard1_replica_n7&action=CREATE&collection=collection2&shard=shard1&wt=javabin&version=2&replicaType=NRT} status=0 QTime=1805 2> 208290 INFO (qtp657049334-530) [n:127.0.0.1:37133_ c:collection2 s:shard1 r:core_node4 x:collection2_shard1_replica_n3 t:null-549] o.a.s.c.S.Request webapp= path=/admin/ping params={wt=javabin&version=2} hits=0 status=0 QTime=6 2> 208291 INFO (qtp657049334-530) [n:127.0.0.1:37133_ c:collection2 s:shard1 r:core_node4 x:collection2_shard1_replica_n3 t:null-549] o.a.s.c.S.Request webapp= path=/admin/ping params={wt=javabin&version=2} status=0 QTime=7 2> 208293 INFO (recoveryExecutor-337-thread-1-processing-collection2_shard1_replica_n7 null-546 core_node8 127.0.0.1:43017_ collection2 shard1) [n:127.0.0.1:43017_ c:collection2 s:shard1 r:core_node8 x:collection2_shard1_replica_n7 t:null-546] o.a.s.c.RecoveryStrategy Begin buffering updates. core=[collection2_shard1_replica_n7] 2> 208296 INFO (recoveryExecutor-337-thread-1-processing-collection2_shard1_replica_n7 null-546 core_node8 127.0.0.1:43017_ collection2 shard1) [n:127.0.0.1:43017_ c:collection2 s:shard1 r:core_node8 x:collection2_shard1_replica_n7 t:null-546] o.a.s.u.UpdateLog Starting to buffer updates. FSUpdateLog{state=ACTIVE, tlog=null} 2> 208297 INFO (recoveryExecutor-337-thread-1-processing-collection2_shard1_replica_n7 null-546 core_node8 127.0.0.1:43017_ collection2 shard1) [n:127.0.0.1:43017_ c:collection2 s:shard1 r:core_node8 x:collection2_shard1_replica_n7 t:null-546] o.a.s.c.RecoveryStrategy Publishing state of core [collection2_shard1_replica_n7] as recovering, leader is [http://127.0.0.1:37133/collection2_shard1_replica_n3/] and I am [http://127.0.0.1:43017/collection2_shard1_replica_n7/] 2> 208312 INFO (recoveryExecutor-337-thread-1-processing-collection2_shard1_replica_n7 null-546 core_node8 127.0.0.1:43017_ collection2 shard1) [n:127.0.0.1:43017_ c:collection2 s:shard1 r:core_node8 x:collection2_shard1_replica_n7 t:null-546] o.a.s.c.RecoveryStrategy Sending prep recovery command to [http://127.0.0.1:37133]; [WaitForState: action=PREPRECOVERY&core=collection2_shard1_replica_n3&nodeName=127.0.0.1:43017_&coreNodeName=core_node8&state=recovering&checkLive=true&onlyIfLeader=true&onlyIfLeaderActive=true] 2> 208318 INFO (qtp1128963290-590) [n:127.0.0.1:43017_ c:collection2 s: r: x: t:null-545] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/collections params={node=127.0.0.1:43017_&action=ADDREPLICA&collection=collection2&shard=shard1&type=NRT&wt=javabin&version=2} status=0 QTime=2073 2> 208320 INFO (qtp657049334-532) [n:127.0.0.1:37133_ c: s: r: x:collection2_shard1_replica_n3 t:null-550] o.a.s.h.a.PrepRecoveryOp Going to wait for coreNodeName: core_node8, state: recovering, checkLive: true, onlyIfLeader: true, onlyIfLeaderActive: true 2> 208320 INFO (qtp657049334-532) [n:127.0.0.1:37133_ c: s: r: x: t:null-550] o.a.s.h.a.PrepRecoveryOp In WaitForState(recovering): collection=collection2, shard=shard1, thisCore=collection2_shard1_replica_n3, leaderDoesNotNeedRecovery=false, isLeader? true, live=true, checkLive=true, currentState=down, localState=active, nodeName=127.0.0.1:43017_, coreNodeName=core_node8, onlyIfActiveCheckResult=false, nodeProps: core_node8:{ 2> "core":"collection2_shard1_replica_n7", 2> "node_name":"127.0.0.1:43017_", 2> "base_url":"http://127.0.0.1:43017", 2> "state":"down", 2> "type":"NRT", 2> "force_set_state":"false"} 2> 208321 INFO (qtp657049334-532) [n:127.0.0.1:37133_ c: s: r: x: t:null-550] o.a.s.h.a.PrepRecoveryOp In WaitForState(recovering): collection=collection2, shard=shard1, thisCore=collection2_shard1_replica_n3, leaderDoesNotNeedRecovery=false, isLeader? true, live=true, checkLive=true, currentState=down, localState=active, nodeName=127.0.0.1:43017_, coreNodeName=core_node8, onlyIfActiveCheckResult=false, nodeProps: core_node8:{ 2> "core":"collection2_shard1_replica_n7", 2> "node_name":"127.0.0.1:43017_", 2> "base_url":"http://127.0.0.1:43017", 2> "state":"down", 2> "type":"NRT", 2> "force_set_state":"false"} 2> 208322 INFO (qtp657049334-532) [n:127.0.0.1:37133_ c: s: r: x: t:null-550] o.a.s.h.a.PrepRecoveryOp In WaitForState(recovering): collection=collection2, shard=shard1, thisCore=collection2_shard1_replica_n3, leaderDoesNotNeedRecovery=false, isLeader? true, live=true, checkLive=true, currentState=down, localState=active, nodeName=127.0.0.1:43017_, coreNodeName=core_node8, onlyIfActiveCheckResult=false, nodeProps: core_node8:{ 2> "core":"collection2_shard1_replica_n7", 2> "node_name":"127.0.0.1:43017_", 2> "base_url":"http://127.0.0.1:43017", 2> "state":"down", 2> "type":"NRT", 2> "force_set_state":"false"} 2> 208323 INFO (qtp657049334-532) [n:127.0.0.1:37133_ c: s: r: x: t:null-550] o.a.s.h.a.PrepRecoveryOp In WaitForState(recovering): collection=collection2, shard=shard1, thisCore=collection2_shard1_replica_n3, leaderDoesNotNeedRecovery=false, isLeader? true, live=true, checkLive=true, currentState=down, localState=active, nodeName=127.0.0.1:43017_, coreNodeName=core_node8, onlyIfActiveCheckResult=false, nodeProps: core_node8:{ 2> "core":"collection2_shard1_replica_n7", 2> "node_name":"127.0.0.1:43017_", 2> "base_url":"http://127.0.0.1:43017", 2> "state":"down", 2> "type":"NRT", 2> "force_set_state":"false"} 2> 208350 INFO (OverseerThreadFactory-250-thread-4) [n: c:collection3 s: r: x: t:] o.a.s.c.a.c.CreateCollectionCmd Create collection collection3 2> 208352 INFO (OverseerCollectionConfigSetProcessor-72077803529764868-127.0.0.1:45297_-n_0000000000) [n: c: s: r: x: t:] o.a.s.c.OverseerTaskQueue Response ZK path: /overseer/collection-queue-work/qnr-0000000020 doesn't exist. Requestor may have disconnected from ZooKeeper 2> 208498 INFO (zkCallback-326-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/collection2/state.json zxid: 479] for collection [collection2] has occurred - updating... (live nodes size: [5]) 2> 208498 INFO (zkCallback-340-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/collection2/state.json zxid: 479] for collection [collection2] has occurred - updating... (live nodes size: [5]) 2> 208498 INFO (zkCallback-288-thread-2) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/collection2/state.json zxid: 479] for collection [collection2] has occurred - updating... (live nodes size: [5]) 2> 208499 INFO (zkCallback-288-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/collection2/state.json zxid: 479] for collection [collection2] has occurred - updating... (live nodes size: [5]) 2> 208498 INFO (zkCallback-299-thread-2) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/collection2/state.json zxid: 479] for collection [collection2] has occurred - updating... (live nodes size: [5]) 2> 208498 INFO (zkCallback-299-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/collection2/state.json zxid: 479] for collection [collection2] has occurred - updating... (live nodes size: [5]) 2> 208507 INFO (watches-301-thread-2) [n:127.0.0.1:37133_ c: s: r: x: t:] o.a.s.h.a.PrepRecoveryOp In WaitForState(recovering): collection=collection2, shard=shard1, thisCore=collection2_shard1_replica_n3, leaderDoesNotNeedRecovery=false, isLeader? true, live=true, checkLive=true, currentState=recovering, localState=active, nodeName=127.0.0.1:43017_, coreNodeName=core_node8, onlyIfActiveCheckResult=false, nodeProps: core_node8:{ 2> "core":"collection2_shard1_replica_n7", 2> "node_name":"127.0.0.1:43017_", 2> "base_url":"http://127.0.0.1:43017", 2> "state":"recovering", 2> "type":"NRT", 2> "force_set_state":"false"} 2> 208508 INFO (qtp657049334-532) [n:127.0.0.1:37133_ c: s: r: x: t:null-550] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/cores params={nodeName=127.0.0.1:43017_&onlyIfLeaderActive=true&core=collection2_shard1_replica_n3&coreNodeName=core_node8&action=PREPRECOVERY&checkLive=true&state=recovering&onlyIfLeader=true&wt=javabin&version=2} status=0 QTime=189 2> 208589 WARN (OverseerThreadFactory-250-thread-4) [n: c:collection3 s: r: x: t:] o.a.s.c.a.c.CreateCollectionCmd It is unusual to create a collection (collection3) without cores. 2> 208600 INFO (qtp1128963290-591) [n:127.0.0.1:43017_ c:collection3 s: r: x: t:null-551] o.a.s.h.a.CollectionsHandler Wait for new collection to be active for at most 45 seconds. Check all shard replicas 2> 208604 INFO (qtp1128963290-591) [n:127.0.0.1:43017_ c:collection3 s: r: x: t:null-551] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/collections params={collection.configName=conf1&name=collection3&nrtReplicas=1&action=CREATE&numShards=2&createNodeSet=&wt=javabin&version=2} status=0 QTime=280 2> 208645 INFO (OverseerCollectionConfigSetProcessor-72077803529764868-127.0.0.1:45297_-n_0000000000) [n: c: s: r: x: t:] o.a.s.c.OverseerTaskQueue Response ZK path: /overseer/collection-queue-work/qnr-0000000022 doesn't exist. Requestor may have disconnected from ZooKeeper 2> 208664 INFO (OverseerThreadFactory-250-thread-4) [n: c:collection3 s:shard2 r: x: t:] o.a.s.c.a.c.AddReplicaCmd Node Identified 127.0.0.1:40707_ for creating new replica of shard shard2 for collection collection3 2> 208681 INFO (OverseerThreadFactory-250-thread-4) [n: c:collection3 s:shard2 r: x: t:] o.a.s.c.a.c.AddReplicaCmd Returning CreateReplica command. 2> 208697 INFO (OverseerStateUpdate-72077803529764868-127.0.0.1:45297_-n_0000000000) [n:127.0.0.1:45297_ c: s: r: x: t:] o.a.s.c.o.SliceMutator createReplica() { 2> "core":"collection3_shard2_replica_n1", 2> "node_name":"127.0.0.1:40707_", 2> "base_url":"http://127.0.0.1:40707", 2> "collection":"collection3", 2> "shard":"shard2", 2> "state":"down", 2> "type":"NRT", 2> "operation":"addreplica"} 2> 208817 INFO (zkCallback-244-thread-2) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/collection3/state.json zxid: 493] for collection [collection3] has occurred - updating... (live nodes size: [5]) 2> 208832 INFO (qtp600427849-519) [n:127.0.0.1:40707_ c: s: r: x:collection3_shard2_replica_n1 t:null-553] o.a.s.h.a.CoreAdminOperation core create command qt=/admin/cores&coreNodeName=core_node2&collection.configName=conf1&name=collection3_shard2_replica_n1&action=CREATE&collection=collection3&shard=shard2&wt=javabin&version=2&replicaType=NRT 2> 208885 INFO (qtp600427849-519) [n:127.0.0.1:40707_ c:collection3 s:shard2 r:core_node2 x:collection3_shard2_replica_n1 t:null-553] o.a.s.c.SolrConfig Using Lucene MatchVersion: 9.8.0 2> 208896 INFO (qtp600427849-519) [n:127.0.0.1:40707_ c:collection3 s:shard2 r:core_node2 x:collection3_shard2_replica_n1 t:null-553] o.a.s.s.IndexSchema Schema name=test 2> 209010 INFO (recoveryExecutor-337-thread-1-processing-collection2_shard1_replica_n7 null-546 core_node8 127.0.0.1:43017_ collection2 shard1) [n:127.0.0.1:43017_ c:collection2 s:shard1 r:core_node8 x:collection2_shard1_replica_n7 t:null-546] o.a.s.c.RecoveryStrategy Attempting to PeerSync from [http://127.0.0.1:37133/collection2_shard1_replica_n3/] - recoveringAfterStartup=[true] 2> 209014 WARN (recoveryExecutor-337-thread-1-processing-collection2_shard1_replica_n7 null-546 core_node8 127.0.0.1:43017_ collection2 shard1) [n:127.0.0.1:43017_ c:collection2 s:shard1 r:core_node8 x:collection2_shard1_replica_n7 t:null-546] o.a.s.u.PeerSyncWithLeader no frame of reference to tell if we've missed updates 2> 209014 INFO (recoveryExecutor-337-thread-1-processing-collection2_shard1_replica_n7 null-546 core_node8 127.0.0.1:43017_ collection2 shard1) [n:127.0.0.1:43017_ c:collection2 s:shard1 r:core_node8 x:collection2_shard1_replica_n7 t:null-546] o.a.s.c.RecoveryStrategy PeerSync Recovery was not successful - trying replication. 2> 209015 INFO (recoveryExecutor-337-thread-1-processing-collection2_shard1_replica_n7 null-546 core_node8 127.0.0.1:43017_ collection2 shard1) [n:127.0.0.1:43017_ c:collection2 s:shard1 r:core_node8 x:collection2_shard1_replica_n7 t:null-546] o.a.s.c.RecoveryStrategy Starting Replication Recovery. 2> 209015 INFO (recoveryExecutor-337-thread-1-processing-collection2_shard1_replica_n7 null-546 core_node8 127.0.0.1:43017_ collection2 shard1) [n:127.0.0.1:43017_ c:collection2 s:shard1 r:core_node8 x:collection2_shard1_replica_n7 t:null-546] o.a.s.c.RecoveryStrategy Attempting to replicate from [http://127.0.0.1:37133/collection2_shard1_replica_n3/]. 2> 209074 INFO (qtp558875332-573) [n:127.0.0.1:32885_ c:collection2 s:shard2 r:core_node6 x:collection2_shard2_replica_n5 t:null-554] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={update.distrib=FROMLEADER&waitSearcher=true&openSearcher=false&commit=true&softCommit=false&distrib.from=http://127.0.0.1:40707/collection2_shard2_replica_n1/&commit_end_point=replicas&wt=javabin&version=2&expungeDeletes=false}{commit=} 0 3 2> 209077 INFO (qtp600427849-520) [n:127.0.0.1:40707_ c:collection2 s:shard2 r:core_node2 x:collection2_shard2_replica_n1 t:null-554] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={update.distrib=TOLEADER&waitSearcher=true&openSearcher=false&commit=true&softCommit=false&distrib.from=http://127.0.0.1:37133/collection2_shard1_replica_n3/&commit_end_point=leaders&wt=javabin&version=2&expungeDeletes=false}{commit=} 0 22 2> 209116 INFO (qtp1128963290-589) [n:127.0.0.1:43017_ c:collection2 s:shard1 r:core_node8 x:collection2_shard1_replica_n7 t:null-554] o.a.s.u.p.DistributedUpdateProcessor Ignoring commit while not ACTIVE - state: BUFFERING replay: false 2> 209117 INFO (qtp1128963290-589) [n:127.0.0.1:43017_ c:collection2 s:shard1 r:core_node8 x:collection2_shard1_replica_n7 t:null-554] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={update.distrib=FROMLEADER&waitSearcher=true&openSearcher=false&commit=true&softCommit=false&distrib.from=http://127.0.0.1:37133/collection2_shard1_replica_n3/&commit_end_point=replicas&wt=javabin&version=2&expungeDeletes=false}{commit=} 0 9 2> 209120 INFO (qtp657049334-528) [n:127.0.0.1:37133_ c:collection2 s:shard1 r:core_node4 x:collection2_shard1_replica_n3 t:null-554] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={waitSearcher=true&openSearcher=false&commit=true&softCommit=false&wt=javabin&version=2}{commit=} 0 100 2> 209128 INFO (qtp600427849-519) [n:127.0.0.1:40707_ c:collection3 s:shard2 r:core_node2 x:collection3_shard2_replica_n1 t:null-553] o.a.s.s.IndexSchema Loaded schema test/1.0 with uniqueid field id 2> 209132 INFO (qtp657049334-533) [n:127.0.0.1:37133_ c:collection2 s:shard1 r:core_node4 x:collection2_shard1_replica_n3 t:null-555] o.a.s.c.S.Request webapp= path=/replication params={qt=/replication&wt=javabin&version=2&command=indexversion} status=0 QTime=2 2> 209135 INFO (recoveryExecutor-337-thread-1-processing-collection2_shard1_replica_n7 null-546 core_node8 127.0.0.1:43017_ collection2 shard1) [n:127.0.0.1:43017_ c:collection2 s:shard1 r:core_node8 x:collection2_shard1_replica_n7 t:null-546] o.a.s.h.IndexFetcher Leader's generation: 1 2> 209135 INFO (recoveryExecutor-337-thread-1-processing-collection2_shard1_replica_n7 null-546 core_node8 127.0.0.1:43017_ collection2 shard1) [n:127.0.0.1:43017_ c:collection2 s:shard1 r:core_node8 x:collection2_shard1_replica_n7 t:null-546] o.a.s.h.IndexFetcher Leader's version: 0 2> 209135 INFO (recoveryExecutor-337-thread-1-processing-collection2_shard1_replica_n7 null-546 core_node8 127.0.0.1:43017_ collection2 shard1) [n:127.0.0.1:43017_ c:collection2 s:shard1 r:core_node8 x:collection2_shard1_replica_n7 t:null-546] o.a.s.h.IndexFetcher Follower's generation: 1 2> 209136 INFO (recoveryExecutor-337-thread-1-processing-collection2_shard1_replica_n7 null-546 core_node8 127.0.0.1:43017_ collection2 shard1) [n:127.0.0.1:43017_ c:collection2 s:shard1 r:core_node8 x:collection2_shard1_replica_n7 t:null-546] o.a.s.h.IndexFetcher Follower's version: 0 2> 209136 INFO (recoveryExecutor-337-thread-1-processing-collection2_shard1_replica_n7 null-546 core_node8 127.0.0.1:43017_ collection2 shard1) [n:127.0.0.1:43017_ c:collection2 s:shard1 r:core_node8 x:collection2_shard1_replica_n7 t:null-546] o.a.s.c.RecoveryStrategy No replay needed. 2> 209145 INFO (recoveryExecutor-337-thread-1-processing-collection2_shard1_replica_n7 null-546 core_node8 127.0.0.1:43017_ collection2 shard1) [n:127.0.0.1:43017_ c:collection2 s:shard1 r:core_node8 x:collection2_shard1_replica_n7 t:null-546] o.a.s.c.RecoveryStrategy Replication Recovery was successful. 2> 209145 INFO (recoveryExecutor-337-thread-1-processing-collection2_shard1_replica_n7 null-546 core_node8 127.0.0.1:43017_ collection2 shard1) [n:127.0.0.1:43017_ c:collection2 s:shard1 r:core_node8 x:collection2_shard1_replica_n7 t:null-546] o.a.s.c.RecoveryStrategy Registering as Active after recovery. 2> 209151 INFO (recoveryExecutor-337-thread-1-processing-collection2_shard1_replica_n7 null-546 core_node8 127.0.0.1:43017_ collection2 shard1) [n:127.0.0.1:43017_ c:collection2 s:shard1 r:core_node8 x:collection2_shard1_replica_n7 t:null-546] o.a.s.c.RecoveryStrategy Updating version bucket highest from index after successful recovery. 2> 209152 INFO (recoveryExecutor-337-thread-1-processing-collection2_shard1_replica_n7 null-546 core_node8 127.0.0.1:43017_ collection2 shard1) [n:127.0.0.1:43017_ c:collection2 s:shard1 r:core_node8 x:collection2_shard1_replica_n7 t:null-546] o.a.s.u.UpdateLog Could not find max version in index or recent updates, using new clock 1788735537807360000 2> 209162 INFO (recoveryExecutor-337-thread-1-processing-collection2_shard1_replica_n7 null-546 core_node8 127.0.0.1:43017_ collection2 shard1) [n:127.0.0.1:43017_ c:collection2 s:shard1 r:core_node8 x:collection2_shard1_replica_n7 t:null-546] o.a.s.c.RecoveryStrategy Finished recovery process, successful=[true] msTimeTaken=894.0 2> 209162 INFO (recoveryExecutor-337-thread-1-processing-collection2_shard1_replica_n7 null-546 core_node8 127.0.0.1:43017_ collection2 shard1) [n:127.0.0.1:43017_ c:collection2 s:shard1 r:core_node8 x:collection2_shard1_replica_n7 t:null-546] o.a.s.c.RecoveryStrategy Finished recovery process. recoveringAfterStartup=true msTimeTaken=895.0 2> 209267 INFO (zkCallback-326-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/collection2/state.json zxid: 499] for collection [collection2] has occurred - updating... (live nodes size: [5]) 2> 209267 INFO (zkCallback-299-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/collection2/state.json zxid: 499] for collection [collection2] has occurred - updating... (live nodes size: [5]) 2> 209267 INFO (zkCallback-288-thread-2) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/collection2/state.json zxid: 499] for collection [collection2] has occurred - updating... (live nodes size: [5]) 2> 209267 INFO (zkCallback-299-thread-2) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/collection2/state.json zxid: 499] for collection [collection2] has occurred - updating... (live nodes size: [5]) 2> 209267 INFO (zkCallback-340-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/collection2/state.json zxid: 499] for collection [collection2] has occurred - updating... (live nodes size: [5]) 2> 209267 INFO (zkCallback-288-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/collection2/state.json zxid: 499] for collection [collection2] has occurred - updating... (live nodes size: [5]) 2> 209338 INFO (qtp600427849-519) [n:127.0.0.1:40707_ c:collection3 s:shard2 r:core_node2 x:collection3_shard2_replica_n1 t:null-553] o.a.s.c.CoreContainer Creating SolrCore 'collection3_shard2_replica_n1' using configuration from configset conf1, trusted=true 2> 209342 INFO (qtp600427849-519) [n:127.0.0.1:40707_ c:collection3 s:shard2 r:core_node2 x:collection3_shard2_replica_n1 t:null-553] o.a.s.c.SolrCore Opening new SolrCore at [/build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.cloud.BasicDistributedZkTest_246C98A4C257C021-001/shard-1-001/cores/collection3_shard2_replica_n1], dataDir=[/build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.cloud.BasicDistributedZkTest_246C98A4C257C021-001/shard-1-001/cores/collection3_shard2_replica_n1/data/] 2> 209379 INFO (qtp600427849-519) [n:127.0.0.1:40707_ c:collection3 s:shard2 r:core_node2 x:collection3_shard2_replica_n1 t:null-553] o.a.s.u.RandomMergePolicy RandomMergePolicy wrapping class org.apache.lucene.index.TieredMergePolicy: [TieredMergePolicy: maxMergeAtOnce=26, maxMergedSegmentMB=49.734910011291504, floorSegmentMB=0.3930988311767578, forceMergeDeletesPctAllowed=19.244929011688583, segmentsPerTier=32.0, maxCFSSegmentSizeMB=8.796093022208E12, noCFSRatio=0.844874139736486, deletesPctAllowed=38.311155010312675 2> 209402 WARN (qtp600427849-519) [n:127.0.0.1:40707_ c:collection3 s:shard2 r:core_node2 x:collection3_shard2_replica_n1 t:null-553] o.a.s.c.RequestHandlers INVALID paramSet a in requestHandler {type = requestHandler,name = /dump,class = DumpRequestHandler,attributes = {initParams=a, name=/dump, class=DumpRequestHandler},args = {defaults={a=A, b=B}}} 2> 209616 INFO (qtp600427849-519) [n:127.0.0.1:40707_ c:collection3 s:shard2 r:core_node2 x:collection3_shard2_replica_n1 t:null-553] o.a.s.u.UpdateHandler Using UpdateLog implementation: org.apache.solr.update.UpdateLog 2> 209617 INFO (qtp600427849-519) [n:127.0.0.1:40707_ c:collection3 s:shard2 r:core_node2 x:collection3_shard2_replica_n1 t:null-553] o.a.s.u.UpdateLog Initializing UpdateLog: dataDir= defaultSyncLevel=FLUSH numRecordsToKeep=100 maxNumLogsToKeep=10 numVersionBuckets=65536 2> 209635 INFO (qtp600427849-519) [n:127.0.0.1:40707_ c:collection3 s:shard2 r:core_node2 x:collection3_shard2_replica_n1 t:null-553] o.a.s.u.CommitTracker Hard AutoCommit: disabled 2> 209635 INFO (qtp600427849-519) [n:127.0.0.1:40707_ c:collection3 s:shard2 r:core_node2 x:collection3_shard2_replica_n1 t:null-553] o.a.s.u.CommitTracker Soft AutoCommit: disabled 2> 209646 INFO (qtp600427849-519) [n:127.0.0.1:40707_ c:collection3 s:shard2 r:core_node2 x:collection3_shard2_replica_n1 t:null-553] o.a.s.u.RandomMergePolicy RandomMergePolicy wrapping class org.apache.lucene.index.TieredMergePolicy: [TieredMergePolicy: maxMergeAtOnce=25, maxMergedSegmentMB=98.51693820953369, floorSegmentMB=1.4527759552001953, forceMergeDeletesPctAllowed=2.9097891812865595, segmentsPerTier=45.0, maxCFSSegmentSizeMB=8.796093022208E12, noCFSRatio=0.5977428682331861, deletesPctAllowed=37.29251706615804 2> 209666 INFO (qtp600427849-519) [n:127.0.0.1:40707_ c:collection3 s:shard2 r:core_node2 x:collection3_shard2_replica_n1 t:null-553] o.a.s.r.ManagedResourceStorage Configured ZooKeeperStorageIO with znodeBase: /configs/conf1 2> 209669 INFO (qtp600427849-519) [n:127.0.0.1:40707_ c:collection3 s:shard2 r:core_node2 x:collection3_shard2_replica_n1 t:null-553] o.a.s.r.ManagedResourceStorage Loaded null at path _rest_managed.json using ZooKeeperStorageIO:path=/configs/conf1 2> 209674 INFO (qtp600427849-519) [n:127.0.0.1:40707_ c:collection3 s:shard2 r:core_node2 x:collection3_shard2_replica_n1 t:null-553] o.a.s.h.ReplicationHandler Commits will be reserved for 10000 ms 2> 209675 INFO (qtp600427849-519) [n:127.0.0.1:40707_ c:collection3 s:shard2 r:core_node2 x:collection3_shard2_replica_n1 t:null-553] o.a.s.u.UpdateLog Could not find max version in index or recent updates, using new clock 1788735538355765248 2> 209711 INFO (searcherExecutor-413-thread-1-processing-collection3_shard2_replica_n1 null-553 core_node2 127.0.0.1:40707_ collection3 shard2) [n:127.0.0.1:40707_ c:collection3 s:shard2 r:core_node2 x:collection3_shard2_replica_n1 t:null-553] o.a.s.c.SolrCore Registered new searcher autowarm time: 0 ms 2> 209720 INFO (qtp600427849-519) [n:127.0.0.1:40707_ c:collection3 s:shard2 r:core_node2 x:collection3_shard2_replica_n1 t:null-553] o.a.s.c.ZkShardTerms Successful update of terms at /collections/collection3/terms/shard2 to Terms{values={core_node2=0}, version=0} for registerTerm 2> 209723 INFO (qtp600427849-519) [n:127.0.0.1:40707_ c:collection3 s:shard2 r:core_node2 x:collection3_shard2_replica_n1 t:null-553] o.a.s.c.ShardLeaderElectionContextBase make sure parent is created /collections/collection3/leaders/shard2 2> 209763 INFO (qtp600427849-519) [n:127.0.0.1:40707_ c:collection3 s:shard2 r:core_node2 x:collection3_shard2_replica_n1 t:null-553] o.a.s.c.ShardLeaderElectionContext Enough replicas found to continue. 2> 209763 INFO (qtp600427849-519) [n:127.0.0.1:40707_ c:collection3 s:shard2 r:core_node2 x:collection3_shard2_replica_n1 t:null-553] o.a.s.c.ShardLeaderElectionContext I may be the new leader - try and sync 2> 209763 INFO (qtp600427849-519) [n:127.0.0.1:40707_ c:collection3 s:shard2 r:core_node2 x:collection3_shard2_replica_n1 t:null-553] o.a.s.c.SyncStrategy Sync replicas to http://127.0.0.1:40707/collection3_shard2_replica_n1/ 2> 209766 INFO (qtp600427849-519) [n:127.0.0.1:40707_ c:collection3 s:shard2 r:core_node2 x:collection3_shard2_replica_n1 t:null-553] o.a.s.c.SyncStrategy Sync Success - now sync replicas to me 2> 209769 INFO (qtp600427849-519) [n:127.0.0.1:40707_ c:collection3 s:shard2 r:core_node2 x:collection3_shard2_replica_n1 t:null-553] o.a.s.c.SyncStrategy http://127.0.0.1:40707/collection3_shard2_replica_n1/ has no replicas 2> 209770 INFO (qtp600427849-519) [n:127.0.0.1:40707_ c:collection3 s:shard2 r:core_node2 x:collection3_shard2_replica_n1 t:null-553] o.a.s.c.ShardLeaderElectionContextBase Creating leader registration node /collections/collection3/leaders/shard2/leader after winning as /collections/collection3/leader_elect/shard2/election/72077803529764874-core_node2-n_0000000000 2> 209793 INFO (qtp600427849-519) [n:127.0.0.1:40707_ c:collection3 s:shard2 r:core_node2 x:collection3_shard2_replica_n1 t:null-553] o.a.s.c.ShardLeaderElectionContext I am the new leader: http://127.0.0.1:40707/collection3_shard2_replica_n1/ shard2 2> 209906 INFO (zkCallback-288-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/collection3/state.json zxid: 514] for collection [collection3] has occurred - updating... (live nodes size: [5]) 2> 209915 INFO (qtp600427849-519) [n:127.0.0.1:40707_ c:collection3 s:shard2 r:core_node2 x:collection3_shard2_replica_n1 t:null-553] o.a.s.c.ZkController I am the leader, no recovery necessary 2> 209943 INFO (qtp600427849-519) [n:127.0.0.1:40707_ c: s: r: x: t:null-553] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/cores params={qt=/admin/cores&coreNodeName=core_node2&collection.configName=conf1&name=collection3_shard2_replica_n1&action=CREATE&collection=collection3&shard=shard2&wt=javabin&version=2&replicaType=NRT} status=0 QTime=1114 2> 209963 INFO (qtp1128963290-588) [n:127.0.0.1:43017_ c:collection3 s: r: x: t:null-552] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/collections params={node=127.0.0.1:40707_&action=ADDREPLICA&collection=collection3&shard=shard2&type=NRT&wt=javabin&version=2} status=0 QTime=1347 2> 210010 INFO (OverseerCollectionConfigSetProcessor-72077803529764868-127.0.0.1:45297_-n_0000000000) [n: c: s: r: x: t:] o.a.s.c.OverseerTaskQueue Response ZK path: /overseer/collection-queue-work/qnr-0000000024 doesn't exist. Requestor may have disconnected from ZooKeeper 2> 210029 INFO (OverseerThreadFactory-250-thread-4) [n: c:collection3 s:shard1 r: x: t:] o.a.s.c.a.c.AddReplicaCmd Node Identified 127.0.0.1:37133_ for creating new replica of shard shard1 for collection collection3 2> 210036 INFO (zkCallback-288-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/collection3/state.json zxid: 524] for collection [collection3] has occurred - updating... (live nodes size: [5]) 2> 210036 INFO (zkCallback-288-thread-2) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/collection3/state.json zxid: 524] for collection [collection3] has occurred - updating... (live nodes size: [5]) 2> 210049 INFO (OverseerThreadFactory-250-thread-4) [n: c:collection3 s:shard1 r: x: t:] o.a.s.c.a.c.AddReplicaCmd Returning CreateReplica command. 2> 210067 INFO (OverseerStateUpdate-72077803529764868-127.0.0.1:45297_-n_0000000000) [n:127.0.0.1:45297_ c: s: r: x: t:] o.a.s.c.o.SliceMutator createReplica() { 2> "core":"collection3_shard1_replica_n3", 2> "node_name":"127.0.0.1:37133_", 2> "base_url":"http://127.0.0.1:37133", 2> "collection":"collection3", 2> "shard":"shard1", 2> "state":"down", 2> "type":"NRT", 2> "operation":"addreplica"} 2> 210188 INFO (zkCallback-288-thread-2) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/collection3/state.json zxid: 529] for collection [collection3] has occurred - updating... (live nodes size: [5]) 2> 210188 INFO (zkCallback-244-thread-2) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/collection3/state.json zxid: 529] for collection [collection3] has occurred - updating... (live nodes size: [5]) 2> 210188 INFO (zkCallback-288-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/collection3/state.json zxid: 529] for collection [collection3] has occurred - updating... (live nodes size: [5]) 2> 210203 INFO (qtp657049334-529) [n:127.0.0.1:37133_ c: s: r: x:collection3_shard1_replica_n3 t:null-557] o.a.s.h.a.CoreAdminOperation core create command qt=/admin/cores&coreNodeName=core_node4&collection.configName=conf1&name=collection3_shard1_replica_n3&action=CREATE&collection=collection3&shard=shard1&wt=javabin&version=2&replicaType=NRT 2> 210261 INFO (qtp657049334-529) [n:127.0.0.1:37133_ c:collection3 s:shard1 r:core_node4 x:collection3_shard1_replica_n3 t:null-557] o.a.s.c.SolrConfig Using Lucene MatchVersion: 9.8.0 2> 210277 INFO (qtp657049334-529) [n:127.0.0.1:37133_ c:collection3 s:shard1 r:core_node4 x:collection3_shard1_replica_n3 t:null-557] o.a.s.s.IndexSchema Schema name=test 2> 210340 INFO (zkCallback-288-thread-2) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/collection3/state.json zxid: 532] for collection [collection3] has occurred - updating... (live nodes size: [5]) 2> 210340 INFO (zkCallback-288-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/collection3/state.json zxid: 532] for collection [collection3] has occurred - updating... (live nodes size: [5]) 2> 210481 INFO (qtp657049334-529) [n:127.0.0.1:37133_ c:collection3 s:shard1 r:core_node4 x:collection3_shard1_replica_n3 t:null-557] o.a.s.s.IndexSchema Loaded schema test/1.0 with uniqueid field id 2> 210672 INFO (qtp657049334-529) [n:127.0.0.1:37133_ c:collection3 s:shard1 r:core_node4 x:collection3_shard1_replica_n3 t:null-557] o.a.s.c.CoreContainer Creating SolrCore 'collection3_shard1_replica_n3' using configuration from configset conf1, trusted=true 2> 210676 INFO (qtp657049334-529) [n:127.0.0.1:37133_ c:collection3 s:shard1 r:core_node4 x:collection3_shard1_replica_n3 t:null-557] o.a.s.c.SolrCore Opening new SolrCore at [/build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.cloud.BasicDistributedZkTest_246C98A4C257C021-001/shard-2-001/cores/collection3_shard1_replica_n3], dataDir=[/build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.cloud.BasicDistributedZkTest_246C98A4C257C021-001/shard-2-001/cores/collection3_shard1_replica_n3/data/] 2> 210705 INFO (qtp657049334-529) [n:127.0.0.1:37133_ c:collection3 s:shard1 r:core_node4 x:collection3_shard1_replica_n3 t:null-557] o.a.s.u.RandomMergePolicy RandomMergePolicy wrapping class org.apache.lucene.index.TieredMergePolicy: [TieredMergePolicy: maxMergeAtOnce=26, maxMergedSegmentMB=49.734910011291504, floorSegmentMB=0.3930988311767578, forceMergeDeletesPctAllowed=19.244929011688583, segmentsPerTier=32.0, maxCFSSegmentSizeMB=8.796093022208E12, noCFSRatio=0.844874139736486, deletesPctAllowed=38.311155010312675 2> 210761 WARN (qtp657049334-529) [n:127.0.0.1:37133_ c:collection3 s:shard1 r:core_node4 x:collection3_shard1_replica_n3 t:null-557] o.a.s.c.RequestHandlers INVALID paramSet a in requestHandler {type = requestHandler,name = /dump,class = DumpRequestHandler,attributes = {initParams=a, name=/dump, class=DumpRequestHandler},args = {defaults={a=A, b=B}}} 2> 211065 INFO (qtp657049334-529) [n:127.0.0.1:37133_ c:collection3 s:shard1 r:core_node4 x:collection3_shard1_replica_n3 t:null-557] o.a.s.u.UpdateHandler Using UpdateLog implementation: org.apache.solr.update.UpdateLog 2> 211065 INFO (qtp657049334-529) [n:127.0.0.1:37133_ c:collection3 s:shard1 r:core_node4 x:collection3_shard1_replica_n3 t:null-557] o.a.s.u.UpdateLog Initializing UpdateLog: dataDir= defaultSyncLevel=FLUSH numRecordsToKeep=100 maxNumLogsToKeep=10 numVersionBuckets=65536 2> 211088 INFO (qtp657049334-529) [n:127.0.0.1:37133_ c:collection3 s:shard1 r:core_node4 x:collection3_shard1_replica_n3 t:null-557] o.a.s.u.CommitTracker Hard AutoCommit: disabled 2> 211088 INFO (qtp657049334-529) [n:127.0.0.1:37133_ c:collection3 s:shard1 r:core_node4 x:collection3_shard1_replica_n3 t:null-557] o.a.s.u.CommitTracker Soft AutoCommit: disabled 2> 211103 INFO (qtp657049334-529) [n:127.0.0.1:37133_ c:collection3 s:shard1 r:core_node4 x:collection3_shard1_replica_n3 t:null-557] o.a.s.u.RandomMergePolicy RandomMergePolicy wrapping class org.apache.lucene.index.TieredMergePolicy: [TieredMergePolicy: maxMergeAtOnce=25, maxMergedSegmentMB=98.51693820953369, floorSegmentMB=1.4527759552001953, forceMergeDeletesPctAllowed=2.9097891812865595, segmentsPerTier=45.0, maxCFSSegmentSizeMB=8.796093022208E12, noCFSRatio=0.5977428682331861, deletesPctAllowed=37.29251706615804 2> 211137 INFO (qtp657049334-529) [n:127.0.0.1:37133_ c:collection3 s:shard1 r:core_node4 x:collection3_shard1_replica_n3 t:null-557] o.a.s.r.ManagedResourceStorage Configured ZooKeeperStorageIO with znodeBase: /configs/conf1 2> 211140 INFO (qtp657049334-529) [n:127.0.0.1:37133_ c:collection3 s:shard1 r:core_node4 x:collection3_shard1_replica_n3 t:null-557] o.a.s.r.ManagedResourceStorage Loaded null at path _rest_managed.json using ZooKeeperStorageIO:path=/configs/conf1 2> 211147 INFO (qtp657049334-529) [n:127.0.0.1:37133_ c:collection3 s:shard1 r:core_node4 x:collection3_shard1_replica_n3 t:null-557] o.a.s.h.ReplicationHandler Commits will be reserved for 10000 ms 2> 211148 INFO (qtp657049334-529) [n:127.0.0.1:37133_ c:collection3 s:shard1 r:core_node4 x:collection3_shard1_replica_n3 t:null-557] o.a.s.u.UpdateLog Could not find max version in index or recent updates, using new clock 1788735539900317696 2> 211187 INFO (searcherExecutor-419-thread-1-processing-collection3_shard1_replica_n3 null-557 core_node4 127.0.0.1:37133_ collection3 shard1) [n:127.0.0.1:37133_ c:collection3 s:shard1 r:core_node4 x:collection3_shard1_replica_n3 t:null-557] o.a.s.c.SolrCore Registered new searcher autowarm time: 0 ms 2> 211191 INFO (qtp657049334-529) [n:127.0.0.1:37133_ c:collection3 s:shard1 r:core_node4 x:collection3_shard1_replica_n3 t:null-557] o.a.s.c.ZkShardTerms Successful update of terms at /collections/collection3/terms/shard1 to Terms{values={core_node4=0}, version=0} for registerTerm 2> 211192 INFO (qtp657049334-529) [n:127.0.0.1:37133_ c:collection3 s:shard1 r:core_node4 x:collection3_shard1_replica_n3 t:null-557] o.a.s.c.ShardLeaderElectionContextBase make sure parent is created /collections/collection3/leaders/shard1 2> 211227 INFO (qtp657049334-529) [n:127.0.0.1:37133_ c:collection3 s:shard1 r:core_node4 x:collection3_shard1_replica_n3 t:null-557] o.a.s.c.ShardLeaderElectionContext Enough replicas found to continue. 2> 211227 INFO (qtp657049334-529) [n:127.0.0.1:37133_ c:collection3 s:shard1 r:core_node4 x:collection3_shard1_replica_n3 t:null-557] o.a.s.c.ShardLeaderElectionContext I may be the new leader - try and sync 2> 211227 INFO (qtp657049334-529) [n:127.0.0.1:37133_ c:collection3 s:shard1 r:core_node4 x:collection3_shard1_replica_n3 t:null-557] o.a.s.c.SyncStrategy Sync replicas to http://127.0.0.1:37133/collection3_shard1_replica_n3/ 2> 211231 INFO (qtp657049334-529) [n:127.0.0.1:37133_ c:collection3 s:shard1 r:core_node4 x:collection3_shard1_replica_n3 t:null-557] o.a.s.c.SyncStrategy Sync Success - now sync replicas to me 2> 211234 INFO (qtp657049334-529) [n:127.0.0.1:37133_ c:collection3 s:shard1 r:core_node4 x:collection3_shard1_replica_n3 t:null-557] o.a.s.c.SyncStrategy http://127.0.0.1:37133/collection3_shard1_replica_n3/ has no replicas 2> 211234 INFO (qtp657049334-529) [n:127.0.0.1:37133_ c:collection3 s:shard1 r:core_node4 x:collection3_shard1_replica_n3 t:null-557] o.a.s.c.ShardLeaderElectionContextBase Creating leader registration node /collections/collection3/leaders/shard1/leader after winning as /collections/collection3/leader_elect/shard1/election/72077803529764876-core_node4-n_0000000000 2> 211262 INFO (qtp657049334-529) [n:127.0.0.1:37133_ c:collection3 s:shard1 r:core_node4 x:collection3_shard1_replica_n3 t:null-557] o.a.s.c.ShardLeaderElectionContext I am the new leader: http://127.0.0.1:37133/collection3_shard1_replica_n3/ shard1 2> 211376 INFO (zkCallback-288-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/collection3/state.json zxid: 547] for collection [collection3] has occurred - updating... (live nodes size: [5]) 2> 211376 INFO (zkCallback-288-thread-2) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/collection3/state.json zxid: 547] for collection [collection3] has occurred - updating... (live nodes size: [5]) 2> 211376 INFO (zkCallback-299-thread-2) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/collection3/state.json zxid: 547] for collection [collection3] has occurred - updating... (live nodes size: [5]) 2> 211385 INFO (qtp657049334-529) [n:127.0.0.1:37133_ c:collection3 s:shard1 r:core_node4 x:collection3_shard1_replica_n3 t:null-557] o.a.s.c.ZkController I am the leader, no recovery necessary 2> 211425 INFO (qtp657049334-529) [n:127.0.0.1:37133_ c: s: r: x: t:null-557] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/cores params={qt=/admin/cores&coreNodeName=core_node4&collection.configName=conf1&name=collection3_shard1_replica_n3&action=CREATE&collection=collection3&shard=shard1&wt=javabin&version=2&replicaType=NRT} status=0 QTime=1225 2> 211443 INFO (qtp1128963290-592) [n:127.0.0.1:43017_ c:collection3 s: r: x: t:null-556] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/collections params={node=127.0.0.1:37133_&action=ADDREPLICA&collection=collection3&shard=shard1&type=NRT&wt=javabin&version=2} status=0 QTime=1465 2> 211492 INFO (OverseerCollectionConfigSetProcessor-72077803529764868-127.0.0.1:45297_-n_0000000000) [n: c: s: r: x: t:] o.a.s.c.OverseerTaskQueue Response ZK path: /overseer/collection-queue-work/qnr-0000000026 doesn't exist. Requestor may have disconnected from ZooKeeper 2> 211509 INFO (OverseerThreadFactory-250-thread-4) [n: c:collection3 s:shard2 r: x: t:] o.a.s.c.a.c.AddReplicaCmd Node Identified 127.0.0.1:32885_ for creating new replica of shard shard2 for collection collection3 2> 211518 INFO (zkCallback-288-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/collection3/state.json zxid: 557] for collection [collection3] has occurred - updating... (live nodes size: [5]) 2> 211518 INFO (zkCallback-288-thread-2) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/collection3/state.json zxid: 557] for collection [collection3] has occurred - updating... (live nodes size: [5]) 2> 211518 INFO (zkCallback-299-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/collection3/state.json zxid: 557] for collection [collection3] has occurred - updating... (live nodes size: [5]) 2> 211518 INFO (zkCallback-299-thread-2) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/collection3/state.json zxid: 557] for collection [collection3] has occurred - updating... (live nodes size: [5]) 2> 211530 INFO (OverseerThreadFactory-250-thread-4) [n: c:collection3 s:shard2 r: x: t:] o.a.s.c.a.c.AddReplicaCmd Returning CreateReplica command. 2> 211548 INFO (OverseerStateUpdate-72077803529764868-127.0.0.1:45297_-n_0000000000) [n:127.0.0.1:45297_ c: s: r: x: t:] o.a.s.c.o.SliceMutator createReplica() { 2> "core":"collection3_shard2_replica_n5", 2> "node_name":"127.0.0.1:32885_", 2> "base_url":"http://127.0.0.1:32885", 2> "collection":"collection3", 2> "shard":"shard2", 2> "state":"down", 2> "type":"NRT", 2> "operation":"addreplica"} 2> 211669 INFO (zkCallback-244-thread-2) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/collection3/state.json zxid: 562] for collection [collection3] has occurred - updating... (live nodes size: [5]) 2> 211669 INFO (zkCallback-299-thread-2) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/collection3/state.json zxid: 562] for collection [collection3] has occurred - updating... (live nodes size: [5]) 2> 211669 INFO (zkCallback-288-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/collection3/state.json zxid: 562] for collection [collection3] has occurred - updating... (live nodes size: [5]) 2> 211669 INFO (zkCallback-288-thread-2) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/collection3/state.json zxid: 562] for collection [collection3] has occurred - updating... (live nodes size: [5]) 2> 211669 INFO (zkCallback-299-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/collection3/state.json zxid: 562] for collection [collection3] has occurred - updating... (live nodes size: [5]) 2> 211684 INFO (qtp558875332-568) [n:127.0.0.1:32885_ c: s: r: x:collection3_shard2_replica_n5 t:null-559] o.a.s.h.a.CoreAdminOperation core create command qt=/admin/cores&coreNodeName=core_node6&collection.configName=conf1&name=collection3_shard2_replica_n5&action=CREATE&collection=collection3&shard=shard2&wt=javabin&version=2&replicaType=NRT 2> 211739 INFO (qtp558875332-568) [n:127.0.0.1:32885_ c:collection3 s:shard2 r:core_node6 x:collection3_shard2_replica_n5 t:null-559] o.a.s.c.SolrConfig Using Lucene MatchVersion: 9.8.0 2> 211753 INFO (qtp558875332-568) [n:127.0.0.1:32885_ c:collection3 s:shard2 r:core_node6 x:collection3_shard2_replica_n5 t:null-559] o.a.s.s.IndexSchema Schema name=test 2> 211818 INFO (zkCallback-299-thread-2) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/collection3/state.json zxid: 565] for collection [collection3] has occurred - updating... (live nodes size: [5]) 2> 211818 INFO (zkCallback-288-thread-2) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/collection3/state.json zxid: 565] for collection [collection3] has occurred - updating... (live nodes size: [5]) 2> 211819 INFO (zkCallback-288-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/collection3/state.json zxid: 565] for collection [collection3] has occurred - updating... (live nodes size: [5]) 2> 211818 INFO (zkCallback-299-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/collection3/state.json zxid: 565] for collection [collection3] has occurred - updating... (live nodes size: [5]) 2> 211959 INFO (qtp558875332-568) [n:127.0.0.1:32885_ c:collection3 s:shard2 r:core_node6 x:collection3_shard2_replica_n5 t:null-559] o.a.s.s.IndexSchema Loaded schema test/1.0 with uniqueid field id 2> 212128 INFO (qtp558875332-568) [n:127.0.0.1:32885_ c:collection3 s:shard2 r:core_node6 x:collection3_shard2_replica_n5 t:null-559] o.a.s.c.CoreContainer Creating SolrCore 'collection3_shard2_replica_n5' using configuration from configset conf1, trusted=true 2> 212133 INFO (qtp558875332-568) [n:127.0.0.1:32885_ c:collection3 s:shard2 r:core_node6 x:collection3_shard2_replica_n5 t:null-559] o.a.s.c.SolrCore Opening new SolrCore at [/build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.cloud.BasicDistributedZkTest_246C98A4C257C021-001/shard-3-001/cores/collection3_shard2_replica_n5], dataDir=[/build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.cloud.BasicDistributedZkTest_246C98A4C257C021-001/shard-3-001/cores/collection3_shard2_replica_n5/data/] 2> 212169 INFO (qtp558875332-568) [n:127.0.0.1:32885_ c:collection3 s:shard2 r:core_node6 x:collection3_shard2_replica_n5 t:null-559] o.a.s.u.RandomMergePolicy RandomMergePolicy wrapping class org.apache.lucene.index.TieredMergePolicy: [TieredMergePolicy: maxMergeAtOnce=26, maxMergedSegmentMB=49.734910011291504, floorSegmentMB=0.3930988311767578, forceMergeDeletesPctAllowed=19.244929011688583, segmentsPerTier=32.0, maxCFSSegmentSizeMB=8.796093022208E12, noCFSRatio=0.844874139736486, deletesPctAllowed=38.311155010312675 2> 212188 WARN (qtp558875332-568) [n:127.0.0.1:32885_ c:collection3 s:shard2 r:core_node6 x:collection3_shard2_replica_n5 t:null-559] o.a.s.c.RequestHandlers INVALID paramSet a in requestHandler {type = requestHandler,name = /dump,class = DumpRequestHandler,attributes = {initParams=a, name=/dump, class=DumpRequestHandler},args = {defaults={a=A, b=B}}} 2> 212420 INFO (qtp558875332-568) [n:127.0.0.1:32885_ c:collection3 s:shard2 r:core_node6 x:collection3_shard2_replica_n5 t:null-559] o.a.s.u.UpdateHandler Using UpdateLog implementation: org.apache.solr.update.UpdateLog 2> 212420 INFO (qtp558875332-568) [n:127.0.0.1:32885_ c:collection3 s:shard2 r:core_node6 x:collection3_shard2_replica_n5 t:null-559] o.a.s.u.UpdateLog Initializing UpdateLog: dataDir= defaultSyncLevel=FLUSH numRecordsToKeep=100 maxNumLogsToKeep=10 numVersionBuckets=65536 2> 212437 INFO (qtp558875332-568) [n:127.0.0.1:32885_ c:collection3 s:shard2 r:core_node6 x:collection3_shard2_replica_n5 t:null-559] o.a.s.u.CommitTracker Hard AutoCommit: disabled 2> 212438 INFO (qtp558875332-568) [n:127.0.0.1:32885_ c:collection3 s:shard2 r:core_node6 x:collection3_shard2_replica_n5 t:null-559] o.a.s.u.CommitTracker Soft AutoCommit: disabled 2> 212449 INFO (qtp558875332-568) [n:127.0.0.1:32885_ c:collection3 s:shard2 r:core_node6 x:collection3_shard2_replica_n5 t:null-559] o.a.s.u.RandomMergePolicy RandomMergePolicy wrapping class org.apache.lucene.index.TieredMergePolicy: [TieredMergePolicy: maxMergeAtOnce=25, maxMergedSegmentMB=98.51693820953369, floorSegmentMB=1.4527759552001953, forceMergeDeletesPctAllowed=2.9097891812865595, segmentsPerTier=45.0, maxCFSSegmentSizeMB=8.796093022208E12, noCFSRatio=0.5977428682331861, deletesPctAllowed=37.29251706615804 2> 212465 INFO (qtp558875332-568) [n:127.0.0.1:32885_ c:collection3 s:shard2 r:core_node6 x:collection3_shard2_replica_n5 t:null-559] o.a.s.r.ManagedResourceStorage Configured ZooKeeperStorageIO with znodeBase: /configs/conf1 2> 212469 INFO (qtp558875332-568) [n:127.0.0.1:32885_ c:collection3 s:shard2 r:core_node6 x:collection3_shard2_replica_n5 t:null-559] o.a.s.r.ManagedResourceStorage Loaded null at path _rest_managed.json using ZooKeeperStorageIO:path=/configs/conf1 2> 212473 INFO (qtp558875332-568) [n:127.0.0.1:32885_ c:collection3 s:shard2 r:core_node6 x:collection3_shard2_replica_n5 t:null-559] o.a.s.h.ReplicationHandler Commits will be reserved for 10000 ms 2> 212477 INFO (qtp558875332-568) [n:127.0.0.1:32885_ c:collection3 s:shard2 r:core_node6 x:collection3_shard2_replica_n5 t:null-559] o.a.s.u.UpdateLog Could not find max version in index or recent updates, using new clock 1788735541293875200 2> 212512 INFO (searcherExecutor-425-thread-1-processing-collection3_shard2_replica_n5 null-559 core_node6 127.0.0.1:32885_ collection3 shard2) [n:127.0.0.1:32885_ c:collection3 s:shard2 r:core_node6 x:collection3_shard2_replica_n5 t:null-559] o.a.s.c.SolrCore Registered new searcher autowarm time: 0 ms 2> 212517 INFO (qtp558875332-568) [n:127.0.0.1:32885_ c:collection3 s:shard2 r:core_node6 x:collection3_shard2_replica_n5 t:null-559] o.a.s.c.ZkShardTerms Successful update of terms at /collections/collection3/terms/shard2 to Terms{values={core_node2=0, core_node6=0}, version=1} for registerTerm 2> 212518 INFO (qtp558875332-568) [n:127.0.0.1:32885_ c:collection3 s:shard2 r:core_node6 x:collection3_shard2_replica_n5 t:null-559] o.a.s.c.ShardLeaderElectionContextBase make sure parent is created /collections/collection3/leaders/shard2 2> 212541 INFO (qtp558875332-568) [n:127.0.0.1:32885_ c:collection3 s:shard2 r:core_node6 x:collection3_shard2_replica_n5 t:null-559] o.a.s.c.ZkController Core needs to recover:collection3_shard2_replica_n5 2> 212542 INFO (updateExecutor-318-thread-1-processing-collection3_shard2_replica_n5 null-559 core_node6 127.0.0.1:32885_ collection3 shard2) [n:127.0.0.1:32885_ c:collection3 s:shard2 r:core_node6 x:collection3_shard2_replica_n5 t:null-559] o.a.s.u.DefaultSolrCoreState Running recovery 2> 212544 INFO (recoveryExecutor-320-thread-1-processing-collection3_shard2_replica_n5 null-559 core_node6 127.0.0.1:32885_ collection3 shard2) [n:127.0.0.1:32885_ c:collection3 s:shard2 r:core_node6 x:collection3_shard2_replica_n5 t:null-559] o.a.s.c.RecoveryStrategy Starting recovery process. recoveringAfterStartup=true 2> 212544 INFO (recoveryExecutor-320-thread-1-processing-collection3_shard2_replica_n5 null-559 core_node6 127.0.0.1:32885_ collection3 shard2) [n:127.0.0.1:32885_ c:collection3 s:shard2 r:core_node6 x:collection3_shard2_replica_n5 t:null-559] o.a.s.c.RecoveryStrategy startupVersions is empty 2> 212563 INFO (qtp600427849-516) [n:127.0.0.1:40707_ c:collection3 s:shard2 r:core_node2 x:collection3_shard2_replica_n1 t:null-560] o.a.s.c.S.Request webapp= path=/admin/ping params={wt=javabin&version=2} hits=0 status=0 QTime=6 2> 212563 INFO (qtp600427849-516) [n:127.0.0.1:40707_ c:collection3 s:shard2 r:core_node2 x:collection3_shard2_replica_n1 t:null-560] o.a.s.c.S.Request webapp= path=/admin/ping params={wt=javabin&version=2} status=0 QTime=6 2> 212564 INFO (qtp558875332-568) [n:127.0.0.1:32885_ c: s: r: x: t:null-559] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/cores params={qt=/admin/cores&coreNodeName=core_node6&collection.configName=conf1&name=collection3_shard2_replica_n5&action=CREATE&collection=collection3&shard=shard2&wt=javabin&version=2&replicaType=NRT} status=0 QTime=882 2> 212565 INFO (recoveryExecutor-320-thread-1-processing-collection3_shard2_replica_n5 null-559 core_node6 127.0.0.1:32885_ collection3 shard2) [n:127.0.0.1:32885_ c:collection3 s:shard2 r:core_node6 x:collection3_shard2_replica_n5 t:null-559] o.a.s.c.RecoveryStrategy Begin buffering updates. core=[collection3_shard2_replica_n5] 2> 212569 INFO (recoveryExecutor-320-thread-1-processing-collection3_shard2_replica_n5 null-559 core_node6 127.0.0.1:32885_ collection3 shard2) [n:127.0.0.1:32885_ c:collection3 s:shard2 r:core_node6 x:collection3_shard2_replica_n5 t:null-559] o.a.s.u.UpdateLog Starting to buffer updates. FSUpdateLog{state=ACTIVE, tlog=null} 2> 212569 INFO (recoveryExecutor-320-thread-1-processing-collection3_shard2_replica_n5 null-559 core_node6 127.0.0.1:32885_ collection3 shard2) [n:127.0.0.1:32885_ c:collection3 s:shard2 r:core_node6 x:collection3_shard2_replica_n5 t:null-559] o.a.s.c.RecoveryStrategy Publishing state of core [collection3_shard2_replica_n5] as recovering, leader is [http://127.0.0.1:40707/collection3_shard2_replica_n1/] and I am [http://127.0.0.1:32885/collection3_shard2_replica_n5/] 2> 212580 INFO (recoveryExecutor-320-thread-1-processing-collection3_shard2_replica_n5 null-559 core_node6 127.0.0.1:32885_ collection3 shard2) [n:127.0.0.1:32885_ c:collection3 s:shard2 r:core_node6 x:collection3_shard2_replica_n5 t:null-559] o.a.s.c.RecoveryStrategy Sending prep recovery command to [http://127.0.0.1:40707]; [WaitForState: action=PREPRECOVERY&core=collection3_shard2_replica_n1&nodeName=127.0.0.1:32885_&coreNodeName=core_node6&state=recovering&checkLive=true&onlyIfLeader=true&onlyIfLeaderActive=true] 2> 212584 INFO (qtp1128963290-593) [n:127.0.0.1:43017_ c:collection3 s: r: x: t:null-558] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/collections params={node=127.0.0.1:32885_&action=ADDREPLICA&collection=collection3&shard=shard2&type=NRT&wt=javabin&version=2} status=0 QTime=1126 2> 212585 INFO (qtp600427849-518) [n:127.0.0.1:40707_ c: s: r: x:collection3_shard2_replica_n1 t:null-561] o.a.s.h.a.PrepRecoveryOp Going to wait for coreNodeName: core_node6, state: recovering, checkLive: true, onlyIfLeader: true, onlyIfLeaderActive: true 2> 212589 INFO (qtp600427849-518) [n:127.0.0.1:40707_ c: s: r: x: t:null-561] o.a.s.h.a.PrepRecoveryOp In WaitForState(recovering): collection=collection3, shard=shard2, thisCore=collection3_shard2_replica_n1, leaderDoesNotNeedRecovery=false, isLeader? true, live=true, checkLive=true, currentState=down, localState=active, nodeName=127.0.0.1:32885_, coreNodeName=core_node6, onlyIfActiveCheckResult=false, nodeProps: core_node6:{ 2> "core":"collection3_shard2_replica_n5", 2> "node_name":"127.0.0.1:32885_", 2> "base_url":"http://127.0.0.1:32885", 2> "state":"down", 2> "type":"NRT", 2> "force_set_state":"false"} 2> 212590 INFO (qtp600427849-518) [n:127.0.0.1:40707_ c: s: r: x: t:null-561] o.a.s.h.a.PrepRecoveryOp In WaitForState(recovering): collection=collection3, shard=shard2, thisCore=collection3_shard2_replica_n1, leaderDoesNotNeedRecovery=false, isLeader? true, live=true, checkLive=true, currentState=down, localState=active, nodeName=127.0.0.1:32885_, coreNodeName=core_node6, onlyIfActiveCheckResult=false, nodeProps: core_node6:{ 2> "core":"collection3_shard2_replica_n5", 2> "node_name":"127.0.0.1:32885_", 2> "base_url":"http://127.0.0.1:32885", 2> "state":"down", 2> "type":"NRT", 2> "force_set_state":"false"} 2> 212590 INFO (qtp600427849-518) [n:127.0.0.1:40707_ c: s: r: x: t:null-561] o.a.s.h.a.PrepRecoveryOp In WaitForState(recovering): collection=collection3, shard=shard2, thisCore=collection3_shard2_replica_n1, leaderDoesNotNeedRecovery=false, isLeader? true, live=true, checkLive=true, currentState=down, localState=active, nodeName=127.0.0.1:32885_, coreNodeName=core_node6, onlyIfActiveCheckResult=false, nodeProps: core_node6:{ 2> "core":"collection3_shard2_replica_n5", 2> "node_name":"127.0.0.1:32885_", 2> "base_url":"http://127.0.0.1:32885", 2> "state":"down", 2> "type":"NRT", 2> "force_set_state":"false"} 2> 212591 INFO (qtp600427849-518) [n:127.0.0.1:40707_ c: s: r: x: t:null-561] o.a.s.h.a.PrepRecoveryOp In WaitForState(recovering): collection=collection3, shard=shard2, thisCore=collection3_shard2_replica_n1, leaderDoesNotNeedRecovery=false, isLeader? true, live=true, checkLive=true, currentState=down, localState=active, nodeName=127.0.0.1:32885_, coreNodeName=core_node6, onlyIfActiveCheckResult=false, nodeProps: core_node6:{ 2> "core":"collection3_shard2_replica_n5", 2> "node_name":"127.0.0.1:32885_", 2> "base_url":"http://127.0.0.1:32885", 2> "state":"down", 2> "type":"NRT", 2> "force_set_state":"false"} 2> 212628 INFO (OverseerCollectionConfigSetProcessor-72077803529764868-127.0.0.1:45297_-n_0000000000) [n: c: s: r: x: t:] o.a.s.c.OverseerTaskQueue Response ZK path: /overseer/collection-queue-work/qnr-0000000028 doesn't exist. Requestor may have disconnected from ZooKeeper 2> 212643 INFO (OverseerThreadFactory-250-thread-4) [n: c:collection3 s:shard1 r: x: t:] o.a.s.c.a.c.AddReplicaCmd Node Identified 127.0.0.1:43017_ for creating new replica of shard shard1 for collection collection3 2> 212655 INFO (OverseerThreadFactory-250-thread-4) [n: c:collection3 s:shard1 r: x: t:] o.a.s.c.a.c.AddReplicaCmd Returning CreateReplica command. 2> 212670 INFO (OverseerStateUpdate-72077803529764868-127.0.0.1:45297_-n_0000000000) [n:127.0.0.1:45297_ c: s: r: x: t:] o.a.s.c.o.SliceMutator createReplica() { 2> "core":"collection3_shard1_replica_n7", 2> "node_name":"127.0.0.1:43017_", 2> "base_url":"http://127.0.0.1:43017", 2> "collection":"collection3", 2> "shard":"shard1", 2> "state":"down", 2> "type":"NRT", 2> "operation":"addreplica"} 2> 212792 INFO (zkCallback-299-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/collection3/state.json zxid: 584] for collection [collection3] has occurred - updating... (live nodes size: [5]) 2> 212792 INFO (zkCallback-326-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/collection3/state.json zxid: 584] for collection [collection3] has occurred - updating... (live nodes size: [5]) 2> 212792 INFO (zkCallback-244-thread-2) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/collection3/state.json zxid: 584] for collection [collection3] has occurred - updating... (live nodes size: [5]) 2> 212792 INFO (zkCallback-299-thread-2) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/collection3/state.json zxid: 584] for collection [collection3] has occurred - updating... (live nodes size: [5]) 2> 212792 INFO (zkCallback-288-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/collection3/state.json zxid: 584] for collection [collection3] has occurred - updating... (live nodes size: [5]) 2> 212792 INFO (zkCallback-288-thread-2) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/collection3/state.json zxid: 584] for collection [collection3] has occurred - updating... (live nodes size: [5]) 2> 212798 INFO (watches-290-thread-1) [n:127.0.0.1:40707_ c: s: r: x: t:] o.a.s.h.a.PrepRecoveryOp In WaitForState(recovering): collection=collection3, shard=shard2, thisCore=collection3_shard2_replica_n1, leaderDoesNotNeedRecovery=false, isLeader? true, live=true, checkLive=true, currentState=recovering, localState=active, nodeName=127.0.0.1:32885_, coreNodeName=core_node6, onlyIfActiveCheckResult=false, nodeProps: core_node6:{ 2> "core":"collection3_shard2_replica_n5", 2> "node_name":"127.0.0.1:32885_", 2> "base_url":"http://127.0.0.1:32885", 2> "state":"recovering", 2> "type":"NRT", 2> "force_set_state":"false"} 2> 212799 INFO (qtp600427849-518) [n:127.0.0.1:40707_ c: s: r: x: t:null-561] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/cores params={nodeName=127.0.0.1:32885_&onlyIfLeaderActive=true&core=collection3_shard2_replica_n1&coreNodeName=core_node6&action=PREPRECOVERY&checkLive=true&state=recovering&onlyIfLeader=true&wt=javabin&version=2} status=0 QTime=215 2> 212804 INFO (qtp1128963290-591) [n:127.0.0.1:43017_ c: s: r: x:collection3_shard1_replica_n7 t:null-563] o.a.s.h.a.CoreAdminOperation core create command qt=/admin/cores&coreNodeName=core_node8&collection.configName=conf1&name=collection3_shard1_replica_n7&action=CREATE&collection=collection3&shard=shard1&wt=javabin&version=2&replicaType=NRT 2> 212855 INFO (qtp1128963290-591) [n:127.0.0.1:43017_ c:collection3 s:shard1 r:core_node8 x:collection3_shard1_replica_n7 t:null-563] o.a.s.c.SolrConfig Using Lucene MatchVersion: 9.8.0 2> 212867 INFO (qtp1128963290-591) [n:127.0.0.1:43017_ c:collection3 s:shard1 r:core_node8 x:collection3_shard1_replica_n7 t:null-563] o.a.s.s.IndexSchema Schema name=test 2> 212940 INFO (zkCallback-299-thread-2) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/collection3/state.json zxid: 587] for collection [collection3] has occurred - updating... (live nodes size: [5]) 2> 212940 INFO (zkCallback-326-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/collection3/state.json zxid: 587] for collection [collection3] has occurred - updating... (live nodes size: [5]) 2> 212940 INFO (zkCallback-288-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/collection3/state.json zxid: 587] for collection [collection3] has occurred - updating... (live nodes size: [5]) 2> 212940 INFO (zkCallback-288-thread-2) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/collection3/state.json zxid: 587] for collection [collection3] has occurred - updating... (live nodes size: [5]) 2> 212940 INFO (zkCallback-299-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/collection3/state.json zxid: 587] for collection [collection3] has occurred - updating... (live nodes size: [5]) 2> 213058 INFO (qtp1128963290-591) [n:127.0.0.1:43017_ c:collection3 s:shard1 r:core_node8 x:collection3_shard1_replica_n7 t:null-563] o.a.s.s.IndexSchema Loaded schema test/1.0 with uniqueid field id 2> 213259 INFO (qtp1128963290-591) [n:127.0.0.1:43017_ c:collection3 s:shard1 r:core_node8 x:collection3_shard1_replica_n7 t:null-563] o.a.s.c.CoreContainer Creating SolrCore 'collection3_shard1_replica_n7' using configuration from configset conf1, trusted=true 2> 213262 INFO (qtp1128963290-591) [n:127.0.0.1:43017_ c:collection3 s:shard1 r:core_node8 x:collection3_shard1_replica_n7 t:null-563] o.a.s.c.SolrCore Opening new SolrCore at [/build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.cloud.BasicDistributedZkTest_246C98A4C257C021-001/shard-4-001/cores/collection3_shard1_replica_n7], dataDir=[/build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.cloud.BasicDistributedZkTest_246C98A4C257C021-001/shard-4-001/cores/collection3_shard1_replica_n7/data/] 2> 213292 INFO (qtp1128963290-591) [n:127.0.0.1:43017_ c:collection3 s:shard1 r:core_node8 x:collection3_shard1_replica_n7 t:null-563] o.a.s.u.RandomMergePolicy RandomMergePolicy wrapping class org.apache.lucene.index.TieredMergePolicy: [TieredMergePolicy: maxMergeAtOnce=26, maxMergedSegmentMB=49.734910011291504, floorSegmentMB=0.3930988311767578, forceMergeDeletesPctAllowed=19.244929011688583, segmentsPerTier=32.0, maxCFSSegmentSizeMB=8.796093022208E12, noCFSRatio=0.844874139736486, deletesPctAllowed=38.311155010312675 2> 213302 INFO (recoveryExecutor-320-thread-1-processing-collection3_shard2_replica_n5 null-559 core_node6 127.0.0.1:32885_ collection3 shard2) [n:127.0.0.1:32885_ c:collection3 s:shard2 r:core_node6 x:collection3_shard2_replica_n5 t:null-559] o.a.s.c.RecoveryStrategy Attempting to PeerSync from [http://127.0.0.1:40707/collection3_shard2_replica_n1/] - recoveringAfterStartup=[true] 2> 213305 WARN (recoveryExecutor-320-thread-1-processing-collection3_shard2_replica_n5 null-559 core_node6 127.0.0.1:32885_ collection3 shard2) [n:127.0.0.1:32885_ c:collection3 s:shard2 r:core_node6 x:collection3_shard2_replica_n5 t:null-559] o.a.s.u.PeerSyncWithLeader no frame of reference to tell if we've missed updates 2> 213306 INFO (recoveryExecutor-320-thread-1-processing-collection3_shard2_replica_n5 null-559 core_node6 127.0.0.1:32885_ collection3 shard2) [n:127.0.0.1:32885_ c:collection3 s:shard2 r:core_node6 x:collection3_shard2_replica_n5 t:null-559] o.a.s.c.RecoveryStrategy PeerSync Recovery was not successful - trying replication. 2> 213306 INFO (recoveryExecutor-320-thread-1-processing-collection3_shard2_replica_n5 null-559 core_node6 127.0.0.1:32885_ collection3 shard2) [n:127.0.0.1:32885_ c:collection3 s:shard2 r:core_node6 x:collection3_shard2_replica_n5 t:null-559] o.a.s.c.RecoveryStrategy Starting Replication Recovery. 2> 213306 INFO (recoveryExecutor-320-thread-1-processing-collection3_shard2_replica_n5 null-559 core_node6 127.0.0.1:32885_ collection3 shard2) [n:127.0.0.1:32885_ c:collection3 s:shard2 r:core_node6 x:collection3_shard2_replica_n5 t:null-559] o.a.s.c.RecoveryStrategy Attempting to replicate from [http://127.0.0.1:40707/collection3_shard2_replica_n1/]. 2> 213314 WARN (qtp1128963290-591) [n:127.0.0.1:43017_ c:collection3 s:shard1 r:core_node8 x:collection3_shard1_replica_n7 t:null-563] o.a.s.c.RequestHandlers INVALID paramSet a in requestHandler {type = requestHandler,name = /dump,class = DumpRequestHandler,attributes = {initParams=a, name=/dump, class=DumpRequestHandler},args = {defaults={a=A, b=B}}} 2> 213351 INFO (qtp657049334-531) [n:127.0.0.1:37133_ c:collection3 s:shard1 r:core_node4 x:collection3_shard1_replica_n3 t:null-564] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={update.distrib=TOLEADER&waitSearcher=true&openSearcher=false&commit=true&softCommit=false&distrib.from=http://127.0.0.1:40707/collection3_shard2_replica_n1/&commit_end_point=leaders&wt=javabin&version=2&expungeDeletes=false}{commit=} 0 22 2> 213393 INFO (qtp558875332-571) [n:127.0.0.1:32885_ c:collection3 s:shard2 r:core_node6 x:collection3_shard2_replica_n5 t:null-564] o.a.s.u.p.DistributedUpdateProcessor Ignoring commit while not ACTIVE - state: BUFFERING replay: false 2> 213394 INFO (qtp558875332-571) [n:127.0.0.1:32885_ c:collection3 s:shard2 r:core_node6 x:collection3_shard2_replica_n5 t:null-564] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={update.distrib=FROMLEADER&waitSearcher=true&openSearcher=false&commit=true&softCommit=false&distrib.from=http://127.0.0.1:40707/collection3_shard2_replica_n1/&commit_end_point=replicas&wt=javabin&version=2&expungeDeletes=false}{commit=} 0 10 2> 213397 INFO (qtp600427849-517) [n:127.0.0.1:40707_ c:collection3 s:shard2 r:core_node2 x:collection3_shard2_replica_n1 t:null-564] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={waitSearcher=true&openSearcher=false&commit=true&softCommit=false&wt=javabin&version=2}{commit=} 0 86 2> 213412 INFO (qtp600427849-515) [n:127.0.0.1:40707_ c:collection3 s:shard2 r:core_node2 x:collection3_shard2_replica_n1 t:null-565] o.a.s.c.S.Request webapp= path=/replication params={qt=/replication&wt=javabin&version=2&command=indexversion} status=0 QTime=2 2> 213415 INFO (recoveryExecutor-320-thread-1-processing-collection3_shard2_replica_n5 null-559 core_node6 127.0.0.1:32885_ collection3 shard2) [n:127.0.0.1:32885_ c:collection3 s:shard2 r:core_node6 x:collection3_shard2_replica_n5 t:null-559] o.a.s.h.IndexFetcher Leader's generation: 1 2> 213415 INFO (recoveryExecutor-320-thread-1-processing-collection3_shard2_replica_n5 null-559 core_node6 127.0.0.1:32885_ collection3 shard2) [n:127.0.0.1:32885_ c:collection3 s:shard2 r:core_node6 x:collection3_shard2_replica_n5 t:null-559] o.a.s.h.IndexFetcher Leader's version: 0 2> 213415 INFO (recoveryExecutor-320-thread-1-processing-collection3_shard2_replica_n5 null-559 core_node6 127.0.0.1:32885_ collection3 shard2) [n:127.0.0.1:32885_ c:collection3 s:shard2 r:core_node6 x:collection3_shard2_replica_n5 t:null-559] o.a.s.h.IndexFetcher Follower's generation: 1 2> 213416 INFO (recoveryExecutor-320-thread-1-processing-collection3_shard2_replica_n5 null-559 core_node6 127.0.0.1:32885_ collection3 shard2) [n:127.0.0.1:32885_ c:collection3 s:shard2 r:core_node6 x:collection3_shard2_replica_n5 t:null-559] o.a.s.h.IndexFetcher Follower's version: 0 2> 213416 INFO (recoveryExecutor-320-thread-1-processing-collection3_shard2_replica_n5 null-559 core_node6 127.0.0.1:32885_ collection3 shard2) [n:127.0.0.1:32885_ c:collection3 s:shard2 r:core_node6 x:collection3_shard2_replica_n5 t:null-559] o.a.s.c.RecoveryStrategy No replay needed. 2> 213425 INFO (recoveryExecutor-320-thread-1-processing-collection3_shard2_replica_n5 null-559 core_node6 127.0.0.1:32885_ collection3 shard2) [n:127.0.0.1:32885_ c:collection3 s:shard2 r:core_node6 x:collection3_shard2_replica_n5 t:null-559] o.a.s.c.RecoveryStrategy Replication Recovery was successful. 2> 213425 INFO (recoveryExecutor-320-thread-1-processing-collection3_shard2_replica_n5 null-559 core_node6 127.0.0.1:32885_ collection3 shard2) [n:127.0.0.1:32885_ c:collection3 s:shard2 r:core_node6 x:collection3_shard2_replica_n5 t:null-559] o.a.s.c.RecoveryStrategy Registering as Active after recovery. 2> 213430 INFO (recoveryExecutor-320-thread-1-processing-collection3_shard2_replica_n5 null-559 core_node6 127.0.0.1:32885_ collection3 shard2) [n:127.0.0.1:32885_ c:collection3 s:shard2 r:core_node6 x:collection3_shard2_replica_n5 t:null-559] o.a.s.c.RecoveryStrategy Updating version bucket highest from index after successful recovery. 2> 213431 INFO (recoveryExecutor-320-thread-1-processing-collection3_shard2_replica_n5 null-559 core_node6 127.0.0.1:32885_ collection3 shard2) [n:127.0.0.1:32885_ c:collection3 s:shard2 r:core_node6 x:collection3_shard2_replica_n5 t:null-559] o.a.s.u.UpdateLog Could not find max version in index or recent updates, using new clock 1788735542294216704 2> 213443 INFO (recoveryExecutor-320-thread-1-processing-collection3_shard2_replica_n5 null-559 core_node6 127.0.0.1:32885_ collection3 shard2) [n:127.0.0.1:32885_ c:collection3 s:shard2 r:core_node6 x:collection3_shard2_replica_n5 t:null-559] o.a.s.c.RecoveryStrategy Finished recovery process, successful=[true] msTimeTaken=899.0 2> 213444 INFO (recoveryExecutor-320-thread-1-processing-collection3_shard2_replica_n5 null-559 core_node6 127.0.0.1:32885_ collection3 shard2) [n:127.0.0.1:32885_ c:collection3 s:shard2 r:core_node6 x:collection3_shard2_replica_n5 t:null-559] o.a.s.c.RecoveryStrategy Finished recovery process. recoveringAfterStartup=true msTimeTaken=900.0 2> 213548 INFO (zkCallback-299-thread-2) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/collection3/state.json zxid: 590] for collection [collection3] has occurred - updating... (live nodes size: [5]) 2> 213548 INFO (zkCallback-299-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/collection3/state.json zxid: 590] for collection [collection3] has occurred - updating... (live nodes size: [5]) 2> 213548 INFO (zkCallback-326-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/collection3/state.json zxid: 590] for collection [collection3] has occurred - updating... (live nodes size: [5]) 2> 213548 INFO (zkCallback-288-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/collection3/state.json zxid: 590] for collection [collection3] has occurred - updating... (live nodes size: [5]) 2> 213548 INFO (zkCallback-288-thread-2) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/collection3/state.json zxid: 590] for collection [collection3] has occurred - updating... (live nodes size: [5]) 2> 213627 INFO (qtp1128963290-591) [n:127.0.0.1:43017_ c:collection3 s:shard1 r:core_node8 x:collection3_shard1_replica_n7 t:null-563] o.a.s.u.UpdateHandler Using UpdateLog implementation: org.apache.solr.update.UpdateLog 2> 213628 INFO (qtp1128963290-591) [n:127.0.0.1:43017_ c:collection3 s:shard1 r:core_node8 x:collection3_shard1_replica_n7 t:null-563] o.a.s.u.UpdateLog Initializing UpdateLog: dataDir= defaultSyncLevel=FLUSH numRecordsToKeep=100 maxNumLogsToKeep=10 numVersionBuckets=65536 2> 213646 INFO (qtp1128963290-591) [n:127.0.0.1:43017_ c:collection3 s:shard1 r:core_node8 x:collection3_shard1_replica_n7 t:null-563] o.a.s.u.CommitTracker Hard AutoCommit: disabled 2> 213647 INFO (qtp1128963290-591) [n:127.0.0.1:43017_ c:collection3 s:shard1 r:core_node8 x:collection3_shard1_replica_n7 t:null-563] o.a.s.u.CommitTracker Soft AutoCommit: disabled 2> 213658 INFO (qtp1128963290-591) [n:127.0.0.1:43017_ c:collection3 s:shard1 r:core_node8 x:collection3_shard1_replica_n7 t:null-563] o.a.s.u.RandomMergePolicy RandomMergePolicy wrapping class org.apache.lucene.index.TieredMergePolicy: [TieredMergePolicy: maxMergeAtOnce=25, maxMergedSegmentMB=98.51693820953369, floorSegmentMB=1.4527759552001953, forceMergeDeletesPctAllowed=2.9097891812865595, segmentsPerTier=45.0, maxCFSSegmentSizeMB=8.796093022208E12, noCFSRatio=0.5977428682331861, deletesPctAllowed=37.29251706615804 2> 213675 INFO (qtp1128963290-591) [n:127.0.0.1:43017_ c:collection3 s:shard1 r:core_node8 x:collection3_shard1_replica_n7 t:null-563] o.a.s.r.ManagedResourceStorage Configured ZooKeeperStorageIO with znodeBase: /configs/conf1 2> 213678 INFO (qtp1128963290-591) [n:127.0.0.1:43017_ c:collection3 s:shard1 r:core_node8 x:collection3_shard1_replica_n7 t:null-563] o.a.s.r.ManagedResourceStorage Loaded null at path _rest_managed.json using ZooKeeperStorageIO:path=/configs/conf1 2> 213685 INFO (qtp1128963290-591) [n:127.0.0.1:43017_ c:collection3 s:shard1 r:core_node8 x:collection3_shard1_replica_n7 t:null-563] o.a.s.h.ReplicationHandler Commits will be reserved for 10000 ms 2> 213687 INFO (qtp1128963290-591) [n:127.0.0.1:43017_ c:collection3 s:shard1 r:core_node8 x:collection3_shard1_replica_n7 t:null-563] o.a.s.u.UpdateLog Could not find max version in index or recent updates, using new clock 1788735542562652160 2> 213737 INFO (searcherExecutor-432-thread-1-processing-collection3_shard1_replica_n7 null-563 core_node8 127.0.0.1:43017_ collection3 shard1) [n:127.0.0.1:43017_ c:collection3 s:shard1 r:core_node8 x:collection3_shard1_replica_n7 t:null-563] o.a.s.c.SolrCore Registered new searcher autowarm time: 0 ms 2> 213740 INFO (qtp1128963290-591) [n:127.0.0.1:43017_ c:collection3 s:shard1 r:core_node8 x:collection3_shard1_replica_n7 t:null-563] o.a.s.c.ZkShardTerms Successful update of terms at /collections/collection3/terms/shard1 to Terms{values={core_node4=0, core_node8=0}, version=1} for registerTerm 2> 213745 INFO (qtp1128963290-591) [n:127.0.0.1:43017_ c:collection3 s:shard1 r:core_node8 x:collection3_shard1_replica_n7 t:null-563] o.a.s.c.ShardLeaderElectionContextBase make sure parent is created /collections/collection3/leaders/shard1 2> 213767 INFO (qtp1128963290-591) [n:127.0.0.1:43017_ c:collection3 s:shard1 r:core_node8 x:collection3_shard1_replica_n7 t:null-563] o.a.s.c.ZkController Core needs to recover:collection3_shard1_replica_n7 2> 213768 INFO (updateExecutor-335-thread-1-processing-collection3_shard1_replica_n7 null-563 core_node8 127.0.0.1:43017_ collection3 shard1) [n:127.0.0.1:43017_ c:collection3 s:shard1 r:core_node8 x:collection3_shard1_replica_n7 t:null-563] o.a.s.u.DefaultSolrCoreState Running recovery 2> 213769 INFO (recoveryExecutor-337-thread-1-processing-collection3_shard1_replica_n7 null-563 core_node8 127.0.0.1:43017_ collection3 shard1) [n:127.0.0.1:43017_ c:collection3 s:shard1 r:core_node8 x:collection3_shard1_replica_n7 t:null-563] o.a.s.c.RecoveryStrategy Starting recovery process. recoveringAfterStartup=true 2> 213769 INFO (recoveryExecutor-337-thread-1-processing-collection3_shard1_replica_n7 null-563 core_node8 127.0.0.1:43017_ collection3 shard1) [n:127.0.0.1:43017_ c:collection3 s:shard1 r:core_node8 x:collection3_shard1_replica_n7 t:null-563] o.a.s.c.RecoveryStrategy startupVersions is empty 2> 213788 INFO (qtp657049334-530) [n:127.0.0.1:37133_ c:collection3 s:shard1 r:core_node4 x:collection3_shard1_replica_n3 t:null-566] o.a.s.c.S.Request webapp= path=/admin/ping params={wt=javabin&version=2} hits=0 status=0 QTime=7 2> 213789 INFO (qtp657049334-530) [n:127.0.0.1:37133_ c:collection3 s:shard1 r:core_node4 x:collection3_shard1_replica_n3 t:null-566] o.a.s.c.S.Request webapp= path=/admin/ping params={wt=javabin&version=2} status=0 QTime=8 2> 213791 INFO (recoveryExecutor-337-thread-1-processing-collection3_shard1_replica_n7 null-563 core_node8 127.0.0.1:43017_ collection3 shard1) [n:127.0.0.1:43017_ c:collection3 s:shard1 r:core_node8 x:collection3_shard1_replica_n7 t:null-563] o.a.s.c.RecoveryStrategy Begin buffering updates. core=[collection3_shard1_replica_n7] 2> 213791 INFO (qtp1128963290-591) [n:127.0.0.1:43017_ c: s: r: x: t:null-563] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/cores params={qt=/admin/cores&coreNodeName=core_node8&collection.configName=conf1&name=collection3_shard1_replica_n7&action=CREATE&collection=collection3&shard=shard1&wt=javabin&version=2&replicaType=NRT} status=0 QTime=990 2> 213795 INFO (recoveryExecutor-337-thread-1-processing-collection3_shard1_replica_n7 null-563 core_node8 127.0.0.1:43017_ collection3 shard1) [n:127.0.0.1:43017_ c:collection3 s:shard1 r:core_node8 x:collection3_shard1_replica_n7 t:null-563] o.a.s.u.UpdateLog Starting to buffer updates. FSUpdateLog{state=ACTIVE, tlog=null} 2> 213795 INFO (recoveryExecutor-337-thread-1-processing-collection3_shard1_replica_n7 null-563 core_node8 127.0.0.1:43017_ collection3 shard1) [n:127.0.0.1:43017_ c:collection3 s:shard1 r:core_node8 x:collection3_shard1_replica_n7 t:null-563] o.a.s.c.RecoveryStrategy Publishing state of core [collection3_shard1_replica_n7] as recovering, leader is [http://127.0.0.1:37133/collection3_shard1_replica_n3/] and I am [http://127.0.0.1:43017/collection3_shard1_replica_n7/] 2> 213806 INFO (recoveryExecutor-337-thread-1-processing-collection3_shard1_replica_n7 null-563 core_node8 127.0.0.1:43017_ collection3 shard1) [n:127.0.0.1:43017_ c:collection3 s:shard1 r:core_node8 x:collection3_shard1_replica_n7 t:null-563] o.a.s.c.RecoveryStrategy Sending prep recovery command to [http://127.0.0.1:37133]; [WaitForState: action=PREPRECOVERY&core=collection3_shard1_replica_n3&nodeName=127.0.0.1:43017_&coreNodeName=core_node8&state=recovering&checkLive=true&onlyIfLeader=true&onlyIfLeaderActive=true] 2> 213811 INFO (qtp1128963290-590) [n:127.0.0.1:43017_ c:collection3 s: r: x: t:null-562] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/collections params={node=127.0.0.1:43017_&action=ADDREPLICA&collection=collection3&shard=shard1&type=NRT&wt=javabin&version=2} status=0 QTime=1211 2> 213811 INFO (qtp657049334-532) [n:127.0.0.1:37133_ c: s: r: x:collection3_shard1_replica_n3 t:null-567] o.a.s.h.a.PrepRecoveryOp Going to wait for coreNodeName: core_node8, state: recovering, checkLive: true, onlyIfLeader: true, onlyIfLeaderActive: true 2> 213812 INFO (qtp657049334-532) [n:127.0.0.1:37133_ c: s: r: x: t:null-567] o.a.s.h.a.PrepRecoveryOp In WaitForState(recovering): collection=collection3, shard=shard1, thisCore=collection3_shard1_replica_n3, leaderDoesNotNeedRecovery=false, isLeader? true, live=true, checkLive=true, currentState=down, localState=active, nodeName=127.0.0.1:43017_, coreNodeName=core_node8, onlyIfActiveCheckResult=false, nodeProps: core_node8:{ 2> "core":"collection3_shard1_replica_n7", 2> "node_name":"127.0.0.1:43017_", 2> "base_url":"http://127.0.0.1:43017", 2> "state":"down", 2> "type":"NRT", 2> "force_set_state":"false"} 2> 213813 INFO (qtp657049334-532) [n:127.0.0.1:37133_ c: s: r: x: t:null-567] o.a.s.h.a.PrepRecoveryOp In WaitForState(recovering): collection=collection3, shard=shard1, thisCore=collection3_shard1_replica_n3, leaderDoesNotNeedRecovery=false, isLeader? true, live=true, checkLive=true, currentState=down, localState=active, nodeName=127.0.0.1:43017_, coreNodeName=core_node8, onlyIfActiveCheckResult=false, nodeProps: core_node8:{ 2> "core":"collection3_shard1_replica_n7", 2> "node_name":"127.0.0.1:43017_", 2> "base_url":"http://127.0.0.1:43017", 2> "state":"down", 2> "type":"NRT", 2> "force_set_state":"false"} 2> 213814 INFO (qtp657049334-532) [n:127.0.0.1:37133_ c: s: r: x: t:null-567] o.a.s.h.a.PrepRecoveryOp In WaitForState(recovering): collection=collection3, shard=shard1, thisCore=collection3_shard1_replica_n3, leaderDoesNotNeedRecovery=false, isLeader? true, live=true, checkLive=true, currentState=down, localState=active, nodeName=127.0.0.1:43017_, coreNodeName=core_node8, onlyIfActiveCheckResult=false, nodeProps: core_node8:{ 2> "core":"collection3_shard1_replica_n7", 2> "node_name":"127.0.0.1:43017_", 2> "base_url":"http://127.0.0.1:43017", 2> "state":"down", 2> "type":"NRT", 2> "force_set_state":"false"} 2> 213815 INFO (qtp657049334-532) [n:127.0.0.1:37133_ c: s: r: x: t:null-567] o.a.s.h.a.PrepRecoveryOp In WaitForState(recovering): collection=collection3, shard=shard1, thisCore=collection3_shard1_replica_n3, leaderDoesNotNeedRecovery=false, isLeader? true, live=true, checkLive=true, currentState=down, localState=active, nodeName=127.0.0.1:43017_, coreNodeName=core_node8, onlyIfActiveCheckResult=false, nodeProps: core_node8:{ 2> "core":"collection3_shard1_replica_n7", 2> "node_name":"127.0.0.1:43017_", 2> "base_url":"http://127.0.0.1:43017", 2> "state":"down", 2> "type":"NRT", 2> "force_set_state":"false"} 2> 213862 INFO (qtp600427849-520) [n:127.0.0.1:40707_ c:collection2 s:shard2 r:core_node2 x:collection2_shard2_replica_n1 t:null-568] o.a.s.c.ZkShardTerms Successful update of terms at /collections/collection2/terms/shard2 to Terms{values={core_node2=1, core_node6=1}, version=2} for ensureHighestTermsAreNotZero 2> 213880 INFO (qtp558875332-569) [n:127.0.0.1:32885_ c:collection2 s:shard2 r:core_node6 x:collection2_shard2_replica_n5 t:null-568] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={update.distrib=FROMLEADER&distrib.from=http://127.0.0.1:40707/collection2_shard2_replica_n1/&wt=javabin&version=2}{add=[10000000 (1788735542720987136)]} 0 18 2> 213883 INFO (qtp600427849-520) [n:127.0.0.1:40707_ c:collection2 s:shard2 r:core_node2 x:collection2_shard2_replica_n1 t:null-568] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={update.distrib=TOLEADER&distrib.from=http://127.0.0.1:37133/collection2_shard1_replica_n3/&wt=javabin&version=2}{add=[10000000 (1788735542720987136)]} 0 48 2> 213885 INFO (qtp657049334-533) [n:127.0.0.1:37133_ c:collection2 s:shard1 r:core_node4 x:collection2_shard1_replica_n3 t:null-568] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={wt=javabin&version=2}{add=[10000000]} 0 60 2> 213917 INFO (zkCallback-340-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/collection3/state.json zxid: 603] for collection [collection3] has occurred - updating... (live nodes size: [5]) 2> 213917 INFO (zkCallback-326-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/collection3/state.json zxid: 603] for collection [collection3] has occurred - updating... (live nodes size: [5]) 2> 213917 INFO (zkCallback-299-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/collection3/state.json zxid: 603] for collection [collection3] has occurred - updating... (live nodes size: [5]) 2> 213917 INFO (zkCallback-299-thread-2) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/collection3/state.json zxid: 603] for collection [collection3] has occurred - updating... (live nodes size: [5]) 2> 213918 INFO (zkCallback-288-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/collection3/state.json zxid: 603] for collection [collection3] has occurred - updating... (live nodes size: [5]) 2> 213918 INFO (zkCallback-288-thread-2) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/collection3/state.json zxid: 603] for collection [collection3] has occurred - updating... (live nodes size: [5]) 2> 213934 INFO (watches-301-thread-3) [n:127.0.0.1:37133_ c: s: r: x: t:] o.a.s.h.a.PrepRecoveryOp In WaitForState(recovering): collection=collection3, shard=shard1, thisCore=collection3_shard1_replica_n3, leaderDoesNotNeedRecovery=false, isLeader? true, live=true, checkLive=true, currentState=recovering, localState=active, nodeName=127.0.0.1:43017_, coreNodeName=core_node8, onlyIfActiveCheckResult=false, nodeProps: core_node8:{ 2> "core":"collection3_shard1_replica_n7", 2> "node_name":"127.0.0.1:43017_", 2> "base_url":"http://127.0.0.1:43017", 2> "state":"recovering", 2> "type":"NRT", 2> "force_set_state":"false"} 2> 213934 INFO (watches-301-thread-2) [n:127.0.0.1:37133_ c: s: r: x: t:] o.a.s.h.a.PrepRecoveryOp In WaitForState(recovering): collection=collection3, shard=shard1, thisCore=collection3_shard1_replica_n3, leaderDoesNotNeedRecovery=false, isLeader? true, live=true, checkLive=true, currentState=recovering, localState=active, nodeName=127.0.0.1:43017_, coreNodeName=core_node8, onlyIfActiveCheckResult=false, nodeProps: core_node8:{ 2> "core":"collection3_shard1_replica_n7", 2> "node_name":"127.0.0.1:43017_", 2> "base_url":"http://127.0.0.1:43017", 2> "state":"recovering", 2> "type":"NRT", 2> "force_set_state":"false"} 2> 213935 INFO (qtp657049334-532) [n:127.0.0.1:37133_ c: s: r: x: t:null-567] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/cores params={nodeName=127.0.0.1:43017_&onlyIfLeaderActive=true&core=collection3_shard1_replica_n3&coreNodeName=core_node8&action=PREPRECOVERY&checkLive=true&state=recovering&onlyIfLeader=true&wt=javabin&version=2} status=0 QTime=125 2> 213981 INFO (qtp558875332-573) [n:127.0.0.1:32885_ c:collection2 s:shard2 r:core_node6 x:collection2_shard2_replica_n5 t:null-569] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={update.distrib=FROMLEADER&distrib.from=http://127.0.0.1:40707/collection2_shard2_replica_n1/&wt=javabin&version=2}{add=[10000001 (1788735542812213248)]} 0 43 2> 213990 INFO (qtp600427849-518) [n:127.0.0.1:40707_ c:collection2 s:shard2 r:core_node2 x:collection2_shard2_replica_n1 t:null-569] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={update.distrib=TOLEADER&distrib.from=http://127.0.0.1:32885/collection2_shard2_replica_n5/&wt=javabin&version=2}{add=[10000001 (1788735542812213248)]} 0 67 2> 213993 INFO (qtp558875332-570) [n:127.0.0.1:32885_ c:collection2 s:shard2 r:core_node6 x:collection2_shard2_replica_n5 t:null-569] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={wt=javabin&version=2}{add=[10000001]} 0 98 2> 214051 INFO (qtp657049334-529) [n:127.0.0.1:37133_ c:collection2 s:shard1 r:core_node4 x:collection2_shard1_replica_n3 t:null-570] o.a.s.c.ZkShardTerms Successful update of terms at /collections/collection2/terms/shard1 to Terms{values={core_node4=1, core_node8=1}, version=2} for ensureHighestTermsAreNotZero 2> 214062 INFO (qtp1128963290-589) [n:127.0.0.1:43017_ c:collection2 s:shard1 r:core_node8 x:collection2_shard1_replica_n7 t:null-570] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={update.distrib=FROMLEADER&distrib.from=http://127.0.0.1:37133/collection2_shard1_replica_n3/&wt=javabin&version=2}{add=[10000003 (1788735542911827968)]} 0 14 2> 214065 INFO (qtp657049334-529) [n:127.0.0.1:37133_ c:collection2 s:shard1 r:core_node4 x:collection2_shard1_replica_n3 t:null-570] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={update.distrib=TOLEADER&distrib.from=http://127.0.0.1:40707/collection2_shard2_replica_n1/&wt=javabin&version=2}{add=[10000003 (1788735542911827968)]} 0 47 2> 214068 INFO (qtp600427849-515) [n:127.0.0.1:40707_ c:collection2 s:shard2 r:core_node2 x:collection2_shard2_replica_n1 t:null-570] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={wt=javabin&version=2}{add=[10000003]} 0 58 2> 214087 WARN (TEST-BasicDistributedZkTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.c.SolrZkClient Using default ZkCredentialsInjector. ZkCredentialsInjector is not secure, it creates an empty list of credentials which leads to 'OPEN_ACL_UNSAFE' ACLs to Zookeeper nodes 2> 214108 INFO (TEST-BasicDistributedZkTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.c.ConnectionManager Waiting up to 15000ms for client to connect to ZooKeeper 2> 214131 INFO (zkConnectionManagerCallback-442-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ConnectionManager zkClient has connected 2> 214132 INFO (TEST-BasicDistributedZkTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper 2> 214132 WARN (TEST-BasicDistributedZkTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.c.SolrZkClient Using default ZkACLProvider. DefaultZkACLProvider is not secure, it creates 'OPEN_ACL_UNSAFE' ACLs to Zookeeper nodes 2> 214146 INFO (TEST-BasicDistributedZkTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (0) -> (5) 2> 214158 INFO (TEST-BasicDistributedZkTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.s.i.ZkClientClusterStateProvider Cluster at 127.0.0.1:42019/solr ready 2> 214158 INFO (TEST-BasicDistributedZkTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.AbstractFullDistribZkTestBase Created solrClient for collection collection2 with updatesToLeaders=true and parallelUpdates=false 2> 214203 INFO (qtp1128963290-588) [n:127.0.0.1:43017_ c:collection2 s:shard1 r:core_node8 x:collection2_shard1_replica_n7 t:null-571] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={update.distrib=FROMLEADER&distrib.from=http://127.0.0.1:37133/collection2_shard1_replica_n3/&wt=javabin&version=2}{add=[10000004 (1788735543073308672)]} 0 18 2> 214214 INFO (qtp657049334-530) [n:127.0.0.1:37133_ c:collection2 s:shard1 r:core_node4 x:collection2_shard1_replica_n3 t:null-571] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={wt=javabin&version=2}{add=[10000004 (1788735543073308672)]} 0 41 2> 214249 INFO (qtp600427849-519) [n:127.0.0.1:40707_ c:collection3 s:shard2 r:core_node2 x:collection3_shard2_replica_n1 t:null-572] o.a.s.c.ZkShardTerms Successful update of terms at /collections/collection3/terms/shard2 to Terms{values={core_node2=1, core_node6=1}, version=2} for ensureHighestTermsAreNotZero 2> 214259 INFO (qtp558875332-568) [n:127.0.0.1:32885_ c:collection3 s:shard2 r:core_node6 x:collection3_shard2_replica_n5 t:null-572] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={update.distrib=FROMLEADER&distrib.from=http://127.0.0.1:40707/collection3_shard2_replica_n1/&wt=javabin&version=2}{add=[20000000 (1788735543132028928)]} 0 10 2> 214262 INFO (qtp600427849-519) [n:127.0.0.1:40707_ c:collection3 s:shard2 r:core_node2 x:collection3_shard2_replica_n1 t:null-572] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={wt=javabin&version=2}{add=[20000000 (1788735543132028928)]} 0 38 2> 214290 INFO (qtp657049334-533) [n:127.0.0.1:37133_ c:collection3 s:shard1 r:core_node4 x:collection3_shard1_replica_n3 t:null-573] o.a.s.c.ZkShardTerms Successful update of terms at /collections/collection3/terms/shard1 to Terms{values={core_node4=1, core_node8=1}, version=2} for ensureHighestTermsAreNotZero 2> 214304 INFO (qtp1128963290-592) [n:127.0.0.1:43017_ c:collection3 s:shard1 r:core_node8 x:collection3_shard1_replica_n7 t:null-573] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={update.distrib=FROMLEADER&distrib.from=http://127.0.0.1:37133/collection3_shard1_replica_n3/&wt=javabin&version=2}{add=[20000001 (1788735543177117696)]} 0 14 2> 214325 INFO (qtp657049334-533) [n:127.0.0.1:37133_ c:collection3 s:shard1 r:core_node4 x:collection3_shard1_replica_n3 t:null-573] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={wt=javabin&version=2}{add=[20000001 (1788735543177117696)]} 0 54 2> 214338 WARN (TEST-BasicDistributedZkTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.c.SolrZkClient Using default ZkCredentialsInjector. ZkCredentialsInjector is not secure, it creates an empty list of credentials which leads to 'OPEN_ACL_UNSAFE' ACLs to Zookeeper nodes 2> 214360 INFO (TEST-BasicDistributedZkTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.c.ConnectionManager Waiting up to 15000ms for client to connect to ZooKeeper 2> 214374 INFO (zkConnectionManagerCallback-447-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ConnectionManager zkClient has connected 2> 214374 INFO (TEST-BasicDistributedZkTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper 2> 214375 WARN (TEST-BasicDistributedZkTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.c.SolrZkClient Using default ZkACLProvider. DefaultZkACLProvider is not secure, it creates 'OPEN_ACL_UNSAFE' ACLs to Zookeeper nodes 2> 214387 INFO (TEST-BasicDistributedZkTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (0) -> (5) 2> 214396 INFO (TEST-BasicDistributedZkTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.s.i.ZkClientClusterStateProvider Cluster at 127.0.0.1:42019/solr ready 2> 214397 INFO (TEST-BasicDistributedZkTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.AbstractFullDistribZkTestBase Created solrClient for collection collection3 with updatesToLeaders=true and parallelUpdates=false 2> 214438 INFO (qtp558875332-571) [n:127.0.0.1:32885_ c:collection3 s:shard2 r:core_node6 x:collection3_shard2_replica_n5 t:null-574] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={update.distrib=FROMLEADER&distrib.from=http://127.0.0.1:40707/collection3_shard2_replica_n1/&wt=javabin&version=2}{add=[10000005 (1788735543320772608)]} 0 15 2> 214438 INFO (recoveryExecutor-337-thread-1-processing-collection3_shard1_replica_n7 null-563 core_node8 127.0.0.1:43017_ collection3 shard1) [n:127.0.0.1:43017_ c:collection3 s:shard1 r:core_node8 x:collection3_shard1_replica_n7 t:null-563] o.a.s.c.RecoveryStrategy Attempting to PeerSync from [http://127.0.0.1:37133/collection3_shard1_replica_n3/] - recoveringAfterStartup=[true] 2> 214446 INFO (qtp600427849-518) [n:127.0.0.1:40707_ c:collection3 s:shard2 r:core_node2 x:collection3_shard2_replica_n1 t:null-574] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={wt=javabin&version=2}{add=[10000005 (1788735543320772608)]} 0 38 2> 214447 WARN (recoveryExecutor-337-thread-1-processing-collection3_shard1_replica_n7 null-563 core_node8 127.0.0.1:43017_ collection3 shard1) [n:127.0.0.1:43017_ c:collection3 s:shard1 r:core_node8 x:collection3_shard1_replica_n7 t:null-563] o.a.s.u.PeerSyncWithLeader no frame of reference to tell if we've missed updates 2> 214448 INFO (recoveryExecutor-337-thread-1-processing-collection3_shard1_replica_n7 null-563 core_node8 127.0.0.1:43017_ collection3 shard1) [n:127.0.0.1:43017_ c:collection3 s:shard1 r:core_node8 x:collection3_shard1_replica_n7 t:null-563] o.a.s.c.RecoveryStrategy PeerSync Recovery was not successful - trying replication. 2> 214448 INFO (recoveryExecutor-337-thread-1-processing-collection3_shard1_replica_n7 null-563 core_node8 127.0.0.1:43017_ collection3 shard1) [n:127.0.0.1:43017_ c:collection3 s:shard1 r:core_node8 x:collection3_shard1_replica_n7 t:null-563] o.a.s.c.RecoveryStrategy Starting Replication Recovery. 2> 214448 INFO (recoveryExecutor-337-thread-1-processing-collection3_shard1_replica_n7 null-563 core_node8 127.0.0.1:43017_ collection3 shard1) [n:127.0.0.1:43017_ c:collection3 s:shard1 r:core_node8 x:collection3_shard1_replica_n7 t:null-563] o.a.s.c.RecoveryStrategy Attempting to replicate from [http://127.0.0.1:37133/collection3_shard1_replica_n3/]. 2> 214581 INFO (qtp558875332-569) [n:127.0.0.1:32885_ c:collection3 s:shard2 r:core_node6 x:collection3_shard2_replica_n5 t:null-575] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={update.distrib=FROMLEADER&waitSearcher=true&openSearcher=false&commit=true&softCommit=false&distrib.from=http://127.0.0.1:40707/collection3_shard2_replica_n1/&commit_end_point=replicas&wt=javabin&version=2&expungeDeletes=false}{commit=} 0 88 2> 214612 INFO (qtp600427849-515) [n:127.0.0.1:40707_ c:collection3 s:shard2 r:core_node2 x:collection3_shard2_replica_n1 t:null-575] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={update.distrib=TOLEADER&waitSearcher=true&openSearcher=false&commit=true&softCommit=false&distrib.from=http://127.0.0.1:37133/collection3_shard1_replica_n3/&commit_end_point=leaders&wt=javabin&version=2&expungeDeletes=false}{commit=} 0 147 2> 214641 INFO (qtp1128963290-591) [n:127.0.0.1:43017_ c:collection3 s:shard1 r:core_node8 x:collection3_shard1_replica_n7 t:null-575] o.a.s.u.p.DistributedUpdateProcessor Ignoring commit while not ACTIVE - state: BUFFERING replay: false 2> 214642 INFO (qtp1128963290-591) [n:127.0.0.1:43017_ c:collection3 s:shard1 r:core_node8 x:collection3_shard1_replica_n7 t:null-575] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={update.distrib=FROMLEADER&waitSearcher=true&openSearcher=false&commit=true&softCommit=false&distrib.from=http://127.0.0.1:37133/collection3_shard1_replica_n3/&commit_end_point=replicas&wt=javabin&version=2&expungeDeletes=false}{commit=} 0 4 2> 214645 INFO (OverseerCollectionConfigSetProcessor-72077803529764868-127.0.0.1:45297_-n_0000000000) [n: c: s: r: x: t:] o.a.s.c.OverseerTaskQueue Response ZK path: /overseer/collection-queue-work/qnr-0000000030 doesn't exist. Requestor may have disconnected from ZooKeeper 2> 214737 INFO (qtp657049334-532) [n:127.0.0.1:37133_ c:collection3 s:shard1 r:core_node4 x:collection3_shard1_replica_n3 t:null-575] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={waitSearcher=true&openSearcher=false&commit=true&softCommit=false&wt=javabin&version=2}{commit=} 0 284 2> 214750 INFO (searcherExecutor-393-thread-1-processing-collection2_shard1_replica_n3 null-576 core_node4 127.0.0.1:37133_ collection2 shard1) [n:127.0.0.1:37133_ c:collection2 s:shard1 r:core_node4 x:collection2_shard1_replica_n3 t:null-576] o.a.s.c.SolrCore Registered new searcher autowarm time: 0 ms 2> 214751 INFO (qtp657049334-531) [n:127.0.0.1:37133_ c:collection3 s:shard1 r:core_node4 x:collection3_shard1_replica_n3 t:null-577] o.a.s.c.S.Request webapp= path=/replication params={qt=/replication&wt=javabin&version=2&command=indexversion} status=0 QTime=2 2> 214754 INFO (recoveryExecutor-337-thread-1-processing-collection3_shard1_replica_n7 null-563 core_node8 127.0.0.1:43017_ collection3 shard1) [n:127.0.0.1:43017_ c:collection3 s:shard1 r:core_node8 x:collection3_shard1_replica_n7 t:null-563] o.a.s.h.IndexFetcher Leader's generation: 2 2> 214754 INFO (recoveryExecutor-337-thread-1-processing-collection3_shard1_replica_n7 null-563 core_node8 127.0.0.1:43017_ collection3 shard1) [n:127.0.0.1:43017_ c:collection3 s:shard1 r:core_node8 x:collection3_shard1_replica_n7 t:null-563] o.a.s.h.IndexFetcher Leader's version: 1705871146717 2> 214754 INFO (recoveryExecutor-337-thread-1-processing-collection3_shard1_replica_n7 null-563 core_node8 127.0.0.1:43017_ collection3 shard1) [n:127.0.0.1:43017_ c:collection3 s:shard1 r:core_node8 x:collection3_shard1_replica_n7 t:null-563] o.a.s.h.IndexFetcher Follower's generation: 1 2> 214757 INFO (searcherExecutor-406-thread-1-processing-collection2_shard1_replica_n7 null-576 core_node8 127.0.0.1:43017_ collection2 shard1) [n:127.0.0.1:43017_ c:collection2 s:shard1 r:core_node8 x:collection2_shard1_replica_n7 t:null-576] o.a.s.c.SolrCore Registered new searcher autowarm time: 0 ms 2> 214757 INFO (recoveryExecutor-337-thread-1-processing-collection3_shard1_replica_n7 null-563 core_node8 127.0.0.1:43017_ collection3 shard1) [n:127.0.0.1:43017_ c:collection3 s:shard1 r:core_node8 x:collection3_shard1_replica_n7 t:null-563] o.a.s.h.IndexFetcher Follower's version: 0 2> 214758 INFO (recoveryExecutor-337-thread-1-processing-collection3_shard1_replica_n7 null-563 core_node8 127.0.0.1:43017_ collection3 shard1) [n:127.0.0.1:43017_ c:collection3 s:shard1 r:core_node8 x:collection3_shard1_replica_n7 t:null-563] o.a.s.h.IndexFetcher Starting replication process 2> 214760 INFO (qtp1128963290-593) [n:127.0.0.1:43017_ c:collection2 s:shard1 r:core_node8 x:collection2_shard1_replica_n7 t:null-576] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={update.distrib=FROMLEADER&waitSearcher=true&openSearcher=true&commit=true&softCommit=false&distrib.from=http://127.0.0.1:37133/collection2_shard1_replica_n3/&commit_end_point=replicas&wt=javabin&version=2&expungeDeletes=false}{commit=} 0 257 2> 214763 INFO (qtp657049334-529) [n:127.0.0.1:37133_ c:collection2 s:shard1 r:core_node4 x:collection2_shard1_replica_n3 t:null-576] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={update.distrib=TOLEADER&waitSearcher=true&openSearcher=true&commit=true&softCommit=false&distrib.from=http://127.0.0.1:40707/collection2_shard2_replica_n1/&commit_end_point=leaders&wt=javabin&version=2&expungeDeletes=false}{commit=} 0 298 2> 214794 INFO (qtp657049334-530) [n:127.0.0.1:37133_ c:collection3 s:shard1 r:core_node4 x:collection3_shard1_replica_n3 t:null-578] o.a.s.c.S.Request webapp= path=/replication params={generation=2&qt=/replication&wt=javabin&version=2&command=filelist} status=0 QTime=32 2> 214835 INFO (recoveryExecutor-337-thread-1-processing-collection3_shard1_replica_n7 null-563 core_node8 127.0.0.1:43017_ collection3 shard1) [n:127.0.0.1:43017_ c:collection3 s:shard1 r:core_node8 x:collection3_shard1_replica_n7 t:null-563] o.a.s.h.IndexFetcher Number of files in latest index in leader: 4 2> 214937 INFO (recoveryExecutor-337-thread-1-processing-collection3_shard1_replica_n7 null-563 core_node8 127.0.0.1:43017_ collection3 shard1) [n:127.0.0.1:43017_ c:collection3 s:shard1 r:core_node8 x:collection3_shard1_replica_n7 t:null-563] o.a.s.u.RandomMergePolicy RandomMergePolicy wrapping class org.apache.lucene.index.LogByteSizeMergePolicy: [LogByteSizeMergePolicy: minMergeSize=1677721, mergeFactor=14, maxMergeSize=2147483648, maxMergeSizeForForcedMerge=9223372036854775807, calibrateSizeByDeletes=true, maxMergeDocs=2147483647, maxCFSSegmentSizeMB=1.0183916091918945, noCFSRatio=1.0] 2> 214948 INFO (recoveryExecutor-337-thread-1-processing-collection3_shard1_replica_n7 null-563 core_node8 127.0.0.1:43017_ collection3 shard1) [n:127.0.0.1:43017_ c:collection3 s:shard1 r:core_node8 x:collection3_shard1_replica_n7 t:null-563] o.a.s.u.DefaultSolrCoreState New IndexWriter is ready to be used. 2> 214956 INFO (recoveryExecutor-337-thread-1-processing-collection3_shard1_replica_n7 null-563 core_node8 127.0.0.1:43017_ collection3 shard1) [n:127.0.0.1:43017_ c:collection3 s:shard1 r:core_node8 x:collection3_shard1_replica_n7 t:null-563] o.a.s.h.IndexFetcher Starting download (fullCopy=false) to MockDirectoryWrapper(ByteBuffersDirectory@ad96318 lockFactory=org.apache.lucene.store.SingleInstanceLockFactory@518ce8ef) 2> 214957 INFO (recoveryExecutor-337-thread-1-processing-collection3_shard1_replica_n7 null-563 core_node8 127.0.0.1:43017_ collection3 shard1) [n:127.0.0.1:43017_ c:collection3 s:shard1 r:core_node8 x:collection3_shard1_replica_n7 t:null-563] o.a.s.h.IndexFetcher tmpIndexDir_type : class org.apache.lucene.tests.store.MockDirectoryWrapper , ByteBuffersDirectory@ad96318 lockFactory=org.apache.lucene.store.SingleInstanceLockFactory@518ce8ef 2> 214966 INFO (searcherExecutor-387-thread-1-processing-collection2_shard2_replica_n1 null-576 core_node2 127.0.0.1:40707_ collection2 shard2) [n:127.0.0.1:40707_ c:collection2 s:shard2 r:core_node2 x:collection2_shard2_replica_n1 t:null-576] o.a.s.c.SolrCore Registered new searcher autowarm time: 0 ms 2> 214969 INFO (searcherExecutor-399-thread-1-processing-collection2_shard2_replica_n5 null-576 core_node6 127.0.0.1:32885_ collection2 shard2) [n:127.0.0.1:32885_ c:collection2 s:shard2 r:core_node6 x:collection2_shard2_replica_n5 t:null-576] o.a.s.c.SolrCore Registered new searcher autowarm time: 0 ms 2> 214979 INFO (qtp558875332-572) [n:127.0.0.1:32885_ c:collection2 s:shard2 r:core_node6 x:collection2_shard2_replica_n5 t:null-576] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={update.distrib=FROMLEADER&waitSearcher=true&openSearcher=true&commit=true&softCommit=false&distrib.from=http://127.0.0.1:40707/collection2_shard2_replica_n1/&commit_end_point=replicas&wt=javabin&version=2&expungeDeletes=false}{commit=} 0 209 2> 214982 INFO (qtp600427849-517) [n:127.0.0.1:40707_ c:collection2 s:shard2 r:core_node2 x:collection2_shard2_replica_n1 t:null-576] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={waitSearcher=true&commit=true&softCommit=false&wt=javabin&version=2}{commit=} 0 529 2> 215016 INFO (qtp1128963290-590) [n:127.0.0.1:43017_ c:collection3 s:shard1 r:core_node8 x:collection3_shard1_replica_n7 t:null-579] o.a.s.u.p.DistributedUpdateProcessor Ignoring commit while not ACTIVE - state: BUFFERING replay: false 2> 215016 INFO (qtp1128963290-590) [n:127.0.0.1:43017_ c:collection3 s:shard1 r:core_node8 x:collection3_shard1_replica_n7 t:null-579] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={update.distrib=FROMLEADER&waitSearcher=true&openSearcher=true&commit=true&softCommit=false&distrib.from=http://127.0.0.1:37133/collection3_shard1_replica_n3/&commit_end_point=replicas&wt=javabin&version=2&expungeDeletes=false}{commit=} 0 4 2> 215032 INFO (qtp657049334-533) [n:127.0.0.1:37133_ c:collection3 s:shard1 r:core_node4 x:collection3_shard1_replica_n3 t:null-580] o.a.s.c.S.Request webapp= path=/replication params={generation=2&qt=/replication&file=_0.cfe&checksum=true&wt=filestream&command=filecontent} status=0 QTime=23 2> 215083 INFO (searcherExecutor-419-thread-1-processing-collection3_shard1_replica_n3 null-579 core_node4 127.0.0.1:37133_ collection3 shard1) [n:127.0.0.1:37133_ c:collection3 s:shard1 r:core_node4 x:collection3_shard1_replica_n3 t:null-579] o.a.s.c.SolrCore Registered new searcher autowarm time: 0 ms 2> 215087 INFO (qtp657049334-528) [n:127.0.0.1:37133_ c:collection3 s:shard1 r:core_node4 x:collection3_shard1_replica_n3 t:null-579] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={update.distrib=TOLEADER&waitSearcher=true&openSearcher=true&commit=true&softCommit=false&distrib.from=http://127.0.0.1:40707/collection3_shard2_replica_n1/&commit_end_point=leaders&wt=javabin&version=2&expungeDeletes=false}{commit=} 0 86 2> 215126 INFO (qtp657049334-532) [n:127.0.0.1:37133_ c:collection3 s:shard1 r:core_node4 x:collection3_shard1_replica_n3 t:null-581] o.a.s.c.S.Request webapp= path=/replication params={generation=2&qt=/replication&file=_0.si&checksum=true&wt=filestream&command=filecontent} status=0 QTime=2 2> 215135 INFO (qtp657049334-531) [n:127.0.0.1:37133_ c:collection3 s:shard1 r:core_node4 x:collection3_shard1_replica_n3 t:null-582] o.a.s.c.S.Request webapp= path=/replication params={generation=2&qt=/replication&file=_0.cfs&checksum=true&wt=filestream&command=filecontent} status=0 QTime=2 2> 215146 INFO (qtp657049334-529) [n:127.0.0.1:37133_ c:collection3 s:shard1 r:core_node4 x:collection3_shard1_replica_n3 t:null-583] o.a.s.c.S.Request webapp= path=/replication params={generation=2&qt=/replication&file=segments_2&checksum=true&wt=filestream&command=filecontent} status=0 QTime=2 2> 215150 INFO (recoveryExecutor-337-thread-1-processing-collection3_shard1_replica_n7 null-563 core_node8 127.0.0.1:43017_ collection3 shard1) [n:127.0.0.1:43017_ c:collection3 s:shard1 r:core_node8 x:collection3_shard1_replica_n7 t:null-563] o.a.s.h.IndexFetcher Bytes downloaded: 5039, Bytes skipped downloading: 0 2> 215151 INFO (recoveryExecutor-337-thread-1-processing-collection3_shard1_replica_n7 null-563 core_node8 127.0.0.1:43017_ collection3 shard1) [n:127.0.0.1:43017_ c:collection3 s:shard1 r:core_node8 x:collection3_shard1_replica_n7 t:null-563] o.a.s.h.IndexFetcher Total time taken for download (fullCopy=false,bytesDownloaded=5039) : 0 secs (null bytes/sec) to MockDirectoryWrapper(ByteBuffersDirectory@ad96318 lockFactory=org.apache.lucene.store.SingleInstanceLockFactory@518ce8ef) 2> 215172 INFO (searcherExecutor-413-thread-1-processing-collection3_shard2_replica_n1 null-579 core_node2 127.0.0.1:40707_ collection3 shard2) [n:127.0.0.1:40707_ c:collection3 s:shard2 r:core_node2 x:collection3_shard2_replica_n1 t:null-579] o.a.s.c.SolrCore Registered new searcher autowarm time: 0 ms 2> 215184 INFO (recoveryExecutor-337-thread-1-processing-collection3_shard1_replica_n7 null-563 core_node8 127.0.0.1:43017_ collection3 shard1) [n:127.0.0.1:43017_ c:collection3 s:shard1 r:core_node8 x:collection3_shard1_replica_n7 t:null-563] o.a.s.u.RandomMergePolicy RandomMergePolicy wrapping class org.apache.lucene.index.LogByteSizeMergePolicy: [LogByteSizeMergePolicy: minMergeSize=1677721, mergeFactor=45, maxMergeSize=2147483648, maxMergeSizeForForcedMerge=9223372036854775807, calibrateSizeByDeletes=true, maxMergeDocs=2147483647, maxCFSSegmentSizeMB=8.796093022208E12, noCFSRatio=0.0] 2> 215199 INFO (searcherExecutor-425-thread-1-processing-collection3_shard2_replica_n5 null-579 core_node6 127.0.0.1:32885_ collection3 shard2) [n:127.0.0.1:32885_ c:collection3 s:shard2 r:core_node6 x:collection3_shard2_replica_n5 t:null-579] o.a.s.c.SolrCore Registered new searcher autowarm time: 0 ms 2> 215200 INFO (recoveryExecutor-337-thread-1-processing-collection3_shard1_replica_n7 null-563 core_node8 127.0.0.1:43017_ collection3 shard1) [n:127.0.0.1:43017_ c:collection3 s:shard1 r:core_node8 x:collection3_shard1_replica_n7 t:null-563] o.a.s.u.DefaultSolrCoreState New IndexWriter is ready to be used. 2> 215202 INFO (qtp558875332-573) [n:127.0.0.1:32885_ c:collection3 s:shard2 r:core_node6 x:collection3_shard2_replica_n5 t:null-579] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={update.distrib=FROMLEADER&waitSearcher=true&openSearcher=true&commit=true&softCommit=false&distrib.from=http://127.0.0.1:40707/collection3_shard2_replica_n1/&commit_end_point=replicas&wt=javabin&version=2&expungeDeletes=false}{commit=} 0 105 2> 215204 INFO (qtp600427849-520) [n:127.0.0.1:40707_ c:collection3 s:shard2 r:core_node2 x:collection3_shard2_replica_n1 t:null-579] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={waitSearcher=true&commit=true&softCommit=false&wt=javabin&version=2}{commit=} 0 216 2> 215216 WARN (TEST-BasicDistributedZkTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.c.SolrZkClient Using default ZkCredentialsInjector. ZkCredentialsInjector is not secure, it creates an empty list of credentials which leads to 'OPEN_ACL_UNSAFE' ACLs to Zookeeper nodes 2> 215242 INFO (TEST-BasicDistributedZkTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.c.ConnectionManager Waiting up to 15000ms for client to connect to ZooKeeper 2> 215246 INFO (searcherExecutor-432-thread-1-processing-collection3_shard1_replica_n7 null-563 core_node8 127.0.0.1:43017_ collection3 shard1) [n:127.0.0.1:43017_ c:collection3 s:shard1 r:core_node8 x:collection3_shard1_replica_n7 t:null-563] o.a.s.c.SolrCore Registered new searcher autowarm time: 0 ms 2> 215256 INFO (zkConnectionManagerCallback-453-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ConnectionManager zkClient has connected 2> 215257 INFO (TEST-BasicDistributedZkTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper 2> 215258 WARN (TEST-BasicDistributedZkTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.c.SolrZkClient Using default ZkACLProvider. DefaultZkACLProvider is not secure, it creates 'OPEN_ACL_UNSAFE' ACLs to Zookeeper nodes 2> 215267 INFO (recoveryExecutor-337-thread-1-processing-collection3_shard1_replica_n7 null-563 core_node8 127.0.0.1:43017_ collection3 shard1) [n:127.0.0.1:43017_ c:collection3 s:shard1 r:core_node8 x:collection3_shard1_replica_n7 t:null-563] o.a.s.c.RecoveryStrategy Replaying buffered documents. 2> 215270 INFO (TEST-BasicDistributedZkTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (0) -> (5) 2> 215269 WARN (recoveryExecutor-434-thread-1-processing-collection3_shard1_replica_n7 null-563 core_node8 127.0.0.1:43017_ collection3 shard1) [n:127.0.0.1:43017_ c:collection3 s:shard1 r:core_node8 x:collection3_shard1_replica_n7 t:null-563] o.a.s.u.UpdateLog Starting log replay tlog{file=/build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.cloud.BasicDistributedZkTest_246C98A4C257C021-001/shard-4-001/cores/collection3_shard1_replica_n7/data/tlog/buffer.tlog.0000308442129032203 refcount=2} active=true starting pos=0 inSortedOrder=false 2> 215288 INFO (TEST-BasicDistributedZkTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.s.i.ZkClientClusterStateProvider Cluster at 127.0.0.1:42019/solr ready 2> 215289 INFO (TEST-BasicDistributedZkTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.AbstractFullDistribZkTestBase Created solrClient for collection collection1 with updatesToLeaders=false and parallelUpdates=true 2> 215314 INFO (qtp558875332-571) [n:127.0.0.1:32885_ c:collection1 s:shard2 r:core_node7 x:collection1_shard2_replica_n5 t:null-584] o.a.s.c.S.Request webapp= path=/select params={df=text&distrib=false&_stateVer_=collection1:11&fl=id&fl=score&shards.purpose=16388&start=0&fsv=true&rid=null-584&rows=10&version=2&q=*:*&omitHeader=false&NOW=1705871147400&isShard=true&wt=javabin} hits=35 status=0 QTime=2 2> 215331 INFO (qtp1128963290-589) [n:127.0.0.1:43017_ c:collection1 s:shard1 r:core_node3 x:collection1_shard1_replica_n1 t:null-584] o.a.s.c.S.Request webapp= path=/select params={df=text&distrib=false&_stateVer_=collection1:11&fl=id&fl=score&shards.purpose=16388&start=0&fsv=true&rid=null-584&rows=10&version=2&q=*:*&omitHeader=false&NOW=1705871147400&isShard=true&wt=javabin} hits=32 status=0 QTime=17 2> 215355 INFO (qtp558875332-569) [n:127.0.0.1:32885_ c:collection1 s:shard2 r:core_node7 x:collection1_shard2_replica_n5 t:null-584] o.a.s.c.S.Request webapp= path=/select params={q=*:*&df=text&distrib=false&_stateVer_=collection1:11&omitHeader=false&shards.purpose=64&NOW=1705871147400&ids=2,3,103,104,5,105,17,6,7,9&isShard=true&rid=null-584&wt=javabin&version=2} status=0 QTime=4 2> 215363 INFO (qtp558875332-568) [n:127.0.0.1:32885_ c:collection1 s:shard2 r:core_node7 x:collection1_shard2_replica_n5 t:null-584] o.a.s.c.S.Request webapp= path=/select params={q=*:*&_stateVer_=collection1:11&wt=javabin&version=2} rid=null-584 hits=67 status=0 QTime=60 2> 215380 INFO (qtp600427849-516) [n:127.0.0.1:40707_ c:collection2 s:shard2 r:core_node2 x:collection2_shard2_replica_n1 t:null-585] o.a.s.c.S.Request webapp= path=/select params={df=text&distrib=false&fl=id&fl=score&shards.purpose=16388&start=0&fsv=true&rid=null-585&rows=10&version=2&q=*:*&omitHeader=false&NOW=1705871147468&isShard=true&wt=javabin} hits=2 status=0 QTime=3 2> 215384 INFO (qtp1128963290-588) [n:127.0.0.1:43017_ c:collection2 s:shard1 r:core_node8 x:collection2_shard1_replica_n7 t:null-585] o.a.s.c.S.Request webapp= path=/select params={df=text&distrib=false&fl=id&fl=score&shards.purpose=16388&start=0&fsv=true&rid=null-585&rows=10&version=2&q=*:*&omitHeader=false&NOW=1705871147468&isShard=true&wt=javabin} hits=2 status=0 QTime=2 2> 215397 INFO (qtp600427849-518) [n:127.0.0.1:40707_ c:collection2 s:shard2 r:core_node2 x:collection2_shard2_replica_n1 t:null-585] o.a.s.c.S.Request webapp= path=/select params={q=*:*&df=text&distrib=false&omitHeader=false&shards.purpose=64&NOW=1705871147468&ids=10000000,10000001&isShard=true&rid=null-585&wt=javabin&version=2} status=0 QTime=2 2> 215400 INFO (qtp1128963290-592) [n:127.0.0.1:43017_ c:collection2 s:shard1 r:core_node8 x:collection2_shard1_replica_n7 t:null-585] o.a.s.c.S.Request webapp= path=/select params={q=*:*&df=text&distrib=false&omitHeader=false&shards.purpose=64&NOW=1705871147468&ids=10000003,10000004&isShard=true&rid=null-585&wt=javabin&version=2} status=0 QTime=2 2> 215403 INFO (qtp600427849-519) [n:127.0.0.1:40707_ c:collection2 s:shard2 r:core_node2 x:collection2_shard2_replica_n1 t:null-585] o.a.s.c.S.Request webapp= path=/select params={q=*:*&wt=javabin&version=2} rid=null-585 hits=4 status=0 QTime=34 2> 215418 INFO (qtp600427849-517) [n:127.0.0.1:40707_ c:collection3 s:shard2 r:core_node2 x:collection3_shard2_replica_n1 t:null-586] o.a.s.c.S.Request webapp= path=/select params={df=text&distrib=false&fl=id&fl=score&shards.purpose=16388&start=0&fsv=true&rid=null-586&rows=10&version=2&q=*:*&omitHeader=false&NOW=1705871147507&isShard=true&wt=javabin} hits=2 status=0 QTime=2 2> 215422 INFO (qtp657049334-530) [n:127.0.0.1:37133_ c:collection3 s:shard1 r:core_node4 x:collection3_shard1_replica_n3 t:null-586] o.a.s.c.S.Request webapp= path=/select params={df=text&distrib=false&fl=id&fl=score&shards.purpose=16388&start=0&fsv=true&rid=null-586&rows=10&version=2&q=*:*&omitHeader=false&NOW=1705871147507&isShard=true&wt=javabin} hits=1 status=0 QTime=1 2> 215431 INFO (qtp657049334-528) [n:127.0.0.1:37133_ c:collection3 s:shard1 r:core_node4 x:collection3_shard1_replica_n3 t:null-586] o.a.s.c.S.Request webapp= path=/select params={q=*:*&df=text&distrib=false&omitHeader=false&shards.purpose=64&NOW=1705871147507&ids=20000001&isShard=true&rid=null-586&wt=javabin&version=2} status=0 QTime=2 2> 215433 INFO (qtp600427849-520) [n:127.0.0.1:40707_ c:collection3 s:shard2 r:core_node2 x:collection3_shard2_replica_n1 t:null-586] o.a.s.c.S.Request webapp= path=/select params={q=*:*&df=text&distrib=false&omitHeader=false&shards.purpose=64&NOW=1705871147507&ids=20000000,10000005&isShard=true&rid=null-586&wt=javabin&version=2} status=0 QTime=1 2> 215436 INFO (qtp600427849-515) [n:127.0.0.1:40707_ c:collection3 s:shard2 r:core_node2 x:collection3_shard2_replica_n1 t:null-586] o.a.s.c.S.Request webapp= path=/select params={q=*:*&wt=javabin&version=2} rid=null-586 hits=3 status=0 QTime=27 2> 215456 INFO (qtp657049334-533) [n:127.0.0.1:37133_ c:collection3 s:shard1 r:core_node4 x:collection3_shard1_replica_n3 t:null-587] o.a.s.c.S.Request webapp= path=/select params={df=text&distrib=false&fl=id&fl=score&shards.purpose=16388&start=0&fsv=true&collection=collection2,collection3&rid=null-587&rows=10&version=2&q=*:*&omitHeader=false&NOW=1705871147544&isShard=true&wt=javabin} hits=1 status=0 QTime=2 2> 215460 INFO (qtp558875332-572) [n:127.0.0.1:32885_ c:collection3 s:shard2 r:core_node6 x:collection3_shard2_replica_n5 t:null-587] o.a.s.c.S.Request webapp= path=/select params={df=text&distrib=false&fl=id&fl=score&shards.purpose=16388&start=0&fsv=true&collection=collection2,collection3&rid=null-587&rows=10&version=2&q=*:*&omitHeader=false&NOW=1705871147544&isShard=true&wt=javabin} hits=2 status=0 QTime=3 2> 215463 INFO (qtp558875332-573) [n:127.0.0.1:32885_ c:collection2 s:shard2 r:core_node6 x:collection2_shard2_replica_n5 t:null-587] o.a.s.c.S.Request webapp= path=/select params={df=text&distrib=false&fl=id&fl=score&shards.purpose=16388&start=0&fsv=true&collection=collection2,collection3&rid=null-587&rows=10&version=2&q=*:*&omitHeader=false&NOW=1705871147544&isShard=true&wt=javabin} hits=2 status=0 QTime=2 2> 215466 INFO (qtp657049334-532) [n:127.0.0.1:37133_ c:collection2 s:shard1 r:core_node4 x:collection2_shard1_replica_n3 t:null-587] o.a.s.c.S.Request webapp= path=/select params={df=text&distrib=false&fl=id&fl=score&shards.purpose=16388&start=0&fsv=true&collection=collection2,collection3&rid=null-587&rows=10&version=2&q=*:*&omitHeader=false&NOW=1705871147544&isShard=true&wt=javabin} hits=2 status=0 QTime=2 2> 215474 INFO (qtp657049334-531) [n:127.0.0.1:37133_ c:collection3 s:shard1 r:core_node4 x:collection3_shard1_replica_n3 t:null-587] o.a.s.c.S.Request webapp= path=/select params={q=*:*&df=text&distrib=false&omitHeader=false&shards.purpose=64&NOW=1705871147544&ids=20000001&isShard=true&collection=collection2,collection3&rid=null-587&wt=javabin&version=2} status=0 QTime=2 2> 215476 INFO (qtp657049334-529) [n:127.0.0.1:37133_ c:collection2 s:shard1 r:core_node4 x:collection2_shard1_replica_n3 t:null-587] o.a.s.c.S.Request webapp= path=/select params={q=*:*&df=text&distrib=false&omitHeader=false&shards.purpose=64&NOW=1705871147544&ids=10000003,10000004&isShard=true&collection=collection2,collection3&rid=null-587&wt=javabin&version=2} status=0 QTime=1 2> 215478 INFO (qtp558875332-570) [n:127.0.0.1:32885_ c:collection3 s:shard2 r:core_node6 x:collection3_shard2_replica_n5 t:null-587] o.a.s.c.S.Request webapp= path=/select params={q=*:*&df=text&distrib=false&omitHeader=false&shards.purpose=64&NOW=1705871147544&ids=20000000,10000005&isShard=true&collection=collection2,collection3&rid=null-587&wt=javabin&version=2} status=0 QTime=1 2> 215480 INFO (qtp558875332-571) [n:127.0.0.1:32885_ c:collection2 s:shard2 r:core_node6 x:collection2_shard2_replica_n5 t:null-587] o.a.s.c.S.Request webapp= path=/select params={q=*:*&df=text&distrib=false&omitHeader=false&shards.purpose=64&NOW=1705871147544&ids=10000000,10000001&isShard=true&collection=collection2,collection3&rid=null-587&wt=javabin&version=2} status=0 QTime=2 2> 215483 INFO (qtp600427849-516) [n:127.0.0.1:40707_ c:collection2 s:shard2 r:core_node2 x:collection2_shard2_replica_n1 t:null-587] o.a.s.c.S.Request webapp= path=/select params={q=*:*&collection=collection2,collection3&wt=javabin&version=2} rid=null-587 hits=7 status=0 QTime=39 2> 215498 INFO (qtp600427849-519) [n:127.0.0.1:40707_ c:collection1 s:shard2 r:core_node4 x:collection1_shard2_replica_n2 t:null-588] o.a.s.c.S.Request webapp= path=/select params={df=text&distrib=false&fl=id&fl=score&shards.purpose=16388&start=0&fsv=true&collection=collection1,collection2,collection3&rid=null-588&rows=10&version=2&q=*:*&omitHeader=false&NOW=1705871147588&isShard=true&wt=javabin} hits=35 status=0 QTime=2 2> 215500 INFO (qtp657049334-530) [n:127.0.0.1:37133_ c:collection3 s:shard1 r:core_node4 x:collection3_shard1_replica_n3 t:null-588] o.a.s.c.S.Request webapp= path=/select params={df=text&distrib=false&fl=id&fl=score&shards.purpose=16388&start=0&fsv=true&collection=collection1,collection2,collection3&rid=null-588&rows=10&version=2&q=*:*&omitHeader=false&NOW=1705871147588&isShard=true&wt=javabin} hits=1 status=0 QTime=2 2> 215503 INFO (qtp600427849-517) [n:127.0.0.1:40707_ c:collection3 s:shard2 r:core_node2 x:collection3_shard2_replica_n1 t:null-588] o.a.s.c.S.Request webapp= path=/select params={df=text&distrib=false&fl=id&fl=score&shards.purpose=16388&start=0&fsv=true&collection=collection1,collection2,collection3&rid=null-588&rows=10&version=2&q=*:*&omitHeader=false&NOW=1705871147588&isShard=true&wt=javabin} hits=2 status=0 QTime=1 2> 215505 INFO (qtp600427849-520) [n:127.0.0.1:40707_ c:collection2 s:shard2 r:core_node2 x:collection2_shard2_replica_n1 t:null-588] o.a.s.c.S.Request webapp= path=/select params={df=text&distrib=false&fl=id&fl=score&shards.purpose=16388&start=0&fsv=true&collection=collection1,collection2,collection3&rid=null-588&rows=10&version=2&q=*:*&omitHeader=false&NOW=1705871147588&isShard=true&wt=javabin} hits=2 status=0 QTime=2 2> 215507 INFO (qtp657049334-528) [n:127.0.0.1:37133_ c:collection1 s:shard1 r:core_node8 x:collection1_shard1_replica_n6 t:null-588] o.a.s.c.S.Request webapp= path=/select params={df=text&distrib=false&fl=id&fl=score&shards.purpose=16388&start=0&fsv=true&collection=collection1,collection2,collection3&rid=null-588&rows=10&version=2&q=*:*&omitHeader=false&NOW=1705871147588&isShard=true&wt=javabin} hits=32 status=0 QTime=2 2> 215510 INFO (qtp1128963290-591) [n:127.0.0.1:43017_ c:collection2 s:shard1 r:core_node8 x:collection2_shard1_replica_n7 t:null-588] o.a.s.c.S.Request webapp= path=/select params={df=text&distrib=false&fl=id&fl=score&shards.purpose=16388&start=0&fsv=true&collection=collection1,collection2,collection3&rid=null-588&rows=10&version=2&q=*:*&omitHeader=false&NOW=1705871147588&isShard=true&wt=javabin} hits=2 status=0 QTime=1 2> 215535 INFO (qtp657049334-533) [n:127.0.0.1:37133_ c:collection1 s:shard1 r:core_node8 x:collection1_shard1_replica_n6 t:null-588] o.a.s.c.S.Request webapp= path=/select params={q=*:*&df=text&distrib=false&omitHeader=false&shards.purpose=64&NOW=1705871147588&ids=11,12,1,13,14,15,4,16,8,10&isShard=true&collection=collection1,collection2,collection3&rid=null-588&wt=javabin&version=2} status=0 QTime=3 2> 215541 INFO (qtp600427849-518) [n:127.0.0.1:40707_ c:collection1 s:shard2 r:core_node4 x:collection1_shard2_replica_n2 t:null-588] o.a.s.c.S.Request webapp= path=/select params={q=*:*&collection=collection1,collection2,collection3&wt=javabin&version=2} rid=null-588 hits=74 status=0 QTime=52 2> 215616 INFO (qtp558875332-569) [n:127.0.0.1:32885_ c:collection1 s:shard2 r:core_node7 x:collection1_shard2_replica_n5 t:null-589] o.a.s.c.S.Request webapp= path=/select params={df=text&distrib=false&_stateVer_=collection1:11|collection2:15|collection3:14&fl=id&fl=score&shards.purpose=16388&start=0&fsv=true&collection=collection1,collection2,collection3&rid=null-589&rows=10&version=2&q=*:*&omitHeader=false&NOW=1705871147686&isShard=true&wt=javabin} hits=35 status=0 QTime=2 2> 215627 INFO (qtp657049334-532) [n:127.0.0.1:37133_ c:collection3 s:shard1 r:core_node4 x:collection3_shard1_replica_n3 t:null-589] o.a.s.c.S.Request webapp= path=/select params={df=text&distrib=false&_stateVer_=collection1:11|collection2:15|collection3:14&fl=id&fl=score&shards.purpose=16388&start=0&fsv=true&collection=collection1,collection2,collection3&rid=null-589&rows=10&version=2&q=*:*&omitHeader=false&NOW=1705871147686&isShard=true&wt=javabin} hits=1 status=0 QTime=1 2> 215640 INFO (qtp558875332-568) [n:127.0.0.1:32885_ c:collection3 s:shard2 r:core_node6 x:collection3_shard2_replica_n5 t:null-589] o.a.s.c.S.Request webapp= path=/select params={df=text&distrib=false&_stateVer_=collection1:11|collection2:15|collection3:14&fl=id&fl=score&shards.purpose=16388&start=0&fsv=true&collection=collection1,collection2,collection3&rid=null-589&rows=10&version=2&q=*:*&omitHeader=false&NOW=1705871147686&isShard=true&wt=javabin} hits=2 status=0 QTime=9 2> 215645 INFO (qtp600427849-516) [n:127.0.0.1:40707_ c:collection2 s:shard2 r:core_node2 x:collection2_shard2_replica_n1 t:null-589] o.a.s.c.S.Request webapp= path=/select params={df=text&distrib=false&_stateVer_=collection1:11|collection2:15|collection3:14&fl=id&fl=score&shards.purpose=16388&start=0&fsv=true&collection=collection1,collection2,collection3&rid=null-589&rows=10&version=2&q=*:*&omitHeader=false&NOW=1705871147686&isShard=true&wt=javabin} hits=2 status=0 QTime=1 2> 215649 INFO (qtp1128963290-593) [n:127.0.0.1:43017_ c:collection1 s:shard1 r:core_node3 x:collection1_shard1_replica_n1 t:null-589] o.a.s.c.S.Request webapp= path=/select params={df=text&distrib=false&_stateVer_=collection1:11|collection2:15|collection3:14&fl=id&fl=score&shards.purpose=16388&start=0&fsv=true&collection=collection1,collection2,collection3&rid=null-589&rows=10&version=2&q=*:*&omitHeader=false&NOW=1705871147686&isShard=true&wt=javabin} hits=32 status=0 QTime=2 2> 215655 INFO (qtp1128963290-590) [n:127.0.0.1:43017_ c:collection2 s:shard1 r:core_node8 x:collection2_shard1_replica_n7 t:null-589] o.a.s.c.S.Request webapp= path=/select params={df=text&distrib=false&_stateVer_=collection1:11|collection2:15|collection3:14&fl=id&fl=score&shards.purpose=16388&start=0&fsv=true&collection=collection1,collection2,collection3&rid=null-589&rows=10&version=2&q=*:*&omitHeader=false&NOW=1705871147686&isShard=true&wt=javabin} hits=2 status=0 QTime=1 2> 215668 INFO (qtp558875332-572) [n:127.0.0.1:32885_ c:collection1 s:shard2 r:core_node7 x:collection1_shard2_replica_n5 t:null-589] o.a.s.c.S.Request webapp= path=/select params={df=text&distrib=false&_stateVer_=collection1:11|collection2:15|collection3:14&shards.purpose=64&collection=collection1,collection2,collection3&rid=null-589&version=2&q=*:*&omitHeader=false&NOW=1705871147686&ids=2,3,103,104,5,105,17,6,7,9&isShard=true&wt=javabin} status=0 QTime=3 2> 215675 INFO (qtp600427849-515) [n:127.0.0.1:40707_ c:collection1 s:shard2 r:core_node4 x:collection1_shard2_replica_n2 t:null-589] o.a.s.c.S.Request webapp= path=/select params={q=*:*&_stateVer_=collection1:11|collection2:15|collection3:14&collection=collection1,collection2,collection3&wt=javabin&version=2} rid=null-589 hits=74 status=0 QTime=90 2> 215700 INFO (qtp657049334-531) [n:127.0.0.1:37133_ c:collection3 s:shard1 r:core_node4 x:collection3_shard1_replica_n3 t:null-590] o.a.s.c.S.Request webapp= path=/select params={df=text&distrib=false&_stateVer_=collection2:15|collection3:14&fl=id&fl=score&shards.purpose=16388&start=0&fsv=true&collection=collection2,collection3&rid=null-590&rows=10&version=2&q=*:*&omitHeader=false&NOW=1705871147785&isShard=true&wt=javabin} hits=1 status=0 QTime=2 2> 215708 INFO (qtp558875332-573) [n:127.0.0.1:32885_ c:collection3 s:shard2 r:core_node6 x:collection3_shard2_replica_n5 t:null-590] o.a.s.c.S.Request webapp= path=/select params={df=text&distrib=false&_stateVer_=collection2:15|collection3:14&fl=id&fl=score&shards.purpose=16388&start=0&fsv=true&collection=collection2,collection3&rid=null-590&rows=10&version=2&q=*:*&omitHeader=false&NOW=1705871147785&isShard=true&wt=javabin} hits=2 status=0 QTime=2 2> 215713 INFO (qtp600427849-519) [n:127.0.0.1:40707_ c:collection2 s:shard2 r:core_node2 x:collection2_shard2_replica_n1 t:null-590] o.a.s.c.S.Request webapp= path=/select params={df=text&distrib=false&_stateVer_=collection2:15|collection3:14&fl=id&fl=score&shards.purpose=16388&start=0&fsv=true&collection=collection2,collection3&rid=null-590&rows=10&version=2&q=*:*&omitHeader=false&NOW=1705871147785&isShard=true&wt=javabin} hits=2 status=0 QTime=1 2> 215713 INFO (searcherExecutor-432-thread-1-processing-collection3_shard1_replica_n7 null-563 core_node8 127.0.0.1:43017_ collection3 shard1) [n:127.0.0.1:43017_ c:collection3 s:shard1 r:core_node8 x:collection3_shard1_replica_n7 t:null-563] o.a.s.c.SolrCore Registered new searcher autowarm time: 7 ms 2> 215716 INFO (qtp657049334-529) [n:127.0.0.1:37133_ c:collection2 s:shard1 r:core_node4 x:collection2_shard1_replica_n3 t:null-590] o.a.s.c.S.Request webapp= path=/select params={df=text&distrib=false&_stateVer_=collection2:15|collection3:14&fl=id&fl=score&shards.purpose=16388&start=0&fsv=true&collection=collection2,collection3&rid=null-590&rows=10&version=2&q=*:*&omitHeader=false&NOW=1705871147785&isShard=true&wt=javabin} hits=2 status=0 QTime=2 2> 215718 INFO (recoveryExecutor-434-thread-1-processing-collection3_shard1_replica_n7 null-563 core_node8 127.0.0.1:43017_ collection3 shard1) [n:127.0.0.1:43017_ c:collection3 s:shard1 r:core_node8 x:collection3_shard1_replica_n7 t:null-563] o.a.s.u.p.LogUpdateProcessorFactory {add=[20000001 (1788735543177117696)]} 0 449 2> 215718 INFO (recoveryExecutor-434-thread-1-processing-collection3_shard1_replica_n7 null-563 core_node8 127.0.0.1:43017_ collection3 shard1) [n:127.0.0.1:43017_ c:collection3 s:shard1 r:core_node8 x:collection3_shard1_replica_n7 t:null-563] o.a.s.u.UpdateLog Re-computing max version from index after log re-play. 2> 215750 INFO (qtp657049334-530) [n:127.0.0.1:37133_ c:collection3 s:shard1 r:core_node4 x:collection3_shard1_replica_n3 t:null-590] o.a.s.c.S.Request webapp= path=/select params={df=text&distrib=false&_stateVer_=collection2:15|collection3:14&shards.purpose=64&collection=collection2,collection3&rid=null-590&version=2&q=*:*&omitHeader=false&NOW=1705871147785&ids=20000001&isShard=true&wt=javabin} status=0 QTime=28 2> 215752 INFO (qtp600427849-517) [n:127.0.0.1:40707_ c:collection2 s:shard2 r:core_node2 x:collection2_shard2_replica_n1 t:null-590] o.a.s.c.S.Request webapp= path=/select params={df=text&distrib=false&_stateVer_=collection2:15|collection3:14&shards.purpose=64&collection=collection2,collection3&rid=null-590&version=2&q=*:*&omitHeader=false&NOW=1705871147785&ids=10000000,10000001&isShard=true&wt=javabin} status=0 QTime=2 2> 215754 INFO (qtp657049334-528) [n:127.0.0.1:37133_ c:collection2 s:shard1 r:core_node4 x:collection2_shard1_replica_n3 t:null-590] o.a.s.c.S.Request webapp= path=/select params={df=text&distrib=false&_stateVer_=collection2:15|collection3:14&shards.purpose=64&collection=collection2,collection3&rid=null-590&version=2&q=*:*&omitHeader=false&NOW=1705871147785&ids=10000003,10000004&isShard=true&wt=javabin} status=0 QTime=1 2> 215757 INFO (qtp558875332-570) [n:127.0.0.1:32885_ c:collection3 s:shard2 r:core_node6 x:collection3_shard2_replica_n5 t:null-590] o.a.s.c.S.Request webapp= path=/select params={df=text&distrib=false&_stateVer_=collection2:15|collection3:14&shards.purpose=64&collection=collection2,collection3&rid=null-590&version=2&q=*:*&omitHeader=false&NOW=1705871147785&ids=20000000,10000005&isShard=true&wt=javabin} status=0 QTime=2 2> 215761 INFO (qtp1128963290-589) [n:127.0.0.1:43017_ c:collection2 s:shard1 r:core_node8 x:collection2_shard1_replica_n7 t:null-590] o.a.s.c.S.Request webapp= path=/select params={q=*:*&_stateVer_=collection2:15|collection3:14&collection=collection2,collection3&wt=javabin&version=2} rid=null-590 hits=7 status=0 QTime=74 2> 215782 INFO (qtp657049334-533) [n:127.0.0.1:37133_ c:collection3 s:shard1 r:core_node4 x:collection3_shard1_replica_n3 t:null-591] o.a.s.c.S.Request webapp= path=/select params={df=text&distrib=false&_stateVer_=collection3:14&fl=id&fl=score&shards.purpose=16388&start=0&fsv=true&collection=collection3&rid=null-591&rows=10&version=2&q=*:*&omitHeader=false&NOW=1705871147867&isShard=true&wt=javabin} hits=1 status=0 QTime=2 2> 215785 INFO (qtp600427849-520) [n:127.0.0.1:40707_ c:collection3 s:shard2 r:core_node2 x:collection3_shard2_replica_n1 t:null-591] o.a.s.c.S.Request webapp= path=/select params={df=text&distrib=false&_stateVer_=collection3:14&fl=id&fl=score&shards.purpose=16388&start=0&fsv=true&collection=collection3&rid=null-591&rows=10&version=2&q=*:*&omitHeader=false&NOW=1705871147867&isShard=true&wt=javabin} hits=2 status=0 QTime=1 2> 215794 INFO (qtp657049334-532) [n:127.0.0.1:37133_ c:collection3 s:shard1 r:core_node4 x:collection3_shard1_replica_n3 t:null-591] o.a.s.c.S.Request webapp= path=/select params={df=text&distrib=false&_stateVer_=collection3:14&shards.purpose=64&collection=collection3&rid=null-591&version=2&q=*:*&omitHeader=false&NOW=1705871147867&ids=20000001&isShard=true&wt=javabin} status=0 QTime=2 2> 215795 INFO (qtp600427849-518) [n:127.0.0.1:40707_ c:collection3 s:shard2 r:core_node2 x:collection3_shard2_replica_n1 t:null-591] o.a.s.c.S.Request webapp= path=/select params={df=text&distrib=false&_stateVer_=collection3:14&shards.purpose=64&collection=collection3&rid=null-591&version=2&q=*:*&omitHeader=false&NOW=1705871147867&ids=20000000,10000005&isShard=true&wt=javabin} status=0 QTime=1 2> 215799 INFO (qtp558875332-571) [n:127.0.0.1:32885_ c:collection3 s:shard2 r:core_node6 x:collection3_shard2_replica_n5 t:null-591] o.a.s.c.S.Request webapp= path=/select params={q=*:*&_stateVer_=collection3:14&collection=collection3&wt=javabin&version=2} rid=null-591 hits=3 status=0 QTime=30 2> 215812 INFO (qtp558875332-569) [n:127.0.0.1:32885_ c:collection1 s:shard2 r:core_node7 x:collection1_shard2_replica_n5 t:null-592] o.a.s.c.S.Request webapp= path=/select params={df=text&distrib=false&_stateVer_=collection1:11&fl=id&fl=score&shards.purpose=16388&start=0&fsv=true&rid=null-592&rows=10&version=2&q=*:*&omitHeader=false&NOW=1705871147903&isShard=true&wt=javabin} hits=35 status=0 QTime=1 2> 215814 INFO (qtp1128963290-588) [n:127.0.0.1:43017_ c:collection1 s:shard1 r:core_node3 x:collection1_shard1_replica_n1 t:null-592] o.a.s.c.S.Request webapp= path=/select params={df=text&distrib=false&_stateVer_=collection1:11&fl=id&fl=score&shards.purpose=16388&start=0&fsv=true&rid=null-592&rows=10&version=2&q=*:*&omitHeader=false&NOW=1705871147903&isShard=true&wt=javabin} hits=32 status=0 QTime=1 2> 215828 INFO (qtp558875332-568) [n:127.0.0.1:32885_ c:collection1 s:shard2 r:core_node7 x:collection1_shard2_replica_n5 t:null-592] o.a.s.c.S.Request webapp= path=/select params={q=*:*&df=text&distrib=false&_stateVer_=collection1:11&omitHeader=false&shards.purpose=64&NOW=1705871147903&ids=2,3,103,104,5,105,17,6,7,9&isShard=true&rid=null-592&wt=javabin&version=2} status=0 QTime=3 2> 215837 INFO (qtp657049334-531) [n:127.0.0.1:37133_ c:collection1 s:shard1 r:core_node8 x:collection1_shard1_replica_n6 t:null-592] o.a.s.c.S.Request webapp= path=/select params={q=*:*&_stateVer_=collection1:11&wt=javabin&version=2} rid=null-592 hits=67 status=0 QTime=31 2> 215840 INFO (TEST-BasicDistributedZkTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.AbstractBasicDistributedZkTestBase ### STARTING testANewCollectionInOneInstance 2> 215763 WARN (recoveryExecutor-434-thread-1-processing-collection3_shard1_replica_n7 null-563 core_node8 127.0.0.1:43017_ collection3 shard1) [n:127.0.0.1:43017_ c:collection3 s:shard1 r:core_node8 x:collection3_shard1_replica_n7 t:null-563] o.a.s.u.UpdateLog Log replay finished. recoveryInfo=RecoveryInfo{adds=1 deletes=0 deleteByQuery=0 errors=0 positionOfStart=0} 2> 215873 INFO (recoveryExecutor-337-thread-1-processing-collection3_shard1_replica_n7 null-563 core_node8 127.0.0.1:43017_ collection3 shard1) [n:127.0.0.1:43017_ c:collection3 s:shard1 r:core_node8 x:collection3_shard1_replica_n7 t:null-563] o.a.s.c.RecoveryStrategy Replication Recovery was successful. 2> 215873 INFO (recoveryExecutor-337-thread-1-processing-collection3_shard1_replica_n7 null-563 core_node8 127.0.0.1:43017_ collection3 shard1) [n:127.0.0.1:43017_ c:collection3 s:shard1 r:core_node8 x:collection3_shard1_replica_n7 t:null-563] o.a.s.c.RecoveryStrategy Registering as Active after recovery. 2> 215877 INFO (recoveryExecutor-337-thread-1-processing-collection3_shard1_replica_n7 null-563 core_node8 127.0.0.1:43017_ collection3 shard1) [n:127.0.0.1:43017_ c:collection3 s:shard1 r:core_node8 x:collection3_shard1_replica_n7 t:null-563] o.a.s.c.RecoveryStrategy Finished recovery process, successful=[true] msTimeTaken=2108.0 2> 215878 INFO (recoveryExecutor-337-thread-1-processing-collection3_shard1_replica_n7 null-563 core_node8 127.0.0.1:43017_ collection3 shard1) [n:127.0.0.1:43017_ c:collection3 s:shard1 r:core_node8 x:collection3_shard1_replica_n7 t:null-563] o.a.s.c.RecoveryStrategy Finished recovery process. recoveringAfterStartup=true msTimeTaken=2108.0 2> 215879 INFO (OverseerThreadFactory-250-thread-4) [n: c:oneInstanceCollection s: r: x: t:] o.a.s.c.a.c.CreateCollectionCmd Create collection oneInstanceCollection 2> 215932 INFO (zkCallback-299-thread-2) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/collection3/state.json zxid: 620] for collection [collection3] has occurred - updating... (live nodes size: [5]) 2> 215932 INFO (zkCallback-326-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/collection3/state.json zxid: 620] for collection [collection3] has occurred - updating... (live nodes size: [5]) 2> 215932 INFO (zkCallback-340-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/collection3/state.json zxid: 620] for collection [collection3] has occurred - updating... (live nodes size: [5]) 2> 215932 INFO (zkCallback-288-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/collection3/state.json zxid: 620] for collection [collection3] has occurred - updating... (live nodes size: [5]) 2> 215932 INFO (zkCallback-288-thread-2) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/collection3/state.json zxid: 620] for collection [collection3] has occurred - updating... (live nodes size: [5]) 2> 215932 INFO (zkCallback-299-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/collection3/state.json zxid: 620] for collection [collection3] has occurred - updating... (live nodes size: [5]) 2> 216022 WARN (OverseerThreadFactory-250-thread-4) [n: c:oneInstanceCollection s: r: x: t:] o.a.s.c.a.c.CreateCollectionCmd Specified number of replicas of 2 on collection oneInstanceCollection is higher than the number of Solr instances currently live or live and part of your createNodeSet(1). It's unusual to run two replica of the same slice on the same Solr-instance. 2> 216076 INFO (OverseerStateUpdate-72077803529764868-127.0.0.1:45297_-n_0000000000) [n:127.0.0.1:45297_ c: s: r: x: t:] o.a.s.c.o.SliceMutator createReplica() { 2> "core":"oneInstanceCollection_shard2_replica_n1", 2> "node_name":"127.0.0.1:40707_", 2> "base_url":"http://127.0.0.1:40707", 2> "collection":"oneInstanceCollection", 2> "shard":"shard2", 2> "state":"down", 2> "type":"NRT", 2> "operation":"ADDREPLICA", 2> "waitForFinalState":"false"} 2> 216102 INFO (OverseerStateUpdate-72077803529764868-127.0.0.1:45297_-n_0000000000) [n:127.0.0.1:45297_ c: s: r: x: t:] o.a.s.c.o.SliceMutator createReplica() { 2> "core":"oneInstanceCollection_shard2_replica_n2", 2> "node_name":"127.0.0.1:40707_", 2> "base_url":"http://127.0.0.1:40707", 2> "collection":"oneInstanceCollection", 2> "shard":"shard2", 2> "state":"down", 2> "type":"NRT", 2> "operation":"ADDREPLICA", 2> "waitForFinalState":"false"} 2> 216128 INFO (OverseerStateUpdate-72077803529764868-127.0.0.1:45297_-n_0000000000) [n:127.0.0.1:45297_ c: s: r: x: t:] o.a.s.c.o.SliceMutator createReplica() { 2> "core":"oneInstanceCollection_shard1_replica_n4", 2> "node_name":"127.0.0.1:40707_", 2> "base_url":"http://127.0.0.1:40707", 2> "collection":"oneInstanceCollection", 2> "shard":"shard1", 2> "state":"down", 2> "type":"NRT", 2> "operation":"ADDREPLICA", 2> "waitForFinalState":"false"} 2> 216150 INFO (OverseerStateUpdate-72077803529764868-127.0.0.1:45297_-n_0000000000) [n:127.0.0.1:45297_ c: s: r: x: t:] o.a.s.c.o.SliceMutator createReplica() { 2> "core":"oneInstanceCollection_shard1_replica_n6", 2> "node_name":"127.0.0.1:40707_", 2> "base_url":"http://127.0.0.1:40707", 2> "collection":"oneInstanceCollection", 2> "shard":"shard1", 2> "state":"down", 2> "type":"NRT", 2> "operation":"ADDREPLICA", 2> "waitForFinalState":"false"} 2> 216270 INFO (zkCallback-244-thread-2) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/oneInstanceCollection/state.json zxid: 635] for collection [oneInstanceCollection] has occurred - updating... (live nodes size: [5]) 2> 216284 INFO (qtp600427849-516) [n:127.0.0.1:40707_ c: s: r: x:oneInstanceCollection_shard2_replica_n1 t:null-594] o.a.s.h.a.CoreAdminOperation core create command qt=/admin/cores&collection.configName=conf1&newCollection=true&collection=oneInstanceCollection&version=2&replicaType=NRT&coreNodeName=core_node3&name=oneInstanceCollection_shard2_replica_n1&action=CREATE&numShards=2&shard=shard2&wt=javabin 2> 216285 INFO (qtp600427849-515) [n:127.0.0.1:40707_ c: s: r: x:oneInstanceCollection_shard2_replica_n2 t:null-595] o.a.s.h.a.CoreAdminOperation core create command qt=/admin/cores&collection.configName=conf1&newCollection=true&collection=oneInstanceCollection&version=2&replicaType=NRT&coreNodeName=core_node5&name=oneInstanceCollection_shard2_replica_n2&action=CREATE&numShards=2&shard=shard2&wt=javabin 2> 216288 INFO (qtp600427849-519) [n:127.0.0.1:40707_ c: s: r: x:oneInstanceCollection_shard1_replica_n4 t:null-596] o.a.s.h.a.CoreAdminOperation core create command qt=/admin/cores&collection.configName=conf1&newCollection=true&collection=oneInstanceCollection&version=2&replicaType=NRT&coreNodeName=core_node7&name=oneInstanceCollection_shard1_replica_n4&action=CREATE&numShards=2&shard=shard1&wt=javabin 2> 216293 INFO (qtp600427849-517) [n:127.0.0.1:40707_ c: s: r: x:oneInstanceCollection_shard1_replica_n6 t:null-597] o.a.s.h.a.CoreAdminOperation core create command qt=/admin/cores&collection.configName=conf1&newCollection=true&collection=oneInstanceCollection&version=2&replicaType=NRT&coreNodeName=core_node8&name=oneInstanceCollection_shard1_replica_n6&action=CREATE&numShards=2&shard=shard1&wt=javabin 2> 216362 INFO (qtp600427849-516) [n:127.0.0.1:40707_ c:oneInstanceCollection s:shard2 r:core_node3 x:oneInstanceCollection_shard2_replica_n1 t:null-594] o.a.s.c.SolrConfig Using Lucene MatchVersion: 9.8.0 2> 216373 INFO (qtp600427849-519) [n:127.0.0.1:40707_ c:oneInstanceCollection s:shard1 r:core_node7 x:oneInstanceCollection_shard1_replica_n4 t:null-596] o.a.s.c.SolrConfig Using Lucene MatchVersion: 9.8.0 2> 216374 INFO (qtp600427849-516) [n:127.0.0.1:40707_ c:oneInstanceCollection s:shard2 r:core_node3 x:oneInstanceCollection_shard2_replica_n1 t:null-594] o.a.s.s.IndexSchema Schema name=test 2> 216378 INFO (qtp600427849-515) [n:127.0.0.1:40707_ c:oneInstanceCollection s:shard2 r:core_node5 x:oneInstanceCollection_shard2_replica_n2 t:null-595] o.a.s.c.SolrConfig Using Lucene MatchVersion: 9.8.0 2> 216379 INFO (qtp600427849-517) [n:127.0.0.1:40707_ c:oneInstanceCollection s:shard1 r:core_node8 x:oneInstanceCollection_shard1_replica_n6 t:null-597] o.a.s.c.SolrConfig Using Lucene MatchVersion: 9.8.0 2> 216386 INFO (qtp600427849-519) [n:127.0.0.1:40707_ c:oneInstanceCollection s:shard1 r:core_node7 x:oneInstanceCollection_shard1_replica_n4 t:null-596] o.a.s.s.IndexSchema Schema name=test 2> 216390 INFO (qtp600427849-515) [n:127.0.0.1:40707_ c:oneInstanceCollection s:shard2 r:core_node5 x:oneInstanceCollection_shard2_replica_n2 t:null-595] o.a.s.s.IndexSchema Schema name=test 2> 216391 INFO (qtp600427849-517) [n:127.0.0.1:40707_ c:oneInstanceCollection s:shard1 r:core_node8 x:oneInstanceCollection_shard1_replica_n6 t:null-597] o.a.s.s.IndexSchema Schema name=test 2> 217391 INFO (qtp600427849-516) [n:127.0.0.1:40707_ c:oneInstanceCollection s:shard2 r:core_node3 x:oneInstanceCollection_shard2_replica_n1 t:null-594] o.a.s.s.IndexSchema Loaded schema test/1.0 with uniqueid field id 2> 217404 INFO (qtp600427849-519) [n:127.0.0.1:40707_ c:oneInstanceCollection s:shard1 r:core_node7 x:oneInstanceCollection_shard1_replica_n4 t:null-596] o.a.s.s.IndexSchema Loaded schema test/1.0 with uniqueid field id 2> 217421 INFO (qtp600427849-515) [n:127.0.0.1:40707_ c:oneInstanceCollection s:shard2 r:core_node5 x:oneInstanceCollection_shard2_replica_n2 t:null-595] o.a.s.s.IndexSchema Loaded schema test/1.0 with uniqueid field id 2> 217427 INFO (qtp600427849-517) [n:127.0.0.1:40707_ c:oneInstanceCollection s:shard1 r:core_node8 x:oneInstanceCollection_shard1_replica_n6 t:null-597] o.a.s.s.IndexSchema Loaded schema test/1.0 with uniqueid field id 2> 217632 INFO (qtp600427849-519) [n:127.0.0.1:40707_ c:oneInstanceCollection s:shard1 r:core_node7 x:oneInstanceCollection_shard1_replica_n4 t:null-596] o.a.s.c.CoreContainer Creating SolrCore 'oneInstanceCollection_shard1_replica_n4' using configuration from configset conf1, trusted=true 2> 217632 INFO (qtp600427849-516) [n:127.0.0.1:40707_ c:oneInstanceCollection s:shard2 r:core_node3 x:oneInstanceCollection_shard2_replica_n1 t:null-594] o.a.s.c.CoreContainer Creating SolrCore 'oneInstanceCollection_shard2_replica_n1' using configuration from configset conf1, trusted=true 2> 217650 INFO (qtp600427849-515) [n:127.0.0.1:40707_ c:oneInstanceCollection s:shard2 r:core_node5 x:oneInstanceCollection_shard2_replica_n2 t:null-595] o.a.s.c.CoreContainer Creating SolrCore 'oneInstanceCollection_shard2_replica_n2' using configuration from configset conf1, trusted=true 2> 217663 INFO (qtp600427849-516) [n:127.0.0.1:40707_ c:oneInstanceCollection s:shard2 r:core_node3 x:oneInstanceCollection_shard2_replica_n1 t:null-594] o.a.s.c.SolrCore Opening new SolrCore at [/build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.cloud.BasicDistributedZkTest_246C98A4C257C021-001/shard-1-001/cores/oneInstanceCollection_shard2_replica_n1], dataDir=[/build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.cloud.BasicDistributedZkTest_246C98A4C257C021-001/shard-1-001/cores/oneInstanceCollection_shard2_replica_n1/data/] 2> 217663 INFO (qtp600427849-515) [n:127.0.0.1:40707_ c:oneInstanceCollection s:shard2 r:core_node5 x:oneInstanceCollection_shard2_replica_n2 t:null-595] o.a.s.c.SolrCore Opening new SolrCore at [/build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.cloud.BasicDistributedZkTest_246C98A4C257C021-001/shard-1-001/cores/oneInstanceCollection_shard2_replica_n2], dataDir=[/build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.cloud.BasicDistributedZkTest_246C98A4C257C021-001/shard-1-001/cores/oneInstanceCollection_shard2_replica_n2/data/] 2> 217663 INFO (qtp600427849-519) [n:127.0.0.1:40707_ c:oneInstanceCollection s:shard1 r:core_node7 x:oneInstanceCollection_shard1_replica_n4 t:null-596] o.a.s.c.SolrCore Opening new SolrCore at [/build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.cloud.BasicDistributedZkTest_246C98A4C257C021-001/shard-1-001/cores/oneInstanceCollection_shard1_replica_n4], dataDir=[/build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.cloud.BasicDistributedZkTest_246C98A4C257C021-001/shard-1-001/cores/oneInstanceCollection_shard1_replica_n4/data/] 2> 217679 INFO (qtp600427849-517) [n:127.0.0.1:40707_ c:oneInstanceCollection s:shard1 r:core_node8 x:oneInstanceCollection_shard1_replica_n6 t:null-597] o.a.s.c.CoreContainer Creating SolrCore 'oneInstanceCollection_shard1_replica_n6' using configuration from configset conf1, trusted=true 2> 217683 INFO (qtp600427849-517) [n:127.0.0.1:40707_ c:oneInstanceCollection s:shard1 r:core_node8 x:oneInstanceCollection_shard1_replica_n6 t:null-597] o.a.s.c.SolrCore Opening new SolrCore at [/build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.cloud.BasicDistributedZkTest_246C98A4C257C021-001/shard-1-001/cores/oneInstanceCollection_shard1_replica_n6], dataDir=[/build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.cloud.BasicDistributedZkTest_246C98A4C257C021-001/shard-1-001/cores/oneInstanceCollection_shard1_replica_n6/data/] 2> 217704 INFO (qtp600427849-516) [n:127.0.0.1:40707_ c:oneInstanceCollection s:shard2 r:core_node3 x:oneInstanceCollection_shard2_replica_n1 t:null-594] o.a.s.u.RandomMergePolicy RandomMergePolicy wrapping class org.apache.lucene.index.TieredMergePolicy: [TieredMergePolicy: maxMergeAtOnce=26, maxMergedSegmentMB=49.734910011291504, floorSegmentMB=0.3930988311767578, forceMergeDeletesPctAllowed=19.244929011688583, segmentsPerTier=32.0, maxCFSSegmentSizeMB=8.796093022208E12, noCFSRatio=0.844874139736486, deletesPctAllowed=38.311155010312675 2> 217704 INFO (qtp600427849-515) [n:127.0.0.1:40707_ c:oneInstanceCollection s:shard2 r:core_node5 x:oneInstanceCollection_shard2_replica_n2 t:null-595] o.a.s.u.RandomMergePolicy RandomMergePolicy wrapping class org.apache.lucene.index.TieredMergePolicy: [TieredMergePolicy: maxMergeAtOnce=26, maxMergedSegmentMB=49.734910011291504, floorSegmentMB=0.3930988311767578, forceMergeDeletesPctAllowed=19.244929011688583, segmentsPerTier=32.0, maxCFSSegmentSizeMB=8.796093022208E12, noCFSRatio=0.844874139736486, deletesPctAllowed=38.311155010312675 2> 217705 INFO (qtp600427849-519) [n:127.0.0.1:40707_ c:oneInstanceCollection s:shard1 r:core_node7 x:oneInstanceCollection_shard1_replica_n4 t:null-596] o.a.s.u.RandomMergePolicy RandomMergePolicy wrapping class org.apache.lucene.index.TieredMergePolicy: [TieredMergePolicy: maxMergeAtOnce=19, maxMergedSegmentMB=10.273061752319336, floorSegmentMB=0.749781608581543, forceMergeDeletesPctAllowed=12.565230667147118, segmentsPerTier=12.0, maxCFSSegmentSizeMB=8.796093022208E12, noCFSRatio=0.13744209775263264, deletesPctAllowed=45.32596601683406 2> 217711 INFO (qtp600427849-517) [n:127.0.0.1:40707_ c:oneInstanceCollection s:shard1 r:core_node8 x:oneInstanceCollection_shard1_replica_n6 t:null-597] o.a.s.u.RandomMergePolicy RandomMergePolicy wrapping class org.apache.lucene.index.TieredMergePolicy: [TieredMergePolicy: maxMergeAtOnce=34, maxMergedSegmentMB=19.161690711975098, floorSegmentMB=0.6958456039428711, forceMergeDeletesPctAllowed=23.348007169778754, segmentsPerTier=32.0, maxCFSSegmentSizeMB=8.796093022208E12, noCFSRatio=0.0, deletesPctAllowed=30.20373023338287 2> 217737 WARN (qtp600427849-515) [n:127.0.0.1:40707_ c:oneInstanceCollection s:shard2 r:core_node5 x:oneInstanceCollection_shard2_replica_n2 t:null-595] o.a.s.c.RequestHandlers INVALID paramSet a in requestHandler {type = requestHandler,name = /dump,class = DumpRequestHandler,attributes = {initParams=a, name=/dump, class=DumpRequestHandler},args = {defaults={a=A, b=B}}} 2> 217750 WARN (qtp600427849-516) [n:127.0.0.1:40707_ c:oneInstanceCollection s:shard2 r:core_node3 x:oneInstanceCollection_shard2_replica_n1 t:null-594] o.a.s.c.RequestHandlers INVALID paramSet a in requestHandler {type = requestHandler,name = /dump,class = DumpRequestHandler,attributes = {initParams=a, name=/dump, class=DumpRequestHandler},args = {defaults={a=A, b=B}}} 2> 217757 WARN (qtp600427849-517) [n:127.0.0.1:40707_ c:oneInstanceCollection s:shard1 r:core_node8 x:oneInstanceCollection_shard1_replica_n6 t:null-597] o.a.s.c.RequestHandlers INVALID paramSet a in requestHandler {type = requestHandler,name = /dump,class = DumpRequestHandler,attributes = {initParams=a, name=/dump, class=DumpRequestHandler},args = {defaults={a=A, b=B}}} 2> 217758 WARN (qtp600427849-519) [n:127.0.0.1:40707_ c:oneInstanceCollection s:shard1 r:core_node7 x:oneInstanceCollection_shard1_replica_n4 t:null-596] o.a.s.c.RequestHandlers INVALID paramSet a in requestHandler {type = requestHandler,name = /dump,class = DumpRequestHandler,attributes = {initParams=a, name=/dump, class=DumpRequestHandler},args = {defaults={a=A, b=B}}} 2> 218092 INFO (qtp600427849-516) [n:127.0.0.1:40707_ c:oneInstanceCollection s:shard2 r:core_node3 x:oneInstanceCollection_shard2_replica_n1 t:null-594] o.a.s.u.UpdateHandler Using UpdateLog implementation: org.apache.solr.update.UpdateLog 2> 218093 INFO (qtp600427849-516) [n:127.0.0.1:40707_ c:oneInstanceCollection s:shard2 r:core_node3 x:oneInstanceCollection_shard2_replica_n1 t:null-594] o.a.s.u.UpdateLog Initializing UpdateLog: dataDir= defaultSyncLevel=FLUSH numRecordsToKeep=100 maxNumLogsToKeep=10 numVersionBuckets=65536 2> 218096 INFO (qtp600427849-515) [n:127.0.0.1:40707_ c:oneInstanceCollection s:shard2 r:core_node5 x:oneInstanceCollection_shard2_replica_n2 t:null-595] o.a.s.u.UpdateHandler Using UpdateLog implementation: org.apache.solr.update.UpdateLog 2> 218097 INFO (qtp600427849-515) [n:127.0.0.1:40707_ c:oneInstanceCollection s:shard2 r:core_node5 x:oneInstanceCollection_shard2_replica_n2 t:null-595] o.a.s.u.UpdateLog Initializing UpdateLog: dataDir= defaultSyncLevel=FLUSH numRecordsToKeep=100 maxNumLogsToKeep=10 numVersionBuckets=65536 2> 218101 INFO (qtp600427849-519) [n:127.0.0.1:40707_ c:oneInstanceCollection s:shard1 r:core_node7 x:oneInstanceCollection_shard1_replica_n4 t:null-596] o.a.s.u.UpdateHandler Using UpdateLog implementation: org.apache.solr.update.UpdateLog 2> 218101 INFO (qtp600427849-519) [n:127.0.0.1:40707_ c:oneInstanceCollection s:shard1 r:core_node7 x:oneInstanceCollection_shard1_replica_n4 t:null-596] o.a.s.u.UpdateLog Initializing UpdateLog: dataDir= defaultSyncLevel=FLUSH numRecordsToKeep=100 maxNumLogsToKeep=10 numVersionBuckets=65536 2> 218104 INFO (qtp600427849-517) [n:127.0.0.1:40707_ c:oneInstanceCollection s:shard1 r:core_node8 x:oneInstanceCollection_shard1_replica_n6 t:null-597] o.a.s.u.UpdateHandler Using UpdateLog implementation: org.apache.solr.update.UpdateLog 2> 218105 INFO (qtp600427849-517) [n:127.0.0.1:40707_ c:oneInstanceCollection s:shard1 r:core_node8 x:oneInstanceCollection_shard1_replica_n6 t:null-597] o.a.s.u.UpdateLog Initializing UpdateLog: dataDir= defaultSyncLevel=FLUSH numRecordsToKeep=100 maxNumLogsToKeep=10 numVersionBuckets=65536 2> 218117 INFO (qtp600427849-516) [n:127.0.0.1:40707_ c:oneInstanceCollection s:shard2 r:core_node3 x:oneInstanceCollection_shard2_replica_n1 t:null-594] o.a.s.u.CommitTracker Hard AutoCommit: disabled 2> 218117 INFO (qtp600427849-516) [n:127.0.0.1:40707_ c:oneInstanceCollection s:shard2 r:core_node3 x:oneInstanceCollection_shard2_replica_n1 t:null-594] o.a.s.u.CommitTracker Soft AutoCommit: disabled 2> 218133 INFO (qtp600427849-515) [n:127.0.0.1:40707_ c:oneInstanceCollection s:shard2 r:core_node5 x:oneInstanceCollection_shard2_replica_n2 t:null-595] o.a.s.u.CommitTracker Hard AutoCommit: disabled 2> 218133 INFO (qtp600427849-515) [n:127.0.0.1:40707_ c:oneInstanceCollection s:shard2 r:core_node5 x:oneInstanceCollection_shard2_replica_n2 t:null-595] o.a.s.u.CommitTracker Soft AutoCommit: disabled 2> 218136 INFO (qtp600427849-517) [n:127.0.0.1:40707_ c:oneInstanceCollection s:shard1 r:core_node8 x:oneInstanceCollection_shard1_replica_n6 t:null-597] o.a.s.u.CommitTracker Hard AutoCommit: disabled 2> 218136 INFO (qtp600427849-519) [n:127.0.0.1:40707_ c:oneInstanceCollection s:shard1 r:core_node7 x:oneInstanceCollection_shard1_replica_n4 t:null-596] o.a.s.u.CommitTracker Hard AutoCommit: disabled 2> 218137 INFO (qtp600427849-517) [n:127.0.0.1:40707_ c:oneInstanceCollection s:shard1 r:core_node8 x:oneInstanceCollection_shard1_replica_n6 t:null-597] o.a.s.u.CommitTracker Soft AutoCommit: disabled 2> 218137 INFO (qtp600427849-519) [n:127.0.0.1:40707_ c:oneInstanceCollection s:shard1 r:core_node7 x:oneInstanceCollection_shard1_replica_n4 t:null-596] o.a.s.u.CommitTracker Soft AutoCommit: disabled 2> 218152 INFO (qtp600427849-519) [n:127.0.0.1:40707_ c:oneInstanceCollection s:shard1 r:core_node7 x:oneInstanceCollection_shard1_replica_n4 t:null-596] o.a.s.u.RandomMergePolicy RandomMergePolicy wrapping class org.apache.lucene.index.LogByteSizeMergePolicy: [LogByteSizeMergePolicy: minMergeSize=1677721, mergeFactor=35, maxMergeSize=2147483648, maxMergeSizeForForcedMerge=9223372036854775807, calibrateSizeByDeletes=false, maxMergeDocs=2147483647, maxCFSSegmentSizeMB=8.796093022208E12, noCFSRatio=0.0] 2> 218152 INFO (qtp600427849-517) [n:127.0.0.1:40707_ c:oneInstanceCollection s:shard1 r:core_node8 x:oneInstanceCollection_shard1_replica_n6 t:null-597] o.a.s.u.RandomMergePolicy RandomMergePolicy wrapping class org.apache.lucene.index.LogDocMergePolicy: [LogDocMergePolicy: minMergeSize=1000, mergeFactor=34, maxMergeSize=9223372036854775807, maxMergeSizeForForcedMerge=9223372036854775807, calibrateSizeByDeletes=true, maxMergeDocs=2147483647, maxCFSSegmentSizeMB=8.796093022208E12, noCFSRatio=0.2861650666961587] 2> 218152 INFO (qtp600427849-516) [n:127.0.0.1:40707_ c:oneInstanceCollection s:shard2 r:core_node3 x:oneInstanceCollection_shard2_replica_n1 t:null-594] o.a.s.u.RandomMergePolicy RandomMergePolicy wrapping class org.apache.lucene.index.TieredMergePolicy: [TieredMergePolicy: maxMergeAtOnce=25, maxMergedSegmentMB=98.51693820953369, floorSegmentMB=1.4527759552001953, forceMergeDeletesPctAllowed=2.9097891812865595, segmentsPerTier=45.0, maxCFSSegmentSizeMB=8.796093022208E12, noCFSRatio=0.5977428682331861, deletesPctAllowed=37.29251706615804 2> 218152 INFO (qtp600427849-515) [n:127.0.0.1:40707_ c:oneInstanceCollection s:shard2 r:core_node5 x:oneInstanceCollection_shard2_replica_n2 t:null-595] o.a.s.u.RandomMergePolicy RandomMergePolicy wrapping class org.apache.lucene.index.TieredMergePolicy: [TieredMergePolicy: maxMergeAtOnce=25, maxMergedSegmentMB=98.51693820953369, floorSegmentMB=1.4527759552001953, forceMergeDeletesPctAllowed=2.9097891812865595, segmentsPerTier=45.0, maxCFSSegmentSizeMB=8.796093022208E12, noCFSRatio=0.5977428682331861, deletesPctAllowed=37.29251706615804 2> 218181 INFO (qtp600427849-515) [n:127.0.0.1:40707_ c:oneInstanceCollection s:shard2 r:core_node5 x:oneInstanceCollection_shard2_replica_n2 t:null-595] o.a.s.r.ManagedResourceStorage Configured ZooKeeperStorageIO with znodeBase: /configs/conf1 2> 218184 INFO (qtp600427849-516) [n:127.0.0.1:40707_ c:oneInstanceCollection s:shard2 r:core_node3 x:oneInstanceCollection_shard2_replica_n1 t:null-594] o.a.s.r.ManagedResourceStorage Configured ZooKeeperStorageIO with znodeBase: /configs/conf1 2> 218184 INFO (qtp600427849-515) [n:127.0.0.1:40707_ c:oneInstanceCollection s:shard2 r:core_node5 x:oneInstanceCollection_shard2_replica_n2 t:null-595] o.a.s.r.ManagedResourceStorage Loaded null at path _rest_managed.json using ZooKeeperStorageIO:path=/configs/conf1 2> 218187 INFO (qtp600427849-516) [n:127.0.0.1:40707_ c:oneInstanceCollection s:shard2 r:core_node3 x:oneInstanceCollection_shard2_replica_n1 t:null-594] o.a.s.r.ManagedResourceStorage Loaded null at path _rest_managed.json using ZooKeeperStorageIO:path=/configs/conf1 2> 218188 INFO (qtp600427849-517) [n:127.0.0.1:40707_ c:oneInstanceCollection s:shard1 r:core_node8 x:oneInstanceCollection_shard1_replica_n6 t:null-597] o.a.s.r.ManagedResourceStorage Configured ZooKeeperStorageIO with znodeBase: /configs/conf1 2> 218190 INFO (qtp600427849-519) [n:127.0.0.1:40707_ c:oneInstanceCollection s:shard1 r:core_node7 x:oneInstanceCollection_shard1_replica_n4 t:null-596] o.a.s.r.ManagedResourceStorage Configured ZooKeeperStorageIO with znodeBase: /configs/conf1 2> 218190 INFO (qtp600427849-515) [n:127.0.0.1:40707_ c:oneInstanceCollection s:shard2 r:core_node5 x:oneInstanceCollection_shard2_replica_n2 t:null-595] o.a.s.h.ReplicationHandler Commits will be reserved for 10000 ms 2> 218192 INFO (qtp600427849-515) [n:127.0.0.1:40707_ c:oneInstanceCollection s:shard2 r:core_node5 x:oneInstanceCollection_shard2_replica_n2 t:null-595] o.a.s.u.UpdateLog Could not find max version in index or recent updates, using new clock 1788735547286487040 2> 218193 INFO (qtp600427849-517) [n:127.0.0.1:40707_ c:oneInstanceCollection s:shard1 r:core_node8 x:oneInstanceCollection_shard1_replica_n6 t:null-597] o.a.s.r.ManagedResourceStorage Loaded null at path _rest_managed.json using ZooKeeperStorageIO:path=/configs/conf1 2> 218193 INFO (qtp600427849-516) [n:127.0.0.1:40707_ c:oneInstanceCollection s:shard2 r:core_node3 x:oneInstanceCollection_shard2_replica_n1 t:null-594] o.a.s.h.ReplicationHandler Commits will be reserved for 10000 ms 2> 218193 INFO (qtp600427849-519) [n:127.0.0.1:40707_ c:oneInstanceCollection s:shard1 r:core_node7 x:oneInstanceCollection_shard1_replica_n4 t:null-596] o.a.s.r.ManagedResourceStorage Loaded null at path _rest_managed.json using ZooKeeperStorageIO:path=/configs/conf1 2> 218195 INFO (qtp600427849-516) [n:127.0.0.1:40707_ c:oneInstanceCollection s:shard2 r:core_node3 x:oneInstanceCollection_shard2_replica_n1 t:null-594] o.a.s.u.UpdateLog Could not find max version in index or recent updates, using new clock 1788735547289632768 2> 218198 INFO (qtp600427849-517) [n:127.0.0.1:40707_ c:oneInstanceCollection s:shard1 r:core_node8 x:oneInstanceCollection_shard1_replica_n6 t:null-597] o.a.s.h.ReplicationHandler Commits will be reserved for 10000 ms 2> 218199 INFO (qtp600427849-519) [n:127.0.0.1:40707_ c:oneInstanceCollection s:shard1 r:core_node7 x:oneInstanceCollection_shard1_replica_n4 t:null-596] o.a.s.h.ReplicationHandler Commits will be reserved for 10000 ms 2> 218199 INFO (qtp600427849-517) [n:127.0.0.1:40707_ c:oneInstanceCollection s:shard1 r:core_node8 x:oneInstanceCollection_shard1_replica_n6 t:null-597] o.a.s.u.UpdateLog Could not find max version in index or recent updates, using new clock 1788735547293827072 2> 218200 INFO (qtp600427849-519) [n:127.0.0.1:40707_ c:oneInstanceCollection s:shard1 r:core_node7 x:oneInstanceCollection_shard1_replica_n4 t:null-596] o.a.s.u.UpdateLog Could not find max version in index or recent updates, using new clock 1788735547294875648 2> 218245 INFO (searcherExecutor-459-thread-1-processing-oneInstanceCollection_shard2_replica_n2 null-595 core_node5 127.0.0.1:40707_ oneInstanceCollection shard2) [n:127.0.0.1:40707_ c:oneInstanceCollection s:shard2 r:core_node5 x:oneInstanceCollection_shard2_replica_n2 t:null-595] o.a.s.c.SolrCore Registered new searcher autowarm time: 0 ms 2> 218252 INFO (searcherExecutor-456-thread-1-processing-oneInstanceCollection_shard2_replica_n1 null-594 core_node3 127.0.0.1:40707_ oneInstanceCollection shard2) [n:127.0.0.1:40707_ c:oneInstanceCollection s:shard2 r:core_node3 x:oneInstanceCollection_shard2_replica_n1 t:null-594] o.a.s.c.SolrCore Registered new searcher autowarm time: 0 ms 2> 218259 INFO (searcherExecutor-457-thread-1-processing-oneInstanceCollection_shard1_replica_n4 null-596 core_node7 127.0.0.1:40707_ oneInstanceCollection shard1) [n:127.0.0.1:40707_ c:oneInstanceCollection s:shard1 r:core_node7 x:oneInstanceCollection_shard1_replica_n4 t:null-596] o.a.s.c.SolrCore Registered new searcher autowarm time: 0 ms 2> 218262 INFO (searcherExecutor-461-thread-1-processing-oneInstanceCollection_shard1_replica_n6 null-597 core_node8 127.0.0.1:40707_ oneInstanceCollection shard1) [n:127.0.0.1:40707_ c:oneInstanceCollection s:shard1 r:core_node8 x:oneInstanceCollection_shard1_replica_n6 t:null-597] o.a.s.c.SolrCore Registered new searcher autowarm time: 0 ms 2> 218279 INFO (qtp600427849-517) [n:127.0.0.1:40707_ c:oneInstanceCollection s:shard1 r:core_node8 x:oneInstanceCollection_shard1_replica_n6 t:null-597] o.a.s.c.ZkShardTerms Successful update of terms at /collections/oneInstanceCollection/terms/shard1 to Terms{values={core_node8=0}, version=0} for registerTerm 2> 218280 INFO (qtp600427849-517) [n:127.0.0.1:40707_ c:oneInstanceCollection s:shard1 r:core_node8 x:oneInstanceCollection_shard1_replica_n6 t:null-597] o.a.s.c.ShardLeaderElectionContextBase make sure parent is created /collections/oneInstanceCollection/leaders/shard1 2> 218300 INFO (qtp600427849-516) [n:127.0.0.1:40707_ c:oneInstanceCollection s:shard2 r:core_node3 x:oneInstanceCollection_shard2_replica_n1 t:null-594] o.a.s.c.ZkShardTerms Successful update of terms at /collections/oneInstanceCollection/terms/shard2 to Terms{values={core_node3=0}, version=0} for registerTerm 2> 218301 INFO (qtp600427849-516) [n:127.0.0.1:40707_ c:oneInstanceCollection s:shard2 r:core_node3 x:oneInstanceCollection_shard2_replica_n1 t:null-594] o.a.s.c.ShardLeaderElectionContextBase make sure parent is created /collections/oneInstanceCollection/leaders/shard2 2> 218304 INFO (qtp600427849-515) [n:127.0.0.1:40707_ c:oneInstanceCollection s:shard2 r:core_node5 x:oneInstanceCollection_shard2_replica_n2 t:null-595] o.a.s.c.ZkShardTerms Successful update of terms at /collections/oneInstanceCollection/terms/shard2 to Terms{values={core_node3=0, core_node5=0}, version=1} for registerTerm 2> 218305 INFO (qtp600427849-515) [n:127.0.0.1:40707_ c:oneInstanceCollection s:shard2 r:core_node5 x:oneInstanceCollection_shard2_replica_n2 t:null-595] o.a.s.c.ShardLeaderElectionContextBase make sure parent is created /collections/oneInstanceCollection/leaders/shard2 2> 218309 INFO (qtp600427849-519) [n:127.0.0.1:40707_ c:oneInstanceCollection s:shard1 r:core_node7 x:oneInstanceCollection_shard1_replica_n4 t:null-596] o.a.s.c.ZkShardTerms Successful update of terms at /collections/oneInstanceCollection/terms/shard1 to Terms{values={core_node7=0, core_node8=0}, version=1} for registerTerm 2> 218310 INFO (qtp600427849-519) [n:127.0.0.1:40707_ c:oneInstanceCollection s:shard1 r:core_node7 x:oneInstanceCollection_shard1_replica_n4 t:null-596] o.a.s.c.ShardLeaderElectionContextBase make sure parent is created /collections/oneInstanceCollection/leaders/shard1 2> 218345 INFO (qtp600427849-517) [n:127.0.0.1:40707_ c:oneInstanceCollection s:shard1 r:core_node8 x:oneInstanceCollection_shard1_replica_n6 t:null-597] o.a.s.c.ShardLeaderElectionContext Enough replicas found to continue. 2> 218345 INFO (qtp600427849-517) [n:127.0.0.1:40707_ c:oneInstanceCollection s:shard1 r:core_node8 x:oneInstanceCollection_shard1_replica_n6 t:null-597] o.a.s.c.ShardLeaderElectionContext I may be the new leader - try and sync 2> 218346 INFO (qtp600427849-517) [n:127.0.0.1:40707_ c:oneInstanceCollection s:shard1 r:core_node8 x:oneInstanceCollection_shard1_replica_n6 t:null-597] o.a.s.c.SyncStrategy Sync replicas to http://127.0.0.1:40707/oneInstanceCollection_shard1_replica_n6/ 2> 218355 INFO (qtp600427849-516) [n:127.0.0.1:40707_ c:oneInstanceCollection s:shard2 r:core_node3 x:oneInstanceCollection_shard2_replica_n1 t:null-594] o.a.s.c.ShardLeaderElectionContext Enough replicas found to continue. 2> 218355 INFO (qtp600427849-516) [n:127.0.0.1:40707_ c:oneInstanceCollection s:shard2 r:core_node3 x:oneInstanceCollection_shard2_replica_n1 t:null-594] o.a.s.c.ShardLeaderElectionContext I may be the new leader - try and sync 2> 218355 INFO (qtp600427849-516) [n:127.0.0.1:40707_ c:oneInstanceCollection s:shard2 r:core_node3 x:oneInstanceCollection_shard2_replica_n1 t:null-594] o.a.s.c.SyncStrategy Sync replicas to http://127.0.0.1:40707/oneInstanceCollection_shard2_replica_n1/ 2> 218365 INFO (qtp600427849-516) [n:127.0.0.1:40707_ c:oneInstanceCollection s:shard2 r:core_node3 x:oneInstanceCollection_shard2_replica_n1 t:null-594] o.a.s.u.PeerSync PeerSync: core=oneInstanceCollection_shard2_replica_n1 url=http://127.0.0.1:40707 START replicas=[http://127.0.0.1:40707/oneInstanceCollection_shard2_replica_n2/] nUpdates=100 2> 218365 INFO (qtp600427849-517) [n:127.0.0.1:40707_ c:oneInstanceCollection s:shard1 r:core_node8 x:oneInstanceCollection_shard1_replica_n6 t:null-597] o.a.s.u.PeerSync PeerSync: core=oneInstanceCollection_shard1_replica_n6 url=http://127.0.0.1:40707 START replicas=[http://127.0.0.1:40707/oneInstanceCollection_shard1_replica_n4/] nUpdates=100 2> 218370 INFO (qtp600427849-516) [n:127.0.0.1:40707_ c:oneInstanceCollection s:shard2 r:core_node3 x:oneInstanceCollection_shard2_replica_n1 t:null-594] o.a.s.u.PeerSync PeerSync: core=oneInstanceCollection_shard2_replica_n1 url=http://127.0.0.1:40707 DONE. We have no versions. sync failed. 2> 218371 INFO (qtp600427849-517) [n:127.0.0.1:40707_ c:oneInstanceCollection s:shard1 r:core_node8 x:oneInstanceCollection_shard1_replica_n6 t:null-597] o.a.s.u.PeerSync PeerSync: core=oneInstanceCollection_shard1_replica_n6 url=http://127.0.0.1:40707 DONE. We have no versions. sync failed. 2> 218378 INFO (qtp600427849-520) [n:127.0.0.1:40707_ c:oneInstanceCollection s:shard1 r:core_node7 x:oneInstanceCollection_shard1_replica_n4 t:null-597] o.a.s.c.S.Request webapp= path=/get params={distrib=false&qt=/get&fingerprint=false&getVersions=100&wt=javabin&version=2} status=0 QTime=2 2> 218378 INFO (qtp600427849-518) [n:127.0.0.1:40707_ c:oneInstanceCollection s:shard2 r:core_node5 x:oneInstanceCollection_shard2_replica_n2 t:null-594] o.a.s.c.S.Request webapp= path=/get params={distrib=false&qt=/get&fingerprint=false&getVersions=100&wt=javabin&version=2} status=0 QTime=2 2> 218382 INFO (qtp600427849-517) [n:127.0.0.1:40707_ c:oneInstanceCollection s:shard1 r:core_node8 x:oneInstanceCollection_shard1_replica_n6 t:null-597] o.a.s.c.SyncStrategy Leader's attempt to sync with shard failed, moving to the next candidate 2> 218382 INFO (qtp600427849-516) [n:127.0.0.1:40707_ c:oneInstanceCollection s:shard2 r:core_node3 x:oneInstanceCollection_shard2_replica_n1 t:null-594] o.a.s.c.SyncStrategy Leader's attempt to sync with shard failed, moving to the next candidate 2> 218383 INFO (qtp600427849-517) [n:127.0.0.1:40707_ c:oneInstanceCollection s:shard1 r:core_node8 x:oneInstanceCollection_shard1_replica_n6 t:null-597] o.a.s.c.ShardLeaderElectionContext We failed sync, but we have no versions - we can't sync in that case - we were active before, so become leader anyway 2> 218383 INFO (qtp600427849-516) [n:127.0.0.1:40707_ c:oneInstanceCollection s:shard2 r:core_node3 x:oneInstanceCollection_shard2_replica_n1 t:null-594] o.a.s.c.ShardLeaderElectionContext We failed sync, but we have no versions - we can't sync in that case - we were active before, so become leader anyway 2> 218383 INFO (qtp600427849-517) [n:127.0.0.1:40707_ c:oneInstanceCollection s:shard1 r:core_node8 x:oneInstanceCollection_shard1_replica_n6 t:null-597] o.a.s.c.ShardLeaderElectionContextBase Creating leader registration node /collections/oneInstanceCollection/leaders/shard1/leader after winning as /collections/oneInstanceCollection/leader_elect/shard1/election/72077803529764874-core_node8-n_0000000000 2> 218383 INFO (qtp600427849-516) [n:127.0.0.1:40707_ c:oneInstanceCollection s:shard2 r:core_node3 x:oneInstanceCollection_shard2_replica_n1 t:null-594] o.a.s.c.ShardLeaderElectionContextBase Creating leader registration node /collections/oneInstanceCollection/leaders/shard2/leader after winning as /collections/oneInstanceCollection/leader_elect/shard2/election/72077803529764874-core_node3-n_0000000000 2> 218408 INFO (qtp600427849-517) [n:127.0.0.1:40707_ c:oneInstanceCollection s:shard1 r:core_node8 x:oneInstanceCollection_shard1_replica_n6 t:null-597] o.a.s.c.ShardLeaderElectionContext I am the new leader: http://127.0.0.1:40707/oneInstanceCollection_shard1_replica_n6/ shard1 2> 218410 INFO (qtp600427849-516) [n:127.0.0.1:40707_ c:oneInstanceCollection s:shard2 r:core_node3 x:oneInstanceCollection_shard2_replica_n1 t:null-594] o.a.s.c.ShardLeaderElectionContext I am the new leader: http://127.0.0.1:40707/oneInstanceCollection_shard2_replica_n1/ shard2 2> 218526 INFO (zkCallback-288-thread-2) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/oneInstanceCollection/state.json zxid: 680] for collection [oneInstanceCollection] has occurred - updating... (live nodes size: [5]) 2> 218542 INFO (qtp600427849-517) [n:127.0.0.1:40707_ c:oneInstanceCollection s:shard1 r:core_node8 x:oneInstanceCollection_shard1_replica_n6 t:null-597] o.a.s.c.ZkController I am the leader, no recovery necessary 2> 218545 INFO (qtp600427849-516) [n:127.0.0.1:40707_ c:oneInstanceCollection s:shard2 r:core_node3 x:oneInstanceCollection_shard2_replica_n1 t:null-594] o.a.s.c.ZkController I am the leader, no recovery necessary 2> 218575 INFO (qtp600427849-516) [n:127.0.0.1:40707_ c: s: r: x: t:null-594] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/cores params={qt=/admin/cores&coreNodeName=core_node3&collection.configName=conf1&newCollection=true&name=oneInstanceCollection_shard2_replica_n1&action=CREATE&numShards=2&collection=oneInstanceCollection&shard=shard2&wt=javabin&version=2&replicaType=NRT} status=0 QTime=2293 2> 218577 INFO (qtp600427849-517) [n:127.0.0.1:40707_ c: s: r: x: t:null-597] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/cores params={qt=/admin/cores&coreNodeName=core_node8&collection.configName=conf1&newCollection=true&name=oneInstanceCollection_shard1_replica_n6&action=CREATE&numShards=2&collection=oneInstanceCollection&shard=shard1&wt=javabin&version=2&replicaType=NRT} status=0 QTime=2289 2> 218668 INFO (zkCallback-288-thread-2) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/oneInstanceCollection/state.json zxid: 684] for collection [oneInstanceCollection] has occurred - updating... (live nodes size: [5]) 2> 218668 INFO (zkCallback-288-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/oneInstanceCollection/state.json zxid: 684] for collection [oneInstanceCollection] has occurred - updating... (live nodes size: [5]) 2> 219371 INFO (qtp600427849-515) [n:127.0.0.1:40707_ c: s: r: x: t:null-595] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/cores params={qt=/admin/cores&coreNodeName=core_node5&collection.configName=conf1&newCollection=true&name=oneInstanceCollection_shard2_replica_n2&action=CREATE&numShards=2&collection=oneInstanceCollection&shard=shard2&wt=javabin&version=2&replicaType=NRT} status=0 QTime=3087 2> 219373 INFO (qtp600427849-519) [n:127.0.0.1:40707_ c: s: r: x: t:null-596] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/cores params={qt=/admin/cores&coreNodeName=core_node7&collection.configName=conf1&newCollection=true&name=oneInstanceCollection_shard1_replica_n4&action=CREATE&numShards=2&collection=oneInstanceCollection&shard=shard1&wt=javabin&version=2&replicaType=NRT} status=0 QTime=3088 2> 219388 INFO (qtp1128963290-592) [n:127.0.0.1:43017_ c:oneInstanceCollection s: r: x: t:null-593] o.a.s.h.a.CollectionsHandler Wait for new collection to be active for at most 45 seconds. Check all shard replicas 2> 219479 INFO (zkCallback-288-thread-2) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/oneInstanceCollection/state.json zxid: 691] for collection [oneInstanceCollection] has occurred - updating... (live nodes size: [5]) 2> 219479 INFO (zkCallback-340-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/oneInstanceCollection/state.json zxid: 691] for collection [oneInstanceCollection] has occurred - updating... (live nodes size: [5]) 2> 219479 INFO (zkCallback-288-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/oneInstanceCollection/state.json zxid: 691] for collection [oneInstanceCollection] has occurred - updating... (live nodes size: [5]) 2> 219493 INFO (qtp1128963290-592) [n:127.0.0.1:43017_ c:oneInstanceCollection s: r: x: t:null-593] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/collections params={collection.configName=conf1&name=oneInstanceCollection&nrtReplicas=2&action=CREATE&numShards=2&createNodeSet=127.0.0.1:40707_&wt=javabin&version=2} status=0 QTime=3649 2> 219529 INFO (TEST-BasicDistributedZkTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.AbstractDistribZkTestBase Wait for recoveries to finish - collection:oneInstanceCollection failOnTimeout:true timeout:330SECONDS 2> 219534 INFO (TEST-BasicDistributedZkTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.AbstractDistribZkTestBase Recoveries finished - collection:oneInstanceCollection 2> 219634 INFO (qtp600427849-515) [n:127.0.0.1:40707_ c:oneInstanceCollection s:shard1 r:core_node8 x:oneInstanceCollection_shard1_replica_n6 t:null-598] o.a.s.c.ZkShardTerms Successful update of terms at /collections/oneInstanceCollection/terms/shard1 to Terms{values={core_node7=1, core_node8=1}, version=2} for ensureHighestTermsAreNotZero 2> 219646 INFO (qtp600427849-519) [n:127.0.0.1:40707_ c:oneInstanceCollection s:shard1 r:core_node7 x:oneInstanceCollection_shard1_replica_n4 t:null-598] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={update.distrib=FROMLEADER&distrib.from=http://127.0.0.1:40707/oneInstanceCollection_shard1_replica_n6/&wt=javabin&version=2}{add=[1 (1788735548759736320)]} 0 11 2> 219649 INFO (qtp600427849-515) [n:127.0.0.1:40707_ c:oneInstanceCollection s:shard1 r:core_node8 x:oneInstanceCollection_shard1_replica_n6 t:null-598] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={update.distrib=TOLEADER&distrib.from=http://127.0.0.1:40707/oneInstanceCollection_shard1_replica_n4/&wt=javabin&version=2}{add=[1 (1788735548759736320)]} 0 58 2> 219652 INFO (qtp600427849-520) [n:127.0.0.1:40707_ c:oneInstanceCollection s:shard1 r:core_node7 x:oneInstanceCollection_shard1_replica_n4 t:null-598] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={wt=javabin&version=2}{add=[1]} 0 102 2> 219681 INFO (qtp600427849-516) [n:127.0.0.1:40707_ c:oneInstanceCollection s:shard2 r:core_node3 x:oneInstanceCollection_shard2_replica_n1 t:null-599] o.a.s.c.ZkShardTerms Successful update of terms at /collections/oneInstanceCollection/terms/shard2 to Terms{values={core_node3=1, core_node5=1}, version=2} for ensureHighestTermsAreNotZero 2> 219705 INFO (qtp600427849-517) [n:127.0.0.1:40707_ c:oneInstanceCollection s:shard2 r:core_node5 x:oneInstanceCollection_shard2_replica_n2 t:null-599] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={update.distrib=FROMLEADER&distrib.from=http://127.0.0.1:40707/oneInstanceCollection_shard2_replica_n1/&wt=javabin&version=2}{add=[2 (1788735548831039488)]} 0 23 2> 219708 INFO (qtp600427849-516) [n:127.0.0.1:40707_ c:oneInstanceCollection s:shard2 r:core_node3 x:oneInstanceCollection_shard2_replica_n1 t:null-599] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={wt=javabin&version=2}{add=[2 (1788735548831039488)]} 0 46 2> 219748 INFO (qtp600427849-518) [n:127.0.0.1:40707_ c:oneInstanceCollection s:shard2 r:core_node5 x:oneInstanceCollection_shard2_replica_n2 t:null-600] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={update.distrib=FROMLEADER&distrib.from=http://127.0.0.1:40707/oneInstanceCollection_shard2_replica_n1/&wt=javabin&version=2}{add=[3 (1788735548893954048)]} 0 13 2> 219756 INFO (qtp600427849-520) [n:127.0.0.1:40707_ c:oneInstanceCollection s:shard2 r:core_node3 x:oneInstanceCollection_shard2_replica_n1 t:null-600] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={update.distrib=TOLEADER&distrib.from=http://127.0.0.1:40707/oneInstanceCollection_shard2_replica_n2/&wt=javabin&version=2}{add=[3 (1788735548893954048)]} 0 32 2> 219758 INFO (qtp600427849-515) [n:127.0.0.1:40707_ c:oneInstanceCollection s:shard2 r:core_node5 x:oneInstanceCollection_shard2_replica_n2 t:null-600] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={wt=javabin&version=2}{add=[3]} 0 41 2> 219897 INFO (OverseerCollectionConfigSetProcessor-72077803529764868-127.0.0.1:45297_-n_0000000000) [n: c: s: r: x: t:] o.a.s.c.OverseerTaskQueue Response ZK path: /overseer/collection-queue-work/qnr-0000000032 doesn't exist. Requestor may have disconnected from ZooKeeper 2> 219948 INFO (searcherExecutor-459-thread-1-processing-oneInstanceCollection_shard2_replica_n2 null-601 core_node5 127.0.0.1:40707_ oneInstanceCollection shard2) [n:127.0.0.1:40707_ c:oneInstanceCollection s:shard2 r:core_node5 x:oneInstanceCollection_shard2_replica_n2 t:null-601] o.a.s.c.SolrCore Registered new searcher autowarm time: 0 ms 2> 219951 INFO (qtp600427849-518) [n:127.0.0.1:40707_ c:oneInstanceCollection s:shard2 r:core_node5 x:oneInstanceCollection_shard2_replica_n2 t:null-601] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={update.distrib=FROMLEADER&waitSearcher=true&openSearcher=true&commit=true&softCommit=false&distrib.from=http://127.0.0.1:40707/oneInstanceCollection_shard2_replica_n1/&commit_end_point=replicas&wt=javabin&version=2&expungeDeletes=false}{commit=} 0 148 2> 219954 INFO (searcherExecutor-456-thread-1-processing-oneInstanceCollection_shard2_replica_n1 null-601 core_node3 127.0.0.1:40707_ oneInstanceCollection shard2) [n:127.0.0.1:40707_ c:oneInstanceCollection s:shard2 r:core_node3 x:oneInstanceCollection_shard2_replica_n1 t:null-601] o.a.s.c.SolrCore Registered new searcher autowarm time: 0 ms 2> 219961 INFO (qtp600427849-519) [n:127.0.0.1:40707_ c:oneInstanceCollection s:shard2 r:core_node3 x:oneInstanceCollection_shard2_replica_n1 t:null-601] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={update.distrib=TOLEADER&waitSearcher=true&openSearcher=true&commit=true&softCommit=false&distrib.from=http://127.0.0.1:40707/oneInstanceCollection_shard1_replica_n6/&commit_end_point=leaders&wt=javabin&version=2&expungeDeletes=false}{commit=} 0 177 2> 220141 INFO (searcherExecutor-461-thread-1-processing-oneInstanceCollection_shard1_replica_n6 null-601 core_node8 127.0.0.1:40707_ oneInstanceCollection shard1) [n:127.0.0.1:40707_ c:oneInstanceCollection s:shard1 r:core_node8 x:oneInstanceCollection_shard1_replica_n6 t:null-601] o.a.s.c.SolrCore Registered new searcher autowarm time: 5 ms 2> 220163 INFO (searcherExecutor-457-thread-1-processing-oneInstanceCollection_shard1_replica_n4 null-601 core_node7 127.0.0.1:40707_ oneInstanceCollection shard1) [n:127.0.0.1:40707_ c:oneInstanceCollection s:shard1 r:core_node7 x:oneInstanceCollection_shard1_replica_n4 t:null-601] o.a.s.c.SolrCore Registered new searcher autowarm time: 5 ms 2> 220168 INFO (qtp600427849-520) [n:127.0.0.1:40707_ c:oneInstanceCollection s:shard1 r:core_node7 x:oneInstanceCollection_shard1_replica_n4 t:null-601] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={update.distrib=FROMLEADER&waitSearcher=true&openSearcher=true&commit=true&softCommit=false&distrib.from=http://127.0.0.1:40707/oneInstanceCollection_shard1_replica_n6/&commit_end_point=replicas&wt=javabin&version=2&expungeDeletes=false}{commit=} 0 193 2> 220171 INFO (qtp600427849-516) [n:127.0.0.1:40707_ c:oneInstanceCollection s:shard1 r:core_node8 x:oneInstanceCollection_shard1_replica_n6 t:null-601] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={waitSearcher=true&commit=true&softCommit=false&wt=javabin&version=2}{commit=} 0 404 2> 220179 INFO (qtp600427849-515) [n:127.0.0.1:40707_ c:oneInstanceCollection s:shard1 r:core_node8 x:oneInstanceCollection_shard1_replica_n6 t:null-602] o.a.s.c.S.Request webapp= path=/select params={q=*:*&distrib=false&wt=javabin&version=2} hits=1 status=0 QTime=2 2> 220187 INFO (qtp600427849-517) [n:127.0.0.1:40707_ c:oneInstanceCollection s:shard1 r:core_node7 x:oneInstanceCollection_shard1_replica_n4 t:null-603] o.a.s.c.S.Request webapp= path=/select params={q=*:*&distrib=false&wt=javabin&version=2} hits=1 status=0 QTime=2 2> 220194 INFO (qtp600427849-518) [n:127.0.0.1:40707_ c:oneInstanceCollection s:shard2 r:core_node3 x:oneInstanceCollection_shard2_replica_n1 t:null-604] o.a.s.c.S.Request webapp= path=/select params={q=*:*&distrib=false&wt=javabin&version=2} hits=2 status=0 QTime=2 2> 220202 INFO (qtp600427849-519) [n:127.0.0.1:40707_ c:oneInstanceCollection s:shard2 r:core_node5 x:oneInstanceCollection_shard2_replica_n2 t:null-605] o.a.s.c.S.Request webapp= path=/select params={q=*:*&distrib=false&wt=javabin&version=2} hits=2 status=0 QTime=1 2> 220224 INFO (qtp600427849-516) [n:127.0.0.1:40707_ c:oneInstanceCollection s:shard1 r:core_node8 x:oneInstanceCollection_shard1_replica_n6 t:null-606] o.a.s.c.S.Request webapp= path=/select params={distrib=false&df=text&_stateVer_=oneInstanceCollection:5&fl=id&fl=score&shards.purpose=16388&start=0&fsv=true&collection=oneInstanceCollection&rid=null-606&rows=10&version=2&q=*:*&omitHeader=false&NOW=1705871152312&isShard=true&wt=javabin} hits=1 status=0 QTime=2 2> 220227 INFO (qtp600427849-515) [n:127.0.0.1:40707_ c:oneInstanceCollection s:shard2 r:core_node5 x:oneInstanceCollection_shard2_replica_n2 t:null-606] o.a.s.c.S.Request webapp= path=/select params={distrib=false&df=text&_stateVer_=oneInstanceCollection:5&fl=id&fl=score&shards.purpose=16388&start=0&fsv=true&collection=oneInstanceCollection&rid=null-606&rows=10&version=2&q=*:*&omitHeader=false&NOW=1705871152312&isShard=true&wt=javabin} hits=2 status=0 QTime=2 2> 220235 INFO (qtp600427849-517) [n:127.0.0.1:40707_ c:oneInstanceCollection s:shard2 r:core_node5 x:oneInstanceCollection_shard2_replica_n2 t:null-606] o.a.s.c.S.Request webapp= path=/select params={distrib=false&df=text&_stateVer_=oneInstanceCollection:5&shards.purpose=64&collection=oneInstanceCollection&rid=null-606&version=2&q=*:*&omitHeader=false&NOW=1705871152312&ids=2,3&isShard=true&wt=javabin} status=0 QTime=2 2> 220237 INFO (qtp600427849-518) [n:127.0.0.1:40707_ c:oneInstanceCollection s:shard1 r:core_node8 x:oneInstanceCollection_shard1_replica_n6 t:null-606] o.a.s.c.S.Request webapp= path=/select params={distrib=false&df=text&_stateVer_=oneInstanceCollection:5&shards.purpose=64&collection=oneInstanceCollection&rid=null-606&version=2&q=*:*&omitHeader=false&NOW=1705871152312&ids=1&isShard=true&wt=javabin} status=0 QTime=2 2> 220240 INFO (qtp600427849-520) [n:127.0.0.1:40707_ c:oneInstanceCollection s:shard1 r:core_node8 x:oneInstanceCollection_shard1_replica_n6 t:null-606] o.a.s.c.S.Request webapp= path=/select params={q=*:*&distrib=true&_stateVer_=oneInstanceCollection:5&collection=oneInstanceCollection&wt=javabin&version=2} rid=null-606 hits=3 status=0 QTime=26 2> 220247 INFO (TEST-BasicDistributedZkTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.AbstractBasicDistributedZkTestBase ### STARTING testSearchByCollectionName 2> 220268 INFO (qtp600427849-519) [n:127.0.0.1:40707_ c:oneInstanceCollection s:shard2 r:core_node3 x:oneInstanceCollection_shard2_replica_n1 t:null-607] o.a.s.c.S.Request webapp= path=/select params={df=text&distrib=false&fl=id&fl=score&shards.purpose=16388&start=0&fsv=true&rid=null-607&rows=10&version=2&q=*:*&omitHeader=false&NOW=1705871152358&isShard=true&wt=javabin} hits=2 status=0 QTime=2 2> 220270 INFO (qtp600427849-517) [n:127.0.0.1:40707_ c:oneInstanceCollection s:shard1 r:core_node8 x:oneInstanceCollection_shard1_replica_n6 t:null-607] o.a.s.c.S.Request webapp= path=/select params={df=text&distrib=false&fl=id&fl=score&shards.purpose=16388&start=0&fsv=true&rid=null-607&rows=10&version=2&q=*:*&omitHeader=false&NOW=1705871152358&isShard=true&wt=javabin} hits=1 status=0 QTime=2 2> 220279 INFO (qtp600427849-515) [n:127.0.0.1:40707_ c:oneInstanceCollection s:shard2 r:core_node3 x:oneInstanceCollection_shard2_replica_n1 t:null-607] o.a.s.c.S.Request webapp= path=/select params={q=*:*&df=text&distrib=false&omitHeader=false&shards.purpose=64&NOW=1705871152358&ids=2,3&isShard=true&rid=null-607&wt=javabin&version=2} status=0 QTime=2 2> 220281 INFO (qtp600427849-520) [n:127.0.0.1:40707_ c:oneInstanceCollection s:shard1 r:core_node8 x:oneInstanceCollection_shard1_replica_n6 t:null-607] o.a.s.c.S.Request webapp= path=/select params={q=*:*&df=text&distrib=false&omitHeader=false&shards.purpose=64&NOW=1705871152358&ids=1&isShard=true&rid=null-607&wt=javabin&version=2} status=0 QTime=2 2> 220284 INFO (qtp600427849-518) [n:127.0.0.1:40707_ c:oneInstanceCollection s:shard2 r:core_node3 x:oneInstanceCollection_shard2_replica_n1 t:null-607] o.a.s.c.S.Request webapp= path=/select params={q=*:*&wt=javabin&version=2} rid=null-607 hits=3 status=0 QTime=24 2> 220288 INFO (TEST-BasicDistributedZkTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.AbstractBasicDistributedZkTestBase ### STARTING testUpdateByCollectionName 2> 220339 INFO (qtp600427849-518) [n:127.0.0.1:40707_ c:oneInstanceCollection s:shard2 r:core_node5 x:oneInstanceCollection_shard2_replica_n2 t:null-608] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={update.distrib=FROMLEADER&waitSearcher=true&openSearcher=true&commit=true&softCommit=false&distrib.from=http://127.0.0.1:40707/oneInstanceCollection_shard2_replica_n1/&commit_end_point=replicas&wt=javabin&version=2&expungeDeletes=false}{commit=} 0 5 2> 220342 INFO (qtp600427849-520) [n:127.0.0.1:40707_ c:oneInstanceCollection s:shard2 r:core_node3 x:oneInstanceCollection_shard2_replica_n1 t:null-608] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={update.distrib=TOLEADER&waitSearcher=true&openSearcher=true&commit=true&softCommit=false&distrib.from=http://127.0.0.1:40707/oneInstanceCollection_shard1_replica_n6/&commit_end_point=leaders&wt=javabin&version=2&expungeDeletes=false}{commit=} 0 29 2> 220384 INFO (qtp600427849-516) [n:127.0.0.1:40707_ c:oneInstanceCollection s:shard1 r:core_node7 x:oneInstanceCollection_shard1_replica_n4 t:null-608] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={update.distrib=FROMLEADER&waitSearcher=true&openSearcher=true&commit=true&softCommit=false&distrib.from=http://127.0.0.1:40707/oneInstanceCollection_shard1_replica_n6/&commit_end_point=replicas&wt=javabin&version=2&expungeDeletes=false}{commit=} 0 18 2> 220387 INFO (qtp600427849-515) [n:127.0.0.1:40707_ c:oneInstanceCollection s:shard1 r:core_node8 x:oneInstanceCollection_shard1_replica_n6 t:null-608] o.a.s.u.p.LogUpdateProcessorFactory webapp= path=/update params={waitSearcher=true&commit=true&softCommit=false&wt=javabin&version=2}{commit=} 0 86 2> 220391 INFO (TEST-BasicDistributedZkTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.AbstractBasicDistributedZkTestBase ### STARTING testANewCollectionInOneInstanceWithManualShardAssignement 2> 220434 INFO (OverseerThreadFactory-250-thread-4) [n: c:oneInstanceCollection2 s: r: x: t:] o.a.s.c.a.c.CreateCollectionCmd Create collection oneInstanceCollection2 2> 220678 WARN (OverseerThreadFactory-250-thread-4) [n: c:oneInstanceCollection2 s: r: x: t:] o.a.s.c.a.c.CreateCollectionCmd It is unusual to create a collection (oneInstanceCollection2) without cores. 2> 220691 INFO (qtp1128963290-593) [n:127.0.0.1:43017_ c:oneInstanceCollection2 s: r: x: t:null-609] o.a.s.h.a.CollectionsHandler Wait for new collection to be active for at most 45 seconds. Check all shard replicas 2> 220695 INFO (qtp1128963290-593) [n:127.0.0.1:43017_ c:oneInstanceCollection2 s: r: x: t:null-609] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/collections params={collection.configName=conf1&name=oneInstanceCollection2&nrtReplicas=2&action=CREATE&numShards=2&createNodeSet=&wt=javabin&version=2} status=0 QTime=299 2> 220734 INFO (OverseerCollectionConfigSetProcessor-72077803529764868-127.0.0.1:45297_-n_0000000000) [n: c: s: r: x: t:] o.a.s.c.OverseerTaskQueue Response ZK path: /overseer/collection-queue-work/qnr-0000000034 doesn't exist. Requestor may have disconnected from ZooKeeper 2> 220754 INFO (OverseerThreadFactory-250-thread-4) [n: c:oneInstanceCollection2 s:shard1 r: x: t:] o.a.s.c.a.c.AddReplicaCmd Node Identified 127.0.0.1:40707_ for creating new replica of shard shard1 for collection oneInstanceCollection2 2> 220772 INFO (OverseerThreadFactory-250-thread-4) [n: c:oneInstanceCollection2 s:shard1 r: x: t:] o.a.s.c.a.c.AddReplicaCmd Returning CreateReplica command. 2> 220785 INFO (OverseerStateUpdate-72077803529764868-127.0.0.1:45297_-n_0000000000) [n:127.0.0.1:45297_ c: s: r: x: t:] o.a.s.c.o.SliceMutator createReplica() { 2> "core":"oneInstanceCollection2_shard1_replica_n1", 2> "node_name":"127.0.0.1:40707_", 2> "base_url":"http://127.0.0.1:40707", 2> "collection":"oneInstanceCollection2", 2> "shard":"shard1", 2> "state":"down", 2> "type":"NRT", 2> "operation":"addreplica"} 2> 220906 INFO (zkCallback-244-thread-2) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/oneInstanceCollection2/state.json zxid: 715] for collection [oneInstanceCollection2] has occurred - updating... (live nodes size: [5]) 2> 220919 INFO (qtp600427849-518) [n:127.0.0.1:40707_ c: s: r: x:oneInstanceCollection2_shard1_replica_n1 t:null-611] o.a.s.h.a.CoreAdminOperation core create command qt=/admin/cores&coreNodeName=core_node2&collection.configName=conf1&name=oneInstanceCollection2_shard1_replica_n1&action=CREATE&collection=oneInstanceCollection2&shard=shard1&wt=javabin&version=2&replicaType=NRT 2> 220979 INFO (qtp600427849-518) [n:127.0.0.1:40707_ c:oneInstanceCollection2 s:shard1 r:core_node2 x:oneInstanceCollection2_shard1_replica_n1 t:null-611] o.a.s.c.SolrConfig Using Lucene MatchVersion: 9.8.0 2> 220991 INFO (qtp600427849-518) [n:127.0.0.1:40707_ c:oneInstanceCollection2 s:shard1 r:core_node2 x:oneInstanceCollection2_shard1_replica_n1 t:null-611] o.a.s.s.IndexSchema Schema name=test 2> 221217 INFO (qtp600427849-518) [n:127.0.0.1:40707_ c:oneInstanceCollection2 s:shard1 r:core_node2 x:oneInstanceCollection2_shard1_replica_n1 t:null-611] o.a.s.s.IndexSchema Loaded schema test/1.0 with uniqueid field id 2> 221411 INFO (qtp600427849-518) [n:127.0.0.1:40707_ c:oneInstanceCollection2 s:shard1 r:core_node2 x:oneInstanceCollection2_shard1_replica_n1 t:null-611] o.a.s.c.CoreContainer Creating SolrCore 'oneInstanceCollection2_shard1_replica_n1' using configuration from configset conf1, trusted=true 2> 221416 INFO (qtp600427849-518) [n:127.0.0.1:40707_ c:oneInstanceCollection2 s:shard1 r:core_node2 x:oneInstanceCollection2_shard1_replica_n1 t:null-611] o.a.s.c.SolrCore Opening new SolrCore at [/build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.cloud.BasicDistributedZkTest_246C98A4C257C021-001/shard-1-001/cores/oneInstanceCollection2_shard1_replica_n1], dataDir=[/build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.cloud.BasicDistributedZkTest_246C98A4C257C021-001/shard-1-001/cores/oneInstanceCollection2_shard1_replica_n1/data/] 2> 221457 INFO (qtp600427849-518) [n:127.0.0.1:40707_ c:oneInstanceCollection2 s:shard1 r:core_node2 x:oneInstanceCollection2_shard1_replica_n1 t:null-611] o.a.s.u.RandomMergePolicy RandomMergePolicy wrapping class org.apache.lucene.index.TieredMergePolicy: [TieredMergePolicy: maxMergeAtOnce=26, maxMergedSegmentMB=49.734910011291504, floorSegmentMB=0.3930988311767578, forceMergeDeletesPctAllowed=19.244929011688583, segmentsPerTier=32.0, maxCFSSegmentSizeMB=8.796093022208E12, noCFSRatio=0.844874139736486, deletesPctAllowed=38.311155010312675 2> 221475 WARN (qtp600427849-518) [n:127.0.0.1:40707_ c:oneInstanceCollection2 s:shard1 r:core_node2 x:oneInstanceCollection2_shard1_replica_n1 t:null-611] o.a.s.c.RequestHandlers INVALID paramSet a in requestHandler {type = requestHandler,name = /dump,class = DumpRequestHandler,attributes = {initParams=a, name=/dump, class=DumpRequestHandler},args = {defaults={a=A, b=B}}} 2> 221703 INFO (qtp600427849-518) [n:127.0.0.1:40707_ c:oneInstanceCollection2 s:shard1 r:core_node2 x:oneInstanceCollection2_shard1_replica_n1 t:null-611] o.a.s.u.UpdateHandler Using UpdateLog implementation: org.apache.solr.update.UpdateLog 2> 221704 INFO (qtp600427849-518) [n:127.0.0.1:40707_ c:oneInstanceCollection2 s:shard1 r:core_node2 x:oneInstanceCollection2_shard1_replica_n1 t:null-611] o.a.s.u.UpdateLog Initializing UpdateLog: dataDir= defaultSyncLevel=FLUSH numRecordsToKeep=100 maxNumLogsToKeep=10 numVersionBuckets=65536 2> 221724 INFO (qtp600427849-518) [n:127.0.0.1:40707_ c:oneInstanceCollection2 s:shard1 r:core_node2 x:oneInstanceCollection2_shard1_replica_n1 t:null-611] o.a.s.u.CommitTracker Hard AutoCommit: disabled 2> 221725 INFO (qtp600427849-518) [n:127.0.0.1:40707_ c:oneInstanceCollection2 s:shard1 r:core_node2 x:oneInstanceCollection2_shard1_replica_n1 t:null-611] o.a.s.u.CommitTracker Soft AutoCommit: disabled 2> 221735 INFO (qtp600427849-518) [n:127.0.0.1:40707_ c:oneInstanceCollection2 s:shard1 r:core_node2 x:oneInstanceCollection2_shard1_replica_n1 t:null-611] o.a.s.u.RandomMergePolicy RandomMergePolicy wrapping class org.apache.lucene.index.TieredMergePolicy: [TieredMergePolicy: maxMergeAtOnce=25, maxMergedSegmentMB=98.51693820953369, floorSegmentMB=1.4527759552001953, forceMergeDeletesPctAllowed=2.9097891812865595, segmentsPerTier=45.0, maxCFSSegmentSizeMB=8.796093022208E12, noCFSRatio=0.5977428682331861, deletesPctAllowed=37.29251706615804 2> 221752 INFO (qtp600427849-518) [n:127.0.0.1:40707_ c:oneInstanceCollection2 s:shard1 r:core_node2 x:oneInstanceCollection2_shard1_replica_n1 t:null-611] o.a.s.r.ManagedResourceStorage Configured ZooKeeperStorageIO with znodeBase: /configs/conf1 2> 221755 INFO (qtp600427849-518) [n:127.0.0.1:40707_ c:oneInstanceCollection2 s:shard1 r:core_node2 x:oneInstanceCollection2_shard1_replica_n1 t:null-611] o.a.s.r.ManagedResourceStorage Loaded null at path _rest_managed.json using ZooKeeperStorageIO:path=/configs/conf1 2> 221760 INFO (qtp600427849-518) [n:127.0.0.1:40707_ c:oneInstanceCollection2 s:shard1 r:core_node2 x:oneInstanceCollection2_shard1_replica_n1 t:null-611] o.a.s.h.ReplicationHandler Commits will be reserved for 10000 ms 2> 221761 INFO (qtp600427849-518) [n:127.0.0.1:40707_ c:oneInstanceCollection2 s:shard1 r:core_node2 x:oneInstanceCollection2_shard1_replica_n1 t:null-611] o.a.s.u.UpdateLog Could not find max version in index or recent updates, using new clock 1788735551028854784 2> 221798 INFO (searcherExecutor-479-thread-1-processing-oneInstanceCollection2_shard1_replica_n1 null-611 core_node2 127.0.0.1:40707_ oneInstanceCollection2 shard1) [n:127.0.0.1:40707_ c:oneInstanceCollection2 s:shard1 r:core_node2 x:oneInstanceCollection2_shard1_replica_n1 t:null-611] o.a.s.c.SolrCore Registered new searcher autowarm time: 0 ms 2> 221802 INFO (qtp600427849-518) [n:127.0.0.1:40707_ c:oneInstanceCollection2 s:shard1 r:core_node2 x:oneInstanceCollection2_shard1_replica_n1 t:null-611] o.a.s.c.ZkShardTerms Successful update of terms at /collections/oneInstanceCollection2/terms/shard1 to Terms{values={core_node2=0}, version=0} for registerTerm 2> 221803 INFO (qtp600427849-518) [n:127.0.0.1:40707_ c:oneInstanceCollection2 s:shard1 r:core_node2 x:oneInstanceCollection2_shard1_replica_n1 t:null-611] o.a.s.c.ShardLeaderElectionContextBase make sure parent is created /collections/oneInstanceCollection2/leaders/shard1 2> 221838 INFO (qtp600427849-518) [n:127.0.0.1:40707_ c:oneInstanceCollection2 s:shard1 r:core_node2 x:oneInstanceCollection2_shard1_replica_n1 t:null-611] o.a.s.c.ShardLeaderElectionContext Enough replicas found to continue. 2> 221839 INFO (qtp600427849-518) [n:127.0.0.1:40707_ c:oneInstanceCollection2 s:shard1 r:core_node2 x:oneInstanceCollection2_shard1_replica_n1 t:null-611] o.a.s.c.ShardLeaderElectionContext I may be the new leader - try and sync 2> 221839 INFO (qtp600427849-518) [n:127.0.0.1:40707_ c:oneInstanceCollection2 s:shard1 r:core_node2 x:oneInstanceCollection2_shard1_replica_n1 t:null-611] o.a.s.c.SyncStrategy Sync replicas to http://127.0.0.1:40707/oneInstanceCollection2_shard1_replica_n1/ 2> 221841 INFO (qtp600427849-518) [n:127.0.0.1:40707_ c:oneInstanceCollection2 s:shard1 r:core_node2 x:oneInstanceCollection2_shard1_replica_n1 t:null-611] o.a.s.c.SyncStrategy Sync Success - now sync replicas to me 2> 221844 INFO (qtp600427849-518) [n:127.0.0.1:40707_ c:oneInstanceCollection2 s:shard1 r:core_node2 x:oneInstanceCollection2_shard1_replica_n1 t:null-611] o.a.s.c.SyncStrategy http://127.0.0.1:40707/oneInstanceCollection2_shard1_replica_n1/ has no replicas 2> 221844 INFO (qtp600427849-518) [n:127.0.0.1:40707_ c:oneInstanceCollection2 s:shard1 r:core_node2 x:oneInstanceCollection2_shard1_replica_n1 t:null-611] o.a.s.c.ShardLeaderElectionContextBase Creating leader registration node /collections/oneInstanceCollection2/leaders/shard1/leader after winning as /collections/oneInstanceCollection2/leader_elect/shard1/election/72077803529764874-core_node2-n_0000000000 2> 221866 INFO (qtp600427849-518) [n:127.0.0.1:40707_ c:oneInstanceCollection2 s:shard1 r:core_node2 x:oneInstanceCollection2_shard1_replica_n1 t:null-611] o.a.s.c.ShardLeaderElectionContext I am the new leader: http://127.0.0.1:40707/oneInstanceCollection2_shard1_replica_n1/ shard1 2> 221980 INFO (zkCallback-288-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/oneInstanceCollection2/state.json zxid: 733] for collection [oneInstanceCollection2] has occurred - updating... (live nodes size: [5]) 2> 221989 INFO (qtp600427849-518) [n:127.0.0.1:40707_ c:oneInstanceCollection2 s:shard1 r:core_node2 x:oneInstanceCollection2_shard1_replica_n1 t:null-611] o.a.s.c.ZkController I am the leader, no recovery necessary 2> 222024 INFO (qtp600427849-518) [n:127.0.0.1:40707_ c: s: r: x: t:null-611] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/cores params={qt=/admin/cores&coreNodeName=core_node2&collection.configName=conf1&name=oneInstanceCollection2_shard1_replica_n1&action=CREATE&collection=oneInstanceCollection2&shard=shard1&wt=javabin&version=2&replicaType=NRT} status=0 QTime=1107 2> 222043 INFO (qtp1128963290-590) [n:127.0.0.1:43017_ c:oneInstanceCollection2 s: r: x: t:null-610] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/collections params={node=127.0.0.1:40707_&action=ADDREPLICA&collection=oneInstanceCollection2&shard=shard1&type=NRT&wt=javabin&version=2} status=0 QTime=1340 2> 222086 INFO (OverseerCollectionConfigSetProcessor-72077803529764868-127.0.0.1:45297_-n_0000000000) [n: c: s: r: x: t:] o.a.s.c.OverseerTaskQueue Response ZK path: /overseer/collection-queue-work/qnr-0000000036 doesn't exist. Requestor may have disconnected from ZooKeeper 2> 222107 INFO (OverseerThreadFactory-250-thread-4) [n: c:oneInstanceCollection2 s:shard2 r: x: t:] o.a.s.c.a.c.AddReplicaCmd Node Identified 127.0.0.1:40707_ for creating new replica of shard shard2 for collection oneInstanceCollection2 2> 222111 INFO (zkCallback-288-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/oneInstanceCollection2/state.json zxid: 743] for collection [oneInstanceCollection2] has occurred - updating... (live nodes size: [5]) 2> 222111 INFO (zkCallback-288-thread-2) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/oneInstanceCollection2/state.json zxid: 743] for collection [oneInstanceCollection2] has occurred - updating... (live nodes size: [5]) 2> 222127 INFO (OverseerThreadFactory-250-thread-4) [n: c:oneInstanceCollection2 s:shard2 r: x: t:] o.a.s.c.a.c.AddReplicaCmd Returning CreateReplica command. 2> 222141 INFO (OverseerStateUpdate-72077803529764868-127.0.0.1:45297_-n_0000000000) [n:127.0.0.1:45297_ c: s: r: x: t:] o.a.s.c.o.SliceMutator createReplica() { 2> "core":"oneInstanceCollection2_shard2_replica_n3", 2> "node_name":"127.0.0.1:40707_", 2> "base_url":"http://127.0.0.1:40707", 2> "collection":"oneInstanceCollection2", 2> "shard":"shard2", 2> "state":"down", 2> "type":"NRT", 2> "operation":"addreplica"} 2> 222266 INFO (zkCallback-288-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/oneInstanceCollection2/state.json zxid: 748] for collection [oneInstanceCollection2] has occurred - updating... (live nodes size: [5]) 2> 222266 INFO (zkCallback-244-thread-2) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/oneInstanceCollection2/state.json zxid: 748] for collection [oneInstanceCollection2] has occurred - updating... (live nodes size: [5]) 2> 222266 INFO (zkCallback-288-thread-2) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/oneInstanceCollection2/state.json zxid: 748] for collection [oneInstanceCollection2] has occurred - updating... (live nodes size: [5]) 2> 222281 INFO (qtp600427849-520) [n:127.0.0.1:40707_ c: s: r: x:oneInstanceCollection2_shard2_replica_n3 t:null-613] o.a.s.h.a.CoreAdminOperation core create command qt=/admin/cores&coreNodeName=core_node4&collection.configName=conf1&name=oneInstanceCollection2_shard2_replica_n3&action=CREATE&collection=oneInstanceCollection2&shard=shard2&wt=javabin&version=2&replicaType=NRT 2> 222322 INFO (qtp600427849-520) [n:127.0.0.1:40707_ c:oneInstanceCollection2 s:shard2 r:core_node4 x:oneInstanceCollection2_shard2_replica_n3 t:null-613] o.a.s.c.SolrConfig Using Lucene MatchVersion: 9.8.0 2> 222334 INFO (qtp600427849-520) [n:127.0.0.1:40707_ c:oneInstanceCollection2 s:shard2 r:core_node4 x:oneInstanceCollection2_shard2_replica_n3 t:null-613] o.a.s.s.IndexSchema Schema name=test 2> 222416 INFO (zkCallback-288-thread-2) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/oneInstanceCollection2/state.json zxid: 751] for collection [oneInstanceCollection2] has occurred - updating... (live nodes size: [5]) 2> 222416 INFO (zkCallback-288-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/oneInstanceCollection2/state.json zxid: 751] for collection [oneInstanceCollection2] has occurred - updating... (live nodes size: [5]) 2> 222524 INFO (qtp600427849-520) [n:127.0.0.1:40707_ c:oneInstanceCollection2 s:shard2 r:core_node4 x:oneInstanceCollection2_shard2_replica_n3 t:null-613] o.a.s.s.IndexSchema Loaded schema test/1.0 with uniqueid field id 2> 222718 INFO (qtp600427849-520) [n:127.0.0.1:40707_ c:oneInstanceCollection2 s:shard2 r:core_node4 x:oneInstanceCollection2_shard2_replica_n3 t:null-613] o.a.s.c.CoreContainer Creating SolrCore 'oneInstanceCollection2_shard2_replica_n3' using configuration from configset conf1, trusted=true 2> 222723 INFO (qtp600427849-520) [n:127.0.0.1:40707_ c:oneInstanceCollection2 s:shard2 r:core_node4 x:oneInstanceCollection2_shard2_replica_n3 t:null-613] o.a.s.c.SolrCore Opening new SolrCore at [/build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.cloud.BasicDistributedZkTest_246C98A4C257C021-001/shard-1-001/cores/oneInstanceCollection2_shard2_replica_n3], dataDir=[/build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.cloud.BasicDistributedZkTest_246C98A4C257C021-001/shard-1-001/cores/oneInstanceCollection2_shard2_replica_n3/data/] 2> 222755 INFO (qtp600427849-520) [n:127.0.0.1:40707_ c:oneInstanceCollection2 s:shard2 r:core_node4 x:oneInstanceCollection2_shard2_replica_n3 t:null-613] o.a.s.u.RandomMergePolicy RandomMergePolicy wrapping class org.apache.lucene.index.TieredMergePolicy: [TieredMergePolicy: maxMergeAtOnce=26, maxMergedSegmentMB=49.734910011291504, floorSegmentMB=0.3930988311767578, forceMergeDeletesPctAllowed=19.244929011688583, segmentsPerTier=32.0, maxCFSSegmentSizeMB=8.796093022208E12, noCFSRatio=0.844874139736486, deletesPctAllowed=38.311155010312675 2> 222792 WARN (qtp600427849-520) [n:127.0.0.1:40707_ c:oneInstanceCollection2 s:shard2 r:core_node4 x:oneInstanceCollection2_shard2_replica_n3 t:null-613] o.a.s.c.RequestHandlers INVALID paramSet a in requestHandler {type = requestHandler,name = /dump,class = DumpRequestHandler,attributes = {initParams=a, name=/dump, class=DumpRequestHandler},args = {defaults={a=A, b=B}}} 2> 223095 INFO (qtp600427849-520) [n:127.0.0.1:40707_ c:oneInstanceCollection2 s:shard2 r:core_node4 x:oneInstanceCollection2_shard2_replica_n3 t:null-613] o.a.s.u.UpdateHandler Using UpdateLog implementation: org.apache.solr.update.UpdateLog 2> 223096 INFO (qtp600427849-520) [n:127.0.0.1:40707_ c:oneInstanceCollection2 s:shard2 r:core_node4 x:oneInstanceCollection2_shard2_replica_n3 t:null-613] o.a.s.u.UpdateLog Initializing UpdateLog: dataDir= defaultSyncLevel=FLUSH numRecordsToKeep=100 maxNumLogsToKeep=10 numVersionBuckets=65536 2> 223115 INFO (qtp600427849-520) [n:127.0.0.1:40707_ c:oneInstanceCollection2 s:shard2 r:core_node4 x:oneInstanceCollection2_shard2_replica_n3 t:null-613] o.a.s.u.CommitTracker Hard AutoCommit: disabled 2> 223115 INFO (qtp600427849-520) [n:127.0.0.1:40707_ c:oneInstanceCollection2 s:shard2 r:core_node4 x:oneInstanceCollection2_shard2_replica_n3 t:null-613] o.a.s.u.CommitTracker Soft AutoCommit: disabled 2> 223127 INFO (qtp600427849-520) [n:127.0.0.1:40707_ c:oneInstanceCollection2 s:shard2 r:core_node4 x:oneInstanceCollection2_shard2_replica_n3 t:null-613] o.a.s.u.RandomMergePolicy RandomMergePolicy wrapping class org.apache.lucene.index.TieredMergePolicy: [TieredMergePolicy: maxMergeAtOnce=25, maxMergedSegmentMB=98.51693820953369, floorSegmentMB=1.4527759552001953, forceMergeDeletesPctAllowed=2.9097891812865595, segmentsPerTier=45.0, maxCFSSegmentSizeMB=8.796093022208E12, noCFSRatio=0.5977428682331861, deletesPctAllowed=37.29251706615804 2> 223141 INFO (qtp600427849-520) [n:127.0.0.1:40707_ c:oneInstanceCollection2 s:shard2 r:core_node4 x:oneInstanceCollection2_shard2_replica_n3 t:null-613] o.a.s.r.ManagedResourceStorage Configured ZooKeeperStorageIO with znodeBase: /configs/conf1 2> 223145 INFO (qtp600427849-520) [n:127.0.0.1:40707_ c:oneInstanceCollection2 s:shard2 r:core_node4 x:oneInstanceCollection2_shard2_replica_n3 t:null-613] o.a.s.r.ManagedResourceStorage Loaded null at path _rest_managed.json using ZooKeeperStorageIO:path=/configs/conf1 2> 223151 INFO (qtp600427849-520) [n:127.0.0.1:40707_ c:oneInstanceCollection2 s:shard2 r:core_node4 x:oneInstanceCollection2_shard2_replica_n3 t:null-613] o.a.s.h.ReplicationHandler Commits will be reserved for 10000 ms 2> 223152 INFO (qtp600427849-520) [n:127.0.0.1:40707_ c:oneInstanceCollection2 s:shard2 r:core_node4 x:oneInstanceCollection2_shard2_replica_n3 t:null-613] o.a.s.u.UpdateLog Could not find max version in index or recent updates, using new clock 1788735552487424000 2> 223189 INFO (searcherExecutor-485-thread-1-processing-oneInstanceCollection2_shard2_replica_n3 null-613 core_node4 127.0.0.1:40707_ oneInstanceCollection2 shard2) [n:127.0.0.1:40707_ c:oneInstanceCollection2 s:shard2 r:core_node4 x:oneInstanceCollection2_shard2_replica_n3 t:null-613] o.a.s.c.SolrCore Registered new searcher autowarm time: 0 ms 2> 223191 INFO (qtp600427849-520) [n:127.0.0.1:40707_ c:oneInstanceCollection2 s:shard2 r:core_node4 x:oneInstanceCollection2_shard2_replica_n3 t:null-613] o.a.s.c.ZkShardTerms Successful update of terms at /collections/oneInstanceCollection2/terms/shard2 to Terms{values={core_node4=0}, version=0} for registerTerm 2> 223192 INFO (qtp600427849-520) [n:127.0.0.1:40707_ c:oneInstanceCollection2 s:shard2 r:core_node4 x:oneInstanceCollection2_shard2_replica_n3 t:null-613] o.a.s.c.ShardLeaderElectionContextBase make sure parent is created /collections/oneInstanceCollection2/leaders/shard2 2> 223223 INFO (qtp600427849-520) [n:127.0.0.1:40707_ c:oneInstanceCollection2 s:shard2 r:core_node4 x:oneInstanceCollection2_shard2_replica_n3 t:null-613] o.a.s.c.ShardLeaderElectionContext Enough replicas found to continue. 2> 223223 INFO (qtp600427849-520) [n:127.0.0.1:40707_ c:oneInstanceCollection2 s:shard2 r:core_node4 x:oneInstanceCollection2_shard2_replica_n3 t:null-613] o.a.s.c.ShardLeaderElectionContext I may be the new leader - try and sync 2> 223223 INFO (qtp600427849-520) [n:127.0.0.1:40707_ c:oneInstanceCollection2 s:shard2 r:core_node4 x:oneInstanceCollection2_shard2_replica_n3 t:null-613] o.a.s.c.SyncStrategy Sync replicas to http://127.0.0.1:40707/oneInstanceCollection2_shard2_replica_n3/ 2> 223224 INFO (qtp600427849-520) [n:127.0.0.1:40707_ c:oneInstanceCollection2 s:shard2 r:core_node4 x:oneInstanceCollection2_shard2_replica_n3 t:null-613] o.a.s.c.SyncStrategy Sync Success - now sync replicas to me 2> 223224 INFO (qtp600427849-520) [n:127.0.0.1:40707_ c:oneInstanceCollection2 s:shard2 r:core_node4 x:oneInstanceCollection2_shard2_replica_n3 t:null-613] o.a.s.c.SyncStrategy http://127.0.0.1:40707/oneInstanceCollection2_shard2_replica_n3/ has no replicas 2> 223225 INFO (qtp600427849-520) [n:127.0.0.1:40707_ c:oneInstanceCollection2 s:shard2 r:core_node4 x:oneInstanceCollection2_shard2_replica_n3 t:null-613] o.a.s.c.ShardLeaderElectionContextBase Creating leader registration node /collections/oneInstanceCollection2/leaders/shard2/leader after winning as /collections/oneInstanceCollection2/leader_elect/shard2/election/72077803529764874-core_node4-n_0000000000 2> 223244 INFO (qtp600427849-520) [n:127.0.0.1:40707_ c:oneInstanceCollection2 s:shard2 r:core_node4 x:oneInstanceCollection2_shard2_replica_n3 t:null-613] o.a.s.c.ShardLeaderElectionContext I am the new leader: http://127.0.0.1:40707/oneInstanceCollection2_shard2_replica_n3/ shard2 2> 223358 INFO (zkCallback-288-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/oneInstanceCollection2/state.json zxid: 766] for collection [oneInstanceCollection2] has occurred - updating... (live nodes size: [5]) 2> 223358 INFO (zkCallback-288-thread-2) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/oneInstanceCollection2/state.json zxid: 766] for collection [oneInstanceCollection2] has occurred - updating... (live nodes size: [5]) 2> 223364 INFO (qtp600427849-520) [n:127.0.0.1:40707_ c:oneInstanceCollection2 s:shard2 r:core_node4 x:oneInstanceCollection2_shard2_replica_n3 t:null-613] o.a.s.c.ZkController I am the leader, no recovery necessary 2> 223378 INFO (qtp600427849-520) [n:127.0.0.1:40707_ c: s: r: x: t:null-613] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/cores params={qt=/admin/cores&coreNodeName=core_node4&collection.configName=conf1&name=oneInstanceCollection2_shard2_replica_n3&action=CREATE&collection=oneInstanceCollection2&shard=shard2&wt=javabin&version=2&replicaType=NRT} status=0 QTime=1098 2> 223400 INFO (qtp1128963290-589) [n:127.0.0.1:43017_ c:oneInstanceCollection2 s: r: x: t:null-612] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/collections params={node=127.0.0.1:40707_&action=ADDREPLICA&collection=oneInstanceCollection2&shard=shard2&type=NRT&wt=javabin&version=2} status=0 QTime=1343 2> 223456 INFO (OverseerCollectionConfigSetProcessor-72077803529764868-127.0.0.1:45297_-n_0000000000) [n: c: s: r: x: t:] o.a.s.c.OverseerTaskQueue Response ZK path: /overseer/collection-queue-work/qnr-0000000038 doesn't exist. Requestor may have disconnected from ZooKeeper 2> 223481 INFO (OverseerThreadFactory-250-thread-4) [n: c:oneInstanceCollection2 s:shard1 r: x: t:] o.a.s.c.a.c.AddReplicaCmd Node Identified 127.0.0.1:40707_ for creating new replica of shard shard1 for collection oneInstanceCollection2 2> 223482 INFO (zkCallback-288-thread-2) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/oneInstanceCollection2/state.json zxid: 776] for collection [oneInstanceCollection2] has occurred - updating... (live nodes size: [5]) 2> 223482 INFO (zkCallback-288-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/oneInstanceCollection2/state.json zxid: 776] for collection [oneInstanceCollection2] has occurred - updating... (live nodes size: [5]) 2> 223502 INFO (OverseerThreadFactory-250-thread-4) [n: c:oneInstanceCollection2 s:shard1 r: x: t:] o.a.s.c.a.c.AddReplicaCmd Returning CreateReplica command. 2> 223516 INFO (OverseerStateUpdate-72077803529764868-127.0.0.1:45297_-n_0000000000) [n:127.0.0.1:45297_ c: s: r: x: t:] o.a.s.c.o.SliceMutator createReplica() { 2> "core":"oneInstanceCollection2_shard1_replica_n5", 2> "node_name":"127.0.0.1:40707_", 2> "base_url":"http://127.0.0.1:40707", 2> "collection":"oneInstanceCollection2", 2> "shard":"shard1", 2> "state":"down", 2> "type":"NRT", 2> "operation":"addreplica"} 2> 223634 INFO (zkCallback-244-thread-2) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/oneInstanceCollection2/state.json zxid: 781] for collection [oneInstanceCollection2] has occurred - updating... (live nodes size: [5]) 2> 223634 INFO (zkCallback-288-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/oneInstanceCollection2/state.json zxid: 781] for collection [oneInstanceCollection2] has occurred - updating... (live nodes size: [5]) 2> 223634 INFO (zkCallback-288-thread-2) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/oneInstanceCollection2/state.json zxid: 781] for collection [oneInstanceCollection2] has occurred - updating... (live nodes size: [5]) 2> 223650 INFO (qtp600427849-516) [n:127.0.0.1:40707_ c: s: r: x:oneInstanceCollection2_shard1_replica_n5 t:null-615] o.a.s.h.a.CoreAdminOperation core create command qt=/admin/cores&coreNodeName=core_node6&collection.configName=conf1&name=oneInstanceCollection2_shard1_replica_n5&action=CREATE&collection=oneInstanceCollection2&shard=shard1&wt=javabin&version=2&replicaType=NRT 2> 223691 INFO (qtp600427849-516) [n:127.0.0.1:40707_ c:oneInstanceCollection2 s:shard1 r:core_node6 x:oneInstanceCollection2_shard1_replica_n5 t:null-615] o.a.s.c.SolrConfig Using Lucene MatchVersion: 9.8.0 2> 223703 INFO (qtp600427849-516) [n:127.0.0.1:40707_ c:oneInstanceCollection2 s:shard1 r:core_node6 x:oneInstanceCollection2_shard1_replica_n5 t:null-615] o.a.s.s.IndexSchema Schema name=test 2> 223779 INFO (zkCallback-288-thread-2) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/oneInstanceCollection2/state.json zxid: 784] for collection [oneInstanceCollection2] has occurred - updating... (live nodes size: [5]) 2> 223779 INFO (zkCallback-288-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/oneInstanceCollection2/state.json zxid: 784] for collection [oneInstanceCollection2] has occurred - updating... (live nodes size: [5]) 2> 223886 INFO (qtp600427849-516) [n:127.0.0.1:40707_ c:oneInstanceCollection2 s:shard1 r:core_node6 x:oneInstanceCollection2_shard1_replica_n5 t:null-615] o.a.s.s.IndexSchema Loaded schema test/1.0 with uniqueid field id 2> 238337 INFO (qtp600427849-516) [n:127.0.0.1:40707_ c:oneInstanceCollection2 s:shard1 r:core_node6 x:oneInstanceCollection2_shard1_replica_n5 t:null-615] o.a.s.c.CoreContainer Creating SolrCore 'oneInstanceCollection2_shard1_replica_n5' using configuration from configset conf1, trusted=true 2> 238386 INFO (qtp600427849-516) [n:127.0.0.1:40707_ c:oneInstanceCollection2 s:shard1 r:core_node6 x:oneInstanceCollection2_shard1_replica_n5 t:null-615] o.a.s.c.SolrCore Opening new SolrCore at [/build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.cloud.BasicDistributedZkTest_246C98A4C257C021-001/shard-1-001/cores/oneInstanceCollection2_shard1_replica_n5], dataDir=[/build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.cloud.BasicDistributedZkTest_246C98A4C257C021-001/shard-1-001/cores/oneInstanceCollection2_shard1_replica_n5/data/] 2> 238426 INFO (qtp600427849-516) [n:127.0.0.1:40707_ c:oneInstanceCollection2 s:shard1 r:core_node6 x:oneInstanceCollection2_shard1_replica_n5 t:null-615] o.a.s.u.RandomMergePolicy RandomMergePolicy wrapping class org.apache.lucene.index.TieredMergePolicy: [TieredMergePolicy: maxMergeAtOnce=19, maxMergedSegmentMB=10.273061752319336, floorSegmentMB=0.749781608581543, forceMergeDeletesPctAllowed=12.565230667147118, segmentsPerTier=12.0, maxCFSSegmentSizeMB=8.796093022208E12, noCFSRatio=0.13744209775263264, deletesPctAllowed=45.32596601683406 2> 238478 WARN (qtp600427849-516) [n:127.0.0.1:40707_ c:oneInstanceCollection2 s:shard1 r:core_node6 x:oneInstanceCollection2_shard1_replica_n5 t:null-615] o.a.s.c.RequestHandlers INVALID paramSet a in requestHandler {type = requestHandler,name = /dump,class = DumpRequestHandler,attributes = {initParams=a, name=/dump, class=DumpRequestHandler},args = {defaults={a=A, b=B}}} 2> 238658 ERROR (OverseerThreadFactory-250-thread-4) [n: c:oneInstanceCollection2 s:shard1 r: x: t:] o.a.s.c.a.c.CollectionHandlingUtils Error from shard: http://127.0.0.1:40707 2> => org.apache.solr.client.solrj.SolrServerException: Total timeout 15000 ms elapsed 2> at org.apache.solr.client.solrj.impl.Http2SolrClient$1.onFailure(Http2SolrClient.java:506) 2> org.apache.solr.client.solrj.SolrServerException: Total timeout 15000 ms elapsed 2> at org.apache.solr.client.solrj.impl.Http2SolrClient$1.onFailure(Http2SolrClient.java:506) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.eclipse.jetty.client.ResponseNotifier.notifyFailure(ResponseNotifier.java:192) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.ResponseNotifier.notifyFailure(ResponseNotifier.java:184) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpReceiver.abort(HttpReceiver.java:559) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpChannel.abortResponse(HttpChannel.java:154) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpChannel.abort(HttpChannel.java:147) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpExchange.abort(HttpExchange.java:273) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpConversation.abort(HttpConversation.java:159) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpRequest.abort(HttpRequest.java:925) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpConnection$RequestTimeouts.onExpired(HttpConnection.java:342) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpConnection$RequestTimeouts.onExpired(HttpConnection.java:322) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.io.CyclicTimeouts.onTimeoutExpired(CyclicTimeouts.java:110) ~[jetty-io-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.io.CyclicTimeouts$Timeouts.onTimeoutExpired(CyclicTimeouts.java:197) ~[jetty-io-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.io.CyclicTimeout$Wakeup.run(CyclicTimeout.java:294) ~[jetty-io-10.0.19.jar:10.0.19] 2> at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) ~[?:?] 2> at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264) ~[?:?] 2> at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:304) ~[?:?] 2> at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) [?:?] 2> at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) [?:?] 2> at java.base/java.lang.Thread.run(Thread.java:829) [?:?] 2> Caused by: java.util.concurrent.TimeoutException: Total timeout 15000 ms elapsed 2> ... 11 more 2> 238708 ERROR (OverseerThreadFactory-250-thread-4) [n: c:oneInstanceCollection2 s:shard1 r: x: t:] o.a.s.c.a.c.OverseerCollectionMessageHandler Collection oneInstanceCollection2}, operation addreplica failed 2> => org.apache.solr.common.SolrException: ADDREPLICA failed to create replica 2> at org.apache.solr.cloud.api.collections.CollectionHandlingUtils$ShardRequestTracker.processResponses(CollectionHandlingUtils.java:677) 2> org.apache.solr.common.SolrException: ADDREPLICA failed to create replica 2> at org.apache.solr.cloud.api.collections.CollectionHandlingUtils$ShardRequestTracker.processResponses(CollectionHandlingUtils.java:677) ~[main/:?] 2> at org.apache.solr.cloud.api.collections.CollectionHandlingUtils$ShardRequestTracker.processResponses(CollectionHandlingUtils.java:656) ~[main/:?] 2> at org.apache.solr.cloud.api.collections.AddReplicaCmd.lambda$addReplica$1(AddReplicaCmd.java:200) ~[main/:?] 2> at org.apache.solr.cloud.api.collections.AddReplicaCmd.addReplica(AddReplicaCmd.java:233) ~[main/:?] 2> at org.apache.solr.cloud.api.collections.AddReplicaCmd.call(AddReplicaCmd.java:85) ~[main/:?] 2> at org.apache.solr.cloud.api.collections.OverseerCollectionMessageHandler.processMessage(OverseerCollectionMessageHandler.java:129) [main/:?] 2> at org.apache.solr.cloud.OverseerTaskProcessor$Runner.run(OverseerTaskProcessor.java:564) [main/:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.common.util.ExecutorUtil$MDCAwareThreadPoolExecutor.lambda$execute$0(ExecutorUtil.java:294) [solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) [?:?] 2> at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) [?:?] 2> at java.base/java.lang.Thread.run(Thread.java:829) [?:?] 2> Caused by: org.apache.solr.client.solrj.SolrServerException: Total timeout 15000 ms elapsed 2> at org.apache.solr.client.solrj.impl.Http2SolrClient$1.onFailure(Http2SolrClient.java:506) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.eclipse.jetty.client.ResponseNotifier.notifyFailure(ResponseNotifier.java:192) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.ResponseNotifier.notifyFailure(ResponseNotifier.java:184) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpReceiver.abort(HttpReceiver.java:559) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpChannel.abortResponse(HttpChannel.java:154) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpChannel.abort(HttpChannel.java:147) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpExchange.abort(HttpExchange.java:273) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpConversation.abort(HttpConversation.java:159) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpRequest.abort(HttpRequest.java:925) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpConnection$RequestTimeouts.onExpired(HttpConnection.java:342) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpConnection$RequestTimeouts.onExpired(HttpConnection.java:322) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.io.CyclicTimeouts.onTimeoutExpired(CyclicTimeouts.java:110) ~[jetty-io-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.io.CyclicTimeouts$Timeouts.onTimeoutExpired(CyclicTimeouts.java:197) ~[jetty-io-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.io.CyclicTimeout$Wakeup.run(CyclicTimeout.java:294) ~[jetty-io-10.0.19.jar:10.0.19] 2> at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) ~[?:?] 2> at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264) ~[?:?] 2> at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:304) ~[?:?] 2> ... 3 more 2> Caused by: java.util.concurrent.TimeoutException: Total timeout 15000 ms elapsed 2> at org.eclipse.jetty.client.HttpConnection$RequestTimeouts.onExpired(HttpConnection.java:342) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpConnection$RequestTimeouts.onExpired(HttpConnection.java:322) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.io.CyclicTimeouts.onTimeoutExpired(CyclicTimeouts.java:110) ~[jetty-io-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.io.CyclicTimeouts$Timeouts.onTimeoutExpired(CyclicTimeouts.java:197) ~[jetty-io-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.io.CyclicTimeout$Wakeup.run(CyclicTimeout.java:294) ~[jetty-io-10.0.19.jar:10.0.19] 2> at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) ~[?:?] 2> at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264) ~[?:?] 2> at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:304) ~[?:?] 2> ... 3 more 2> 238739 INFO (qtp600427849-516) [n:127.0.0.1:40707_ c:oneInstanceCollection2 s:shard1 r:core_node6 x:oneInstanceCollection2_shard1_replica_n5 t:null-615] o.a.s.u.UpdateHandler Using UpdateLog implementation: org.apache.solr.update.UpdateLog 2> 238739 INFO (qtp600427849-516) [n:127.0.0.1:40707_ c:oneInstanceCollection2 s:shard1 r:core_node6 x:oneInstanceCollection2_shard1_replica_n5 t:null-615] o.a.s.u.UpdateLog Initializing UpdateLog: dataDir= defaultSyncLevel=FLUSH numRecordsToKeep=100 maxNumLogsToKeep=10 numVersionBuckets=65536 2> 238761 INFO (qtp600427849-516) [n:127.0.0.1:40707_ c:oneInstanceCollection2 s:shard1 r:core_node6 x:oneInstanceCollection2_shard1_replica_n5 t:null-615] o.a.s.u.CommitTracker Hard AutoCommit: disabled 2> 238762 INFO (qtp600427849-516) [n:127.0.0.1:40707_ c:oneInstanceCollection2 s:shard1 r:core_node6 x:oneInstanceCollection2_shard1_replica_n5 t:null-615] o.a.s.u.CommitTracker Soft AutoCommit: disabled 2> 238775 INFO (qtp600427849-516) [n:127.0.0.1:40707_ c:oneInstanceCollection2 s:shard1 r:core_node6 x:oneInstanceCollection2_shard1_replica_n5 t:null-615] o.a.s.u.RandomMergePolicy RandomMergePolicy wrapping class org.apache.lucene.index.LogByteSizeMergePolicy: [LogByteSizeMergePolicy: minMergeSize=1677721, mergeFactor=35, maxMergeSize=2147483648, maxMergeSizeForForcedMerge=9223372036854775807, calibrateSizeByDeletes=false, maxMergeDocs=2147483647, maxCFSSegmentSizeMB=8.796093022208E12, noCFSRatio=0.0] 2> 238763 ERROR (qtp1128963290-588) [n:127.0.0.1:43017_ c:oneInstanceCollection2 s: r: x: t:null-614] o.a.s.h.RequestHandlerBase Server exception 2> => org.apache.solr.common.SolrException: ADDREPLICA failed to create replica 2> at org.apache.solr.client.solrj.SolrResponse.getException(SolrResponse.java:56) 2> org.apache.solr.common.SolrException: ADDREPLICA failed to create replica 2> at org.apache.solr.client.solrj.SolrResponse.getException(SolrResponse.java:56) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.handler.admin.api.AdminAPIBase.submitRemoteMessageAndHandleResponse(AdminAPIBase.java:141) ~[main/:?] 2> at org.apache.solr.handler.admin.api.CreateReplicaAPI.createReplica(CreateReplicaAPI.java:105) ~[main/:?] 2> at org.apache.solr.handler.admin.CollectionsHandler$CollectionOperation.lambda$static$22(CollectionsHandler.java:949) ~[main/:?] 2> at org.apache.solr.handler.admin.CollectionsHandler$CollectionOperation.execute(CollectionsHandler.java:1265) ~[main/:?] 2> at org.apache.solr.handler.admin.CollectionsHandler.invokeAction(CollectionsHandler.java:315) ~[main/:?] 2> at org.apache.solr.handler.admin.CollectionsHandler.handleRequestBody(CollectionsHandler.java:293) ~[main/:?] 2> at org.apache.solr.handler.RequestHandlerBase.handleRequest(RequestHandlerBase.java:226) [main/:?] 2> at org.apache.solr.servlet.HttpSolrCall.handleAdmin(HttpSolrCall.java:932) [main/:?] 2> at org.apache.solr.servlet.HttpSolrCall.handleAdminRequest(HttpSolrCall.java:880) [main/:?] 2> at org.apache.solr.servlet.HttpSolrCall.call(HttpSolrCall.java:541) [main/:?] 2> at org.apache.solr.servlet.SolrDispatchFilter.dispatch(SolrDispatchFilter.java:262) [main/:?] 2> at org.apache.solr.servlet.SolrDispatchFilter.lambda$doFilter$0(SolrDispatchFilter.java:219) [main/:?] 2> at org.apache.solr.servlet.ServletUtils.traceHttpRequestExecution2(ServletUtils.java:246) [main/:?] 2> at org.apache.solr.servlet.ServletUtils.rateLimitRequest(ServletUtils.java:215) [main/:?] 2> at org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:213) [main/:?] 2> at org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:195) [main/:?] 2> at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:202) [jetty-servlet-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1635) [jetty-servlet-10.0.19.jar:10.0.19] 2> at org.apache.solr.embedded.JettySolrRunner$DebugFilter.doFilter(JettySolrRunner.java:187) [solr-test-framework-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:202) [jetty-servlet-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1635) [jetty-servlet-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:527) [jetty-servlet-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:221) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1580) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:221) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1384) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:176) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:484) [jetty-servlet-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1553) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:174) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1306) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:129) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:122) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.rewrite.handler.RewriteHandler.handle(RewriteHandler.java:301) [jetty-rewrite-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:122) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.gzip.GzipHandler.handle(GzipHandler.java:822) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:122) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.Server.handle(Server.java:563) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.HttpChannel$RequestDispatchable.dispatch(HttpChannel.java:1598) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:753) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:501) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:287) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:314) [jetty-io-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:100) [jetty-io-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.io.SelectableChannelEndPoint$1.run(SelectableChannelEndPoint.java:53) [jetty-io-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:969) [jetty-util-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.doRunJob(QueuedThreadPool.java:1194) [jetty-util-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1149) [jetty-util-10.0.19.jar:10.0.19] 2> at java.base/java.lang.Thread.run(Thread.java:829) [?:?] 2> 238786 INFO (qtp1128963290-588) [n:127.0.0.1:43017_ c:oneInstanceCollection2 s: r: x: t:null-614] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/collections params={node=127.0.0.1:40707_&action=ADDREPLICA&collection=oneInstanceCollection2&shard=shard1&type=NRT&wt=javabin&version=2} status=500 QTime=15361 2> 238801 INFO (qtp600427849-516) [n:127.0.0.1:40707_ c:oneInstanceCollection2 s:shard1 r:core_node6 x:oneInstanceCollection2_shard1_replica_n5 t:null-615] o.a.s.r.ManagedResourceStorage Configured ZooKeeperStorageIO with znodeBase: /configs/conf1 2> 238810 INFO (qtp600427849-516) [n:127.0.0.1:40707_ c:oneInstanceCollection2 s:shard1 r:core_node6 x:oneInstanceCollection2_shard1_replica_n5 t:null-615] o.a.s.r.ManagedResourceStorage Loaded null at path _rest_managed.json using ZooKeeperStorageIO:path=/configs/conf1 2> 238791 ERROR (qtp1128963290-588) [n:127.0.0.1:43017_ c:oneInstanceCollection2 s: r: x: t:null-614] o.a.s.s.HttpSolrCall 500 Exception 2> => org.apache.solr.common.SolrException: ADDREPLICA failed to create replica 2> at org.apache.solr.client.solrj.SolrResponse.getException(SolrResponse.java:56) 2> org.apache.solr.common.SolrException: ADDREPLICA failed to create replica 2> at org.apache.solr.client.solrj.SolrResponse.getException(SolrResponse.java:56) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.handler.admin.api.AdminAPIBase.submitRemoteMessageAndHandleResponse(AdminAPIBase.java:141) ~[main/:?] 2> at org.apache.solr.handler.admin.api.CreateReplicaAPI.createReplica(CreateReplicaAPI.java:105) ~[main/:?] 2> at org.apache.solr.handler.admin.CollectionsHandler$CollectionOperation.lambda$static$22(CollectionsHandler.java:949) ~[main/:?] 2> at org.apache.solr.handler.admin.CollectionsHandler$CollectionOperation.execute(CollectionsHandler.java:1265) ~[main/:?] 2> at org.apache.solr.handler.admin.CollectionsHandler.invokeAction(CollectionsHandler.java:315) ~[main/:?] 2> at org.apache.solr.handler.admin.CollectionsHandler.handleRequestBody(CollectionsHandler.java:293) ~[main/:?] 2> at org.apache.solr.handler.RequestHandlerBase.handleRequest(RequestHandlerBase.java:226) ~[main/:?] 2> at org.apache.solr.servlet.HttpSolrCall.handleAdmin(HttpSolrCall.java:932) [main/:?] 2> at org.apache.solr.servlet.HttpSolrCall.handleAdminRequest(HttpSolrCall.java:880) [main/:?] 2> at org.apache.solr.servlet.HttpSolrCall.call(HttpSolrCall.java:541) [main/:?] 2> at org.apache.solr.servlet.SolrDispatchFilter.dispatch(SolrDispatchFilter.java:262) [main/:?] 2> at org.apache.solr.servlet.SolrDispatchFilter.lambda$doFilter$0(SolrDispatchFilter.java:219) [main/:?] 2> at org.apache.solr.servlet.ServletUtils.traceHttpRequestExecution2(ServletUtils.java:246) [main/:?] 2> at org.apache.solr.servlet.ServletUtils.rateLimitRequest(ServletUtils.java:215) [main/:?] 2> at org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:213) [main/:?] 2> at org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:195) [main/:?] 2> at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:202) [jetty-servlet-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1635) [jetty-servlet-10.0.19.jar:10.0.19] 2> at org.apache.solr.embedded.JettySolrRunner$DebugFilter.doFilter(JettySolrRunner.java:187) [solr-test-framework-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:202) [jetty-servlet-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1635) [jetty-servlet-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:527) [jetty-servlet-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:221) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1580) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:221) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1384) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:176) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:484) [jetty-servlet-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1553) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:174) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1306) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:129) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:122) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.rewrite.handler.RewriteHandler.handle(RewriteHandler.java:301) [jetty-rewrite-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:122) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.gzip.GzipHandler.handle(GzipHandler.java:822) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:122) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.Server.handle(Server.java:563) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.HttpChannel$RequestDispatchable.dispatch(HttpChannel.java:1598) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:753) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:501) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:287) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:314) [jetty-io-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:100) [jetty-io-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.io.SelectableChannelEndPoint$1.run(SelectableChannelEndPoint.java:53) [jetty-io-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:969) [jetty-util-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.doRunJob(QueuedThreadPool.java:1194) [jetty-util-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1149) [jetty-util-10.0.19.jar:10.0.19] 2> at java.base/java.lang.Thread.run(Thread.java:829) [?:?] 2> 238815 INFO (qtp600427849-516) [n:127.0.0.1:40707_ c:oneInstanceCollection2 s:shard1 r:core_node6 x:oneInstanceCollection2_shard1_replica_n5 t:null-615] o.a.s.h.ReplicationHandler Commits will be reserved for 10000 ms 2> 238818 INFO (qtp600427849-516) [n:127.0.0.1:40707_ c:oneInstanceCollection2 s:shard1 r:core_node6 x:oneInstanceCollection2_shard1_replica_n5 t:null-615] o.a.s.u.UpdateLog Could not find max version in index or recent updates, using new clock 1788735568914415616 2> 238854 INFO (TEST-BasicDistributedZkTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.SolrTestCaseJ4 ###Ending test 2> 238856 INFO (qtp600427849-516) [n:127.0.0.1:40707_ c:oneInstanceCollection2 s:shard1 r:core_node6 x:oneInstanceCollection2_shard1_replica_n5 t:null-615] o.a.s.c.ZkShardTerms Successful update of terms at /collections/oneInstanceCollection2/terms/shard1 to Terms{values={core_node2=0, core_node6=0}, version=1} for registerTerm 2> 238857 INFO (qtp600427849-516) [n:127.0.0.1:40707_ c:oneInstanceCollection2 s:shard1 r:core_node6 x:oneInstanceCollection2_shard1_replica_n5 t:null-615] o.a.s.c.ShardLeaderElectionContextBase make sure parent is created /collections/oneInstanceCollection2/leaders/shard1 2> 238898 INFO (searcherExecutor-491-thread-1-processing-oneInstanceCollection2_shard1_replica_n5 null-615 core_node6 127.0.0.1:40707_ oneInstanceCollection2 shard1) [n:127.0.0.1:40707_ c:oneInstanceCollection2 s:shard1 r:core_node6 x:oneInstanceCollection2_shard1_replica_n5 t:null-615] o.a.s.c.SolrCore Registered new searcher autowarm time: 0 ms 2> 238906 INFO (qtp600427849-516) [n:127.0.0.1:40707_ c:oneInstanceCollection2 s:shard1 r:core_node6 x:oneInstanceCollection2_shard1_replica_n5 t:null-615] o.a.s.c.ZkController Core needs to recover:oneInstanceCollection2_shard1_replica_n5 2> 238907 INFO (updateExecutor-283-thread-1-processing-oneInstanceCollection2_shard1_replica_n5 null-615 core_node6 127.0.0.1:40707_ oneInstanceCollection2 shard1) [n:127.0.0.1:40707_ c:oneInstanceCollection2 s:shard1 r:core_node6 x:oneInstanceCollection2_shard1_replica_n5 t:null-615] o.a.s.u.DefaultSolrCoreState Running recovery 2> 238910 INFO (recoveryExecutor-285-thread-1-processing-oneInstanceCollection2_shard1_replica_n5 null-615 core_node6 127.0.0.1:40707_ oneInstanceCollection2 shard1) [n:127.0.0.1:40707_ c:oneInstanceCollection2 s:shard1 r:core_node6 x:oneInstanceCollection2_shard1_replica_n5 t:null-615] o.a.s.c.RecoveryStrategy Starting recovery process. recoveringAfterStartup=true 2> 238910 INFO (recoveryExecutor-285-thread-1-processing-oneInstanceCollection2_shard1_replica_n5 null-615 core_node6 127.0.0.1:40707_ oneInstanceCollection2 shard1) [n:127.0.0.1:40707_ c:oneInstanceCollection2 s:shard1 r:core_node6 x:oneInstanceCollection2_shard1_replica_n5 t:null-615] o.a.s.c.RecoveryStrategy startupVersions is empty 2> 238920 INFO (qtp600427849-516) [n:127.0.0.1:40707_ c: s: r: x: t:null-615] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/cores params={qt=/admin/cores&coreNodeName=core_node6&collection.configName=conf1&name=oneInstanceCollection2_shard1_replica_n5&action=CREATE&collection=oneInstanceCollection2&shard=shard1&wt=javabin&version=2&replicaType=NRT} status=0 QTime=15273 2> 238968 INFO (qtp600427849-519) [n:127.0.0.1:40707_ c:oneInstanceCollection2 s:shard1 r:core_node2 x:oneInstanceCollection2_shard1_replica_n1 t:null-616] o.a.s.c.S.Request webapp= path=/admin/ping params={wt=javabin&version=2} hits=0 status=0 QTime=35 2> 238968 INFO (qtp600427849-519) [n:127.0.0.1:40707_ c:oneInstanceCollection2 s:shard1 r:core_node2 x:oneInstanceCollection2_shard1_replica_n1 t:null-616] o.a.s.c.S.Request webapp= path=/admin/ping params={wt=javabin&version=2} status=0 QTime=37 2> 239017 INFO (recoveryExecutor-285-thread-1-processing-oneInstanceCollection2_shard1_replica_n5 null-615 core_node6 127.0.0.1:40707_ oneInstanceCollection2 shard1) [n:127.0.0.1:40707_ c:oneInstanceCollection2 s:shard1 r:core_node6 x:oneInstanceCollection2_shard1_replica_n5 t:null-615] o.a.s.c.RecoveryStrategy Begin buffering updates. core=[oneInstanceCollection2_shard1_replica_n5] 2> 239085 INFO (recoveryExecutor-285-thread-1-processing-oneInstanceCollection2_shard1_replica_n5 null-615 core_node6 127.0.0.1:40707_ oneInstanceCollection2 shard1) [n:127.0.0.1:40707_ c:oneInstanceCollection2 s:shard1 r:core_node6 x:oneInstanceCollection2_shard1_replica_n5 t:null-615] o.a.s.u.UpdateLog Starting to buffer updates. FSUpdateLog{state=ACTIVE, tlog=null} 2> 239085 INFO (recoveryExecutor-285-thread-1-processing-oneInstanceCollection2_shard1_replica_n5 null-615 core_node6 127.0.0.1:40707_ oneInstanceCollection2 shard1) [n:127.0.0.1:40707_ c:oneInstanceCollection2 s:shard1 r:core_node6 x:oneInstanceCollection2_shard1_replica_n5 t:null-615] o.a.s.c.RecoveryStrategy Publishing state of core [oneInstanceCollection2_shard1_replica_n5] as recovering, leader is [http://127.0.0.1:40707/oneInstanceCollection2_shard1_replica_n1/] and I am [http://127.0.0.1:40707/oneInstanceCollection2_shard1_replica_n5/] 2> 239095 INFO (recoveryExecutor-285-thread-1-processing-oneInstanceCollection2_shard1_replica_n5 null-615 core_node6 127.0.0.1:40707_ oneInstanceCollection2 shard1) [n:127.0.0.1:40707_ c:oneInstanceCollection2 s:shard1 r:core_node6 x:oneInstanceCollection2_shard1_replica_n5 t:null-615] o.a.s.c.RecoveryStrategy Sending prep recovery command to [http://127.0.0.1:40707]; [WaitForState: action=PREPRECOVERY&core=oneInstanceCollection2_shard1_replica_n1&nodeName=127.0.0.1:40707_&coreNodeName=core_node6&state=recovering&checkLive=true&onlyIfLeader=true&onlyIfLeaderActive=true] 2> 239113 INFO (zkCallback-288-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/oneInstanceCollection2/state.json zxid: 796] for collection [oneInstanceCollection2] has occurred - updating... (live nodes size: [5]) 2> 239113 INFO (zkCallback-288-thread-2) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/oneInstanceCollection2/state.json zxid: 796] for collection [oneInstanceCollection2] has occurred - updating... (live nodes size: [5]) 2> 239142 INFO (qtp600427849-515) [n:127.0.0.1:40707_ c: s: r: x:oneInstanceCollection2_shard1_replica_n1 t:null-617] o.a.s.h.a.PrepRecoveryOp Going to wait for coreNodeName: core_node6, state: recovering, checkLive: true, onlyIfLeader: true, onlyIfLeaderActive: true 2> 239143 INFO (qtp600427849-515) [n:127.0.0.1:40707_ c: s: r: x: t:null-617] o.a.s.h.a.PrepRecoveryOp In WaitForState(recovering): collection=oneInstanceCollection2, shard=shard1, thisCore=oneInstanceCollection2_shard1_replica_n1, leaderDoesNotNeedRecovery=false, isLeader? true, live=true, checkLive=true, currentState=recovering, localState=active, nodeName=127.0.0.1:40707_, coreNodeName=core_node6, onlyIfActiveCheckResult=false, nodeProps: core_node6:{ 2> "core":"oneInstanceCollection2_shard1_replica_n5", 2> "node_name":"127.0.0.1:40707_", 2> "base_url":"http://127.0.0.1:40707", 2> "state":"recovering", 2> "type":"NRT", 2> "force_set_state":"false"} 2> 239144 INFO (qtp600427849-515) [n:127.0.0.1:40707_ c: s: r: x: t:null-617] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/cores params={nodeName=127.0.0.1:40707_&onlyIfLeaderActive=true&core=oneInstanceCollection2_shard1_replica_n1&coreNodeName=core_node6&action=PREPRECOVERY&checkLive=true&state=recovering&onlyIfLeader=true&wt=javabin&version=2} status=0 QTime=4 2> 239345 INFO (closeThreadPool-498-thread-1) [n: c: s: r: x: t:] o.e.j.s.Server Stopped Server@86332bd{STOPPING}[10.0.19,sto=0] 2> 239351 INFO (closeThreadPool-498-thread-1) [n: c: s: r: x: t:] o.e.j.s.AbstractConnector Stopped ServerConnector@45540c3e{HTTP/1.1, (http/1.1, h2c)}{127.0.0.1:0} 2> 239353 INFO (closeThreadPool-498-thread-5) [n: c: s: r: x: t:] o.e.j.s.Server Stopped Server@46d0a8c2{STOPPING}[10.0.19,sto=0] 2> 239354 INFO (closeThreadPool-498-thread-3) [n: c: s: r: x: t:] o.e.j.s.Server Stopped Server@761e3cbc{STOPPING}[10.0.19,sto=0] 2> 239354 INFO (closeThreadPool-498-thread-2) [n: c: s: r: x: t:] o.e.j.s.Server Stopped Server@582d86f1{STOPPING}[10.0.19,sto=0] 2> 239354 INFO (closeThreadPool-498-thread-4) [n: c: s: r: x: t:] o.e.j.s.Server Stopped Server@4609ed2a{STOPPING}[10.0.19,sto=0] 2> 239356 INFO (closeThreadPool-498-thread-1) [n: c: s: r: x: t:] o.e.j.s.h.ContextHandler Stopped o.e.j.s.ServletContextHandler@3237e41b{/,file:///build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-cwd/,STOPPED} 2> 239360 INFO (closeThreadPool-498-thread-2) [n: c: s: r: x: t:] o.e.j.s.AbstractConnector Stopped ServerConnector@510f304{HTTP/1.1, (http/1.1, h2c)}{127.0.0.1:0} 2> 239361 INFO (closeThreadPool-498-thread-3) [n: c: s: r: x: t:] o.e.j.s.AbstractConnector Stopped ServerConnector@25705235{HTTP/1.1, (http/1.1, h2c)}{127.0.0.1:0} 2> 239363 INFO (closeThreadPool-498-thread-2) [n: c: s: r: x: t:] o.e.j.s.h.ContextHandler Stopped o.e.j.s.ServletContextHandler@74705c4d{/,file:///build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-cwd/,STOPPED} 2> 239363 INFO (closeThreadPool-498-thread-4) [n: c: s: r: x: t:] o.e.j.s.AbstractConnector Stopped ServerConnector@231656cb{HTTP/1.1, (http/1.1, h2c)}{127.0.0.1:0} 2> 239377 INFO (closeThreadPool-498-thread-3) [n: c: s: r: x: t:] o.e.j.s.h.ContextHandler Stopped o.e.j.s.ServletContextHandler@467365d4{/,file:///build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-cwd/,STOPPED} 2> 239380 INFO (closeThreadPool-498-thread-4) [n: c: s: r: x: t:] o.e.j.s.h.ContextHandler Stopped o.e.j.s.ServletContextHandler@11c811bb{/,file:///build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-cwd/,STOPPED} 2> 239381 INFO (closeThreadPool-498-thread-5) [n: c: s: r: x: t:] o.e.j.s.AbstractConnector Stopped ServerConnector@27f64de8{HTTP/1.1, (http/1.1, h2c)}{127.0.0.1:0} 2> 239389 INFO (closeThreadPool-498-thread-5) [n: c: s: r: x: t:] o.e.j.s.h.ContextHandler Stopped o.e.j.s.ServletContextHandler@58ad96b0{/,file:///build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-cwd/,STOPPED} 2> 239406 INFO (closeThreadPool-498-thread-2) [n: c: s: r: x: t:] o.a.s.c.CoreContainer Shutting down CoreContainer instance=1819604714 2> 239406 INFO (closeThreadPool-498-thread-5) [n: c: s: r: x: t:] o.a.s.c.CoreContainer Shutting down CoreContainer instance=604847438 2> 239406 INFO (closeThreadPool-498-thread-3) [n: c: s: r: x: t:] o.a.s.c.CoreContainer Shutting down CoreContainer instance=842909930 2> 239406 INFO (closeThreadPool-498-thread-4) [n: c: s: r: x: t:] o.a.s.c.CoreContainer Shutting down CoreContainer instance=1726900532 2> 239407 INFO (closeThreadPool-498-thread-2) [n: c: s: r: x: t:] o.a.s.c.ZkController Remove node as live in ZooKeeper:/live_nodes/127.0.0.1:43017_ 2> 239407 INFO (closeThreadPool-498-thread-3) [n: c: s: r: x: t:] o.a.s.c.ZkController Remove node as live in ZooKeeper:/live_nodes/127.0.0.1:32885_ 2> 239407 INFO (closeThreadPool-498-thread-4) [n: c: s: r: x: t:] o.a.s.c.ZkController Remove node as live in ZooKeeper:/live_nodes/127.0.0.1:37133_ 2> 239407 WARN (closeThreadPool-498-thread-5) [n: c: s: r: x: t:] o.a.s.c.RecoveryStrategy Stopping recovery for core=[oneInstanceCollection2_shard1_replica_n5] coreNodeName=[core_node6] 2> 239409 INFO (closeThreadPool-498-thread-5) [n: c: s: r: x: t:] o.a.s.c.ZkController Remove node as live in ZooKeeper:/live_nodes/127.0.0.1:40707_ 2> 239413 INFO (closeThreadPool-498-thread-2) [n: c: s: r: x: t:] o.a.s.c.ZkController Publish this node as DOWN... 2> 239414 INFO (closeThreadPool-498-thread-2) [n: c: s: r: x: t:] o.a.s.c.ZkController Publish node=127.0.0.1:43017_ as DOWN 2> 239416 INFO (closeThreadPool-498-thread-3) [n: c: s: r: x: t:] o.a.s.c.ZkController Publish this node as DOWN... 2> 239416 INFO (closeThreadPool-498-thread-3) [n: c: s: r: x: t:] o.a.s.c.ZkController Publish node=127.0.0.1:32885_ as DOWN 2> 239417 INFO (closeThreadPool-498-thread-5) [n: c: s: r: x: t:] o.a.s.c.ZkController Publish this node as DOWN... 2> 239417 INFO (closeThreadPool-498-thread-5) [n: c: s: r: x: t:] o.a.s.c.ZkController Publish node=127.0.0.1:40707_ as DOWN 2> 239419 INFO (closeThreadPool-498-thread-4) [n: c: s: r: x: t:] o.a.s.c.ZkController Publish this node as DOWN... 2> 239419 INFO (closeThreadPool-498-thread-4) [n: c: s: r: x: t:] o.a.s.c.ZkController Publish node=127.0.0.1:37133_ as DOWN 2> 239420 INFO (zkCallback-340-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (5) -> (1) 2> 239420 INFO (zkCallback-288-thread-2) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (5) -> (1) 2> 239421 INFO (zkCallback-326-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (5) -> (1) 2> 239421 INFO (zkCallback-299-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (5) -> (1) 2> 239421 INFO (zkCallback-244-thread-2) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (5) -> (1) 2> 239452 INFO (coreCloseExecutor-507-thread-1) [n:127.0.0.1:43017_ c:collection1 s:shard1 r:core_node3 x:collection1_shard1_replica_n1 t:] o.a.s.c.SolrCore CLOSING SolrCore org.apache.solr.core.SolrCore@4cdde0d7 collection1_shard1_replica_n1 2> 239452 INFO (coreCloseExecutor-507-thread-2) [n:127.0.0.1:43017_ c:collection2 s:shard1 r:core_node8 x:collection2_shard1_replica_n7 t:] o.a.s.c.SolrCore CLOSING SolrCore org.apache.solr.core.SolrCore@641fe203 collection2_shard1_replica_n7 2> 239453 INFO (coreCloseExecutor-509-thread-1) [n:127.0.0.1:32885_ c:collection1 s:shard2 r:core_node7 x:collection1_shard2_replica_n5 t:] o.a.s.c.SolrCore CLOSING SolrCore org.apache.solr.core.SolrCore@7babf325 collection1_shard2_replica_n5 2> 239453 INFO (coreCloseExecutor-507-thread-1) [n:127.0.0.1:43017_ c:collection1 s:shard1 r:core_node3 x:collection1_shard1_replica_n1 t:] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.core.collection1.shard1.replica_n1 tag=SolrCore@4cdde0d7 2> 239453 INFO (coreCloseExecutor-508-thread-1) [n:127.0.0.1:40707_ c:collection1 s:shard2 r:core_node4 x:collection1_shard2_replica_n2 t:] o.a.s.c.SolrCore CLOSING SolrCore org.apache.solr.core.SolrCore@60b877ba collection1_shard2_replica_n2 2> 239453 INFO (coreCloseExecutor-509-thread-1) [n:127.0.0.1:32885_ c:collection1 s:shard2 r:core_node7 x:collection1_shard2_replica_n5 t:] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.core.collection1.shard2.replica_n5 tag=SolrCore@7babf325 2> 239453 INFO (coreCloseExecutor-508-thread-1) [n:127.0.0.1:40707_ c:collection1 s:shard2 r:core_node4 x:collection1_shard2_replica_n2 t:] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.core.collection1.shard2.replica_n2 tag=SolrCore@60b877ba 2> 239454 INFO (coreCloseExecutor-508-thread-2) [n:127.0.0.1:40707_ c:collection2 s:shard2 r:core_node2 x:collection2_shard2_replica_n1 t:] o.a.s.c.SolrCore CLOSING SolrCore org.apache.solr.core.SolrCore@6aa85742 collection2_shard2_replica_n1 2> 239454 INFO (coreCloseExecutor-508-thread-3) [n:127.0.0.1:40707_ c:collection3 s:shard2 r:core_node2 x:collection3_shard2_replica_n1 t:] o.a.s.c.SolrCore CLOSING SolrCore org.apache.solr.core.SolrCore@2b2f1822 collection3_shard2_replica_n1 2> 239458 INFO (coreCloseExecutor-509-thread-3) [n:127.0.0.1:32885_ c:collection3 s:shard2 r:core_node6 x:collection3_shard2_replica_n5 t:] o.a.s.c.SolrCore CLOSING SolrCore org.apache.solr.core.SolrCore@62747a26 collection3_shard2_replica_n5 2> 239460 INFO (coreCloseExecutor-508-thread-4) [n:127.0.0.1:40707_ c:oneInstanceCollection s:shard2 r:core_node5 x:oneInstanceCollection_shard2_replica_n2 t:] o.a.s.c.SolrCore CLOSING SolrCore org.apache.solr.core.SolrCore@352df53f oneInstanceCollection_shard2_replica_n2 2> 239455 INFO (coreCloseExecutor-509-thread-2) [n:127.0.0.1:32885_ c:collection2 s:shard2 r:core_node6 x:collection2_shard2_replica_n5 t:] o.a.s.c.SolrCore CLOSING SolrCore org.apache.solr.core.SolrCore@6f3c7edf collection2_shard2_replica_n5 2> 239460 INFO (coreCloseExecutor-510-thread-1) [n:127.0.0.1:37133_ c:collection1 s:shard1 r:core_node8 x:collection1_shard1_replica_n6 t:] o.a.s.c.SolrCore CLOSING SolrCore org.apache.solr.core.SolrCore@5a464cb7 collection1_shard1_replica_n6 2> 239460 INFO (coreCloseExecutor-507-thread-3) [n:127.0.0.1:43017_ c:collection3 s:shard1 r:core_node8 x:collection3_shard1_replica_n7 t:] o.a.s.c.SolrCore CLOSING SolrCore org.apache.solr.core.SolrCore@5308781c collection3_shard1_replica_n7 2> 239461 INFO (coreCloseExecutor-510-thread-1) [n:127.0.0.1:37133_ c:collection1 s:shard1 r:core_node8 x:collection1_shard1_replica_n6 t:] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.core.collection1.shard1.replica_n6 tag=SolrCore@5a464cb7 2> 239461 INFO (coreCloseExecutor-510-thread-2) [n:127.0.0.1:37133_ c:collection2 s:shard1 r:core_node4 x:collection2_shard1_replica_n3 t:] o.a.s.c.SolrCore CLOSING SolrCore org.apache.solr.core.SolrCore@60615ba3 collection2_shard1_replica_n3 2> 239462 INFO (coreCloseExecutor-508-thread-5) [n:127.0.0.1:40707_ c:oneInstanceCollection s:shard2 r:core_node3 x:oneInstanceCollection_shard2_replica_n1 t:] o.a.s.c.SolrCore CLOSING SolrCore org.apache.solr.core.SolrCore@20cefd01 oneInstanceCollection_shard2_replica_n1 2> 239462 INFO (coreCloseExecutor-510-thread-3) [n:127.0.0.1:37133_ c:collection3 s:shard1 r:core_node4 x:collection3_shard1_replica_n3 t:] o.a.s.c.SolrCore CLOSING SolrCore org.apache.solr.core.SolrCore@3316dc3e collection3_shard1_replica_n3 2> 239462 INFO (coreCloseExecutor-508-thread-6) [n:127.0.0.1:40707_ c:oneInstanceCollection s:shard1 r:core_node7 x:oneInstanceCollection_shard1_replica_n4 t:] o.a.s.c.SolrCore CLOSING SolrCore org.apache.solr.core.SolrCore@561a5a38 oneInstanceCollection_shard1_replica_n4 2> 239463 INFO (coreCloseExecutor-508-thread-7) [n:127.0.0.1:40707_ c:oneInstanceCollection s:shard1 r:core_node8 x:oneInstanceCollection_shard1_replica_n6 t:] o.a.s.c.SolrCore CLOSING SolrCore org.apache.solr.core.SolrCore@309f6061 oneInstanceCollection_shard1_replica_n6 2> 239464 INFO (coreCloseExecutor-508-thread-8) [n:127.0.0.1:40707_ c:oneInstanceCollection2 s:shard1 r:core_node2 x:oneInstanceCollection2_shard1_replica_n1 t:] o.a.s.c.SolrCore CLOSING SolrCore org.apache.solr.core.SolrCore@2f139be8 oneInstanceCollection2_shard1_replica_n1 2> 239464 INFO (coreCloseExecutor-508-thread-9) [n:127.0.0.1:40707_ c:oneInstanceCollection2 s:shard2 r:core_node4 x:oneInstanceCollection2_shard2_replica_n3 t:] o.a.s.c.SolrCore CLOSING SolrCore org.apache.solr.core.SolrCore@1b4f12a7 oneInstanceCollection2_shard2_replica_n3 2> 239582 INFO (zkCallback-340-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/collection1/state.json zxid: 807] for collection [collection1] has occurred - updating... (live nodes size: [1]) 2> 239581 INFO (zkCallback-326-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/collection1/state.json zxid: 807] for collection [collection1] has occurred - updating... (live nodes size: [1]) 2> 239582 INFO (zkCallback-288-thread-2) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/collection1/state.json zxid: 807] for collection [collection1] has occurred - updating... (live nodes size: [1]) 2> 239582 INFO (zkCallback-299-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/collection1/state.json zxid: 807] for collection [collection1] has occurred - updating... (live nodes size: [1]) 2> 239582 INFO (zkCallback-288-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/collection1/state.json zxid: 807] for collection [collection1] has occurred - updating... (live nodes size: [1]) 2> 239582 INFO (zkCallback-340-thread-2) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/collection1/state.json zxid: 807] for collection [collection1] has occurred - updating... (live nodes size: [1]) 2> 239593 INFO (zkCallback-288-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/oneInstanceCollection/state.json zxid: 808] for collection [oneInstanceCollection] has occurred - updating... (live nodes size: [1]) 2> 239594 INFO (zkCallback-288-thread-2) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/oneInstanceCollection/state.json zxid: 808] for collection [oneInstanceCollection] has occurred - updating... (live nodes size: [1]) 2> 239602 INFO (zkCallback-340-thread-2) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/collection2/state.json zxid: 809] for collection [collection2] has occurred - updating... (live nodes size: [1]) 2> 239602 INFO (zkCallback-299-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/collection2/state.json zxid: 809] for collection [collection2] has occurred - updating... (live nodes size: [1]) 2> 239602 INFO (zkCallback-288-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/collection2/state.json zxid: 809] for collection [collection2] has occurred - updating... (live nodes size: [1]) 2> 239602 INFO (zkCallback-299-thread-2) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/collection2/state.json zxid: 809] for collection [collection2] has occurred - updating... (live nodes size: [1]) 2> 239602 INFO (zkCallback-288-thread-2) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/collection2/state.json zxid: 809] for collection [collection2] has occurred - updating... (live nodes size: [1]) 2> 239602 INFO (zkCallback-326-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/collection2/state.json zxid: 809] for collection [collection2] has occurred - updating... (live nodes size: [1]) 2> 239612 INFO (zkCallback-340-thread-2) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/collection3/state.json zxid: 810] for collection [collection3] has occurred - updating... (live nodes size: [1]) 2> 239612 INFO (zkCallback-299-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/collection3/state.json zxid: 810] for collection [collection3] has occurred - updating... (live nodes size: [1]) 2> 239612 INFO (zkCallback-326-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/collection3/state.json zxid: 810] for collection [collection3] has occurred - updating... (live nodes size: [1]) 2> 239612 INFO (zkCallback-288-thread-2) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/collection3/state.json zxid: 810] for collection [collection3] has occurred - updating... (live nodes size: [1]) 2> 239612 INFO (zkCallback-288-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/collection3/state.json zxid: 810] for collection [collection3] has occurred - updating... (live nodes size: [1]) 2> 239612 INFO (zkCallback-299-thread-2) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/collection3/state.json zxid: 810] for collection [collection3] has occurred - updating... (live nodes size: [1]) 2> 239624 INFO (zkCallback-288-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/oneInstanceCollection2/state.json zxid: 811] for collection [oneInstanceCollection2] has occurred - updating... (live nodes size: [1]) 2> 239624 INFO (zkCallback-288-thread-2) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/oneInstanceCollection2/state.json zxid: 811] for collection [oneInstanceCollection2] has occurred - updating... (live nodes size: [1]) 2> 239648 INFO (closeThreadPool-498-thread-1) [n: c: s: r: x: t:] o.a.s.c.CoreContainer Shutting down CoreContainer instance=205625851 2> 239649 INFO (closeThreadPool-498-thread-1) [n: c: s: r: x: t:] o.a.s.c.ZkController Remove node as live in ZooKeeper:/live_nodes/127.0.0.1:45297_ 2> 239650 INFO (recoveryExecutor-285-thread-1-processing-oneInstanceCollection2_shard1_replica_n5 null-615 core_node6 127.0.0.1:40707_ oneInstanceCollection2 shard1) [n:127.0.0.1:40707_ c:oneInstanceCollection2 s:shard1 r:core_node6 x:oneInstanceCollection2_shard1_replica_n5 t:null-615] o.a.s.c.RecoveryStrategy Attempting to PeerSync from [http://127.0.0.1:40707/oneInstanceCollection2_shard1_replica_n1/] - recoveringAfterStartup=[true] 2> 239654 WARN (recoveryExecutor-285-thread-1-processing-oneInstanceCollection2_shard1_replica_n5 null-615 core_node6 127.0.0.1:40707_ oneInstanceCollection2 shard1) [n:127.0.0.1:40707_ c:oneInstanceCollection2 s:shard1 r:core_node6 x:oneInstanceCollection2_shard1_replica_n5 t:null-615] o.a.s.u.PeerSyncWithLeader no frame of reference to tell if we've missed updates 2> 239654 INFO (closeThreadPool-498-thread-1) [n: c: s: r: x: t:] o.a.s.c.ZkController Publish this node as DOWN... 2> 239654 INFO (closeThreadPool-498-thread-1) [n: c: s: r: x: t:] o.a.s.c.ZkController Publish node=127.0.0.1:45297_ as DOWN 2> 239655 INFO (recoveryExecutor-285-thread-1-processing-oneInstanceCollection2_shard1_replica_n5 null-615 core_node6 127.0.0.1:40707_ oneInstanceCollection2 shard1) [n:127.0.0.1:40707_ c:oneInstanceCollection2 s:shard1 r:core_node6 x:oneInstanceCollection2_shard1_replica_n5 t:null-615] o.a.s.c.RecoveryStrategy PeerSync Recovery was not successful - trying replication. 2> 239655 INFO (recoveryExecutor-285-thread-1-processing-oneInstanceCollection2_shard1_replica_n5 null-615 core_node6 127.0.0.1:40707_ oneInstanceCollection2 shard1) [n:127.0.0.1:40707_ c:oneInstanceCollection2 s:shard1 r:core_node6 x:oneInstanceCollection2_shard1_replica_n5 t:null-615] o.a.s.c.RecoveryStrategy RecoveryStrategy has been closed 2> 239655 INFO (recoveryExecutor-285-thread-1-processing-oneInstanceCollection2_shard1_replica_n5 null-615 core_node6 127.0.0.1:40707_ oneInstanceCollection2 shard1) [n:127.0.0.1:40707_ c:oneInstanceCollection2 s:shard1 r:core_node6 x:oneInstanceCollection2_shard1_replica_n5 t:null-615] o.a.s.c.RecoveryStrategy Finished recovery process, successful=[false] msTimeTaken=744.0 2> 239656 INFO (recoveryExecutor-285-thread-1-processing-oneInstanceCollection2_shard1_replica_n5 null-615 core_node6 127.0.0.1:40707_ oneInstanceCollection2 shard1) [n:127.0.0.1:40707_ c:oneInstanceCollection2 s:shard1 r:core_node6 x:oneInstanceCollection2_shard1_replica_n5 t:null-615] o.a.s.c.RecoveryStrategy Finished recovery process. recoveringAfterStartup=true msTimeTaken=745.0 2> 239656 INFO (recoveryExecutor-285-thread-1-processing-oneInstanceCollection2_shard1_replica_n5 null-615 core_node6 127.0.0.1:40707_ oneInstanceCollection2 shard1) [n:127.0.0.1:40707_ c:oneInstanceCollection2 s:shard1 r:core_node6 x:oneInstanceCollection2_shard1_replica_n5 t:null-615] o.a.s.c.SolrCore CLOSING SolrCore org.apache.solr.core.SolrCore@186bd751 oneInstanceCollection2_shard1_replica_n5 2> 239657 INFO (zkCallback-326-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (1) -> (0) 2> 239657 INFO (zkCallback-244-thread-2) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (1) -> (0) 2> 239658 INFO (zkCallback-299-thread-2) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (1) -> (0) 2> 239658 INFO (zkCallback-340-thread-2) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (1) -> (0) 2> 239660 INFO (zkCallback-288-thread-2) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (1) -> (0) 2> 239681 INFO (coreCloseExecutor-513-thread-1) [n:127.0.0.1:45297_ c:control_collection s:shard1 r:core_node2 x:control_collection_shard1_replica_n1 t:] o.a.s.c.SolrCore CLOSING SolrCore org.apache.solr.core.SolrCore@5b9635c5 control_collection_shard1_replica_n1 2> 239683 INFO (coreCloseExecutor-513-thread-1) [n:127.0.0.1:45297_ c:control_collection s:shard1 r:core_node2 x:control_collection_shard1_replica_n1 t:] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.core.control_collection.shard1.replica_n1 tag=SolrCore@5b9635c5 2> 239794 INFO (zkCallback-244-thread-2) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/control_collection/state.json zxid: 815] for collection [control_collection] has occurred - updating... (live nodes size: [0]) 2> 239794 INFO (zkCallback-244-thread-3) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/control_collection/state.json zxid: 815] for collection [control_collection] has occurred - updating... (live nodes size: [0]) 2> 240099 INFO (coreCloseExecutor-507-thread-1) [n:127.0.0.1:43017_ c:collection1 s:shard1 r:core_node3 x:collection1_shard1_replica_n1 t:] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.collection.collection1.shard1.leader tag=SolrCore@4cdde0d7 2> 240100 INFO (coreCloseExecutor-507-thread-2) [n:127.0.0.1:43017_ c:collection2 s:shard1 r:core_node8 x:collection2_shard1_replica_n7 t:] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.core.collection2.shard1.replica_n7 tag=SolrCore@641fe203 2> 240103 INFO (coreCloseExecutor-510-thread-1) [n:127.0.0.1:37133_ c:collection1 s:shard1 r:core_node8 x:collection1_shard1_replica_n6 t:] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.collection.collection1.shard1.leader tag=SolrCore@5a464cb7 2> 240103 INFO (coreCloseExecutor-510-thread-2) [n:127.0.0.1:37133_ c:collection2 s:shard1 r:core_node4 x:collection2_shard1_replica_n3 t:] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.core.collection2.shard1.replica_n3 tag=SolrCore@60615ba3 2> 240111 INFO (coreCloseExecutor-507-thread-1) [n:127.0.0.1:43017_ c:collection1 s:shard1 r:core_node3 x:collection1_shard1_replica_n1 t:] o.a.s.u.DirectUpdateHandler2 Committing on IndexWriter.close() . 2> 240112 INFO (coreCloseExecutor-510-thread-1) [n:127.0.0.1:37133_ c:collection1 s:shard1 r:core_node8 x:collection1_shard1_replica_n6 t:] o.a.s.u.DirectUpdateHandler2 Committing on IndexWriter.close() . 2> 240128 INFO (coreCloseExecutor-509-thread-3) [n:127.0.0.1:32885_ c:collection3 s:shard2 r:core_node6 x:collection3_shard2_replica_n5 t:] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.core.collection3.shard2.replica_n5 tag=SolrCore@62747a26 2> 240129 INFO (coreCloseExecutor-508-thread-1) [n:127.0.0.1:40707_ c:collection1 s:shard2 r:core_node4 x:collection1_shard2_replica_n2 t:] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.collection.collection1.shard2.leader tag=SolrCore@60b877ba 2> 240130 INFO (coreCloseExecutor-508-thread-2) [n:127.0.0.1:40707_ c:collection2 s:shard2 r:core_node2 x:collection2_shard2_replica_n1 t:] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.core.collection2.shard2.replica_n1 tag=SolrCore@6aa85742 2> 240139 INFO (coreCloseExecutor-508-thread-1) [n:127.0.0.1:40707_ c:collection1 s:shard2 r:core_node4 x:collection1_shard2_replica_n2 t:] o.a.s.u.DirectUpdateHandler2 Committing on IndexWriter.close() . 2> 240302 INFO (OverseerCollectionConfigSetProcessor-72077803529764868-127.0.0.1:45297_-n_0000000000) [n: c: s: r: x: t:] o.a.s.c.OverseerTaskQueue Response ZK path: /overseer/collection-queue-work/qnr-0000000040 doesn't exist. Requestor may have disconnected from ZooKeeper 2> 240394 INFO (coreCloseExecutor-513-thread-1) [n:127.0.0.1:45297_ c:control_collection s:shard1 r:core_node2 x:control_collection_shard1_replica_n1 t:] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.collection.control_collection.shard1.leader tag=SolrCore@5b9635c5 2> 240406 INFO (coreCloseExecutor-513-thread-1) [n:127.0.0.1:45297_ c:control_collection s:shard1 r:core_node2 x:control_collection_shard1_replica_n1 t:] o.a.s.u.DirectUpdateHandler2 Committing on IndexWriter.close() . 2> 240456 INFO (closeThreadPool-498-thread-1) [n: c: s: r: x: t:] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.node tag=null 2> 240774 INFO (closeThreadPool-498-thread-1) [n: c: s: r: x: t:] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.jvm tag=null 2> 240866 INFO (coreCloseExecutor-508-thread-2) [n:127.0.0.1:40707_ c:collection2 s:shard2 r:core_node2 x:collection2_shard2_replica_n1 t:] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.collection.collection2.shard2.leader tag=SolrCore@6aa85742 2> 240867 INFO (coreCloseExecutor-508-thread-3) [n:127.0.0.1:40707_ c:collection3 s:shard2 r:core_node2 x:collection3_shard2_replica_n1 t:] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.core.collection3.shard2.replica_n1 tag=SolrCore@2b2f1822 2> 240880 INFO (coreCloseExecutor-508-thread-2) [n:127.0.0.1:40707_ c:collection2 s:shard2 r:core_node2 x:collection2_shard2_replica_n1 t:] o.a.s.u.DirectUpdateHandler2 Committing on IndexWriter.close() ... SKIPPED (unnecessary). 2> 240913 INFO (coreCloseExecutor-510-thread-2) [n:127.0.0.1:37133_ c:collection2 s:shard1 r:core_node4 x:collection2_shard1_replica_n3 t:] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.collection.collection2.shard1.leader tag=SolrCore@60615ba3 2> 240913 INFO (coreCloseExecutor-510-thread-3) [n:127.0.0.1:37133_ c:collection3 s:shard1 r:core_node4 x:collection3_shard1_replica_n3 t:] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.core.collection3.shard1.replica_n3 tag=SolrCore@3316dc3e 2> 240915 INFO (coreCloseExecutor-509-thread-3) [n:127.0.0.1:32885_ c:collection3 s:shard2 r:core_node6 x:collection3_shard2_replica_n5 t:] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.collection.collection3.shard2.leader tag=SolrCore@62747a26 2> 240915 INFO (coreCloseExecutor-509-thread-2) [n:127.0.0.1:32885_ c:collection2 s:shard2 r:core_node6 x:collection2_shard2_replica_n5 t:] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.core.collection2.shard2.replica_n5 tag=SolrCore@6f3c7edf 2> 240924 INFO (coreCloseExecutor-510-thread-2) [n:127.0.0.1:37133_ c:collection2 s:shard1 r:core_node4 x:collection2_shard1_replica_n3 t:] o.a.s.u.DirectUpdateHandler2 Committing on IndexWriter.close() ... SKIPPED (unnecessary). 2> 240936 INFO (coreCloseExecutor-509-thread-3) [n:127.0.0.1:32885_ c:collection3 s:shard2 r:core_node6 x:collection3_shard2_replica_n5 t:] o.a.s.u.DirectUpdateHandler2 Committing on IndexWriter.close() ... SKIPPED (unnecessary). 2> 240946 INFO (closeThreadPool-498-thread-1) [n: c: s: r: x: t:] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.jetty tag=null 2> 240947 INFO (closeThreadPool-498-thread-1) [n: c: s: r: x: t:] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.cluster tag=null 2> 240956 INFO (closeThreadPool-514-thread-1) [n: c: s: r: x: t:] o.a.s.c.Overseer Overseer (id=72077803529764868-127.0.0.1:45297_-n_0000000000) closing 2> 240960 INFO (OverseerStateUpdate-72077803529764868-127.0.0.1:45297_-n_0000000000) [n:127.0.0.1:45297_ c: s: r: x: t:] o.a.s.c.Overseer Overseer Loop exiting : 127.0.0.1:45297_ 2> 240985 INFO (closeThreadPool-514-thread-2) [n: c: s: r: x: t:] o.a.s.c.Overseer Overseer (id=72077803529764868-127.0.0.1:45297_-n_0000000000) closing 2> 240995 INFO (coreCloseExecutor-507-thread-3) [n:127.0.0.1:43017_ c:collection3 s:shard1 r:core_node8 x:collection3_shard1_replica_n7 t:] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.core.collection3.shard1.replica_n7 tag=SolrCore@5308781c 2> 241101 INFO (closeThreadPool-498-thread-1) [n: c: s: r: x: t:] o.a.s.c.Overseer Overseer (id=72077803529764868-127.0.0.1:45297_-n_0000000000) closing 2> 241538 INFO (coreCloseExecutor-510-thread-3) [n:127.0.0.1:37133_ c:collection3 s:shard1 r:core_node4 x:collection3_shard1_replica_n3 t:] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.collection.collection3.shard1.leader tag=SolrCore@3316dc3e 2> 241540 INFO (coreCloseExecutor-509-thread-2) [n:127.0.0.1:32885_ c:collection2 s:shard2 r:core_node6 x:collection2_shard2_replica_n5 t:] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.collection.collection2.shard2.leader tag=SolrCore@6f3c7edf 2> 241540 INFO (coreCloseExecutor-509-thread-1) [n:127.0.0.1:32885_ c:collection1 s:shard2 r:core_node7 x:collection1_shard2_replica_n5 t:] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.collection.collection1.shard2.leader tag=SolrCore@7babf325 2> 241548 INFO (coreCloseExecutor-510-thread-3) [n:127.0.0.1:37133_ c:collection3 s:shard1 r:core_node4 x:collection3_shard1_replica_n3 t:] o.a.s.u.DirectUpdateHandler2 Committing on IndexWriter.close() ... SKIPPED (unnecessary). 2> 241552 INFO (coreCloseExecutor-508-thread-3) [n:127.0.0.1:40707_ c:collection3 s:shard2 r:core_node2 x:collection3_shard2_replica_n1 t:] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.collection.collection3.shard2.leader tag=SolrCore@2b2f1822 2> 241554 INFO (coreCloseExecutor-508-thread-4) [n:127.0.0.1:40707_ c:oneInstanceCollection s:shard2 r:core_node5 x:oneInstanceCollection_shard2_replica_n2 t:] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.core.oneInstanceCollection.shard2.replica_n2 tag=SolrCore@352df53f 2> 241564 INFO (coreCloseExecutor-509-thread-2) [n:127.0.0.1:32885_ c:collection2 s:shard2 r:core_node6 x:collection2_shard2_replica_n5 t:] o.a.s.u.DirectUpdateHandler2 Committing on IndexWriter.close() ... SKIPPED (unnecessary). 2> 241571 INFO (coreCloseExecutor-509-thread-1) [n:127.0.0.1:32885_ c:collection1 s:shard2 r:core_node7 x:collection1_shard2_replica_n5 t:] o.a.s.u.DirectUpdateHandler2 Committing on IndexWriter.close() . 2> 241573 INFO (closeThreadPool-498-thread-4) [n: c: s: r: x: t:] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.node tag=null 2> 241585 INFO (coreCloseExecutor-508-thread-3) [n:127.0.0.1:40707_ c:collection3 s:shard2 r:core_node2 x:collection3_shard2_replica_n1 t:] o.a.s.u.DirectUpdateHandler2 Committing on IndexWriter.close() ... SKIPPED (unnecessary). 2> 241603 INFO (closeThreadPool-498-thread-3) [n: c: s: r: x: t:] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.node tag=null 2> 241604 INFO (coreCloseExecutor-507-thread-3) [n:127.0.0.1:43017_ c:collection3 s:shard1 r:core_node8 x:collection3_shard1_replica_n7 t:] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.collection.collection3.shard1.leader tag=SolrCore@5308781c 2> 241604 INFO (coreCloseExecutor-507-thread-2) [n:127.0.0.1:43017_ c:collection2 s:shard1 r:core_node8 x:collection2_shard1_replica_n7 t:] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.collection.collection2.shard1.leader tag=SolrCore@641fe203 2> 241616 INFO (coreCloseExecutor-507-thread-3) [n:127.0.0.1:43017_ c:collection3 s:shard1 r:core_node8 x:collection3_shard1_replica_n7 t:] o.a.s.u.DirectUpdateHandler2 Committing on IndexWriter.close() ... SKIPPED (unnecessary). 2> 241624 INFO (coreCloseExecutor-507-thread-2) [n:127.0.0.1:43017_ c:collection2 s:shard1 r:core_node8 x:collection2_shard1_replica_n7 t:] o.a.s.u.DirectUpdateHandler2 Committing on IndexWriter.close() ... SKIPPED (unnecessary). 2> 241638 INFO (closeThreadPool-498-thread-2) [n: c: s: r: x: t:] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.node tag=null 2> 241817 INFO (closeThreadPool-498-thread-4) [n: c: s: r: x: t:] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.jvm tag=null 2> 241830 INFO (closeThreadPool-498-thread-3) [n: c: s: r: x: t:] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.jvm tag=null 2> 241910 INFO (closeThreadPool-498-thread-2) [n: c: s: r: x: t:] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.jvm tag=null 2> 241917 INFO (closeThreadPool-498-thread-4) [n: c: s: r: x: t:] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.jetty tag=null 2> 241917 INFO (closeThreadPool-498-thread-4) [n: c: s: r: x: t:] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.cluster tag=null 2> 241934 INFO (closeThreadPool-498-thread-3) [n: c: s: r: x: t:] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.jetty tag=null 2> 241935 INFO (closeThreadPool-498-thread-3) [n: c: s: r: x: t:] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.cluster tag=null 2> 242003 INFO (closeThreadPool-498-thread-2) [n: c: s: r: x: t:] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.jetty tag=null 2> 242004 INFO (closeThreadPool-498-thread-2) [n: c: s: r: x: t:] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.cluster tag=null 2> 242039 INFO (coreCloseExecutor-508-thread-4) [n:127.0.0.1:40707_ c:oneInstanceCollection s:shard2 r:core_node5 x:oneInstanceCollection_shard2_replica_n2 t:] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.collection.oneInstanceCollection.shard2.leader tag=SolrCore@352df53f 2> 242039 INFO (coreCloseExecutor-508-thread-6) [n:127.0.0.1:40707_ c:oneInstanceCollection s:shard1 r:core_node7 x:oneInstanceCollection_shard1_replica_n4 t:] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.core.oneInstanceCollection.shard1.replica_n4 tag=SolrCore@561a5a38 2> 242051 INFO (coreCloseExecutor-508-thread-4) [n:127.0.0.1:40707_ c:oneInstanceCollection s:shard2 r:core_node5 x:oneInstanceCollection_shard2_replica_n2 t:] o.a.s.u.DirectUpdateHandler2 Committing on IndexWriter.close() ... SKIPPED (unnecessary). 2> 242310 INFO (coreCloseExecutor-508-thread-6) [n:127.0.0.1:40707_ c:oneInstanceCollection s:shard1 r:core_node7 x:oneInstanceCollection_shard1_replica_n4 t:] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.collection.oneInstanceCollection.shard1.leader tag=SolrCore@561a5a38 2> 242311 INFO (coreCloseExecutor-508-thread-5) [n:127.0.0.1:40707_ c:oneInstanceCollection s:shard2 r:core_node3 x:oneInstanceCollection_shard2_replica_n1 t:] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.core.oneInstanceCollection.shard2.replica_n1 tag=SolrCore@20cefd01 2> 242321 INFO (coreCloseExecutor-508-thread-6) [n:127.0.0.1:40707_ c:oneInstanceCollection s:shard1 r:core_node7 x:oneInstanceCollection_shard1_replica_n4 t:] o.a.s.u.DirectUpdateHandler2 Committing on IndexWriter.close() ... SKIPPED (unnecessary). 2> 242586 INFO (coreCloseExecutor-508-thread-5) [n:127.0.0.1:40707_ c:oneInstanceCollection s:shard2 r:core_node3 x:oneInstanceCollection_shard2_replica_n1 t:] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.collection.oneInstanceCollection.shard2.leader tag=SolrCore@20cefd01 2> 242586 INFO (coreCloseExecutor-508-thread-8) [n:127.0.0.1:40707_ c:oneInstanceCollection2 s:shard1 r:core_node2 x:oneInstanceCollection2_shard1_replica_n1 t:] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.core.oneInstanceCollection2.shard1.replica_n1 tag=SolrCore@2f139be8 2> 242592 INFO (coreCloseExecutor-508-thread-5) [n:127.0.0.1:40707_ c:oneInstanceCollection s:shard2 r:core_node3 x:oneInstanceCollection_shard2_replica_n1 t:] o.a.s.u.DirectUpdateHandler2 Committing on IndexWriter.close() ... SKIPPED (unnecessary). 2> 242886 INFO (coreCloseExecutor-508-thread-8) [n:127.0.0.1:40707_ c:oneInstanceCollection2 s:shard1 r:core_node2 x:oneInstanceCollection2_shard1_replica_n1 t:] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.collection.oneInstanceCollection2.shard1.leader tag=SolrCore@2f139be8 2> 242886 INFO (coreCloseExecutor-508-thread-7) [n:127.0.0.1:40707_ c:oneInstanceCollection s:shard1 r:core_node8 x:oneInstanceCollection_shard1_replica_n6 t:] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.core.oneInstanceCollection.shard1.replica_n6 tag=SolrCore@309f6061 2> 242896 INFO (coreCloseExecutor-508-thread-8) [n:127.0.0.1:40707_ c:oneInstanceCollection2 s:shard1 r:core_node2 x:oneInstanceCollection2_shard1_replica_n1 t:] o.a.s.u.DirectUpdateHandler2 Committing on IndexWriter.close() ... SKIPPED (unnecessary). 2> 243176 INFO (coreCloseExecutor-508-thread-7) [n:127.0.0.1:40707_ c:oneInstanceCollection s:shard1 r:core_node8 x:oneInstanceCollection_shard1_replica_n6 t:] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.collection.oneInstanceCollection.shard1.leader tag=SolrCore@309f6061 2> 243177 INFO (coreCloseExecutor-508-thread-9) [n:127.0.0.1:40707_ c:oneInstanceCollection2 s:shard2 r:core_node4 x:oneInstanceCollection2_shard2_replica_n3 t:] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.core.oneInstanceCollection2.shard2.replica_n3 tag=SolrCore@1b4f12a7 2> 243185 INFO (coreCloseExecutor-508-thread-7) [n:127.0.0.1:40707_ c:oneInstanceCollection s:shard1 r:core_node8 x:oneInstanceCollection_shard1_replica_n6 t:] o.a.s.u.DirectUpdateHandler2 Committing on IndexWriter.close() ... SKIPPED (unnecessary). 2> 243463 INFO (coreCloseExecutor-508-thread-9) [n:127.0.0.1:40707_ c:oneInstanceCollection2 s:shard2 r:core_node4 x:oneInstanceCollection2_shard2_replica_n3 t:] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.collection.oneInstanceCollection2.shard2.leader tag=SolrCore@1b4f12a7 2> 243464 INFO (recoveryExecutor-285-thread-1-processing-oneInstanceCollection2_shard1_replica_n5 null-615 core_node6 127.0.0.1:40707_ oneInstanceCollection2 shard1) [n:127.0.0.1:40707_ c:oneInstanceCollection2 s:shard1 r:core_node6 x:oneInstanceCollection2_shard1_replica_n5 t:null-615] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.core.oneInstanceCollection2.shard1.replica_n5 tag=SolrCore@186bd751 2> 243471 INFO (coreCloseExecutor-508-thread-9) [n:127.0.0.1:40707_ c:oneInstanceCollection2 s:shard2 r:core_node4 x:oneInstanceCollection2_shard2_replica_n3 t:] o.a.s.u.DirectUpdateHandler2 Committing on IndexWriter.close() ... SKIPPED (unnecessary). 2> 243771 INFO (recoveryExecutor-285-thread-1-processing-oneInstanceCollection2_shard1_replica_n5 null-615 core_node6 127.0.0.1:40707_ oneInstanceCollection2 shard1) [n:127.0.0.1:40707_ c:oneInstanceCollection2 s:shard1 r:core_node6 x:oneInstanceCollection2_shard1_replica_n5 t:null-615] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.collection.oneInstanceCollection2.shard1.leader tag=SolrCore@186bd751 2> 243772 INFO (closeThreadPool-498-thread-5) [n: c: s: r: x: t:] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.node tag=null 2> 243778 WARN (recoveryExecutor-285-thread-1-processing-oneInstanceCollection2_shard1_replica_n5 null-615 core_node6 127.0.0.1:40707_ oneInstanceCollection2 shard1) [n:127.0.0.1:40707_ c:oneInstanceCollection2 s:shard1 r:core_node6 x:oneInstanceCollection2_shard1_replica_n5 t:null-615] o.a.s.c.RecoveryStrategy Stopping recovery for core=[oneInstanceCollection2_shard1_replica_n5] coreNodeName=[core_node6] 2> 243778 INFO (recoveryExecutor-285-thread-1-processing-oneInstanceCollection2_shard1_replica_n5 null-615 core_node6 127.0.0.1:40707_ oneInstanceCollection2 shard1) [n:127.0.0.1:40707_ c:oneInstanceCollection2 s:shard1 r:core_node6 x:oneInstanceCollection2_shard1_replica_n5 t:null-615] o.a.s.u.DirectUpdateHandler2 Committing on IndexWriter.close() ... SKIPPED (unnecessary). 2> 243815 INFO (closeThreadPool-498-thread-5) [n: c: s: r: x: t:] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.jvm tag=null 2> 243830 INFO (closeThreadPool-498-thread-5) [n: c: s: r: x: t:] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.jetty tag=null 2> 243831 INFO (closeThreadPool-498-thread-5) [n: c: s: r: x: t:] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.cluster tag=null 2> 243959 INFO (TEST-BasicDistributedZkTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.u.ErrorLogMuter Closing ErrorLogMuter-regex-10 after mutting 0 log messages 2> 243961 INFO (TEST-BasicDistributedZkTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.u.ErrorLogMuter Creating ErrorLogMuter-regex-12 for ERROR logs matching regex: ignore_exception 2> 243964 INFO (TEST-BasicDistributedZkTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.ZkTestServer Shutting down ZkTestServer. 2> 244191 WARN (ZkTestServer Run Thread) [n: c: s: r: x: t:] o.a.s.c.ZkTestServer Watch limit violations: 2> Maximum concurrent create/delete watches above limit: 2> 2> 11 /solr/aliases.json 2> 11 /solr/clusterprops.json 2> 7 /solr/collections/collection2/terms/shard1 2> 7 /solr/collections/collection2/terms/shard2 2> 7 /solr/collections/collection3/terms/shard1 2> 7 /solr/collections/collection3/terms/shard2 2> 7 /solr/collections/collection1/terms/shard1 2> 7 /solr/collections/collection1/terms/shard2 2> 5 /solr/packages.json 2> 5 /solr/security.json 2> 5 /solr/configs/conf1 2> 4 /solr/collections/collection1/collectionprops.json 2> 4 /solr/collections/collection2/collectionprops.json 2> 4 /solr/collections/oneInstanceCollection/terms/shard1 2> 4 /solr/collections/oneInstanceCollection/terms/shard2 2> 4 /solr/collections/collection3/collectionprops.json 2> 3 /solr/collections/control_collection/terms/shard1 2> 3 /solr/collections/oneInstanceCollection2/terms/shard1 2> 2 /solr/collections/oneInstanceCollection2/terms/shard2 2> 2> Maximum concurrent data watches above limit: 2> 2> 69 /solr/collections/collection3/state.json 2> 69 /solr/collections/collection2/state.json 2> 50 /solr/collections/collection1/state.json 2> 27 /solr/collections/oneInstanceCollection2/state.json 2> 13 /solr/collections/oneInstanceCollection/state.json 2> 8 /solr/collections/control_collection/state.json 2> 3 /solr/overseer_elect/election/72077803529764874-127.0.0.1:40707_-n_0000000001 2> 2> Maximum concurrent children watches above limit: 2> 2> 156 /solr/overseer/queue 2> 74 /solr/overseer/collection-queue-work 2> 45 /solr/collections 2> 40 /solr/live_nodes 2> 10 /solr/collections/collection3/state.json 2> 10 /solr/collections/collection2/state.json 2> 9 /solr/collections/collection1/state.json 2> 5 /solr/collections/oneInstanceCollection2/state.json 2> 4 /solr/collections/oneInstanceCollection/state.json 2> 3 /solr/collections/control_collection/state.json 2> > org.apache.solr.client.solrj.impl.BaseHttpSolrClient$RemoteSolrException: Error from server at http://127.0.0.1:43017: ADDREPLICA failed to create replica > at __randomizedtesting.SeedInfo.seed([246C98A4C257C021:AC38A77E6CABADD9]:0) > at app//org.apache.solr.client.solrj.impl.HttpSolrClient.executeMethod(HttpSolrClient.java:747) > at app//org.apache.solr.client.solrj.impl.HttpSolrClient.request(HttpSolrClient.java:263) > at app//org.apache.solr.client.solrj.impl.HttpSolrClient.request(HttpSolrClient.java:244) > at app//org.apache.solr.client.solrj.impl.LBSolrClient.doRequest(LBSolrClient.java:386) > at app//org.apache.solr.client.solrj.impl.LBSolrClient.request(LBSolrClient.java:352) > at app//org.apache.solr.client.solrj.impl.CloudSolrClient.sendRequest(CloudSolrClient.java:1199) > at app//org.apache.solr.client.solrj.impl.CloudSolrClient.requestWithRetryOnStaleState(CloudSolrClient.java:898) > at app//org.apache.solr.client.solrj.impl.CloudSolrClient.request(CloudSolrClient.java:826) > at app//org.apache.solr.client.solrj.SolrRequest.process(SolrRequest.java:234) > at app//org.apache.solr.client.solrj.SolrRequest.process(SolrRequest.java:249) > at app//org.apache.solr.cloud.AbstractBasicDistributedZkTestBase.testANewCollectionInOneInstanceWithManualShardAssignement(AbstractBasicDistributedZkTestBase.java:1340) > at app//org.apache.solr.cloud.AbstractBasicDistributedZkTestBase.test(AbstractBasicDistributedZkTestBase.java:749) > at app//org.apache.solr.cloud.BasicDistributedZkTest.test(BasicDistributedZkTest.java:34) > at java.base@11.0.22/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method) > at java.base@11.0.22/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) > at java.base@11.0.22/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) > at java.base@11.0.22/java.lang.reflect.Method.invoke(Method.java:566) > at app//com.carrotsearch.randomizedtesting.RandomizedRunner.invoke(RandomizedRunner.java:1758) > at app//com.carrotsearch.randomizedtesting.RandomizedRunner$8.evaluate(RandomizedRunner.java:946) > at app//com.carrotsearch.randomizedtesting.RandomizedRunner$9.evaluate(RandomizedRunner.java:982) > at app//com.carrotsearch.randomizedtesting.RandomizedRunner$10.evaluate(RandomizedRunner.java:996) > at app//org.apache.solr.BaseDistributedSearchTestCase$ShardsRepeatRule$ShardsFixedStatement.callStatement(BaseDistributedSearchTestCase.java:1163) > at app//org.apache.solr.BaseDistributedSearchTestCase$ShardsRepeatRule$ShardsStatement.evaluate(BaseDistributedSearchTestCase.java:1134) > at app//com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:80) > at app//org.junit.rules.RunRules.evaluate(RunRules.java:20) > at app//org.apache.lucene.tests.util.TestRuleSetupTeardownChained$1.evaluate(TestRuleSetupTeardownChained.java:48) > at app//org.apache.lucene.tests.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:43) > at app//org.apache.lucene.tests.util.TestRuleThreadAndTestName$1.evaluate(TestRuleThreadAndTestName.java:45) > at app//org.apache.lucene.tests.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:60) > at app//org.apache.lucene.tests.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:44) > at app//org.junit.rules.RunRules.evaluate(RunRules.java:20) > at app//com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) > at app//com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:390) > at app//com.carrotsearch.randomizedtesting.ThreadLeakControl.forkTimeoutingTask(ThreadLeakControl.java:843) > at app//com.carrotsearch.randomizedtesting.ThreadLeakControl$3.evaluate(ThreadLeakControl.java:490) > at app//com.carrotsearch.randomizedtesting.RandomizedRunner.runSingleTest(RandomizedRunner.java:955) > at app//com.carrotsearch.randomizedtesting.RandomizedRunner$5.evaluate(RandomizedRunner.java:840) > at app//com.carrotsearch.randomizedtesting.RandomizedRunner$6.evaluate(RandomizedRunner.java:891) > at app//com.carrotsearch.randomizedtesting.RandomizedRunner$7.evaluate(RandomizedRunner.java:902) > at app//com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) > at app//com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) > at app//org.apache.lucene.tests.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:43) > at app//org.apache.lucene.tests.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:43) > at app//com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:80) > at app//org.junit.rules.RunRules.evaluate(RunRules.java:20) > at app//org.apache.lucene.tests.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:43) > at app//com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) > at app//org.apache.lucene.tests.util.TestRuleStoreClassName$1.evaluate(TestRuleStoreClassName.java:38) > at app//com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:40) > at app//com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:40) > at app//com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) > at app//com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) > at app//org.apache.lucene.tests.util.TestRuleAssertionsRequired$1.evaluate(TestRuleAssertionsRequired.java:53) > at app//org.apache.lucene.tests.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:43) > at app//org.apache.lucene.tests.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:44) > at app//org.apache.lucene.tests.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:60) > at app//org.apache.lucene.tests.util.TestRuleIgnoreTestSuites$1.evaluate(TestRuleIgnoreTestSuites.java:47) > at app//org.junit.rules.RunRules.evaluate(RunRules.java:20) > at app//com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) > at app//com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:390) > at app//com.carrotsearch.randomizedtesting.ThreadLeakControl.lambda$forkTimeoutingTask$0(ThreadLeakControl.java:850) > at java.base@11.0.22/java.lang.Thread.run(Thread.java:829) 2> NOTE: reproduce with: gradlew test --tests BasicDistributedZkTest.test -Dtests.seed=246C98A4C257C021 -Dtests.locale=os-RU -Dtests.timezone=America/El_Salvador -Dtests.asserts=true -Dtests.file.encoding=UTF-8 2> 244432 INFO (SUITE-BasicDistributedZkTest-seed#[246C98A4C257C021]-worker) [n: c: s: r: x: t:] o.a.s.u.ErrorLogMuter Closing ErrorLogMuter-regex-12 after mutting 0 log messages 2> 244433 INFO (SUITE-BasicDistributedZkTest-seed#[246C98A4C257C021]-worker) [n: c: s: r: x: t:] o.a.s.u.ErrorLogMuter Creating ErrorLogMuter-regex-13 for ERROR logs matching regex: ignore_exception 2> NOTE: leaving temporary files on disk at: /build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.cloud.BasicDistributedZkTest_246C98A4C257C021-001 > java.lang.AssertionError: ObjectTracker found 2 object(s) that were not released!!! [InternalHttpClient, InternalHttpClient] > org.apache.http.impl.client.InternalHttpClient:org.apache.solr.common.util.ObjectReleaseTracker$ObjectTrackerException: org.apache.http.impl.client.InternalHttpClient > at org.apache.solr.common.util.ObjectReleaseTracker.track(ObjectReleaseTracker.java:52) > at org.apache.solr.client.solrj.impl.HttpClientUtil.createClient(HttpClientUtil.java:378) > at org.apache.solr.client.solrj.impl.HttpClientUtil.createClient(HttpClientUtil.java:387) > at org.apache.solr.client.solrj.impl.HttpClientUtil.createClient(HttpClientUtil.java:313) > at org.apache.solr.client.solrj.impl.HttpClientUtil.createClient(HttpClientUtil.java:298) > at org.apache.solr.client.solrj.impl.HttpSolrClient.(HttpSolrClient.java:180) > at org.apache.solr.client.solrj.impl.HttpSolrClient$Builder.build(HttpSolrClient.java:1055) > at org.apache.solr.SolrTestCaseJ4.getHttpSolrClient(SolrTestCaseJ4.java:2688) > at org.apache.solr.cloud.AbstractBasicDistributedZkTestBase.createNewSolrClient(AbstractBasicDistributedZkTestBase.java:1716) > at org.apache.solr.cloud.AbstractBasicDistributedZkTestBase.testANewCollectionInOneInstanceWithManualShardAssignement(AbstractBasicDistributedZkTestBase.java:1342) > at org.apache.solr.cloud.AbstractBasicDistributedZkTestBase.test(AbstractBasicDistributedZkTestBase.java:749) > at org.apache.solr.cloud.BasicDistributedZkTest.test(BasicDistributedZkTest.java:34) > at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method) > at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) > at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) > at java.base/java.lang.reflect.Method.invoke(Method.java:566) > at com.carrotsearch.randomizedtesting.RandomizedRunner.invoke(RandomizedRunner.java:1758) > at com.carrotsearch.randomizedtesting.RandomizedRunner$8.evaluate(RandomizedRunner.java:946) > at com.carrotsearch.randomizedtesting.RandomizedRunner$9.evaluate(RandomizedRunner.java:982) > at com.carrotsearch.randomizedtesting.RandomizedRunner$10.evaluate(RandomizedRunner.java:996) > at org.apache.solr.BaseDistributedSearchTestCase$ShardsRepeatRule$ShardsFixedStatement.callStatement(BaseDistributedSearchTestCase.java:1163) > at org.apache.solr.BaseDistributedSearchTestCase$ShardsRepeatRule$ShardsStatement.evaluate(BaseDistributedSearchTestCase.java:1134) > at com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:80) > at org.junit.rules.RunRules.evaluate(RunRules.java:20) > at org.apache.lucene.tests.util.TestRuleSetupTeardownChained$1.evaluate(TestRuleSetupTeardownChained.java:48) > at org.apache.lucene.tests.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:43) > at org.apache.lucene.tests.util.TestRuleThreadAndTestName$1.evaluate(TestRuleThreadAndTestName.java:45) > at org.apache.lucene.tests.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:60) > at org.apache.lucene.tests.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:44) > at org.junit.rules.RunRules.evaluate(RunRules.java:20) > at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) > at com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:390) > at com.carrotsearch.randomizedtesting.ThreadLeakControl.forkTimeoutingTask(ThreadLeakControl.java:843) > at com.carrotsearch.randomizedtesting.ThreadLeakControl$3.evaluate(ThreadLeakControl.java:490) > at com.carrotsearch.randomizedtesting.RandomizedRunner.runSingleTest(RandomizedRunner.java:955) > at com.carrotsearch.randomizedtesting.RandomizedRunner$5.evaluate(RandomizedRunner.java:840) > at com.carrotsearch.randomizedtesting.RandomizedRunner$6.evaluate(RandomizedRunner.java:891) > at com.carrotsearch.randomizedtesting.RandomizedRunner$7.evaluate(RandomizedRunner.java:902) > at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) > at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) > at org.apache.lucene.tests.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:43) > at org.apache.lucene.tests.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:43) > at com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:80) > at org.junit.rules.RunRules.evaluate(RunRules.java:20) > at org.apache.lucene.tests.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:43) > at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) > at org.apache.lucene.tests.util.TestRuleStoreClassName$1.evaluate(TestRuleStoreClassName.java:38) > at com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:40) > at com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:40) > at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) > at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) > at org.apache.lucene.tests.util.TestRuleAssertionsRequired$1.evaluate(TestRuleAssertionsRequired.java:53) > at org.apache.lucene.tests.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:43) > at org.apache.lucene.tests.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:44) > at org.apache.lucene.tests.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:60) > at org.apache.lucene.tests.util.TestRuleIgnoreTestSuites$1.evaluate(TestRuleIgnoreTestSuites.java:47) > at org.junit.rules.RunRules.evaluate(RunRules.java:20) > at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) > at com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:390) > at com.carrotsearch.randomizedtesting.ThreadLeakControl.lambda$forkTimeoutingTask$0(ThreadLeakControl.java:850) > at java.base/java.lang.Thread.run(Thread.java:829) > > org.apache.http.impl.client.InternalHttpClient:org.apache.solr.common.util.ObjectReleaseTracker$ObjectTrackerException: org.apache.http.impl.client.InternalHttpClient > at org.apache.solr.common.util.ObjectReleaseTracker.track(ObjectReleaseTracker.java:52) > at org.apache.solr.client.solrj.impl.HttpClientUtil.createClient(HttpClientUtil.java:378) > at org.apache.solr.client.solrj.impl.HttpClientUtil.createClient(HttpClientUtil.java:387) > at org.apache.solr.client.solrj.impl.HttpClientUtil.createClient(HttpClientUtil.java:313) > at org.apache.solr.client.solrj.impl.HttpClientUtil.createClient(HttpClientUtil.java:298) > at org.apache.solr.client.solrj.impl.HttpSolrClient.(HttpSolrClient.java:180) > at org.apache.solr.client.solrj.impl.HttpSolrClient$Builder.build(HttpSolrClient.java:1055) > at org.apache.solr.SolrTestCaseJ4.getHttpSolrClient(SolrTestCaseJ4.java:2688) > at org.apache.solr.cloud.AbstractBasicDistributedZkTestBase.createNewSolrClient(AbstractBasicDistributedZkTestBase.java:1716) > at org.apache.solr.cloud.AbstractBasicDistributedZkTestBase.testANewCollectionInOneInstanceWithManualShardAssignement(AbstractBasicDistributedZkTestBase.java:1342) > at org.apache.solr.cloud.AbstractBasicDistributedZkTestBase.test(AbstractBasicDistributedZkTestBase.java:749) > at org.apache.solr.cloud.BasicDistributedZkTest.test(BasicDistributedZkTest.java:34) > at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method) > at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) > at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) > at java.base/java.lang.reflect.Method.invoke(Method.java:566) > at com.carrotsearch.randomizedtesting.RandomizedRunner.invoke(RandomizedRunner.java:1758) > at com.carrotsearch.randomizedtesting.RandomizedRunner$8.evaluate(RandomizedRunner.java:946) > at com.carrotsearch.randomizedtesting.RandomizedRunner$9.evaluate(RandomizedRunner.java:982) > at com.carrotsearch.randomizedtesting.RandomizedRunner$10.evaluate(RandomizedRunner.java:996) > at org.apache.solr.BaseDistributedSearchTestCase$ShardsRepeatRule$ShardsFixedStatement.callStatement(BaseDistributedSearchTestCase.java:1163) > at org.apache.solr.BaseDistributedSearchTestCase$ShardsRepeatRule$ShardsStatement.evaluate(BaseDistributedSearchTestCase.java:1134) > at com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:80) > at org.junit.rules.RunRules.evaluate(RunRules.java:20) > at org.apache.lucene.tests.util.TestRuleSetupTeardownChained$1.evaluate(TestRuleSetupTeardownChained.java:48) > at org.apache.lucene.tests.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:43) > at org.apache.lucene.tests.util.TestRuleThreadAndTestName$1.evaluate(TestRuleThreadAndTestName.java:45) > at org.apache.lucene.tests.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:60) > at org.apache.lucene.tests.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:44) > at org.junit.rules.RunRules.evaluate(RunRules.java:20) > at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) > at com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:390) > at com.carrotsearch.randomizedtesting.ThreadLeakControl.forkTimeoutingTask(ThreadLeakControl.java:843) > at com.carrotsearch.randomizedtesting.ThreadLeakControl$3.evaluate(ThreadLeakControl.java:490) > at com.carrotsearch.randomizedtesting.RandomizedRunner.runSingleTest(RandomizedRunner.java:955) > at com.carrotsearch.randomizedtesting.RandomizedRunner$5.evaluate(RandomizedRunner.java:840) > at com.carrotsearch.randomizedtesting.RandomizedRunner$6.evaluate(RandomizedRunner.java:891) > at com.carrotsearch.randomizedtesting.RandomizedRunner$7.evaluate(RandomizedRunner.java:902) > at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) > at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) > at org.apache.lucene.tests.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:43) > at org.apache.lucene.tests.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:43) > at com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:80) > at org.junit.rules.RunRules.evaluate(RunRules.java:20) > at org.apache.lucene.tests.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:43) > at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) > at org.apache.lucene.tests.util.TestRuleStoreClassName$1.evaluate(TestRuleStoreClassName.java:38) > at com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:40) > at com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:40) > at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) > at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) > at org.apache.lucene.tests.util.TestRuleAssertionsRequired$1.evaluate(TestRuleAssertionsRequired.java:53) > at org.apache.lucene.tests.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:43) > at org.apache.lucene.tests.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:44) > at org.apache.lucene.tests.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:60) > at org.apache.lucene.tests.util.TestRuleIgnoreTestSuites$1.evaluate(TestRuleIgnoreTestSuites.java:47) > at org.junit.rules.RunRules.evaluate(RunRules.java:20) > at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) > at com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:390) > at com.carrotsearch.randomizedtesting.ThreadLeakControl.lambda$forkTimeoutingTask$0(ThreadLeakControl.java:850) > at java.base/java.lang.Thread.run(Thread.java:829) > > expected null, but was: org.apache.http.impl.client.InternalHttpClient:org.apache.solr.common.util.ObjectReleaseTracker$ObjectTrackerException: org.apache.http.impl.client.InternalHttpClient > at org.apache.solr.common.util.ObjectReleaseTracker.track(ObjectReleaseTracker.java:52) > at org.apache.solr.client.solrj.impl.HttpClientUtil.createClient(HttpClientUtil.java:378) > at org.apache.solr.client.solrj.impl.HttpClientUtil.createClient(HttpClientUtil.java:387) > at org.apache.solr.client.solrj.impl.HttpClientUtil.createClient(HttpClientUtil.java:313) > at org.apache.solr.client.solrj.impl.HttpClientUtil.createClient(HttpClientUtil.java:298) > at org.apache.solr.client.solrj.impl.HttpSolrClient.(HttpSolrClient.java:180) > at org.apache.solr.client.solrj.impl.HttpSolrClient$Builder.build(HttpSolrClient.java:1055) > at org.apache.solr.SolrTestCaseJ4.getHttpSolrClient(SolrTestCaseJ4.java:2688) > at org.apache.solr.cloud.AbstractBasicDistributedZkTestBase.createNewSolrClient(AbstractBasicDistributedZkTestBase.java:1716) > at org.apache.solr.cloud.AbstractBasicDistributedZkTestBase.testANewCollectionInOneInstanceWithManualShardAssignement(AbstractBasicDistributedZkTestBase.java:1342) > at org.apache.solr.cloud.AbstractBasicDistributedZkTestBase.test(AbstractBasicDistributedZkTestBase.java:749) > at org.apache.solr.cloud.BasicDistributedZkTest.test(BasicDistributedZkTest.java:34) > at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method) > at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) > at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) > at java.base/java.lang.reflect.Method.invoke(Method.java:566) > at com.carrotsearch.randomizedtesting.RandomizedRunner.invoke(RandomizedRunner.java:1758) > at com.carrotsearch.randomizedtesting.RandomizedRunner$8.evaluate(RandomizedRunner.java:946) > at com.carrotsearch.randomizedtesting.RandomizedRunner$9.evaluate(RandomizedRunner.java:982) > at com.carrotsearch.randomizedtesting.RandomizedRunner$10.evaluate(RandomizedRunner.java:996) > at org.apache.solr.BaseDistributedSearchTestCase$ShardsRepeatRule$ShardsFixedStatement.callStatement(BaseDistributedSearchTestCase.java:1163) > at org.apache.solr.BaseDistributedSearchTestCase$ShardsRepeatRule$ShardsStatement.evaluate(BaseDistributedSearchTestCase.java:1134) > at com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:80) > at org.junit.rules.RunRules.evaluate(RunRules.java:20) > at org.apache.lucene.tests.util.TestRuleSetupTeardownChained$1.evaluate(TestRuleSetupTeardownChained.java:48) > at org.apache.lucene.tests.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:43) > at org.apache.lucene.tests.util.TestRuleThreadAndTestName$1.evaluate(TestRuleThreadAndTestName.java:45) > at org.apache.lucene.tests.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:60) > at org.apache.lucene.tests.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:44) > at org.junit.rules.RunRules.evaluate(RunRules.java:20) > at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) > at com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:390) > at com.carrotsearch.randomizedtesting.ThreadLeakControl.forkTimeoutingTask(ThreadLeakControl.java:843) > at com.carrotsearch.randomizedtesting.ThreadLeakControl$3.evaluate(ThreadLeakControl.java:490) > at com.carrotsearch.randomizedtesting.RandomizedRunner.runSingleTest(RandomizedRunner.java:955) > at com.carrotsearch.randomizedtesting.RandomizedRunner$5.evaluate(RandomizedRunner.java:840) > at com.carrotsearch.randomizedtesting.RandomizedRunner$6.evaluate(RandomizedRunner.java:891) > at com.carrotsearch.randomizedtesting.RandomizedRunner$7.evaluate(RandomizedRunner.java:902) > at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) > at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) > at org.apache.lucene.tests.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:43) > at org.apache.lucene.tests.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:43) > at com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:80) > at org.junit.rules.RunRules.evaluate(RunRules.java:20) > at org.apache.lucene.tests.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:43) > at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) > at org.apache.lucene.tests.util.TestRuleStoreClassName$1.evaluate(TestRuleStoreClassName.java:38) > at com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:40) > at com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:40) > at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) > at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) > at org.apache.lucene.tests.util.TestRuleAssertionsRequired$1.evaluate(TestRuleAssertionsRequired.java:53) > at org.apache.lucene.tests.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:43) > at org.apache.lucene.tests.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:44) > at org.apache.lucene.tests.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:60) > at org.apache.lucene.tests.util.TestRuleIgnoreTestSuites$1.evaluate(TestRuleIgnoreTestSuites.java:47) > at org.junit.rules.RunRules.evaluate(RunRules.java:20) > at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) > at com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:390) > at com.carrotsearch.randomizedtesting.ThreadLeakControl.lambda$forkTimeoutingTask$0(ThreadLeakControl.java:850) > at java.base/java.lang.Thread.run(Thread.java:829) > > org.apache.http.impl.client.InternalHttpClient:org.apache.solr.common.util.ObjectReleaseTracker$ObjectTrackerException: org.apache.http.impl.client.InternalHttpClient > at org.apache.solr.common.util.ObjectReleaseTracker.track(ObjectReleaseTracker.java:52) > at org.apache.solr.client.solrj.impl.HttpClientUtil.createClient(HttpClientUtil.java:378) > at org.apache.solr.client.solrj.impl.HttpClientUtil.createClient(HttpClientUtil.java:387) > at org.apache.solr.client.solrj.impl.HttpClientUtil.createClient(HttpClientUtil.java:313) > at org.apache.solr.client.solrj.impl.HttpClientUtil.createClient(HttpClientUtil.java:298) > at org.apache.solr.client.solrj.impl.HttpSolrClient.(HttpSolrClient.java:180) > at org.apache.solr.client.solrj.impl.HttpSolrClient$Builder.build(HttpSolrClient.java:1055) > at org.apache.solr.SolrTestCaseJ4.getHttpSolrClient(SolrTestCaseJ4.java:2688) > at org.apache.solr.cloud.AbstractBasicDistributedZkTestBase.createNewSolrClient(AbstractBasicDistributedZkTestBase.java:1716) > at org.apache.solr.cloud.AbstractBasicDistributedZkTestBase.testANewCollectionInOneInstanceWithManualShardAssignement(AbstractBasicDistributedZkTestBase.java:1342) > at org.apache.solr.cloud.AbstractBasicDistributedZkTestBase.test(AbstractBasicDistributedZkTestBase.java:749) > at org.apache.solr.cloud.BasicDistributedZkTest.test(BasicDistributedZkTest.java:34) > at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method) > at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) > at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) > at java.base/java.lang.reflect.Method.invoke(Method.java:566) > at com.carrotsearch.randomizedtesting.RandomizedRunner.invoke(RandomizedRunner.java:1758) > at com.carrotsearch.randomizedtesting.RandomizedRunner$8.evaluate(RandomizedRunner.java:946) > at com.carrotsearch.randomizedtesting.RandomizedRunner$9.evaluate(RandomizedRunner.java:982) > at com.carrotsearch.randomizedtesting.RandomizedRunner$10.evaluate(RandomizedRunner.java:996) > at org.apache.solr.BaseDistributedSearchTestCase$ShardsRepeatRule$ShardsFixedStatement.callStatement(BaseDistributedSearchTestCase.java:1163) > at org.apache.solr.BaseDistributedSearchTestCase$ShardsRepeatRule$ShardsStatement.evaluate(BaseDistributedSearchTestCase.java:1134) > at com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:80) > at org.junit.rules.RunRules.evaluate(RunRules.java:20) > at org.apache.lucene.tests.util.TestRuleSetupTeardownChained$1.evaluate(TestRuleSetupTeardownChained.java:48) > at org.apache.lucene.tests.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:43) > at org.apache.lucene.tests.util.TestRuleThreadAndTestName$1.evaluate(TestRuleThreadAndTestName.java:45) > at org.apache.lucene.tests.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:60) > at org.apache.lucene.tests.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:44) > at org.junit.rules.RunRules.evaluate(RunRules.java:20) > at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) > at com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:390) > at com.carrotsearch.randomizedtesting.ThreadLeakControl.forkTimeoutingTask(ThreadLeakControl.java:843) > at com.carrotsearch.randomizedtesting.ThreadLeakControl$3.evaluate(ThreadLeakControl.java:490) > at com.carrotsearch.randomizedtesting.RandomizedRunner.runSingleTest(RandomizedRunner.java:955) > at com.carrotsearch.randomizedtesting.RandomizedRunner$5.evaluate(RandomizedRunner.java:840) > at com.carrotsearch.randomizedtesting.RandomizedRunner$6.evaluate(RandomizedRunner.java:891) > at com.carrotsearch.randomizedtesting.RandomizedRunner$7.evaluate(RandomizedRunner.java:902) > at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) > at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) > at org.apache.lucene.tests.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:43) > at org.apache.lucene.tests.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:43) > at com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:80) > at org.junit.rules.RunRules.evaluate(RunRules.java:20) > at org.apache.lucene.tests.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:43) > at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) > at org.apache.lucene.tests.util.TestRuleStoreClassName$1.evaluate(TestRuleStoreClassName.java:38) > at com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:40) > at com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:40) > at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) > at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) > at org.apache.lucene.tests.util.TestRuleAssertionsRequired$1.evaluate(TestRuleAssertionsRequired.java:53) > at org.apache.lucene.tests.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:43) > at org.apache.lucene.tests.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:44) > at org.apache.lucene.tests.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:60) > at org.apache.lucene.tests.util.TestRuleIgnoreTestSuites$1.evaluate(TestRuleIgnoreTestSuites.java:47) > at org.junit.rules.RunRules.evaluate(RunRules.java:20) > at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) > at com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:390) > at com.carrotsearch.randomizedtesting.ThreadLeakControl.lambda$forkTimeoutingTask$0(ThreadLeakControl.java:850) > at java.base/java.lang.Thread.run(Thread.java:829) > > > > at __randomizedtesting.SeedInfo.seed([246C98A4C257C021]:0) > at org.junit.Assert.fail(Assert.java:89) > at org.junit.Assert.failNotNull(Assert.java:756) > at org.junit.Assert.assertNull(Assert.java:738) > at org.apache.solr.SolrTestCase$1.afterIfSuccessful(SolrTestCase.java:100) > at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:37) > at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) > at org.apache.lucene.tests.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:43) > at org.apache.lucene.tests.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:43) > at com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:80) > at org.junit.rules.RunRules.evaluate(RunRules.java:20) > at org.apache.lucene.tests.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:43) > at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) > at org.apache.lucene.tests.util.TestRuleStoreClassName$1.evaluate(TestRuleStoreClassName.java:38) > at com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:40) > at com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:40) > at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) > at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) > at org.apache.lucene.tests.util.TestRuleAssertionsRequired$1.evaluate(TestRuleAssertionsRequired.java:53) > at org.apache.lucene.tests.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:43) > at org.apache.lucene.tests.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:44) > at org.apache.lucene.tests.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:60) > at org.apache.lucene.tests.util.TestRuleIgnoreTestSuites$1.evaluate(TestRuleIgnoreTestSuites.java:47) > at org.junit.rules.RunRules.evaluate(RunRules.java:20) > at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) > at com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:390) > at com.carrotsearch.randomizedtesting.ThreadLeakControl.lambda$forkTimeoutingTask$0(ThreadLeakControl.java:850) > at java.base/java.lang.Thread.run(Thread.java:829) 2> NOTE: test params are: codec=Asserting(Lucene95), sim=Asserting(RandomSimilarity(queryNorm=true): {}), locale=os-RU, timezone=America/El_Salvador 2> NOTE: Linux 6.1.61-3-sophgo-08357-g369f7207fe69 riscv64/Oracle Corporation 11.0.22 (64-bit)/cpus=1,threads=4,free=236484528,total=403177472 2> NOTE: All tests run in this JVM: [SampleTest, TestJoin, TestLuceneMatchVersion, TestExportTool, BasicDistributedZkTest] 2> NOTE: reproduce with: gradlew test --tests BasicDistributedZkTest -Dtests.seed=246C98A4C257C021 -Dtests.locale=os-RU -Dtests.timezone=America/El_Salvador -Dtests.asserts=true -Dtests.file.encoding=UTF-8 > Task :solr:modules:jaegertracer-configurator:test > Task :solr:core:test org.apache.solr.cloud.ReplaceNodeTest > testGoodSpreadDuringAssignWithNoTarget FAILED java.lang.AssertionError: There should be no more replicas on the sourceNode after a replaceNode request. expected:<[]> but was:<[core_node10:{ "core":"replacenodetest_coll_shard1_replica_n4", "leader":"true", "node_name":"127.0.0.1:42593_solr", "base_url":"http://127.0.0.1:42593/solr", "state":"active", "type":"NRT", "force_set_state":"false"}, core_node11:{ "core":"replacenodetest_coll_shard2_replica_n5", "node_name":"127.0.0.1:42593_solr", "base_url":"http://127.0.0.1:42593/solr", "state":"active", "type":"NRT", "force_set_state":"false"}, core_node12:{ "core":"replacenodetest_coll_shard3_replica_n6", "leader":"true", "node_name":"127.0.0.1:42593_solr", "base_url":"http://127.0.0.1:42593/solr", "state":"active", "type":"NRT", "force_set_state":"false"}]> at __randomizedtesting.SeedInfo.seed([246C98A4C257C021:A074C9EF7341155]:0) at org.junit.Assert.fail(Assert.java:89) at org.junit.Assert.failNotEquals(Assert.java:835) at org.junit.Assert.assertEquals(Assert.java:120) at org.apache.solr.cloud.ReplaceNodeTest.testGoodSpreadDuringAssignWithNoTarget(ReplaceNodeTest.java:273) at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.base/java.lang.reflect.Method.invoke(Method.java:566) at com.carrotsearch.randomizedtesting.RandomizedRunner.invoke(RandomizedRunner.java:1758) at com.carrotsearch.randomizedtesting.RandomizedRunner$8.evaluate(RandomizedRunner.java:946) at com.carrotsearch.randomizedtesting.RandomizedRunner$9.evaluate(RandomizedRunner.java:982) at com.carrotsearch.randomizedtesting.RandomizedRunner$10.evaluate(RandomizedRunner.java:996) at com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:80) at org.junit.rules.RunRules.evaluate(RunRules.java:20) at org.apache.lucene.tests.util.TestRuleSetupTeardownChained$1.evaluate(TestRuleSetupTeardownChained.java:48) at org.apache.lucene.tests.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:43) at org.apache.lucene.tests.util.TestRuleThreadAndTestName$1.evaluate(TestRuleThreadAndTestName.java:45) at org.apache.lucene.tests.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:60) at org.apache.lucene.tests.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:44) at org.junit.rules.RunRules.evaluate(RunRules.java:20) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:390) at com.carrotsearch.randomizedtesting.ThreadLeakControl.forkTimeoutingTask(ThreadLeakControl.java:843) at com.carrotsearch.randomizedtesting.ThreadLeakControl$3.evaluate(ThreadLeakControl.java:490) at com.carrotsearch.randomizedtesting.RandomizedRunner.runSingleTest(RandomizedRunner.java:955) at com.carrotsearch.randomizedtesting.RandomizedRunner$5.evaluate(RandomizedRunner.java:840) at com.carrotsearch.randomizedtesting.RandomizedRunner$6.evaluate(RandomizedRunner.java:891) at com.carrotsearch.randomizedtesting.RandomizedRunner$7.evaluate(RandomizedRunner.java:902) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at org.apache.lucene.tests.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:43) at org.apache.lucene.tests.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:43) at com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:80) at org.junit.rules.RunRules.evaluate(RunRules.java:20) at org.apache.lucene.tests.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:43) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at org.apache.lucene.tests.util.TestRuleStoreClassName$1.evaluate(TestRuleStoreClassName.java:38) at com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:40) at com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:40) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at org.apache.lucene.tests.util.TestRuleAssertionsRequired$1.evaluate(TestRuleAssertionsRequired.java:53) at org.apache.lucene.tests.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:43) at org.apache.lucene.tests.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:44) at org.apache.lucene.tests.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:60) at org.apache.lucene.tests.util.TestRuleIgnoreTestSuites$1.evaluate(TestRuleIgnoreTestSuites.java:47) at org.junit.rules.RunRules.evaluate(RunRules.java:20) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:390) at com.carrotsearch.randomizedtesting.ThreadLeakControl.lambda$forkTimeoutingTask$0(ThreadLeakControl.java:850) at java.base/java.lang.Thread.run(Thread.java:829) WARNING: Test org.apache.solr.TestRandomFaceting wrote 12,745,720 bytes of output. org.apache.solr.cloud.ReplaceNodeTest > test suite's output saved to /build/solr/src/solr-9.4.1/solr/core/build/test-results/test/outputs/OUTPUT-org.apache.solr.cloud.ReplaceNodeTest.txt, copied below: 2> 364655 INFO (SUITE-ReplaceNodeTest-seed#[246C98A4C257C021]-worker) [n: c: s: r: x: t:] o.a.s.SolrTestCase Setting 'solr.default.confdir' system property to test-framework derived value of '/build/solr/src/solr-9.4.1/solr/server/solr/configsets/_default/conf' 2> 364656 INFO (SUITE-ReplaceNodeTest-seed#[246C98A4C257C021]-worker) [n: c: s: r: x: t:] o.a.s.SolrTestCaseJ4 SecureRandom sanity checks: test.solr.allowed.securerandom=null & java.security.egd=file:/dev/./urandom 2> 364661 INFO (SUITE-ReplaceNodeTest-seed#[246C98A4C257C021]-worker) [n: c: s: r: x: t:] o.a.s.u.ErrorLogMuter Closing ErrorLogMuter-regex-20 after mutting 0 log messages 2> 364661 INFO (SUITE-ReplaceNodeTest-seed#[246C98A4C257C021]-worker) [n: c: s: r: x: t:] o.a.s.u.ErrorLogMuter Creating ErrorLogMuter-regex-21 for ERROR logs matching regex: ignore_exception 2> 364666 INFO (SUITE-ReplaceNodeTest-seed#[246C98A4C257C021]-worker) [n: c: s: r: x: t:] o.a.s.SolrTestCaseJ4 Created dataDir: /build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.cloud.ReplaceNodeTest_246C98A4C257C021-001/data-dir-11-001 2> 364669 INFO (SUITE-ReplaceNodeTest-seed#[246C98A4C257C021]-worker) [n: c: s: r: x: t:] o.a.s.SolrTestCaseJ4 Using PointFields (NUMERIC_POINTS_SYSPROP=true) w/NUMERIC_DOCVALUES_SYSPROP=true 2> 364675 INFO (SUITE-ReplaceNodeTest-seed#[246C98A4C257C021]-worker) [n: c: s: r: x: t:] o.a.s.SolrTestCaseJ4 Randomized ssl (false) and clientAuth (false) via: @org.apache.solr.util.RandomizeSSL(reason="", ssl=0.0/0.0, value=0.0/0.0, clientAuth=0.0/0.0) 2> 364729 INFO (TEST-ReplaceNodeTest.testGoodSpreadDuringAssignWithNoTarget-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.SolrTestCaseJ4 ###Starting testGoodSpreadDuringAssignWithNoTarget 2> 364732 INFO (TEST-ReplaceNodeTest.testGoodSpreadDuringAssignWithNoTarget-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.u.t.SimplePropagator Always-on trace id generation enabled. 2> 364732 INFO (TEST-ReplaceNodeTest.testGoodSpreadDuringAssignWithNoTarget-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.MiniSolrCloudCluster Starting cluster of 4 servers in /build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.cloud.ReplaceNodeTest_246C98A4C257C021-001/tempDir-001 2> 364734 INFO (TEST-ReplaceNodeTest.testGoodSpreadDuringAssignWithNoTarget-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.ZkTestServer STARTING ZK TEST SERVER 2> 364735 INFO (ZkTestServer Run Thread) [n: c: s: r: x: t:] o.a.s.c.ZkTestServer client port: 0.0.0.0/0.0.0.0:0 2> 364736 INFO (ZkTestServer Run Thread) [n: c: s: r: x: t:] o.a.s.c.ZkTestServer Starting server 2> 364746 WARN (ZkTestServer Run Thread) [n: c: s: r: x: t:] o.a.z.s.ServerCnxnFactory maxCnxns is not configured, using default value 0. 2> 364835 INFO (TEST-ReplaceNodeTest.testGoodSpreadDuringAssignWithNoTarget-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.ZkTestServer start zk server on port: 41575 2> 364842 WARN (TEST-ReplaceNodeTest.testGoodSpreadDuringAssignWithNoTarget-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.c.SolrZkClient Using default ZkCredentialsInjector. ZkCredentialsInjector is not secure, it creates an empty list of credentials which leads to 'OPEN_ACL_UNSAFE' ACLs to Zookeeper nodes 2> 364851 INFO (TEST-ReplaceNodeTest.testGoodSpreadDuringAssignWithNoTarget-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.c.ConnectionManager Waiting up to 30000ms for client to connect to ZooKeeper 2> 364865 INFO (zkConnectionManagerCallback-2123-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ConnectionManager zkClient has connected 2> 364865 INFO (TEST-ReplaceNodeTest.testGoodSpreadDuringAssignWithNoTarget-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper 2> 364866 WARN (TEST-ReplaceNodeTest.testGoodSpreadDuringAssignWithNoTarget-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.c.SolrZkClient Using default ZkACLProvider. DefaultZkACLProvider is not secure, it creates 'OPEN_ACL_UNSAFE' ACLs to Zookeeper nodes 2> 364874 WARN (TEST-ReplaceNodeTest.testGoodSpreadDuringAssignWithNoTarget-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.c.SolrZkClient Using default ZkCredentialsInjector. ZkCredentialsInjector is not secure, it creates an empty list of credentials which leads to 'OPEN_ACL_UNSAFE' ACLs to Zookeeper nodes 2> 364881 INFO (TEST-ReplaceNodeTest.testGoodSpreadDuringAssignWithNoTarget-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.c.ConnectionManager Waiting up to 30000ms for client to connect to ZooKeeper 2> 364889 INFO (zkConnectionManagerCallback-2125-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ConnectionManager zkClient has connected 2> 364890 INFO (TEST-ReplaceNodeTest.testGoodSpreadDuringAssignWithNoTarget-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper 2> 364890 WARN (TEST-ReplaceNodeTest.testGoodSpreadDuringAssignWithNoTarget-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.c.SolrZkClient Using default ZkACLProvider. DefaultZkACLProvider is not secure, it creates 'OPEN_ACL_UNSAFE' ACLs to Zookeeper nodes 2> 364891 WARN (TEST-ReplaceNodeTest.testGoodSpreadDuringAssignWithNoTarget-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.c.SolrZkClient Using default ZkCredentialsInjector. ZkCredentialsInjector is not secure, it creates an empty list of credentials which leads to 'OPEN_ACL_UNSAFE' ACLs to Zookeeper nodes 2> 364901 INFO (TEST-ReplaceNodeTest.testGoodSpreadDuringAssignWithNoTarget-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.c.ConnectionManager Waiting up to 15000ms for client to connect to ZooKeeper 2> 364910 INFO (zkConnectionManagerCallback-2127-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ConnectionManager zkClient has connected 2> 364910 INFO (TEST-ReplaceNodeTest.testGoodSpreadDuringAssignWithNoTarget-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper 2> 364911 WARN (TEST-ReplaceNodeTest.testGoodSpreadDuringAssignWithNoTarget-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.c.SolrZkClient Using default ZkACLProvider. DefaultZkACLProvider is not secure, it creates 'OPEN_ACL_UNSAFE' ACLs to Zookeeper nodes 2> 365040 INFO (jetty-launcher-2128-thread-4) [n: c: s: r: x: t:] o.a.s.e.JettySolrRunner Start Jetty (configured port=0, binding port=0) 2> 365040 INFO (jetty-launcher-2128-thread-2) [n: c: s: r: x: t:] o.a.s.e.JettySolrRunner Start Jetty (configured port=0, binding port=0) 2> 365040 INFO (jetty-launcher-2128-thread-1) [n: c: s: r: x: t:] o.a.s.e.JettySolrRunner Start Jetty (configured port=0, binding port=0) 2> 365040 INFO (jetty-launcher-2128-thread-3) [n: c: s: r: x: t:] o.a.s.e.JettySolrRunner Start Jetty (configured port=0, binding port=0) 2> 365041 INFO (jetty-launcher-2128-thread-4) [n: c: s: r: x: t:] o.a.s.e.JettySolrRunner Trying to start Jetty on port 0 try number 2 ... 2> 365041 INFO (jetty-launcher-2128-thread-3) [n: c: s: r: x: t:] o.a.s.e.JettySolrRunner Trying to start Jetty on port 0 try number 2 ... 2> 365041 INFO (jetty-launcher-2128-thread-1) [n: c: s: r: x: t:] o.a.s.e.JettySolrRunner Trying to start Jetty on port 0 try number 2 ... 2> 365041 INFO (jetty-launcher-2128-thread-2) [n: c: s: r: x: t:] o.a.s.e.JettySolrRunner Trying to start Jetty on port 0 try number 2 ... 2> 365042 INFO (jetty-launcher-2128-thread-3) [n: c: s: r: x: t:] o.e.j.s.Server jetty-10.0.19; built: 2023-12-15T20:38:32.477Z; git: 8492d1c78f122bb30cce20aecfa07e7283facd47; jvm 11.0.22+7 2> 365042 INFO (jetty-launcher-2128-thread-1) [n: c: s: r: x: t:] o.e.j.s.Server jetty-10.0.19; built: 2023-12-15T20:38:32.477Z; git: 8492d1c78f122bb30cce20aecfa07e7283facd47; jvm 11.0.22+7 2> 365042 INFO (jetty-launcher-2128-thread-4) [n: c: s: r: x: t:] o.e.j.s.Server jetty-10.0.19; built: 2023-12-15T20:38:32.477Z; git: 8492d1c78f122bb30cce20aecfa07e7283facd47; jvm 11.0.22+7 2> 365042 INFO (jetty-launcher-2128-thread-2) [n: c: s: r: x: t:] o.e.j.s.Server jetty-10.0.19; built: 2023-12-15T20:38:32.477Z; git: 8492d1c78f122bb30cce20aecfa07e7283facd47; jvm 11.0.22+7 2> 365089 INFO (jetty-launcher-2128-thread-1) [n: c: s: r: x: t:] o.e.j.s.s.DefaultSessionIdManager Session workerName=node0 2> 365089 INFO (jetty-launcher-2128-thread-4) [n: c: s: r: x: t:] o.e.j.s.s.DefaultSessionIdManager Session workerName=node0 2> 365105 INFO (jetty-launcher-2128-thread-1) [n: c: s: r: x: t:] o.e.j.s.h.ContextHandler Started o.e.j.s.ServletContextHandler@4e8c717{/solr,file:///build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-cwd/,AVAILABLE} 2> 365105 INFO (jetty-launcher-2128-thread-4) [n: c: s: r: x: t:] o.e.j.s.h.ContextHandler Started o.e.j.s.ServletContextHandler@33bdba24{/solr,file:///build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-cwd/,AVAILABLE} 2> 365110 INFO (jetty-launcher-2128-thread-2) [n: c: s: r: x: t:] o.e.j.s.s.DefaultSessionIdManager Session workerName=node0 2> 365113 INFO (jetty-launcher-2128-thread-3) [n: c: s: r: x: t:] o.e.j.s.s.DefaultSessionIdManager Session workerName=node0 2> 365113 INFO (jetty-launcher-2128-thread-1) [n: c: s: r: x: t:] o.e.j.s.AbstractConnector Started ServerConnector@14de03a5{HTTP/1.1, (http/1.1, h2c)}{127.0.0.1:44549} 2> 365114 INFO (jetty-launcher-2128-thread-4) [n: c: s: r: x: t:] o.e.j.s.AbstractConnector Started ServerConnector@17bfdbda{HTTP/1.1, (http/1.1, h2c)}{127.0.0.1:35755} 2> 365115 INFO (jetty-launcher-2128-thread-4) [n: c: s: r: x: t:] o.e.j.s.Server Started Server@671ee2a9{STARTING}[10.0.19,sto=0] @365298ms 2> 365115 INFO (jetty-launcher-2128-thread-1) [n: c: s: r: x: t:] o.e.j.s.Server Started Server@64fcd87f{STARTING}[10.0.19,sto=0] @365298ms 2> 365118 INFO (jetty-launcher-2128-thread-2) [n: c: s: r: x: t:] o.e.j.s.h.ContextHandler Started o.e.j.s.ServletContextHandler@726e9ca{/solr,file:///build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-cwd/,AVAILABLE} 2> 365119 INFO (jetty-launcher-2128-thread-3) [n: c: s: r: x: t:] o.e.j.s.h.ContextHandler Started o.e.j.s.ServletContextHandler@742dff7{/solr,file:///build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-cwd/,AVAILABLE} 2> 365120 ERROR (jetty-launcher-2128-thread-1) [n: c: s: r: x: t:] o.a.s.u.StartupLoggingUtils Missing Java Option solr.log.dir. Logging may be missing or incomplete. 2> 365120 ERROR (jetty-launcher-2128-thread-4) [n: c: s: r: x: t:] o.a.s.u.StartupLoggingUtils Missing Java Option solr.log.dir. Logging may be missing or incomplete. 2> 365121 INFO (jetty-launcher-2128-thread-1) [n: c: s: r: x: t:] o.a.s.s.CoreContainerProvider Using logger factory org.apache.logging.slf4j.Log4jLoggerFactory 2> 365121 INFO (jetty-launcher-2128-thread-4) [n: c: s: r: x: t:] o.a.s.s.CoreContainerProvider Using logger factory org.apache.logging.slf4j.Log4jLoggerFactory 2> 365121 INFO (jetty-launcher-2128-thread-1) [n: c: s: r: x: t:] o.a.s.s.CoreContainerProvider ___ _ Welcome to Apache Solr™ version 9.4.1 2> 365121 INFO (jetty-launcher-2128-thread-4) [n: c: s: r: x: t:] o.a.s.s.CoreContainerProvider ___ _ Welcome to Apache Solr™ version 9.4.1 2> 365122 INFO (jetty-launcher-2128-thread-1) [n: c: s: r: x: t:] o.a.s.s.CoreContainerProvider / __| ___| |_ _ Starting in cloud mode on port null 2> 365122 INFO (jetty-launcher-2128-thread-4) [n: c: s: r: x: t:] o.a.s.s.CoreContainerProvider / __| ___| |_ _ Starting in cloud mode on port null 2> 365122 INFO (jetty-launcher-2128-thread-4) [n: c: s: r: x: t:] o.a.s.s.CoreContainerProvider \__ \/ _ \ | '_| Install dir: /build/solr/src/solr-9.4.1/solr 2> 365122 INFO (jetty-launcher-2128-thread-1) [n: c: s: r: x: t:] o.a.s.s.CoreContainerProvider \__ \/ _ \ | '_| Install dir: /build/solr/src/solr-9.4.1/solr 2> 365122 INFO (jetty-launcher-2128-thread-3) [n: c: s: r: x: t:] o.e.j.s.AbstractConnector Started ServerConnector@39ffe81a{HTTP/1.1, (http/1.1, h2c)}{127.0.0.1:33263} 2> 365123 INFO (jetty-launcher-2128-thread-1) [n: c: s: r: x: t:] o.a.s.s.CoreContainerProvider |___/\___/_|_| Start time: 2024-01-21T21:07:18.657107Z 2> 365123 INFO (jetty-launcher-2128-thread-2) [n: c: s: r: x: t:] o.e.j.s.AbstractConnector Started ServerConnector@43f0f3df{HTTP/1.1, (http/1.1, h2c)}{127.0.0.1:42593} 2> 365123 INFO (jetty-launcher-2128-thread-4) [n: c: s: r: x: t:] o.a.s.s.CoreContainerProvider |___/\___/_|_| Start time: 2024-01-21T21:07:18.657086Z 2> 365123 INFO (jetty-launcher-2128-thread-3) [n: c: s: r: x: t:] o.e.j.s.Server Started Server@4970bf1f{STARTING}[10.0.19,sto=0] @365306ms 2> 365123 INFO (jetty-launcher-2128-thread-2) [n: c: s: r: x: t:] o.e.j.s.Server Started Server@338559d9{STARTING}[10.0.19,sto=0] @365306ms 2> 365125 INFO (jetty-launcher-2128-thread-4) [n: c: s: r: x: t:] o.a.s.s.CoreContainerProvider Solr Home: /build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.cloud.ReplaceNodeTest_246C98A4C257C021-001/tempDir-001/node4 (source: servlet config: solr.solr.home) 2> 365125 INFO (jetty-launcher-2128-thread-1) [n: c: s: r: x: t:] o.a.s.s.CoreContainerProvider Solr Home: /build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.cloud.ReplaceNodeTest_246C98A4C257C021-001/tempDir-001/node1 (source: servlet config: solr.solr.home) 2> 365126 ERROR (jetty-launcher-2128-thread-3) [n: c: s: r: x: t:] o.a.s.u.StartupLoggingUtils Missing Java Option solr.log.dir. Logging may be missing or incomplete. 2> 365126 ERROR (jetty-launcher-2128-thread-2) [n: c: s: r: x: t:] o.a.s.u.StartupLoggingUtils Missing Java Option solr.log.dir. Logging may be missing or incomplete. 2> 365126 INFO (jetty-launcher-2128-thread-3) [n: c: s: r: x: t:] o.a.s.s.CoreContainerProvider Using logger factory org.apache.logging.slf4j.Log4jLoggerFactory 2> 365127 INFO (jetty-launcher-2128-thread-2) [n: c: s: r: x: t:] o.a.s.s.CoreContainerProvider Using logger factory org.apache.logging.slf4j.Log4jLoggerFactory 2> 365127 INFO (jetty-launcher-2128-thread-3) [n: c: s: r: x: t:] o.a.s.s.CoreContainerProvider ___ _ Welcome to Apache Solr™ version 9.4.1 2> 365127 INFO (jetty-launcher-2128-thread-2) [n: c: s: r: x: t:] o.a.s.s.CoreContainerProvider ___ _ Welcome to Apache Solr™ version 9.4.1 2> 365127 INFO (jetty-launcher-2128-thread-3) [n: c: s: r: x: t:] o.a.s.s.CoreContainerProvider / __| ___| |_ _ Starting in cloud mode on port null 2> 365128 INFO (jetty-launcher-2128-thread-2) [n: c: s: r: x: t:] o.a.s.s.CoreContainerProvider / __| ___| |_ _ Starting in cloud mode on port null 2> 365128 WARN (jetty-launcher-2128-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.SolrZkClient Using default ZkCredentialsInjector. ZkCredentialsInjector is not secure, it creates an empty list of credentials which leads to 'OPEN_ACL_UNSAFE' ACLs to Zookeeper nodes 2> 365128 WARN (jetty-launcher-2128-thread-4) [n: c: s: r: x: t:] o.a.s.c.c.SolrZkClient Using default ZkCredentialsInjector. ZkCredentialsInjector is not secure, it creates an empty list of credentials which leads to 'OPEN_ACL_UNSAFE' ACLs to Zookeeper nodes 2> 365128 INFO (jetty-launcher-2128-thread-3) [n: c: s: r: x: t:] o.a.s.s.CoreContainerProvider \__ \/ _ \ | '_| Install dir: /build/solr/src/solr-9.4.1/solr 2> 365128 INFO (jetty-launcher-2128-thread-2) [n: c: s: r: x: t:] o.a.s.s.CoreContainerProvider \__ \/ _ \ | '_| Install dir: /build/solr/src/solr-9.4.1/solr 2> 365128 INFO (jetty-launcher-2128-thread-3) [n: c: s: r: x: t:] o.a.s.s.CoreContainerProvider |___/\___/_|_| Start time: 2024-01-21T21:07:18.662562Z 2> 365128 INFO (jetty-launcher-2128-thread-2) [n: c: s: r: x: t:] o.a.s.s.CoreContainerProvider |___/\___/_|_| Start time: 2024-01-21T21:07:18.662661Z 2> 365130 INFO (jetty-launcher-2128-thread-2) [n: c: s: r: x: t:] o.a.s.s.CoreContainerProvider Solr Home: /build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.cloud.ReplaceNodeTest_246C98A4C257C021-001/tempDir-001/node2 (source: servlet config: solr.solr.home) 2> 365130 INFO (jetty-launcher-2128-thread-3) [n: c: s: r: x: t:] o.a.s.s.CoreContainerProvider Solr Home: /build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.cloud.ReplaceNodeTest_246C98A4C257C021-001/tempDir-001/node3 (source: servlet config: solr.solr.home) 2> 365133 WARN (jetty-launcher-2128-thread-2) [n: c: s: r: x: t:] o.a.s.c.c.SolrZkClient Using default ZkCredentialsInjector. ZkCredentialsInjector is not secure, it creates an empty list of credentials which leads to 'OPEN_ACL_UNSAFE' ACLs to Zookeeper nodes 2> 365134 WARN (jetty-launcher-2128-thread-3) [n: c: s: r: x: t:] o.a.s.c.c.SolrZkClient Using default ZkCredentialsInjector. ZkCredentialsInjector is not secure, it creates an empty list of credentials which leads to 'OPEN_ACL_UNSAFE' ACLs to Zookeeper nodes 2> 365157 INFO (jetty-launcher-2128-thread-2) [n: c: s: r: x: t:] o.a.s.c.c.ConnectionManager Waiting up to 30000ms for client to connect to ZooKeeper 2> 365159 INFO (jetty-launcher-2128-thread-4) [n: c: s: r: x: t:] o.a.s.c.c.ConnectionManager Waiting up to 30000ms for client to connect to ZooKeeper 2> 365163 INFO (jetty-launcher-2128-thread-3) [n: c: s: r: x: t:] o.a.s.c.c.ConnectionManager Waiting up to 30000ms for client to connect to ZooKeeper 2> 365165 INFO (jetty-launcher-2128-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ConnectionManager Waiting up to 30000ms for client to connect to ZooKeeper 2> 365170 INFO (zkConnectionManagerCallback-2132-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ConnectionManager zkClient has connected 2> 365170 INFO (jetty-launcher-2128-thread-4) [n: c: s: r: x: t:] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper 2> 365171 WARN (jetty-launcher-2128-thread-4) [n: c: s: r: x: t:] o.a.s.c.c.SolrZkClient Using default ZkACLProvider. DefaultZkACLProvider is not secure, it creates 'OPEN_ACL_UNSAFE' ACLs to Zookeeper nodes 2> 365172 INFO (zkConnectionManagerCallback-2130-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ConnectionManager zkClient has connected 2> 365172 INFO (jetty-launcher-2128-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper 2> 365173 WARN (jetty-launcher-2128-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.SolrZkClient Using default ZkACLProvider. DefaultZkACLProvider is not secure, it creates 'OPEN_ACL_UNSAFE' ACLs to Zookeeper nodes 2> 365174 INFO (zkConnectionManagerCallback-2136-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ConnectionManager zkClient has connected 2> 365174 INFO (jetty-launcher-2128-thread-4) [n: c: s: r: x: t:] o.a.s.c.NodeConfig solr.xml found in ZooKeeper. Loading... 2> 365174 INFO (jetty-launcher-2128-thread-2) [n: c: s: r: x: t:] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper 2> 365175 WARN (jetty-launcher-2128-thread-2) [n: c: s: r: x: t:] o.a.s.c.c.SolrZkClient Using default ZkACLProvider. DefaultZkACLProvider is not secure, it creates 'OPEN_ACL_UNSAFE' ACLs to Zookeeper nodes 2> 365175 INFO (zkConnectionManagerCallback-2134-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ConnectionManager zkClient has connected 2> 365176 INFO (jetty-launcher-2128-thread-3) [n: c: s: r: x: t:] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper 2> 365176 INFO (jetty-launcher-2128-thread-1) [n: c: s: r: x: t:] o.a.s.c.NodeConfig solr.xml found in ZooKeeper. Loading... 2> 365176 WARN (jetty-launcher-2128-thread-3) [n: c: s: r: x: t:] o.a.s.c.c.SolrZkClient Using default ZkACLProvider. DefaultZkACLProvider is not secure, it creates 'OPEN_ACL_UNSAFE' ACLs to Zookeeper nodes 2> 365178 INFO (jetty-launcher-2128-thread-2) [n: c: s: r: x: t:] o.a.s.c.NodeConfig solr.xml found in ZooKeeper. Loading... 2> 365179 INFO (jetty-launcher-2128-thread-3) [n: c: s: r: x: t:] o.a.s.c.NodeConfig solr.xml found in ZooKeeper. Loading... 2> 365380 INFO (jetty-launcher-2128-thread-3) [n: c: s: r: x: t:] o.a.s.c.ZkContainer Zookeeper client=127.0.0.1:41575/solr 2> 365382 INFO (jetty-launcher-2128-thread-3) [n: c: s: r: x: t:] o.a.s.c.DistributedClusterStateUpdater Creating DistributedClusterStateUpdater with useDistributedStateUpdate=true. Solr will be using distributed cluster state updates. 2> 365383 INFO (jetty-launcher-2128-thread-1) [n: c: s: r: x: t:] o.a.s.c.ZkContainer Zookeeper client=127.0.0.1:41575/solr 2> 365384 INFO (jetty-launcher-2128-thread-1) [n: c: s: r: x: t:] o.a.s.c.DistributedClusterStateUpdater Creating DistributedClusterStateUpdater with useDistributedStateUpdate=true. Solr will be using distributed cluster state updates. 2> 365386 INFO (jetty-launcher-2128-thread-4) [n: c: s: r: x: t:] o.a.s.c.ZkContainer Zookeeper client=127.0.0.1:41575/solr 2> 365387 INFO (jetty-launcher-2128-thread-4) [n: c: s: r: x: t:] o.a.s.c.DistributedClusterStateUpdater Creating DistributedClusterStateUpdater with useDistributedStateUpdate=true. Solr will be using distributed cluster state updates. 2> 365387 INFO (jetty-launcher-2128-thread-2) [n: c: s: r: x: t:] o.a.s.c.ZkContainer Zookeeper client=127.0.0.1:41575/solr 2> 365388 INFO (jetty-launcher-2128-thread-2) [n: c: s: r: x: t:] o.a.s.c.DistributedClusterStateUpdater Creating DistributedClusterStateUpdater with useDistributedStateUpdate=true. Solr will be using distributed cluster state updates. 2> 365411 INFO (jetty-launcher-2128-thread-4) [n:127.0.0.1:35755_solr c: s: r: x: t:] o.a.s.c.c.ConnectionManager Waiting up to 15000ms for client to connect to ZooKeeper 2> 365413 INFO (jetty-launcher-2128-thread-1) [n:127.0.0.1:44549_solr c: s: r: x: t:] o.a.s.c.c.ConnectionManager Waiting up to 15000ms for client to connect to ZooKeeper 2> 365416 INFO (jetty-launcher-2128-thread-2) [n:127.0.0.1:42593_solr c: s: r: x: t:] o.a.s.c.c.ConnectionManager Waiting up to 15000ms for client to connect to ZooKeeper 2> 365419 INFO (jetty-launcher-2128-thread-3) [n:127.0.0.1:33263_solr c: s: r: x: t:] o.a.s.c.c.ConnectionManager Waiting up to 15000ms for client to connect to ZooKeeper 2> 365422 INFO (zkConnectionManagerCallback-2170-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ConnectionManager zkClient has connected 2> 365422 INFO (jetty-launcher-2128-thread-3) [n:127.0.0.1:33263_solr c: s: r: x: t:] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper 2> 365423 INFO (zkConnectionManagerCallback-2174-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ConnectionManager zkClient has connected 2> 365424 INFO (jetty-launcher-2128-thread-4) [n:127.0.0.1:35755_solr c: s: r: x: t:] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper 2> 365424 INFO (zkConnectionManagerCallback-2172-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ConnectionManager zkClient has connected 2> 365424 INFO (jetty-launcher-2128-thread-1) [n:127.0.0.1:44549_solr c: s: r: x: t:] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper 2> 365427 INFO (zkConnectionManagerCallback-2176-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ConnectionManager zkClient has connected 2> 365427 INFO (jetty-launcher-2128-thread-2) [n:127.0.0.1:42593_solr c: s: r: x: t:] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper 2> 365572 WARN (jetty-launcher-2128-thread-3) [n:127.0.0.1:33263_solr c: s: r: x: t:] o.a.s.c.ZkController Contents of zookeeper /security.json are world-readable; consider setting up ACLs as described in https://solr.apache.org/guide/solr/latest/deployment-guide/zookeeper-access-control.html 2> 365573 WARN (jetty-launcher-2128-thread-4) [n:127.0.0.1:35755_solr c: s: r: x: t:] o.a.s.c.ZkController Contents of zookeeper /security.json are world-readable; consider setting up ACLs as described in https://solr.apache.org/guide/solr/latest/deployment-guide/zookeeper-access-control.html 2> 365573 WARN (jetty-launcher-2128-thread-1) [n:127.0.0.1:44549_solr c: s: r: x: t:] o.a.s.c.ZkController Contents of zookeeper /security.json are world-readable; consider setting up ACLs as described in https://solr.apache.org/guide/solr/latest/deployment-guide/zookeeper-access-control.html 2> 365576 WARN (jetty-launcher-2128-thread-2) [n:127.0.0.1:42593_solr c: s: r: x: t:] o.a.s.c.ZkController Contents of zookeeper /security.json are world-readable; consider setting up ACLs as described in https://solr.apache.org/guide/solr/latest/deployment-guide/zookeeper-access-control.html 2> 365598 INFO (jetty-launcher-2128-thread-3) [n:127.0.0.1:33263_solr c: s: r: x: t:] o.a.s.c.DistributedClusterStateUpdater Creating DistributedClusterStateUpdater with useDistributedStateUpdate=true. Solr will be using distributed cluster state updates. 2> 365598 INFO (jetty-launcher-2128-thread-4) [n:127.0.0.1:35755_solr c: s: r: x: t:] o.a.s.c.DistributedClusterStateUpdater Creating DistributedClusterStateUpdater with useDistributedStateUpdate=true. Solr will be using distributed cluster state updates. 2> 365599 INFO (jetty-launcher-2128-thread-1) [n:127.0.0.1:44549_solr c: s: r: x: t:] o.a.s.c.DistributedClusterStateUpdater Creating DistributedClusterStateUpdater with useDistributedStateUpdate=true. Solr will be using distributed cluster state updates. 2> 365599 INFO (jetty-launcher-2128-thread-2) [n:127.0.0.1:42593_solr c: s: r: x: t:] o.a.s.c.DistributedClusterStateUpdater Creating DistributedClusterStateUpdater with useDistributedStateUpdate=true. Solr will be using distributed cluster state updates. 2> 366195 INFO (jetty-launcher-2128-thread-3) [n:127.0.0.1:33263_solr c: s: r: x: t:] o.a.s.c.OverseerElectionContext I am going to be the leader 127.0.0.1:33263_solr 2> 366199 INFO (jetty-launcher-2128-thread-2) [n:127.0.0.1:42593_solr c: s: r: x: t:] o.a.s.c.ZkController Register node as live in ZooKeeper:/live_nodes/127.0.0.1:42593_solr 2> 366200 INFO (jetty-launcher-2128-thread-4) [n:127.0.0.1:35755_solr c: s: r: x: t:] o.a.s.c.ZkController Register node as live in ZooKeeper:/live_nodes/127.0.0.1:35755_solr 2> 366200 INFO (jetty-launcher-2128-thread-1) [n:127.0.0.1:44549_solr c: s: r: x: t:] o.a.s.c.ZkController Register node as live in ZooKeeper:/live_nodes/127.0.0.1:44549_solr 2> 366202 INFO (jetty-launcher-2128-thread-3) [n:127.0.0.1:33263_solr c: s: r: x: t:] o.a.s.c.Overseer Overseer (id=72077814123528199-127.0.0.1:33263_solr-n_0000000000) starting 2> 366211 INFO (zkCallback-2169-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (0) -> (3) 2> 366212 INFO (zkCallback-2171-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (0) -> (3) 2> 366213 INFO (zkCallback-2175-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (0) -> (3) 2> 366215 INFO (zkCallback-2173-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (0) -> (3) 2> 366229 WARN (jetty-launcher-2128-thread-2) [n:127.0.0.1:42593_solr c: s: r: x: t:] o.a.s.c.CoreContainer Not all security plugins configured! authentication=disabled authorization=disabled. Solr is only as secure as you make it. Consider configuring authentication/authorization before exposing Solr to users internal or external. See https://s.apache.org/solrsecurity for more info 2> 366230 WARN (jetty-launcher-2128-thread-1) [n:127.0.0.1:44549_solr c: s: r: x: t:] o.a.s.c.CoreContainer Not all security plugins configured! authentication=disabled authorization=disabled. Solr is only as secure as you make it. Consider configuring authentication/authorization before exposing Solr to users internal or external. See https://s.apache.org/solrsecurity for more info 2> 366230 WARN (jetty-launcher-2128-thread-4) [n:127.0.0.1:35755_solr c: s: r: x: t:] o.a.s.c.CoreContainer Not all security plugins configured! authentication=disabled authorization=disabled. Solr is only as secure as you make it. Consider configuring authentication/authorization before exposing Solr to users internal or external. See https://s.apache.org/solrsecurity for more info 2> 366252 INFO (OverseerStateUpdate-72077814123528199-127.0.0.1:33263_solr-n_0000000000) [n:127.0.0.1:33263_solr c: s: r: x: t:] o.a.s.c.Overseer Starting to work on the main queue : 127.0.0.1:33263_solr 2> 366255 INFO (jetty-launcher-2128-thread-3) [n:127.0.0.1:33263_solr c: s: r: x: t:] o.a.s.c.ZkController Publish node=127.0.0.1:33263_solr as DOWN 2> 366258 INFO (jetty-launcher-2128-thread-3) [n:127.0.0.1:33263_solr c: s: r: x: t:] o.a.s.c.ZkController Register node as live in ZooKeeper:/live_nodes/127.0.0.1:33263_solr 2> 366265 INFO (OverseerStateUpdate-72077814123528199-127.0.0.1:33263_solr-n_0000000000) [n:127.0.0.1:33263_solr c: s: r: x: t:] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (3) -> (4) 2> 366271 INFO (zkCallback-2173-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (3) -> (4) 2> 366271 INFO (zkCallback-2171-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (3) -> (4) 2> 366271 INFO (zkCallback-2175-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (3) -> (4) 2> 366287 WARN (jetty-launcher-2128-thread-3) [n:127.0.0.1:33263_solr c: s: r: x: t:] o.a.s.c.CoreContainer Not all security plugins configured! authentication=disabled authorization=disabled. Solr is only as secure as you make it. Consider configuring authentication/authorization before exposing Solr to users internal or external. See https://s.apache.org/solrsecurity for more info 2> 366292 INFO (jetty-launcher-2128-thread-3) [n:127.0.0.1:33263_solr c: s: r: x: t:] o.a.s.c.a.c.DistributedCollectionConfigSetCommandRunner Creating DistributedCollectionConfigSetCommandRunner. Collection and ConfigSet APIs are running distributed (not Overseer based) 2> 366292 INFO (jetty-launcher-2128-thread-3) [n:127.0.0.1:33263_solr c: s: r: x: t:] o.a.s.c.DistributedClusterStateUpdater Creating DistributedClusterStateUpdater with useDistributedStateUpdate=true. Solr will be using distributed cluster state updates. 2> 366471 INFO (jetty-launcher-2128-thread-2) [n:127.0.0.1:42593_solr c: s: r: x: t:] o.a.s.c.a.c.DistributedCollectionConfigSetCommandRunner Creating DistributedCollectionConfigSetCommandRunner. Collection and ConfigSet APIs are running distributed (not Overseer based) 2> 366472 INFO (jetty-launcher-2128-thread-2) [n:127.0.0.1:42593_solr c: s: r: x: t:] o.a.s.c.DistributedClusterStateUpdater Creating DistributedClusterStateUpdater with useDistributedStateUpdate=true. Solr will be using distributed cluster state updates. 2> 366690 INFO (jetty-launcher-2128-thread-4) [n:127.0.0.1:35755_solr c: s: r: x: t:] o.a.s.c.a.c.DistributedCollectionConfigSetCommandRunner Creating DistributedCollectionConfigSetCommandRunner. Collection and ConfigSet APIs are running distributed (not Overseer based) 2> 366690 INFO (jetty-launcher-2128-thread-4) [n:127.0.0.1:35755_solr c: s: r: x: t:] o.a.s.c.DistributedClusterStateUpdater Creating DistributedClusterStateUpdater with useDistributedStateUpdate=true. Solr will be using distributed cluster state updates. 2> 366691 INFO (jetty-launcher-2128-thread-1) [n:127.0.0.1:44549_solr c: s: r: x: t:] o.a.s.c.a.c.DistributedCollectionConfigSetCommandRunner Creating DistributedCollectionConfigSetCommandRunner. Collection and ConfigSet APIs are running distributed (not Overseer based) 2> 366691 INFO (jetty-launcher-2128-thread-1) [n:127.0.0.1:44549_solr c: s: r: x: t:] o.a.s.c.DistributedClusterStateUpdater Creating DistributedClusterStateUpdater with useDistributedStateUpdate=true. Solr will be using distributed cluster state updates. 2> 366898 INFO (jetty-launcher-2128-thread-3) [n:127.0.0.1:33263_solr c: s: r: x: t:] o.a.s.c.CorePropertiesLocator Found 0 core definitions underneath /build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.cloud.ReplaceNodeTest_246C98A4C257C021-001/tempDir-001/node3 2> 367084 INFO (jetty-launcher-2128-thread-2) [n:127.0.0.1:42593_solr c: s: r: x: t:] o.a.s.c.CorePropertiesLocator Found 0 core definitions underneath /build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.cloud.ReplaceNodeTest_246C98A4C257C021-001/tempDir-001/node2 2> 367285 INFO (jetty-launcher-2128-thread-4) [n:127.0.0.1:35755_solr c: s: r: x: t:] o.a.s.c.CorePropertiesLocator Found 0 core definitions underneath /build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.cloud.ReplaceNodeTest_246C98A4C257C021-001/tempDir-001/node4 2> 367317 INFO (jetty-launcher-2128-thread-1) [n:127.0.0.1:44549_solr c: s: r: x: t:] o.a.s.c.CorePropertiesLocator Found 0 core definitions underneath /build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.cloud.ReplaceNodeTest_246C98A4C257C021-001/tempDir-001/node1 2> 367332 INFO (jetty-launcher-2128-thread-3) [n:127.0.0.1:33263_solr c: s: r: x: t:] o.a.s.j.SolrRequestAuthorizer Creating a new SolrRequestAuthorizer 2> 368078 INFO (jetty-launcher-2128-thread-2) [n:127.0.0.1:42593_solr c: s: r: x: t:] o.a.s.j.SolrRequestAuthorizer Creating a new SolrRequestAuthorizer 2> 368187 INFO (jetty-launcher-2128-thread-3) [n:127.0.0.1:33263_solr c: s: r: x: t:] o.a.s.e.JettySolrRunner Jetty properties: {hostContext=/solr, hostPort=33263, zkHost=127.0.0.1:41575/solr} 2> 368264 INFO (jetty-launcher-2128-thread-2) [n:127.0.0.1:42593_solr c: s: r: x: t:] o.a.s.e.JettySolrRunner Jetty properties: {hostContext=/solr, hostPort=42593, zkHost=127.0.0.1:41575/solr} 2> 368272 INFO (jetty-launcher-2128-thread-4) [n:127.0.0.1:35755_solr c: s: r: x: t:] o.a.s.j.SolrRequestAuthorizer Creating a new SolrRequestAuthorizer 2> 368309 INFO (jetty-launcher-2128-thread-1) [n:127.0.0.1:44549_solr c: s: r: x: t:] o.a.s.j.SolrRequestAuthorizer Creating a new SolrRequestAuthorizer 2> 368450 INFO (jetty-launcher-2128-thread-4) [n:127.0.0.1:35755_solr c: s: r: x: t:] o.a.s.e.JettySolrRunner Jetty properties: {hostContext=/solr, hostPort=35755, zkHost=127.0.0.1:41575/solr} 2> 368480 INFO (jetty-launcher-2128-thread-1) [n:127.0.0.1:44549_solr c: s: r: x: t:] o.a.s.e.JettySolrRunner Jetty properties: {hostContext=/solr, hostPort=44549, zkHost=127.0.0.1:41575/solr} 2> 368489 INFO (TEST-ReplaceNodeTest.testGoodSpreadDuringAssignWithNoTarget-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.MiniSolrCloudCluster waitForAllNodes: numServers=4 2> 368489 INFO (TEST-ReplaceNodeTest.testGoodSpreadDuringAssignWithNoTarget-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.MiniSolrCloudCluster waitForNode: 127.0.0.1:33263_solr 2> 368490 WARN (TEST-ReplaceNodeTest.testGoodSpreadDuringAssignWithNoTarget-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.c.SolrZkClient Using default ZkCredentialsInjector. ZkCredentialsInjector is not secure, it creates an empty list of credentials which leads to 'OPEN_ACL_UNSAFE' ACLs to Zookeeper nodes 2> 368501 INFO (TEST-ReplaceNodeTest.testGoodSpreadDuringAssignWithNoTarget-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.c.ConnectionManager Waiting up to 15000ms for client to connect to ZooKeeper 2> 368511 INFO (zkConnectionManagerCallback-2215-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ConnectionManager zkClient has connected 2> 368511 INFO (TEST-ReplaceNodeTest.testGoodSpreadDuringAssignWithNoTarget-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper 2> 368512 WARN (TEST-ReplaceNodeTest.testGoodSpreadDuringAssignWithNoTarget-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.c.SolrZkClient Using default ZkACLProvider. DefaultZkACLProvider is not secure, it creates 'OPEN_ACL_UNSAFE' ACLs to Zookeeper nodes 2> 368518 INFO (TEST-ReplaceNodeTest.testGoodSpreadDuringAssignWithNoTarget-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (0) -> (4) 2> 368525 INFO (TEST-ReplaceNodeTest.testGoodSpreadDuringAssignWithNoTarget-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.s.i.ZkClientClusterStateProvider Cluster at 127.0.0.1:41575/solr ready 2> 368525 INFO (TEST-ReplaceNodeTest.testGoodSpreadDuringAssignWithNoTarget-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.MiniSolrCloudCluster waitForNode: 127.0.0.1:42593_solr 2> 368526 INFO (TEST-ReplaceNodeTest.testGoodSpreadDuringAssignWithNoTarget-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.MiniSolrCloudCluster waitForNode: 127.0.0.1:35755_solr 2> 368526 INFO (TEST-ReplaceNodeTest.testGoodSpreadDuringAssignWithNoTarget-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.MiniSolrCloudCluster waitForNode: 127.0.0.1:44549_solr 2> 368527 WARN (TEST-ReplaceNodeTest.testGoodSpreadDuringAssignWithNoTarget-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.c.SolrZkClient Using default ZkCredentialsInjector. ZkCredentialsInjector is not secure, it creates an empty list of credentials which leads to 'OPEN_ACL_UNSAFE' ACLs to Zookeeper nodes 2> 368539 INFO (TEST-ReplaceNodeTest.testGoodSpreadDuringAssignWithNoTarget-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.c.ConnectionManager Waiting up to 45000ms for client to connect to ZooKeeper 2> 368547 INFO (zkConnectionManagerCallback-2217-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ConnectionManager zkClient has connected 2> 368548 INFO (TEST-ReplaceNodeTest.testGoodSpreadDuringAssignWithNoTarget-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper 2> 368548 WARN (TEST-ReplaceNodeTest.testGoodSpreadDuringAssignWithNoTarget-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.c.SolrZkClient Using default ZkACLProvider. DefaultZkACLProvider is not secure, it creates 'OPEN_ACL_UNSAFE' ACLs to Zookeeper nodes 2> 368671 INFO (TEST-ReplaceNodeTest.testGoodSpreadDuringAssignWithNoTarget-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.ReplaceNodeTest total_jettys: 4 2> 368690 INFO (qtp926421133-3962) [n:127.0.0.1:33263_solr c:replacenodetest_coll s: r: x: t:null-1647] o.a.s.c.a.c.DistributedCollectionConfigSetCommandRunner Running Collection API locally for CREATE asyncId=null 2> 368710 INFO (DistributedCollectionApiCommandExecutor-2189-thread-1-processing-127.0.0.1:33263_solr null-1647 replacenodetest_coll) [n:127.0.0.1:33263_solr c:replacenodetest_coll s: r: x: t:null-1647] o.a.s.c.a.c.CreateCollectionCmd Create collection replacenodetest_coll 2> 368907 INFO (DistributedCollectionApiCommandExecutor-2189-thread-1-processing-127.0.0.1:33263_solr null-1647 replacenodetest_coll) [n:127.0.0.1:33263_solr c:replacenodetest_coll s: r: x: t:null-1647] o.a.s.c.o.SliceMutator createReplica() { 2> "core":"replacenodetest_coll_shard1_replica_n1", 2> "node_name":"127.0.0.1:35755_solr", 2> "base_url":"http://127.0.0.1:35755/solr", 2> "collection":"replacenodetest_coll", 2> "shard":"shard1", 2> "state":"down", 2> "type":"NRT", 2> "operation":"ADDREPLICA", 2> "waitForFinalState":"false"} 2> 368917 INFO (DistributedCollectionApiCommandExecutor-2189-thread-1-processing-127.0.0.1:33263_solr null-1647 replacenodetest_coll) [n:127.0.0.1:33263_solr c:replacenodetest_coll s: r: x: t:null-1647] o.a.s.c.o.SliceMutator createReplica() { 2> "core":"replacenodetest_coll_shard2_replica_n2", 2> "node_name":"127.0.0.1:35755_solr", 2> "base_url":"http://127.0.0.1:35755/solr", 2> "collection":"replacenodetest_coll", 2> "shard":"shard2", 2> "state":"down", 2> "type":"NRT", 2> "operation":"ADDREPLICA", 2> "waitForFinalState":"false"} 2> 368926 INFO (DistributedCollectionApiCommandExecutor-2189-thread-1-processing-127.0.0.1:33263_solr null-1647 replacenodetest_coll) [n:127.0.0.1:33263_solr c:replacenodetest_coll s: r: x: t:null-1647] o.a.s.c.o.SliceMutator createReplica() { 2> "core":"replacenodetest_coll_shard3_replica_n3", 2> "node_name":"127.0.0.1:35755_solr", 2> "base_url":"http://127.0.0.1:35755/solr", 2> "collection":"replacenodetest_coll", 2> "shard":"shard3", 2> "state":"down", 2> "type":"NRT", 2> "operation":"ADDREPLICA", 2> "waitForFinalState":"false"} 2> 368935 INFO (DistributedCollectionApiCommandExecutor-2189-thread-1-processing-127.0.0.1:33263_solr null-1647 replacenodetest_coll) [n:127.0.0.1:33263_solr c:replacenodetest_coll s: r: x: t:null-1647] o.a.s.c.o.SliceMutator createReplica() { 2> "core":"replacenodetest_coll_shard1_replica_n4", 2> "node_name":"127.0.0.1:42593_solr", 2> "base_url":"http://127.0.0.1:42593/solr", 2> "collection":"replacenodetest_coll", 2> "shard":"shard1", 2> "state":"down", 2> "type":"NRT", 2> "operation":"ADDREPLICA", 2> "waitForFinalState":"false"} 2> 368943 INFO (DistributedCollectionApiCommandExecutor-2189-thread-1-processing-127.0.0.1:33263_solr null-1647 replacenodetest_coll) [n:127.0.0.1:33263_solr c:replacenodetest_coll s: r: x: t:null-1647] o.a.s.c.o.SliceMutator createReplica() { 2> "core":"replacenodetest_coll_shard2_replica_n5", 2> "node_name":"127.0.0.1:42593_solr", 2> "base_url":"http://127.0.0.1:42593/solr", 2> "collection":"replacenodetest_coll", 2> "shard":"shard2", 2> "state":"down", 2> "type":"NRT", 2> "operation":"ADDREPLICA", 2> "waitForFinalState":"false"} 2> 368953 INFO (DistributedCollectionApiCommandExecutor-2189-thread-1-processing-127.0.0.1:33263_solr null-1647 replacenodetest_coll) [n:127.0.0.1:33263_solr c:replacenodetest_coll s: r: x: t:null-1647] o.a.s.c.o.SliceMutator createReplica() { 2> "core":"replacenodetest_coll_shard3_replica_n6", 2> "node_name":"127.0.0.1:42593_solr", 2> "base_url":"http://127.0.0.1:42593/solr", 2> "collection":"replacenodetest_coll", 2> "shard":"shard3", 2> "state":"down", 2> "type":"NRT", 2> "operation":"ADDREPLICA", 2> "waitForFinalState":"false"} 2> 369001 INFO (qtp427304581-3958) [n:127.0.0.1:35755_solr c: s: r: x:replacenodetest_coll_shard1_replica_n1 t:null-1647] o.a.s.h.a.CoreAdminOperation core create command qt=/admin/cores&collection.configName=conf1&newCollection=true&collection=replacenodetest_coll&version=2&replicaType=NRT&coreNodeName=core_node7&name=replacenodetest_coll_shard1_replica_n1&action=CREATE&numShards=3&shard=shard1&wt=javabin 2> 369002 INFO (qtp1295455241-3971) [n:127.0.0.1:42593_solr c: s: r: x:replacenodetest_coll_shard2_replica_n5 t:null-1647] o.a.s.h.a.CoreAdminOperation core create command qt=/admin/cores&collection.configName=conf1&newCollection=true&collection=replacenodetest_coll&version=2&replicaType=NRT&coreNodeName=core_node11&name=replacenodetest_coll_shard2_replica_n5&action=CREATE&numShards=3&shard=shard2&wt=javabin 2> 369001 INFO (qtp1295455241-3965) [n:127.0.0.1:42593_solr c: s: r: x:replacenodetest_coll_shard1_replica_n4 t:null-1647] o.a.s.h.a.CoreAdminOperation core create command qt=/admin/cores&collection.configName=conf1&newCollection=true&collection=replacenodetest_coll&version=2&replicaType=NRT&coreNodeName=core_node10&name=replacenodetest_coll_shard1_replica_n4&action=CREATE&numShards=3&shard=shard1&wt=javabin 2> 369002 INFO (qtp427304581-3960) [n:127.0.0.1:35755_solr c: s: r: x:replacenodetest_coll_shard2_replica_n2 t:null-1647] o.a.s.h.a.CoreAdminOperation core create command qt=/admin/cores&collection.configName=conf1&newCollection=true&collection=replacenodetest_coll&version=2&replicaType=NRT&coreNodeName=core_node8&name=replacenodetest_coll_shard2_replica_n2&action=CREATE&numShards=3&shard=shard2&wt=javabin 2> 369004 INFO (qtp427304581-3964) [n:127.0.0.1:35755_solr c: s: r: x:replacenodetest_coll_shard3_replica_n3 t:null-1647] o.a.s.h.a.CoreAdminOperation core create command qt=/admin/cores&collection.configName=conf1&newCollection=true&collection=replacenodetest_coll&version=2&replicaType=NRT&coreNodeName=core_node9&name=replacenodetest_coll_shard3_replica_n3&action=CREATE&numShards=3&shard=shard3&wt=javabin 2> 369004 INFO (qtp1295455241-3973) [n:127.0.0.1:42593_solr c: s: r: x:replacenodetest_coll_shard3_replica_n6 t:null-1647] o.a.s.h.a.CoreAdminOperation core create command qt=/admin/cores&collection.configName=conf1&newCollection=true&collection=replacenodetest_coll&version=2&replicaType=NRT&coreNodeName=core_node12&name=replacenodetest_coll_shard3_replica_n6&action=CREATE&numShards=3&shard=shard3&wt=javabin 2> 369090 INFO (qtp1295455241-3971) [n:127.0.0.1:42593_solr c:replacenodetest_coll s:shard2 r:core_node11 x:replacenodetest_coll_shard2_replica_n5 t:null-1647] o.a.s.c.SolrConfig Using Lucene MatchVersion: 9.8.0 2> 369098 INFO (qtp427304581-3960) [n:127.0.0.1:35755_solr c:replacenodetest_coll s:shard2 r:core_node8 x:replacenodetest_coll_shard2_replica_n2 t:null-1647] o.a.s.c.SolrConfig Using Lucene MatchVersion: 9.8.0 2> 369098 INFO (qtp427304581-3964) [n:127.0.0.1:35755_solr c:replacenodetest_coll s:shard3 r:core_node9 x:replacenodetest_coll_shard3_replica_n3 t:null-1647] o.a.s.c.SolrConfig Using Lucene MatchVersion: 9.8.0 2> 369101 INFO (qtp427304581-3958) [n:127.0.0.1:35755_solr c:replacenodetest_coll s:shard1 r:core_node7 x:replacenodetest_coll_shard1_replica_n1 t:null-1647] o.a.s.c.SolrConfig Using Lucene MatchVersion: 9.8.0 2> 369107 INFO (qtp1295455241-3973) [n:127.0.0.1:42593_solr c:replacenodetest_coll s:shard3 r:core_node12 x:replacenodetest_coll_shard3_replica_n6 t:null-1647] o.a.s.c.SolrConfig Using Lucene MatchVersion: 9.8.0 2> 369119 INFO (qtp1295455241-3965) [n:127.0.0.1:42593_solr c:replacenodetest_coll s:shard1 r:core_node10 x:replacenodetest_coll_shard1_replica_n4 t:null-1647] o.a.s.c.SolrConfig Using Lucene MatchVersion: 9.8.0 2> 369124 INFO (qtp1295455241-3971) [n:127.0.0.1:42593_solr c:replacenodetest_coll s:shard2 r:core_node11 x:replacenodetest_coll_shard2_replica_n5 t:null-1647] o.a.s.s.IndexSchema Schema name=cloud-dynamic 2> 369124 INFO (qtp1295455241-3973) [n:127.0.0.1:42593_solr c:replacenodetest_coll s:shard3 r:core_node12 x:replacenodetest_coll_shard3_replica_n6 t:null-1647] o.a.s.s.IndexSchema Schema name=cloud-dynamic 2> 369124 INFO (qtp1295455241-3965) [n:127.0.0.1:42593_solr c:replacenodetest_coll s:shard1 r:core_node10 x:replacenodetest_coll_shard1_replica_n4 t:null-1647] o.a.s.s.IndexSchema Schema name=cloud-dynamic 2> 369128 INFO (qtp427304581-3960) [n:127.0.0.1:35755_solr c:replacenodetest_coll s:shard2 r:core_node8 x:replacenodetest_coll_shard2_replica_n2 t:null-1647] o.a.s.s.IndexSchema Schema name=cloud-dynamic 2> 369128 INFO (qtp427304581-3964) [n:127.0.0.1:35755_solr c:replacenodetest_coll s:shard3 r:core_node9 x:replacenodetest_coll_shard3_replica_n3 t:null-1647] o.a.s.s.IndexSchema Schema name=cloud-dynamic 2> 369128 INFO (qtp427304581-3958) [n:127.0.0.1:35755_solr c:replacenodetest_coll s:shard1 r:core_node7 x:replacenodetest_coll_shard1_replica_n1 t:null-1647] o.a.s.s.IndexSchema Schema name=cloud-dynamic 2> 369150 INFO (qtp1295455241-3973) [n:127.0.0.1:42593_solr c:replacenodetest_coll s:shard3 r:core_node12 x:replacenodetest_coll_shard3_replica_n6 t:null-1647] o.a.s.s.IndexSchema Loaded schema cloud-dynamic/1.6 with uniqueid field id 2> 369151 INFO (qtp1295455241-3973) [n:127.0.0.1:42593_solr c:replacenodetest_coll s:shard3 r:core_node12 x:replacenodetest_coll_shard3_replica_n6 t:null-1647] o.a.s.c.CoreContainer Creating SolrCore 'replacenodetest_coll_shard3_replica_n6' using configuration from configset conf1, trusted=true 2> 369152 INFO (qtp427304581-3964) [n:127.0.0.1:35755_solr c:replacenodetest_coll s:shard3 r:core_node9 x:replacenodetest_coll_shard3_replica_n3 t:null-1647] o.a.s.s.IndexSchema Loaded schema cloud-dynamic/1.6 with uniqueid field id 2> 369152 INFO (qtp427304581-3964) [n:127.0.0.1:35755_solr c:replacenodetest_coll s:shard3 r:core_node9 x:replacenodetest_coll_shard3_replica_n3 t:null-1647] o.a.s.c.CoreContainer Creating SolrCore 'replacenodetest_coll_shard3_replica_n3' using configuration from configset conf1, trusted=true 2> 369153 INFO (qtp1295455241-3971) [n:127.0.0.1:42593_solr c:replacenodetest_coll s:shard2 r:core_node11 x:replacenodetest_coll_shard2_replica_n5 t:null-1647] o.a.s.s.IndexSchema Loaded schema cloud-dynamic/1.6 with uniqueid field id 2> 369154 INFO (qtp1295455241-3971) [n:127.0.0.1:42593_solr c:replacenodetest_coll s:shard2 r:core_node11 x:replacenodetest_coll_shard2_replica_n5 t:null-1647] o.a.s.c.CoreContainer Creating SolrCore 'replacenodetest_coll_shard2_replica_n5' using configuration from configset conf1, trusted=true 2> 369154 INFO (qtp1295455241-3973) [n:127.0.0.1:42593_solr c:replacenodetest_coll s:shard3 r:core_node12 x:replacenodetest_coll_shard3_replica_n6 t:null-1647] o.a.s.c.SolrCore Opening new SolrCore at [/build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.cloud.ReplaceNodeTest_246C98A4C257C021-001/tempDir-001/node2/replacenodetest_coll_shard3_replica_n6], dataDir=[/build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.cloud.ReplaceNodeTest_246C98A4C257C021-001/tempDir-001/node2/replacenodetest_coll_shard3_replica_n6/data/] 2> 369155 INFO (qtp1295455241-3965) [n:127.0.0.1:42593_solr c:replacenodetest_coll s:shard1 r:core_node10 x:replacenodetest_coll_shard1_replica_n4 t:null-1647] o.a.s.s.IndexSchema Loaded schema cloud-dynamic/1.6 with uniqueid field id 2> 369155 INFO (qtp427304581-3964) [n:127.0.0.1:35755_solr c:replacenodetest_coll s:shard3 r:core_node9 x:replacenodetest_coll_shard3_replica_n3 t:null-1647] o.a.s.c.SolrCore Opening new SolrCore at [/build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.cloud.ReplaceNodeTest_246C98A4C257C021-001/tempDir-001/node4/replacenodetest_coll_shard3_replica_n3], dataDir=[/build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.cloud.ReplaceNodeTest_246C98A4C257C021-001/tempDir-001/node4/replacenodetest_coll_shard3_replica_n3/data/] 2> 369155 INFO (qtp1295455241-3965) [n:127.0.0.1:42593_solr c:replacenodetest_coll s:shard1 r:core_node10 x:replacenodetest_coll_shard1_replica_n4 t:null-1647] o.a.s.c.CoreContainer Creating SolrCore 'replacenodetest_coll_shard1_replica_n4' using configuration from configset conf1, trusted=true 2> 369156 INFO (qtp1295455241-3971) [n:127.0.0.1:42593_solr c:replacenodetest_coll s:shard2 r:core_node11 x:replacenodetest_coll_shard2_replica_n5 t:null-1647] o.a.s.c.SolrCore Opening new SolrCore at [/build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.cloud.ReplaceNodeTest_246C98A4C257C021-001/tempDir-001/node2/replacenodetest_coll_shard2_replica_n5], dataDir=[/build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.cloud.ReplaceNodeTest_246C98A4C257C021-001/tempDir-001/node2/replacenodetest_coll_shard2_replica_n5/data/] 2> 369158 INFO (qtp427304581-3958) [n:127.0.0.1:35755_solr c:replacenodetest_coll s:shard1 r:core_node7 x:replacenodetest_coll_shard1_replica_n1 t:null-1647] o.a.s.s.IndexSchema Loaded schema cloud-dynamic/1.6 with uniqueid field id 2> 369158 INFO (qtp427304581-3958) [n:127.0.0.1:35755_solr c:replacenodetest_coll s:shard1 r:core_node7 x:replacenodetest_coll_shard1_replica_n1 t:null-1647] o.a.s.c.CoreContainer Creating SolrCore 'replacenodetest_coll_shard1_replica_n1' using configuration from configset conf1, trusted=true 2> 369159 INFO (qtp427304581-3960) [n:127.0.0.1:35755_solr c:replacenodetest_coll s:shard2 r:core_node8 x:replacenodetest_coll_shard2_replica_n2 t:null-1647] o.a.s.s.IndexSchema Loaded schema cloud-dynamic/1.6 with uniqueid field id 2> 369160 INFO (qtp1295455241-3965) [n:127.0.0.1:42593_solr c:replacenodetest_coll s:shard1 r:core_node10 x:replacenodetest_coll_shard1_replica_n4 t:null-1647] o.a.s.c.SolrCore Opening new SolrCore at [/build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.cloud.ReplaceNodeTest_246C98A4C257C021-001/tempDir-001/node2/replacenodetest_coll_shard1_replica_n4], dataDir=[/build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.cloud.ReplaceNodeTest_246C98A4C257C021-001/tempDir-001/node2/replacenodetest_coll_shard1_replica_n4/data/] 2> 369160 INFO (qtp427304581-3960) [n:127.0.0.1:35755_solr c:replacenodetest_coll s:shard2 r:core_node8 x:replacenodetest_coll_shard2_replica_n2 t:null-1647] o.a.s.c.CoreContainer Creating SolrCore 'replacenodetest_coll_shard2_replica_n2' using configuration from configset conf1, trusted=true 2> 369163 INFO (qtp427304581-3958) [n:127.0.0.1:35755_solr c:replacenodetest_coll s:shard1 r:core_node7 x:replacenodetest_coll_shard1_replica_n1 t:null-1647] o.a.s.c.SolrCore Opening new SolrCore at [/build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.cloud.ReplaceNodeTest_246C98A4C257C021-001/tempDir-001/node4/replacenodetest_coll_shard1_replica_n1], dataDir=[/build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.cloud.ReplaceNodeTest_246C98A4C257C021-001/tempDir-001/node4/replacenodetest_coll_shard1_replica_n1/data/] 2> 369166 INFO (qtp427304581-3960) [n:127.0.0.1:35755_solr c:replacenodetest_coll s:shard2 r:core_node8 x:replacenodetest_coll_shard2_replica_n2 t:null-1647] o.a.s.c.SolrCore Opening new SolrCore at [/build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.cloud.ReplaceNodeTest_246C98A4C257C021-001/tempDir-001/node4/replacenodetest_coll_shard2_replica_n2], dataDir=[/build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.cloud.ReplaceNodeTest_246C98A4C257C021-001/tempDir-001/node4/replacenodetest_coll_shard2_replica_n2/data/] 2> 370348 INFO (qtp1295455241-3973) [n:127.0.0.1:42593_solr c:replacenodetest_coll s:shard3 r:core_node12 x:replacenodetest_coll_shard3_replica_n6 t:null-1647] o.a.s.j.SolrRequestAuthorizer Creating a new SolrRequestAuthorizer 2> 370350 INFO (qtp427304581-3964) [n:127.0.0.1:35755_solr c:replacenodetest_coll s:shard3 r:core_node9 x:replacenodetest_coll_shard3_replica_n3 t:null-1647] o.a.s.j.SolrRequestAuthorizer Creating a new SolrRequestAuthorizer 2> 370413 INFO (qtp1295455241-3965) [n:127.0.0.1:42593_solr c:replacenodetest_coll s:shard1 r:core_node10 x:replacenodetest_coll_shard1_replica_n4 t:null-1647] o.a.s.u.UpdateHandler Using UpdateLog implementation: org.apache.solr.update.UpdateLog 2> 370413 INFO (qtp1295455241-3973) [n:127.0.0.1:42593_solr c:replacenodetest_coll s:shard3 r:core_node12 x:replacenodetest_coll_shard3_replica_n6 t:null-1647] o.a.s.u.UpdateHandler Using UpdateLog implementation: org.apache.solr.update.UpdateLog 2> 370413 INFO (qtp427304581-3958) [n:127.0.0.1:35755_solr c:replacenodetest_coll s:shard1 r:core_node7 x:replacenodetest_coll_shard1_replica_n1 t:null-1647] o.a.s.u.UpdateHandler Using UpdateLog implementation: org.apache.solr.update.UpdateLog 2> 370413 INFO (qtp427304581-3964) [n:127.0.0.1:35755_solr c:replacenodetest_coll s:shard3 r:core_node9 x:replacenodetest_coll_shard3_replica_n3 t:null-1647] o.a.s.u.UpdateHandler Using UpdateLog implementation: org.apache.solr.update.UpdateLog 2> 370414 INFO (qtp427304581-3958) [n:127.0.0.1:35755_solr c:replacenodetest_coll s:shard1 r:core_node7 x:replacenodetest_coll_shard1_replica_n1 t:null-1647] o.a.s.u.UpdateLog Initializing UpdateLog: dataDir=null defaultSyncLevel=FLUSH numRecordsToKeep=100 maxNumLogsToKeep=10 numVersionBuckets=65536 2> 370413 INFO (qtp1295455241-3965) [n:127.0.0.1:42593_solr c:replacenodetest_coll s:shard1 r:core_node10 x:replacenodetest_coll_shard1_replica_n4 t:null-1647] o.a.s.u.UpdateLog Initializing UpdateLog: dataDir=null defaultSyncLevel=FLUSH numRecordsToKeep=100 maxNumLogsToKeep=10 numVersionBuckets=65536 2> 370414 INFO (qtp1295455241-3973) [n:127.0.0.1:42593_solr c:replacenodetest_coll s:shard3 r:core_node12 x:replacenodetest_coll_shard3_replica_n6 t:null-1647] o.a.s.u.UpdateLog Initializing UpdateLog: dataDir=null defaultSyncLevel=FLUSH numRecordsToKeep=100 maxNumLogsToKeep=10 numVersionBuckets=65536 2> 370413 INFO (qtp427304581-3964) [n:127.0.0.1:35755_solr c:replacenodetest_coll s:shard3 r:core_node9 x:replacenodetest_coll_shard3_replica_n3 t:null-1647] o.a.s.u.UpdateLog Initializing UpdateLog: dataDir=null defaultSyncLevel=FLUSH numRecordsToKeep=100 maxNumLogsToKeep=10 numVersionBuckets=65536 2> 370415 INFO (qtp427304581-3960) [n:127.0.0.1:35755_solr c:replacenodetest_coll s:shard2 r:core_node8 x:replacenodetest_coll_shard2_replica_n2 t:null-1647] o.a.s.u.UpdateHandler Using UpdateLog implementation: org.apache.solr.update.UpdateLog 2> 370415 INFO (qtp427304581-3960) [n:127.0.0.1:35755_solr c:replacenodetest_coll s:shard2 r:core_node8 x:replacenodetest_coll_shard2_replica_n2 t:null-1647] o.a.s.u.UpdateLog Initializing UpdateLog: dataDir=null defaultSyncLevel=FLUSH numRecordsToKeep=100 maxNumLogsToKeep=10 numVersionBuckets=65536 2> 370417 INFO (qtp1295455241-3971) [n:127.0.0.1:42593_solr c:replacenodetest_coll s:shard2 r:core_node11 x:replacenodetest_coll_shard2_replica_n5 t:null-1647] o.a.s.u.UpdateHandler Using UpdateLog implementation: org.apache.solr.update.UpdateLog 2> 370418 INFO (qtp1295455241-3971) [n:127.0.0.1:42593_solr c:replacenodetest_coll s:shard2 r:core_node11 x:replacenodetest_coll_shard2_replica_n5 t:null-1647] o.a.s.u.UpdateLog Initializing UpdateLog: dataDir=null defaultSyncLevel=FLUSH numRecordsToKeep=100 maxNumLogsToKeep=10 numVersionBuckets=65536 2> 370439 INFO (qtp1295455241-3971) [n:127.0.0.1:42593_solr c:replacenodetest_coll s:shard2 r:core_node11 x:replacenodetest_coll_shard2_replica_n5 t:null-1647] o.a.s.u.CommitTracker Hard AutoCommit: disabled 2> 370439 INFO (qtp427304581-3964) [n:127.0.0.1:35755_solr c:replacenodetest_coll s:shard3 r:core_node9 x:replacenodetest_coll_shard3_replica_n3 t:null-1647] o.a.s.u.CommitTracker Hard AutoCommit: disabled 2> 370440 INFO (qtp1295455241-3965) [n:127.0.0.1:42593_solr c:replacenodetest_coll s:shard1 r:core_node10 x:replacenodetest_coll_shard1_replica_n4 t:null-1647] o.a.s.u.CommitTracker Hard AutoCommit: disabled 2> 370440 INFO (qtp427304581-3960) [n:127.0.0.1:35755_solr c:replacenodetest_coll s:shard2 r:core_node8 x:replacenodetest_coll_shard2_replica_n2 t:null-1647] o.a.s.u.CommitTracker Hard AutoCommit: disabled 2> 370440 INFO (qtp427304581-3964) [n:127.0.0.1:35755_solr c:replacenodetest_coll s:shard3 r:core_node9 x:replacenodetest_coll_shard3_replica_n3 t:null-1647] o.a.s.u.CommitTracker Soft AutoCommit: disabled 2> 370440 INFO (qtp1295455241-3965) [n:127.0.0.1:42593_solr c:replacenodetest_coll s:shard1 r:core_node10 x:replacenodetest_coll_shard1_replica_n4 t:null-1647] o.a.s.u.CommitTracker Soft AutoCommit: disabled 2> 370440 INFO (qtp427304581-3960) [n:127.0.0.1:35755_solr c:replacenodetest_coll s:shard2 r:core_node8 x:replacenodetest_coll_shard2_replica_n2 t:null-1647] o.a.s.u.CommitTracker Soft AutoCommit: disabled 2> 370439 INFO (qtp1295455241-3973) [n:127.0.0.1:42593_solr c:replacenodetest_coll s:shard3 r:core_node12 x:replacenodetest_coll_shard3_replica_n6 t:null-1647] o.a.s.u.CommitTracker Hard AutoCommit: disabled 2> 370440 INFO (qtp1295455241-3971) [n:127.0.0.1:42593_solr c:replacenodetest_coll s:shard2 r:core_node11 x:replacenodetest_coll_shard2_replica_n5 t:null-1647] o.a.s.u.CommitTracker Soft AutoCommit: disabled 2> 370439 INFO (qtp427304581-3958) [n:127.0.0.1:35755_solr c:replacenodetest_coll s:shard1 r:core_node7 x:replacenodetest_coll_shard1_replica_n1 t:null-1647] o.a.s.u.CommitTracker Hard AutoCommit: disabled 2> 370441 INFO (qtp1295455241-3973) [n:127.0.0.1:42593_solr c:replacenodetest_coll s:shard3 r:core_node12 x:replacenodetest_coll_shard3_replica_n6 t:null-1647] o.a.s.u.CommitTracker Soft AutoCommit: disabled 2> 370441 INFO (qtp427304581-3958) [n:127.0.0.1:35755_solr c:replacenodetest_coll s:shard1 r:core_node7 x:replacenodetest_coll_shard1_replica_n1 t:null-1647] o.a.s.u.CommitTracker Soft AutoCommit: disabled 2> 370473 INFO (qtp1295455241-3973) [n:127.0.0.1:42593_solr c:replacenodetest_coll s:shard3 r:core_node12 x:replacenodetest_coll_shard3_replica_n6 t:null-1647] o.a.s.r.ManagedResourceStorage Configured ZooKeeperStorageIO with znodeBase: /configs/conf1 2> 370473 INFO (qtp427304581-3964) [n:127.0.0.1:35755_solr c:replacenodetest_coll s:shard3 r:core_node9 x:replacenodetest_coll_shard3_replica_n3 t:null-1647] o.a.s.r.ManagedResourceStorage Configured ZooKeeperStorageIO with znodeBase: /configs/conf1 2> 370476 INFO (qtp427304581-3958) [n:127.0.0.1:35755_solr c:replacenodetest_coll s:shard1 r:core_node7 x:replacenodetest_coll_shard1_replica_n1 t:null-1647] o.a.s.r.ManagedResourceStorage Configured ZooKeeperStorageIO with znodeBase: /configs/conf1 2> 370476 INFO (qtp1295455241-3965) [n:127.0.0.1:42593_solr c:replacenodetest_coll s:shard1 r:core_node10 x:replacenodetest_coll_shard1_replica_n4 t:null-1647] o.a.s.r.ManagedResourceStorage Configured ZooKeeperStorageIO with znodeBase: /configs/conf1 2> 370477 INFO (qtp427304581-3964) [n:127.0.0.1:35755_solr c:replacenodetest_coll s:shard3 r:core_node9 x:replacenodetest_coll_shard3_replica_n3 t:null-1647] o.a.s.r.ManagedResourceStorage Loaded null at path _rest_managed.json using ZooKeeperStorageIO:path=/configs/conf1 2> 370478 INFO (qtp1295455241-3973) [n:127.0.0.1:42593_solr c:replacenodetest_coll s:shard3 r:core_node12 x:replacenodetest_coll_shard3_replica_n6 t:null-1647] o.a.s.r.ManagedResourceStorage Loaded null at path _rest_managed.json using ZooKeeperStorageIO:path=/configs/conf1 2> 370479 INFO (qtp427304581-3958) [n:127.0.0.1:35755_solr c:replacenodetest_coll s:shard1 r:core_node7 x:replacenodetest_coll_shard1_replica_n1 t:null-1647] o.a.s.r.ManagedResourceStorage Loaded null at path _rest_managed.json using ZooKeeperStorageIO:path=/configs/conf1 2> 370479 INFO (qtp1295455241-3965) [n:127.0.0.1:42593_solr c:replacenodetest_coll s:shard1 r:core_node10 x:replacenodetest_coll_shard1_replica_n4 t:null-1647] o.a.s.r.ManagedResourceStorage Loaded null at path _rest_managed.json using ZooKeeperStorageIO:path=/configs/conf1 2> 370479 INFO (qtp427304581-3960) [n:127.0.0.1:35755_solr c:replacenodetest_coll s:shard2 r:core_node8 x:replacenodetest_coll_shard2_replica_n2 t:null-1647] o.a.s.r.ManagedResourceStorage Configured ZooKeeperStorageIO with znodeBase: /configs/conf1 2> 370479 INFO (qtp1295455241-3971) [n:127.0.0.1:42593_solr c:replacenodetest_coll s:shard2 r:core_node11 x:replacenodetest_coll_shard2_replica_n5 t:null-1647] o.a.s.r.ManagedResourceStorage Configured ZooKeeperStorageIO with znodeBase: /configs/conf1 2> 370482 INFO (qtp427304581-3960) [n:127.0.0.1:35755_solr c:replacenodetest_coll s:shard2 r:core_node8 x:replacenodetest_coll_shard2_replica_n2 t:null-1647] o.a.s.r.ManagedResourceStorage Loaded null at path _rest_managed.json using ZooKeeperStorageIO:path=/configs/conf1 2> 370482 INFO (qtp1295455241-3971) [n:127.0.0.1:42593_solr c:replacenodetest_coll s:shard2 r:core_node11 x:replacenodetest_coll_shard2_replica_n5 t:null-1647] o.a.s.r.ManagedResourceStorage Loaded null at path _rest_managed.json using ZooKeeperStorageIO:path=/configs/conf1 2> 370484 INFO (qtp1295455241-3965) [n:127.0.0.1:42593_solr c:replacenodetest_coll s:shard1 r:core_node10 x:replacenodetest_coll_shard1_replica_n4 t:null-1647] o.a.s.h.ReplicationHandler Commits will be reserved for 10000 ms 2> 370485 INFO (qtp427304581-3958) [n:127.0.0.1:35755_solr c:replacenodetest_coll s:shard1 r:core_node7 x:replacenodetest_coll_shard1_replica_n1 t:null-1647] o.a.s.h.ReplicationHandler Commits will be reserved for 10000 ms 2> 370485 INFO (qtp427304581-3964) [n:127.0.0.1:35755_solr c:replacenodetest_coll s:shard3 r:core_node9 x:replacenodetest_coll_shard3_replica_n3 t:null-1647] o.a.s.h.ReplicationHandler Commits will be reserved for 10000 ms 2> 370485 INFO (qtp1295455241-3973) [n:127.0.0.1:42593_solr c:replacenodetest_coll s:shard3 r:core_node12 x:replacenodetest_coll_shard3_replica_n6 t:null-1647] o.a.s.h.ReplicationHandler Commits will be reserved for 10000 ms 2> 370486 INFO (qtp1295455241-3965) [n:127.0.0.1:42593_solr c:replacenodetest_coll s:shard1 r:core_node10 x:replacenodetest_coll_shard1_replica_n4 t:null-1647] o.a.s.u.UpdateLog Could not find max version in index or recent updates, using new clock 1788735645569515520 2> 370486 INFO (qtp427304581-3958) [n:127.0.0.1:35755_solr c:replacenodetest_coll s:shard1 r:core_node7 x:replacenodetest_coll_shard1_replica_n1 t:null-1647] o.a.s.u.UpdateLog Could not find max version in index or recent updates, using new clock 1788735645569515520 2> 370486 INFO (qtp427304581-3964) [n:127.0.0.1:35755_solr c:replacenodetest_coll s:shard3 r:core_node9 x:replacenodetest_coll_shard3_replica_n3 t:null-1647] o.a.s.u.UpdateLog Could not find max version in index or recent updates, using new clock 1788735645569515520 2> 370486 INFO (qtp1295455241-3973) [n:127.0.0.1:42593_solr c:replacenodetest_coll s:shard3 r:core_node12 x:replacenodetest_coll_shard3_replica_n6 t:null-1647] o.a.s.u.UpdateLog Could not find max version in index or recent updates, using new clock 1788735645569515520 2> 370488 INFO (qtp1295455241-3971) [n:127.0.0.1:42593_solr c:replacenodetest_coll s:shard2 r:core_node11 x:replacenodetest_coll_shard2_replica_n5 t:null-1647] o.a.s.h.ReplicationHandler Commits will be reserved for 10000 ms 2> 370489 INFO (qtp427304581-3960) [n:127.0.0.1:35755_solr c:replacenodetest_coll s:shard2 r:core_node8 x:replacenodetest_coll_shard2_replica_n2 t:null-1647] o.a.s.h.ReplicationHandler Commits will be reserved for 10000 ms 2> 370490 INFO (qtp1295455241-3971) [n:127.0.0.1:42593_solr c:replacenodetest_coll s:shard2 r:core_node11 x:replacenodetest_coll_shard2_replica_n5 t:null-1647] o.a.s.u.UpdateLog Could not find max version in index or recent updates, using new clock 1788735645572661248 2> 370491 INFO (qtp427304581-3960) [n:127.0.0.1:35755_solr c:replacenodetest_coll s:shard2 r:core_node8 x:replacenodetest_coll_shard2_replica_n2 t:null-1647] o.a.s.u.UpdateLog Could not find max version in index or recent updates, using new clock 1788735645574758400 2> 370560 INFO (searcherExecutor-2219-thread-1-processing-replacenodetest_coll_shard3_replica_n6 null-1647 core_node12 127.0.0.1:42593_solr replacenodetest_coll shard3) [n:127.0.0.1:42593_solr c:replacenodetest_coll s:shard3 r:core_node12 x:replacenodetest_coll_shard3_replica_n6 t:null-1647] o.a.s.c.SolrCore Registered new searcher autowarm time: 0 ms 2> 370560 INFO (searcherExecutor-2227-thread-1-processing-replacenodetest_coll_shard1_replica_n1 null-1647 core_node7 127.0.0.1:35755_solr replacenodetest_coll shard1) [n:127.0.0.1:35755_solr c:replacenodetest_coll s:shard1 r:core_node7 x:replacenodetest_coll_shard1_replica_n1 t:null-1647] o.a.s.c.SolrCore Registered new searcher autowarm time: 0 ms 2> 370561 INFO (searcherExecutor-2223-thread-1-processing-replacenodetest_coll_shard2_replica_n5 null-1647 core_node11 127.0.0.1:42593_solr replacenodetest_coll shard2) [n:127.0.0.1:42593_solr c:replacenodetest_coll s:shard2 r:core_node11 x:replacenodetest_coll_shard2_replica_n5 t:null-1647] o.a.s.c.SolrCore Registered new searcher autowarm time: 0 ms 2> 370560 INFO (searcherExecutor-2225-thread-1-processing-replacenodetest_coll_shard1_replica_n4 null-1647 core_node10 127.0.0.1:42593_solr replacenodetest_coll shard1) [n:127.0.0.1:42593_solr c:replacenodetest_coll s:shard1 r:core_node10 x:replacenodetest_coll_shard1_replica_n4 t:null-1647] o.a.s.c.SolrCore Registered new searcher autowarm time: 0 ms 2> 370561 INFO (searcherExecutor-2229-thread-1-processing-replacenodetest_coll_shard2_replica_n2 null-1647 core_node8 127.0.0.1:35755_solr replacenodetest_coll shard2) [n:127.0.0.1:35755_solr c:replacenodetest_coll s:shard2 r:core_node8 x:replacenodetest_coll_shard2_replica_n2 t:null-1647] o.a.s.c.SolrCore Registered new searcher autowarm time: 0 ms 2> 370562 INFO (searcherExecutor-2221-thread-1-processing-replacenodetest_coll_shard3_replica_n3 null-1647 core_node9 127.0.0.1:35755_solr replacenodetest_coll shard3) [n:127.0.0.1:35755_solr c:replacenodetest_coll s:shard3 r:core_node9 x:replacenodetest_coll_shard3_replica_n3 t:null-1647] o.a.s.c.SolrCore Registered new searcher autowarm time: 0 ms 2> 370602 INFO (qtp427304581-3960) [n:127.0.0.1:35755_solr c:replacenodetest_coll s:shard2 r:core_node8 x:replacenodetest_coll_shard2_replica_n2 t:null-1647] o.a.s.c.ZkShardTerms Successful update of terms at /collections/replacenodetest_coll/terms/shard2 to Terms{values={core_node8=0}, version=0} for registerTerm 2> 370602 INFO (qtp1295455241-3973) [n:127.0.0.1:42593_solr c:replacenodetest_coll s:shard3 r:core_node12 x:replacenodetest_coll_shard3_replica_n6 t:null-1647] o.a.s.c.ZkShardTerms Successful update of terms at /collections/replacenodetest_coll/terms/shard3 to Terms{values={core_node12=0}, version=0} for registerTerm 2> 370603 INFO (qtp1295455241-3973) [n:127.0.0.1:42593_solr c:replacenodetest_coll s:shard3 r:core_node12 x:replacenodetest_coll_shard3_replica_n6 t:null-1647] o.a.s.c.ShardLeaderElectionContextBase make sure parent is created /collections/replacenodetest_coll/leaders/shard3 2> 370603 INFO (qtp427304581-3960) [n:127.0.0.1:35755_solr c:replacenodetest_coll s:shard2 r:core_node8 x:replacenodetest_coll_shard2_replica_n2 t:null-1647] o.a.s.c.ShardLeaderElectionContextBase make sure parent is created /collections/replacenodetest_coll/leaders/shard2 2> 370626 INFO (qtp427304581-3964) [n:127.0.0.1:35755_solr c:replacenodetest_coll s:shard3 r:core_node9 x:replacenodetest_coll_shard3_replica_n3 t:null-1647] o.a.s.c.ZkShardTerms Successful update of terms at /collections/replacenodetest_coll/terms/shard3 to Terms{values={core_node12=0, core_node9=0}, version=1} for registerTerm 2> 370629 INFO (qtp427304581-3964) [n:127.0.0.1:35755_solr c:replacenodetest_coll s:shard3 r:core_node9 x:replacenodetest_coll_shard3_replica_n3 t:null-1647] o.a.s.c.ShardLeaderElectionContextBase make sure parent is created /collections/replacenodetest_coll/leaders/shard3 2> 370630 INFO (qtp1295455241-3965) [n:127.0.0.1:42593_solr c:replacenodetest_coll s:shard1 r:core_node10 x:replacenodetest_coll_shard1_replica_n4 t:null-1647] o.a.s.c.ZkShardTerms Successful update of terms at /collections/replacenodetest_coll/terms/shard1 to Terms{values={core_node10=0}, version=0} for registerTerm 2> 370635 INFO (qtp1295455241-3965) [n:127.0.0.1:42593_solr c:replacenodetest_coll s:shard1 r:core_node10 x:replacenodetest_coll_shard1_replica_n4 t:null-1647] o.a.s.c.ShardLeaderElectionContextBase make sure parent is created /collections/replacenodetest_coll/leaders/shard1 2> 370662 INFO (qtp427304581-3958) [n:127.0.0.1:35755_solr c:replacenodetest_coll s:shard1 r:core_node7 x:replacenodetest_coll_shard1_replica_n1 t:null-1647] o.a.s.c.ZkShardTerms Successful update of terms at /collections/replacenodetest_coll/terms/shard1 to Terms{values={core_node10=0, core_node7=0}, version=1} for registerTerm 2> 370662 INFO (qtp427304581-3958) [n:127.0.0.1:35755_solr c:replacenodetest_coll s:shard1 r:core_node7 x:replacenodetest_coll_shard1_replica_n1 t:null-1647] o.a.s.c.ShardLeaderElectionContextBase make sure parent is created /collections/replacenodetest_coll/leaders/shard1 2> 370664 INFO (qtp1295455241-3971) [n:127.0.0.1:42593_solr c:replacenodetest_coll s:shard2 r:core_node11 x:replacenodetest_coll_shard2_replica_n5 t:null-1647] o.a.s.c.ZkShardTerms Successful update of terms at /collections/replacenodetest_coll/terms/shard2 to Terms{values={core_node8=0, core_node11=0}, version=1} for registerTerm 2> 370665 INFO (qtp1295455241-3971) [n:127.0.0.1:42593_solr c:replacenodetest_coll s:shard2 r:core_node11 x:replacenodetest_coll_shard2_replica_n5 t:null-1647] o.a.s.c.ShardLeaderElectionContextBase make sure parent is created /collections/replacenodetest_coll/leaders/shard2 2> 370668 INFO (qtp427304581-3960) [n:127.0.0.1:35755_solr c:replacenodetest_coll s:shard2 r:core_node8 x:replacenodetest_coll_shard2_replica_n2 t:null-1647] o.a.s.c.ShardLeaderElectionContext Waiting until we see more replicas up for shard shard2: total=2 found=1 timeoute in=9989ms 2> 370679 INFO (qtp1295455241-3973) [n:127.0.0.1:42593_solr c:replacenodetest_coll s:shard3 r:core_node12 x:replacenodetest_coll_shard3_replica_n6 t:null-1647] o.a.s.c.ShardLeaderElectionContext Enough replicas found to continue. 2> 370679 INFO (qtp1295455241-3973) [n:127.0.0.1:42593_solr c:replacenodetest_coll s:shard3 r:core_node12 x:replacenodetest_coll_shard3_replica_n6 t:null-1647] o.a.s.c.ShardLeaderElectionContext I may be the new leader - try and sync 2> 370680 INFO (qtp1295455241-3973) [n:127.0.0.1:42593_solr c:replacenodetest_coll s:shard3 r:core_node12 x:replacenodetest_coll_shard3_replica_n6 t:null-1647] o.a.s.c.SyncStrategy Sync replicas to http://127.0.0.1:42593/solr/replacenodetest_coll_shard3_replica_n6/ 2> 370686 INFO (qtp1295455241-3973) [n:127.0.0.1:42593_solr c:replacenodetest_coll s:shard3 r:core_node12 x:replacenodetest_coll_shard3_replica_n6 t:null-1647] o.a.s.u.PeerSync PeerSync: core=replacenodetest_coll_shard3_replica_n6 url=http://127.0.0.1:42593/solr START replicas=[http://127.0.0.1:35755/solr/replacenodetest_coll_shard3_replica_n3/] nUpdates=100 2> 370694 INFO (qtp1295455241-3965) [n:127.0.0.1:42593_solr c:replacenodetest_coll s:shard1 r:core_node10 x:replacenodetest_coll_shard1_replica_n4 t:null-1647] o.a.s.c.ShardLeaderElectionContext Enough replicas found to continue. 2> 370694 INFO (qtp1295455241-3965) [n:127.0.0.1:42593_solr c:replacenodetest_coll s:shard1 r:core_node10 x:replacenodetest_coll_shard1_replica_n4 t:null-1647] o.a.s.c.ShardLeaderElectionContext I may be the new leader - try and sync 2> 370694 INFO (qtp1295455241-3965) [n:127.0.0.1:42593_solr c:replacenodetest_coll s:shard1 r:core_node10 x:replacenodetest_coll_shard1_replica_n4 t:null-1647] o.a.s.c.SyncStrategy Sync replicas to http://127.0.0.1:42593/solr/replacenodetest_coll_shard1_replica_n4/ 2> 370694 INFO (qtp1295455241-3973) [n:127.0.0.1:42593_solr c:replacenodetest_coll s:shard3 r:core_node12 x:replacenodetest_coll_shard3_replica_n6 t:null-1647] o.a.s.u.PeerSync PeerSync: core=replacenodetest_coll_shard3_replica_n6 url=http://127.0.0.1:42593/solr DONE. We have no versions. sync failed. 2> 370700 INFO (qtp1295455241-3965) [n:127.0.0.1:42593_solr c:replacenodetest_coll s:shard1 r:core_node10 x:replacenodetest_coll_shard1_replica_n4 t:null-1647] o.a.s.u.PeerSync PeerSync: core=replacenodetest_coll_shard1_replica_n4 url=http://127.0.0.1:42593/solr START replicas=[http://127.0.0.1:35755/solr/replacenodetest_coll_shard1_replica_n1/] nUpdates=100 2> 370701 INFO (qtp1295455241-3965) [n:127.0.0.1:42593_solr c:replacenodetest_coll s:shard1 r:core_node10 x:replacenodetest_coll_shard1_replica_n4 t:null-1647] o.a.s.u.PeerSync PeerSync: core=replacenodetest_coll_shard1_replica_n4 url=http://127.0.0.1:42593/solr DONE. We have no versions. sync failed. 2> 370713 INFO (qtp427304581-3966) [n:127.0.0.1:35755_solr c:replacenodetest_coll s:shard1 r:core_node7 x:replacenodetest_coll_shard1_replica_n1 t:null-1647] o.a.s.c.S.Request webapp=/solr path=/get params={distrib=false&qt=/get&fingerprint=false&getVersions=100&wt=javabin&version=2} status=0 QTime=3 2> 370713 INFO (qtp427304581-3954) [n:127.0.0.1:35755_solr c:replacenodetest_coll s:shard3 r:core_node9 x:replacenodetest_coll_shard3_replica_n3 t:null-1647] o.a.s.c.S.Request webapp=/solr path=/get params={distrib=false&qt=/get&fingerprint=false&getVersions=100&wt=javabin&version=2} status=0 QTime=4 2> 370720 INFO (qtp1295455241-3965) [n:127.0.0.1:42593_solr c:replacenodetest_coll s:shard1 r:core_node10 x:replacenodetest_coll_shard1_replica_n4 t:null-1647] o.a.s.c.SyncStrategy Leader's attempt to sync with shard failed, moving to the next candidate 2> 370720 INFO (qtp1295455241-3973) [n:127.0.0.1:42593_solr c:replacenodetest_coll s:shard3 r:core_node12 x:replacenodetest_coll_shard3_replica_n6 t:null-1647] o.a.s.c.SyncStrategy Leader's attempt to sync with shard failed, moving to the next candidate 2> 370720 INFO (qtp1295455241-3965) [n:127.0.0.1:42593_solr c:replacenodetest_coll s:shard1 r:core_node10 x:replacenodetest_coll_shard1_replica_n4 t:null-1647] o.a.s.c.ShardLeaderElectionContext We failed sync, but we have no versions - we can't sync in that case - we were active before, so become leader anyway 2> 370720 INFO (qtp1295455241-3973) [n:127.0.0.1:42593_solr c:replacenodetest_coll s:shard3 r:core_node12 x:replacenodetest_coll_shard3_replica_n6 t:null-1647] o.a.s.c.ShardLeaderElectionContext We failed sync, but we have no versions - we can't sync in that case - we were active before, so become leader anyway 2> 370721 INFO (qtp1295455241-3965) [n:127.0.0.1:42593_solr c:replacenodetest_coll s:shard1 r:core_node10 x:replacenodetest_coll_shard1_replica_n4 t:null-1647] o.a.s.c.ShardLeaderElectionContextBase Creating leader registration node /collections/replacenodetest_coll/leaders/shard1/leader after winning as /collections/replacenodetest_coll/leader_elect/shard1/election/72077814123528202-core_node10-n_0000000000 2> 370721 INFO (qtp1295455241-3973) [n:127.0.0.1:42593_solr c:replacenodetest_coll s:shard3 r:core_node12 x:replacenodetest_coll_shard3_replica_n6 t:null-1647] o.a.s.c.ShardLeaderElectionContextBase Creating leader registration node /collections/replacenodetest_coll/leaders/shard3/leader after winning as /collections/replacenodetest_coll/leader_elect/shard3/election/72077814123528202-core_node12-n_0000000000 2> 370751 INFO (qtp1295455241-3965) [n:127.0.0.1:42593_solr c:replacenodetest_coll s:shard1 r:core_node10 x:replacenodetest_coll_shard1_replica_n4 t:null-1647] o.a.s.c.ShardLeaderElectionContext I am the new leader: http://127.0.0.1:42593/solr/replacenodetest_coll_shard1_replica_n4/ shard1 2> 370761 INFO (qtp1295455241-3973) [n:127.0.0.1:42593_solr c:replacenodetest_coll s:shard3 r:core_node12 x:replacenodetest_coll_shard3_replica_n6 t:null-1647] o.a.s.c.ShardLeaderElectionContext I am the new leader: http://127.0.0.1:42593/solr/replacenodetest_coll_shard3_replica_n6/ shard3 2> 370766 INFO (qtp1295455241-3965) [n:127.0.0.1:42593_solr c:replacenodetest_coll s:shard1 r:core_node10 x:replacenodetest_coll_shard1_replica_n4 t:null-1647] o.a.s.c.ZkController I am the leader, no recovery necessary 2> 370774 INFO (qtp1295455241-3973) [n:127.0.0.1:42593_solr c:replacenodetest_coll s:shard3 r:core_node12 x:replacenodetest_coll_shard3_replica_n6 t:null-1647] o.a.s.c.ZkController I am the leader, no recovery necessary 2> 370807 INFO (qtp1295455241-3973) [n:127.0.0.1:42593_solr c: s: r: x: t:null-1647] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/cores params={qt=/admin/cores&coreNodeName=core_node12&collection.configName=conf1&newCollection=true&name=replacenodetest_coll_shard3_replica_n6&action=CREATE&numShards=3&collection=replacenodetest_coll&shard=shard3&wt=javabin&version=2&replicaType=NRT} status=0 QTime=1806 2> 370808 INFO (qtp1295455241-3965) [n:127.0.0.1:42593_solr c: s: r: x: t:null-1647] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/cores params={qt=/admin/cores&coreNodeName=core_node10&collection.configName=conf1&newCollection=true&name=replacenodetest_coll_shard1_replica_n4&action=CREATE&numShards=3&collection=replacenodetest_coll&shard=shard1&wt=javabin&version=2&replicaType=NRT} status=0 QTime=1810 2> 371178 INFO (qtp427304581-3960) [n:127.0.0.1:35755_solr c:replacenodetest_coll s:shard2 r:core_node8 x:replacenodetest_coll_shard2_replica_n2 t:null-1647] o.a.s.c.ShardLeaderElectionContext Enough replicas found to continue. 2> 371178 INFO (qtp427304581-3960) [n:127.0.0.1:35755_solr c:replacenodetest_coll s:shard2 r:core_node8 x:replacenodetest_coll_shard2_replica_n2 t:null-1647] o.a.s.c.ShardLeaderElectionContext I may be the new leader - try and sync 2> 371178 INFO (qtp427304581-3960) [n:127.0.0.1:35755_solr c:replacenodetest_coll s:shard2 r:core_node8 x:replacenodetest_coll_shard2_replica_n2 t:null-1647] o.a.s.c.SyncStrategy Sync replicas to http://127.0.0.1:35755/solr/replacenodetest_coll_shard2_replica_n2/ 2> 371184 INFO (qtp427304581-3960) [n:127.0.0.1:35755_solr c:replacenodetest_coll s:shard2 r:core_node8 x:replacenodetest_coll_shard2_replica_n2 t:null-1647] o.a.s.u.PeerSync PeerSync: core=replacenodetest_coll_shard2_replica_n2 url=http://127.0.0.1:35755/solr START replicas=[http://127.0.0.1:42593/solr/replacenodetest_coll_shard2_replica_n5/] nUpdates=100 2> 371190 INFO (qtp427304581-3960) [n:127.0.0.1:35755_solr c:replacenodetest_coll s:shard2 r:core_node8 x:replacenodetest_coll_shard2_replica_n2 t:null-1647] o.a.s.u.PeerSync PeerSync: core=replacenodetest_coll_shard2_replica_n2 url=http://127.0.0.1:35755/solr DONE. We have no versions. sync failed. 2> 371204 INFO (qtp1295455241-3959) [n:127.0.0.1:42593_solr c:replacenodetest_coll s:shard2 r:core_node11 x:replacenodetest_coll_shard2_replica_n5 t:null-1647] o.a.s.c.S.Request webapp=/solr path=/get params={distrib=false&qt=/get&fingerprint=false&getVersions=100&wt=javabin&version=2} status=0 QTime=2 2> 371207 INFO (qtp427304581-3960) [n:127.0.0.1:35755_solr c:replacenodetest_coll s:shard2 r:core_node8 x:replacenodetest_coll_shard2_replica_n2 t:null-1647] o.a.s.c.SyncStrategy Leader's attempt to sync with shard failed, moving to the next candidate 2> 371208 INFO (qtp427304581-3960) [n:127.0.0.1:35755_solr c:replacenodetest_coll s:shard2 r:core_node8 x:replacenodetest_coll_shard2_replica_n2 t:null-1647] o.a.s.c.ShardLeaderElectionContext We failed sync, but we have no versions - we can't sync in that case - we were active before, so become leader anyway 2> 371208 INFO (qtp427304581-3960) [n:127.0.0.1:35755_solr c:replacenodetest_coll s:shard2 r:core_node8 x:replacenodetest_coll_shard2_replica_n2 t:null-1647] o.a.s.c.ShardLeaderElectionContextBase Creating leader registration node /collections/replacenodetest_coll/leaders/shard2/leader after winning as /collections/replacenodetest_coll/leader_elect/shard2/election/72077814123528200-core_node8-n_0000000000 2> 371227 INFO (qtp427304581-3960) [n:127.0.0.1:35755_solr c:replacenodetest_coll s:shard2 r:core_node8 x:replacenodetest_coll_shard2_replica_n2 t:null-1647] o.a.s.c.ShardLeaderElectionContext I am the new leader: http://127.0.0.1:35755/solr/replacenodetest_coll_shard2_replica_n2/ shard2 2> 371227 INFO (zkCallback-2175-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/replacenodetest_coll/state.json zxid: 551] for collection [replacenodetest_coll] has occurred - updating... (live nodes size: [4]) 2> 371238 INFO (qtp427304581-3960) [n:127.0.0.1:35755_solr c:replacenodetest_coll s:shard2 r:core_node8 x:replacenodetest_coll_shard2_replica_n2 t:null-1647] o.a.s.c.ZkController I am the leader, no recovery necessary 2> 371248 INFO (zkCallback-2175-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/replacenodetest_coll/state.json zxid: 552] for collection [replacenodetest_coll] has occurred - updating... (live nodes size: [4]) 2> 371272 INFO (qtp427304581-3960) [n:127.0.0.1:35755_solr c: s: r: x: t:null-1647] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/cores params={qt=/admin/cores&coreNodeName=core_node8&collection.configName=conf1&newCollection=true&name=replacenodetest_coll_shard2_replica_n2&action=CREATE&numShards=3&collection=replacenodetest_coll&shard=shard2&wt=javabin&version=2&replicaType=NRT} status=0 QTime=2274 2> 371672 INFO (zkCallback-2175-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/replacenodetest_coll/state.json zxid: 553] for collection [replacenodetest_coll] has occurred - updating... (live nodes size: [4]) 2> 371672 INFO (zkCallback-2173-thread-2) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/replacenodetest_coll/state.json zxid: 553] for collection [replacenodetest_coll] has occurred - updating... (live nodes size: [4]) 2> 371684 INFO (qtp427304581-3964) [n:127.0.0.1:35755_solr c: s: r: x: t:null-1647] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/cores params={qt=/admin/cores&coreNodeName=core_node9&collection.configName=conf1&newCollection=true&name=replacenodetest_coll_shard3_replica_n3&action=CREATE&numShards=3&collection=replacenodetest_coll&shard=shard3&wt=javabin&version=2&replicaType=NRT} status=0 QTime=2683 2> 371692 INFO (zkCallback-2173-thread-2) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/replacenodetest_coll/state.json zxid: 554] for collection [replacenodetest_coll] has occurred - updating... (live nodes size: [4]) 2> 371692 INFO (zkCallback-2175-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/replacenodetest_coll/state.json zxid: 554] for collection [replacenodetest_coll] has occurred - updating... (live nodes size: [4]) 2> 371700 INFO (zkCallback-2173-thread-2) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/replacenodetest_coll/state.json zxid: 555] for collection [replacenodetest_coll] has occurred - updating... (live nodes size: [4]) 2> 371700 INFO (zkCallback-2175-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/replacenodetest_coll/state.json zxid: 555] for collection [replacenodetest_coll] has occurred - updating... (live nodes size: [4]) 2> 371704 INFO (qtp427304581-3958) [n:127.0.0.1:35755_solr c: s: r: x: t:null-1647] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/cores params={qt=/admin/cores&coreNodeName=core_node7&collection.configName=conf1&newCollection=true&name=replacenodetest_coll_shard1_replica_n1&action=CREATE&numShards=3&collection=replacenodetest_coll&shard=shard1&wt=javabin&version=2&replicaType=NRT} status=0 QTime=2705 2> 371710 INFO (qtp1295455241-3971) [n:127.0.0.1:42593_solr c: s: r: x: t:null-1647] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/cores params={qt=/admin/cores&coreNodeName=core_node11&collection.configName=conf1&newCollection=true&name=replacenodetest_coll_shard2_replica_n5&action=CREATE&numShards=3&collection=replacenodetest_coll&shard=shard2&wt=javabin&version=2&replicaType=NRT} status=0 QTime=2712 2> 371719 INFO (qtp926421133-3962) [n:127.0.0.1:33263_solr c:replacenodetest_coll s: r: x: t:null-1647] o.a.s.h.a.CollectionsHandler Wait for new collection to be active for at most 45 seconds. Check all shard replicas 2> 371725 INFO (qtp926421133-3962) [n:127.0.0.1:33263_solr c:replacenodetest_coll s: r: x: t:null-1647] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/collections params={pullReplicas=0&collection.configName=conf1&name=replacenodetest_coll&nrtReplicas=2&action=CREATE&numShards=3&tlogReplicas=0&createNodeSet=127.0.0.1:42593_solr,127.0.0.1:35755_solr&wt=javabin&version=2} status=0 QTime=3037 2> 371728 INFO (TEST-ReplaceNodeTest.testGoodSpreadDuringAssignWithNoTarget-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.MiniSolrCloudCluster waitForActiveCollection: replacenodetest_coll 2> 371733 INFO (TEST-ReplaceNodeTest.testGoodSpreadDuringAssignWithNoTarget-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.ReplaceNodeTest excluded_nodes : [127.0.0.1:44549_solr, 127.0.0.1:33263_solr] 2> 371748 INFO (qtp926421133-3968) [n:127.0.0.1:33263_solr c: s: r: x: t:null-1648] o.a.s.c.a.c.DistributedCollectionConfigSetCommandRunner Running Collection API locally for REPLACENODE asyncId=000 2> 371774 INFO (qtp926421133-3968) [n:127.0.0.1:33263_solr c: s: r: x: t:null-1648] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/collections params={async=000&sourceNode=127.0.0.1:42593_solr¶llel=true&action=REPLACENODE&wt=javabin&version=2} status=0 QTime=29 2> 371794 INFO (qtp926421133-3972) [n:127.0.0.1:33263_solr c: s: r: x: t:null-1649] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/collections params={requestid=000&action=REQUESTSTATUS&wt=javabin&version=2} status=0 QTime=13 2> 371799 INFO (DistributedCollectionApiCommandExecutor-2189-thread-2-processing-127.0.0.1:33263_solr null-1648) [n:127.0.0.1:33263_solr c: s: r: x: t:null-1648] o.a.s.c.a.c.ReplicaMigrationUtils Going to create replica for collection=replacenodetest_coll shard=shard2 on node=127.0.0.1:44549_solr 2> 371812 INFO (DistributedCollectionApiCommandExecutor-2189-thread-2-processing-127.0.0.1:33263_solr null-1648) [n:127.0.0.1:33263_solr c: s: r: x: t:null-1648] o.a.s.c.a.c.AddReplicaCmd Node Identified 127.0.0.1:44549_solr for creating new replica of shard shard2 for collection replacenodetest_coll 2> 371824 INFO (DistributedCollectionApiCommandExecutor-2189-thread-2-processing-127.0.0.1:33263_solr null-1648) [n:127.0.0.1:33263_solr c: s: r: x: t:null-1648] o.a.s.c.a.c.AddReplicaCmd Returning CreateReplica command. 2> 371827 INFO (DistributedCollectionApiCommandExecutor-2189-thread-2-processing-127.0.0.1:33263_solr null-1648) [n:127.0.0.1:33263_solr c: s: r: x: t:null-1648] o.a.s.c.o.SliceMutator createReplica() { 2> "core":"replacenodetest_coll_shard2_replica_n13", 2> "node_name":"127.0.0.1:44549_solr", 2> "base_url":"http://127.0.0.1:44549/solr", 2> "collection":"replacenodetest_coll", 2> "shard":"shard2", 2> "state":"down", 2> "type":"NRT", 2> "operation":"addreplica"} 2> 371841 INFO (zkCallback-2175-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/replacenodetest_coll/state.json zxid: 565] for collection [replacenodetest_coll] has occurred - updating... (live nodes size: [4]) 2> 371841 INFO (zkCallback-2173-thread-2) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/replacenodetest_coll/state.json zxid: 565] for collection [replacenodetest_coll] has occurred - updating... (live nodes size: [4]) 2> 371865 INFO (qtp437907094-3957) [n:127.0.0.1:44549_solr c: s: r: x:replacenodetest_coll_shard2_replica_n13 t:null-1648] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/cores params={async=000308541111949012&qt=/admin/cores&coreNodeName=core_node14&collection.configName=conf1&name=replacenodetest_coll_shard2_replica_n13&action=CREATE&collection=replacenodetest_coll&shard=shard2&wt=javabin&version=2&replicaType=NRT} status=0 QTime=4 2> 371866 INFO (parallelCoreAdminAPIBaseExecutor-2205-thread-1-processing-127.0.0.1:44549_solr replacenodetest_coll_shard2_replica_n13 null-1648 000308541111949012 create) [n:127.0.0.1:44549_solr c: s: r: x:replacenodetest_coll_shard2_replica_n13 t:null-1648] o.a.s.h.a.CoreAdminOperation core create command async=000308541111949012&qt=/admin/cores&coreNodeName=core_node14&collection.configName=conf1&name=replacenodetest_coll_shard2_replica_n13&action=CREATE&collection=replacenodetest_coll&shard=shard2&wt=javabin&version=2&replicaType=NRT 2> 371873 INFO (qtp437907094-3961) [n:127.0.0.1:44549_solr c: s: r: x: t:null-1648] o.a.s.h.a.CoreAdminOperation Checking request status for : 000308541111949012 2> 371875 INFO (qtp437907094-3961) [n:127.0.0.1:44549_solr c: s: r: x: t:null-1648] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/cores params={qt=/admin/cores&requestid=000308541111949012&action=REQUESTSTATUS&wt=javabin&version=2} status=0 QTime=4 2> 371890 INFO (zkCallback-2175-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/replacenodetest_coll/state.json zxid: 566] for collection [replacenodetest_coll] has occurred - updating... (live nodes size: [4]) 2> 371890 INFO (zkCallback-2173-thread-2) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/replacenodetest_coll/state.json zxid: 566] for collection [replacenodetest_coll] has occurred - updating... (live nodes size: [4]) 2> 371934 INFO (parallelCoreAdminAPIBaseExecutor-2205-thread-1-processing-127.0.0.1:44549_solr replacenodetest_coll_shard2_replica_n13 null-1648 000308541111949012 create) [n:127.0.0.1:44549_solr c:replacenodetest_coll s:shard2 r:core_node14 x:replacenodetest_coll_shard2_replica_n13 t:null-1648] o.a.s.c.SolrConfig Using Lucene MatchVersion: 9.8.0 2> 371955 INFO (parallelCoreAdminAPIBaseExecutor-2205-thread-1-processing-127.0.0.1:44549_solr replacenodetest_coll_shard2_replica_n13 null-1648 000308541111949012 create) [n:127.0.0.1:44549_solr c:replacenodetest_coll s:shard2 r:core_node14 x:replacenodetest_coll_shard2_replica_n13 t:null-1648] o.a.s.s.IndexSchema Schema name=cloud-dynamic 2> 371979 INFO (parallelCoreAdminAPIBaseExecutor-2205-thread-1-processing-127.0.0.1:44549_solr replacenodetest_coll_shard2_replica_n13 null-1648 000308541111949012 create) [n:127.0.0.1:44549_solr c:replacenodetest_coll s:shard2 r:core_node14 x:replacenodetest_coll_shard2_replica_n13 t:null-1648] o.a.s.s.IndexSchema Loaded schema cloud-dynamic/1.6 with uniqueid field id 2> 371980 INFO (parallelCoreAdminAPIBaseExecutor-2205-thread-1-processing-127.0.0.1:44549_solr replacenodetest_coll_shard2_replica_n13 null-1648 000308541111949012 create) [n:127.0.0.1:44549_solr c:replacenodetest_coll s:shard2 r:core_node14 x:replacenodetest_coll_shard2_replica_n13 t:null-1648] o.a.s.c.CoreContainer Creating SolrCore 'replacenodetest_coll_shard2_replica_n13' using configuration from configset conf1, trusted=true 2> 371984 INFO (parallelCoreAdminAPIBaseExecutor-2205-thread-1-processing-127.0.0.1:44549_solr replacenodetest_coll_shard2_replica_n13 null-1648 000308541111949012 create) [n:127.0.0.1:44549_solr c:replacenodetest_coll s:shard2 r:core_node14 x:replacenodetest_coll_shard2_replica_n13 t:null-1648] o.a.s.c.SolrCore Opening new SolrCore at [/build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.cloud.ReplaceNodeTest_246C98A4C257C021-001/tempDir-001/node1/replacenodetest_coll_shard2_replica_n13], dataDir=[/build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.cloud.ReplaceNodeTest_246C98A4C257C021-001/tempDir-001/node1/replacenodetest_coll_shard2_replica_n13/data/] 2> 372751 INFO (parallelCoreAdminAPIBaseExecutor-2205-thread-1-processing-127.0.0.1:44549_solr replacenodetest_coll_shard2_replica_n13 null-1648 000308541111949012 create) [n:127.0.0.1:44549_solr c:replacenodetest_coll s:shard2 r:core_node14 x:replacenodetest_coll_shard2_replica_n13 t:null-1648] o.a.s.j.SolrRequestAuthorizer Creating a new SolrRequestAuthorizer 2> 372811 INFO (qtp926421133-3974) [n:127.0.0.1:33263_solr c: s: r: x: t:null-1650] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/collections params={requestid=000&action=REQUESTSTATUS&wt=javabin&version=2} status=0 QTime=12 2> 372822 INFO (parallelCoreAdminAPIBaseExecutor-2205-thread-1-processing-127.0.0.1:44549_solr replacenodetest_coll_shard2_replica_n13 null-1648 000308541111949012 create) [n:127.0.0.1:44549_solr c:replacenodetest_coll s:shard2 r:core_node14 x:replacenodetest_coll_shard2_replica_n13 t:null-1648] o.a.s.u.UpdateHandler Using UpdateLog implementation: org.apache.solr.update.UpdateLog 2> 372822 INFO (parallelCoreAdminAPIBaseExecutor-2205-thread-1-processing-127.0.0.1:44549_solr replacenodetest_coll_shard2_replica_n13 null-1648 000308541111949012 create) [n:127.0.0.1:44549_solr c:replacenodetest_coll s:shard2 r:core_node14 x:replacenodetest_coll_shard2_replica_n13 t:null-1648] o.a.s.u.UpdateLog Initializing UpdateLog: dataDir=null defaultSyncLevel=FLUSH numRecordsToKeep=100 maxNumLogsToKeep=10 numVersionBuckets=65536 2> 372846 INFO (parallelCoreAdminAPIBaseExecutor-2205-thread-1-processing-127.0.0.1:44549_solr replacenodetest_coll_shard2_replica_n13 null-1648 000308541111949012 create) [n:127.0.0.1:44549_solr c:replacenodetest_coll s:shard2 r:core_node14 x:replacenodetest_coll_shard2_replica_n13 t:null-1648] o.a.s.u.CommitTracker Hard AutoCommit: disabled 2> 372846 INFO (parallelCoreAdminAPIBaseExecutor-2205-thread-1-processing-127.0.0.1:44549_solr replacenodetest_coll_shard2_replica_n13 null-1648 000308541111949012 create) [n:127.0.0.1:44549_solr c:replacenodetest_coll s:shard2 r:core_node14 x:replacenodetest_coll_shard2_replica_n13 t:null-1648] o.a.s.u.CommitTracker Soft AutoCommit: disabled 2> 372874 INFO (parallelCoreAdminAPIBaseExecutor-2205-thread-1-processing-127.0.0.1:44549_solr replacenodetest_coll_shard2_replica_n13 null-1648 000308541111949012 create) [n:127.0.0.1:44549_solr c:replacenodetest_coll s:shard2 r:core_node14 x:replacenodetest_coll_shard2_replica_n13 t:null-1648] o.a.s.r.ManagedResourceStorage Configured ZooKeeperStorageIO with znodeBase: /configs/conf1 2> 372878 INFO (parallelCoreAdminAPIBaseExecutor-2205-thread-1-processing-127.0.0.1:44549_solr replacenodetest_coll_shard2_replica_n13 null-1648 000308541111949012 create) [n:127.0.0.1:44549_solr c:replacenodetest_coll s:shard2 r:core_node14 x:replacenodetest_coll_shard2_replica_n13 t:null-1648] o.a.s.r.ManagedResourceStorage Loaded null at path _rest_managed.json using ZooKeeperStorageIO:path=/configs/conf1 2> 372883 INFO (qtp437907094-3963) [n:127.0.0.1:44549_solr c: s: r: x: t:null-1648] o.a.s.h.a.CoreAdminOperation Checking request status for : 000308541111949012 2> 372884 INFO (qtp437907094-3963) [n:127.0.0.1:44549_solr c: s: r: x: t:null-1648] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/cores params={qt=/admin/cores&requestid=000308541111949012&action=REQUESTSTATUS&wt=javabin&version=2} status=0 QTime=2 2> 372885 INFO (parallelCoreAdminAPIBaseExecutor-2205-thread-1-processing-127.0.0.1:44549_solr replacenodetest_coll_shard2_replica_n13 null-1648 000308541111949012 create) [n:127.0.0.1:44549_solr c:replacenodetest_coll s:shard2 r:core_node14 x:replacenodetest_coll_shard2_replica_n13 t:null-1648] o.a.s.h.ReplicationHandler Commits will be reserved for 10000 ms 2> 372888 INFO (parallelCoreAdminAPIBaseExecutor-2205-thread-1-processing-127.0.0.1:44549_solr replacenodetest_coll_shard2_replica_n13 null-1648 000308541111949012 create) [n:127.0.0.1:44549_solr c:replacenodetest_coll s:shard2 r:core_node14 x:replacenodetest_coll_shard2_replica_n13 t:null-1648] o.a.s.u.UpdateLog Could not find max version in index or recent updates, using new clock 1788735648088195072 2> 372923 INFO (searcherExecutor-2255-thread-1-processing-replacenodetest_coll_shard2_replica_n13 null-1648 000308541111949012 core_node14 create 127.0.0.1:44549_solr replacenodetest_coll shard2) [n:127.0.0.1:44549_solr c:replacenodetest_coll s:shard2 r:core_node14 x:replacenodetest_coll_shard2_replica_n13 t:null-1648] o.a.s.c.SolrCore Registered new searcher autowarm time: 0 ms 2> 372932 INFO (parallelCoreAdminAPIBaseExecutor-2205-thread-1-processing-127.0.0.1:44549_solr replacenodetest_coll_shard2_replica_n13 null-1648 000308541111949012 create) [n:127.0.0.1:44549_solr c:replacenodetest_coll s:shard2 r:core_node14 x:replacenodetest_coll_shard2_replica_n13 t:null-1648] o.a.s.c.ZkShardTerms Successful update of terms at /collections/replacenodetest_coll/terms/shard2 to Terms{values={core_node14=0, core_node8=0, core_node11=0}, version=2} for registerTerm 2> 372933 INFO (parallelCoreAdminAPIBaseExecutor-2205-thread-1-processing-127.0.0.1:44549_solr replacenodetest_coll_shard2_replica_n13 null-1648 000308541111949012 create) [n:127.0.0.1:44549_solr c:replacenodetest_coll s:shard2 r:core_node14 x:replacenodetest_coll_shard2_replica_n13 t:null-1648] o.a.s.c.ShardLeaderElectionContextBase make sure parent is created /collections/replacenodetest_coll/leaders/shard2 2> 372956 INFO (parallelCoreAdminAPIBaseExecutor-2205-thread-1-processing-127.0.0.1:44549_solr replacenodetest_coll_shard2_replica_n13 null-1648 000308541111949012 create) [n:127.0.0.1:44549_solr c:replacenodetest_coll s:shard2 r:core_node14 x:replacenodetest_coll_shard2_replica_n13 t:null-1648] o.a.s.c.ZkController Core needs to recover:replacenodetest_coll_shard2_replica_n13 2> 372960 INFO (updateExecutor-2157-thread-1-processing-replacenodetest_coll_shard2_replica_n13 null-1648 000308541111949012 core_node14 create 127.0.0.1:44549_solr replacenodetest_coll shard2) [n:127.0.0.1:44549_solr c:replacenodetest_coll s:shard2 r:core_node14 x:replacenodetest_coll_shard2_replica_n13 t:null-1648] o.a.s.u.DefaultSolrCoreState Running recovery 2> 372964 INFO (recoveryExecutor-2166-thread-1-processing-replacenodetest_coll_shard2_replica_n13 null-1648 000308541111949012 core_node14 create 127.0.0.1:44549_solr replacenodetest_coll shard2) [n:127.0.0.1:44549_solr c:replacenodetest_coll s:shard2 r:core_node14 x:replacenodetest_coll_shard2_replica_n13 t:null-1648] o.a.s.c.RecoveryStrategy Starting recovery process. recoveringAfterStartup=true 2> 372964 INFO (recoveryExecutor-2166-thread-1-processing-replacenodetest_coll_shard2_replica_n13 null-1648 000308541111949012 core_node14 create 127.0.0.1:44549_solr replacenodetest_coll shard2) [n:127.0.0.1:44549_solr c:replacenodetest_coll s:shard2 r:core_node14 x:replacenodetest_coll_shard2_replica_n13 t:null-1648] o.a.s.c.RecoveryStrategy startupVersions is empty 2> 372987 INFO (qtp427304581-3966) [n:127.0.0.1:35755_solr c:replacenodetest_coll s:shard2 r:core_node8 x:replacenodetest_coll_shard2_replica_n2 t:null-1651] o.a.s.c.S.Request webapp=/solr path=/admin/ping params={wt=javabin&version=2} hits=0 status=0 QTime=5 2> 372988 INFO (qtp427304581-3966) [n:127.0.0.1:35755_solr c:replacenodetest_coll s:shard2 r:core_node8 x:replacenodetest_coll_shard2_replica_n2 t:null-1651] o.a.s.c.S.Request webapp=/solr path=/admin/ping params={wt=javabin&version=2} status=0 QTime=6 2> 372989 INFO (recoveryExecutor-2166-thread-1-processing-replacenodetest_coll_shard2_replica_n13 null-1648 000308541111949012 core_node14 create 127.0.0.1:44549_solr replacenodetest_coll shard2) [n:127.0.0.1:44549_solr c:replacenodetest_coll s:shard2 r:core_node14 x:replacenodetest_coll_shard2_replica_n13 t:null-1648] o.a.s.c.RecoveryStrategy Begin buffering updates. core=[replacenodetest_coll_shard2_replica_n13] 2> 372992 INFO (recoveryExecutor-2166-thread-1-processing-replacenodetest_coll_shard2_replica_n13 null-1648 000308541111949012 core_node14 create 127.0.0.1:44549_solr replacenodetest_coll shard2) [n:127.0.0.1:44549_solr c:replacenodetest_coll s:shard2 r:core_node14 x:replacenodetest_coll_shard2_replica_n13 t:null-1648] o.a.s.u.UpdateLog Starting to buffer updates. FSUpdateLog{state=ACTIVE, tlog=null} 2> 372993 INFO (recoveryExecutor-2166-thread-1-processing-replacenodetest_coll_shard2_replica_n13 null-1648 000308541111949012 core_node14 create 127.0.0.1:44549_solr replacenodetest_coll shard2) [n:127.0.0.1:44549_solr c:replacenodetest_coll s:shard2 r:core_node14 x:replacenodetest_coll_shard2_replica_n13 t:null-1648] o.a.s.c.RecoveryStrategy Publishing state of core [replacenodetest_coll_shard2_replica_n13] as recovering, leader is [http://127.0.0.1:35755/solr/replacenodetest_coll_shard2_replica_n2/] and I am [http://127.0.0.1:44549/solr/replacenodetest_coll_shard2_replica_n13/] 2> 373001 INFO (zkCallback-2175-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/replacenodetest_coll/state.json zxid: 573] for collection [replacenodetest_coll] has occurred - updating... (live nodes size: [4]) 2> 373001 INFO (zkCallback-2171-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/replacenodetest_coll/state.json zxid: 573] for collection [replacenodetest_coll] has occurred - updating... (live nodes size: [4]) 2> 373001 INFO (zkCallback-2173-thread-2) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/replacenodetest_coll/state.json zxid: 573] for collection [replacenodetest_coll] has occurred - updating... (live nodes size: [4]) 2> 373004 INFO (recoveryExecutor-2166-thread-1-processing-replacenodetest_coll_shard2_replica_n13 null-1648 000308541111949012 core_node14 create 127.0.0.1:44549_solr replacenodetest_coll shard2) [n:127.0.0.1:44549_solr c:replacenodetest_coll s:shard2 r:core_node14 x:replacenodetest_coll_shard2_replica_n13 t:null-1648] o.a.s.c.RecoveryStrategy Sending prep recovery command to [http://127.0.0.1:35755/solr]; [WaitForState: action=PREPRECOVERY&core=replacenodetest_coll_shard2_replica_n2&nodeName=127.0.0.1:44549_solr&coreNodeName=core_node14&state=recovering&checkLive=true&onlyIfLeader=true&onlyIfLeaderActive=true] 2> 373009 INFO (qtp427304581-3954) [n:127.0.0.1:35755_solr c: s: r: x:replacenodetest_coll_shard2_replica_n2 t:null-1652] o.a.s.h.a.PrepRecoveryOp Going to wait for coreNodeName: core_node14, state: recovering, checkLive: true, onlyIfLeader: true, onlyIfLeaderActive: true 2> 373010 INFO (qtp427304581-3954) [n:127.0.0.1:35755_solr c: s: r: x: t:null-1652] o.a.s.h.a.PrepRecoveryOp In WaitForState(recovering): collection=replacenodetest_coll, shard=shard2, thisCore=replacenodetest_coll_shard2_replica_n2, leaderDoesNotNeedRecovery=false, isLeader? true, live=true, checkLive=true, currentState=recovering, localState=active, nodeName=127.0.0.1:44549_solr, coreNodeName=core_node14, onlyIfActiveCheckResult=false, nodeProps: core_node14:{ 2> "core":"replacenodetest_coll_shard2_replica_n13", 2> "node_name":"127.0.0.1:44549_solr", 2> "base_url":"http://127.0.0.1:44549/solr", 2> "state":"recovering", 2> "type":"NRT", 2> "force_set_state":"false"} 2> 373010 INFO (qtp427304581-3954) [n:127.0.0.1:35755_solr c: s: r: x: t:null-1652] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/cores params={nodeName=127.0.0.1:44549_solr&onlyIfLeaderActive=true&core=replacenodetest_coll_shard2_replica_n2&coreNodeName=core_node14&action=PREPRECOVERY&checkLive=true&state=recovering&onlyIfLeader=true&wt=javabin&version=2} status=0 QTime=3 2> 373512 INFO (recoveryExecutor-2166-thread-1-processing-replacenodetest_coll_shard2_replica_n13 null-1648 000308541111949012 core_node14 create 127.0.0.1:44549_solr replacenodetest_coll shard2) [n:127.0.0.1:44549_solr c:replacenodetest_coll s:shard2 r:core_node14 x:replacenodetest_coll_shard2_replica_n13 t:null-1648] o.a.s.c.RecoveryStrategy Attempting to PeerSync from [http://127.0.0.1:35755/solr/replacenodetest_coll_shard2_replica_n2/] - recoveringAfterStartup=[true] 2> 373516 WARN (recoveryExecutor-2166-thread-1-processing-replacenodetest_coll_shard2_replica_n13 null-1648 000308541111949012 core_node14 create 127.0.0.1:44549_solr replacenodetest_coll shard2) [n:127.0.0.1:44549_solr c:replacenodetest_coll s:shard2 r:core_node14 x:replacenodetest_coll_shard2_replica_n13 t:null-1648] o.a.s.u.PeerSyncWithLeader no frame of reference to tell if we've missed updates 2> 373516 INFO (recoveryExecutor-2166-thread-1-processing-replacenodetest_coll_shard2_replica_n13 null-1648 000308541111949012 core_node14 create 127.0.0.1:44549_solr replacenodetest_coll shard2) [n:127.0.0.1:44549_solr c:replacenodetest_coll s:shard2 r:core_node14 x:replacenodetest_coll_shard2_replica_n13 t:null-1648] o.a.s.c.RecoveryStrategy PeerSync Recovery was not successful - trying replication. 2> 373517 INFO (recoveryExecutor-2166-thread-1-processing-replacenodetest_coll_shard2_replica_n13 null-1648 000308541111949012 core_node14 create 127.0.0.1:44549_solr replacenodetest_coll shard2) [n:127.0.0.1:44549_solr c:replacenodetest_coll s:shard2 r:core_node14 x:replacenodetest_coll_shard2_replica_n13 t:null-1648] o.a.s.c.RecoveryStrategy Starting Replication Recovery. 2> 373517 INFO (recoveryExecutor-2166-thread-1-processing-replacenodetest_coll_shard2_replica_n13 null-1648 000308541111949012 core_node14 create 127.0.0.1:44549_solr replacenodetest_coll shard2) [n:127.0.0.1:44549_solr c:replacenodetest_coll s:shard2 r:core_node14 x:replacenodetest_coll_shard2_replica_n13 t:null-1648] o.a.s.c.RecoveryStrategy Attempting to replicate from [http://127.0.0.1:35755/solr/replacenodetest_coll_shard2_replica_n2/]. 2> 373623 INFO (qtp427304581-3952) [n:127.0.0.1:35755_solr c:replacenodetest_coll s:shard3 r:core_node9 x:replacenodetest_coll_shard3_replica_n3 t:null-1653] o.a.s.u.p.LogUpdateProcessorFactory webapp=/solr path=/update params={update.distrib=FROMLEADER&waitSearcher=true&openSearcher=false&commit=true&softCommit=false&distrib.from=http://127.0.0.1:42593/solr/replacenodetest_coll_shard3_replica_n6/&commit_end_point=replicas&wt=javabin&version=2&expungeDeletes=false}{commit=} 0 9 2> 373628 INFO (qtp427304581-3966) [n:127.0.0.1:35755_solr c:replacenodetest_coll s:shard1 r:core_node7 x:replacenodetest_coll_shard1_replica_n1 t:null-1653] o.a.s.u.p.LogUpdateProcessorFactory webapp=/solr path=/update params={update.distrib=FROMLEADER&waitSearcher=true&openSearcher=false&commit=true&softCommit=false&distrib.from=http://127.0.0.1:42593/solr/replacenodetest_coll_shard1_replica_n4/&commit_end_point=replicas&wt=javabin&version=2&expungeDeletes=false}{commit=} 0 12 2> 373629 INFO (qtp1295455241-3975) [n:127.0.0.1:42593_solr c:replacenodetest_coll s:shard3 r:core_node12 x:replacenodetest_coll_shard3_replica_n6 t:null-1653] o.a.s.u.p.LogUpdateProcessorFactory webapp=/solr path=/update params={update.distrib=TOLEADER&waitSearcher=true&openSearcher=false&commit=true&softCommit=false&distrib.from=http://127.0.0.1:35755/solr/replacenodetest_coll_shard2_replica_n2/&commit_end_point=leaders&wt=javabin&version=2&expungeDeletes=false}{commit=} 0 69 2> 373630 INFO (qtp1295455241-3953) [n:127.0.0.1:42593_solr c:replacenodetest_coll s:shard1 r:core_node10 x:replacenodetest_coll_shard1_replica_n4 t:null-1653] o.a.s.u.p.LogUpdateProcessorFactory webapp=/solr path=/update params={update.distrib=TOLEADER&waitSearcher=true&openSearcher=false&commit=true&softCommit=false&distrib.from=http://127.0.0.1:35755/solr/replacenodetest_coll_shard2_replica_n2/&commit_end_point=leaders&wt=javabin&version=2&expungeDeletes=false}{commit=} 0 69 2> 373644 INFO (qtp1295455241-3959) [n:127.0.0.1:42593_solr c:replacenodetest_coll s:shard2 r:core_node11 x:replacenodetest_coll_shard2_replica_n5 t:null-1653] o.a.s.u.p.LogUpdateProcessorFactory webapp=/solr path=/update params={update.distrib=FROMLEADER&waitSearcher=true&openSearcher=false&commit=true&softCommit=false&distrib.from=http://127.0.0.1:35755/solr/replacenodetest_coll_shard2_replica_n2/&commit_end_point=replicas&wt=javabin&version=2&expungeDeletes=false}{commit=} 0 8 2> 373659 INFO (qtp437907094-3955) [n:127.0.0.1:44549_solr c:replacenodetest_coll s:shard2 r:core_node14 x:replacenodetest_coll_shard2_replica_n13 t:null-1653] o.a.s.u.p.DistributedUpdateProcessor Ignoring commit while not ACTIVE - state: BUFFERING replay: false 2> 373659 INFO (qtp437907094-3955) [n:127.0.0.1:44549_solr c:replacenodetest_coll s:shard2 r:core_node14 x:replacenodetest_coll_shard2_replica_n13 t:null-1653] o.a.s.u.p.LogUpdateProcessorFactory webapp=/solr path=/update params={update.distrib=FROMLEADER&waitSearcher=true&openSearcher=false&commit=true&softCommit=false&distrib.from=http://127.0.0.1:35755/solr/replacenodetest_coll_shard2_replica_n2/&commit_end_point=replicas&wt=javabin&version=2&expungeDeletes=false}{commit=} 0 8 2> 373661 INFO (qtp427304581-3960) [n:127.0.0.1:35755_solr c:replacenodetest_coll s:shard2 r:core_node8 x:replacenodetest_coll_shard2_replica_n2 t:null-1653] o.a.s.u.p.LogUpdateProcessorFactory webapp=/solr path=/update params={waitSearcher=true&openSearcher=false&commit=true&softCommit=false&wt=javabin&version=2}{commit=} 0 140 2> 373673 INFO (qtp427304581-3954) [n:127.0.0.1:35755_solr c:replacenodetest_coll s:shard2 r:core_node8 x:replacenodetest_coll_shard2_replica_n2 t:null-1654] o.a.s.c.S.Request webapp=/solr path=/replication params={qt=/replication&wt=javabin&version=2&command=indexversion} status=0 QTime=2 2> 373675 INFO (recoveryExecutor-2166-thread-1-processing-replacenodetest_coll_shard2_replica_n13 null-1648 000308541111949012 core_node14 create 127.0.0.1:44549_solr replacenodetest_coll shard2) [n:127.0.0.1:44549_solr c:replacenodetest_coll s:shard2 r:core_node14 x:replacenodetest_coll_shard2_replica_n13 t:null-1648] o.a.s.h.IndexFetcher Leader's generation: 1 2> 373675 INFO (recoveryExecutor-2166-thread-1-processing-replacenodetest_coll_shard2_replica_n13 null-1648 000308541111949012 core_node14 create 127.0.0.1:44549_solr replacenodetest_coll shard2) [n:127.0.0.1:44549_solr c:replacenodetest_coll s:shard2 r:core_node14 x:replacenodetest_coll_shard2_replica_n13 t:null-1648] o.a.s.h.IndexFetcher Leader's version: 0 2> 373675 INFO (recoveryExecutor-2166-thread-1-processing-replacenodetest_coll_shard2_replica_n13 null-1648 000308541111949012 core_node14 create 127.0.0.1:44549_solr replacenodetest_coll shard2) [n:127.0.0.1:44549_solr c:replacenodetest_coll s:shard2 r:core_node14 x:replacenodetest_coll_shard2_replica_n13 t:null-1648] o.a.s.h.IndexFetcher Follower's generation: 1 2> 373676 INFO (recoveryExecutor-2166-thread-1-processing-replacenodetest_coll_shard2_replica_n13 null-1648 000308541111949012 core_node14 create 127.0.0.1:44549_solr replacenodetest_coll shard2) [n:127.0.0.1:44549_solr c:replacenodetest_coll s:shard2 r:core_node14 x:replacenodetest_coll_shard2_replica_n13 t:null-1648] o.a.s.h.IndexFetcher Follower's version: 0 2> 373676 INFO (recoveryExecutor-2166-thread-1-processing-replacenodetest_coll_shard2_replica_n13 null-1648 000308541111949012 core_node14 create 127.0.0.1:44549_solr replacenodetest_coll shard2) [n:127.0.0.1:44549_solr c:replacenodetest_coll s:shard2 r:core_node14 x:replacenodetest_coll_shard2_replica_n13 t:null-1648] o.a.s.c.RecoveryStrategy No replay needed. 2> 373680 INFO (recoveryExecutor-2166-thread-1-processing-replacenodetest_coll_shard2_replica_n13 null-1648 000308541111949012 core_node14 create 127.0.0.1:44549_solr replacenodetest_coll shard2) [n:127.0.0.1:44549_solr c:replacenodetest_coll s:shard2 r:core_node14 x:replacenodetest_coll_shard2_replica_n13 t:null-1648] o.a.s.c.RecoveryStrategy Replication Recovery was successful. 2> 373680 INFO (recoveryExecutor-2166-thread-1-processing-replacenodetest_coll_shard2_replica_n13 null-1648 000308541111949012 core_node14 create 127.0.0.1:44549_solr replacenodetest_coll shard2) [n:127.0.0.1:44549_solr c:replacenodetest_coll s:shard2 r:core_node14 x:replacenodetest_coll_shard2_replica_n13 t:null-1648] o.a.s.c.RecoveryStrategy Registering as Active after recovery. 2> 373688 INFO (zkCallback-2171-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/replacenodetest_coll/state.json zxid: 574] for collection [replacenodetest_coll] has occurred - updating... (live nodes size: [4]) 2> 373688 INFO (recoveryExecutor-2166-thread-1-processing-replacenodetest_coll_shard2_replica_n13 null-1648 000308541111949012 core_node14 create 127.0.0.1:44549_solr replacenodetest_coll shard2) [n:127.0.0.1:44549_solr c:replacenodetest_coll s:shard2 r:core_node14 x:replacenodetest_coll_shard2_replica_n13 t:null-1648] o.a.s.c.RecoveryStrategy Updating version bucket highest from index after successful recovery. 2> 373688 INFO (zkCallback-2173-thread-2) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/replacenodetest_coll/state.json zxid: 574] for collection [replacenodetest_coll] has occurred - updating... (live nodes size: [4]) 2> 373688 INFO (zkCallback-2175-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/replacenodetest_coll/state.json zxid: 574] for collection [replacenodetest_coll] has occurred - updating... (live nodes size: [4]) 2> 373689 INFO (recoveryExecutor-2166-thread-1-processing-replacenodetest_coll_shard2_replica_n13 null-1648 000308541111949012 core_node14 create 127.0.0.1:44549_solr replacenodetest_coll shard2) [n:127.0.0.1:44549_solr c:replacenodetest_coll s:shard2 r:core_node14 x:replacenodetest_coll_shard2_replica_n13 t:null-1648] o.a.s.u.UpdateLog Could not find max version in index or recent updates, using new clock 1788735648928104448 2> 373698 INFO (recoveryExecutor-2166-thread-1-processing-replacenodetest_coll_shard2_replica_n13 null-1648 000308541111949012 core_node14 create 127.0.0.1:44549_solr replacenodetest_coll shard2) [n:127.0.0.1:44549_solr c:replacenodetest_coll s:shard2 r:core_node14 x:replacenodetest_coll_shard2_replica_n13 t:null-1648] o.a.s.c.RecoveryStrategy Finished recovery process, successful=[true] msTimeTaken=733.0 2> 373699 INFO (recoveryExecutor-2166-thread-1-processing-replacenodetest_coll_shard2_replica_n13 null-1648 000308541111949012 core_node14 create 127.0.0.1:44549_solr replacenodetest_coll shard2) [n:127.0.0.1:44549_solr c:replacenodetest_coll s:shard2 r:core_node14 x:replacenodetest_coll_shard2_replica_n13 t:null-1648] o.a.s.c.RecoveryStrategy Finished recovery process. recoveringAfterStartup=true msTimeTaken=734.0 2> 373828 INFO (qtp926421133-3976) [n:127.0.0.1:33263_solr c: s: r: x: t:null-1655] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/collections params={requestid=000&action=REQUESTSTATUS&wt=javabin&version=2} status=0 QTime=11 2> 373891 INFO (qtp437907094-3957) [n:127.0.0.1:44549_solr c: s: r: x: t:null-1648] o.a.s.h.a.CoreAdminOperation Checking request status for : 000308541111949012 2> 373891 INFO (qtp437907094-3957) [n:127.0.0.1:44549_solr c: s: r: x: t:null-1648] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/cores params={qt=/admin/cores&requestid=000308541111949012&action=REQUESTSTATUS&wt=javabin&version=2} status=0 QTime=2 2> 373899 INFO (DistributedCollectionApiCommandExecutor-2189-thread-2-processing-127.0.0.1:33263_solr null-1648) [n:127.0.0.1:33263_solr c: s: r: x: t:null-1648] o.a.s.c.a.c.ReplicaMigrationUtils Going to create replica for collection=replacenodetest_coll shard=shard1 on node=127.0.0.1:33263_solr 2> 373914 INFO (DistributedCollectionApiCommandExecutor-2189-thread-2-processing-127.0.0.1:33263_solr null-1648) [n:127.0.0.1:33263_solr c: s: r: x: t:null-1648] o.a.s.c.a.c.AddReplicaCmd Node Identified 127.0.0.1:33263_solr for creating new replica of shard shard1 for collection replacenodetest_coll 2> 373927 INFO (DistributedCollectionApiCommandExecutor-2189-thread-2-processing-127.0.0.1:33263_solr null-1648) [n:127.0.0.1:33263_solr c: s: r: x: t:null-1648] o.a.s.c.a.c.AddReplicaCmd Returning CreateReplica command. 2> 373931 INFO (DistributedCollectionApiCommandExecutor-2189-thread-2-processing-127.0.0.1:33263_solr null-1648) [n:127.0.0.1:33263_solr c: s: r: x: t:null-1648] o.a.s.c.o.SliceMutator createReplica() { 2> "core":"replacenodetest_coll_shard1_replica_n15", 2> "node_name":"127.0.0.1:33263_solr", 2> "base_url":"http://127.0.0.1:33263/solr", 2> "collection":"replacenodetest_coll", 2> "shard":"shard1", 2> "state":"down", 2> "type":"NRT", 2> "operation":"addreplica"} 2> 373946 INFO (zkCallback-2171-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/replacenodetest_coll/state.json zxid: 577] for collection [replacenodetest_coll] has occurred - updating... (live nodes size: [4]) 2> 373947 INFO (zkCallback-2175-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/replacenodetest_coll/state.json zxid: 577] for collection [replacenodetest_coll] has occurred - updating... (live nodes size: [4]) 2> 373946 INFO (zkCallback-2173-thread-2) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/replacenodetest_coll/state.json zxid: 577] for collection [replacenodetest_coll] has occurred - updating... (live nodes size: [4]) 2> 373970 INFO (qtp926421133-3968) [n:127.0.0.1:33263_solr c: s: r: x:replacenodetest_coll_shard1_replica_n15 t:null-1648] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/cores params={async=000308543218837525&qt=/admin/cores&coreNodeName=core_node16&collection.configName=conf1&name=replacenodetest_coll_shard1_replica_n15&action=CREATE&collection=replacenodetest_coll&shard=shard1&wt=javabin&version=2&replicaType=NRT} status=0 QTime=4 2> 373971 INFO (parallelCoreAdminAPIBaseExecutor-2193-thread-1-processing-127.0.0.1:33263_solr replacenodetest_coll_shard1_replica_n15 null-1648 000308543218837525 create) [n:127.0.0.1:33263_solr c: s: r: x:replacenodetest_coll_shard1_replica_n15 t:null-1648] o.a.s.h.a.CoreAdminOperation core create command async=000308543218837525&qt=/admin/cores&coreNodeName=core_node16&collection.configName=conf1&name=replacenodetest_coll_shard1_replica_n15&action=CREATE&collection=replacenodetest_coll&shard=shard1&wt=javabin&version=2&replicaType=NRT 2> 373977 INFO (qtp926421133-3972) [n:127.0.0.1:33263_solr c: s: r: x: t:null-1648] o.a.s.h.a.CoreAdminOperation Checking request status for : 000308543218837525 2> 373980 INFO (qtp926421133-3972) [n:127.0.0.1:33263_solr c: s: r: x: t:null-1648] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/cores params={qt=/admin/cores&requestid=000308543218837525&action=REQUESTSTATUS&wt=javabin&version=2} status=0 QTime=4 2> 373995 INFO (zkCallback-2171-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/replacenodetest_coll/state.json zxid: 578] for collection [replacenodetest_coll] has occurred - updating... (live nodes size: [4]) 2> 373995 INFO (zkCallback-2175-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/replacenodetest_coll/state.json zxid: 578] for collection [replacenodetest_coll] has occurred - updating... (live nodes size: [4]) 2> 373995 INFO (zkCallback-2173-thread-2) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/replacenodetest_coll/state.json zxid: 578] for collection [replacenodetest_coll] has occurred - updating... (live nodes size: [4]) 2> 374041 INFO (parallelCoreAdminAPIBaseExecutor-2193-thread-1-processing-127.0.0.1:33263_solr replacenodetest_coll_shard1_replica_n15 null-1648 000308543218837525 create) [n:127.0.0.1:33263_solr c:replacenodetest_coll s:shard1 r:core_node16 x:replacenodetest_coll_shard1_replica_n15 t:null-1648] o.a.s.c.SolrConfig Using Lucene MatchVersion: 9.8.0 2> 374070 INFO (parallelCoreAdminAPIBaseExecutor-2193-thread-1-processing-127.0.0.1:33263_solr replacenodetest_coll_shard1_replica_n15 null-1648 000308543218837525 create) [n:127.0.0.1:33263_solr c:replacenodetest_coll s:shard1 r:core_node16 x:replacenodetest_coll_shard1_replica_n15 t:null-1648] o.a.s.s.IndexSchema Schema name=cloud-dynamic 2> 374099 INFO (parallelCoreAdminAPIBaseExecutor-2193-thread-1-processing-127.0.0.1:33263_solr replacenodetest_coll_shard1_replica_n15 null-1648 000308543218837525 create) [n:127.0.0.1:33263_solr c:replacenodetest_coll s:shard1 r:core_node16 x:replacenodetest_coll_shard1_replica_n15 t:null-1648] o.a.s.s.IndexSchema Loaded schema cloud-dynamic/1.6 with uniqueid field id 2> 374099 INFO (parallelCoreAdminAPIBaseExecutor-2193-thread-1-processing-127.0.0.1:33263_solr replacenodetest_coll_shard1_replica_n15 null-1648 000308543218837525 create) [n:127.0.0.1:33263_solr c:replacenodetest_coll s:shard1 r:core_node16 x:replacenodetest_coll_shard1_replica_n15 t:null-1648] o.a.s.c.CoreContainer Creating SolrCore 'replacenodetest_coll_shard1_replica_n15' using configuration from configset conf1, trusted=true 2> 374102 INFO (parallelCoreAdminAPIBaseExecutor-2193-thread-1-processing-127.0.0.1:33263_solr replacenodetest_coll_shard1_replica_n15 null-1648 000308543218837525 create) [n:127.0.0.1:33263_solr c:replacenodetest_coll s:shard1 r:core_node16 x:replacenodetest_coll_shard1_replica_n15 t:null-1648] o.a.s.c.SolrCore Opening new SolrCore at [/build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.cloud.ReplaceNodeTest_246C98A4C257C021-001/tempDir-001/node3/replacenodetest_coll_shard1_replica_n15], dataDir=[/build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.cloud.ReplaceNodeTest_246C98A4C257C021-001/tempDir-001/node3/replacenodetest_coll_shard1_replica_n15/data/] 2> 374644 INFO (parallelCoreAdminAPIBaseExecutor-2193-thread-1-processing-127.0.0.1:33263_solr replacenodetest_coll_shard1_replica_n15 null-1648 000308543218837525 create) [n:127.0.0.1:33263_solr c:replacenodetest_coll s:shard1 r:core_node16 x:replacenodetest_coll_shard1_replica_n15 t:null-1648] o.a.s.j.SolrRequestAuthorizer Creating a new SolrRequestAuthorizer 2> 374702 INFO (parallelCoreAdminAPIBaseExecutor-2193-thread-1-processing-127.0.0.1:33263_solr replacenodetest_coll_shard1_replica_n15 null-1648 000308543218837525 create) [n:127.0.0.1:33263_solr c:replacenodetest_coll s:shard1 r:core_node16 x:replacenodetest_coll_shard1_replica_n15 t:null-1648] o.a.s.u.UpdateHandler Using UpdateLog implementation: org.apache.solr.update.UpdateLog 2> 374702 INFO (parallelCoreAdminAPIBaseExecutor-2193-thread-1-processing-127.0.0.1:33263_solr replacenodetest_coll_shard1_replica_n15 null-1648 000308543218837525 create) [n:127.0.0.1:33263_solr c:replacenodetest_coll s:shard1 r:core_node16 x:replacenodetest_coll_shard1_replica_n15 t:null-1648] o.a.s.u.UpdateLog Initializing UpdateLog: dataDir=null defaultSyncLevel=FLUSH numRecordsToKeep=100 maxNumLogsToKeep=10 numVersionBuckets=65536 2> 374720 INFO (parallelCoreAdminAPIBaseExecutor-2193-thread-1-processing-127.0.0.1:33263_solr replacenodetest_coll_shard1_replica_n15 null-1648 000308543218837525 create) [n:127.0.0.1:33263_solr c:replacenodetest_coll s:shard1 r:core_node16 x:replacenodetest_coll_shard1_replica_n15 t:null-1648] o.a.s.u.CommitTracker Hard AutoCommit: disabled 2> 374720 INFO (parallelCoreAdminAPIBaseExecutor-2193-thread-1-processing-127.0.0.1:33263_solr replacenodetest_coll_shard1_replica_n15 null-1648 000308543218837525 create) [n:127.0.0.1:33263_solr c:replacenodetest_coll s:shard1 r:core_node16 x:replacenodetest_coll_shard1_replica_n15 t:null-1648] o.a.s.u.CommitTracker Soft AutoCommit: disabled 2> 374740 INFO (parallelCoreAdminAPIBaseExecutor-2193-thread-1-processing-127.0.0.1:33263_solr replacenodetest_coll_shard1_replica_n15 null-1648 000308543218837525 create) [n:127.0.0.1:33263_solr c:replacenodetest_coll s:shard1 r:core_node16 x:replacenodetest_coll_shard1_replica_n15 t:null-1648] o.a.s.r.ManagedResourceStorage Configured ZooKeeperStorageIO with znodeBase: /configs/conf1 2> 374743 INFO (parallelCoreAdminAPIBaseExecutor-2193-thread-1-processing-127.0.0.1:33263_solr replacenodetest_coll_shard1_replica_n15 null-1648 000308543218837525 create) [n:127.0.0.1:33263_solr c:replacenodetest_coll s:shard1 r:core_node16 x:replacenodetest_coll_shard1_replica_n15 t:null-1648] o.a.s.r.ManagedResourceStorage Loaded null at path _rest_managed.json using ZooKeeperStorageIO:path=/configs/conf1 2> 374748 INFO (parallelCoreAdminAPIBaseExecutor-2193-thread-1-processing-127.0.0.1:33263_solr replacenodetest_coll_shard1_replica_n15 null-1648 000308543218837525 create) [n:127.0.0.1:33263_solr c:replacenodetest_coll s:shard1 r:core_node16 x:replacenodetest_coll_shard1_replica_n15 t:null-1648] o.a.s.h.ReplicationHandler Commits will be reserved for 10000 ms 2> 374749 INFO (parallelCoreAdminAPIBaseExecutor-2193-thread-1-processing-127.0.0.1:33263_solr replacenodetest_coll_shard1_replica_n15 null-1648 000308543218837525 create) [n:127.0.0.1:33263_solr c:replacenodetest_coll s:shard1 r:core_node16 x:replacenodetest_coll_shard1_replica_n15 t:null-1648] o.a.s.u.UpdateLog Could not find max version in index or recent updates, using new clock 1788735650039595008 2> 374777 INFO (searcherExecutor-2262-thread-1-processing-replacenodetest_coll_shard1_replica_n15 null-1648 000308543218837525 core_node16 create 127.0.0.1:33263_solr replacenodetest_coll shard1) [n:127.0.0.1:33263_solr c:replacenodetest_coll s:shard1 r:core_node16 x:replacenodetest_coll_shard1_replica_n15 t:null-1648] o.a.s.c.SolrCore Registered new searcher autowarm time: 0 ms 2> 374787 INFO (parallelCoreAdminAPIBaseExecutor-2193-thread-1-processing-127.0.0.1:33263_solr replacenodetest_coll_shard1_replica_n15 null-1648 000308543218837525 create) [n:127.0.0.1:33263_solr c:replacenodetest_coll s:shard1 r:core_node16 x:replacenodetest_coll_shard1_replica_n15 t:null-1648] o.a.s.c.ZkShardTerms Successful update of terms at /collections/replacenodetest_coll/terms/shard1 to Terms{values={core_node10=0, core_node7=0, core_node16=0}, version=2} for registerTerm 2> 374788 INFO (parallelCoreAdminAPIBaseExecutor-2193-thread-1-processing-127.0.0.1:33263_solr replacenodetest_coll_shard1_replica_n15 null-1648 000308543218837525 create) [n:127.0.0.1:33263_solr c:replacenodetest_coll s:shard1 r:core_node16 x:replacenodetest_coll_shard1_replica_n15 t:null-1648] o.a.s.c.ShardLeaderElectionContextBase make sure parent is created /collections/replacenodetest_coll/leaders/shard1 2> 374808 INFO (parallelCoreAdminAPIBaseExecutor-2193-thread-1-processing-127.0.0.1:33263_solr replacenodetest_coll_shard1_replica_n15 null-1648 000308543218837525 create) [n:127.0.0.1:33263_solr c:replacenodetest_coll s:shard1 r:core_node16 x:replacenodetest_coll_shard1_replica_n15 t:null-1648] o.a.s.c.ZkController Core needs to recover:replacenodetest_coll_shard1_replica_n15 2> 374812 INFO (updateExecutor-2159-thread-1-processing-replacenodetest_coll_shard1_replica_n15 null-1648 000308543218837525 core_node16 create 127.0.0.1:33263_solr replacenodetest_coll shard1) [n:127.0.0.1:33263_solr c:replacenodetest_coll s:shard1 r:core_node16 x:replacenodetest_coll_shard1_replica_n15 t:null-1648] o.a.s.u.DefaultSolrCoreState Running recovery 2> 374815 INFO (recoveryExecutor-2165-thread-1-processing-replacenodetest_coll_shard1_replica_n15 null-1648 000308543218837525 core_node16 create 127.0.0.1:33263_solr replacenodetest_coll shard1) [n:127.0.0.1:33263_solr c:replacenodetest_coll s:shard1 r:core_node16 x:replacenodetest_coll_shard1_replica_n15 t:null-1648] o.a.s.c.RecoveryStrategy Starting recovery process. recoveringAfterStartup=true 2> 374816 INFO (recoveryExecutor-2165-thread-1-processing-replacenodetest_coll_shard1_replica_n15 null-1648 000308543218837525 core_node16 create 127.0.0.1:33263_solr replacenodetest_coll shard1) [n:127.0.0.1:33263_solr c:replacenodetest_coll s:shard1 r:core_node16 x:replacenodetest_coll_shard1_replica_n15 t:null-1648] o.a.s.c.RecoveryStrategy startupVersions is empty 2> 374832 INFO (qtp1295455241-3973) [n:127.0.0.1:42593_solr c:replacenodetest_coll s:shard1 r:core_node10 x:replacenodetest_coll_shard1_replica_n4 t:null-1656] o.a.s.c.S.Request webapp=/solr path=/admin/ping params={wt=javabin&version=2} hits=0 status=0 QTime=3 2> 374832 INFO (qtp1295455241-3973) [n:127.0.0.1:42593_solr c:replacenodetest_coll s:shard1 r:core_node10 x:replacenodetest_coll_shard1_replica_n4 t:null-1656] o.a.s.c.S.Request webapp=/solr path=/admin/ping params={wt=javabin&version=2} status=0 QTime=4 2> 374837 INFO (recoveryExecutor-2165-thread-1-processing-replacenodetest_coll_shard1_replica_n15 null-1648 000308543218837525 core_node16 create 127.0.0.1:33263_solr replacenodetest_coll shard1) [n:127.0.0.1:33263_solr c:replacenodetest_coll s:shard1 r:core_node16 x:replacenodetest_coll_shard1_replica_n15 t:null-1648] o.a.s.c.RecoveryStrategy Begin buffering updates. core=[replacenodetest_coll_shard1_replica_n15] 2> 374840 INFO (recoveryExecutor-2165-thread-1-processing-replacenodetest_coll_shard1_replica_n15 null-1648 000308543218837525 core_node16 create 127.0.0.1:33263_solr replacenodetest_coll shard1) [n:127.0.0.1:33263_solr c:replacenodetest_coll s:shard1 r:core_node16 x:replacenodetest_coll_shard1_replica_n15 t:null-1648] o.a.s.u.UpdateLog Starting to buffer updates. FSUpdateLog{state=ACTIVE, tlog=null} 2> 374840 INFO (recoveryExecutor-2165-thread-1-processing-replacenodetest_coll_shard1_replica_n15 null-1648 000308543218837525 core_node16 create 127.0.0.1:33263_solr replacenodetest_coll shard1) [n:127.0.0.1:33263_solr c:replacenodetest_coll s:shard1 r:core_node16 x:replacenodetest_coll_shard1_replica_n15 t:null-1648] o.a.s.c.RecoveryStrategy Publishing state of core [replacenodetest_coll_shard1_replica_n15] as recovering, leader is [http://127.0.0.1:42593/solr/replacenodetest_coll_shard1_replica_n4/] and I am [http://127.0.0.1:33263/solr/replacenodetest_coll_shard1_replica_n15/] 2> 374845 INFO (qtp926421133-3974) [n:127.0.0.1:33263_solr c: s: r: x: t:null-1657] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/collections params={requestid=000&action=REQUESTSTATUS&wt=javabin&version=2} status=0 QTime=12 2> 374848 INFO (zkCallback-2169-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/replacenodetest_coll/state.json zxid: 585] for collection [replacenodetest_coll] has occurred - updating... (live nodes size: [4]) 2> 374849 INFO (zkCallback-2175-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/replacenodetest_coll/state.json zxid: 585] for collection [replacenodetest_coll] has occurred - updating... (live nodes size: [4]) 2> 374848 INFO (zkCallback-2171-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/replacenodetest_coll/state.json zxid: 585] for collection [replacenodetest_coll] has occurred - updating... (live nodes size: [4]) 2> 374849 INFO (zkCallback-2173-thread-2) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/replacenodetest_coll/state.json zxid: 585] for collection [replacenodetest_coll] has occurred - updating... (live nodes size: [4]) 2> 374851 INFO (recoveryExecutor-2165-thread-1-processing-replacenodetest_coll_shard1_replica_n15 null-1648 000308543218837525 core_node16 create 127.0.0.1:33263_solr replacenodetest_coll shard1) [n:127.0.0.1:33263_solr c:replacenodetest_coll s:shard1 r:core_node16 x:replacenodetest_coll_shard1_replica_n15 t:null-1648] o.a.s.c.RecoveryStrategy Sending prep recovery command to [http://127.0.0.1:42593/solr]; [WaitForState: action=PREPRECOVERY&core=replacenodetest_coll_shard1_replica_n4&nodeName=127.0.0.1:33263_solr&coreNodeName=core_node16&state=recovering&checkLive=true&onlyIfLeader=true&onlyIfLeaderActive=true] 2> 374856 INFO (qtp1295455241-3965) [n:127.0.0.1:42593_solr c: s: r: x:replacenodetest_coll_shard1_replica_n4 t:null-1658] o.a.s.h.a.PrepRecoveryOp Going to wait for coreNodeName: core_node16, state: recovering, checkLive: true, onlyIfLeader: true, onlyIfLeaderActive: true 2> 374857 INFO (qtp1295455241-3965) [n:127.0.0.1:42593_solr c: s: r: x: t:null-1658] o.a.s.h.a.PrepRecoveryOp In WaitForState(recovering): collection=replacenodetest_coll, shard=shard1, thisCore=replacenodetest_coll_shard1_replica_n4, leaderDoesNotNeedRecovery=false, isLeader? true, live=true, checkLive=true, currentState=recovering, localState=active, nodeName=127.0.0.1:33263_solr, coreNodeName=core_node16, onlyIfActiveCheckResult=false, nodeProps: core_node16:{ 2> "core":"replacenodetest_coll_shard1_replica_n15", 2> "node_name":"127.0.0.1:33263_solr", 2> "base_url":"http://127.0.0.1:33263/solr", 2> "state":"recovering", 2> "type":"NRT", 2> "force_set_state":"false"} 2> 374858 INFO (qtp1295455241-3965) [n:127.0.0.1:42593_solr c: s: r: x: t:null-1658] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/cores params={nodeName=127.0.0.1:33263_solr&onlyIfLeaderActive=true&core=replacenodetest_coll_shard1_replica_n4&coreNodeName=core_node16&action=PREPRECOVERY&checkLive=true&state=recovering&onlyIfLeader=true&wt=javabin&version=2} status=0 QTime=3 2> 374987 INFO (qtp926421133-3976) [n:127.0.0.1:33263_solr c: s: r: x: t:null-1648] o.a.s.h.a.CoreAdminOperation Checking request status for : 000308543218837525 2> 374988 INFO (qtp926421133-3976) [n:127.0.0.1:33263_solr c: s: r: x: t:null-1648] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/cores params={qt=/admin/cores&requestid=000308543218837525&action=REQUESTSTATUS&wt=javabin&version=2} status=0 QTime=2 2> 374992 INFO (DistributedCollectionApiCommandExecutor-2189-thread-2-processing-127.0.0.1:33263_solr null-1648) [n:127.0.0.1:33263_solr c: s: r: x: t:null-1648] o.a.s.c.a.c.ReplicaMigrationUtils Going to create replica for collection=replacenodetest_coll shard=shard3 on node=127.0.0.1:33263_solr 2> 375007 INFO (DistributedCollectionApiCommandExecutor-2189-thread-2-processing-127.0.0.1:33263_solr null-1648) [n:127.0.0.1:33263_solr c: s: r: x: t:null-1648] o.a.s.c.a.c.AddReplicaCmd Node Identified 127.0.0.1:33263_solr for creating new replica of shard shard3 for collection replacenodetest_coll 2> 375019 INFO (DistributedCollectionApiCommandExecutor-2189-thread-2-processing-127.0.0.1:33263_solr null-1648) [n:127.0.0.1:33263_solr c: s: r: x: t:null-1648] o.a.s.c.a.c.AddReplicaCmd Returning CreateReplica command. 2> 375023 INFO (DistributedCollectionApiCommandExecutor-2189-thread-2-processing-127.0.0.1:33263_solr null-1648) [n:127.0.0.1:33263_solr c: s: r: x: t:null-1648] o.a.s.c.o.SliceMutator createReplica() { 2> "core":"replacenodetest_coll_shard3_replica_n17", 2> "node_name":"127.0.0.1:33263_solr", 2> "base_url":"http://127.0.0.1:33263/solr", 2> "collection":"replacenodetest_coll", 2> "shard":"shard3", 2> "state":"down", 2> "type":"NRT", 2> "operation":"addreplica"} 2> 375037 INFO (zkCallback-2169-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/replacenodetest_coll/state.json zxid: 588] for collection [replacenodetest_coll] has occurred - updating... (live nodes size: [4]) 2> 375037 INFO (zkCallback-2171-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/replacenodetest_coll/state.json zxid: 588] for collection [replacenodetest_coll] has occurred - updating... (live nodes size: [4]) 2> 375037 INFO (zkCallback-2173-thread-2) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/replacenodetest_coll/state.json zxid: 588] for collection [replacenodetest_coll] has occurred - updating... (live nodes size: [4]) 2> 375037 INFO (zkCallback-2175-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/replacenodetest_coll/state.json zxid: 588] for collection [replacenodetest_coll] has occurred - updating... (live nodes size: [4]) 2> 375052 INFO (qtp926421133-3956) [n:127.0.0.1:33263_solr c: s: r: x:replacenodetest_coll_shard3_replica_n17 t:null-1648] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/cores params={async=000308544307642260&qt=/admin/cores&coreNodeName=core_node18&collection.configName=conf1&name=replacenodetest_coll_shard3_replica_n17&action=CREATE&collection=replacenodetest_coll&shard=shard3&wt=javabin&version=2&replicaType=NRT} status=0 QTime=6 2> 375052 INFO (parallelCoreAdminAPIBaseExecutor-2193-thread-2-processing-127.0.0.1:33263_solr replacenodetest_coll_shard3_replica_n17 null-1648 000308544307642260 create) [n:127.0.0.1:33263_solr c: s: r: x:replacenodetest_coll_shard3_replica_n17 t:null-1648] o.a.s.h.a.CoreAdminOperation core create command async=000308544307642260&qt=/admin/cores&coreNodeName=core_node18&collection.configName=conf1&name=replacenodetest_coll_shard3_replica_n17&action=CREATE&collection=replacenodetest_coll&shard=shard3&wt=javabin&version=2&replicaType=NRT 2> 375058 INFO (qtp926421133-3962) [n:127.0.0.1:33263_solr c: s: r: x: t:null-1648] o.a.s.h.a.CoreAdminOperation Checking request status for : 000308544307642260 2> 375059 INFO (qtp926421133-3962) [n:127.0.0.1:33263_solr c: s: r: x: t:null-1648] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/cores params={qt=/admin/cores&requestid=000308544307642260&action=REQUESTSTATUS&wt=javabin&version=2} status=0 QTime=2 2> 388349 INFO (recoveryExecutor-2165-thread-1-processing-replacenodetest_coll_shard1_replica_n15 null-1648 000308543218837525 core_node16 create 127.0.0.1:33263_solr replacenodetest_coll shard1) [n:127.0.0.1:33263_solr c:replacenodetest_coll s:shard1 r:core_node16 x:replacenodetest_coll_shard1_replica_n15 t:null-1648] o.a.s.c.RecoveryStrategy Attempting to PeerSync from [http://127.0.0.1:42593/solr/replacenodetest_coll_shard1_replica_n4/] - recoveringAfterStartup=[true] 2> 388448 WARN (recoveryExecutor-2165-thread-1-processing-replacenodetest_coll_shard1_replica_n15 null-1648 000308543218837525 core_node16 create 127.0.0.1:33263_solr replacenodetest_coll shard1) [n:127.0.0.1:33263_solr c:replacenodetest_coll s:shard1 r:core_node16 x:replacenodetest_coll_shard1_replica_n15 t:null-1648] o.a.s.u.PeerSyncWithLeader no frame of reference to tell if we've missed updates 2> 388456 INFO (recoveryExecutor-2165-thread-1-processing-replacenodetest_coll_shard1_replica_n15 null-1648 000308543218837525 core_node16 create 127.0.0.1:33263_solr replacenodetest_coll shard1) [n:127.0.0.1:33263_solr c:replacenodetest_coll s:shard1 r:core_node16 x:replacenodetest_coll_shard1_replica_n15 t:null-1648] o.a.s.c.RecoveryStrategy PeerSync Recovery was not successful - trying replication. 2> 388456 INFO (recoveryExecutor-2165-thread-1-processing-replacenodetest_coll_shard1_replica_n15 null-1648 000308543218837525 core_node16 create 127.0.0.1:33263_solr replacenodetest_coll shard1) [n:127.0.0.1:33263_solr c:replacenodetest_coll s:shard1 r:core_node16 x:replacenodetest_coll_shard1_replica_n15 t:null-1648] o.a.s.c.RecoveryStrategy Starting Replication Recovery. 2> 388457 INFO (recoveryExecutor-2165-thread-1-processing-replacenodetest_coll_shard1_replica_n15 null-1648 000308543218837525 core_node16 create 127.0.0.1:33263_solr replacenodetest_coll shard1) [n:127.0.0.1:33263_solr c:replacenodetest_coll s:shard1 r:core_node16 x:replacenodetest_coll_shard1_replica_n15 t:null-1648] o.a.s.c.RecoveryStrategy Attempting to replicate from [http://127.0.0.1:42593/solr/replacenodetest_coll_shard1_replica_n4/]. 2> 388467 INFO (zkCallback-2171-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/replacenodetest_coll/state.json zxid: 589] for collection [replacenodetest_coll] has occurred - updating... (live nodes size: [4]) 2> 388470 INFO (zkCallback-2175-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/replacenodetest_coll/state.json zxid: 589] for collection [replacenodetest_coll] has occurred - updating... (live nodes size: [4]) 2> 388468 INFO (zkCallback-2173-thread-2) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/replacenodetest_coll/state.json zxid: 589] for collection [replacenodetest_coll] has occurred - updating... (live nodes size: [4]) 2> 388467 INFO (zkCallback-2169-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/replacenodetest_coll/state.json zxid: 589] for collection [replacenodetest_coll] has occurred - updating... (live nodes size: [4]) 2> 388592 INFO (qtp926421133-3968) [n:127.0.0.1:33263_solr c: s: r: x: t:null-1648] o.a.s.h.a.CoreAdminOperation Checking request status for : 000308544307642260 2> 388603 INFO (qtp926421133-3968) [n:127.0.0.1:33263_solr c: s: r: x: t:null-1648] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/cores params={qt=/admin/cores&requestid=000308544307642260&action=REQUESTSTATUS&wt=javabin&version=2} status=0 QTime=38 2> 388605 INFO (TEST-ReplaceNodeTest.testGoodSpreadDuringAssignWithNoTarget-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.SolrTestCaseJ4 ###Ending testGoodSpreadDuringAssignWithNoTarget 2> 388665 INFO (parallelCoreAdminAPIBaseExecutor-2193-thread-2-processing-127.0.0.1:33263_solr replacenodetest_coll_shard3_replica_n17 null-1648 000308544307642260 create) [n:127.0.0.1:33263_solr c:replacenodetest_coll s:shard3 r:core_node18 x:replacenodetest_coll_shard3_replica_n17 t:null-1648] o.a.s.c.SolrConfig Using Lucene MatchVersion: 9.8.0 2> 388703 INFO (parallelCoreAdminAPIBaseExecutor-2193-thread-2-processing-127.0.0.1:33263_solr replacenodetest_coll_shard3_replica_n17 null-1648 000308544307642260 create) [n:127.0.0.1:33263_solr c:replacenodetest_coll s:shard3 r:core_node18 x:replacenodetest_coll_shard3_replica_n17 t:null-1648] o.a.s.s.IndexSchema Schema name=cloud-dynamic 2> 388709 INFO (qtp437907094-3961) [n:127.0.0.1:44549_solr c:replacenodetest_coll s:shard2 r:core_node14 x:replacenodetest_coll_shard2_replica_n13 t:null-1659] o.a.s.u.p.LogUpdateProcessorFactory webapp=/solr path=/update params={update.distrib=FROMLEADER&waitSearcher=true&openSearcher=false&commit=true&softCommit=false&distrib.from=http://127.0.0.1:35755/solr/replacenodetest_coll_shard2_replica_n2/&commit_end_point=replicas&wt=javabin&version=2&expungeDeletes=false}{commit=} 0 38 2> 388710 INFO (qtp1295455241-3959) [n:127.0.0.1:42593_solr c:replacenodetest_coll s:shard2 r:core_node11 x:replacenodetest_coll_shard2_replica_n5 t:null-1659] o.a.s.u.p.LogUpdateProcessorFactory webapp=/solr path=/update params={update.distrib=FROMLEADER&waitSearcher=true&openSearcher=false&commit=true&softCommit=false&distrib.from=http://127.0.0.1:35755/solr/replacenodetest_coll_shard2_replica_n2/&commit_end_point=replicas&wt=javabin&version=2&expungeDeletes=false}{commit=} 0 42 2> 388732 INFO (qtp427304581-3964) [n:127.0.0.1:35755_solr c:replacenodetest_coll s:shard2 r:core_node8 x:replacenodetest_coll_shard2_replica_n2 t:null-1659] o.a.s.u.p.LogUpdateProcessorFactory webapp=/solr path=/update params={update.distrib=TOLEADER&waitSearcher=true&openSearcher=false&commit=true&softCommit=false&distrib.from=http://127.0.0.1:42593/solr/replacenodetest_coll_shard1_replica_n4/&commit_end_point=leaders&wt=javabin&version=2&expungeDeletes=false}{commit=} 0 90 2> 388746 INFO (qtp427304581-3958) [n:127.0.0.1:35755_solr c:replacenodetest_coll s:shard3 r:core_node9 x:replacenodetest_coll_shard3_replica_n3 t:null-1659] o.a.s.u.p.LogUpdateProcessorFactory webapp=/solr path=/update params={update.distrib=FROMLEADER&waitSearcher=true&openSearcher=false&commit=true&softCommit=false&distrib.from=http://127.0.0.1:42593/solr/replacenodetest_coll_shard3_replica_n6/&commit_end_point=replicas&wt=javabin&version=2&expungeDeletes=false}{commit=} 0 8 2> 388749 INFO (qtp1295455241-3973) [n:127.0.0.1:42593_solr c:replacenodetest_coll s:shard3 r:core_node12 x:replacenodetest_coll_shard3_replica_n6 t:null-1659] o.a.s.u.p.LogUpdateProcessorFactory webapp=/solr path=/update params={update.distrib=TOLEADER&waitSearcher=true&openSearcher=false&commit=true&softCommit=false&distrib.from=http://127.0.0.1:42593/solr/replacenodetest_coll_shard1_replica_n4/&commit_end_point=leaders&wt=javabin&version=2&expungeDeletes=false}{commit=} 0 31 2> 388766 INFO (qtp427304581-3952) [n:127.0.0.1:35755_solr c:replacenodetest_coll s:shard1 r:core_node7 x:replacenodetest_coll_shard1_replica_n1 t:null-1659] o.a.s.u.p.LogUpdateProcessorFactory webapp=/solr path=/update params={update.distrib=FROMLEADER&waitSearcher=true&openSearcher=false&commit=true&softCommit=false&distrib.from=http://127.0.0.1:42593/solr/replacenodetest_coll_shard1_replica_n4/&commit_end_point=replicas&wt=javabin&version=2&expungeDeletes=false}{commit=} 0 8 2> 388787 INFO (parallelCoreAdminAPIBaseExecutor-2193-thread-2-processing-127.0.0.1:33263_solr replacenodetest_coll_shard3_replica_n17 null-1648 000308544307642260 create) [n:127.0.0.1:33263_solr c:replacenodetest_coll s:shard3 r:core_node18 x:replacenodetest_coll_shard3_replica_n17 t:null-1648] o.a.s.s.IndexSchema Loaded schema cloud-dynamic/1.6 with uniqueid field id 2> 388786 INFO (qtp926421133-3976) [n:127.0.0.1:33263_solr c:replacenodetest_coll s:shard1 r:core_node16 x:replacenodetest_coll_shard1_replica_n15 t:null-1659] o.a.s.u.p.DistributedUpdateProcessor Ignoring commit while not ACTIVE - state: BUFFERING replay: false 2> 388788 INFO (qtp926421133-3976) [n:127.0.0.1:33263_solr c:replacenodetest_coll s:shard1 r:core_node16 x:replacenodetest_coll_shard1_replica_n15 t:null-1659] o.a.s.u.p.LogUpdateProcessorFactory webapp=/solr path=/update params={update.distrib=FROMLEADER&waitSearcher=true&openSearcher=false&commit=true&softCommit=false&distrib.from=http://127.0.0.1:42593/solr/replacenodetest_coll_shard1_replica_n4/&commit_end_point=replicas&wt=javabin&version=2&expungeDeletes=false}{commit=} 0 13 2> 388788 INFO (parallelCoreAdminAPIBaseExecutor-2193-thread-2-processing-127.0.0.1:33263_solr replacenodetest_coll_shard3_replica_n17 null-1648 000308544307642260 create) [n:127.0.0.1:33263_solr c:replacenodetest_coll s:shard3 r:core_node18 x:replacenodetest_coll_shard3_replica_n17 t:null-1648] o.a.s.c.CoreContainer Creating SolrCore 'replacenodetest_coll_shard3_replica_n17' using configuration from configset conf1, trusted=true 2> 388790 INFO (qtp1295455241-3975) [n:127.0.0.1:42593_solr c:replacenodetest_coll s:shard1 r:core_node10 x:replacenodetest_coll_shard1_replica_n4 t:null-1659] o.a.s.u.p.LogUpdateProcessorFactory webapp=/solr path=/update params={waitSearcher=true&openSearcher=false&commit=true&softCommit=false&wt=javabin&version=2}{commit=} 0 242 2> 388794 INFO (parallelCoreAdminAPIBaseExecutor-2193-thread-2-processing-127.0.0.1:33263_solr replacenodetest_coll_shard3_replica_n17 null-1648 000308544307642260 create) [n:127.0.0.1:33263_solr c:replacenodetest_coll s:shard3 r:core_node18 x:replacenodetest_coll_shard3_replica_n17 t:null-1648] o.a.s.c.SolrCore Opening new SolrCore at [/build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.cloud.ReplaceNodeTest_246C98A4C257C021-001/tempDir-001/node3/replacenodetest_coll_shard3_replica_n17], dataDir=[/build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.cloud.ReplaceNodeTest_246C98A4C257C021-001/tempDir-001/node3/replacenodetest_coll_shard3_replica_n17/data/] > java.lang.AssertionError: There should be no more replicas on the sourceNode after a replaceNode request. expected:<[]> but was:<[core_node10:{ > "core":"replacenodetest_coll_shard1_replica_n4", > "leader":"true", > "node_name":"127.0.0.1:42593_solr", > "base_url":"http://127.0.0.1:42593/solr", > "state":"active", > "type":"NRT", > "force_set_state":"false"}, core_node11:{ > "core":"replacenodetest_coll_shard2_replica_n5", > "node_name":"127.0.0.1:42593_solr", > "base_url":"http://127.0.0.1:42593/solr", > "state":"active", > "type":"NRT", > "force_set_state":"false"}, core_node12:{ > "core":"replacenodetest_coll_shard3_replica_n6", > "leader":"true", > "node_name":"127.0.0.1:42593_solr", > "base_url":"http://127.0.0.1:42593/solr", > "state":"active", > "type":"NRT", > "force_set_state":"false"}]> > at __randomizedtesting.SeedInfo.seed([246C98A4C257C021:A074C9EF7341155]:0) > at org.junit.Assert.fail(Assert.java:89) > at org.junit.Assert.failNotEquals(Assert.java:835) > at org.junit.Assert.assertEquals(Assert.java:120) > at org.apache.solr.cloud.ReplaceNodeTest.testGoodSpreadDuringAssignWithNoTarget(ReplaceNodeTest.java:273) > at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method) > at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) > at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) > at java.base/java.lang.reflect.Method.invoke(Method.java:566) > at com.carrotsearch.randomizedtesting.RandomizedRunner.invoke(RandomizedRunner.java:1758) > at com.carrotsearch.randomizedtesting.RandomizedRunner$8.evaluate(RandomizedRunner.java:946) > at com.carrotsearch.randomizedtesting.RandomizedRunner$9.evaluate(RandomizedRunner.java:982) > at com.carrotsearch.randomizedtesting.RandomizedRunner$10.evaluate(RandomizedRunner.java:996) > at com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:80) > at org.junit.rules.RunRules.evaluate(RunRules.java:20) > at org.apache.lucene.tests.util.TestRuleSetupTeardownChained$1.evaluate(TestRuleSetupTeardownChained.java:48) > at org.apache.lucene.tests.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:43) > at org.apache.lucene.tests.util.TestRuleThreadAndTestName$1.evaluate(TestRuleThreadAndTestName.java:45) > at org.apache.lucene.tests.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:60) > at org.apache.lucene.tests.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:44) > at org.junit.rules.RunRules.evaluate(RunRules.java:20) > at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) > at com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:390) > at com.carrotsearch.randomizedtesting.ThreadLeakControl.forkTimeoutingTask(ThreadLeakControl.java:843) > at com.carrotsearch.randomizedtesting.ThreadLeakControl$3.evaluate(ThreadLeakControl.java:490) > at com.carrotsearch.randomizedtesting.RandomizedRunner.runSingleTest(RandomizedRunner.java:955) > at com.carrotsearch.randomizedtesting.RandomizedRunner$5.evaluate(RandomizedRunner.java:840) > at com.carrotsearch.randomizedtesting.RandomizedRunner$6.evaluate(RandomizedRunner.java:891) > at com.carrotsearch.randomizedtesting.RandomizedRunner$7.evaluate(RandomizedRunner.java:902) > at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) > at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) > at org.apache.lucene.tests.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:43) > at org.apache.lucene.tests.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:43) > at com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:80) > at org.junit.rules.RunRules.evaluate(RunRules.java:20) > at org.apache.lucene.tests.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:43) > at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) > at org.apache.lucene.tests.util.TestRuleStoreClassName$1.evaluate(TestRuleStoreClassName.java:38) > at com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:40) > at com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:40) > at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) > at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) > at org.apache.lucene.tests.util.TestRuleAssertionsRequired$1.evaluate(TestRuleAssertionsRequired.java:53) > at org.apache.lucene.tests.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:43) > at org.apache.lucene.tests.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:44) > at org.apache.lucene.tests.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:60) > at org.apache.lucene.tests.util.TestRuleIgnoreTestSuites$1.evaluate(TestRuleIgnoreTestSuites.java:47) > at org.junit.rules.RunRules.evaluate(RunRules.java:20) > at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) > at com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:390) > at com.carrotsearch.randomizedtesting.ThreadLeakControl.lambda$forkTimeoutingTask$0(ThreadLeakControl.java:850) > at java.base/java.lang.Thread.run(Thread.java:829) 2> NOTE: reproduce with: gradlew test --tests ReplaceNodeTest.testGoodSpreadDuringAssignWithNoTarget -Dtests.seed=246C98A4C257C021 -Dtests.locale=en-VG -Dtests.timezone=Indian/Comoro -Dtests.asserts=true -Dtests.file.encoding=UTF-8 2> 388829 INFO (qtp1295455241-3965) [n:127.0.0.1:42593_solr c:replacenodetest_coll s:shard1 r:core_node10 x:replacenodetest_coll_shard1_replica_n4 t:null-1660] o.a.s.c.S.Request webapp=/solr path=/replication params={qt=/replication&wt=javabin&version=2&command=indexversion} status=0 QTime=4 2> 388831 INFO (recoveryExecutor-2165-thread-1-processing-replacenodetest_coll_shard1_replica_n15 null-1648 000308543218837525 core_node16 create 127.0.0.1:33263_solr replacenodetest_coll shard1) [n:127.0.0.1:33263_solr c:replacenodetest_coll s:shard1 r:core_node16 x:replacenodetest_coll_shard1_replica_n15 t:null-1648] o.a.s.h.IndexFetcher Leader's generation: 1 2> 388832 INFO (recoveryExecutor-2165-thread-1-processing-replacenodetest_coll_shard1_replica_n15 null-1648 000308543218837525 core_node16 create 127.0.0.1:33263_solr replacenodetest_coll shard1) [n:127.0.0.1:33263_solr c:replacenodetest_coll s:shard1 r:core_node16 x:replacenodetest_coll_shard1_replica_n15 t:null-1648] o.a.s.h.IndexFetcher Leader's version: 0 2> 388832 INFO (recoveryExecutor-2165-thread-1-processing-replacenodetest_coll_shard1_replica_n15 null-1648 000308543218837525 core_node16 create 127.0.0.1:33263_solr replacenodetest_coll shard1) [n:127.0.0.1:33263_solr c:replacenodetest_coll s:shard1 r:core_node16 x:replacenodetest_coll_shard1_replica_n15 t:null-1648] o.a.s.h.IndexFetcher Follower's generation: 1 2> 388832 INFO (recoveryExecutor-2165-thread-1-processing-replacenodetest_coll_shard1_replica_n15 null-1648 000308543218837525 core_node16 create 127.0.0.1:33263_solr replacenodetest_coll shard1) [n:127.0.0.1:33263_solr c:replacenodetest_coll s:shard1 r:core_node16 x:replacenodetest_coll_shard1_replica_n15 t:null-1648] o.a.s.h.IndexFetcher Follower's version: 0 2> 388832 INFO (recoveryExecutor-2165-thread-1-processing-replacenodetest_coll_shard1_replica_n15 null-1648 000308543218837525 core_node16 create 127.0.0.1:33263_solr replacenodetest_coll shard1) [n:127.0.0.1:33263_solr c:replacenodetest_coll s:shard1 r:core_node16 x:replacenodetest_coll_shard1_replica_n15 t:null-1648] o.a.s.c.RecoveryStrategy No replay needed. 2> 388853 INFO (recoveryExecutor-2165-thread-1-processing-replacenodetest_coll_shard1_replica_n15 null-1648 000308543218837525 core_node16 create 127.0.0.1:33263_solr replacenodetest_coll shard1) [n:127.0.0.1:33263_solr c:replacenodetest_coll s:shard1 r:core_node16 x:replacenodetest_coll_shard1_replica_n15 t:null-1648] o.a.s.c.RecoveryStrategy Replication Recovery was successful. 2> 388853 INFO (recoveryExecutor-2165-thread-1-processing-replacenodetest_coll_shard1_replica_n15 null-1648 000308543218837525 core_node16 create 127.0.0.1:33263_solr replacenodetest_coll shard1) [n:127.0.0.1:33263_solr c:replacenodetest_coll s:shard1 r:core_node16 x:replacenodetest_coll_shard1_replica_n15 t:null-1648] o.a.s.c.RecoveryStrategy Registering as Active after recovery. 2> 388865 INFO (zkCallback-2169-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/replacenodetest_coll/state.json zxid: 590] for collection [replacenodetest_coll] has occurred - updating... (live nodes size: [4]) 2> 388865 INFO (zkCallback-2173-thread-2) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/replacenodetest_coll/state.json zxid: 590] for collection [replacenodetest_coll] has occurred - updating... (live nodes size: [4]) 2> 388865 INFO (zkCallback-2175-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/replacenodetest_coll/state.json zxid: 590] for collection [replacenodetest_coll] has occurred - updating... (live nodes size: [4]) 2> 388865 INFO (zkCallback-2171-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/replacenodetest_coll/state.json zxid: 590] for collection [replacenodetest_coll] has occurred - updating... (live nodes size: [4]) 2> 388865 INFO (recoveryExecutor-2165-thread-1-processing-replacenodetest_coll_shard1_replica_n15 null-1648 000308543218837525 core_node16 create 127.0.0.1:33263_solr replacenodetest_coll shard1) [n:127.0.0.1:33263_solr c:replacenodetest_coll s:shard1 r:core_node16 x:replacenodetest_coll_shard1_replica_n15 t:null-1648] o.a.s.c.RecoveryStrategy Updating version bucket highest from index after successful recovery. 2> 388870 INFO (recoveryExecutor-2165-thread-1-processing-replacenodetest_coll_shard1_replica_n15 null-1648 000308543218837525 core_node16 create 127.0.0.1:33263_solr replacenodetest_coll shard1) [n:127.0.0.1:33263_solr c:replacenodetest_coll s:shard1 r:core_node16 x:replacenodetest_coll_shard1_replica_n15 t:null-1648] o.a.s.u.UpdateLog Could not find max version in index or recent updates, using new clock 1788735664846536704 2> 388881 INFO (recoveryExecutor-2165-thread-1-processing-replacenodetest_coll_shard1_replica_n15 null-1648 000308543218837525 core_node16 create 127.0.0.1:33263_solr replacenodetest_coll shard1) [n:127.0.0.1:33263_solr c:replacenodetest_coll s:shard1 r:core_node16 x:replacenodetest_coll_shard1_replica_n15 t:null-1648] o.a.s.c.RecoveryStrategy Finished recovery process, successful=[true] msTimeTaken=14065.0 2> 388882 INFO (recoveryExecutor-2165-thread-1-processing-replacenodetest_coll_shard1_replica_n15 null-1648 000308543218837525 core_node16 create 127.0.0.1:33263_solr replacenodetest_coll shard1) [n:127.0.0.1:33263_solr c:replacenodetest_coll s:shard1 r:core_node16 x:replacenodetest_coll_shard1_replica_n15 t:null-1648] o.a.s.c.RecoveryStrategy Finished recovery process. recoveringAfterStartup=true msTimeTaken=14066.0 2> 389306 INFO (TEST-ReplaceNodeTest.testFailOnSingleNode-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.SolrTestCaseJ4 ###Starting testFailOnSingleNode 2> 389423 INFO (jetty-closer-2271-thread-2) [n: c: s: r: x: t:] o.e.j.s.Server Stopped Server@338559d9{STOPPING}[10.0.19,sto=0] 2> 389423 INFO (jetty-closer-2271-thread-1) [n: c: s: r: x: t:] o.e.j.s.Server Stopped Server@4970bf1f{STOPPING}[10.0.19,sto=0] 2> 389425 INFO (jetty-closer-2271-thread-4) [n: c: s: r: x: t:] o.e.j.s.Server Stopped Server@64fcd87f{STOPPING}[10.0.19,sto=0] 2> 389425 INFO (jetty-closer-2271-thread-3) [n: c: s: r: x: t:] o.e.j.s.Server Stopped Server@671ee2a9{STOPPING}[10.0.19,sto=0] 2> 389435 INFO (jetty-closer-2271-thread-1) [n: c: s: r: x: t:] o.e.j.s.AbstractConnector Stopped ServerConnector@39ffe81a{HTTP/1.1, (http/1.1, h2c)}{127.0.0.1:0} 2> 389434 INFO (jetty-closer-2271-thread-4) [n: c: s: r: x: t:] o.e.j.s.AbstractConnector Stopped ServerConnector@14de03a5{HTTP/1.1, (http/1.1, h2c)}{127.0.0.1:0} 2> 389436 INFO (jetty-closer-2271-thread-3) [n: c: s: r: x: t:] o.e.j.s.AbstractConnector Stopped ServerConnector@17bfdbda{HTTP/1.1, (http/1.1, h2c)}{127.0.0.1:0} 2> 389439 INFO (jetty-closer-2271-thread-2) [n: c: s: r: x: t:] o.e.j.s.AbstractConnector Stopped ServerConnector@43f0f3df{HTTP/1.1, (http/1.1, h2c)}{127.0.0.1:0} 2> 389442 INFO (jetty-closer-2271-thread-2) [n: c: s: r: x: t:] o.e.j.s.h.ContextHandler Stopped o.e.j.s.ServletContextHandler@726e9ca{/solr,file:///build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-cwd/,STOPPED} 2> 389442 INFO (jetty-closer-2271-thread-1) [n: c: s: r: x: t:] o.e.j.s.h.ContextHandler Stopped o.e.j.s.ServletContextHandler@742dff7{/solr,file:///build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-cwd/,STOPPED} 2> 389442 INFO (jetty-closer-2271-thread-4) [n: c: s: r: x: t:] o.e.j.s.h.ContextHandler Stopped o.e.j.s.ServletContextHandler@4e8c717{/solr,file:///build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-cwd/,STOPPED} 2> 389442 INFO (jetty-closer-2271-thread-3) [n: c: s: r: x: t:] o.e.j.s.h.ContextHandler Stopped o.e.j.s.ServletContextHandler@33bdba24{/solr,file:///build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-cwd/,STOPPED} 2> 389475 INFO (jetty-closer-2271-thread-2) [n: c: s: r: x: t:] o.a.s.c.CoreContainer Shutting down CoreContainer instance=577560943 2> 389475 INFO (jetty-closer-2271-thread-3) [n: c: s: r: x: t:] o.a.s.c.CoreContainer Shutting down CoreContainer instance=2114618794 2> 389475 INFO (jetty-closer-2271-thread-4) [n: c: s: r: x: t:] o.a.s.c.CoreContainer Shutting down CoreContainer instance=24240906 2> 389476 INFO (jetty-closer-2271-thread-2) [n: c: s: r: x: t:] o.a.s.c.ZkController Remove node as live in ZooKeeper:/live_nodes/127.0.0.1:42593_solr 2> 389476 INFO (jetty-closer-2271-thread-3) [n: c: s: r: x: t:] o.a.s.c.ZkController Remove node as live in ZooKeeper:/live_nodes/127.0.0.1:35755_solr 2> 389476 INFO (jetty-closer-2271-thread-4) [n: c: s: r: x: t:] o.a.s.c.ZkController Remove node as live in ZooKeeper:/live_nodes/127.0.0.1:44549_solr 2> 389490 INFO (jetty-closer-2271-thread-2) [n: c: s: r: x: t:] o.a.s.c.ZkController Publish this node as DOWN... 2> 389490 INFO (jetty-closer-2271-thread-2) [n: c: s: r: x: t:] o.a.s.c.ZkController Publish node=127.0.0.1:42593_solr as DOWN 2> 389490 INFO (jetty-closer-2271-thread-3) [n: c: s: r: x: t:] o.a.s.c.ZkController Publish this node as DOWN... 2> 389490 INFO (jetty-closer-2271-thread-3) [n: c: s: r: x: t:] o.a.s.c.ZkController Publish node=127.0.0.1:35755_solr as DOWN 2> 389491 INFO (jetty-closer-2271-thread-4) [n: c: s: r: x: t:] o.a.s.c.ZkController Publish this node as DOWN... 2> 389491 INFO (jetty-closer-2271-thread-4) [n: c: s: r: x: t:] o.a.s.c.ZkController Publish node=127.0.0.1:44549_solr as DOWN 2> 389493 INFO (zkCallback-2169-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (4) -> (1) 2> 389493 INFO (zkCallback-2173-thread-2) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (4) -> (1) 2> 389494 INFO (zkCallback-2175-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (4) -> (1) 2> 389494 INFO (zkCallback-2171-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (4) -> (1) 2> 389510 INFO (zkCallback-2173-thread-2) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/replacenodetest_coll/state.json zxid: 595] for collection [replacenodetest_coll] has occurred - updating... (live nodes size: [1]) 2> 389510 INFO (zkCallback-2175-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/replacenodetest_coll/state.json zxid: 595] for collection [replacenodetest_coll] has occurred - updating... (live nodes size: [1]) 2> 389510 INFO (zkCallback-2171-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/replacenodetest_coll/state.json zxid: 595] for collection [replacenodetest_coll] has occurred - updating... (live nodes size: [1]) 2> 389510 INFO (zkCallback-2169-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/replacenodetest_coll/state.json zxid: 595] for collection [replacenodetest_coll] has occurred - updating... (live nodes size: [1]) 2> 389516 INFO (coreCloseExecutor-2276-thread-1) [n:127.0.0.1:44549_solr c:replacenodetest_coll s:shard2 r:core_node14 x:replacenodetest_coll_shard2_replica_n13 t:] o.a.s.c.SolrCore CLOSING SolrCore org.apache.solr.core.SolrCore@2329a9e2 replacenodetest_coll_shard2_replica_n13 2> 389517 INFO (coreCloseExecutor-2276-thread-1) [n:127.0.0.1:44549_solr c:replacenodetest_coll s:shard2 r:core_node14 x:replacenodetest_coll_shard2_replica_n13 t:] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.core.replacenodetest_coll.shard2.replica_n13 tag=SolrCore@2329a9e2 2> 389519 INFO (zkCallback-2169-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/replacenodetest_coll/state.json zxid: 598] for collection [replacenodetest_coll] has occurred - updating... (live nodes size: [1]) 2> 389519 INFO (zkCallback-2175-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/replacenodetest_coll/state.json zxid: 598] for collection [replacenodetest_coll] has occurred - updating... (live nodes size: [1]) 2> 389519 INFO (zkCallback-2171-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/replacenodetest_coll/state.json zxid: 598] for collection [replacenodetest_coll] has occurred - updating... (live nodes size: [1]) 2> 389519 INFO (zkCallback-2173-thread-2) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/replacenodetest_coll/state.json zxid: 598] for collection [replacenodetest_coll] has occurred - updating... (live nodes size: [1]) 2> 389525 INFO (coreCloseExecutor-2278-thread-1) [n:127.0.0.1:42593_solr c:replacenodetest_coll s:shard1 r:core_node10 x:replacenodetest_coll_shard1_replica_n4 t:] o.a.s.c.SolrCore CLOSING SolrCore org.apache.solr.core.SolrCore@14387166 replacenodetest_coll_shard1_replica_n4 2> 389526 INFO (coreCloseExecutor-2278-thread-2) [n:127.0.0.1:42593_solr c:replacenodetest_coll s:shard2 r:core_node11 x:replacenodetest_coll_shard2_replica_n5 t:] o.a.s.c.SolrCore CLOSING SolrCore org.apache.solr.core.SolrCore@554471ac replacenodetest_coll_shard2_replica_n5 2> 389526 INFO (coreCloseExecutor-2278-thread-1) [n:127.0.0.1:42593_solr c:replacenodetest_coll s:shard1 r:core_node10 x:replacenodetest_coll_shard1_replica_n4 t:] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.core.replacenodetest_coll.shard1.replica_n4 tag=SolrCore@14387166 2> 389526 INFO (coreCloseExecutor-2278-thread-3) [n:127.0.0.1:42593_solr c:replacenodetest_coll s:shard3 r:core_node12 x:replacenodetest_coll_shard3_replica_n6 t:] o.a.s.c.SolrCore CLOSING SolrCore org.apache.solr.core.SolrCore@7ff66817 replacenodetest_coll_shard3_replica_n6 2> 389531 INFO (zkCallback-2169-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/replacenodetest_coll/state.json zxid: 600] for collection [replacenodetest_coll] has occurred - updating... (live nodes size: [1]) 2> 389531 INFO (zkCallback-2171-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/replacenodetest_coll/state.json zxid: 600] for collection [replacenodetest_coll] has occurred - updating... (live nodes size: [1]) 2> 389531 INFO (zkCallback-2175-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/replacenodetest_coll/state.json zxid: 600] for collection [replacenodetest_coll] has occurred - updating... (live nodes size: [1]) 2> 389531 INFO (zkCallback-2173-thread-2) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/replacenodetest_coll/state.json zxid: 600] for collection [replacenodetest_coll] has occurred - updating... (live nodes size: [1]) 2> 389543 INFO (coreCloseExecutor-2280-thread-1) [n:127.0.0.1:35755_solr c:replacenodetest_coll s:shard1 r:core_node7 x:replacenodetest_coll_shard1_replica_n1 t:] o.a.s.c.SolrCore CLOSING SolrCore org.apache.solr.core.SolrCore@542378cf replacenodetest_coll_shard1_replica_n1 2> 389544 INFO (coreCloseExecutor-2280-thread-1) [n:127.0.0.1:35755_solr c:replacenodetest_coll s:shard1 r:core_node7 x:replacenodetest_coll_shard1_replica_n1 t:] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.core.replacenodetest_coll.shard1.replica_n1 tag=SolrCore@542378cf 2> 389544 INFO (coreCloseExecutor-2280-thread-2) [n:127.0.0.1:35755_solr c:replacenodetest_coll s:shard2 r:core_node8 x:replacenodetest_coll_shard2_replica_n2 t:] o.a.s.c.SolrCore CLOSING SolrCore org.apache.solr.core.SolrCore@5299b668 replacenodetest_coll_shard2_replica_n2 2> 389546 INFO (coreCloseExecutor-2280-thread-3) [n:127.0.0.1:35755_solr c:replacenodetest_coll s:shard3 r:core_node9 x:replacenodetest_coll_shard3_replica_n3 t:] o.a.s.c.SolrCore CLOSING SolrCore org.apache.solr.core.SolrCore@1c756b3e replacenodetest_coll_shard3_replica_n3 2> 389636 INFO (parallelCoreAdminAPIBaseExecutor-2193-thread-2-processing-127.0.0.1:33263_solr replacenodetest_coll_shard3_replica_n17 null-1648 000308544307642260 create) [n:127.0.0.1:33263_solr c:replacenodetest_coll s:shard3 r:core_node18 x:replacenodetest_coll_shard3_replica_n17 t:null-1648] o.a.s.u.UpdateHandler Using UpdateLog implementation: org.apache.solr.update.UpdateLog 2> 389636 INFO (parallelCoreAdminAPIBaseExecutor-2193-thread-2-processing-127.0.0.1:33263_solr replacenodetest_coll_shard3_replica_n17 null-1648 000308544307642260 create) [n:127.0.0.1:33263_solr c:replacenodetest_coll s:shard3 r:core_node18 x:replacenodetest_coll_shard3_replica_n17 t:null-1648] o.a.s.u.UpdateLog Initializing UpdateLog: dataDir=null defaultSyncLevel=FLUSH numRecordsToKeep=100 maxNumLogsToKeep=10 numVersionBuckets=65536 2> 389661 INFO (parallelCoreAdminAPIBaseExecutor-2193-thread-2-processing-127.0.0.1:33263_solr replacenodetest_coll_shard3_replica_n17 null-1648 000308544307642260 create) [n:127.0.0.1:33263_solr c:replacenodetest_coll s:shard3 r:core_node18 x:replacenodetest_coll_shard3_replica_n17 t:null-1648] o.a.s.u.CommitTracker Hard AutoCommit: disabled 2> 389661 INFO (parallelCoreAdminAPIBaseExecutor-2193-thread-2-processing-127.0.0.1:33263_solr replacenodetest_coll_shard3_replica_n17 null-1648 000308544307642260 create) [n:127.0.0.1:33263_solr c:replacenodetest_coll s:shard3 r:core_node18 x:replacenodetest_coll_shard3_replica_n17 t:null-1648] o.a.s.u.CommitTracker Soft AutoCommit: disabled 2> 389693 INFO (parallelCoreAdminAPIBaseExecutor-2193-thread-2-processing-127.0.0.1:33263_solr replacenodetest_coll_shard3_replica_n17 null-1648 000308544307642260 create) [n:127.0.0.1:33263_solr c:replacenodetest_coll s:shard3 r:core_node18 x:replacenodetest_coll_shard3_replica_n17 t:null-1648] o.a.s.r.ManagedResourceStorage Configured ZooKeeperStorageIO with znodeBase: /configs/conf1 2> 389701 INFO (parallelCoreAdminAPIBaseExecutor-2193-thread-2-processing-127.0.0.1:33263_solr replacenodetest_coll_shard3_replica_n17 null-1648 000308544307642260 create) [n:127.0.0.1:33263_solr c:replacenodetest_coll s:shard3 r:core_node18 x:replacenodetest_coll_shard3_replica_n17 t:null-1648] o.a.s.r.ManagedResourceStorage Loaded null at path _rest_managed.json using ZooKeeperStorageIO:path=/configs/conf1 2> 389707 INFO (parallelCoreAdminAPIBaseExecutor-2193-thread-2-processing-127.0.0.1:33263_solr replacenodetest_coll_shard3_replica_n17 null-1648 000308544307642260 create) [n:127.0.0.1:33263_solr c:replacenodetest_coll s:shard3 r:core_node18 x:replacenodetest_coll_shard3_replica_n17 t:null-1648] o.a.s.h.ReplicationHandler Commits will be reserved for 10000 ms 2> 389709 INFO (parallelCoreAdminAPIBaseExecutor-2193-thread-2-processing-127.0.0.1:33263_solr replacenodetest_coll_shard3_replica_n17 null-1648 000308544307642260 create) [n:127.0.0.1:33263_solr c:replacenodetest_coll s:shard3 r:core_node18 x:replacenodetest_coll_shard3_replica_n17 t:null-1648] o.a.s.u.UpdateLog Could not find max version in index or recent updates, using new clock 1788735665726291968 2> 389788 INFO (parallelCoreAdminAPIBaseExecutor-2193-thread-2-processing-127.0.0.1:33263_solr replacenodetest_coll_shard3_replica_n17 null-1648 000308544307642260 create) [n:127.0.0.1:33263_solr c:replacenodetest_coll s:shard3 r:core_node18 x:replacenodetest_coll_shard3_replica_n17 t:null-1648] o.a.s.c.ZkShardTerms Successful update of terms at /collections/replacenodetest_coll/terms/shard3 to Terms{values={core_node18=0, core_node12=0, core_node9=0}, version=2} for registerTerm 2> 389796 INFO (parallelCoreAdminAPIBaseExecutor-2193-thread-2-processing-127.0.0.1:33263_solr replacenodetest_coll_shard3_replica_n17 null-1648 000308544307642260 create) [n:127.0.0.1:33263_solr c:replacenodetest_coll s:shard3 r:core_node18 x:replacenodetest_coll_shard3_replica_n17 t:null-1648] o.a.s.c.ShardLeaderElectionContextBase make sure parent is created /collections/replacenodetest_coll/leaders/shard3 2> 389797 INFO (searcherExecutor-2269-thread-1-processing-replacenodetest_coll_shard3_replica_n17 null-1648 000308544307642260 core_node18 create 127.0.0.1:33263_solr replacenodetest_coll shard3) [n:127.0.0.1:33263_solr c:replacenodetest_coll s:shard3 r:core_node18 x:replacenodetest_coll_shard3_replica_n17 t:null-1648] o.a.s.c.SolrCore Registered new searcher autowarm time: 0 ms 2> 389768 ERROR (DistributedCollectionApiCommandExecutor-2189-thread-2-processing-127.0.0.1:33263_solr null-1648) [n:127.0.0.1:33263_solr c: s: r: x: t:null-1648] o.a.s.c.a.c.CollectionHandlingUtils Error from shard: http://127.0.0.1:33263/solr 2> => org.apache.solr.client.solrj.SolrServerException: Connection refused 2> at org.apache.solr.client.solrj.impl.Http2SolrClient$1.onFailure(Http2SolrClient.java:506) 2> org.apache.solr.client.solrj.SolrServerException: Connection refused 2> at org.apache.solr.client.solrj.impl.Http2SolrClient$1.onFailure(Http2SolrClient.java:506) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.eclipse.jetty.client.ResponseNotifier.notifyFailure(ResponseNotifier.java:192) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.ResponseNotifier.notifyFailure(ResponseNotifier.java:184) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpExchange.notifyFailureComplete(HttpExchange.java:284) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpExchange.abort(HttpExchange.java:256) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpConversation.abort(HttpConversation.java:159) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpRequest.abort(HttpRequest.java:925) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpDestination.abort(HttpDestination.java:530) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpDestination.failed(HttpDestination.java:294) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.AbstractConnectionPool$FutureConnection.failed(AbstractConnectionPool.java:574) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.util.Promise$Wrapper.failed(Promise.java:169) ~[jetty-util-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1$1.failed(HttpClient.java:578) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.http.HTTPSessionListenerPromise.failConnectionPromise(HTTPSessionListenerPromise.java:136) ~[http2-http-client-transport-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.http.HTTPSessionListenerPromise.failed(HTTPSessionListenerPromise.java:51) ~[http2-http-client-transport-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.util.Promise$1.failed(Promise.java:94) ~[jetty-util-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.io.ClientConnector.connectFailed(ClientConnector.java:540) ~[jetty-io-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.io.ClientConnector.connect(ClientConnector.java:456) ~[jetty-io-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.HTTP2Client.connect(HTTP2Client.java:461) ~[http2-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.http.HttpClientTransportOverHTTP2.connect(HttpClientTransportOverHTTP2.java:146) ~[http2-http-client-transport-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.http.HttpClientTransportOverHTTP2.connect(HttpClientTransportOverHTTP2.java:135) ~[http2-http-client-transport-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.connect(HttpClient.java:583) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.succeeded(HttpClient.java:560) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.succeeded(HttpClient.java:556) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.util.SocketAddressResolver$Async.lambda$resolve$1(SocketAddressResolver.java:180) ~[jetty-util-10.0.19.jar:10.0.19] 2> at org.apache.solr.common.util.ExecutorUtil$MDCAwareThreadPoolExecutor.lambda$execute$0(ExecutorUtil.java:294) [solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) [?:?] 2> at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) [?:?] 2> at java.base/java.lang.Thread.run(Thread.java:829) [?:?] 2> Caused by: java.net.ConnectException: Connection refused 2> at java.base/sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) ~[?:?] 2> at java.base/sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:774) ~[?:?] 2> at java.base/sun.nio.ch.SocketAdaptor.connect(SocketAdaptor.java:120) ~[?:?] 2> at org.eclipse.jetty.io.ClientConnector.connect(ClientConnector.java:428) ~[jetty-io-10.0.19.jar:10.0.19] 2> ... 11 more 2> 389809 ERROR (DistributedCollectionApiCommandExecutor-2189-thread-2-processing-127.0.0.1:33263_solr null-1648) [n:127.0.0.1:33263_solr c: s: r: x: t:null-1648] o.a.s.c.a.c.CollectionHandlingUtils Error from shard 127.0.0.1:33263_solr: {STATUS=failed} 2> 389810 WARN (DistributedCollectionApiCommandExecutor-2189-thread-2-processing-127.0.0.1:33263_solr null-1648) [n:127.0.0.1:33263_solr c: s: r: x: t:null-1648] o.a.s.c.a.c.ReplicaMigrationUtils Failed to create replica for collection=replacenodetest_coll shard=shard3 on node=127.0.0.1:33263_solr 2> 390024 INFO (coreCloseExecutor-2278-thread-2) [n:127.0.0.1:42593_solr c:replacenodetest_coll s:shard2 r:core_node11 x:replacenodetest_coll_shard2_replica_n5 t:] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.core.replacenodetest_coll.shard2.replica_n5 tag=SolrCore@554471ac 2> 390025 INFO (coreCloseExecutor-2276-thread-1) [n:127.0.0.1:44549_solr c:replacenodetest_coll s:shard2 r:core_node14 x:replacenodetest_coll_shard2_replica_n13 t:] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.collection.replacenodetest_coll.shard2.leader tag=SolrCore@2329a9e2 2> 390034 INFO (coreCloseExecutor-2276-thread-1) [n:127.0.0.1:44549_solr c:replacenodetest_coll s:shard2 r:core_node14 x:replacenodetest_coll_shard2_replica_n13 t:] o.a.s.u.DirectUpdateHandler2 Committing on IndexWriter.close() ... SKIPPED (unnecessary). 2> 390054 INFO (coreCloseExecutor-2280-thread-1) [n:127.0.0.1:35755_solr c:replacenodetest_coll s:shard1 r:core_node7 x:replacenodetest_coll_shard1_replica_n1 t:] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.collection.replacenodetest_coll.shard1.leader tag=SolrCore@542378cf 2> 390054 INFO (coreCloseExecutor-2280-thread-2) [n:127.0.0.1:35755_solr c:replacenodetest_coll s:shard2 r:core_node8 x:replacenodetest_coll_shard2_replica_n2 t:] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.core.replacenodetest_coll.shard2.replica_n2 tag=SolrCore@5299b668 2> 390062 INFO (jetty-closer-2271-thread-4) [n: c: s: r: x: t:] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.node tag=null 2> 390062 INFO (coreCloseExecutor-2280-thread-1) [n:127.0.0.1:35755_solr c:replacenodetest_coll s:shard1 r:core_node7 x:replacenodetest_coll_shard1_replica_n1 t:] o.a.s.u.DirectUpdateHandler2 Committing on IndexWriter.close() ... SKIPPED (unnecessary). 2> 390290 INFO (jetty-closer-2271-thread-4) [n: c: s: r: x: t:] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.jvm tag=null 2> 390417 INFO (coreCloseExecutor-2278-thread-2) [n:127.0.0.1:42593_solr c:replacenodetest_coll s:shard2 r:core_node11 x:replacenodetest_coll_shard2_replica_n5 t:] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.collection.replacenodetest_coll.shard2.leader tag=SolrCore@554471ac 2> 390417 INFO (coreCloseExecutor-2278-thread-3) [n:127.0.0.1:42593_solr c:replacenodetest_coll s:shard3 r:core_node12 x:replacenodetest_coll_shard3_replica_n6 t:] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.core.replacenodetest_coll.shard3.replica_n6 tag=SolrCore@7ff66817 2> 390421 INFO (coreCloseExecutor-2280-thread-2) [n:127.0.0.1:35755_solr c:replacenodetest_coll s:shard2 r:core_node8 x:replacenodetest_coll_shard2_replica_n2 t:] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.collection.replacenodetest_coll.shard2.leader tag=SolrCore@5299b668 2> 390422 INFO (coreCloseExecutor-2280-thread-3) [n:127.0.0.1:35755_solr c:replacenodetest_coll s:shard3 r:core_node9 x:replacenodetest_coll_shard3_replica_n3 t:] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.core.replacenodetest_coll.shard3.replica_n3 tag=SolrCore@1c756b3e 2> 390428 INFO (coreCloseExecutor-2278-thread-2) [n:127.0.0.1:42593_solr c:replacenodetest_coll s:shard2 r:core_node11 x:replacenodetest_coll_shard2_replica_n5 t:] o.a.s.u.DirectUpdateHandler2 Committing on IndexWriter.close() ... SKIPPED (unnecessary). 2> 390439 INFO (coreCloseExecutor-2280-thread-2) [n:127.0.0.1:35755_solr c:replacenodetest_coll s:shard2 r:core_node8 x:replacenodetest_coll_shard2_replica_n2 t:] o.a.s.u.DirectUpdateHandler2 Committing on IndexWriter.close() ... SKIPPED (unnecessary). 2> 390520 INFO (jetty-closer-2271-thread-4) [n: c: s: r: x: t:] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.jetty tag=null 2> 390521 INFO (jetty-closer-2271-thread-4) [n: c: s: r: x: t:] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.cluster tag=null 2> 390758 INFO (coreCloseExecutor-2280-thread-3) [n:127.0.0.1:35755_solr c:replacenodetest_coll s:shard3 r:core_node9 x:replacenodetest_coll_shard3_replica_n3 t:] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.collection.replacenodetest_coll.shard3.leader tag=SolrCore@1c756b3e 2> 390766 INFO (coreCloseExecutor-2280-thread-3) [n:127.0.0.1:35755_solr c:replacenodetest_coll s:shard3 r:core_node9 x:replacenodetest_coll_shard3_replica_n3 t:] o.a.s.u.DirectUpdateHandler2 Committing on IndexWriter.close() ... SKIPPED (unnecessary). 2> 390780 INFO (jetty-closer-2271-thread-3) [n: c: s: r: x: t:] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.node tag=null 2> 390784 INFO (coreCloseExecutor-2278-thread-3) [n:127.0.0.1:42593_solr c:replacenodetest_coll s:shard3 r:core_node12 x:replacenodetest_coll_shard3_replica_n6 t:] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.collection.replacenodetest_coll.shard3.leader tag=SolrCore@7ff66817 2> 390784 INFO (coreCloseExecutor-2278-thread-1) [n:127.0.0.1:42593_solr c:replacenodetest_coll s:shard1 r:core_node10 x:replacenodetest_coll_shard1_replica_n4 t:] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.collection.replacenodetest_coll.shard1.leader tag=SolrCore@14387166 2> 390791 INFO (coreCloseExecutor-2278-thread-3) [n:127.0.0.1:42593_solr c:replacenodetest_coll s:shard3 r:core_node12 x:replacenodetest_coll_shard3_replica_n6 t:] o.a.s.u.DirectUpdateHandler2 Committing on IndexWriter.close() ... SKIPPED (unnecessary). 2> 390798 INFO (coreCloseExecutor-2278-thread-1) [n:127.0.0.1:42593_solr c:replacenodetest_coll s:shard1 r:core_node10 x:replacenodetest_coll_shard1_replica_n4 t:] o.a.s.u.DirectUpdateHandler2 Committing on IndexWriter.close() ... SKIPPED (unnecessary). 2> 390814 INFO (jetty-closer-2271-thread-2) [n: c: s: r: x: t:] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.node tag=null 2> 390990 INFO (jetty-closer-2271-thread-3) [n: c: s: r: x: t:] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.jvm tag=null 2> 391005 INFO (jetty-closer-2271-thread-2) [n: c: s: r: x: t:] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.jvm tag=null 2> 391072 INFO (jetty-closer-2271-thread-3) [n: c: s: r: x: t:] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.jetty tag=null 2> 391073 INFO (jetty-closer-2271-thread-3) [n: c: s: r: x: t:] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.cluster tag=null 2> 391090 INFO (jetty-closer-2271-thread-2) [n: c: s: r: x: t:] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.jetty tag=null 2> 391091 INFO (jetty-closer-2271-thread-2) [n: c: s: r: x: t:] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.cluster tag=null 2> 391132 INFO (zkCallback-2169-thread-3) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/replacenodetest_coll/state.json zxid: 610] for collection [replacenodetest_coll] has occurred - updating... (live nodes size: [1]) 2> 391135 INFO (zkCallback-2169-thread-2) [n:127.0.0.1:33263_solr c:replacenodetest_coll s:shard3 r:core_node18 x:replacenodetest_coll_shard3_replica_n17 t:] o.a.s.c.ShardLeaderElectionContext I may be the new leader - try and sync 2> 391140 INFO (zkCallback-2169-thread-3) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/replacenodetest_coll/state.json zxid: 612] for collection [replacenodetest_coll] has occurred - updating... (live nodes size: [1]) 2> 391143 INFO (zkCallback-2169-thread-1) [n:127.0.0.1:33263_solr c:replacenodetest_coll s:shard1 r:core_node16 x:replacenodetest_coll_shard1_replica_n15 t:] o.a.s.c.ShardLeaderElectionContext I may be the new leader - try and sync 2> 393635 INFO (zkCallback-2169-thread-2) [n:127.0.0.1:33263_solr c:replacenodetest_coll s:shard3 r:core_node18 x:replacenodetest_coll_shard3_replica_n17 t:] o.a.s.c.SyncStrategy Sync replicas to http://127.0.0.1:33263/solr/replacenodetest_coll_shard3_replica_n17/ 2> 393636 INFO (zkCallback-2169-thread-2) [n:127.0.0.1:33263_solr c:replacenodetest_coll s:shard3 r:core_node18 x:replacenodetest_coll_shard3_replica_n17 t:] o.a.s.c.SyncStrategy Sync Success - now sync replicas to me 2> 393636 INFO (zkCallback-2169-thread-2) [n:127.0.0.1:33263_solr c:replacenodetest_coll s:shard3 r:core_node18 x:replacenodetest_coll_shard3_replica_n17 t:] o.a.s.c.SyncStrategy http://127.0.0.1:33263/solr/replacenodetest_coll_shard3_replica_n17/ has no replicas 2> 393637 INFO (zkCallback-2169-thread-2) [n:127.0.0.1:33263_solr c:replacenodetest_coll s:shard3 r:core_node18 x:replacenodetest_coll_shard3_replica_n17 t:] o.a.s.c.ShardLeaderElectionContextBase Creating leader registration node /collections/replacenodetest_coll/leaders/shard3/leader after winning as /collections/replacenodetest_coll/leader_elect/shard3/election/72077814123528199-core_node18-n_0000000002 2> 393643 INFO (zkCallback-2169-thread-1) [n:127.0.0.1:33263_solr c:replacenodetest_coll s:shard1 r:core_node16 x:replacenodetest_coll_shard1_replica_n15 t:] o.a.s.c.SyncStrategy Sync replicas to http://127.0.0.1:33263/solr/replacenodetest_coll_shard1_replica_n15/ 2> 393644 INFO (zkCallback-2169-thread-1) [n:127.0.0.1:33263_solr c:replacenodetest_coll s:shard1 r:core_node16 x:replacenodetest_coll_shard1_replica_n15 t:] o.a.s.c.SyncStrategy Sync Success - now sync replicas to me 2> 393644 INFO (zkCallback-2169-thread-1) [n:127.0.0.1:33263_solr c:replacenodetest_coll s:shard1 r:core_node16 x:replacenodetest_coll_shard1_replica_n15 t:] o.a.s.c.SyncStrategy http://127.0.0.1:33263/solr/replacenodetest_coll_shard1_replica_n15/ has no replicas 2> 393644 INFO (zkCallback-2169-thread-1) [n:127.0.0.1:33263_solr c:replacenodetest_coll s:shard1 r:core_node16 x:replacenodetest_coll_shard1_replica_n15 t:] o.a.s.c.ShardLeaderElectionContextBase Creating leader registration node /collections/replacenodetest_coll/leaders/shard1/leader after winning as /collections/replacenodetest_coll/leader_elect/shard1/election/72077814123528199-core_node16-n_0000000002 2> 393656 INFO (zkCallback-2169-thread-3) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/replacenodetest_coll/state.json zxid: 615] for collection [replacenodetest_coll] has occurred - updating... (live nodes size: [1]) 2> 393656 INFO (zkCallback-2169-thread-2) [n:127.0.0.1:33263_solr c:replacenodetest_coll s:shard3 r:core_node18 x:replacenodetest_coll_shard3_replica_n17 t:] o.a.s.c.ShardLeaderElectionContext I am the new leader: http://127.0.0.1:33263/solr/replacenodetest_coll_shard3_replica_n17/ shard3 2> 393663 INFO (zkCallback-2169-thread-3) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/replacenodetest_coll/state.json zxid: 617] for collection [replacenodetest_coll] has occurred - updating... (live nodes size: [1]) 2> 393666 INFO (zkCallback-2169-thread-1) [n:127.0.0.1:33263_solr c:replacenodetest_coll s:shard1 r:core_node16 x:replacenodetest_coll_shard1_replica_n15 t:] o.a.s.c.ShardLeaderElectionContext I am the new leader: http://127.0.0.1:33263/solr/replacenodetest_coll_shard1_replica_n15/ shard1 2> 394664 INFO (parallelCoreAdminAPIBaseExecutor-2193-thread-2-processing-127.0.0.1:33263_solr replacenodetest_coll_shard3_replica_n17 null-1648 000308544307642260 create) [n:127.0.0.1:33263_solr c:replacenodetest_coll s:shard3 r:core_node18 x:replacenodetest_coll_shard3_replica_n17 t:null-1648] o.a.s.c.ZkController I am the leader, no recovery necessary 2> 394672 INFO (zkCallback-2169-thread-3) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/replacenodetest_coll/state.json zxid: 618] for collection [replacenodetest_coll] has occurred - updating... (live nodes size: [1]) 2> 394678 INFO (DistributedCollectionApiCommandExecutor-2189-thread-2-processing-127.0.0.1:33263_solr null-1648) [n:127.0.0.1:33263_solr c: s: r: x: t:null-1648] o.a.s.c.a.c.ReplicaMigrationUtils Failed to create some replicas. Cleaning up all newly created replicas. 2> 394681 WARN (DistributedCollectionApiCommandExecutor-2189-thread-2-processing-127.0.0.1:33263_solr null-1648) [n:127.0.0.1:33263_solr c: s: r: x: t:null-1648] o.a.s.c.a.c.ReplicaMigrationUtils Error deleting replica 2> => org.apache.solr.common.SolrException: collection,shard,replica are required params 2> at org.apache.solr.cloud.api.collections.CollectionHandlingUtils.checkRequired(CollectionHandlingUtils.java:196) 2> org.apache.solr.common.SolrException: collection,shard,replica are required params 2> at org.apache.solr.cloud.api.collections.CollectionHandlingUtils.checkRequired(CollectionHandlingUtils.java:196) ~[main/:?] 2> at org.apache.solr.cloud.api.collections.DeleteReplicaCmd.deleteReplica(DeleteReplicaCmd.java:86) ~[main/:?] 2> at org.apache.solr.cloud.api.collections.ReplicaMigrationUtils.migrateReplicas(ReplicaMigrationUtils.java:217) [main/:?] 2> at org.apache.solr.cloud.api.collections.ReplaceNodeCmd.call(ReplaceNodeCmd.java:114) [main/:?] 2> at org.apache.solr.cloud.api.collections.DistributedCollectionConfigSetCommandRunner$CollectionCommandRunner.call(DistributedCollectionConfigSetCommandRunner.java:430) [main/:?] 2> at org.apache.solr.cloud.api.collections.DistributedCollectionConfigSetCommandRunner$CollectionCommandRunner.call(DistributedCollectionConfigSetCommandRunner.java:363) [main/:?] 2> at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264) [?:?] 2> at org.apache.solr.common.util.ExecutorUtil$MDCAwareThreadPoolExecutor.lambda$execute$0(ExecutorUtil.java:294) [solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) [?:?] 2> at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) [?:?] 2> at java.base/java.lang.Thread.run(Thread.java:829) [?:?] 2> 394685 ERROR (DistributedCollectionApiCommandExecutor-2189-thread-2-processing-127.0.0.1:33263_solr null-1648) [n:127.0.0.1:33263_solr c: s: r: x: t:null-1648] o.a.s.c.a.c.DistributedCollectionConfigSetCommandRunner Operation REPLACENODE failed 2> => org.apache.solr.common.SolrException: collection,shard,replica are required params 2> at org.apache.solr.cloud.api.collections.CollectionHandlingUtils.checkRequired(CollectionHandlingUtils.java:196) 2> org.apache.solr.common.SolrException: collection,shard,replica are required params 2> at org.apache.solr.cloud.api.collections.CollectionHandlingUtils.checkRequired(CollectionHandlingUtils.java:196) ~[main/:?] 2> at org.apache.solr.cloud.api.collections.DeleteReplicaCmd.deleteReplica(DeleteReplicaCmd.java:86) ~[main/:?] 2> at org.apache.solr.cloud.api.collections.ReplicaMigrationUtils.migrateReplicas(ReplicaMigrationUtils.java:217) ~[main/:?] 2> at org.apache.solr.cloud.api.collections.ReplaceNodeCmd.call(ReplaceNodeCmd.java:114) ~[main/:?] 2> at org.apache.solr.cloud.api.collections.DistributedCollectionConfigSetCommandRunner$CollectionCommandRunner.call(DistributedCollectionConfigSetCommandRunner.java:430) [main/:?] 2> at org.apache.solr.cloud.api.collections.DistributedCollectionConfigSetCommandRunner$CollectionCommandRunner.call(DistributedCollectionConfigSetCommandRunner.java:363) [main/:?] 2> at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264) [?:?] 2> at org.apache.solr.common.util.ExecutorUtil$MDCAwareThreadPoolExecutor.lambda$execute$0(ExecutorUtil.java:294) [solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) [?:?] 2> at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) [?:?] 2> at java.base/java.lang.Thread.run(Thread.java:829) [?:?] 2> 394707 INFO (jetty-closer-2271-thread-1) [n: c: s: r: x: t:] o.a.s.c.CoreContainer Shutting down CoreContainer instance=790720396 2> 394708 INFO (jetty-closer-2271-thread-1) [n: c: s: r: x: t:] o.a.s.c.ZkController Remove node as live in ZooKeeper:/live_nodes/127.0.0.1:33263_solr 2> 394711 INFO (jetty-closer-2271-thread-1) [n: c: s: r: x: t:] o.a.s.c.ZkController Publish this node as DOWN... 2> 394711 INFO (jetty-closer-2271-thread-1) [n: c: s: r: x: t:] o.a.s.c.ZkController Publish node=127.0.0.1:33263_solr as DOWN 2> 394713 INFO (zkCallback-2169-thread-3) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (1) -> (0) 2> 394721 INFO (zkCallback-2169-thread-3) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/replacenodetest_coll/state.json zxid: 626] for collection [replacenodetest_coll] has occurred - updating... (live nodes size: [0]) 2> 394729 INFO (coreCloseExecutor-2290-thread-1) [n:127.0.0.1:33263_solr c:replacenodetest_coll s:shard1 r:core_node16 x:replacenodetest_coll_shard1_replica_n15 t:] o.a.s.c.SolrCore CLOSING SolrCore org.apache.solr.core.SolrCore@20300d37 replacenodetest_coll_shard1_replica_n15 2> 394730 INFO (coreCloseExecutor-2290-thread-2) [n:127.0.0.1:33263_solr c:replacenodetest_coll s:shard3 r:core_node18 x:replacenodetest_coll_shard3_replica_n17 t:] o.a.s.c.SolrCore CLOSING SolrCore org.apache.solr.core.SolrCore@56d6a81b replacenodetest_coll_shard3_replica_n17 2> 394730 INFO (coreCloseExecutor-2290-thread-1) [n:127.0.0.1:33263_solr c:replacenodetest_coll s:shard1 r:core_node16 x:replacenodetest_coll_shard1_replica_n15 t:] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.core.replacenodetest_coll.shard1.replica_n15 tag=SolrCore@20300d37 2> 394949 INFO (coreCloseExecutor-2290-thread-1) [n:127.0.0.1:33263_solr c:replacenodetest_coll s:shard1 r:core_node16 x:replacenodetest_coll_shard1_replica_n15 t:] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.collection.replacenodetest_coll.shard1.leader tag=SolrCore@20300d37 2> 394949 INFO (coreCloseExecutor-2290-thread-2) [n:127.0.0.1:33263_solr c:replacenodetest_coll s:shard3 r:core_node18 x:replacenodetest_coll_shard3_replica_n17 t:] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.core.replacenodetest_coll.shard3.replica_n17 tag=SolrCore@56d6a81b 2> 394957 INFO (coreCloseExecutor-2290-thread-1) [n:127.0.0.1:33263_solr c:replacenodetest_coll s:shard1 r:core_node16 x:replacenodetest_coll_shard1_replica_n15 t:] o.a.s.u.DirectUpdateHandler2 Committing on IndexWriter.close() ... SKIPPED (unnecessary). 2> 395182 INFO (coreCloseExecutor-2290-thread-2) [n:127.0.0.1:33263_solr c:replacenodetest_coll s:shard3 r:core_node18 x:replacenodetest_coll_shard3_replica_n17 t:] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.collection.replacenodetest_coll.shard3.leader tag=SolrCore@56d6a81b 2> 395189 INFO (coreCloseExecutor-2290-thread-2) [n:127.0.0.1:33263_solr c:replacenodetest_coll s:shard3 r:core_node18 x:replacenodetest_coll_shard3_replica_n17 t:] o.a.s.u.DirectUpdateHandler2 Committing on IndexWriter.close() ... SKIPPED (unnecessary). 2> 395198 INFO (jetty-closer-2271-thread-1) [n: c: s: r: x: t:] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.node tag=null 2> 395306 INFO (jetty-closer-2271-thread-1) [n: c: s: r: x: t:] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.jvm tag=null 2> 395362 INFO (jetty-closer-2271-thread-1) [n: c: s: r: x: t:] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.jetty tag=null 2> 395363 INFO (jetty-closer-2271-thread-1) [n: c: s: r: x: t:] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.cluster tag=null 2> 395369 INFO (closeThreadPool-2291-thread-1) [n: c: s: r: x: t:] o.a.s.c.Overseer Overseer (id=72077814123528199-127.0.0.1:33263_solr-n_0000000000) closing 2> 395370 INFO (OverseerStateUpdate-72077814123528199-127.0.0.1:33263_solr-n_0000000000) [n:127.0.0.1:33263_solr c: s: r: x: t:] o.a.s.c.Overseer Overseer Loop exiting : 127.0.0.1:33263_solr 2> 395374 INFO (closeThreadPool-2291-thread-2) [n: c: s: r: x: t:] o.a.s.c.Overseer Overseer (id=72077814123528199-127.0.0.1:33263_solr-n_0000000000) closing 2> 395486 INFO (jetty-closer-2271-thread-1) [n: c: s: r: x: t:] o.a.s.c.Overseer Overseer (id=72077814123528199-127.0.0.1:33263_solr-n_0000000000) closing 2> 395498 INFO (TEST-ReplaceNodeTest.testFailOnSingleNode-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.ZkTestServer Shutting down ZkTestServer. 2> 395723 WARN (ZkTestServer Run Thread) [n: c: s: r: x: t:] o.a.s.c.ZkTestServer Watch limit violations: 2> Maximum concurrent create/delete watches above limit: 2> 2> 9 /solr/collections/replacenodetest_coll/terms/shard2 2> 9 /solr/collections/replacenodetest_coll/terms/shard1 2> 7 /solr/collections/replacenodetest_coll/terms/shard3 2> 5 /solr/aliases.json 2> 5 /solr/clusterprops.json 2> 4 /solr/packages.json 2> 4 /solr/security.json 2> 4 /solr/configs/conf1 2> 4 /solr/collections/replacenodetest_coll/collectionprops.json 2> 2> Maximum concurrent data watches above limit: 2> 2> 62 /solr/collections/replacenodetest_coll/state.json 2> 2 /solr/collections/replacenodetest_coll/leader_elect/shard3/election/72077814123528202-core_node12-n_0000000000 2> 2 /solr/collections/replacenodetest_coll/leader_elect/shard1/election/72077814123528202-core_node10-n_0000000000 2> 2> Maximum concurrent children watches above limit: 2> 2> 18 /solr/live_nodes 2> 10 /solr/collections 2> 4 /solr/collections/replacenodetest_coll/state.json 2> 2> 395795 INFO (TEST-ReplaceNodeTest.testFailOnSingleNode-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.MiniSolrCloudCluster Starting cluster of 1 servers in /build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.cloud.ReplaceNodeTest_246C98A4C257C021-001/tempDir-002 2> 395801 INFO (TEST-ReplaceNodeTest.testFailOnSingleNode-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.ZkTestServer STARTING ZK TEST SERVER 2> 395803 INFO (ZkTestServer Run Thread) [n: c: s: r: x: t:] o.a.s.c.ZkTestServer client port: 0.0.0.0/0.0.0.0:0 2> 395803 INFO (ZkTestServer Run Thread) [n: c: s: r: x: t:] o.a.s.c.ZkTestServer Starting server 2> 395816 WARN (ZkTestServer Run Thread) [n: c: s: r: x: t:] o.a.z.s.ServerCnxnFactory maxCnxns is not configured, using default value 0. 2> 395903 INFO (TEST-ReplaceNodeTest.testFailOnSingleNode-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.ZkTestServer start zk server on port: 42237 2> 395912 WARN (TEST-ReplaceNodeTest.testFailOnSingleNode-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.c.SolrZkClient Using default ZkCredentialsInjector. ZkCredentialsInjector is not secure, it creates an empty list of credentials which leads to 'OPEN_ACL_UNSAFE' ACLs to Zookeeper nodes 2> 395934 INFO (TEST-ReplaceNodeTest.testFailOnSingleNode-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.c.ConnectionManager Waiting up to 30000ms for client to connect to ZooKeeper 2> 395972 INFO (zkConnectionManagerCallback-2293-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ConnectionManager zkClient has connected 2> 395973 INFO (TEST-ReplaceNodeTest.testFailOnSingleNode-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper 2> 395973 WARN (TEST-ReplaceNodeTest.testFailOnSingleNode-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.c.SolrZkClient Using default ZkACLProvider. DefaultZkACLProvider is not secure, it creates 'OPEN_ACL_UNSAFE' ACLs to Zookeeper nodes 2> 395982 WARN (TEST-ReplaceNodeTest.testFailOnSingleNode-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.c.SolrZkClient Using default ZkCredentialsInjector. ZkCredentialsInjector is not secure, it creates an empty list of credentials which leads to 'OPEN_ACL_UNSAFE' ACLs to Zookeeper nodes 2> 395991 INFO (TEST-ReplaceNodeTest.testFailOnSingleNode-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.c.ConnectionManager Waiting up to 30000ms for client to connect to ZooKeeper 2> 395999 INFO (zkConnectionManagerCallback-2295-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ConnectionManager zkClient has connected 2> 396000 INFO (TEST-ReplaceNodeTest.testFailOnSingleNode-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper 2> 396000 WARN (TEST-ReplaceNodeTest.testFailOnSingleNode-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.c.SolrZkClient Using default ZkACLProvider. DefaultZkACLProvider is not secure, it creates 'OPEN_ACL_UNSAFE' ACLs to Zookeeper nodes 2> 396001 WARN (TEST-ReplaceNodeTest.testFailOnSingleNode-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.c.SolrZkClient Using default ZkCredentialsInjector. ZkCredentialsInjector is not secure, it creates an empty list of credentials which leads to 'OPEN_ACL_UNSAFE' ACLs to Zookeeper nodes 2> 396014 INFO (TEST-ReplaceNodeTest.testFailOnSingleNode-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.c.ConnectionManager Waiting up to 15000ms for client to connect to ZooKeeper 2> 396026 INFO (zkConnectionManagerCallback-2297-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ConnectionManager zkClient has connected 2> 396027 INFO (TEST-ReplaceNodeTest.testFailOnSingleNode-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper 2> 396027 WARN (TEST-ReplaceNodeTest.testFailOnSingleNode-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.c.SolrZkClient Using default ZkACLProvider. DefaultZkACLProvider is not secure, it creates 'OPEN_ACL_UNSAFE' ACLs to Zookeeper nodes 2> 396163 INFO (jetty-launcher-2298-thread-1) [n: c: s: r: x: t:] o.a.s.e.JettySolrRunner Start Jetty (configured port=0, binding port=0) 2> 396163 INFO (jetty-launcher-2298-thread-1) [n: c: s: r: x: t:] o.a.s.e.JettySolrRunner Trying to start Jetty on port 0 try number 2 ... 2> 396164 INFO (jetty-launcher-2298-thread-1) [n: c: s: r: x: t:] o.e.j.s.Server jetty-10.0.19; built: 2023-12-15T20:38:32.477Z; git: 8492d1c78f122bb30cce20aecfa07e7283facd47; jvm 11.0.22+7 2> 396181 INFO (jetty-launcher-2298-thread-1) [n: c: s: r: x: t:] o.e.j.s.s.DefaultSessionIdManager Session workerName=node0 2> 396190 INFO (jetty-launcher-2298-thread-1) [n: c: s: r: x: t:] o.e.j.s.h.ContextHandler Started o.e.j.s.ServletContextHandler@2853e959{/solr,file:///build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-cwd/,AVAILABLE} 2> 396194 INFO (jetty-launcher-2298-thread-1) [n: c: s: r: x: t:] o.e.j.s.AbstractConnector Started ServerConnector@16bfca54{HTTP/1.1, (http/1.1, h2c)}{127.0.0.1:43527} 2> 396195 INFO (jetty-launcher-2298-thread-1) [n: c: s: r: x: t:] o.e.j.s.Server Started Server@661ff236{STARTING}[10.0.19,sto=0] @396377ms 2> 396197 ERROR (jetty-launcher-2298-thread-1) [n: c: s: r: x: t:] o.a.s.u.StartupLoggingUtils Missing Java Option solr.log.dir. Logging may be missing or incomplete. 2> 396197 INFO (jetty-launcher-2298-thread-1) [n: c: s: r: x: t:] o.a.s.s.CoreContainerProvider Using logger factory org.apache.logging.slf4j.Log4jLoggerFactory 2> 396197 INFO (jetty-launcher-2298-thread-1) [n: c: s: r: x: t:] o.a.s.s.CoreContainerProvider ___ _ Welcome to Apache Solr™ version 9.4.1 2> 396198 INFO (jetty-launcher-2298-thread-1) [n: c: s: r: x: t:] o.a.s.s.CoreContainerProvider / __| ___| |_ _ Starting in cloud mode on port null 2> 396198 INFO (jetty-launcher-2298-thread-1) [n: c: s: r: x: t:] o.a.s.s.CoreContainerProvider \__ \/ _ \ | '_| Install dir: /build/solr/src/solr-9.4.1/solr 2> 396198 INFO (jetty-launcher-2298-thread-1) [n: c: s: r: x: t:] o.a.s.s.CoreContainerProvider |___/\___/_|_| Start time: 2024-01-21T21:07:49.732677Z 2> 396200 INFO (jetty-launcher-2298-thread-1) [n: c: s: r: x: t:] o.a.s.s.CoreContainerProvider Solr Home: /build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.cloud.ReplaceNodeTest_246C98A4C257C021-001/tempDir-002/node1 (source: servlet config: solr.solr.home) 2> 396202 WARN (jetty-launcher-2298-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.SolrZkClient Using default ZkCredentialsInjector. ZkCredentialsInjector is not secure, it creates an empty list of credentials which leads to 'OPEN_ACL_UNSAFE' ACLs to Zookeeper nodes 2> 396214 INFO (jetty-launcher-2298-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ConnectionManager Waiting up to 30000ms for client to connect to ZooKeeper 2> 396223 INFO (zkConnectionManagerCallback-2300-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ConnectionManager zkClient has connected 2> 396224 INFO (jetty-launcher-2298-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper 2> 396224 WARN (jetty-launcher-2298-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.SolrZkClient Using default ZkACLProvider. DefaultZkACLProvider is not secure, it creates 'OPEN_ACL_UNSAFE' ACLs to Zookeeper nodes 2> 396227 INFO (jetty-launcher-2298-thread-1) [n: c: s: r: x: t:] o.a.s.c.NodeConfig solr.xml found in ZooKeeper. Loading... 2> 396455 INFO (jetty-launcher-2298-thread-1) [n: c: s: r: x: t:] o.a.s.c.ZkContainer Zookeeper client=127.0.0.1:42237/solr 2> 396456 INFO (jetty-launcher-2298-thread-1) [n: c: s: r: x: t:] o.a.s.c.DistributedClusterStateUpdater Creating DistributedClusterStateUpdater with useDistributedStateUpdate=true. Solr will be using distributed cluster state updates. 2> 396469 INFO (jetty-launcher-2298-thread-1) [n:127.0.0.1:43527_solr c: s: r: x: t:] o.a.s.c.c.ConnectionManager Waiting up to 15000ms for client to connect to ZooKeeper 2> 396479 INFO (zkConnectionManagerCallback-2310-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ConnectionManager zkClient has connected 2> 396480 INFO (jetty-launcher-2298-thread-1) [n:127.0.0.1:43527_solr c: s: r: x: t:] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper 2> 396635 WARN (jetty-launcher-2298-thread-1) [n:127.0.0.1:43527_solr c: s: r: x: t:] o.a.s.c.ZkController Contents of zookeeper /security.json are world-readable; consider setting up ACLs as described in https://solr.apache.org/guide/solr/latest/deployment-guide/zookeeper-access-control.html 2> 396658 INFO (jetty-launcher-2298-thread-1) [n:127.0.0.1:43527_solr c: s: r: x: t:] o.a.s.c.DistributedClusterStateUpdater Creating DistributedClusterStateUpdater with useDistributedStateUpdate=true. Solr will be using distributed cluster state updates. 2> 396676 INFO (jetty-launcher-2298-thread-1) [n:127.0.0.1:43527_solr c: s: r: x: t:] o.a.s.c.OverseerElectionContext I am going to be the leader 127.0.0.1:43527_solr 2> 396683 INFO (jetty-launcher-2298-thread-1) [n:127.0.0.1:43527_solr c: s: r: x: t:] o.a.s.c.Overseer Overseer (id=72077816160387076-127.0.0.1:43527_solr-n_0000000000) starting 2> 396729 INFO (OverseerStateUpdate-72077816160387076-127.0.0.1:43527_solr-n_0000000000) [n:127.0.0.1:43527_solr c: s: r: x: t:] o.a.s.c.Overseer Starting to work on the main queue : 127.0.0.1:43527_solr 2> 396729 INFO (jetty-launcher-2298-thread-1) [n:127.0.0.1:43527_solr c: s: r: x: t:] o.a.s.c.ZkController Register node as live in ZooKeeper:/live_nodes/127.0.0.1:43527_solr 2> 396739 INFO (OverseerStateUpdate-72077816160387076-127.0.0.1:43527_solr-n_0000000000) [n:127.0.0.1:43527_solr c: s: r: x: t:] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (0) -> (1) 2> 396758 WARN (jetty-launcher-2298-thread-1) [n:127.0.0.1:43527_solr c: s: r: x: t:] o.a.s.c.CoreContainer Not all security plugins configured! authentication=disabled authorization=disabled. Solr is only as secure as you make it. Consider configuring authentication/authorization before exposing Solr to users internal or external. See https://s.apache.org/solrsecurity for more info 2> 397218 INFO (jetty-launcher-2298-thread-1) [n:127.0.0.1:43527_solr c: s: r: x: t:] o.a.s.c.a.c.DistributedCollectionConfigSetCommandRunner Creating DistributedCollectionConfigSetCommandRunner. Collection and ConfigSet APIs are running distributed (not Overseer based) 2> 397219 INFO (jetty-launcher-2298-thread-1) [n:127.0.0.1:43527_solr c: s: r: x: t:] o.a.s.c.DistributedClusterStateUpdater Creating DistributedClusterStateUpdater with useDistributedStateUpdate=true. Solr will be using distributed cluster state updates. 2> 397855 INFO (jetty-launcher-2298-thread-1) [n:127.0.0.1:43527_solr c: s: r: x: t:] o.a.s.c.CorePropertiesLocator Found 0 core definitions underneath /build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.cloud.ReplaceNodeTest_246C98A4C257C021-001/tempDir-002/node1 2> 398752 INFO (jetty-launcher-2298-thread-1) [n:127.0.0.1:43527_solr c: s: r: x: t:] o.a.s.j.SolrRequestAuthorizer Creating a new SolrRequestAuthorizer 2> 399066 INFO (jetty-launcher-2298-thread-1) [n:127.0.0.1:43527_solr c: s: r: x: t:] o.a.s.e.JettySolrRunner Jetty properties: {hostContext=/solr, hostPort=43527, zkHost=127.0.0.1:42237/solr} 2> 399097 INFO (TEST-ReplaceNodeTest.testFailOnSingleNode-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.MiniSolrCloudCluster waitForAllNodes: numServers=1 2> 399097 INFO (TEST-ReplaceNodeTest.testFailOnSingleNode-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.MiniSolrCloudCluster waitForNode: 127.0.0.1:43527_solr 2> 399099 WARN (TEST-ReplaceNodeTest.testFailOnSingleNode-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.c.SolrZkClient Using default ZkCredentialsInjector. ZkCredentialsInjector is not secure, it creates an empty list of credentials which leads to 'OPEN_ACL_UNSAFE' ACLs to Zookeeper nodes 2> 399130 INFO (TEST-ReplaceNodeTest.testFailOnSingleNode-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.c.ConnectionManager Waiting up to 15000ms for client to connect to ZooKeeper 2> 399142 INFO (zkConnectionManagerCallback-2325-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ConnectionManager zkClient has connected 2> 399142 INFO (TEST-ReplaceNodeTest.testFailOnSingleNode-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper 2> 399143 WARN (TEST-ReplaceNodeTest.testFailOnSingleNode-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.c.SolrZkClient Using default ZkACLProvider. DefaultZkACLProvider is not secure, it creates 'OPEN_ACL_UNSAFE' ACLs to Zookeeper nodes 2> 399151 INFO (TEST-ReplaceNodeTest.testFailOnSingleNode-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (0) -> (1) 2> 399159 INFO (TEST-ReplaceNodeTest.testFailOnSingleNode-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.s.i.ZkClientClusterStateProvider Cluster at 127.0.0.1:42237/solr ready 2> 399160 WARN (TEST-ReplaceNodeTest.testFailOnSingleNode-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.c.SolrZkClient Using default ZkCredentialsInjector. ZkCredentialsInjector is not secure, it creates an empty list of credentials which leads to 'OPEN_ACL_UNSAFE' ACLs to Zookeeper nodes 2> 399175 INFO (TEST-ReplaceNodeTest.testFailOnSingleNode-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.c.ConnectionManager Waiting up to 45000ms for client to connect to ZooKeeper 2> 399185 INFO (zkConnectionManagerCallback-2327-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ConnectionManager zkClient has connected 2> 399186 INFO (TEST-ReplaceNodeTest.testFailOnSingleNode-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper 2> 399186 WARN (TEST-ReplaceNodeTest.testFailOnSingleNode-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.c.SolrZkClient Using default ZkACLProvider. DefaultZkACLProvider is not secure, it creates 'OPEN_ACL_UNSAFE' ACLs to Zookeeper nodes 2> 399312 INFO (TEST-ReplaceNodeTest.testFailOnSingleNode-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.ReplaceNodeTest total_jettys: 1 2> 399335 INFO (qtp410531627-4210) [n:127.0.0.1:43527_solr c:replacesinglenodetest_coll s: r: x: t:null-1661] o.a.s.c.a.c.DistributedCollectionConfigSetCommandRunner Running Collection API locally for CREATE asyncId=null 2> 399358 INFO (DistributedCollectionApiCommandExecutor-2317-thread-1-processing-127.0.0.1:43527_solr null-1661 replacesinglenodetest_coll) [n:127.0.0.1:43527_solr c:replacesinglenodetest_coll s: r: x: t:null-1661] o.a.s.c.a.c.CreateCollectionCmd Create collection replacesinglenodetest_coll 2> 399569 INFO (DistributedCollectionApiCommandExecutor-2317-thread-1-processing-127.0.0.1:43527_solr null-1661 replacesinglenodetest_coll) [n:127.0.0.1:43527_solr c:replacesinglenodetest_coll s: r: x: t:null-1661] o.a.s.c.o.SliceMutator createReplica() { 2> "core":"replacesinglenodetest_coll_shard5_replica_n1", 2> "node_name":"127.0.0.1:43527_solr", 2> "base_url":"http://127.0.0.1:43527/solr", 2> "collection":"replacesinglenodetest_coll", 2> "shard":"shard5", 2> "state":"down", 2> "type":"NRT", 2> "operation":"ADDREPLICA", 2> "waitForFinalState":"false"} 2> 399579 INFO (DistributedCollectionApiCommandExecutor-2317-thread-1-processing-127.0.0.1:43527_solr null-1661 replacesinglenodetest_coll) [n:127.0.0.1:43527_solr c:replacesinglenodetest_coll s: r: x: t:null-1661] o.a.s.c.o.SliceMutator createReplica() { 2> "core":"replacesinglenodetest_coll_shard3_replica_n2", 2> "node_name":"127.0.0.1:43527_solr", 2> "base_url":"http://127.0.0.1:43527/solr", 2> "collection":"replacesinglenodetest_coll", 2> "shard":"shard3", 2> "state":"down", 2> "type":"NRT", 2> "operation":"ADDREPLICA", 2> "waitForFinalState":"false"} 2> 399589 INFO (DistributedCollectionApiCommandExecutor-2317-thread-1-processing-127.0.0.1:43527_solr null-1661 replacesinglenodetest_coll) [n:127.0.0.1:43527_solr c:replacesinglenodetest_coll s: r: x: t:null-1661] o.a.s.c.o.SliceMutator createReplica() { 2> "core":"replacesinglenodetest_coll_shard2_replica_n3", 2> "node_name":"127.0.0.1:43527_solr", 2> "base_url":"http://127.0.0.1:43527/solr", 2> "collection":"replacesinglenodetest_coll", 2> "shard":"shard2", 2> "state":"down", 2> "type":"NRT", 2> "operation":"ADDREPLICA", 2> "waitForFinalState":"false"} 2> 399600 INFO (DistributedCollectionApiCommandExecutor-2317-thread-1-processing-127.0.0.1:43527_solr null-1661 replacesinglenodetest_coll) [n:127.0.0.1:43527_solr c:replacesinglenodetest_coll s: r: x: t:null-1661] o.a.s.c.o.SliceMutator createReplica() { 2> "core":"replacesinglenodetest_coll_shard4_replica_n4", 2> "node_name":"127.0.0.1:43527_solr", 2> "base_url":"http://127.0.0.1:43527/solr", 2> "collection":"replacesinglenodetest_coll", 2> "shard":"shard4", 2> "state":"down", 2> "type":"NRT", 2> "operation":"ADDREPLICA", 2> "waitForFinalState":"false"} 2> 399610 INFO (DistributedCollectionApiCommandExecutor-2317-thread-1-processing-127.0.0.1:43527_solr null-1661 replacesinglenodetest_coll) [n:127.0.0.1:43527_solr c:replacesinglenodetest_coll s: r: x: t:null-1661] o.a.s.c.o.SliceMutator createReplica() { 2> "core":"replacesinglenodetest_coll_shard1_replica_n5", 2> "node_name":"127.0.0.1:43527_solr", 2> "base_url":"http://127.0.0.1:43527/solr", 2> "collection":"replacesinglenodetest_coll", 2> "shard":"shard1", 2> "state":"down", 2> "type":"NRT", 2> "operation":"ADDREPLICA", 2> "waitForFinalState":"false"} 2> 399652 INFO (qtp410531627-4214) [n:127.0.0.1:43527_solr c: s: r: x:replacesinglenodetest_coll_shard3_replica_n2 t:null-1661] o.a.s.h.a.CoreAdminOperation core create command qt=/admin/cores&collection.configName=conf1&newCollection=true&collection=replacesinglenodetest_coll&version=2&replicaType=NRT&coreNodeName=core_node7&name=replacesinglenodetest_coll_shard3_replica_n2&action=CREATE&numShards=5&shard=shard3&wt=javabin 2> 399652 INFO (qtp410531627-4213) [n:127.0.0.1:43527_solr c: s: r: x:replacesinglenodetest_coll_shard5_replica_n1 t:null-1661] o.a.s.h.a.CoreAdminOperation core create command qt=/admin/cores&collection.configName=conf1&newCollection=true&collection=replacesinglenodetest_coll&version=2&replicaType=NRT&coreNodeName=core_node6&name=replacesinglenodetest_coll_shard5_replica_n1&action=CREATE&numShards=5&shard=shard5&wt=javabin 2> 399653 INFO (qtp410531627-4209) [n:127.0.0.1:43527_solr c: s: r: x:replacesinglenodetest_coll_shard2_replica_n3 t:null-1661] o.a.s.h.a.CoreAdminOperation core create command qt=/admin/cores&collection.configName=conf1&newCollection=true&collection=replacesinglenodetest_coll&version=2&replicaType=NRT&coreNodeName=core_node8&name=replacesinglenodetest_coll_shard2_replica_n3&action=CREATE&numShards=5&shard=shard2&wt=javabin 2> 399654 INFO (qtp410531627-4211) [n:127.0.0.1:43527_solr c: s: r: x:replacesinglenodetest_coll_shard4_replica_n4 t:null-1661] o.a.s.h.a.CoreAdminOperation core create command qt=/admin/cores&collection.configName=conf1&newCollection=true&collection=replacesinglenodetest_coll&version=2&replicaType=NRT&coreNodeName=core_node9&name=replacesinglenodetest_coll_shard4_replica_n4&action=CREATE&numShards=5&shard=shard4&wt=javabin 2> 399654 INFO (qtp410531627-4212) [n:127.0.0.1:43527_solr c: s: r: x:replacesinglenodetest_coll_shard1_replica_n5 t:null-1661] o.a.s.h.a.CoreAdminOperation core create command qt=/admin/cores&collection.configName=conf1&newCollection=true&collection=replacesinglenodetest_coll&version=2&replicaType=NRT&coreNodeName=core_node10&name=replacesinglenodetest_coll_shard1_replica_n5&action=CREATE&numShards=5&shard=shard1&wt=javabin 2> 399773 INFO (qtp410531627-4214) [n:127.0.0.1:43527_solr c:replacesinglenodetest_coll s:shard3 r:core_node7 x:replacesinglenodetest_coll_shard3_replica_n2 t:null-1661] o.a.s.c.SolrConfig Using Lucene MatchVersion: 9.8.0 2> 399773 INFO (qtp410531627-4211) [n:127.0.0.1:43527_solr c:replacesinglenodetest_coll s:shard4 r:core_node9 x:replacesinglenodetest_coll_shard4_replica_n4 t:null-1661] o.a.s.c.SolrConfig Using Lucene MatchVersion: 9.8.0 2> 399774 INFO (qtp410531627-4209) [n:127.0.0.1:43527_solr c:replacesinglenodetest_coll s:shard2 r:core_node8 x:replacesinglenodetest_coll_shard2_replica_n3 t:null-1661] o.a.s.c.SolrConfig Using Lucene MatchVersion: 9.8.0 2> 399774 INFO (qtp410531627-4213) [n:127.0.0.1:43527_solr c:replacesinglenodetest_coll s:shard5 r:core_node6 x:replacesinglenodetest_coll_shard5_replica_n1 t:null-1661] o.a.s.c.SolrConfig Using Lucene MatchVersion: 9.8.0 2> 399775 INFO (qtp410531627-4212) [n:127.0.0.1:43527_solr c:replacesinglenodetest_coll s:shard1 r:core_node10 x:replacesinglenodetest_coll_shard1_replica_n5 t:null-1661] o.a.s.c.SolrConfig Using Lucene MatchVersion: 9.8.0 2> 399801 INFO (qtp410531627-4209) [n:127.0.0.1:43527_solr c:replacesinglenodetest_coll s:shard2 r:core_node8 x:replacesinglenodetest_coll_shard2_replica_n3 t:null-1661] o.a.s.s.IndexSchema Schema name=cloud-dynamic 2> 399801 INFO (qtp410531627-4214) [n:127.0.0.1:43527_solr c:replacesinglenodetest_coll s:shard3 r:core_node7 x:replacesinglenodetest_coll_shard3_replica_n2 t:null-1661] o.a.s.s.IndexSchema Schema name=cloud-dynamic 2> 399801 INFO (qtp410531627-4212) [n:127.0.0.1:43527_solr c:replacesinglenodetest_coll s:shard1 r:core_node10 x:replacesinglenodetest_coll_shard1_replica_n5 t:null-1661] o.a.s.s.IndexSchema Schema name=cloud-dynamic 2> 399801 INFO (qtp410531627-4213) [n:127.0.0.1:43527_solr c:replacesinglenodetest_coll s:shard5 r:core_node6 x:replacesinglenodetest_coll_shard5_replica_n1 t:null-1661] o.a.s.s.IndexSchema Schema name=cloud-dynamic 2> 399801 INFO (qtp410531627-4211) [n:127.0.0.1:43527_solr c:replacesinglenodetest_coll s:shard4 r:core_node9 x:replacesinglenodetest_coll_shard4_replica_n4 t:null-1661] o.a.s.s.IndexSchema Schema name=cloud-dynamic 2> 399827 INFO (qtp410531627-4214) [n:127.0.0.1:43527_solr c:replacesinglenodetest_coll s:shard3 r:core_node7 x:replacesinglenodetest_coll_shard3_replica_n2 t:null-1661] o.a.s.s.IndexSchema Loaded schema cloud-dynamic/1.6 with uniqueid field id 2> 399828 INFO (qtp410531627-4214) [n:127.0.0.1:43527_solr c:replacesinglenodetest_coll s:shard3 r:core_node7 x:replacesinglenodetest_coll_shard3_replica_n2 t:null-1661] o.a.s.c.CoreContainer Creating SolrCore 'replacesinglenodetest_coll_shard3_replica_n2' using configuration from configset conf1, trusted=true 2> 399829 INFO (qtp410531627-4209) [n:127.0.0.1:43527_solr c:replacesinglenodetest_coll s:shard2 r:core_node8 x:replacesinglenodetest_coll_shard2_replica_n3 t:null-1661] o.a.s.s.IndexSchema Loaded schema cloud-dynamic/1.6 with uniqueid field id 2> 399830 INFO (qtp410531627-4209) [n:127.0.0.1:43527_solr c:replacesinglenodetest_coll s:shard2 r:core_node8 x:replacesinglenodetest_coll_shard2_replica_n3 t:null-1661] o.a.s.c.CoreContainer Creating SolrCore 'replacesinglenodetest_coll_shard2_replica_n3' using configuration from configset conf1, trusted=true 2> 399831 INFO (qtp410531627-4214) [n:127.0.0.1:43527_solr c:replacesinglenodetest_coll s:shard3 r:core_node7 x:replacesinglenodetest_coll_shard3_replica_n2 t:null-1661] o.a.s.c.SolrCore Opening new SolrCore at [/build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.cloud.ReplaceNodeTest_246C98A4C257C021-001/tempDir-002/node1/replacesinglenodetest_coll_shard3_replica_n2], dataDir=[/build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.cloud.ReplaceNodeTest_246C98A4C257C021-001/tempDir-002/node1/replacesinglenodetest_coll_shard3_replica_n2/data/] 2> 399831 INFO (qtp410531627-4213) [n:127.0.0.1:43527_solr c:replacesinglenodetest_coll s:shard5 r:core_node6 x:replacesinglenodetest_coll_shard5_replica_n1 t:null-1661] o.a.s.s.IndexSchema Loaded schema cloud-dynamic/1.6 with uniqueid field id 2> 399832 INFO (qtp410531627-4213) [n:127.0.0.1:43527_solr c:replacesinglenodetest_coll s:shard5 r:core_node6 x:replacesinglenodetest_coll_shard5_replica_n1 t:null-1661] o.a.s.c.CoreContainer Creating SolrCore 'replacesinglenodetest_coll_shard5_replica_n1' using configuration from configset conf1, trusted=true 2> 399833 INFO (qtp410531627-4211) [n:127.0.0.1:43527_solr c:replacesinglenodetest_coll s:shard4 r:core_node9 x:replacesinglenodetest_coll_shard4_replica_n4 t:null-1661] o.a.s.s.IndexSchema Loaded schema cloud-dynamic/1.6 with uniqueid field id 2> 399834 INFO (qtp410531627-4211) [n:127.0.0.1:43527_solr c:replacesinglenodetest_coll s:shard4 r:core_node9 x:replacesinglenodetest_coll_shard4_replica_n4 t:null-1661] o.a.s.c.CoreContainer Creating SolrCore 'replacesinglenodetest_coll_shard4_replica_n4' using configuration from configset conf1, trusted=true 2> 399834 INFO (qtp410531627-4209) [n:127.0.0.1:43527_solr c:replacesinglenodetest_coll s:shard2 r:core_node8 x:replacesinglenodetest_coll_shard2_replica_n3 t:null-1661] o.a.s.c.SolrCore Opening new SolrCore at [/build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.cloud.ReplaceNodeTest_246C98A4C257C021-001/tempDir-002/node1/replacesinglenodetest_coll_shard2_replica_n3], dataDir=[/build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.cloud.ReplaceNodeTest_246C98A4C257C021-001/tempDir-002/node1/replacesinglenodetest_coll_shard2_replica_n3/data/] 2> 399835 INFO (qtp410531627-4213) [n:127.0.0.1:43527_solr c:replacesinglenodetest_coll s:shard5 r:core_node6 x:replacesinglenodetest_coll_shard5_replica_n1 t:null-1661] o.a.s.c.SolrCore Opening new SolrCore at [/build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.cloud.ReplaceNodeTest_246C98A4C257C021-001/tempDir-002/node1/replacesinglenodetest_coll_shard5_replica_n1], dataDir=[/build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.cloud.ReplaceNodeTest_246C98A4C257C021-001/tempDir-002/node1/replacesinglenodetest_coll_shard5_replica_n1/data/] 2> 399835 INFO (qtp410531627-4212) [n:127.0.0.1:43527_solr c:replacesinglenodetest_coll s:shard1 r:core_node10 x:replacesinglenodetest_coll_shard1_replica_n5 t:null-1661] o.a.s.s.IndexSchema Loaded schema cloud-dynamic/1.6 with uniqueid field id 2> 399836 INFO (qtp410531627-4212) [n:127.0.0.1:43527_solr c:replacesinglenodetest_coll s:shard1 r:core_node10 x:replacesinglenodetest_coll_shard1_replica_n5 t:null-1661] o.a.s.c.CoreContainer Creating SolrCore 'replacesinglenodetest_coll_shard1_replica_n5' using configuration from configset conf1, trusted=true 2> 399837 INFO (qtp410531627-4211) [n:127.0.0.1:43527_solr c:replacesinglenodetest_coll s:shard4 r:core_node9 x:replacesinglenodetest_coll_shard4_replica_n4 t:null-1661] o.a.s.c.SolrCore Opening new SolrCore at [/build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.cloud.ReplaceNodeTest_246C98A4C257C021-001/tempDir-002/node1/replacesinglenodetest_coll_shard4_replica_n4], dataDir=[/build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.cloud.ReplaceNodeTest_246C98A4C257C021-001/tempDir-002/node1/replacesinglenodetest_coll_shard4_replica_n4/data/] 2> 399839 INFO (qtp410531627-4212) [n:127.0.0.1:43527_solr c:replacesinglenodetest_coll s:shard1 r:core_node10 x:replacesinglenodetest_coll_shard1_replica_n5 t:null-1661] o.a.s.c.SolrCore Opening new SolrCore at [/build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.cloud.ReplaceNodeTest_246C98A4C257C021-001/tempDir-002/node1/replacesinglenodetest_coll_shard1_replica_n5], dataDir=[/build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.cloud.ReplaceNodeTest_246C98A4C257C021-001/tempDir-002/node1/replacesinglenodetest_coll_shard1_replica_n5/data/] 2> 400506 INFO (qtp410531627-4213) [n:127.0.0.1:43527_solr c:replacesinglenodetest_coll s:shard5 r:core_node6 x:replacesinglenodetest_coll_shard5_replica_n1 t:null-1661] o.a.s.j.SolrRequestAuthorizer Creating a new SolrRequestAuthorizer 2> 400579 INFO (qtp410531627-4213) [n:127.0.0.1:43527_solr c:replacesinglenodetest_coll s:shard5 r:core_node6 x:replacesinglenodetest_coll_shard5_replica_n1 t:null-1661] o.a.s.u.UpdateHandler Using UpdateLog implementation: org.apache.solr.update.UpdateLog 2> 400579 INFO (qtp410531627-4213) [n:127.0.0.1:43527_solr c:replacesinglenodetest_coll s:shard5 r:core_node6 x:replacesinglenodetest_coll_shard5_replica_n1 t:null-1661] o.a.s.u.UpdateLog Initializing UpdateLog: dataDir=null defaultSyncLevel=FLUSH numRecordsToKeep=100 maxNumLogsToKeep=10 numVersionBuckets=65536 2> 400580 INFO (qtp410531627-4212) [n:127.0.0.1:43527_solr c:replacesinglenodetest_coll s:shard1 r:core_node10 x:replacesinglenodetest_coll_shard1_replica_n5 t:null-1661] o.a.s.u.UpdateHandler Using UpdateLog implementation: org.apache.solr.update.UpdateLog 2> 400580 INFO (qtp410531627-4212) [n:127.0.0.1:43527_solr c:replacesinglenodetest_coll s:shard1 r:core_node10 x:replacesinglenodetest_coll_shard1_replica_n5 t:null-1661] o.a.s.u.UpdateLog Initializing UpdateLog: dataDir=null defaultSyncLevel=FLUSH numRecordsToKeep=100 maxNumLogsToKeep=10 numVersionBuckets=65536 2> 400580 INFO (qtp410531627-4211) [n:127.0.0.1:43527_solr c:replacesinglenodetest_coll s:shard4 r:core_node9 x:replacesinglenodetest_coll_shard4_replica_n4 t:null-1661] o.a.s.u.UpdateHandler Using UpdateLog implementation: org.apache.solr.update.UpdateLog 2> 400581 INFO (qtp410531627-4211) [n:127.0.0.1:43527_solr c:replacesinglenodetest_coll s:shard4 r:core_node9 x:replacesinglenodetest_coll_shard4_replica_n4 t:null-1661] o.a.s.u.UpdateLog Initializing UpdateLog: dataDir=null defaultSyncLevel=FLUSH numRecordsToKeep=100 maxNumLogsToKeep=10 numVersionBuckets=65536 2> 400581 INFO (qtp410531627-4209) [n:127.0.0.1:43527_solr c:replacesinglenodetest_coll s:shard2 r:core_node8 x:replacesinglenodetest_coll_shard2_replica_n3 t:null-1661] o.a.s.u.UpdateHandler Using UpdateLog implementation: org.apache.solr.update.UpdateLog 2> 400581 INFO (qtp410531627-4209) [n:127.0.0.1:43527_solr c:replacesinglenodetest_coll s:shard2 r:core_node8 x:replacesinglenodetest_coll_shard2_replica_n3 t:null-1661] o.a.s.u.UpdateLog Initializing UpdateLog: dataDir=null defaultSyncLevel=FLUSH numRecordsToKeep=100 maxNumLogsToKeep=10 numVersionBuckets=65536 2> 400581 INFO (qtp410531627-4214) [n:127.0.0.1:43527_solr c:replacesinglenodetest_coll s:shard3 r:core_node7 x:replacesinglenodetest_coll_shard3_replica_n2 t:null-1661] o.a.s.u.UpdateHandler Using UpdateLog implementation: org.apache.solr.update.UpdateLog 2> 400582 INFO (qtp410531627-4214) [n:127.0.0.1:43527_solr c:replacesinglenodetest_coll s:shard3 r:core_node7 x:replacesinglenodetest_coll_shard3_replica_n2 t:null-1661] o.a.s.u.UpdateLog Initializing UpdateLog: dataDir=null defaultSyncLevel=FLUSH numRecordsToKeep=100 maxNumLogsToKeep=10 numVersionBuckets=65536 2> 400602 INFO (qtp410531627-4212) [n:127.0.0.1:43527_solr c:replacesinglenodetest_coll s:shard1 r:core_node10 x:replacesinglenodetest_coll_shard1_replica_n5 t:null-1661] o.a.s.u.CommitTracker Hard AutoCommit: disabled 2> 400602 INFO (qtp410531627-4212) [n:127.0.0.1:43527_solr c:replacesinglenodetest_coll s:shard1 r:core_node10 x:replacesinglenodetest_coll_shard1_replica_n5 t:null-1661] o.a.s.u.CommitTracker Soft AutoCommit: disabled 2> 400602 INFO (qtp410531627-4213) [n:127.0.0.1:43527_solr c:replacesinglenodetest_coll s:shard5 r:core_node6 x:replacesinglenodetest_coll_shard5_replica_n1 t:null-1661] o.a.s.u.CommitTracker Hard AutoCommit: disabled 2> 400602 INFO (qtp410531627-4213) [n:127.0.0.1:43527_solr c:replacesinglenodetest_coll s:shard5 r:core_node6 x:replacesinglenodetest_coll_shard5_replica_n1 t:null-1661] o.a.s.u.CommitTracker Soft AutoCommit: disabled 2> 400602 INFO (qtp410531627-4214) [n:127.0.0.1:43527_solr c:replacesinglenodetest_coll s:shard3 r:core_node7 x:replacesinglenodetest_coll_shard3_replica_n2 t:null-1661] o.a.s.u.CommitTracker Hard AutoCommit: disabled 2> 400602 INFO (qtp410531627-4211) [n:127.0.0.1:43527_solr c:replacesinglenodetest_coll s:shard4 r:core_node9 x:replacesinglenodetest_coll_shard4_replica_n4 t:null-1661] o.a.s.u.CommitTracker Hard AutoCommit: disabled 2> 400603 INFO (qtp410531627-4214) [n:127.0.0.1:43527_solr c:replacesinglenodetest_coll s:shard3 r:core_node7 x:replacesinglenodetest_coll_shard3_replica_n2 t:null-1661] o.a.s.u.CommitTracker Soft AutoCommit: disabled 2> 400603 INFO (qtp410531627-4211) [n:127.0.0.1:43527_solr c:replacesinglenodetest_coll s:shard4 r:core_node9 x:replacesinglenodetest_coll_shard4_replica_n4 t:null-1661] o.a.s.u.CommitTracker Soft AutoCommit: disabled 2> 400605 INFO (qtp410531627-4209) [n:127.0.0.1:43527_solr c:replacesinglenodetest_coll s:shard2 r:core_node8 x:replacesinglenodetest_coll_shard2_replica_n3 t:null-1661] o.a.s.u.CommitTracker Hard AutoCommit: disabled 2> 400606 INFO (qtp410531627-4209) [n:127.0.0.1:43527_solr c:replacesinglenodetest_coll s:shard2 r:core_node8 x:replacesinglenodetest_coll_shard2_replica_n3 t:null-1661] o.a.s.u.CommitTracker Soft AutoCommit: disabled 2> 400630 INFO (qtp410531627-4214) [n:127.0.0.1:43527_solr c:replacesinglenodetest_coll s:shard3 r:core_node7 x:replacesinglenodetest_coll_shard3_replica_n2 t:null-1661] o.a.s.r.ManagedResourceStorage Configured ZooKeeperStorageIO with znodeBase: /configs/conf1 2> 400633 INFO (qtp410531627-4212) [n:127.0.0.1:43527_solr c:replacesinglenodetest_coll s:shard1 r:core_node10 x:replacesinglenodetest_coll_shard1_replica_n5 t:null-1661] o.a.s.r.ManagedResourceStorage Configured ZooKeeperStorageIO with znodeBase: /configs/conf1 2> 400633 INFO (qtp410531627-4214) [n:127.0.0.1:43527_solr c:replacesinglenodetest_coll s:shard3 r:core_node7 x:replacesinglenodetest_coll_shard3_replica_n2 t:null-1661] o.a.s.r.ManagedResourceStorage Loaded null at path _rest_managed.json using ZooKeeperStorageIO:path=/configs/conf1 2> 400636 INFO (qtp410531627-4213) [n:127.0.0.1:43527_solr c:replacesinglenodetest_coll s:shard5 r:core_node6 x:replacesinglenodetest_coll_shard5_replica_n1 t:null-1661] o.a.s.r.ManagedResourceStorage Configured ZooKeeperStorageIO with znodeBase: /configs/conf1 2> 400637 INFO (qtp410531627-4212) [n:127.0.0.1:43527_solr c:replacesinglenodetest_coll s:shard1 r:core_node10 x:replacesinglenodetest_coll_shard1_replica_n5 t:null-1661] o.a.s.r.ManagedResourceStorage Loaded null at path _rest_managed.json using ZooKeeperStorageIO:path=/configs/conf1 2> 400639 INFO (qtp410531627-4211) [n:127.0.0.1:43527_solr c:replacesinglenodetest_coll s:shard4 r:core_node9 x:replacesinglenodetest_coll_shard4_replica_n4 t:null-1661] o.a.s.r.ManagedResourceStorage Configured ZooKeeperStorageIO with znodeBase: /configs/conf1 2> 400640 INFO (qtp410531627-4214) [n:127.0.0.1:43527_solr c:replacesinglenodetest_coll s:shard3 r:core_node7 x:replacesinglenodetest_coll_shard3_replica_n2 t:null-1661] o.a.s.h.ReplicationHandler Commits will be reserved for 10000 ms 2> 400640 INFO (qtp410531627-4213) [n:127.0.0.1:43527_solr c:replacesinglenodetest_coll s:shard5 r:core_node6 x:replacesinglenodetest_coll_shard5_replica_n1 t:null-1661] o.a.s.r.ManagedResourceStorage Loaded null at path _rest_managed.json using ZooKeeperStorageIO:path=/configs/conf1 2> 400641 INFO (qtp410531627-4214) [n:127.0.0.1:43527_solr c:replacesinglenodetest_coll s:shard3 r:core_node7 x:replacesinglenodetest_coll_shard3_replica_n2 t:null-1661] o.a.s.u.UpdateLog Could not find max version in index or recent updates, using new clock 1788735677189324800 2> 400643 INFO (qtp410531627-4209) [n:127.0.0.1:43527_solr c:replacesinglenodetest_coll s:shard2 r:core_node8 x:replacesinglenodetest_coll_shard2_replica_n3 t:null-1661] o.a.s.r.ManagedResourceStorage Configured ZooKeeperStorageIO with znodeBase: /configs/conf1 2> 400644 INFO (qtp410531627-4211) [n:127.0.0.1:43527_solr c:replacesinglenodetest_coll s:shard4 r:core_node9 x:replacesinglenodetest_coll_shard4_replica_n4 t:null-1661] o.a.s.r.ManagedResourceStorage Loaded null at path _rest_managed.json using ZooKeeperStorageIO:path=/configs/conf1 2> 400644 INFO (qtp410531627-4212) [n:127.0.0.1:43527_solr c:replacesinglenodetest_coll s:shard1 r:core_node10 x:replacesinglenodetest_coll_shard1_replica_n5 t:null-1661] o.a.s.h.ReplicationHandler Commits will be reserved for 10000 ms 2> 400645 INFO (qtp410531627-4212) [n:127.0.0.1:43527_solr c:replacesinglenodetest_coll s:shard1 r:core_node10 x:replacesinglenodetest_coll_shard1_replica_n5 t:null-1661] o.a.s.u.UpdateLog Could not find max version in index or recent updates, using new clock 1788735677193519104 2> 400646 INFO (qtp410531627-4209) [n:127.0.0.1:43527_solr c:replacesinglenodetest_coll s:shard2 r:core_node8 x:replacesinglenodetest_coll_shard2_replica_n3 t:null-1661] o.a.s.r.ManagedResourceStorage Loaded null at path _rest_managed.json using ZooKeeperStorageIO:path=/configs/conf1 2> 400647 INFO (qtp410531627-4213) [n:127.0.0.1:43527_solr c:replacesinglenodetest_coll s:shard5 r:core_node6 x:replacesinglenodetest_coll_shard5_replica_n1 t:null-1661] o.a.s.h.ReplicationHandler Commits will be reserved for 10000 ms 2> 400648 INFO (qtp410531627-4213) [n:127.0.0.1:43527_solr c:replacesinglenodetest_coll s:shard5 r:core_node6 x:replacesinglenodetest_coll_shard5_replica_n1 t:null-1661] o.a.s.u.UpdateLog Could not find max version in index or recent updates, using new clock 1788735677196664832 2> 400649 INFO (qtp410531627-4211) [n:127.0.0.1:43527_solr c:replacesinglenodetest_coll s:shard4 r:core_node9 x:replacesinglenodetest_coll_shard4_replica_n4 t:null-1661] o.a.s.h.ReplicationHandler Commits will be reserved for 10000 ms 2> 400651 INFO (qtp410531627-4211) [n:127.0.0.1:43527_solr c:replacesinglenodetest_coll s:shard4 r:core_node9 x:replacesinglenodetest_coll_shard4_replica_n4 t:null-1661] o.a.s.u.UpdateLog Could not find max version in index or recent updates, using new clock 1788735677199810560 2> 400651 INFO (qtp410531627-4209) [n:127.0.0.1:43527_solr c:replacesinglenodetest_coll s:shard2 r:core_node8 x:replacesinglenodetest_coll_shard2_replica_n3 t:null-1661] o.a.s.h.ReplicationHandler Commits will be reserved for 10000 ms 2> 400652 INFO (qtp410531627-4209) [n:127.0.0.1:43527_solr c:replacesinglenodetest_coll s:shard2 r:core_node8 x:replacesinglenodetest_coll_shard2_replica_n3 t:null-1661] o.a.s.u.UpdateLog Could not find max version in index or recent updates, using new clock 1788735677200859136 2> 401476 INFO (searcherExecutor-2329-thread-1-processing-replacesinglenodetest_coll_shard3_replica_n2 null-1661 core_node7 127.0.0.1:43527_solr replacesinglenodetest_coll shard3) [n:127.0.0.1:43527_solr c:replacesinglenodetest_coll s:shard3 r:core_node7 x:replacesinglenodetest_coll_shard3_replica_n2 t:null-1661] o.a.s.c.SolrCore Registered new searcher autowarm time: 0 ms 2> 401477 INFO (searcherExecutor-2337-thread-1-processing-replacesinglenodetest_coll_shard1_replica_n5 null-1661 core_node10 127.0.0.1:43527_solr replacesinglenodetest_coll shard1) [n:127.0.0.1:43527_solr c:replacesinglenodetest_coll s:shard1 r:core_node10 x:replacesinglenodetest_coll_shard1_replica_n5 t:null-1661] o.a.s.c.SolrCore Registered new searcher autowarm time: 0 ms 2> 401488 INFO (searcherExecutor-2331-thread-1-processing-replacesinglenodetest_coll_shard2_replica_n3 null-1661 core_node8 127.0.0.1:43527_solr replacesinglenodetest_coll shard2) [n:127.0.0.1:43527_solr c:replacesinglenodetest_coll s:shard2 r:core_node8 x:replacesinglenodetest_coll_shard2_replica_n3 t:null-1661] o.a.s.c.SolrCore Registered new searcher autowarm time: 0 ms 2> 401490 INFO (searcherExecutor-2333-thread-1-processing-replacesinglenodetest_coll_shard5_replica_n1 null-1661 core_node6 127.0.0.1:43527_solr replacesinglenodetest_coll shard5) [n:127.0.0.1:43527_solr c:replacesinglenodetest_coll s:shard5 r:core_node6 x:replacesinglenodetest_coll_shard5_replica_n1 t:null-1661] o.a.s.c.SolrCore Registered new searcher autowarm time: 0 ms 2> 401498 INFO (searcherExecutor-2335-thread-1-processing-replacesinglenodetest_coll_shard4_replica_n4 null-1661 core_node9 127.0.0.1:43527_solr replacesinglenodetest_coll shard4) [n:127.0.0.1:43527_solr c:replacesinglenodetest_coll s:shard4 r:core_node9 x:replacesinglenodetest_coll_shard4_replica_n4 t:null-1661] o.a.s.c.SolrCore Registered new searcher autowarm time: 0 ms 2> 401501 INFO (qtp410531627-4214) [n:127.0.0.1:43527_solr c:replacesinglenodetest_coll s:shard3 r:core_node7 x:replacesinglenodetest_coll_shard3_replica_n2 t:null-1661] o.a.s.c.ZkShardTerms Successful update of terms at /collections/replacesinglenodetest_coll/terms/shard3 to Terms{values={core_node7=0}, version=0} for registerTerm 2> 401502 INFO (qtp410531627-4214) [n:127.0.0.1:43527_solr c:replacesinglenodetest_coll s:shard3 r:core_node7 x:replacesinglenodetest_coll_shard3_replica_n2 t:null-1661] o.a.s.c.ShardLeaderElectionContextBase make sure parent is created /collections/replacesinglenodetest_coll/leaders/shard3 2> 401523 INFO (qtp410531627-4213) [n:127.0.0.1:43527_solr c:replacesinglenodetest_coll s:shard5 r:core_node6 x:replacesinglenodetest_coll_shard5_replica_n1 t:null-1661] o.a.s.c.ZkShardTerms Successful update of terms at /collections/replacesinglenodetest_coll/terms/shard5 to Terms{values={core_node6=0}, version=0} for registerTerm 2> 401525 INFO (qtp410531627-4213) [n:127.0.0.1:43527_solr c:replacesinglenodetest_coll s:shard5 r:core_node6 x:replacesinglenodetest_coll_shard5_replica_n1 t:null-1661] o.a.s.c.ShardLeaderElectionContextBase make sure parent is created /collections/replacesinglenodetest_coll/leaders/shard5 2> 401545 INFO (qtp410531627-4214) [n:127.0.0.1:43527_solr c:replacesinglenodetest_coll s:shard3 r:core_node7 x:replacesinglenodetest_coll_shard3_replica_n2 t:null-1661] o.a.s.c.ShardLeaderElectionContext Enough replicas found to continue. 2> 401550 INFO (qtp410531627-4209) [n:127.0.0.1:43527_solr c:replacesinglenodetest_coll s:shard2 r:core_node8 x:replacesinglenodetest_coll_shard2_replica_n3 t:null-1661] o.a.s.c.ZkShardTerms Successful update of terms at /collections/replacesinglenodetest_coll/terms/shard2 to Terms{values={core_node8=0}, version=0} for registerTerm 2> 401551 INFO (qtp410531627-4209) [n:127.0.0.1:43527_solr c:replacesinglenodetest_coll s:shard2 r:core_node8 x:replacesinglenodetest_coll_shard2_replica_n3 t:null-1661] o.a.s.c.ShardLeaderElectionContextBase make sure parent is created /collections/replacesinglenodetest_coll/leaders/shard2 2> 401566 INFO (qtp410531627-4213) [n:127.0.0.1:43527_solr c:replacesinglenodetest_coll s:shard5 r:core_node6 x:replacesinglenodetest_coll_shard5_replica_n1 t:null-1661] o.a.s.c.ShardLeaderElectionContext Enough replicas found to continue. 2> 401572 INFO (qtp410531627-4211) [n:127.0.0.1:43527_solr c:replacesinglenodetest_coll s:shard4 r:core_node9 x:replacesinglenodetest_coll_shard4_replica_n4 t:null-1661] o.a.s.c.ZkShardTerms Successful update of terms at /collections/replacesinglenodetest_coll/terms/shard4 to Terms{values={core_node9=0}, version=0} for registerTerm 2> 401573 INFO (qtp410531627-4211) [n:127.0.0.1:43527_solr c:replacesinglenodetest_coll s:shard4 r:core_node9 x:replacesinglenodetest_coll_shard4_replica_n4 t:null-1661] o.a.s.c.ShardLeaderElectionContextBase make sure parent is created /collections/replacesinglenodetest_coll/leaders/shard4 2> 401591 INFO (qtp410531627-4209) [n:127.0.0.1:43527_solr c:replacesinglenodetest_coll s:shard2 r:core_node8 x:replacesinglenodetest_coll_shard2_replica_n3 t:null-1661] o.a.s.c.ShardLeaderElectionContext Enough replicas found to continue. 2> 401597 INFO (qtp410531627-4212) [n:127.0.0.1:43527_solr c:replacesinglenodetest_coll s:shard1 r:core_node10 x:replacesinglenodetest_coll_shard1_replica_n5 t:null-1661] o.a.s.c.ZkShardTerms Successful update of terms at /collections/replacesinglenodetest_coll/terms/shard1 to Terms{values={core_node10=0}, version=0} for registerTerm 2> 401597 INFO (qtp410531627-4213) [n:127.0.0.1:43527_solr c:replacesinglenodetest_coll s:shard5 r:core_node6 x:replacesinglenodetest_coll_shard5_replica_n1 t:null-1661] o.a.s.c.ShardLeaderElectionContext I may be the new leader - try and sync 2> 401597 INFO (qtp410531627-4214) [n:127.0.0.1:43527_solr c:replacesinglenodetest_coll s:shard3 r:core_node7 x:replacesinglenodetest_coll_shard3_replica_n2 t:null-1661] o.a.s.c.ShardLeaderElectionContext I may be the new leader - try and sync 2> 401597 INFO (qtp410531627-4209) [n:127.0.0.1:43527_solr c:replacesinglenodetest_coll s:shard2 r:core_node8 x:replacesinglenodetest_coll_shard2_replica_n3 t:null-1661] o.a.s.c.ShardLeaderElectionContext I may be the new leader - try and sync 2> 401597 INFO (qtp410531627-4212) [n:127.0.0.1:43527_solr c:replacesinglenodetest_coll s:shard1 r:core_node10 x:replacesinglenodetest_coll_shard1_replica_n5 t:null-1661] o.a.s.c.ShardLeaderElectionContextBase make sure parent is created /collections/replacesinglenodetest_coll/leaders/shard1 2> 401598 INFO (qtp410531627-4213) [n:127.0.0.1:43527_solr c:replacesinglenodetest_coll s:shard5 r:core_node6 x:replacesinglenodetest_coll_shard5_replica_n1 t:null-1661] o.a.s.c.SyncStrategy Sync replicas to http://127.0.0.1:43527/solr/replacesinglenodetest_coll_shard5_replica_n1/ 2> 401598 INFO (qtp410531627-4214) [n:127.0.0.1:43527_solr c:replacesinglenodetest_coll s:shard3 r:core_node7 x:replacesinglenodetest_coll_shard3_replica_n2 t:null-1661] o.a.s.c.SyncStrategy Sync replicas to http://127.0.0.1:43527/solr/replacesinglenodetest_coll_shard3_replica_n2/ 2> 401598 INFO (qtp410531627-4209) [n:127.0.0.1:43527_solr c:replacesinglenodetest_coll s:shard2 r:core_node8 x:replacesinglenodetest_coll_shard2_replica_n3 t:null-1661] o.a.s.c.SyncStrategy Sync replicas to http://127.0.0.1:43527/solr/replacesinglenodetest_coll_shard2_replica_n3/ 2> 401602 INFO (qtp410531627-4209) [n:127.0.0.1:43527_solr c:replacesinglenodetest_coll s:shard2 r:core_node8 x:replacesinglenodetest_coll_shard2_replica_n3 t:null-1661] o.a.s.c.SyncStrategy Sync Success - now sync replicas to me 2> 401605 INFO (qtp410531627-4213) [n:127.0.0.1:43527_solr c:replacesinglenodetest_coll s:shard5 r:core_node6 x:replacesinglenodetest_coll_shard5_replica_n1 t:null-1661] o.a.s.c.SyncStrategy Sync Success - now sync replicas to me 2> 401609 INFO (qtp410531627-4214) [n:127.0.0.1:43527_solr c:replacesinglenodetest_coll s:shard3 r:core_node7 x:replacesinglenodetest_coll_shard3_replica_n2 t:null-1661] o.a.s.c.SyncStrategy Sync Success - now sync replicas to me 2> 401612 INFO (qtp410531627-4213) [n:127.0.0.1:43527_solr c:replacesinglenodetest_coll s:shard5 r:core_node6 x:replacesinglenodetest_coll_shard5_replica_n1 t:null-1661] o.a.s.c.SyncStrategy http://127.0.0.1:43527/solr/replacesinglenodetest_coll_shard5_replica_n1/ has no replicas 2> 401612 INFO (qtp410531627-4213) [n:127.0.0.1:43527_solr c:replacesinglenodetest_coll s:shard5 r:core_node6 x:replacesinglenodetest_coll_shard5_replica_n1 t:null-1661] o.a.s.c.ShardLeaderElectionContextBase Creating leader registration node /collections/replacesinglenodetest_coll/leaders/shard5/leader after winning as /collections/replacesinglenodetest_coll/leader_elect/shard5/election/72077816160387076-core_node6-n_0000000000 2> 401619 INFO (qtp410531627-4209) [n:127.0.0.1:43527_solr c:replacesinglenodetest_coll s:shard2 r:core_node8 x:replacesinglenodetest_coll_shard2_replica_n3 t:null-1661] o.a.s.c.SyncStrategy http://127.0.0.1:43527/solr/replacesinglenodetest_coll_shard2_replica_n3/ has no replicas 2> 401619 INFO (qtp410531627-4209) [n:127.0.0.1:43527_solr c:replacesinglenodetest_coll s:shard2 r:core_node8 x:replacesinglenodetest_coll_shard2_replica_n3 t:null-1661] o.a.s.c.ShardLeaderElectionContextBase Creating leader registration node /collections/replacesinglenodetest_coll/leaders/shard2/leader after winning as /collections/replacesinglenodetest_coll/leader_elect/shard2/election/72077816160387076-core_node8-n_0000000000 2> 401630 INFO (qtp410531627-4211) [n:127.0.0.1:43527_solr c:replacesinglenodetest_coll s:shard4 r:core_node9 x:replacesinglenodetest_coll_shard4_replica_n4 t:null-1661] o.a.s.c.ShardLeaderElectionContext Enough replicas found to continue. 2> 401630 INFO (qtp410531627-4211) [n:127.0.0.1:43527_solr c:replacesinglenodetest_coll s:shard4 r:core_node9 x:replacesinglenodetest_coll_shard4_replica_n4 t:null-1661] o.a.s.c.ShardLeaderElectionContext I may be the new leader - try and sync 2> 401631 INFO (qtp410531627-4214) [n:127.0.0.1:43527_solr c:replacesinglenodetest_coll s:shard3 r:core_node7 x:replacesinglenodetest_coll_shard3_replica_n2 t:null-1661] o.a.s.c.SyncStrategy http://127.0.0.1:43527/solr/replacesinglenodetest_coll_shard3_replica_n2/ has no replicas 2> 401631 INFO (qtp410531627-4211) [n:127.0.0.1:43527_solr c:replacesinglenodetest_coll s:shard4 r:core_node9 x:replacesinglenodetest_coll_shard4_replica_n4 t:null-1661] o.a.s.c.SyncStrategy Sync replicas to http://127.0.0.1:43527/solr/replacesinglenodetest_coll_shard4_replica_n4/ 2> 401631 INFO (qtp410531627-4214) [n:127.0.0.1:43527_solr c:replacesinglenodetest_coll s:shard3 r:core_node7 x:replacesinglenodetest_coll_shard3_replica_n2 t:null-1661] o.a.s.c.ShardLeaderElectionContextBase Creating leader registration node /collections/replacesinglenodetest_coll/leaders/shard3/leader after winning as /collections/replacesinglenodetest_coll/leader_elect/shard3/election/72077816160387076-core_node7-n_0000000000 2> 401643 INFO (qtp410531627-4213) [n:127.0.0.1:43527_solr c:replacesinglenodetest_coll s:shard5 r:core_node6 x:replacesinglenodetest_coll_shard5_replica_n1 t:null-1661] o.a.s.c.ShardLeaderElectionContext I am the new leader: http://127.0.0.1:43527/solr/replacesinglenodetest_coll_shard5_replica_n1/ shard5 2> 401653 INFO (qtp410531627-4212) [n:127.0.0.1:43527_solr c:replacesinglenodetest_coll s:shard1 r:core_node10 x:replacesinglenodetest_coll_shard1_replica_n5 t:null-1661] o.a.s.c.ShardLeaderElectionContext Enough replicas found to continue. 2> 401654 INFO (qtp410531627-4212) [n:127.0.0.1:43527_solr c:replacesinglenodetest_coll s:shard1 r:core_node10 x:replacesinglenodetest_coll_shard1_replica_n5 t:null-1661] o.a.s.c.ShardLeaderElectionContext I may be the new leader - try and sync 2> 401654 INFO (qtp410531627-4211) [n:127.0.0.1:43527_solr c:replacesinglenodetest_coll s:shard4 r:core_node9 x:replacesinglenodetest_coll_shard4_replica_n4 t:null-1661] o.a.s.c.SyncStrategy Sync Success - now sync replicas to me 2> 401654 INFO (qtp410531627-4212) [n:127.0.0.1:43527_solr c:replacesinglenodetest_coll s:shard1 r:core_node10 x:replacesinglenodetest_coll_shard1_replica_n5 t:null-1661] o.a.s.c.SyncStrategy Sync replicas to http://127.0.0.1:43527/solr/replacesinglenodetest_coll_shard1_replica_n5/ 2> 401665 INFO (qtp410531627-4214) [n:127.0.0.1:43527_solr c:replacesinglenodetest_coll s:shard3 r:core_node7 x:replacesinglenodetest_coll_shard3_replica_n2 t:null-1661] o.a.s.c.ShardLeaderElectionContext I am the new leader: http://127.0.0.1:43527/solr/replacesinglenodetest_coll_shard3_replica_n2/ shard3 2> 401668 INFO (qtp410531627-4213) [n:127.0.0.1:43527_solr c:replacesinglenodetest_coll s:shard5 r:core_node6 x:replacesinglenodetest_coll_shard5_replica_n1 t:null-1661] o.a.s.c.ZkController I am the leader, no recovery necessary 2> 401670 INFO (qtp410531627-4209) [n:127.0.0.1:43527_solr c:replacesinglenodetest_coll s:shard2 r:core_node8 x:replacesinglenodetest_coll_shard2_replica_n3 t:null-1661] o.a.s.c.ShardLeaderElectionContext I am the new leader: http://127.0.0.1:43527/solr/replacesinglenodetest_coll_shard2_replica_n3/ shard2 2> 401674 INFO (qtp410531627-4212) [n:127.0.0.1:43527_solr c:replacesinglenodetest_coll s:shard1 r:core_node10 x:replacesinglenodetest_coll_shard1_replica_n5 t:null-1661] o.a.s.c.SyncStrategy Sync Success - now sync replicas to me 2> 401679 INFO (qtp410531627-4211) [n:127.0.0.1:43527_solr c:replacesinglenodetest_coll s:shard4 r:core_node9 x:replacesinglenodetest_coll_shard4_replica_n4 t:null-1661] o.a.s.c.SyncStrategy http://127.0.0.1:43527/solr/replacesinglenodetest_coll_shard4_replica_n4/ has no replicas 2> 401680 INFO (qtp410531627-4211) [n:127.0.0.1:43527_solr c:replacesinglenodetest_coll s:shard4 r:core_node9 x:replacesinglenodetest_coll_shard4_replica_n4 t:null-1661] o.a.s.c.ShardLeaderElectionContextBase Creating leader registration node /collections/replacesinglenodetest_coll/leaders/shard4/leader after winning as /collections/replacesinglenodetest_coll/leader_elect/shard4/election/72077816160387076-core_node9-n_0000000000 2> 401682 INFO (qtp410531627-4209) [n:127.0.0.1:43527_solr c:replacesinglenodetest_coll s:shard2 r:core_node8 x:replacesinglenodetest_coll_shard2_replica_n3 t:null-1661] o.a.s.c.ZkController I am the leader, no recovery necessary 2> 401685 INFO (qtp410531627-4212) [n:127.0.0.1:43527_solr c:replacesinglenodetest_coll s:shard1 r:core_node10 x:replacesinglenodetest_coll_shard1_replica_n5 t:null-1661] o.a.s.c.SyncStrategy http://127.0.0.1:43527/solr/replacesinglenodetest_coll_shard1_replica_n5/ has no replicas 2> 401686 INFO (qtp410531627-4212) [n:127.0.0.1:43527_solr c:replacesinglenodetest_coll s:shard1 r:core_node10 x:replacesinglenodetest_coll_shard1_replica_n5 t:null-1661] o.a.s.c.ShardLeaderElectionContextBase Creating leader registration node /collections/replacesinglenodetest_coll/leaders/shard1/leader after winning as /collections/replacesinglenodetest_coll/leader_elect/shard1/election/72077816160387076-core_node10-n_0000000000 2> 401720 INFO (qtp410531627-4212) [n:127.0.0.1:43527_solr c:replacesinglenodetest_coll s:shard1 r:core_node10 x:replacesinglenodetest_coll_shard1_replica_n5 t:null-1661] o.a.s.c.ShardLeaderElectionContext I am the new leader: http://127.0.0.1:43527/solr/replacesinglenodetest_coll_shard1_replica_n5/ shard1 2> 401725 INFO (qtp410531627-4214) [n:127.0.0.1:43527_solr c:replacesinglenodetest_coll s:shard3 r:core_node7 x:replacesinglenodetest_coll_shard3_replica_n2 t:null-1661] o.a.s.c.ZkController I am the leader, no recovery necessary 2> 401727 INFO (qtp410531627-4211) [n:127.0.0.1:43527_solr c:replacesinglenodetest_coll s:shard4 r:core_node9 x:replacesinglenodetest_coll_shard4_replica_n4 t:null-1661] o.a.s.c.ShardLeaderElectionContext I am the new leader: http://127.0.0.1:43527/solr/replacesinglenodetest_coll_shard4_replica_n4/ shard4 2> 401739 INFO (qtp410531627-4211) [n:127.0.0.1:43527_solr c:replacesinglenodetest_coll s:shard4 r:core_node9 x:replacesinglenodetest_coll_shard4_replica_n4 t:null-1661] o.a.s.c.ZkController I am the leader, no recovery necessary 2> 401746 INFO (qtp410531627-4212) [n:127.0.0.1:43527_solr c:replacesinglenodetest_coll s:shard1 r:core_node10 x:replacesinglenodetest_coll_shard1_replica_n5 t:null-1661] o.a.s.c.ZkController I am the leader, no recovery necessary 2> 401754 INFO (zkCallback-2309-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/replacesinglenodetest_coll/state.json zxid: 223] for collection [replacesinglenodetest_coll] has occurred - updating... (live nodes size: [1]) 2> 401757 INFO (qtp410531627-4213) [n:127.0.0.1:43527_solr c: s: r: x: t:null-1661] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/cores params={qt=/admin/cores&coreNodeName=core_node6&collection.configName=conf1&newCollection=true&name=replacesinglenodetest_coll_shard5_replica_n1&action=CREATE&numShards=5&collection=replacesinglenodetest_coll&shard=shard5&wt=javabin&version=2&replicaType=NRT} status=0 QTime=2108 2> 401757 INFO (qtp410531627-4214) [n:127.0.0.1:43527_solr c: s: r: x: t:null-1661] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/cores params={qt=/admin/cores&coreNodeName=core_node7&collection.configName=conf1&newCollection=true&name=replacesinglenodetest_coll_shard3_replica_n2&action=CREATE&numShards=5&collection=replacesinglenodetest_coll&shard=shard3&wt=javabin&version=2&replicaType=NRT} status=0 QTime=2107 2> 401765 INFO (zkCallback-2309-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/replacesinglenodetest_coll/state.json zxid: 224] for collection [replacesinglenodetest_coll] has occurred - updating... (live nodes size: [1]) 2> 401766 INFO (qtp410531627-4211) [n:127.0.0.1:43527_solr c: s: r: x: t:null-1661] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/cores params={qt=/admin/cores&coreNodeName=core_node9&collection.configName=conf1&newCollection=true&name=replacesinglenodetest_coll_shard4_replica_n4&action=CREATE&numShards=5&collection=replacesinglenodetest_coll&shard=shard4&wt=javabin&version=2&replicaType=NRT} status=0 QTime=2115 2> 401767 INFO (qtp410531627-4209) [n:127.0.0.1:43527_solr c: s: r: x: t:null-1661] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/cores params={qt=/admin/cores&coreNodeName=core_node8&collection.configName=conf1&newCollection=true&name=replacesinglenodetest_coll_shard2_replica_n3&action=CREATE&numShards=5&collection=replacesinglenodetest_coll&shard=shard2&wt=javabin&version=2&replicaType=NRT} status=0 QTime=2116 2> 401774 INFO (qtp410531627-4212) [n:127.0.0.1:43527_solr c: s: r: x: t:null-1661] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/cores params={qt=/admin/cores&coreNodeName=core_node10&collection.configName=conf1&newCollection=true&name=replacesinglenodetest_coll_shard1_replica_n5&action=CREATE&numShards=5&collection=replacesinglenodetest_coll&shard=shard1&wt=javabin&version=2&replicaType=NRT} status=0 QTime=2122 2> 401780 INFO (qtp410531627-4210) [n:127.0.0.1:43527_solr c:replacesinglenodetest_coll s: r: x: t:null-1661] o.a.s.h.a.CollectionsHandler Wait for new collection to be active for at most 45 seconds. Check all shard replicas 2> 401781 INFO (qtp410531627-4210) [n:127.0.0.1:43527_solr c:replacesinglenodetest_coll s: r: x: t:null-1661] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/collections params={pullReplicas=0&collection.configName=conf1&name=replacesinglenodetest_coll&nrtReplicas=1&action=CREATE&numShards=5&tlogReplicas=0&wt=javabin&version=2} status=0 QTime=2449 2> 401783 INFO (TEST-ReplaceNodeTest.testFailOnSingleNode-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.MiniSolrCloudCluster waitForActiveCollection: replacesinglenodetest_coll 2> 401802 INFO (qtp410531627-4213) [n:127.0.0.1:43527_solr c: s: r: x: t:null-1662] o.a.s.c.a.c.DistributedCollectionConfigSetCommandRunner Running Collection API locally for REPLACENODE asyncId=null 2> 401807 ERROR (DistributedCollectionApiCommandExecutor-2317-thread-2-processing-127.0.0.1:43527_solr null-1662) [n:127.0.0.1:43527_solr c: s: r: x: t:null-1662] o.a.s.c.a.c.DistributedCollectionConfigSetCommandRunner Operation REPLACENODE failed 2> => org.apache.solr.common.SolrException: No nodes other than the source node: 127.0.0.1:43527_solr are live, therefore replicas cannot be moved 2> at org.apache.solr.cloud.api.collections.ReplaceNodeCmd.call(ReplaceNodeCmd.java:70) 2> org.apache.solr.common.SolrException: No nodes other than the source node: 127.0.0.1:43527_solr are live, therefore replicas cannot be moved 2> at org.apache.solr.cloud.api.collections.ReplaceNodeCmd.call(ReplaceNodeCmd.java:70) ~[main/:?] 2> at org.apache.solr.cloud.api.collections.DistributedCollectionConfigSetCommandRunner$CollectionCommandRunner.call(DistributedCollectionConfigSetCommandRunner.java:430) [main/:?] 2> at org.apache.solr.cloud.api.collections.DistributedCollectionConfigSetCommandRunner$CollectionCommandRunner.call(DistributedCollectionConfigSetCommandRunner.java:363) [main/:?] 2> at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264) [?:?] 2> at org.apache.solr.common.util.ExecutorUtil$MDCAwareThreadPoolExecutor.lambda$execute$0(ExecutorUtil.java:294) [solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) [?:?] 2> at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) [?:?] 2> at java.base/java.lang.Thread.run(Thread.java:829) [?:?] 2> 401812 ERROR (qtp410531627-4213) [n:127.0.0.1:43527_solr c: s: r: x: t:null-1662] o.a.s.h.RequestHandlerBase Client exception 2> => org.apache.solr.common.SolrException: No nodes other than the source node: 127.0.0.1:43527_solr are live, therefore replicas cannot be moved 2> at org.apache.solr.client.solrj.SolrResponse.getException(SolrResponse.java:56) 2> org.apache.solr.common.SolrException: No nodes other than the source node: 127.0.0.1:43527_solr are live, therefore replicas cannot be moved 2> at org.apache.solr.client.solrj.SolrResponse.getException(SolrResponse.java:56) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.handler.admin.api.ReplaceNode.replaceNode(ReplaceNode.java:74) ~[main/:?] 2> at org.apache.solr.handler.admin.CollectionsHandler$CollectionOperation.lambda$static$40(CollectionsHandler.java:1185) ~[main/:?] 2> at org.apache.solr.handler.admin.CollectionsHandler$CollectionOperation.execute(CollectionsHandler.java:1265) ~[main/:?] 2> at org.apache.solr.handler.admin.CollectionsHandler.invokeAction(CollectionsHandler.java:315) ~[main/:?] 2> at org.apache.solr.handler.admin.CollectionsHandler.handleRequestBody(CollectionsHandler.java:293) ~[main/:?] 2> at org.apache.solr.handler.RequestHandlerBase.handleRequest(RequestHandlerBase.java:226) [main/:?] 2> at org.apache.solr.servlet.HttpSolrCall.handleAdmin(HttpSolrCall.java:932) [main/:?] 2> at org.apache.solr.servlet.HttpSolrCall.handleAdminRequest(HttpSolrCall.java:880) [main/:?] 2> at org.apache.solr.servlet.HttpSolrCall.call(HttpSolrCall.java:541) [main/:?] 2> at org.apache.solr.servlet.SolrDispatchFilter.dispatch(SolrDispatchFilter.java:262) [main/:?] 2> at org.apache.solr.servlet.SolrDispatchFilter.lambda$doFilter$0(SolrDispatchFilter.java:219) [main/:?] 2> at org.apache.solr.servlet.ServletUtils.traceHttpRequestExecution2(ServletUtils.java:246) [main/:?] 2> at org.apache.solr.servlet.ServletUtils.rateLimitRequest(ServletUtils.java:215) [main/:?] 2> at org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:213) [main/:?] 2> at org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:195) [main/:?] 2> at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:202) [jetty-servlet-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1635) [jetty-servlet-10.0.19.jar:10.0.19] 2> at org.apache.solr.embedded.JettySolrRunner$DebugFilter.doFilter(JettySolrRunner.java:187) [solr-test-framework-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:202) [jetty-servlet-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1635) [jetty-servlet-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:527) [jetty-servlet-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:221) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1580) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:221) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1384) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:176) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:484) [jetty-servlet-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1553) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:174) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1306) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:129) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:122) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.rewrite.handler.RewriteHandler.handle(RewriteHandler.java:301) [jetty-rewrite-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:122) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.gzip.GzipHandler.handle(GzipHandler.java:822) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:122) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.Server.handle(Server.java:563) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.HttpChannel$RequestDispatchable.dispatch(HttpChannel.java:1598) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:753) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:501) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:287) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:314) [jetty-io-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:100) [jetty-io-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.io.SelectableChannelEndPoint$1.run(SelectableChannelEndPoint.java:53) [jetty-io-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:969) [jetty-util-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.doRunJob(QueuedThreadPool.java:1194) [jetty-util-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1149) [jetty-util-10.0.19.jar:10.0.19] 2> at java.base/java.lang.Thread.run(Thread.java:829) [?:?] 2> 401828 INFO (qtp410531627-4213) [n:127.0.0.1:43527_solr c: s: r: x: t:null-1662] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/collections params={sourceNode=127.0.0.1:43527_solr&action=REPLACENODE&wt=javabin&version=2} status=400 QTime=27 2> 401837 INFO (TEST-ReplaceNodeTest.testFailOnSingleNode-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.SolrTestCaseJ4 ###Ending testFailOnSingleNode 2> 401983 INFO (TEST-ReplaceNodeTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.SolrTestCaseJ4 ###Starting test 2> 402094 INFO (jetty-closer-2358-thread-1) [n: c: s: r: x: t:] o.e.j.s.Server Stopped Server@661ff236{STOPPING}[10.0.19,sto=0] 2> 402097 INFO (jetty-closer-2358-thread-1) [n: c: s: r: x: t:] o.e.j.s.AbstractConnector Stopped ServerConnector@16bfca54{HTTP/1.1, (http/1.1, h2c)}{127.0.0.1:0} 2> 402098 INFO (jetty-closer-2358-thread-1) [n: c: s: r: x: t:] o.e.j.s.h.ContextHandler Stopped o.e.j.s.ServletContextHandler@2853e959{/solr,file:///build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-cwd/,STOPPED} 2> 402198 INFO (jetty-closer-2358-thread-1) [n: c: s: r: x: t:] o.a.s.c.CoreContainer Shutting down CoreContainer instance=502179402 2> 402199 INFO (jetty-closer-2358-thread-1) [n: c: s: r: x: t:] o.a.s.c.ZkController Remove node as live in ZooKeeper:/live_nodes/127.0.0.1:43527_solr 2> 402204 INFO (jetty-closer-2358-thread-1) [n: c: s: r: x: t:] o.a.s.c.ZkController Publish this node as DOWN... 2> 402204 INFO (jetty-closer-2358-thread-1) [n: c: s: r: x: t:] o.a.s.c.ZkController Publish node=127.0.0.1:43527_solr as DOWN 2> 402206 INFO (zkCallback-2309-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (1) -> (0) 2> 402213 INFO (zkCallback-2309-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/replacesinglenodetest_coll/state.json zxid: 228] for collection [replacesinglenodetest_coll] has occurred - updating... (live nodes size: [0]) 2> 402220 INFO (coreCloseExecutor-2361-thread-1) [n:127.0.0.1:43527_solr c:replacesinglenodetest_coll s:shard3 r:core_node7 x:replacesinglenodetest_coll_shard3_replica_n2 t:] o.a.s.c.SolrCore CLOSING SolrCore org.apache.solr.core.SolrCore@1cdf4906 replacesinglenodetest_coll_shard3_replica_n2 2> 402220 INFO (coreCloseExecutor-2361-thread-2) [n:127.0.0.1:43527_solr c:replacesinglenodetest_coll s:shard1 r:core_node10 x:replacesinglenodetest_coll_shard1_replica_n5 t:] o.a.s.c.SolrCore CLOSING SolrCore org.apache.solr.core.SolrCore@426c0815 replacesinglenodetest_coll_shard1_replica_n5 2> 402220 INFO (coreCloseExecutor-2361-thread-1) [n:127.0.0.1:43527_solr c:replacesinglenodetest_coll s:shard3 r:core_node7 x:replacesinglenodetest_coll_shard3_replica_n2 t:] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.core.replacesinglenodetest_coll.shard3.replica_n2 tag=SolrCore@1cdf4906 2> 402221 INFO (coreCloseExecutor-2361-thread-3) [n:127.0.0.1:43527_solr c:replacesinglenodetest_coll s:shard2 r:core_node8 x:replacesinglenodetest_coll_shard2_replica_n3 t:] o.a.s.c.SolrCore CLOSING SolrCore org.apache.solr.core.SolrCore@2d3570eb replacesinglenodetest_coll_shard2_replica_n3 2> 402221 INFO (coreCloseExecutor-2361-thread-4) [n:127.0.0.1:43527_solr c:replacesinglenodetest_coll s:shard5 r:core_node6 x:replacesinglenodetest_coll_shard5_replica_n1 t:] o.a.s.c.SolrCore CLOSING SolrCore org.apache.solr.core.SolrCore@231826d7 replacesinglenodetest_coll_shard5_replica_n1 2> 402222 INFO (coreCloseExecutor-2361-thread-5) [n:127.0.0.1:43527_solr c:replacesinglenodetest_coll s:shard4 r:core_node9 x:replacesinglenodetest_coll_shard4_replica_n4 t:] o.a.s.c.SolrCore CLOSING SolrCore org.apache.solr.core.SolrCore@3abdfbae replacesinglenodetest_coll_shard4_replica_n4 2> 402486 INFO (coreCloseExecutor-2361-thread-1) [n:127.0.0.1:43527_solr c:replacesinglenodetest_coll s:shard3 r:core_node7 x:replacesinglenodetest_coll_shard3_replica_n2 t:] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.collection.replacesinglenodetest_coll.shard3.leader tag=SolrCore@1cdf4906 2> 402487 INFO (coreCloseExecutor-2361-thread-2) [n:127.0.0.1:43527_solr c:replacesinglenodetest_coll s:shard1 r:core_node10 x:replacesinglenodetest_coll_shard1_replica_n5 t:] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.core.replacesinglenodetest_coll.shard1.replica_n5 tag=SolrCore@426c0815 2> 402495 INFO (coreCloseExecutor-2361-thread-1) [n:127.0.0.1:43527_solr c:replacesinglenodetest_coll s:shard3 r:core_node7 x:replacesinglenodetest_coll_shard3_replica_n2 t:] o.a.s.u.DirectUpdateHandler2 Committing on IndexWriter.close() ... SKIPPED (unnecessary). 2> 402797 INFO (coreCloseExecutor-2361-thread-2) [n:127.0.0.1:43527_solr c:replacesinglenodetest_coll s:shard1 r:core_node10 x:replacesinglenodetest_coll_shard1_replica_n5 t:] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.collection.replacesinglenodetest_coll.shard1.leader tag=SolrCore@426c0815 2> 402797 INFO (coreCloseExecutor-2361-thread-3) [n:127.0.0.1:43527_solr c:replacesinglenodetest_coll s:shard2 r:core_node8 x:replacesinglenodetest_coll_shard2_replica_n3 t:] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.core.replacesinglenodetest_coll.shard2.replica_n3 tag=SolrCore@2d3570eb 2> 402807 INFO (coreCloseExecutor-2361-thread-2) [n:127.0.0.1:43527_solr c:replacesinglenodetest_coll s:shard1 r:core_node10 x:replacesinglenodetest_coll_shard1_replica_n5 t:] o.a.s.u.DirectUpdateHandler2 Committing on IndexWriter.close() ... SKIPPED (unnecessary). 2> 403117 INFO (coreCloseExecutor-2361-thread-3) [n:127.0.0.1:43527_solr c:replacesinglenodetest_coll s:shard2 r:core_node8 x:replacesinglenodetest_coll_shard2_replica_n3 t:] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.collection.replacesinglenodetest_coll.shard2.leader tag=SolrCore@2d3570eb 2> 403118 INFO (coreCloseExecutor-2361-thread-4) [n:127.0.0.1:43527_solr c:replacesinglenodetest_coll s:shard5 r:core_node6 x:replacesinglenodetest_coll_shard5_replica_n1 t:] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.core.replacesinglenodetest_coll.shard5.replica_n1 tag=SolrCore@231826d7 2> 403126 INFO (coreCloseExecutor-2361-thread-3) [n:127.0.0.1:43527_solr c:replacesinglenodetest_coll s:shard2 r:core_node8 x:replacesinglenodetest_coll_shard2_replica_n3 t:] o.a.s.u.DirectUpdateHandler2 Committing on IndexWriter.close() ... SKIPPED (unnecessary). 2> 403425 INFO (coreCloseExecutor-2361-thread-4) [n:127.0.0.1:43527_solr c:replacesinglenodetest_coll s:shard5 r:core_node6 x:replacesinglenodetest_coll_shard5_replica_n1 t:] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.collection.replacesinglenodetest_coll.shard5.leader tag=SolrCore@231826d7 2> 403425 INFO (coreCloseExecutor-2361-thread-5) [n:127.0.0.1:43527_solr c:replacesinglenodetest_coll s:shard4 r:core_node9 x:replacesinglenodetest_coll_shard4_replica_n4 t:] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.core.replacesinglenodetest_coll.shard4.replica_n4 tag=SolrCore@3abdfbae 2> 403435 INFO (coreCloseExecutor-2361-thread-4) [n:127.0.0.1:43527_solr c:replacesinglenodetest_coll s:shard5 r:core_node6 x:replacesinglenodetest_coll_shard5_replica_n1 t:] o.a.s.u.DirectUpdateHandler2 Committing on IndexWriter.close() ... SKIPPED (unnecessary). 2> 403746 INFO (coreCloseExecutor-2361-thread-5) [n:127.0.0.1:43527_solr c:replacesinglenodetest_coll s:shard4 r:core_node9 x:replacesinglenodetest_coll_shard4_replica_n4 t:] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.collection.replacesinglenodetest_coll.shard4.leader tag=SolrCore@3abdfbae 2> 403755 INFO (coreCloseExecutor-2361-thread-5) [n:127.0.0.1:43527_solr c:replacesinglenodetest_coll s:shard4 r:core_node9 x:replacesinglenodetest_coll_shard4_replica_n4 t:] o.a.s.u.DirectUpdateHandler2 Committing on IndexWriter.close() ... SKIPPED (unnecessary). 2> 403766 INFO (jetty-closer-2358-thread-1) [n: c: s: r: x: t:] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.node tag=null 2> 403905 INFO (jetty-closer-2358-thread-1) [n: c: s: r: x: t:] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.jvm tag=null 2> 403967 INFO (jetty-closer-2358-thread-1) [n: c: s: r: x: t:] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.jetty tag=null 2> 403968 INFO (jetty-closer-2358-thread-1) [n: c: s: r: x: t:] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.cluster tag=null 2> 403973 INFO (closeThreadPool-2362-thread-1) [n: c: s: r: x: t:] o.a.s.c.Overseer Overseer (id=72077816160387076-127.0.0.1:43527_solr-n_0000000000) closing 2> 403974 INFO (OverseerStateUpdate-72077816160387076-127.0.0.1:43527_solr-n_0000000000) [n:127.0.0.1:43527_solr c: s: r: x: t:] o.a.s.c.Overseer Overseer Loop exiting : 127.0.0.1:43527_solr 2> 403976 INFO (closeThreadPool-2362-thread-2) [n: c: s: r: x: t:] o.a.s.c.Overseer Overseer (id=72077816160387076-127.0.0.1:43527_solr-n_0000000000) closing 2> 404094 INFO (jetty-closer-2358-thread-1) [n: c: s: r: x: t:] o.a.s.c.Overseer Overseer (id=72077816160387076-127.0.0.1:43527_solr-n_0000000000) closing 2> 404099 INFO (TEST-ReplaceNodeTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.ZkTestServer Shutting down ZkTestServer. 2> 404319 WARN (ZkTestServer Run Thread) [n: c: s: r: x: t:] o.a.s.c.ZkTestServer Watch limit violations: 2> Maximum concurrent create/delete watches above limit: 2> 2> 2 /solr/aliases.json 2> 2 /solr/collections/replacesinglenodetest_coll/terms/shard1 2> 2 /solr/collections/replacesinglenodetest_coll/terms/shard3 2> 2 /solr/collections/replacesinglenodetest_coll/terms/shard2 2> 2 /solr/collections/replacesinglenodetest_coll/terms/shard5 2> 2 /solr/collections/replacesinglenodetest_coll/terms/shard4 2> 2 /solr/clusterprops.json 2> 2> Maximum concurrent data watches above limit: 2> 2> 4 /solr/collections/replacesinglenodetest_coll/state.json 2> 2> Maximum concurrent children watches above limit: 2> 2> 4 /solr/live_nodes 2> 4 /solr/collections 2> 2> 404345 INFO (TEST-ReplaceNodeTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.MiniSolrCloudCluster Starting cluster of 6 servers in /build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.cloud.ReplaceNodeTest_246C98A4C257C021-001/tempDir-003 2> 404347 INFO (TEST-ReplaceNodeTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.ZkTestServer STARTING ZK TEST SERVER 2> 404348 INFO (ZkTestServer Run Thread) [n: c: s: r: x: t:] o.a.s.c.ZkTestServer client port: 0.0.0.0/0.0.0.0:0 2> 404349 INFO (ZkTestServer Run Thread) [n: c: s: r: x: t:] o.a.s.c.ZkTestServer Starting server 2> 404361 WARN (ZkTestServer Run Thread) [n: c: s: r: x: t:] o.a.z.s.ServerCnxnFactory maxCnxns is not configured, using default value 0. 2> 404448 INFO (TEST-ReplaceNodeTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.ZkTestServer start zk server on port: 35103 2> 404456 WARN (TEST-ReplaceNodeTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.c.SolrZkClient Using default ZkCredentialsInjector. ZkCredentialsInjector is not secure, it creates an empty list of credentials which leads to 'OPEN_ACL_UNSAFE' ACLs to Zookeeper nodes 2> 404465 INFO (TEST-ReplaceNodeTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.c.ConnectionManager Waiting up to 30000ms for client to connect to ZooKeeper 2> 404479 INFO (zkConnectionManagerCallback-2364-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ConnectionManager zkClient has connected 2> 404479 INFO (TEST-ReplaceNodeTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper 2> 404480 WARN (TEST-ReplaceNodeTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.c.SolrZkClient Using default ZkACLProvider. DefaultZkACLProvider is not secure, it creates 'OPEN_ACL_UNSAFE' ACLs to Zookeeper nodes 2> 404489 WARN (TEST-ReplaceNodeTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.c.SolrZkClient Using default ZkCredentialsInjector. ZkCredentialsInjector is not secure, it creates an empty list of credentials which leads to 'OPEN_ACL_UNSAFE' ACLs to Zookeeper nodes 2> 404497 INFO (TEST-ReplaceNodeTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.c.ConnectionManager Waiting up to 30000ms for client to connect to ZooKeeper 2> 404508 INFO (zkConnectionManagerCallback-2366-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ConnectionManager zkClient has connected 2> 404509 INFO (TEST-ReplaceNodeTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper 2> 404509 WARN (TEST-ReplaceNodeTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.c.SolrZkClient Using default ZkACLProvider. DefaultZkACLProvider is not secure, it creates 'OPEN_ACL_UNSAFE' ACLs to Zookeeper nodes 2> 404510 WARN (TEST-ReplaceNodeTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.c.SolrZkClient Using default ZkCredentialsInjector. ZkCredentialsInjector is not secure, it creates an empty list of credentials which leads to 'OPEN_ACL_UNSAFE' ACLs to Zookeeper nodes 2> 404520 INFO (TEST-ReplaceNodeTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.c.ConnectionManager Waiting up to 15000ms for client to connect to ZooKeeper 2> 404534 INFO (zkConnectionManagerCallback-2368-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ConnectionManager zkClient has connected 2> 404534 INFO (TEST-ReplaceNodeTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper 2> 404535 WARN (TEST-ReplaceNodeTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.c.SolrZkClient Using default ZkACLProvider. DefaultZkACLProvider is not secure, it creates 'OPEN_ACL_UNSAFE' ACLs to Zookeeper nodes 2> 404673 INFO (jetty-launcher-2369-thread-1) [n: c: s: r: x: t:] o.a.s.e.JettySolrRunner Start Jetty (configured port=0, binding port=0) 2> 404673 INFO (jetty-launcher-2369-thread-2) [n: c: s: r: x: t:] o.a.s.e.JettySolrRunner Start Jetty (configured port=0, binding port=0) 2> 404673 INFO (jetty-launcher-2369-thread-3) [n: c: s: r: x: t:] o.a.s.e.JettySolrRunner Start Jetty (configured port=0, binding port=0) 2> 404674 INFO (jetty-launcher-2369-thread-2) [n: c: s: r: x: t:] o.a.s.e.JettySolrRunner Trying to start Jetty on port 0 try number 2 ... 2> 404674 INFO (jetty-launcher-2369-thread-3) [n: c: s: r: x: t:] o.a.s.e.JettySolrRunner Trying to start Jetty on port 0 try number 2 ... 2> 404674 INFO (jetty-launcher-2369-thread-1) [n: c: s: r: x: t:] o.a.s.e.JettySolrRunner Trying to start Jetty on port 0 try number 2 ... 2> 404676 INFO (jetty-launcher-2369-thread-1) [n: c: s: r: x: t:] o.e.j.s.Server jetty-10.0.19; built: 2023-12-15T20:38:32.477Z; git: 8492d1c78f122bb30cce20aecfa07e7283facd47; jvm 11.0.22+7 2> 404676 INFO (jetty-launcher-2369-thread-2) [n: c: s: r: x: t:] o.e.j.s.Server jetty-10.0.19; built: 2023-12-15T20:38:32.477Z; git: 8492d1c78f122bb30cce20aecfa07e7283facd47; jvm 11.0.22+7 2> 404676 INFO (jetty-launcher-2369-thread-3) [n: c: s: r: x: t:] o.e.j.s.Server jetty-10.0.19; built: 2023-12-15T20:38:32.477Z; git: 8492d1c78f122bb30cce20aecfa07e7283facd47; jvm 11.0.22+7 2> 404676 INFO (jetty-launcher-2369-thread-4) [n: c: s: r: x: t:] o.a.s.e.JettySolrRunner Start Jetty (configured port=0, binding port=0) 2> 404677 INFO (jetty-launcher-2369-thread-4) [n: c: s: r: x: t:] o.a.s.e.JettySolrRunner Trying to start Jetty on port 0 try number 2 ... 2> 404677 INFO (jetty-launcher-2369-thread-5) [n: c: s: r: x: t:] o.a.s.e.JettySolrRunner Start Jetty (configured port=0, binding port=0) 2> 404677 INFO (jetty-launcher-2369-thread-5) [n: c: s: r: x: t:] o.a.s.e.JettySolrRunner Trying to start Jetty on port 0 try number 2 ... 2> 404678 INFO (jetty-launcher-2369-thread-4) [n: c: s: r: x: t:] o.e.j.s.Server jetty-10.0.19; built: 2023-12-15T20:38:32.477Z; git: 8492d1c78f122bb30cce20aecfa07e7283facd47; jvm 11.0.22+7 2> 404678 INFO (jetty-launcher-2369-thread-6) [n: c: s: r: x: t:] o.a.s.e.JettySolrRunner Start Jetty (configured port=0, binding port=0) 2> 404678 INFO (jetty-launcher-2369-thread-5) [n: c: s: r: x: t:] o.e.j.s.Server jetty-10.0.19; built: 2023-12-15T20:38:32.477Z; git: 8492d1c78f122bb30cce20aecfa07e7283facd47; jvm 11.0.22+7 2> 404678 INFO (jetty-launcher-2369-thread-6) [n: c: s: r: x: t:] o.a.s.e.JettySolrRunner Trying to start Jetty on port 0 try number 2 ... 2> 404680 INFO (jetty-launcher-2369-thread-6) [n: c: s: r: x: t:] o.e.j.s.Server jetty-10.0.19; built: 2023-12-15T20:38:32.477Z; git: 8492d1c78f122bb30cce20aecfa07e7283facd47; jvm 11.0.22+7 2> 404741 INFO (jetty-launcher-2369-thread-3) [n: c: s: r: x: t:] o.e.j.s.s.DefaultSessionIdManager Session workerName=node0 2> 404746 INFO (jetty-launcher-2369-thread-2) [n: c: s: r: x: t:] o.e.j.s.s.DefaultSessionIdManager Session workerName=node0 2> 404752 INFO (jetty-launcher-2369-thread-5) [n: c: s: r: x: t:] o.e.j.s.s.DefaultSessionIdManager Session workerName=node0 2> 404754 INFO (jetty-launcher-2369-thread-2) [n: c: s: r: x: t:] o.e.j.s.h.ContextHandler Started o.e.j.s.ServletContextHandler@15c86ec1{/solr,file:///build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-cwd/,AVAILABLE} 2> 404755 INFO (jetty-launcher-2369-thread-6) [n: c: s: r: x: t:] o.e.j.s.s.DefaultSessionIdManager Session workerName=node0 2> 404756 INFO (jetty-launcher-2369-thread-3) [n: c: s: r: x: t:] o.e.j.s.h.ContextHandler Started o.e.j.s.ServletContextHandler@3daedd6b{/solr,file:///build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-cwd/,AVAILABLE} 2> 404760 INFO (jetty-launcher-2369-thread-2) [n: c: s: r: x: t:] o.e.j.s.AbstractConnector Started ServerConnector@489f601f{HTTP/1.1, (http/1.1, h2c)}{127.0.0.1:36563} 2> 404761 INFO (jetty-launcher-2369-thread-6) [n: c: s: r: x: t:] o.e.j.s.h.ContextHandler Started o.e.j.s.ServletContextHandler@58bd38ad{/solr,file:///build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-cwd/,AVAILABLE} 2> 404761 INFO (jetty-launcher-2369-thread-2) [n: c: s: r: x: t:] o.e.j.s.Server Started Server@386126fc{STARTING}[10.0.19,sto=0] @404944ms 2> 404762 INFO (jetty-launcher-2369-thread-3) [n: c: s: r: x: t:] o.e.j.s.AbstractConnector Started ServerConnector@3e14ddf9{HTTP/1.1, (http/1.1, h2c)}{127.0.0.1:34953} 2> 404762 INFO (jetty-launcher-2369-thread-5) [n: c: s: r: x: t:] o.e.j.s.h.ContextHandler Started o.e.j.s.ServletContextHandler@7d3a376b{/solr,file:///build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-cwd/,AVAILABLE} 2> 404763 INFO (jetty-launcher-2369-thread-3) [n: c: s: r: x: t:] o.e.j.s.Server Started Server@4e9c6e2a{STARTING}[10.0.19,sto=0] @404945ms 2> 404767 ERROR (jetty-launcher-2369-thread-3) [n: c: s: r: x: t:] o.a.s.u.StartupLoggingUtils Missing Java Option solr.log.dir. Logging may be missing or incomplete. 2> 404767 INFO (jetty-launcher-2369-thread-3) [n: c: s: r: x: t:] o.a.s.s.CoreContainerProvider Using logger factory org.apache.logging.slf4j.Log4jLoggerFactory 2> 404767 INFO (jetty-launcher-2369-thread-6) [n: c: s: r: x: t:] o.e.j.s.AbstractConnector Started ServerConnector@2c020370{HTTP/1.1, (http/1.1, h2c)}{127.0.0.1:37983} 2> 404767 ERROR (jetty-launcher-2369-thread-2) [n: c: s: r: x: t:] o.a.s.u.StartupLoggingUtils Missing Java Option solr.log.dir. Logging may be missing or incomplete. 2> 404768 INFO (jetty-launcher-2369-thread-2) [n: c: s: r: x: t:] o.a.s.s.CoreContainerProvider Using logger factory org.apache.logging.slf4j.Log4jLoggerFactory 2> 404768 INFO (jetty-launcher-2369-thread-6) [n: c: s: r: x: t:] o.e.j.s.Server Started Server@5c98a479{STARTING}[10.0.19,sto=0] @404950ms 2> 404767 INFO (jetty-launcher-2369-thread-5) [n: c: s: r: x: t:] o.e.j.s.AbstractConnector Started ServerConnector@5c41120b{HTTP/1.1, (http/1.1, h2c)}{127.0.0.1:34105} 2> 404769 INFO (jetty-launcher-2369-thread-5) [n: c: s: r: x: t:] o.e.j.s.Server Started Server@2c2b064e{STARTING}[10.0.19,sto=0] @404951ms 2> 404768 INFO (jetty-launcher-2369-thread-3) [n: c: s: r: x: t:] o.a.s.s.CoreContainerProvider ___ _ Welcome to Apache Solr™ version 9.4.1 2> 404769 ERROR (jetty-launcher-2369-thread-6) [n: c: s: r: x: t:] o.a.s.u.StartupLoggingUtils Missing Java Option solr.log.dir. Logging may be missing or incomplete. 2> 404768 INFO (jetty-launcher-2369-thread-2) [n: c: s: r: x: t:] o.a.s.s.CoreContainerProvider ___ _ Welcome to Apache Solr™ version 9.4.1 2> 404770 INFO (jetty-launcher-2369-thread-6) [n: c: s: r: x: t:] o.a.s.s.CoreContainerProvider Using logger factory org.apache.logging.slf4j.Log4jLoggerFactory 2> 404771 INFO (jetty-launcher-2369-thread-3) [n: c: s: r: x: t:] o.a.s.s.CoreContainerProvider / __| ___| |_ _ Starting in cloud mode on port null 2> 404771 INFO (jetty-launcher-2369-thread-6) [n: c: s: r: x: t:] o.a.s.s.CoreContainerProvider ___ _ Welcome to Apache Solr™ version 9.4.1 2> 404771 INFO (jetty-launcher-2369-thread-2) [n: c: s: r: x: t:] o.a.s.s.CoreContainerProvider / __| ___| |_ _ Starting in cloud mode on port null 2> 404771 INFO (jetty-launcher-2369-thread-3) [n: c: s: r: x: t:] o.a.s.s.CoreContainerProvider \__ \/ _ \ | '_| Install dir: /build/solr/src/solr-9.4.1/solr 2> 404772 INFO (jetty-launcher-2369-thread-6) [n: c: s: r: x: t:] o.a.s.s.CoreContainerProvider / __| ___| |_ _ Starting in cloud mode on port null 2> 404772 INFO (jetty-launcher-2369-thread-2) [n: c: s: r: x: t:] o.a.s.s.CoreContainerProvider \__ \/ _ \ | '_| Install dir: /build/solr/src/solr-9.4.1/solr 2> 404772 INFO (jetty-launcher-2369-thread-6) [n: c: s: r: x: t:] o.a.s.s.CoreContainerProvider \__ \/ _ \ | '_| Install dir: /build/solr/src/solr-9.4.1/solr 2> 404772 ERROR (jetty-launcher-2369-thread-5) [n: c: s: r: x: t:] o.a.s.u.StartupLoggingUtils Missing Java Option solr.log.dir. Logging may be missing or incomplete. 2> 404773 INFO (jetty-launcher-2369-thread-5) [n: c: s: r: x: t:] o.a.s.s.CoreContainerProvider Using logger factory org.apache.logging.slf4j.Log4jLoggerFactory 2> 404773 INFO (jetty-launcher-2369-thread-5) [n: c: s: r: x: t:] o.a.s.s.CoreContainerProvider ___ _ Welcome to Apache Solr™ version 9.4.1 2> 404774 INFO (jetty-launcher-2369-thread-5) [n: c: s: r: x: t:] o.a.s.s.CoreContainerProvider / __| ___| |_ _ Starting in cloud mode on port null 2> 404771 INFO (jetty-launcher-2369-thread-3) [n: c: s: r: x: t:] o.a.s.s.CoreContainerProvider |___/\___/_|_| Start time: 2024-01-21T21:07:58.305876Z 2> 404772 INFO (jetty-launcher-2369-thread-6) [n: c: s: r: x: t:] o.a.s.s.CoreContainerProvider |___/\___/_|_| Start time: 2024-01-21T21:07:58.306868Z 2> 404772 INFO (jetty-launcher-2369-thread-2) [n: c: s: r: x: t:] o.a.s.s.CoreContainerProvider |___/\___/_|_| Start time: 2024-01-21T21:07:58.306630Z 2> 404774 INFO (jetty-launcher-2369-thread-5) [n: c: s: r: x: t:] o.a.s.s.CoreContainerProvider \__ \/ _ \ | '_| Install dir: /build/solr/src/solr-9.4.1/solr 2> 404775 INFO (jetty-launcher-2369-thread-1) [n: c: s: r: x: t:] o.e.j.s.s.DefaultSessionIdManager Session workerName=node0 2> 404775 INFO (jetty-launcher-2369-thread-5) [n: c: s: r: x: t:] o.a.s.s.CoreContainerProvider |___/\___/_|_| Start time: 2024-01-21T21:07:58.309097Z 2> 404777 INFO (jetty-launcher-2369-thread-5) [n: c: s: r: x: t:] o.a.s.s.CoreContainerProvider Solr Home: /build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.cloud.ReplaceNodeTest_246C98A4C257C021-001/tempDir-003/node5 (source: servlet config: solr.solr.home) 2> 404777 INFO (jetty-launcher-2369-thread-3) [n: c: s: r: x: t:] o.a.s.s.CoreContainerProvider Solr Home: /build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.cloud.ReplaceNodeTest_246C98A4C257C021-001/tempDir-003/node3 (source: servlet config: solr.solr.home) 2> 404778 INFO (jetty-launcher-2369-thread-4) [n: c: s: r: x: t:] o.e.j.s.s.DefaultSessionIdManager Session workerName=node0 2> 404778 INFO (jetty-launcher-2369-thread-6) [n: c: s: r: x: t:] o.a.s.s.CoreContainerProvider Solr Home: /build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.cloud.ReplaceNodeTest_246C98A4C257C021-001/tempDir-003/node6 (source: servlet config: solr.solr.home) 2> 404778 INFO (jetty-launcher-2369-thread-2) [n: c: s: r: x: t:] o.a.s.s.CoreContainerProvider Solr Home: /build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.cloud.ReplaceNodeTest_246C98A4C257C021-001/tempDir-003/node2 (source: servlet config: solr.solr.home) 2> 404782 INFO (jetty-launcher-2369-thread-1) [n: c: s: r: x: t:] o.e.j.s.h.ContextHandler Started o.e.j.s.ServletContextHandler@57900a9e{/solr,file:///build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-cwd/,AVAILABLE} 2> 404784 WARN (jetty-launcher-2369-thread-5) [n: c: s: r: x: t:] o.a.s.c.c.SolrZkClient Using default ZkCredentialsInjector. ZkCredentialsInjector is not secure, it creates an empty list of credentials which leads to 'OPEN_ACL_UNSAFE' ACLs to Zookeeper nodes 2> 404784 WARN (jetty-launcher-2369-thread-6) [n: c: s: r: x: t:] o.a.s.c.c.SolrZkClient Using default ZkCredentialsInjector. ZkCredentialsInjector is not secure, it creates an empty list of credentials which leads to 'OPEN_ACL_UNSAFE' ACLs to Zookeeper nodes 2> 404785 INFO (jetty-launcher-2369-thread-4) [n: c: s: r: x: t:] o.e.j.s.h.ContextHandler Started o.e.j.s.ServletContextHandler@6618269a{/solr,file:///build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-cwd/,AVAILABLE} 2> 404785 WARN (jetty-launcher-2369-thread-3) [n: c: s: r: x: t:] o.a.s.c.c.SolrZkClient Using default ZkCredentialsInjector. ZkCredentialsInjector is not secure, it creates an empty list of credentials which leads to 'OPEN_ACL_UNSAFE' ACLs to Zookeeper nodes 2> 404786 WARN (jetty-launcher-2369-thread-2) [n: c: s: r: x: t:] o.a.s.c.c.SolrZkClient Using default ZkCredentialsInjector. ZkCredentialsInjector is not secure, it creates an empty list of credentials which leads to 'OPEN_ACL_UNSAFE' ACLs to Zookeeper nodes 2> 404790 INFO (jetty-launcher-2369-thread-1) [n: c: s: r: x: t:] o.e.j.s.AbstractConnector Started ServerConnector@1b63c09{HTTP/1.1, (http/1.1, h2c)}{127.0.0.1:37265} 2> 404791 INFO (jetty-launcher-2369-thread-1) [n: c: s: r: x: t:] o.e.j.s.Server Started Server@7101129e{STARTING}[10.0.19,sto=0] @404974ms 2> 404791 INFO (jetty-launcher-2369-thread-4) [n: c: s: r: x: t:] o.e.j.s.AbstractConnector Started ServerConnector@111b938c{HTTP/1.1, (http/1.1, h2c)}{127.0.0.1:33013} 2> 404792 INFO (jetty-launcher-2369-thread-4) [n: c: s: r: x: t:] o.e.j.s.Server Started Server@613c8803{STARTING}[10.0.19,sto=0] @404975ms 2> 404804 ERROR (jetty-launcher-2369-thread-1) [n: c: s: r: x: t:] o.a.s.u.StartupLoggingUtils Missing Java Option solr.log.dir. Logging may be missing or incomplete. 2> 404804 INFO (jetty-launcher-2369-thread-1) [n: c: s: r: x: t:] o.a.s.s.CoreContainerProvider Using logger factory org.apache.logging.slf4j.Log4jLoggerFactory 2> 404804 INFO (jetty-launcher-2369-thread-1) [n: c: s: r: x: t:] o.a.s.s.CoreContainerProvider ___ _ Welcome to Apache Solr™ version 9.4.1 2> 404805 INFO (jetty-launcher-2369-thread-1) [n: c: s: r: x: t:] o.a.s.s.CoreContainerProvider / __| ___| |_ _ Starting in cloud mode on port null 2> 404806 INFO (jetty-launcher-2369-thread-1) [n: c: s: r: x: t:] o.a.s.s.CoreContainerProvider \__ \/ _ \ | '_| Install dir: /build/solr/src/solr-9.4.1/solr 2> 404806 ERROR (jetty-launcher-2369-thread-4) [n: c: s: r: x: t:] o.a.s.u.StartupLoggingUtils Missing Java Option solr.log.dir. Logging may be missing or incomplete. 2> 404806 INFO (jetty-launcher-2369-thread-1) [n: c: s: r: x: t:] o.a.s.s.CoreContainerProvider |___/\___/_|_| Start time: 2024-01-21T21:07:58.340556Z 2> 404806 INFO (jetty-launcher-2369-thread-4) [n: c: s: r: x: t:] o.a.s.s.CoreContainerProvider Using logger factory org.apache.logging.slf4j.Log4jLoggerFactory 2> 404807 INFO (jetty-launcher-2369-thread-4) [n: c: s: r: x: t:] o.a.s.s.CoreContainerProvider ___ _ Welcome to Apache Solr™ version 9.4.1 2> 404808 INFO (jetty-launcher-2369-thread-4) [n: c: s: r: x: t:] o.a.s.s.CoreContainerProvider / __| ___| |_ _ Starting in cloud mode on port null 2> 404808 INFO (jetty-launcher-2369-thread-4) [n: c: s: r: x: t:] o.a.s.s.CoreContainerProvider \__ \/ _ \ | '_| Install dir: /build/solr/src/solr-9.4.1/solr 2> 404808 INFO (jetty-launcher-2369-thread-1) [n: c: s: r: x: t:] o.a.s.s.CoreContainerProvider Solr Home: /build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.cloud.ReplaceNodeTest_246C98A4C257C021-001/tempDir-003/node1 (source: servlet config: solr.solr.home) 2> 404809 INFO (jetty-launcher-2369-thread-4) [n: c: s: r: x: t:] o.a.s.s.CoreContainerProvider |___/\___/_|_| Start time: 2024-01-21T21:07:58.343024Z 2> 404811 INFO (jetty-launcher-2369-thread-4) [n: c: s: r: x: t:] o.a.s.s.CoreContainerProvider Solr Home: /build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.cloud.ReplaceNodeTest_246C98A4C257C021-001/tempDir-003/node4 (source: servlet config: solr.solr.home) 2> 404813 WARN (jetty-launcher-2369-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.SolrZkClient Using default ZkCredentialsInjector. ZkCredentialsInjector is not secure, it creates an empty list of credentials which leads to 'OPEN_ACL_UNSAFE' ACLs to Zookeeper nodes 2> 404818 WARN (jetty-launcher-2369-thread-4) [n: c: s: r: x: t:] o.a.s.c.c.SolrZkClient Using default ZkCredentialsInjector. ZkCredentialsInjector is not secure, it creates an empty list of credentials which leads to 'OPEN_ACL_UNSAFE' ACLs to Zookeeper nodes 2> 404820 INFO (jetty-launcher-2369-thread-6) [n: c: s: r: x: t:] o.a.s.c.c.ConnectionManager Waiting up to 30000ms for client to connect to ZooKeeper 2> 404830 INFO (jetty-launcher-2369-thread-2) [n: c: s: r: x: t:] o.a.s.c.c.ConnectionManager Waiting up to 30000ms for client to connect to ZooKeeper 2> 404832 INFO (jetty-launcher-2369-thread-5) [n: c: s: r: x: t:] o.a.s.c.c.ConnectionManager Waiting up to 30000ms for client to connect to ZooKeeper 2> 404835 INFO (jetty-launcher-2369-thread-3) [n: c: s: r: x: t:] o.a.s.c.c.ConnectionManager Waiting up to 30000ms for client to connect to ZooKeeper 2> 404839 INFO (zkConnectionManagerCallback-2373-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ConnectionManager zkClient has connected 2> 404840 INFO (jetty-launcher-2369-thread-6) [n: c: s: r: x: t:] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper 2> 404841 WARN (jetty-launcher-2369-thread-6) [n: c: s: r: x: t:] o.a.s.c.c.SolrZkClient Using default ZkACLProvider. DefaultZkACLProvider is not secure, it creates 'OPEN_ACL_UNSAFE' ACLs to Zookeeper nodes 2> 404843 INFO (zkConnectionManagerCallback-2377-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ConnectionManager zkClient has connected 2> 404844 INFO (jetty-launcher-2369-thread-2) [n: c: s: r: x: t:] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper 2> 404844 WARN (jetty-launcher-2369-thread-2) [n: c: s: r: x: t:] o.a.s.c.c.SolrZkClient Using default ZkACLProvider. DefaultZkACLProvider is not secure, it creates 'OPEN_ACL_UNSAFE' ACLs to Zookeeper nodes 2> 404845 INFO (jetty-launcher-2369-thread-6) [n: c: s: r: x: t:] o.a.s.c.NodeConfig solr.xml found in ZooKeeper. Loading... 2> 404846 INFO (zkConnectionManagerCallback-2371-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ConnectionManager zkClient has connected 2> 404846 INFO (jetty-launcher-2369-thread-5) [n: c: s: r: x: t:] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper 2> 404847 WARN (jetty-launcher-2369-thread-5) [n: c: s: r: x: t:] o.a.s.c.c.SolrZkClient Using default ZkACLProvider. DefaultZkACLProvider is not secure, it creates 'OPEN_ACL_UNSAFE' ACLs to Zookeeper nodes 2> 404847 INFO (jetty-launcher-2369-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ConnectionManager Waiting up to 30000ms for client to connect to ZooKeeper 2> 404848 INFO (jetty-launcher-2369-thread-2) [n: c: s: r: x: t:] o.a.s.c.NodeConfig solr.xml found in ZooKeeper. Loading... 2> 404849 INFO (jetty-launcher-2369-thread-4) [n: c: s: r: x: t:] o.a.s.c.c.ConnectionManager Waiting up to 30000ms for client to connect to ZooKeeper 2> 404850 INFO (zkConnectionManagerCallback-2375-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ConnectionManager zkClient has connected 2> 404851 INFO (jetty-launcher-2369-thread-3) [n: c: s: r: x: t:] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper 2> 404851 INFO (jetty-launcher-2369-thread-5) [n: c: s: r: x: t:] o.a.s.c.NodeConfig solr.xml found in ZooKeeper. Loading... 2> 404852 WARN (jetty-launcher-2369-thread-3) [n: c: s: r: x: t:] o.a.s.c.c.SolrZkClient Using default ZkACLProvider. DefaultZkACLProvider is not secure, it creates 'OPEN_ACL_UNSAFE' ACLs to Zookeeper nodes 2> 404856 INFO (jetty-launcher-2369-thread-3) [n: c: s: r: x: t:] o.a.s.c.NodeConfig solr.xml found in ZooKeeper. Loading... 2> 404859 INFO (zkConnectionManagerCallback-2379-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ConnectionManager zkClient has connected 2> 404859 INFO (jetty-launcher-2369-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper 2> 404860 WARN (jetty-launcher-2369-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.SolrZkClient Using default ZkACLProvider. DefaultZkACLProvider is not secure, it creates 'OPEN_ACL_UNSAFE' ACLs to Zookeeper nodes 2> 404862 INFO (zkConnectionManagerCallback-2381-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ConnectionManager zkClient has connected 2> 404863 INFO (jetty-launcher-2369-thread-4) [n: c: s: r: x: t:] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper 2> 404864 WARN (jetty-launcher-2369-thread-4) [n: c: s: r: x: t:] o.a.s.c.c.SolrZkClient Using default ZkACLProvider. DefaultZkACLProvider is not secure, it creates 'OPEN_ACL_UNSAFE' ACLs to Zookeeper nodes 2> 404864 INFO (jetty-launcher-2369-thread-1) [n: c: s: r: x: t:] o.a.s.c.NodeConfig solr.xml found in ZooKeeper. Loading... 2> 404868 INFO (jetty-launcher-2369-thread-4) [n: c: s: r: x: t:] o.a.s.c.NodeConfig solr.xml found in ZooKeeper. Loading... 2> 405063 INFO (jetty-launcher-2369-thread-5) [n: c: s: r: x: t:] o.a.s.c.ZkContainer Zookeeper client=127.0.0.1:35103/solr 2> 405064 INFO (jetty-launcher-2369-thread-5) [n: c: s: r: x: t:] o.a.s.c.DistributedClusterStateUpdater Creating DistributedClusterStateUpdater with useDistributedStateUpdate=true. Solr will be using distributed cluster state updates. 2> 405072 INFO (jetty-launcher-2369-thread-6) [n: c: s: r: x: t:] o.a.s.c.ZkContainer Zookeeper client=127.0.0.1:35103/solr 2> 405073 INFO (jetty-launcher-2369-thread-6) [n: c: s: r: x: t:] o.a.s.c.DistributedClusterStateUpdater Creating DistributedClusterStateUpdater with useDistributedStateUpdate=true. Solr will be using distributed cluster state updates. 2> 405076 INFO (jetty-launcher-2369-thread-3) [n: c: s: r: x: t:] o.a.s.c.ZkContainer Zookeeper client=127.0.0.1:35103/solr 2> 405077 INFO (jetty-launcher-2369-thread-3) [n: c: s: r: x: t:] o.a.s.c.DistributedClusterStateUpdater Creating DistributedClusterStateUpdater with useDistributedStateUpdate=true. Solr will be using distributed cluster state updates. 2> 405078 INFO (jetty-launcher-2369-thread-4) [n: c: s: r: x: t:] o.a.s.c.ZkContainer Zookeeper client=127.0.0.1:35103/solr 2> 405080 INFO (jetty-launcher-2369-thread-2) [n: c: s: r: x: t:] o.a.s.c.ZkContainer Zookeeper client=127.0.0.1:35103/solr 2> 405080 INFO (jetty-launcher-2369-thread-4) [n: c: s: r: x: t:] o.a.s.c.DistributedClusterStateUpdater Creating DistributedClusterStateUpdater with useDistributedStateUpdate=true. Solr will be using distributed cluster state updates. 2> 405081 INFO (jetty-launcher-2369-thread-2) [n: c: s: r: x: t:] o.a.s.c.DistributedClusterStateUpdater Creating DistributedClusterStateUpdater with useDistributedStateUpdate=true. Solr will be using distributed cluster state updates. 2> 405081 INFO (jetty-launcher-2369-thread-5) [n:127.0.0.1:34105_solr c: s: r: x: t:] o.a.s.c.c.ConnectionManager Waiting up to 15000ms for client to connect to ZooKeeper 2> 405081 INFO (jetty-launcher-2369-thread-1) [n: c: s: r: x: t:] o.a.s.c.ZkContainer Zookeeper client=127.0.0.1:35103/solr 2> 405083 INFO (jetty-launcher-2369-thread-1) [n: c: s: r: x: t:] o.a.s.c.DistributedClusterStateUpdater Creating DistributedClusterStateUpdater with useDistributedStateUpdate=true. Solr will be using distributed cluster state updates. 2> 405096 INFO (zkConnectionManagerCallback-2426-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ConnectionManager zkClient has connected 2> 405096 INFO (jetty-launcher-2369-thread-5) [n:127.0.0.1:34105_solr c: s: r: x: t:] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper 2> 405116 INFO (jetty-launcher-2369-thread-6) [n:127.0.0.1:37983_solr c: s: r: x: t:] o.a.s.c.c.ConnectionManager Waiting up to 15000ms for client to connect to ZooKeeper 2> 405118 INFO (jetty-launcher-2369-thread-3) [n:127.0.0.1:34953_solr c: s: r: x: t:] o.a.s.c.c.ConnectionManager Waiting up to 15000ms for client to connect to ZooKeeper 2> 405120 INFO (jetty-launcher-2369-thread-2) [n:127.0.0.1:36563_solr c: s: r: x: t:] o.a.s.c.c.ConnectionManager Waiting up to 15000ms for client to connect to ZooKeeper 2> 405122 INFO (jetty-launcher-2369-thread-4) [n:127.0.0.1:33013_solr c: s: r: x: t:] o.a.s.c.c.ConnectionManager Waiting up to 15000ms for client to connect to ZooKeeper 2> 405124 INFO (jetty-launcher-2369-thread-1) [n:127.0.0.1:37265_solr c: s: r: x: t:] o.a.s.c.c.ConnectionManager Waiting up to 15000ms for client to connect to ZooKeeper 2> 405127 INFO (zkConnectionManagerCallback-2431-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ConnectionManager zkClient has connected 2> 405128 INFO (jetty-launcher-2369-thread-6) [n:127.0.0.1:37983_solr c: s: r: x: t:] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper 2> 405129 INFO (zkConnectionManagerCallback-2435-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ConnectionManager zkClient has connected 2> 405129 INFO (jetty-launcher-2369-thread-3) [n:127.0.0.1:34953_solr c: s: r: x: t:] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper 2> 405131 INFO (zkConnectionManagerCallback-2439-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ConnectionManager zkClient has connected 2> 405131 INFO (jetty-launcher-2369-thread-2) [n:127.0.0.1:36563_solr c: s: r: x: t:] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper 2> 405133 INFO (zkConnectionManagerCallback-2437-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ConnectionManager zkClient has connected 2> 405133 INFO (jetty-launcher-2369-thread-4) [n:127.0.0.1:33013_solr c: s: r: x: t:] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper 2> 405137 INFO (zkConnectionManagerCallback-2441-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ConnectionManager zkClient has connected 2> 405137 INFO (jetty-launcher-2369-thread-1) [n:127.0.0.1:37265_solr c: s: r: x: t:] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper 2> 405264 WARN (jetty-launcher-2369-thread-4) [n:127.0.0.1:33013_solr c: s: r: x: t:] o.a.s.c.ZkController Contents of zookeeper /security.json are world-readable; consider setting up ACLs as described in https://solr.apache.org/guide/solr/latest/deployment-guide/zookeeper-access-control.html 2> 405265 WARN (jetty-launcher-2369-thread-3) [n:127.0.0.1:34953_solr c: s: r: x: t:] o.a.s.c.ZkController Contents of zookeeper /security.json are world-readable; consider setting up ACLs as described in https://solr.apache.org/guide/solr/latest/deployment-guide/zookeeper-access-control.html 2> 405272 WARN (jetty-launcher-2369-thread-2) [n:127.0.0.1:36563_solr c: s: r: x: t:] o.a.s.c.ZkController Contents of zookeeper /security.json are world-readable; consider setting up ACLs as described in https://solr.apache.org/guide/solr/latest/deployment-guide/zookeeper-access-control.html 2> 405272 WARN (jetty-launcher-2369-thread-1) [n:127.0.0.1:37265_solr c: s: r: x: t:] o.a.s.c.ZkController Contents of zookeeper /security.json are world-readable; consider setting up ACLs as described in https://solr.apache.org/guide/solr/latest/deployment-guide/zookeeper-access-control.html 2> 405273 WARN (jetty-launcher-2369-thread-5) [n:127.0.0.1:34105_solr c: s: r: x: t:] o.a.s.c.ZkController Contents of zookeeper /security.json are world-readable; consider setting up ACLs as described in https://solr.apache.org/guide/solr/latest/deployment-guide/zookeeper-access-control.html 2> 405273 WARN (jetty-launcher-2369-thread-6) [n:127.0.0.1:37983_solr c: s: r: x: t:] o.a.s.c.ZkController Contents of zookeeper /security.json are world-readable; consider setting up ACLs as described in https://solr.apache.org/guide/solr/latest/deployment-guide/zookeeper-access-control.html 2> 405303 INFO (jetty-launcher-2369-thread-4) [n:127.0.0.1:33013_solr c: s: r: x: t:] o.a.s.c.DistributedClusterStateUpdater Creating DistributedClusterStateUpdater with useDistributedStateUpdate=true. Solr will be using distributed cluster state updates. 2> 405303 INFO (jetty-launcher-2369-thread-3) [n:127.0.0.1:34953_solr c: s: r: x: t:] o.a.s.c.DistributedClusterStateUpdater Creating DistributedClusterStateUpdater with useDistributedStateUpdate=true. Solr will be using distributed cluster state updates. 2> 405303 INFO (jetty-launcher-2369-thread-1) [n:127.0.0.1:37265_solr c: s: r: x: t:] o.a.s.c.DistributedClusterStateUpdater Creating DistributedClusterStateUpdater with useDistributedStateUpdate=true. Solr will be using distributed cluster state updates. 2> 405304 INFO (jetty-launcher-2369-thread-2) [n:127.0.0.1:36563_solr c: s: r: x: t:] o.a.s.c.DistributedClusterStateUpdater Creating DistributedClusterStateUpdater with useDistributedStateUpdate=true. Solr will be using distributed cluster state updates. 2> 405304 INFO (jetty-launcher-2369-thread-6) [n:127.0.0.1:37983_solr c: s: r: x: t:] o.a.s.c.DistributedClusterStateUpdater Creating DistributedClusterStateUpdater with useDistributedStateUpdate=true. Solr will be using distributed cluster state updates. 2> 405305 INFO (jetty-launcher-2369-thread-5) [n:127.0.0.1:34105_solr c: s: r: x: t:] o.a.s.c.DistributedClusterStateUpdater Creating DistributedClusterStateUpdater with useDistributedStateUpdate=true. Solr will be using distributed cluster state updates. 2> 405325 INFO (jetty-launcher-2369-thread-4) [n:127.0.0.1:33013_solr c: s: r: x: t:] o.a.s.c.OverseerElectionContext I am going to be the leader 127.0.0.1:33013_solr 2> 405331 INFO (jetty-launcher-2369-thread-3) [n:127.0.0.1:34953_solr c: s: r: x: t:] o.a.s.c.ZkController Register node as live in ZooKeeper:/live_nodes/127.0.0.1:34953_solr 2> 405332 INFO (jetty-launcher-2369-thread-4) [n:127.0.0.1:33013_solr c: s: r: x: t:] o.a.s.c.Overseer Overseer (id=72077816720195597-127.0.0.1:33013_solr-n_0000000000) starting 2> 405333 INFO (jetty-launcher-2369-thread-1) [n:127.0.0.1:37265_solr c: s: r: x: t:] o.a.s.c.ZkController Register node as live in ZooKeeper:/live_nodes/127.0.0.1:37265_solr 2> 405333 INFO (jetty-launcher-2369-thread-2) [n:127.0.0.1:36563_solr c: s: r: x: t:] o.a.s.c.ZkController Register node as live in ZooKeeper:/live_nodes/127.0.0.1:36563_solr 2> 405333 INFO (jetty-launcher-2369-thread-6) [n:127.0.0.1:37983_solr c: s: r: x: t:] o.a.s.c.ZkController Register node as live in ZooKeeper:/live_nodes/127.0.0.1:37983_solr 2> 405334 INFO (jetty-launcher-2369-thread-5) [n:127.0.0.1:34105_solr c: s: r: x: t:] o.a.s.c.ZkController Register node as live in ZooKeeper:/live_nodes/127.0.0.1:34105_solr 2> 405351 INFO (zkCallback-2438-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (0) -> (5) 2> 405352 INFO (zkCallback-2434-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (0) -> (5) 2> 405352 INFO (zkCallback-2436-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (0) -> (5) 2> 405353 INFO (zkCallback-2430-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (0) -> (5) 2> 405353 INFO (zkCallback-2425-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (0) -> (5) 2> 405354 INFO (zkCallback-2440-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (0) -> (5) 2> 405375 WARN (jetty-launcher-2369-thread-3) [n:127.0.0.1:34953_solr c: s: r: x: t:] o.a.s.c.CoreContainer Not all security plugins configured! authentication=disabled authorization=disabled. Solr is only as secure as you make it. Consider configuring authentication/authorization before exposing Solr to users internal or external. See https://s.apache.org/solrsecurity for more info 2> 405375 WARN (jetty-launcher-2369-thread-1) [n:127.0.0.1:37265_solr c: s: r: x: t:] o.a.s.c.CoreContainer Not all security plugins configured! authentication=disabled authorization=disabled. Solr is only as secure as you make it. Consider configuring authentication/authorization before exposing Solr to users internal or external. See https://s.apache.org/solrsecurity for more info 2> 405375 WARN (jetty-launcher-2369-thread-2) [n:127.0.0.1:36563_solr c: s: r: x: t:] o.a.s.c.CoreContainer Not all security plugins configured! authentication=disabled authorization=disabled. Solr is only as secure as you make it. Consider configuring authentication/authorization before exposing Solr to users internal or external. See https://s.apache.org/solrsecurity for more info 2> 405376 WARN (jetty-launcher-2369-thread-6) [n:127.0.0.1:37983_solr c: s: r: x: t:] o.a.s.c.CoreContainer Not all security plugins configured! authentication=disabled authorization=disabled. Solr is only as secure as you make it. Consider configuring authentication/authorization before exposing Solr to users internal or external. See https://s.apache.org/solrsecurity for more info 2> 405379 WARN (jetty-launcher-2369-thread-5) [n:127.0.0.1:34105_solr c: s: r: x: t:] o.a.s.c.CoreContainer Not all security plugins configured! authentication=disabled authorization=disabled. Solr is only as secure as you make it. Consider configuring authentication/authorization before exposing Solr to users internal or external. See https://s.apache.org/solrsecurity for more info 2> 405393 INFO (OverseerStateUpdate-72077816720195597-127.0.0.1:33013_solr-n_0000000000) [n:127.0.0.1:33013_solr c: s: r: x: t:] o.a.s.c.Overseer Starting to work on the main queue : 127.0.0.1:33013_solr 2> 405393 INFO (jetty-launcher-2369-thread-4) [n:127.0.0.1:33013_solr c: s: r: x: t:] o.a.s.c.ZkController Publish node=127.0.0.1:33013_solr as DOWN 2> 405397 INFO (jetty-launcher-2369-thread-4) [n:127.0.0.1:33013_solr c: s: r: x: t:] o.a.s.c.ZkController Register node as live in ZooKeeper:/live_nodes/127.0.0.1:33013_solr 2> 405408 INFO (OverseerStateUpdate-72077816720195597-127.0.0.1:33013_solr-n_0000000000) [n:127.0.0.1:33013_solr c: s: r: x: t:] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (5) -> (6) 2> 405409 INFO (zkCallback-2438-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (5) -> (6) 2> 405410 INFO (zkCallback-2440-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (5) -> (6) 2> 405410 INFO (zkCallback-2434-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (5) -> (6) 2> 405410 INFO (zkCallback-2425-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (5) -> (6) 2> 405411 INFO (zkCallback-2430-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (5) -> (6) 2> 405431 WARN (jetty-launcher-2369-thread-4) [n:127.0.0.1:33013_solr c: s: r: x: t:] o.a.s.c.CoreContainer Not all security plugins configured! authentication=disabled authorization=disabled. Solr is only as secure as you make it. Consider configuring authentication/authorization before exposing Solr to users internal or external. See https://s.apache.org/solrsecurity for more info 2> 405436 INFO (jetty-launcher-2369-thread-4) [n:127.0.0.1:33013_solr c: s: r: x: t:] o.a.s.c.a.c.DistributedCollectionConfigSetCommandRunner Creating DistributedCollectionConfigSetCommandRunner. Collection and ConfigSet APIs are running distributed (not Overseer based) 2> 405437 INFO (jetty-launcher-2369-thread-4) [n:127.0.0.1:33013_solr c: s: r: x: t:] o.a.s.c.DistributedClusterStateUpdater Creating DistributedClusterStateUpdater with useDistributedStateUpdate=true. Solr will be using distributed cluster state updates. 2> 405996 INFO (jetty-launcher-2369-thread-1) [n:127.0.0.1:37265_solr c: s: r: x: t:] o.a.s.c.a.c.DistributedCollectionConfigSetCommandRunner Creating DistributedCollectionConfigSetCommandRunner. Collection and ConfigSet APIs are running distributed (not Overseer based) 2> 405997 INFO (jetty-launcher-2369-thread-1) [n:127.0.0.1:37265_solr c: s: r: x: t:] o.a.s.c.DistributedClusterStateUpdater Creating DistributedClusterStateUpdater with useDistributedStateUpdate=true. Solr will be using distributed cluster state updates. 2> 406243 INFO (jetty-launcher-2369-thread-3) [n:127.0.0.1:34953_solr c: s: r: x: t:] o.a.s.c.a.c.DistributedCollectionConfigSetCommandRunner Creating DistributedCollectionConfigSetCommandRunner. Collection and ConfigSet APIs are running distributed (not Overseer based) 2> 406244 INFO (jetty-launcher-2369-thread-3) [n:127.0.0.1:34953_solr c: s: r: x: t:] o.a.s.c.DistributedClusterStateUpdater Creating DistributedClusterStateUpdater with useDistributedStateUpdate=true. Solr will be using distributed cluster state updates. 2> 406249 INFO (jetty-launcher-2369-thread-6) [n:127.0.0.1:37983_solr c: s: r: x: t:] o.a.s.c.a.c.DistributedCollectionConfigSetCommandRunner Creating DistributedCollectionConfigSetCommandRunner. Collection and ConfigSet APIs are running distributed (not Overseer based) 2> 406249 INFO (jetty-launcher-2369-thread-6) [n:127.0.0.1:37983_solr c: s: r: x: t:] o.a.s.c.DistributedClusterStateUpdater Creating DistributedClusterStateUpdater with useDistributedStateUpdate=true. Solr will be using distributed cluster state updates. 2> 406257 INFO (jetty-launcher-2369-thread-2) [n:127.0.0.1:36563_solr c: s: r: x: t:] o.a.s.c.a.c.DistributedCollectionConfigSetCommandRunner Creating DistributedCollectionConfigSetCommandRunner. Collection and ConfigSet APIs are running distributed (not Overseer based) 2> 406257 INFO (jetty-launcher-2369-thread-2) [n:127.0.0.1:36563_solr c: s: r: x: t:] o.a.s.c.DistributedClusterStateUpdater Creating DistributedClusterStateUpdater with useDistributedStateUpdate=true. Solr will be using distributed cluster state updates. 2> 406259 INFO (jetty-launcher-2369-thread-5) [n:127.0.0.1:34105_solr c: s: r: x: t:] o.a.s.c.a.c.DistributedCollectionConfigSetCommandRunner Creating DistributedCollectionConfigSetCommandRunner. Collection and ConfigSet APIs are running distributed (not Overseer based) 2> 406260 INFO (jetty-launcher-2369-thread-5) [n:127.0.0.1:34105_solr c: s: r: x: t:] o.a.s.c.DistributedClusterStateUpdater Creating DistributedClusterStateUpdater with useDistributedStateUpdate=true. Solr will be using distributed cluster state updates. 2> 406589 INFO (jetty-launcher-2369-thread-4) [n:127.0.0.1:33013_solr c: s: r: x: t:] o.a.s.c.CorePropertiesLocator Found 0 core definitions underneath /build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.cloud.ReplaceNodeTest_246C98A4C257C021-001/tempDir-003/node4 2> 406880 INFO (jetty-launcher-2369-thread-1) [n:127.0.0.1:37265_solr c: s: r: x: t:] o.a.s.c.CorePropertiesLocator Found 0 core definitions underneath /build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.cloud.ReplaceNodeTest_246C98A4C257C021-001/tempDir-003/node1 2> 407429 INFO (jetty-launcher-2369-thread-4) [n:127.0.0.1:33013_solr c: s: r: x: t:] o.a.s.j.SolrRequestAuthorizer Creating a new SolrRequestAuthorizer 2> 407508 INFO (jetty-launcher-2369-thread-2) [n:127.0.0.1:36563_solr c: s: r: x: t:] o.a.s.c.CorePropertiesLocator Found 0 core definitions underneath /build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.cloud.ReplaceNodeTest_246C98A4C257C021-001/tempDir-003/node2 2> 407517 INFO (jetty-launcher-2369-thread-6) [n:127.0.0.1:37983_solr c: s: r: x: t:] o.a.s.c.CorePropertiesLocator Found 0 core definitions underneath /build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.cloud.ReplaceNodeTest_246C98A4C257C021-001/tempDir-003/node6 2> 407518 INFO (jetty-launcher-2369-thread-3) [n:127.0.0.1:34953_solr c: s: r: x: t:] o.a.s.c.CorePropertiesLocator Found 0 core definitions underneath /build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.cloud.ReplaceNodeTest_246C98A4C257C021-001/tempDir-003/node3 2> 407595 INFO (jetty-launcher-2369-thread-5) [n:127.0.0.1:34105_solr c: s: r: x: t:] o.a.s.c.CorePropertiesLocator Found 0 core definitions underneath /build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.cloud.ReplaceNodeTest_246C98A4C257C021-001/tempDir-003/node5 2> 407791 INFO (jetty-launcher-2369-thread-1) [n:127.0.0.1:37265_solr c: s: r: x: t:] o.a.s.j.SolrRequestAuthorizer Creating a new SolrRequestAuthorizer 2> 407833 INFO (jetty-launcher-2369-thread-4) [n:127.0.0.1:33013_solr c: s: r: x: t:] o.a.s.e.JettySolrRunner Jetty properties: {hostContext=/solr, hostPort=33013, zkHost=127.0.0.1:35103/solr} 2> 408003 INFO (jetty-launcher-2369-thread-2) [n:127.0.0.1:36563_solr c: s: r: x: t:] o.a.s.j.SolrRequestAuthorizer Creating a new SolrRequestAuthorizer 2> 408020 INFO (jetty-launcher-2369-thread-1) [n:127.0.0.1:37265_solr c: s: r: x: t:] o.a.s.e.JettySolrRunner Jetty properties: {hostContext=/solr, hostPort=37265, zkHost=127.0.0.1:35103/solr} 2> 408024 INFO (jetty-launcher-2369-thread-3) [n:127.0.0.1:34953_solr c: s: r: x: t:] o.a.s.j.SolrRequestAuthorizer Creating a new SolrRequestAuthorizer 2> 408035 INFO (jetty-launcher-2369-thread-6) [n:127.0.0.1:37983_solr c: s: r: x: t:] o.a.s.j.SolrRequestAuthorizer Creating a new SolrRequestAuthorizer 2> 408115 INFO (jetty-launcher-2369-thread-5) [n:127.0.0.1:34105_solr c: s: r: x: t:] o.a.s.j.SolrRequestAuthorizer Creating a new SolrRequestAuthorizer 2> 408545 INFO (jetty-launcher-2369-thread-2) [n:127.0.0.1:36563_solr c: s: r: x: t:] o.a.s.e.JettySolrRunner Jetty properties: {hostContext=/solr, hostPort=36563, zkHost=127.0.0.1:35103/solr} 2> 408603 INFO (jetty-launcher-2369-thread-6) [n:127.0.0.1:37983_solr c: s: r: x: t:] o.a.s.e.JettySolrRunner Jetty properties: {hostContext=/solr, hostPort=37983, zkHost=127.0.0.1:35103/solr} 2> 408604 INFO (jetty-launcher-2369-thread-3) [n:127.0.0.1:34953_solr c: s: r: x: t:] o.a.s.e.JettySolrRunner Jetty properties: {hostContext=/solr, hostPort=34953, zkHost=127.0.0.1:35103/solr} 2> 408672 INFO (jetty-launcher-2369-thread-5) [n:127.0.0.1:34105_solr c: s: r: x: t:] o.a.s.e.JettySolrRunner Jetty properties: {hostContext=/solr, hostPort=34105, zkHost=127.0.0.1:35103/solr} 2> 408688 INFO (TEST-ReplaceNodeTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.MiniSolrCloudCluster waitForAllNodes: numServers=6 2> 408689 INFO (TEST-ReplaceNodeTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.MiniSolrCloudCluster waitForNode: 127.0.0.1:33013_solr 2> 408692 WARN (TEST-ReplaceNodeTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.c.SolrZkClient Using default ZkCredentialsInjector. ZkCredentialsInjector is not secure, it creates an empty list of credentials which leads to 'OPEN_ACL_UNSAFE' ACLs to Zookeeper nodes 2> 408704 INFO (TEST-ReplaceNodeTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.c.ConnectionManager Waiting up to 15000ms for client to connect to ZooKeeper 2> 408717 INFO (zkConnectionManagerCallback-2496-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ConnectionManager zkClient has connected 2> 408718 INFO (TEST-ReplaceNodeTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper 2> 408718 WARN (TEST-ReplaceNodeTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.c.SolrZkClient Using default ZkACLProvider. DefaultZkACLProvider is not secure, it creates 'OPEN_ACL_UNSAFE' ACLs to Zookeeper nodes 2> 408727 INFO (TEST-ReplaceNodeTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (0) -> (6) 2> 408737 INFO (TEST-ReplaceNodeTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.s.i.ZkClientClusterStateProvider Cluster at 127.0.0.1:35103/solr ready 2> 408738 INFO (TEST-ReplaceNodeTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.MiniSolrCloudCluster waitForNode: 127.0.0.1:37265_solr 2> 408738 INFO (TEST-ReplaceNodeTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.MiniSolrCloudCluster waitForNode: 127.0.0.1:36563_solr 2> 408739 INFO (TEST-ReplaceNodeTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.MiniSolrCloudCluster waitForNode: 127.0.0.1:37983_solr 2> 408739 INFO (TEST-ReplaceNodeTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.MiniSolrCloudCluster waitForNode: 127.0.0.1:34953_solr 2> 408739 INFO (TEST-ReplaceNodeTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.MiniSolrCloudCluster waitForNode: 127.0.0.1:34105_solr 2> 408740 WARN (TEST-ReplaceNodeTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.c.SolrZkClient Using default ZkCredentialsInjector. ZkCredentialsInjector is not secure, it creates an empty list of credentials which leads to 'OPEN_ACL_UNSAFE' ACLs to Zookeeper nodes 2> 408751 INFO (TEST-ReplaceNodeTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.c.ConnectionManager Waiting up to 45000ms for client to connect to ZooKeeper 2> 408762 INFO (zkConnectionManagerCallback-2498-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ConnectionManager zkClient has connected 2> 408763 INFO (TEST-ReplaceNodeTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper 2> 408764 WARN (TEST-ReplaceNodeTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.c.SolrZkClient Using default ZkACLProvider. DefaultZkACLProvider is not secure, it creates 'OPEN_ACL_UNSAFE' ACLs to Zookeeper nodes 2> 408896 INFO (TEST-ReplaceNodeTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.ReplaceNodeTest total_jettys: 6 2> 408918 INFO (qtp534543879-4338) [n:127.0.0.1:33013_solr c:replacenodetest_coll s: r: x: t:null-1663] o.a.s.c.a.c.DistributedCollectionConfigSetCommandRunner Running Collection API locally for CREATE asyncId=null 2> 408944 INFO (DistributedCollectionApiCommandExecutor-2458-thread-1-processing-127.0.0.1:33013_solr null-1663 replacenodetest_coll) [n:127.0.0.1:33013_solr c:replacenodetest_coll s: r: x: t:null-1663] o.a.s.c.a.c.CreateCollectionCmd Create collection replacenodetest_coll 2> 409226 INFO (DistributedCollectionApiCommandExecutor-2458-thread-1-processing-127.0.0.1:33013_solr null-1663 replacenodetest_coll) [n:127.0.0.1:33013_solr c:replacenodetest_coll s: r: x: t:null-1663] o.a.s.c.o.SliceMutator createReplica() { 2> "core":"replacenodetest_coll_shard1_replica_n1", 2> "node_name":"127.0.0.1:34105_solr", 2> "base_url":"http://127.0.0.1:34105/solr", 2> "collection":"replacenodetest_coll", 2> "shard":"shard1", 2> "state":"down", 2> "type":"NRT", 2> "operation":"ADDREPLICA", 2> "waitForFinalState":"false"} 2> 409238 INFO (DistributedCollectionApiCommandExecutor-2458-thread-1-processing-127.0.0.1:33013_solr null-1663 replacenodetest_coll) [n:127.0.0.1:33013_solr c:replacenodetest_coll s: r: x: t:null-1663] o.a.s.c.o.SliceMutator createReplica() { 2> "core":"replacenodetest_coll_shard1_replica_n2", 2> "node_name":"127.0.0.1:34953_solr", 2> "base_url":"http://127.0.0.1:34953/solr", 2> "collection":"replacenodetest_coll", 2> "shard":"shard1", 2> "state":"down", 2> "type":"NRT", 2> "operation":"ADDREPLICA", 2> "waitForFinalState":"false"} 2> 409251 INFO (DistributedCollectionApiCommandExecutor-2458-thread-1-processing-127.0.0.1:33013_solr null-1663 replacenodetest_coll) [n:127.0.0.1:33013_solr c:replacenodetest_coll s: r: x: t:null-1663] o.a.s.c.o.SliceMutator createReplica() { 2> "core":"replacenodetest_coll_shard2_replica_n3", 2> "node_name":"127.0.0.1:33013_solr", 2> "base_url":"http://127.0.0.1:33013/solr", 2> "collection":"replacenodetest_coll", 2> "shard":"shard2", 2> "state":"down", 2> "type":"NRT", 2> "operation":"ADDREPLICA", 2> "waitForFinalState":"false"} 2> 409262 INFO (DistributedCollectionApiCommandExecutor-2458-thread-1-processing-127.0.0.1:33013_solr null-1663 replacenodetest_coll) [n:127.0.0.1:33013_solr c:replacenodetest_coll s: r: x: t:null-1663] o.a.s.c.o.SliceMutator createReplica() { 2> "core":"replacenodetest_coll_shard3_replica_n4", 2> "node_name":"127.0.0.1:37983_solr", 2> "base_url":"http://127.0.0.1:37983/solr", 2> "collection":"replacenodetest_coll", 2> "shard":"shard3", 2> "state":"down", 2> "type":"NRT", 2> "operation":"ADDREPLICA", 2> "waitForFinalState":"false"} 2> 409273 INFO (DistributedCollectionApiCommandExecutor-2458-thread-1-processing-127.0.0.1:33013_solr null-1663 replacenodetest_coll) [n:127.0.0.1:33013_solr c:replacenodetest_coll s: r: x: t:null-1663] o.a.s.c.o.SliceMutator createReplica() { 2> "core":"replacenodetest_coll_shard5_replica_n5", 2> "node_name":"127.0.0.1:37265_solr", 2> "base_url":"http://127.0.0.1:37265/solr", 2> "collection":"replacenodetest_coll", 2> "shard":"shard5", 2> "state":"down", 2> "type":"NRT", 2> "operation":"ADDREPLICA", 2> "waitForFinalState":"false"} 2> 409285 INFO (DistributedCollectionApiCommandExecutor-2458-thread-1-processing-127.0.0.1:33013_solr null-1663 replacenodetest_coll) [n:127.0.0.1:33013_solr c:replacenodetest_coll s: r: x: t:null-1663] o.a.s.c.o.SliceMutator createReplica() { 2> "core":"replacenodetest_coll_shard5_replica_n6", 2> "node_name":"127.0.0.1:37983_solr", 2> "base_url":"http://127.0.0.1:37983/solr", 2> "collection":"replacenodetest_coll", 2> "shard":"shard5", 2> "state":"down", 2> "type":"NRT", 2> "operation":"ADDREPLICA", 2> "waitForFinalState":"false"} 2> 409297 INFO (DistributedCollectionApiCommandExecutor-2458-thread-1-processing-127.0.0.1:33013_solr null-1663 replacenodetest_coll) [n:127.0.0.1:33013_solr c:replacenodetest_coll s: r: x: t:null-1663] o.a.s.c.o.SliceMutator createReplica() { 2> "core":"replacenodetest_coll_shard4_replica_n7", 2> "node_name":"127.0.0.1:34953_solr", 2> "base_url":"http://127.0.0.1:34953/solr", 2> "collection":"replacenodetest_coll", 2> "shard":"shard4", 2> "state":"down", 2> "type":"NRT", 2> "operation":"ADDREPLICA", 2> "waitForFinalState":"false"} 2> 409310 INFO (DistributedCollectionApiCommandExecutor-2458-thread-1-processing-127.0.0.1:33013_solr null-1663 replacenodetest_coll) [n:127.0.0.1:33013_solr c:replacenodetest_coll s: r: x: t:null-1663] o.a.s.c.o.SliceMutator createReplica() { 2> "core":"replacenodetest_coll_shard4_replica_n8", 2> "node_name":"127.0.0.1:33013_solr", 2> "base_url":"http://127.0.0.1:33013/solr", 2> "collection":"replacenodetest_coll", 2> "shard":"shard4", 2> "state":"down", 2> "type":"NRT", 2> "operation":"ADDREPLICA", 2> "waitForFinalState":"false"} 2> 409324 INFO (DistributedCollectionApiCommandExecutor-2458-thread-1-processing-127.0.0.1:33013_solr null-1663 replacenodetest_coll) [n:127.0.0.1:33013_solr c:replacenodetest_coll s: r: x: t:null-1663] o.a.s.c.o.SliceMutator createReplica() { 2> "core":"replacenodetest_coll_shard2_replica_n9", 2> "node_name":"127.0.0.1:37265_solr", 2> "base_url":"http://127.0.0.1:37265/solr", 2> "collection":"replacenodetest_coll", 2> "shard":"shard2", 2> "state":"down", 2> "type":"NRT", 2> "operation":"ADDREPLICA", 2> "waitForFinalState":"false"} 2> 409337 INFO (DistributedCollectionApiCommandExecutor-2458-thread-1-processing-127.0.0.1:33013_solr null-1663 replacenodetest_coll) [n:127.0.0.1:33013_solr c:replacenodetest_coll s: r: x: t:null-1663] o.a.s.c.o.SliceMutator createReplica() { 2> "core":"replacenodetest_coll_shard3_replica_n10", 2> "node_name":"127.0.0.1:34105_solr", 2> "base_url":"http://127.0.0.1:34105/solr", 2> "collection":"replacenodetest_coll", 2> "shard":"shard3", 2> "state":"down", 2> "type":"NRT", 2> "operation":"ADDREPLICA", 2> "waitForFinalState":"false"} 2> 409396 INFO (qtp819144083-4332) [n:127.0.0.1:34953_solr c: s: r: x:replacenodetest_coll_shard4_replica_n7 t:null-1663] o.a.s.h.a.CoreAdminOperation core create command qt=/admin/cores&collection.configName=conf1&newCollection=true&collection=replacenodetest_coll&version=2&replicaType=NRT&coreNodeName=core_node17&name=replacenodetest_coll_shard4_replica_n7&action=CREATE&numShards=5&shard=shard4&wt=javabin 2> 409396 INFO (qtp1384576555-4336) [n:127.0.0.1:34105_solr c: s: r: x:replacenodetest_coll_shard1_replica_n1 t:null-1663] o.a.s.h.a.CoreAdminOperation core create command qt=/admin/cores&collection.configName=conf1&newCollection=true&collection=replacenodetest_coll&version=2&replicaType=NRT&coreNodeName=core_node11&name=replacenodetest_coll_shard1_replica_n1&action=CREATE&numShards=5&shard=shard1&wt=javabin 2> 409397 INFO (qtp534543879-4354) [n:127.0.0.1:33013_solr c: s: r: x:replacenodetest_coll_shard2_replica_n3 t:null-1663] o.a.s.h.a.CoreAdminOperation core create command qt=/admin/cores&collection.configName=conf1&newCollection=true&collection=replacenodetest_coll&version=2&replicaType=NRT&coreNodeName=core_node13&name=replacenodetest_coll_shard2_replica_n3&action=CREATE&numShards=5&shard=shard2&wt=javabin 2> 409397 INFO (qtp1384576555-4330) [n:127.0.0.1:34105_solr c: s: r: x:replacenodetest_coll_shard3_replica_n10 t:null-1663] o.a.s.h.a.CoreAdminOperation core create command qt=/admin/cores&collection.configName=conf1&newCollection=true&collection=replacenodetest_coll&version=2&replicaType=NRT&coreNodeName=core_node20&name=replacenodetest_coll_shard3_replica_n10&action=CREATE&numShards=5&shard=shard3&wt=javabin 2> 409397 INFO (qtp819144083-4324) [n:127.0.0.1:34953_solr c: s: r: x:replacenodetest_coll_shard1_replica_n2 t:null-1663] o.a.s.h.a.CoreAdminOperation core create command qt=/admin/cores&collection.configName=conf1&newCollection=true&collection=replacenodetest_coll&version=2&replicaType=NRT&coreNodeName=core_node12&name=replacenodetest_coll_shard1_replica_n2&action=CREATE&numShards=5&shard=shard1&wt=javabin 2> 409409 INFO (qtp583399768-4335) [n:127.0.0.1:37983_solr c: s: r: x:replacenodetest_coll_shard5_replica_n6 t:null-1663] o.a.s.h.a.CoreAdminOperation core create command qt=/admin/cores&collection.configName=conf1&newCollection=true&collection=replacenodetest_coll&version=2&replicaType=NRT&coreNodeName=core_node16&name=replacenodetest_coll_shard5_replica_n6&action=CREATE&numShards=5&shard=shard5&wt=javabin 2> 409410 INFO (qtp583399768-4331) [n:127.0.0.1:37983_solr c: s: r: x:replacenodetest_coll_shard3_replica_n4 t:null-1663] o.a.s.h.a.CoreAdminOperation core create command qt=/admin/cores&collection.configName=conf1&newCollection=true&collection=replacenodetest_coll&version=2&replicaType=NRT&coreNodeName=core_node14&name=replacenodetest_coll_shard3_replica_n4&action=CREATE&numShards=5&shard=shard3&wt=javabin 2> 409411 INFO (qtp534543879-4356) [n:127.0.0.1:33013_solr c: s: r: x:replacenodetest_coll_shard4_replica_n8 t:null-1663] o.a.s.h.a.CoreAdminOperation core create command qt=/admin/cores&collection.configName=conf1&newCollection=true&collection=replacenodetest_coll&version=2&replicaType=NRT&coreNodeName=core_node18&name=replacenodetest_coll_shard4_replica_n8&action=CREATE&numShards=5&shard=shard4&wt=javabin 2> 409411 INFO (qtp259113879-4351) [n:127.0.0.1:37265_solr c: s: r: x:replacenodetest_coll_shard2_replica_n9 t:null-1663] o.a.s.h.a.CoreAdminOperation core create command qt=/admin/cores&collection.configName=conf1&newCollection=true&collection=replacenodetest_coll&version=2&replicaType=NRT&coreNodeName=core_node19&name=replacenodetest_coll_shard2_replica_n9&action=CREATE&numShards=5&shard=shard2&wt=javabin 2> 409416 INFO (qtp259113879-4343) [n:127.0.0.1:37265_solr c: s: r: x:replacenodetest_coll_shard5_replica_n5 t:null-1663] o.a.s.h.a.CoreAdminOperation core create command qt=/admin/cores&collection.configName=conf1&newCollection=true&collection=replacenodetest_coll&version=2&replicaType=NRT&coreNodeName=core_node15&name=replacenodetest_coll_shard5_replica_n5&action=CREATE&numShards=5&shard=shard5&wt=javabin 2> 409588 INFO (qtp583399768-4335) [n:127.0.0.1:37983_solr c:replacenodetest_coll s:shard5 r:core_node16 x:replacenodetest_coll_shard5_replica_n6 t:null-1663] o.a.s.c.SolrConfig Using Lucene MatchVersion: 9.8.0 2> 409588 INFO (qtp583399768-4331) [n:127.0.0.1:37983_solr c:replacenodetest_coll s:shard3 r:core_node14 x:replacenodetest_coll_shard3_replica_n4 t:null-1663] o.a.s.c.SolrConfig Using Lucene MatchVersion: 9.8.0 2> 409596 INFO (qtp1384576555-4336) [n:127.0.0.1:34105_solr c:replacenodetest_coll s:shard1 r:core_node11 x:replacenodetest_coll_shard1_replica_n1 t:null-1663] o.a.s.c.SolrConfig Using Lucene MatchVersion: 9.8.0 2> 409613 INFO (qtp259113879-4351) [n:127.0.0.1:37265_solr c:replacenodetest_coll s:shard2 r:core_node19 x:replacenodetest_coll_shard2_replica_n9 t:null-1663] o.a.s.c.SolrConfig Using Lucene MatchVersion: 9.8.0 2> 409623 INFO (qtp259113879-4343) [n:127.0.0.1:37265_solr c:replacenodetest_coll s:shard5 r:core_node15 x:replacenodetest_coll_shard5_replica_n5 t:null-1663] o.a.s.c.SolrConfig Using Lucene MatchVersion: 9.8.0 2> 409626 INFO (qtp534543879-4356) [n:127.0.0.1:33013_solr c:replacenodetest_coll s:shard4 r:core_node18 x:replacenodetest_coll_shard4_replica_n8 t:null-1663] o.a.s.c.SolrConfig Using Lucene MatchVersion: 9.8.0 2> 409626 INFO (qtp1384576555-4330) [n:127.0.0.1:34105_solr c:replacenodetest_coll s:shard3 r:core_node20 x:replacenodetest_coll_shard3_replica_n10 t:null-1663] o.a.s.c.SolrConfig Using Lucene MatchVersion: 9.8.0 2> 409626 INFO (qtp819144083-4324) [n:127.0.0.1:34953_solr c:replacenodetest_coll s:shard1 r:core_node12 x:replacenodetest_coll_shard1_replica_n2 t:null-1663] o.a.s.c.SolrConfig Using Lucene MatchVersion: 9.8.0 2> 409627 INFO (qtp819144083-4332) [n:127.0.0.1:34953_solr c:replacenodetest_coll s:shard4 r:core_node17 x:replacenodetest_coll_shard4_replica_n7 t:null-1663] o.a.s.c.SolrConfig Using Lucene MatchVersion: 9.8.0 2> 409629 INFO (qtp583399768-4331) [n:127.0.0.1:37983_solr c:replacenodetest_coll s:shard3 r:core_node14 x:replacenodetest_coll_shard3_replica_n4 t:null-1663] o.a.s.s.IndexSchema Schema name=cloud-dynamic 2> 409629 INFO (qtp583399768-4335) [n:127.0.0.1:37983_solr c:replacenodetest_coll s:shard5 r:core_node16 x:replacenodetest_coll_shard5_replica_n6 t:null-1663] o.a.s.s.IndexSchema Schema name=cloud-dynamic 2> 409636 INFO (qtp1384576555-4336) [n:127.0.0.1:34105_solr c:replacenodetest_coll s:shard1 r:core_node11 x:replacenodetest_coll_shard1_replica_n1 t:null-1663] o.a.s.s.IndexSchema Schema name=cloud-dynamic 2> 409642 INFO (qtp534543879-4354) [n:127.0.0.1:33013_solr c:replacenodetest_coll s:shard2 r:core_node13 x:replacenodetest_coll_shard2_replica_n3 t:null-1663] o.a.s.c.SolrConfig Using Lucene MatchVersion: 9.8.0 2> 409644 INFO (qtp259113879-4351) [n:127.0.0.1:37265_solr c:replacenodetest_coll s:shard2 r:core_node19 x:replacenodetest_coll_shard2_replica_n9 t:null-1663] o.a.s.s.IndexSchema Schema name=cloud-dynamic 2> 409647 INFO (qtp1384576555-4330) [n:127.0.0.1:34105_solr c:replacenodetest_coll s:shard3 r:core_node20 x:replacenodetest_coll_shard3_replica_n10 t:null-1663] o.a.s.s.IndexSchema Schema name=cloud-dynamic 2> 409648 INFO (qtp259113879-4343) [n:127.0.0.1:37265_solr c:replacenodetest_coll s:shard5 r:core_node15 x:replacenodetest_coll_shard5_replica_n5 t:null-1663] o.a.s.s.IndexSchema Schema name=cloud-dynamic 2> 409675 INFO (qtp819144083-4332) [n:127.0.0.1:34953_solr c:replacenodetest_coll s:shard4 r:core_node17 x:replacenodetest_coll_shard4_replica_n7 t:null-1663] o.a.s.s.IndexSchema Schema name=cloud-dynamic 2> 409675 INFO (qtp819144083-4324) [n:127.0.0.1:34953_solr c:replacenodetest_coll s:shard1 r:core_node12 x:replacenodetest_coll_shard1_replica_n2 t:null-1663] o.a.s.s.IndexSchema Schema name=cloud-dynamic 2> 409677 INFO (qtp534543879-4356) [n:127.0.0.1:33013_solr c:replacenodetest_coll s:shard4 r:core_node18 x:replacenodetest_coll_shard4_replica_n8 t:null-1663] o.a.s.s.IndexSchema Schema name=cloud-dynamic 2> 409677 INFO (qtp534543879-4354) [n:127.0.0.1:33013_solr c:replacenodetest_coll s:shard2 r:core_node13 x:replacenodetest_coll_shard2_replica_n3 t:null-1663] o.a.s.s.IndexSchema Schema name=cloud-dynamic 2> 409713 INFO (qtp1384576555-4330) [n:127.0.0.1:34105_solr c:replacenodetest_coll s:shard3 r:core_node20 x:replacenodetest_coll_shard3_replica_n10 t:null-1663] o.a.s.s.IndexSchema Loaded schema cloud-dynamic/1.6 with uniqueid field id 2> 409714 INFO (qtp1384576555-4330) [n:127.0.0.1:34105_solr c:replacenodetest_coll s:shard3 r:core_node20 x:replacenodetest_coll_shard3_replica_n10 t:null-1663] o.a.s.c.CoreContainer Creating SolrCore 'replacenodetest_coll_shard3_replica_n10' using configuration from configset conf1, trusted=true 2> 409714 INFO (qtp583399768-4331) [n:127.0.0.1:37983_solr c:replacenodetest_coll s:shard3 r:core_node14 x:replacenodetest_coll_shard3_replica_n4 t:null-1663] o.a.s.s.IndexSchema Loaded schema cloud-dynamic/1.6 with uniqueid field id 2> 409714 INFO (qtp583399768-4331) [n:127.0.0.1:37983_solr c:replacenodetest_coll s:shard3 r:core_node14 x:replacenodetest_coll_shard3_replica_n4 t:null-1663] o.a.s.c.CoreContainer Creating SolrCore 'replacenodetest_coll_shard3_replica_n4' using configuration from configset conf1, trusted=true 2> 409716 INFO (qtp259113879-4351) [n:127.0.0.1:37265_solr c:replacenodetest_coll s:shard2 r:core_node19 x:replacenodetest_coll_shard2_replica_n9 t:null-1663] o.a.s.s.IndexSchema Loaded schema cloud-dynamic/1.6 with uniqueid field id 2> 409716 INFO (qtp259113879-4351) [n:127.0.0.1:37265_solr c:replacenodetest_coll s:shard2 r:core_node19 x:replacenodetest_coll_shard2_replica_n9 t:null-1663] o.a.s.c.CoreContainer Creating SolrCore 'replacenodetest_coll_shard2_replica_n9' using configuration from configset conf1, trusted=true 2> 409718 INFO (qtp583399768-4331) [n:127.0.0.1:37983_solr c:replacenodetest_coll s:shard3 r:core_node14 x:replacenodetest_coll_shard3_replica_n4 t:null-1663] o.a.s.c.SolrCore Opening new SolrCore at [/build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.cloud.ReplaceNodeTest_246C98A4C257C021-001/tempDir-003/node6/replacenodetest_coll_shard3_replica_n4], dataDir=[/build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.cloud.ReplaceNodeTest_246C98A4C257C021-001/tempDir-003/node6/replacenodetest_coll_shard3_replica_n4/data/] 2> 409718 INFO (qtp534543879-4354) [n:127.0.0.1:33013_solr c:replacenodetest_coll s:shard2 r:core_node13 x:replacenodetest_coll_shard2_replica_n3 t:null-1663] o.a.s.s.IndexSchema Loaded schema cloud-dynamic/1.6 with uniqueid field id 2> 409718 INFO (qtp534543879-4354) [n:127.0.0.1:33013_solr c:replacenodetest_coll s:shard2 r:core_node13 x:replacenodetest_coll_shard2_replica_n3 t:null-1663] o.a.s.c.CoreContainer Creating SolrCore 'replacenodetest_coll_shard2_replica_n3' using configuration from configset conf1, trusted=true 2> 409719 INFO (qtp1384576555-4330) [n:127.0.0.1:34105_solr c:replacenodetest_coll s:shard3 r:core_node20 x:replacenodetest_coll_shard3_replica_n10 t:null-1663] o.a.s.c.SolrCore Opening new SolrCore at [/build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.cloud.ReplaceNodeTest_246C98A4C257C021-001/tempDir-003/node5/replacenodetest_coll_shard3_replica_n10], dataDir=[/build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.cloud.ReplaceNodeTest_246C98A4C257C021-001/tempDir-003/node5/replacenodetest_coll_shard3_replica_n10/data/] 2> 409720 INFO (qtp819144083-4324) [n:127.0.0.1:34953_solr c:replacenodetest_coll s:shard1 r:core_node12 x:replacenodetest_coll_shard1_replica_n2 t:null-1663] o.a.s.s.IndexSchema Loaded schema cloud-dynamic/1.6 with uniqueid field id 2> 409720 INFO (qtp259113879-4351) [n:127.0.0.1:37265_solr c:replacenodetest_coll s:shard2 r:core_node19 x:replacenodetest_coll_shard2_replica_n9 t:null-1663] o.a.s.c.SolrCore Opening new SolrCore at [/build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.cloud.ReplaceNodeTest_246C98A4C257C021-001/tempDir-003/node1/replacenodetest_coll_shard2_replica_n9], dataDir=[/build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.cloud.ReplaceNodeTest_246C98A4C257C021-001/tempDir-003/node1/replacenodetest_coll_shard2_replica_n9/data/] 2> 409720 INFO (qtp819144083-4324) [n:127.0.0.1:34953_solr c:replacenodetest_coll s:shard1 r:core_node12 x:replacenodetest_coll_shard1_replica_n2 t:null-1663] o.a.s.c.CoreContainer Creating SolrCore 'replacenodetest_coll_shard1_replica_n2' using configuration from configset conf1, trusted=true 2> 409722 INFO (qtp534543879-4354) [n:127.0.0.1:33013_solr c:replacenodetest_coll s:shard2 r:core_node13 x:replacenodetest_coll_shard2_replica_n3 t:null-1663] o.a.s.c.SolrCore Opening new SolrCore at [/build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.cloud.ReplaceNodeTest_246C98A4C257C021-001/tempDir-003/node4/replacenodetest_coll_shard2_replica_n3], dataDir=[/build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.cloud.ReplaceNodeTest_246C98A4C257C021-001/tempDir-003/node4/replacenodetest_coll_shard2_replica_n3/data/] 2> 409722 INFO (qtp534543879-4356) [n:127.0.0.1:33013_solr c:replacenodetest_coll s:shard4 r:core_node18 x:replacenodetest_coll_shard4_replica_n8 t:null-1663] o.a.s.s.IndexSchema Loaded schema cloud-dynamic/1.6 with uniqueid field id 2> 409723 INFO (qtp534543879-4356) [n:127.0.0.1:33013_solr c:replacenodetest_coll s:shard4 r:core_node18 x:replacenodetest_coll_shard4_replica_n8 t:null-1663] o.a.s.c.CoreContainer Creating SolrCore 'replacenodetest_coll_shard4_replica_n8' using configuration from configset conf1, trusted=true 2> 409724 INFO (qtp259113879-4343) [n:127.0.0.1:37265_solr c:replacenodetest_coll s:shard5 r:core_node15 x:replacenodetest_coll_shard5_replica_n5 t:null-1663] o.a.s.s.IndexSchema Loaded schema cloud-dynamic/1.6 with uniqueid field id 2> 409724 INFO (qtp259113879-4343) [n:127.0.0.1:37265_solr c:replacenodetest_coll s:shard5 r:core_node15 x:replacenodetest_coll_shard5_replica_n5 t:null-1663] o.a.s.c.CoreContainer Creating SolrCore 'replacenodetest_coll_shard5_replica_n5' using configuration from configset conf1, trusted=true 2> 409725 INFO (qtp819144083-4324) [n:127.0.0.1:34953_solr c:replacenodetest_coll s:shard1 r:core_node12 x:replacenodetest_coll_shard1_replica_n2 t:null-1663] o.a.s.c.SolrCore Opening new SolrCore at [/build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.cloud.ReplaceNodeTest_246C98A4C257C021-001/tempDir-003/node3/replacenodetest_coll_shard1_replica_n2], dataDir=[/build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.cloud.ReplaceNodeTest_246C98A4C257C021-001/tempDir-003/node3/replacenodetest_coll_shard1_replica_n2/data/] 2> 409725 INFO (qtp583399768-4335) [n:127.0.0.1:37983_solr c:replacenodetest_coll s:shard5 r:core_node16 x:replacenodetest_coll_shard5_replica_n6 t:null-1663] o.a.s.s.IndexSchema Loaded schema cloud-dynamic/1.6 with uniqueid field id 2> 409726 INFO (qtp583399768-4335) [n:127.0.0.1:37983_solr c:replacenodetest_coll s:shard5 r:core_node16 x:replacenodetest_coll_shard5_replica_n6 t:null-1663] o.a.s.c.CoreContainer Creating SolrCore 'replacenodetest_coll_shard5_replica_n6' using configuration from configset conf1, trusted=true 2> 409727 INFO (qtp534543879-4356) [n:127.0.0.1:33013_solr c:replacenodetest_coll s:shard4 r:core_node18 x:replacenodetest_coll_shard4_replica_n8 t:null-1663] o.a.s.c.SolrCore Opening new SolrCore at [/build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.cloud.ReplaceNodeTest_246C98A4C257C021-001/tempDir-003/node4/replacenodetest_coll_shard4_replica_n8], dataDir=[/build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.cloud.ReplaceNodeTest_246C98A4C257C021-001/tempDir-003/node4/replacenodetest_coll_shard4_replica_n8/data/] 2> 409728 INFO (qtp1384576555-4336) [n:127.0.0.1:34105_solr c:replacenodetest_coll s:shard1 r:core_node11 x:replacenodetest_coll_shard1_replica_n1 t:null-1663] o.a.s.s.IndexSchema Loaded schema cloud-dynamic/1.6 with uniqueid field id 2> 409728 INFO (qtp259113879-4343) [n:127.0.0.1:37265_solr c:replacenodetest_coll s:shard5 r:core_node15 x:replacenodetest_coll_shard5_replica_n5 t:null-1663] o.a.s.c.SolrCore Opening new SolrCore at [/build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.cloud.ReplaceNodeTest_246C98A4C257C021-001/tempDir-003/node1/replacenodetest_coll_shard5_replica_n5], dataDir=[/build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.cloud.ReplaceNodeTest_246C98A4C257C021-001/tempDir-003/node1/replacenodetest_coll_shard5_replica_n5/data/] 2> 409728 INFO (qtp1384576555-4336) [n:127.0.0.1:34105_solr c:replacenodetest_coll s:shard1 r:core_node11 x:replacenodetest_coll_shard1_replica_n1 t:null-1663] o.a.s.c.CoreContainer Creating SolrCore 'replacenodetest_coll_shard1_replica_n1' using configuration from configset conf1, trusted=true 2> 409734 INFO (qtp583399768-4335) [n:127.0.0.1:37983_solr c:replacenodetest_coll s:shard5 r:core_node16 x:replacenodetest_coll_shard5_replica_n6 t:null-1663] o.a.s.c.SolrCore Opening new SolrCore at [/build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.cloud.ReplaceNodeTest_246C98A4C257C021-001/tempDir-003/node6/replacenodetest_coll_shard5_replica_n6], dataDir=[/build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.cloud.ReplaceNodeTest_246C98A4C257C021-001/tempDir-003/node6/replacenodetest_coll_shard5_replica_n6/data/] 2> 409736 INFO (qtp1384576555-4336) [n:127.0.0.1:34105_solr c:replacenodetest_coll s:shard1 r:core_node11 x:replacenodetest_coll_shard1_replica_n1 t:null-1663] o.a.s.c.SolrCore Opening new SolrCore at [/build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.cloud.ReplaceNodeTest_246C98A4C257C021-001/tempDir-003/node5/replacenodetest_coll_shard1_replica_n1], dataDir=[/build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.cloud.ReplaceNodeTest_246C98A4C257C021-001/tempDir-003/node5/replacenodetest_coll_shard1_replica_n1/data/] 2> 409736 INFO (qtp819144083-4332) [n:127.0.0.1:34953_solr c:replacenodetest_coll s:shard4 r:core_node17 x:replacenodetest_coll_shard4_replica_n7 t:null-1663] o.a.s.s.IndexSchema Loaded schema cloud-dynamic/1.6 with uniqueid field id 2> 409737 INFO (qtp819144083-4332) [n:127.0.0.1:34953_solr c:replacenodetest_coll s:shard4 r:core_node17 x:replacenodetest_coll_shard4_replica_n7 t:null-1663] o.a.s.c.CoreContainer Creating SolrCore 'replacenodetest_coll_shard4_replica_n7' using configuration from configset conf1, trusted=true 2> 409742 INFO (qtp819144083-4332) [n:127.0.0.1:34953_solr c:replacenodetest_coll s:shard4 r:core_node17 x:replacenodetest_coll_shard4_replica_n7 t:null-1663] o.a.s.c.SolrCore Opening new SolrCore at [/build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.cloud.ReplaceNodeTest_246C98A4C257C021-001/tempDir-003/node3/replacenodetest_coll_shard4_replica_n7], dataDir=[/build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.cloud.ReplaceNodeTest_246C98A4C257C021-001/tempDir-003/node3/replacenodetest_coll_shard4_replica_n7/data/] 2> 411195 INFO (qtp534543879-4354) [n:127.0.0.1:33013_solr c:replacenodetest_coll s:shard2 r:core_node13 x:replacenodetest_coll_shard2_replica_n3 t:null-1663] o.a.s.j.SolrRequestAuthorizer Creating a new SolrRequestAuthorizer 2> 411210 INFO (qtp583399768-4331) [n:127.0.0.1:37983_solr c:replacenodetest_coll s:shard3 r:core_node14 x:replacenodetest_coll_shard3_replica_n4 t:null-1663] o.a.s.j.SolrRequestAuthorizer Creating a new SolrRequestAuthorizer 2> 411231 INFO (qtp259113879-4351) [n:127.0.0.1:37265_solr c:replacenodetest_coll s:shard2 r:core_node19 x:replacenodetest_coll_shard2_replica_n9 t:null-1663] o.a.s.j.SolrRequestAuthorizer Creating a new SolrRequestAuthorizer 2> 411257 INFO (qtp534543879-4354) [n:127.0.0.1:33013_solr c:replacenodetest_coll s:shard2 r:core_node13 x:replacenodetest_coll_shard2_replica_n3 t:null-1663] o.a.s.u.UpdateHandler Using UpdateLog implementation: org.apache.solr.update.UpdateLog 2> 411257 INFO (qtp534543879-4356) [n:127.0.0.1:33013_solr c:replacenodetest_coll s:shard4 r:core_node18 x:replacenodetest_coll_shard4_replica_n8 t:null-1663] o.a.s.u.UpdateHandler Using UpdateLog implementation: org.apache.solr.update.UpdateLog 2> 411257 INFO (qtp534543879-4354) [n:127.0.0.1:33013_solr c:replacenodetest_coll s:shard2 r:core_node13 x:replacenodetest_coll_shard2_replica_n3 t:null-1663] o.a.s.u.UpdateLog Initializing UpdateLog: dataDir=null defaultSyncLevel=FLUSH numRecordsToKeep=100 maxNumLogsToKeep=10 numVersionBuckets=65536 2> 411257 INFO (qtp534543879-4356) [n:127.0.0.1:33013_solr c:replacenodetest_coll s:shard4 r:core_node18 x:replacenodetest_coll_shard4_replica_n8 t:null-1663] o.a.s.u.UpdateLog Initializing UpdateLog: dataDir=null defaultSyncLevel=FLUSH numRecordsToKeep=100 maxNumLogsToKeep=10 numVersionBuckets=65536 2> 411276 INFO (qtp819144083-4324) [n:127.0.0.1:34953_solr c:replacenodetest_coll s:shard1 r:core_node12 x:replacenodetest_coll_shard1_replica_n2 t:null-1663] o.a.s.j.SolrRequestAuthorizer Creating a new SolrRequestAuthorizer 2> 411277 INFO (qtp1384576555-4330) [n:127.0.0.1:34105_solr c:replacenodetest_coll s:shard3 r:core_node20 x:replacenodetest_coll_shard3_replica_n10 t:null-1663] o.a.s.j.SolrRequestAuthorizer Creating a new SolrRequestAuthorizer 2> 411286 INFO (qtp534543879-4354) [n:127.0.0.1:33013_solr c:replacenodetest_coll s:shard2 r:core_node13 x:replacenodetest_coll_shard2_replica_n3 t:null-1663] o.a.s.u.CommitTracker Hard AutoCommit: disabled 2> 411286 INFO (qtp534543879-4354) [n:127.0.0.1:33013_solr c:replacenodetest_coll s:shard2 r:core_node13 x:replacenodetest_coll_shard2_replica_n3 t:null-1663] o.a.s.u.CommitTracker Soft AutoCommit: disabled 2> 411287 INFO (qtp534543879-4356) [n:127.0.0.1:33013_solr c:replacenodetest_coll s:shard4 r:core_node18 x:replacenodetest_coll_shard4_replica_n8 t:null-1663] o.a.s.u.CommitTracker Hard AutoCommit: disabled 2> 411287 INFO (qtp534543879-4356) [n:127.0.0.1:33013_solr c:replacenodetest_coll s:shard4 r:core_node18 x:replacenodetest_coll_shard4_replica_n8 t:null-1663] o.a.s.u.CommitTracker Soft AutoCommit: disabled 2> 411289 INFO (qtp583399768-4331) [n:127.0.0.1:37983_solr c:replacenodetest_coll s:shard3 r:core_node14 x:replacenodetest_coll_shard3_replica_n4 t:null-1663] o.a.s.u.UpdateHandler Using UpdateLog implementation: org.apache.solr.update.UpdateLog 2> 411289 INFO (qtp583399768-4335) [n:127.0.0.1:37983_solr c:replacenodetest_coll s:shard5 r:core_node16 x:replacenodetest_coll_shard5_replica_n6 t:null-1663] o.a.s.u.UpdateHandler Using UpdateLog implementation: org.apache.solr.update.UpdateLog 2> 411289 INFO (qtp583399768-4331) [n:127.0.0.1:37983_solr c:replacenodetest_coll s:shard3 r:core_node14 x:replacenodetest_coll_shard3_replica_n4 t:null-1663] o.a.s.u.UpdateLog Initializing UpdateLog: dataDir=null defaultSyncLevel=FLUSH numRecordsToKeep=100 maxNumLogsToKeep=10 numVersionBuckets=65536 2> 411290 INFO (qtp583399768-4335) [n:127.0.0.1:37983_solr c:replacenodetest_coll s:shard5 r:core_node16 x:replacenodetest_coll_shard5_replica_n6 t:null-1663] o.a.s.u.UpdateLog Initializing UpdateLog: dataDir=null defaultSyncLevel=FLUSH numRecordsToKeep=100 maxNumLogsToKeep=10 numVersionBuckets=65536 2> 411309 INFO (qtp259113879-4351) [n:127.0.0.1:37265_solr c:replacenodetest_coll s:shard2 r:core_node19 x:replacenodetest_coll_shard2_replica_n9 t:null-1663] o.a.s.u.UpdateHandler Using UpdateLog implementation: org.apache.solr.update.UpdateLog 2> 411310 INFO (qtp259113879-4351) [n:127.0.0.1:37265_solr c:replacenodetest_coll s:shard2 r:core_node19 x:replacenodetest_coll_shard2_replica_n9 t:null-1663] o.a.s.u.UpdateLog Initializing UpdateLog: dataDir=null defaultSyncLevel=FLUSH numRecordsToKeep=100 maxNumLogsToKeep=10 numVersionBuckets=65536 2> 411317 INFO (qtp583399768-4335) [n:127.0.0.1:37983_solr c:replacenodetest_coll s:shard5 r:core_node16 x:replacenodetest_coll_shard5_replica_n6 t:null-1663] o.a.s.u.CommitTracker Hard AutoCommit: disabled 2> 411317 INFO (qtp583399768-4335) [n:127.0.0.1:37983_solr c:replacenodetest_coll s:shard5 r:core_node16 x:replacenodetest_coll_shard5_replica_n6 t:null-1663] o.a.s.u.CommitTracker Soft AutoCommit: disabled 2> 411320 INFO (qtp583399768-4331) [n:127.0.0.1:37983_solr c:replacenodetest_coll s:shard3 r:core_node14 x:replacenodetest_coll_shard3_replica_n4 t:null-1663] o.a.s.u.CommitTracker Hard AutoCommit: disabled 2> 411321 INFO (qtp583399768-4331) [n:127.0.0.1:37983_solr c:replacenodetest_coll s:shard3 r:core_node14 x:replacenodetest_coll_shard3_replica_n4 t:null-1663] o.a.s.u.CommitTracker Soft AutoCommit: disabled 2> 411322 INFO (qtp534543879-4354) [n:127.0.0.1:33013_solr c:replacenodetest_coll s:shard2 r:core_node13 x:replacenodetest_coll_shard2_replica_n3 t:null-1663] o.a.s.r.ManagedResourceStorage Configured ZooKeeperStorageIO with znodeBase: /configs/conf1 2> 411322 INFO (qtp259113879-4343) [n:127.0.0.1:37265_solr c:replacenodetest_coll s:shard5 r:core_node15 x:replacenodetest_coll_shard5_replica_n5 t:null-1663] o.a.s.u.UpdateHandler Using UpdateLog implementation: org.apache.solr.update.UpdateLog 2> 411323 INFO (qtp259113879-4343) [n:127.0.0.1:37265_solr c:replacenodetest_coll s:shard5 r:core_node15 x:replacenodetest_coll_shard5_replica_n5 t:null-1663] o.a.s.u.UpdateLog Initializing UpdateLog: dataDir=null defaultSyncLevel=FLUSH numRecordsToKeep=100 maxNumLogsToKeep=10 numVersionBuckets=65536 2> 411326 INFO (qtp534543879-4354) [n:127.0.0.1:33013_solr c:replacenodetest_coll s:shard2 r:core_node13 x:replacenodetest_coll_shard2_replica_n3 t:null-1663] o.a.s.r.ManagedResourceStorage Loaded null at path _rest_managed.json using ZooKeeperStorageIO:path=/configs/conf1 2> 411329 INFO (qtp534543879-4356) [n:127.0.0.1:33013_solr c:replacenodetest_coll s:shard4 r:core_node18 x:replacenodetest_coll_shard4_replica_n8 t:null-1663] o.a.s.r.ManagedResourceStorage Configured ZooKeeperStorageIO with znodeBase: /configs/conf1 2> 411333 INFO (qtp534543879-4354) [n:127.0.0.1:33013_solr c:replacenodetest_coll s:shard2 r:core_node13 x:replacenodetest_coll_shard2_replica_n3 t:null-1663] o.a.s.h.ReplicationHandler Commits will be reserved for 10000 ms 2> 411333 INFO (qtp534543879-4356) [n:127.0.0.1:33013_solr c:replacenodetest_coll s:shard4 r:core_node18 x:replacenodetest_coll_shard4_replica_n8 t:null-1663] o.a.s.r.ManagedResourceStorage Loaded null at path _rest_managed.json using ZooKeeperStorageIO:path=/configs/conf1 2> 411334 INFO (qtp534543879-4354) [n:127.0.0.1:33013_solr c:replacenodetest_coll s:shard2 r:core_node13 x:replacenodetest_coll_shard2_replica_n3 t:null-1663] o.a.s.u.UpdateLog Could not find max version in index or recent updates, using new clock 1788735688401747968 2> 411335 INFO (qtp259113879-4351) [n:127.0.0.1:37265_solr c:replacenodetest_coll s:shard2 r:core_node19 x:replacenodetest_coll_shard2_replica_n9 t:null-1663] o.a.s.u.CommitTracker Hard AutoCommit: disabled 2> 411335 INFO (qtp259113879-4351) [n:127.0.0.1:37265_solr c:replacenodetest_coll s:shard2 r:core_node19 x:replacenodetest_coll_shard2_replica_n9 t:null-1663] o.a.s.u.CommitTracker Soft AutoCommit: disabled 2> 411341 INFO (qtp534543879-4356) [n:127.0.0.1:33013_solr c:replacenodetest_coll s:shard4 r:core_node18 x:replacenodetest_coll_shard4_replica_n8 t:null-1663] o.a.s.h.ReplicationHandler Commits will be reserved for 10000 ms 2> 411354 INFO (qtp534543879-4356) [n:127.0.0.1:33013_solr c:replacenodetest_coll s:shard4 r:core_node18 x:replacenodetest_coll_shard4_replica_n8 t:null-1663] o.a.s.u.UpdateLog Could not find max version in index or recent updates, using new clock 1788735688422719488 2> 411357 INFO (qtp583399768-4335) [n:127.0.0.1:37983_solr c:replacenodetest_coll s:shard5 r:core_node16 x:replacenodetest_coll_shard5_replica_n6 t:null-1663] o.a.s.r.ManagedResourceStorage Configured ZooKeeperStorageIO with znodeBase: /configs/conf1 2> 411360 INFO (qtp583399768-4331) [n:127.0.0.1:37983_solr c:replacenodetest_coll s:shard3 r:core_node14 x:replacenodetest_coll_shard3_replica_n4 t:null-1663] o.a.s.r.ManagedResourceStorage Configured ZooKeeperStorageIO with znodeBase: /configs/conf1 2> 411361 INFO (qtp583399768-4335) [n:127.0.0.1:37983_solr c:replacenodetest_coll s:shard5 r:core_node16 x:replacenodetest_coll_shard5_replica_n6 t:null-1663] o.a.s.r.ManagedResourceStorage Loaded null at path _rest_managed.json using ZooKeeperStorageIO:path=/configs/conf1 2> 411361 INFO (qtp259113879-4343) [n:127.0.0.1:37265_solr c:replacenodetest_coll s:shard5 r:core_node15 x:replacenodetest_coll_shard5_replica_n5 t:null-1663] o.a.s.u.CommitTracker Hard AutoCommit: disabled 2> 411362 INFO (qtp259113879-4343) [n:127.0.0.1:37265_solr c:replacenodetest_coll s:shard5 r:core_node15 x:replacenodetest_coll_shard5_replica_n5 t:null-1663] o.a.s.u.CommitTracker Soft AutoCommit: disabled 2> 411375 INFO (qtp583399768-4331) [n:127.0.0.1:37983_solr c:replacenodetest_coll s:shard3 r:core_node14 x:replacenodetest_coll_shard3_replica_n4 t:null-1663] o.a.s.r.ManagedResourceStorage Loaded null at path _rest_managed.json using ZooKeeperStorageIO:path=/configs/conf1 2> 411379 INFO (qtp583399768-4335) [n:127.0.0.1:37983_solr c:replacenodetest_coll s:shard5 r:core_node16 x:replacenodetest_coll_shard5_replica_n6 t:null-1663] o.a.s.h.ReplicationHandler Commits will be reserved for 10000 ms 2> 411380 INFO (qtp583399768-4335) [n:127.0.0.1:37983_solr c:replacenodetest_coll s:shard5 r:core_node16 x:replacenodetest_coll_shard5_replica_n6 t:null-1663] o.a.s.u.UpdateLog Could not find max version in index or recent updates, using new clock 1788735688449982464 2> 411383 INFO (qtp583399768-4331) [n:127.0.0.1:37983_solr c:replacenodetest_coll s:shard3 r:core_node14 x:replacenodetest_coll_shard3_replica_n4 t:null-1663] o.a.s.h.ReplicationHandler Commits will be reserved for 10000 ms 2> 411390 INFO (qtp583399768-4331) [n:127.0.0.1:37983_solr c:replacenodetest_coll s:shard3 r:core_node14 x:replacenodetest_coll_shard3_replica_n4 t:null-1663] o.a.s.u.UpdateLog Could not find max version in index or recent updates, using new clock 1788735688460468224 2> 411392 INFO (qtp819144083-4324) [n:127.0.0.1:34953_solr c:replacenodetest_coll s:shard1 r:core_node12 x:replacenodetest_coll_shard1_replica_n2 t:null-1663] o.a.s.u.UpdateHandler Using UpdateLog implementation: org.apache.solr.update.UpdateLog 2> 411392 INFO (qtp819144083-4324) [n:127.0.0.1:34953_solr c:replacenodetest_coll s:shard1 r:core_node12 x:replacenodetest_coll_shard1_replica_n2 t:null-1663] o.a.s.u.UpdateLog Initializing UpdateLog: dataDir=null defaultSyncLevel=FLUSH numRecordsToKeep=100 maxNumLogsToKeep=10 numVersionBuckets=65536 2> 411395 INFO (qtp819144083-4332) [n:127.0.0.1:34953_solr c:replacenodetest_coll s:shard4 r:core_node17 x:replacenodetest_coll_shard4_replica_n7 t:null-1663] o.a.s.u.UpdateHandler Using UpdateLog implementation: org.apache.solr.update.UpdateLog 2> 411396 INFO (qtp819144083-4332) [n:127.0.0.1:34953_solr c:replacenodetest_coll s:shard4 r:core_node17 x:replacenodetest_coll_shard4_replica_n7 t:null-1663] o.a.s.u.UpdateLog Initializing UpdateLog: dataDir=null defaultSyncLevel=FLUSH numRecordsToKeep=100 maxNumLogsToKeep=10 numVersionBuckets=65536 2> 411406 INFO (qtp1384576555-4336) [n:127.0.0.1:34105_solr c:replacenodetest_coll s:shard1 r:core_node11 x:replacenodetest_coll_shard1_replica_n1 t:null-1663] o.a.s.u.UpdateHandler Using UpdateLog implementation: org.apache.solr.update.UpdateLog 2> 411406 INFO (searcherExecutor-2506-thread-1-processing-replacenodetest_coll_shard2_replica_n3 null-1663 core_node13 127.0.0.1:33013_solr replacenodetest_coll shard2) [n:127.0.0.1:33013_solr c:replacenodetest_coll s:shard2 r:core_node13 x:replacenodetest_coll_shard2_replica_n3 t:null-1663] o.a.s.c.SolrCore Registered new searcher autowarm time: 0 ms 2> 411406 INFO (qtp1384576555-4336) [n:127.0.0.1:34105_solr c:replacenodetest_coll s:shard1 r:core_node11 x:replacenodetest_coll_shard1_replica_n1 t:null-1663] o.a.s.u.UpdateLog Initializing UpdateLog: dataDir=null defaultSyncLevel=FLUSH numRecordsToKeep=100 maxNumLogsToKeep=10 numVersionBuckets=65536 2> 411408 INFO (qtp1384576555-4330) [n:127.0.0.1:34105_solr c:replacenodetest_coll s:shard3 r:core_node20 x:replacenodetest_coll_shard3_replica_n10 t:null-1663] o.a.s.u.UpdateHandler Using UpdateLog implementation: org.apache.solr.update.UpdateLog 2> 411409 INFO (qtp1384576555-4330) [n:127.0.0.1:34105_solr c:replacenodetest_coll s:shard3 r:core_node20 x:replacenodetest_coll_shard3_replica_n10 t:null-1663] o.a.s.u.UpdateLog Initializing UpdateLog: dataDir=null defaultSyncLevel=FLUSH numRecordsToKeep=100 maxNumLogsToKeep=10 numVersionBuckets=65536 2> 411413 INFO (qtp259113879-4351) [n:127.0.0.1:37265_solr c:replacenodetest_coll s:shard2 r:core_node19 x:replacenodetest_coll_shard2_replica_n9 t:null-1663] o.a.s.r.ManagedResourceStorage Configured ZooKeeperStorageIO with znodeBase: /configs/conf1 2> 411414 INFO (searcherExecutor-2510-thread-1-processing-replacenodetest_coll_shard4_replica_n8 null-1663 core_node18 127.0.0.1:33013_solr replacenodetest_coll shard4) [n:127.0.0.1:33013_solr c:replacenodetest_coll s:shard4 r:core_node18 x:replacenodetest_coll_shard4_replica_n8 t:null-1663] o.a.s.c.SolrCore Registered new searcher autowarm time: 0 ms 2> 411419 INFO (qtp259113879-4351) [n:127.0.0.1:37265_solr c:replacenodetest_coll s:shard2 r:core_node19 x:replacenodetest_coll_shard2_replica_n9 t:null-1663] o.a.s.r.ManagedResourceStorage Loaded null at path _rest_managed.json using ZooKeeperStorageIO:path=/configs/conf1 2> 411420 INFO (qtp259113879-4343) [n:127.0.0.1:37265_solr c:replacenodetest_coll s:shard5 r:core_node15 x:replacenodetest_coll_shard5_replica_n5 t:null-1663] o.a.s.r.ManagedResourceStorage Configured ZooKeeperStorageIO with znodeBase: /configs/conf1 2> 411426 INFO (qtp259113879-4343) [n:127.0.0.1:37265_solr c:replacenodetest_coll s:shard5 r:core_node15 x:replacenodetest_coll_shard5_replica_n5 t:null-1663] o.a.s.r.ManagedResourceStorage Loaded null at path _rest_managed.json using ZooKeeperStorageIO:path=/configs/conf1 2> 411431 INFO (qtp819144083-4324) [n:127.0.0.1:34953_solr c:replacenodetest_coll s:shard1 r:core_node12 x:replacenodetest_coll_shard1_replica_n2 t:null-1663] o.a.s.u.CommitTracker Hard AutoCommit: disabled 2> 411431 INFO (qtp819144083-4324) [n:127.0.0.1:34953_solr c:replacenodetest_coll s:shard1 r:core_node12 x:replacenodetest_coll_shard1_replica_n2 t:null-1663] o.a.s.u.CommitTracker Soft AutoCommit: disabled 2> 411434 INFO (qtp259113879-4351) [n:127.0.0.1:37265_solr c:replacenodetest_coll s:shard2 r:core_node19 x:replacenodetest_coll_shard2_replica_n9 t:null-1663] o.a.s.h.ReplicationHandler Commits will be reserved for 10000 ms 2> 411437 INFO (searcherExecutor-2501-thread-1-processing-replacenodetest_coll_shard3_replica_n4 null-1663 core_node14 127.0.0.1:37983_solr replacenodetest_coll shard3) [n:127.0.0.1:37983_solr c:replacenodetest_coll s:shard3 r:core_node14 x:replacenodetest_coll_shard3_replica_n4 t:null-1663] o.a.s.c.SolrCore Registered new searcher autowarm time: 0 ms 2> 411436 INFO (searcherExecutor-2514-thread-1-processing-replacenodetest_coll_shard5_replica_n6 null-1663 core_node16 127.0.0.1:37983_solr replacenodetest_coll shard5) [n:127.0.0.1:37983_solr c:replacenodetest_coll s:shard5 r:core_node16 x:replacenodetest_coll_shard5_replica_n6 t:null-1663] o.a.s.c.SolrCore Registered new searcher autowarm time: 0 ms 2> 411437 INFO (qtp259113879-4343) [n:127.0.0.1:37265_solr c:replacenodetest_coll s:shard5 r:core_node15 x:replacenodetest_coll_shard5_replica_n5 t:null-1663] o.a.s.h.ReplicationHandler Commits will be reserved for 10000 ms 2> 411438 INFO (qtp259113879-4351) [n:127.0.0.1:37265_solr c:replacenodetest_coll s:shard2 r:core_node19 x:replacenodetest_coll_shard2_replica_n9 t:null-1663] o.a.s.u.UpdateLog Could not find max version in index or recent updates, using new clock 1788735688510799872 2> 411438 INFO (qtp819144083-4332) [n:127.0.0.1:34953_solr c:replacenodetest_coll s:shard4 r:core_node17 x:replacenodetest_coll_shard4_replica_n7 t:null-1663] o.a.s.u.CommitTracker Hard AutoCommit: disabled 2> 411438 INFO (qtp534543879-4356) [n:127.0.0.1:33013_solr c:replacenodetest_coll s:shard4 r:core_node18 x:replacenodetest_coll_shard4_replica_n8 t:null-1663] o.a.s.c.ZkShardTerms Successful update of terms at /collections/replacenodetest_coll/terms/shard4 to Terms{values={core_node18=0}, version=0} for registerTerm 2> 411438 INFO (qtp259113879-4343) [n:127.0.0.1:37265_solr c:replacenodetest_coll s:shard5 r:core_node15 x:replacenodetest_coll_shard5_replica_n5 t:null-1663] o.a.s.u.UpdateLog Could not find max version in index or recent updates, using new clock 1788735688510799872 2> 411438 INFO (qtp819144083-4332) [n:127.0.0.1:34953_solr c:replacenodetest_coll s:shard4 r:core_node17 x:replacenodetest_coll_shard4_replica_n7 t:null-1663] o.a.s.u.CommitTracker Soft AutoCommit: disabled 2> 411439 INFO (qtp534543879-4356) [n:127.0.0.1:33013_solr c:replacenodetest_coll s:shard4 r:core_node18 x:replacenodetest_coll_shard4_replica_n8 t:null-1663] o.a.s.c.ShardLeaderElectionContextBase make sure parent is created /collections/replacenodetest_coll/leaders/shard4 2> 411440 INFO (qtp1384576555-4330) [n:127.0.0.1:34105_solr c:replacenodetest_coll s:shard3 r:core_node20 x:replacenodetest_coll_shard3_replica_n10 t:null-1663] o.a.s.u.CommitTracker Hard AutoCommit: disabled 2> 411441 INFO (qtp1384576555-4330) [n:127.0.0.1:34105_solr c:replacenodetest_coll s:shard3 r:core_node20 x:replacenodetest_coll_shard3_replica_n10 t:null-1663] o.a.s.u.CommitTracker Soft AutoCommit: disabled 2> 411445 INFO (qtp1384576555-4336) [n:127.0.0.1:34105_solr c:replacenodetest_coll s:shard1 r:core_node11 x:replacenodetest_coll_shard1_replica_n1 t:null-1663] o.a.s.u.CommitTracker Hard AutoCommit: disabled 2> 411445 INFO (qtp1384576555-4336) [n:127.0.0.1:34105_solr c:replacenodetest_coll s:shard1 r:core_node11 x:replacenodetest_coll_shard1_replica_n1 t:null-1663] o.a.s.u.CommitTracker Soft AutoCommit: disabled 2> 411450 INFO (qtp583399768-4335) [n:127.0.0.1:37983_solr c:replacenodetest_coll s:shard5 r:core_node16 x:replacenodetest_coll_shard5_replica_n6 t:null-1663] o.a.s.c.ZkShardTerms Successful update of terms at /collections/replacenodetest_coll/terms/shard5 to Terms{values={core_node16=0}, version=0} for registerTerm 2> 411451 INFO (qtp583399768-4335) [n:127.0.0.1:37983_solr c:replacenodetest_coll s:shard5 r:core_node16 x:replacenodetest_coll_shard5_replica_n6 t:null-1663] o.a.s.c.ShardLeaderElectionContextBase make sure parent is created /collections/replacenodetest_coll/leaders/shard5 2> 411477 INFO (qtp534543879-4354) [n:127.0.0.1:33013_solr c:replacenodetest_coll s:shard2 r:core_node13 x:replacenodetest_coll_shard2_replica_n3 t:null-1663] o.a.s.c.ZkShardTerms Successful update of terms at /collections/replacenodetest_coll/terms/shard2 to Terms{values={core_node13=0}, version=0} for registerTerm 2> 411481 INFO (qtp534543879-4354) [n:127.0.0.1:33013_solr c:replacenodetest_coll s:shard2 r:core_node13 x:replacenodetest_coll_shard2_replica_n3 t:null-1663] o.a.s.c.ShardLeaderElectionContextBase make sure parent is created /collections/replacenodetest_coll/leaders/shard2 2> 411482 INFO (qtp819144083-4324) [n:127.0.0.1:34953_solr c:replacenodetest_coll s:shard1 r:core_node12 x:replacenodetest_coll_shard1_replica_n2 t:null-1663] o.a.s.r.ManagedResourceStorage Configured ZooKeeperStorageIO with znodeBase: /configs/conf1 2> 411485 INFO (searcherExecutor-2512-thread-1-processing-replacenodetest_coll_shard5_replica_n5 null-1663 core_node15 127.0.0.1:37265_solr replacenodetest_coll shard5) [n:127.0.0.1:37265_solr c:replacenodetest_coll s:shard5 r:core_node15 x:replacenodetest_coll_shard5_replica_n5 t:null-1663] o.a.s.c.SolrCore Registered new searcher autowarm time: 0 ms 2> 411488 INFO (searcherExecutor-2504-thread-1-processing-replacenodetest_coll_shard2_replica_n9 null-1663 core_node19 127.0.0.1:37265_solr replacenodetest_coll shard2) [n:127.0.0.1:37265_solr c:replacenodetest_coll s:shard2 r:core_node19 x:replacenodetest_coll_shard2_replica_n9 t:null-1663] o.a.s.c.SolrCore Registered new searcher autowarm time: 0 ms 2> 411488 INFO (qtp1384576555-4330) [n:127.0.0.1:34105_solr c:replacenodetest_coll s:shard3 r:core_node20 x:replacenodetest_coll_shard3_replica_n10 t:null-1663] o.a.s.r.ManagedResourceStorage Configured ZooKeeperStorageIO with znodeBase: /configs/conf1 2> 411488 INFO (qtp819144083-4324) [n:127.0.0.1:34953_solr c:replacenodetest_coll s:shard1 r:core_node12 x:replacenodetest_coll_shard1_replica_n2 t:null-1663] o.a.s.r.ManagedResourceStorage Loaded null at path _rest_managed.json using ZooKeeperStorageIO:path=/configs/conf1 2> 411491 INFO (qtp819144083-4332) [n:127.0.0.1:34953_solr c:replacenodetest_coll s:shard4 r:core_node17 x:replacenodetest_coll_shard4_replica_n7 t:null-1663] o.a.s.r.ManagedResourceStorage Configured ZooKeeperStorageIO with znodeBase: /configs/conf1 2> 411494 INFO (qtp1384576555-4330) [n:127.0.0.1:34105_solr c:replacenodetest_coll s:shard3 r:core_node20 x:replacenodetest_coll_shard3_replica_n10 t:null-1663] o.a.s.r.ManagedResourceStorage Loaded null at path _rest_managed.json using ZooKeeperStorageIO:path=/configs/conf1 2> 411494 INFO (qtp1384576555-4336) [n:127.0.0.1:34105_solr c:replacenodetest_coll s:shard1 r:core_node11 x:replacenodetest_coll_shard1_replica_n1 t:null-1663] o.a.s.r.ManagedResourceStorage Configured ZooKeeperStorageIO with znodeBase: /configs/conf1 2> 411495 INFO (qtp819144083-4332) [n:127.0.0.1:34953_solr c:replacenodetest_coll s:shard4 r:core_node17 x:replacenodetest_coll_shard4_replica_n7 t:null-1663] o.a.s.r.ManagedResourceStorage Loaded null at path _rest_managed.json using ZooKeeperStorageIO:path=/configs/conf1 2> 411500 INFO (qtp819144083-4324) [n:127.0.0.1:34953_solr c:replacenodetest_coll s:shard1 r:core_node12 x:replacenodetest_coll_shard1_replica_n2 t:null-1663] o.a.s.h.ReplicationHandler Commits will be reserved for 10000 ms 2> 411500 INFO (qtp583399768-4331) [n:127.0.0.1:37983_solr c:replacenodetest_coll s:shard3 r:core_node14 x:replacenodetest_coll_shard3_replica_n4 t:null-1663] o.a.s.c.ZkShardTerms Successful update of terms at /collections/replacenodetest_coll/terms/shard3 to Terms{values={core_node14=0}, version=0} for registerTerm 2> 411500 INFO (qtp1384576555-4336) [n:127.0.0.1:34105_solr c:replacenodetest_coll s:shard1 r:core_node11 x:replacenodetest_coll_shard1_replica_n1 t:null-1663] o.a.s.r.ManagedResourceStorage Loaded null at path _rest_managed.json using ZooKeeperStorageIO:path=/configs/conf1 2> 411505 INFO (qtp583399768-4331) [n:127.0.0.1:37983_solr c:replacenodetest_coll s:shard3 r:core_node14 x:replacenodetest_coll_shard3_replica_n4 t:null-1663] o.a.s.c.ShardLeaderElectionContextBase make sure parent is created /collections/replacenodetest_coll/leaders/shard3 2> 411506 INFO (qtp819144083-4324) [n:127.0.0.1:34953_solr c:replacenodetest_coll s:shard1 r:core_node12 x:replacenodetest_coll_shard1_replica_n2 t:null-1663] o.a.s.u.UpdateLog Could not find max version in index or recent updates, using new clock 1788735688582103040 2> 411506 INFO (qtp1384576555-4330) [n:127.0.0.1:34105_solr c:replacenodetest_coll s:shard3 r:core_node20 x:replacenodetest_coll_shard3_replica_n10 t:null-1663] o.a.s.h.ReplicationHandler Commits will be reserved for 10000 ms 2> 411508 INFO (qtp1384576555-4330) [n:127.0.0.1:34105_solr c:replacenodetest_coll s:shard3 r:core_node20 x:replacenodetest_coll_shard3_replica_n10 t:null-1663] o.a.s.u.UpdateLog Could not find max version in index or recent updates, using new clock 1788735688584200192 2> 411509 INFO (qtp819144083-4332) [n:127.0.0.1:34953_solr c:replacenodetest_coll s:shard4 r:core_node17 x:replacenodetest_coll_shard4_replica_n7 t:null-1663] o.a.s.h.ReplicationHandler Commits will be reserved for 10000 ms 2> 411510 INFO (qtp819144083-4332) [n:127.0.0.1:34953_solr c:replacenodetest_coll s:shard4 r:core_node17 x:replacenodetest_coll_shard4_replica_n7 t:null-1663] o.a.s.u.UpdateLog Could not find max version in index or recent updates, using new clock 1788735688586297344 2> 411512 INFO (qtp1384576555-4336) [n:127.0.0.1:34105_solr c:replacenodetest_coll s:shard1 r:core_node11 x:replacenodetest_coll_shard1_replica_n1 t:null-1663] o.a.s.h.ReplicationHandler Commits will be reserved for 10000 ms 2> 411513 INFO (qtp1384576555-4336) [n:127.0.0.1:34105_solr c:replacenodetest_coll s:shard1 r:core_node11 x:replacenodetest_coll_shard1_replica_n1 t:null-1663] o.a.s.u.UpdateLog Could not find max version in index or recent updates, using new clock 1788735688589443072 2> 411514 INFO (qtp259113879-4343) [n:127.0.0.1:37265_solr c:replacenodetest_coll s:shard5 r:core_node15 x:replacenodetest_coll_shard5_replica_n5 t:null-1663] o.a.s.c.ZkShardTerms Successful update of terms at /collections/replacenodetest_coll/terms/shard5 to Terms{values={core_node16=0, core_node15=0}, version=1} for registerTerm 2> 411515 INFO (qtp259113879-4343) [n:127.0.0.1:37265_solr c:replacenodetest_coll s:shard5 r:core_node15 x:replacenodetest_coll_shard5_replica_n5 t:null-1663] o.a.s.c.ShardLeaderElectionContextBase make sure parent is created /collections/replacenodetest_coll/leaders/shard5 2> 411538 INFO (qtp534543879-4356) [n:127.0.0.1:33013_solr c:replacenodetest_coll s:shard4 r:core_node18 x:replacenodetest_coll_shard4_replica_n8 t:null-1663] o.a.s.c.ShardLeaderElectionContext Waiting until we see more replicas up for shard shard4: total=2 found=1 timeoute in=9980ms 2> 411562 INFO (qtp583399768-4335) [n:127.0.0.1:37983_solr c:replacenodetest_coll s:shard5 r:core_node16 x:replacenodetest_coll_shard5_replica_n6 t:null-1663] o.a.s.c.ShardLeaderElectionContext Enough replicas found to continue. 2> 411563 INFO (qtp583399768-4335) [n:127.0.0.1:37983_solr c:replacenodetest_coll s:shard5 r:core_node16 x:replacenodetest_coll_shard5_replica_n6 t:null-1663] o.a.s.c.ShardLeaderElectionContext I may be the new leader - try and sync 2> 411563 INFO (qtp583399768-4335) [n:127.0.0.1:37983_solr c:replacenodetest_coll s:shard5 r:core_node16 x:replacenodetest_coll_shard5_replica_n6 t:null-1663] o.a.s.c.SyncStrategy Sync replicas to http://127.0.0.1:37983/solr/replacenodetest_coll_shard5_replica_n6/ 2> 411569 INFO (qtp259113879-4351) [n:127.0.0.1:37265_solr c:replacenodetest_coll s:shard2 r:core_node19 x:replacenodetest_coll_shard2_replica_n9 t:null-1663] o.a.s.c.ZkShardTerms Successful update of terms at /collections/replacenodetest_coll/terms/shard2 to Terms{values={core_node13=0, core_node19=0}, version=1} for registerTerm 2> 411573 INFO (qtp259113879-4351) [n:127.0.0.1:37265_solr c:replacenodetest_coll s:shard2 r:core_node19 x:replacenodetest_coll_shard2_replica_n9 t:null-1663] o.a.s.c.ShardLeaderElectionContextBase make sure parent is created /collections/replacenodetest_coll/leaders/shard2 2> 411577 INFO (searcherExecutor-2508-thread-1-processing-replacenodetest_coll_shard1_replica_n2 null-1663 core_node12 127.0.0.1:34953_solr replacenodetest_coll shard1) [n:127.0.0.1:34953_solr c:replacenodetest_coll s:shard1 r:core_node12 x:replacenodetest_coll_shard1_replica_n2 t:null-1663] o.a.s.c.SolrCore Registered new searcher autowarm time: 0 ms 2> 411578 INFO (qtp583399768-4335) [n:127.0.0.1:37983_solr c:replacenodetest_coll s:shard5 r:core_node16 x:replacenodetest_coll_shard5_replica_n6 t:null-1663] o.a.s.u.PeerSync PeerSync: core=replacenodetest_coll_shard5_replica_n6 url=http://127.0.0.1:37983/solr START replicas=[http://127.0.0.1:37265/solr/replacenodetest_coll_shard5_replica_n5/] nUpdates=100 2> 411579 INFO (searcherExecutor-2502-thread-1-processing-replacenodetest_coll_shard3_replica_n10 null-1663 core_node20 127.0.0.1:34105_solr replacenodetest_coll shard3) [n:127.0.0.1:34105_solr c:replacenodetest_coll s:shard3 r:core_node20 x:replacenodetest_coll_shard3_replica_n10 t:null-1663] o.a.s.c.SolrCore Registered new searcher autowarm time: 0 ms 2> 411580 INFO (searcherExecutor-2516-thread-1-processing-replacenodetest_coll_shard1_replica_n1 null-1663 core_node11 127.0.0.1:34105_solr replacenodetest_coll shard1) [n:127.0.0.1:34105_solr c:replacenodetest_coll s:shard1 r:core_node11 x:replacenodetest_coll_shard1_replica_n1 t:null-1663] o.a.s.c.SolrCore Registered new searcher autowarm time: 0 ms 2> 411582 INFO (searcherExecutor-2518-thread-1-processing-replacenodetest_coll_shard4_replica_n7 null-1663 core_node17 127.0.0.1:34953_solr replacenodetest_coll shard4) [n:127.0.0.1:34953_solr c:replacenodetest_coll s:shard4 r:core_node17 x:replacenodetest_coll_shard4_replica_n7 t:null-1663] o.a.s.c.SolrCore Registered new searcher autowarm time: 0 ms 2> 411584 INFO (qtp583399768-4335) [n:127.0.0.1:37983_solr c:replacenodetest_coll s:shard5 r:core_node16 x:replacenodetest_coll_shard5_replica_n6 t:null-1663] o.a.s.u.PeerSync PeerSync: core=replacenodetest_coll_shard5_replica_n6 url=http://127.0.0.1:37983/solr DONE. We have no versions. sync failed. 2> 411591 INFO (qtp534543879-4354) [n:127.0.0.1:33013_solr c:replacenodetest_coll s:shard2 r:core_node13 x:replacenodetest_coll_shard2_replica_n3 t:null-1663] o.a.s.c.ShardLeaderElectionContext Enough replicas found to continue. 2> 411591 INFO (qtp534543879-4354) [n:127.0.0.1:33013_solr c:replacenodetest_coll s:shard2 r:core_node13 x:replacenodetest_coll_shard2_replica_n3 t:null-1663] o.a.s.c.ShardLeaderElectionContext I may be the new leader - try and sync 2> 411592 INFO (qtp534543879-4354) [n:127.0.0.1:33013_solr c:replacenodetest_coll s:shard2 r:core_node13 x:replacenodetest_coll_shard2_replica_n3 t:null-1663] o.a.s.c.SyncStrategy Sync replicas to http://127.0.0.1:33013/solr/replacenodetest_coll_shard2_replica_n3/ 2> 411598 INFO (qtp534543879-4354) [n:127.0.0.1:33013_solr c:replacenodetest_coll s:shard2 r:core_node13 x:replacenodetest_coll_shard2_replica_n3 t:null-1663] o.a.s.u.PeerSync PeerSync: core=replacenodetest_coll_shard2_replica_n3 url=http://127.0.0.1:33013/solr START replicas=[http://127.0.0.1:37265/solr/replacenodetest_coll_shard2_replica_n9/] nUpdates=100 2> 411599 INFO (qtp819144083-4324) [n:127.0.0.1:34953_solr c:replacenodetest_coll s:shard1 r:core_node12 x:replacenodetest_coll_shard1_replica_n2 t:null-1663] o.a.s.c.ZkShardTerms Successful update of terms at /collections/replacenodetest_coll/terms/shard1 to Terms{values={core_node12=0}, version=0} for registerTerm 2> 411599 INFO (qtp819144083-4324) [n:127.0.0.1:34953_solr c:replacenodetest_coll s:shard1 r:core_node12 x:replacenodetest_coll_shard1_replica_n2 t:null-1663] o.a.s.c.ShardLeaderElectionContextBase make sure parent is created /collections/replacenodetest_coll/leaders/shard1 2> 411603 INFO (qtp1384576555-4336) [n:127.0.0.1:34105_solr c:replacenodetest_coll s:shard1 r:core_node11 x:replacenodetest_coll_shard1_replica_n1 t:null-1663] o.a.s.c.ZkShardTerms Failed to save terms, version is not a match, retrying 2> 411603 INFO (qtp534543879-4354) [n:127.0.0.1:33013_solr c:replacenodetest_coll s:shard2 r:core_node13 x:replacenodetest_coll_shard2_replica_n3 t:null-1663] o.a.s.u.PeerSync PeerSync: core=replacenodetest_coll_shard2_replica_n3 url=http://127.0.0.1:33013/solr DONE. We have no versions. sync failed. 2> 411607 INFO (qtp583399768-4331) [n:127.0.0.1:37983_solr c:replacenodetest_coll s:shard3 r:core_node14 x:replacenodetest_coll_shard3_replica_n4 t:null-1663] o.a.s.c.ShardLeaderElectionContext Waiting until we see more replicas up for shard shard3: total=2 found=1 timeoute in=9988ms 2> 411608 INFO (qtp259113879-4333) [n:127.0.0.1:37265_solr c:replacenodetest_coll s:shard2 r:core_node19 x:replacenodetest_coll_shard2_replica_n9 t:null-1663] o.a.s.c.S.Request webapp=/solr path=/get params={distrib=false&qt=/get&fingerprint=false&getVersions=100&wt=javabin&version=2} status=0 QTime=2 2> 411608 INFO (qtp259113879-4323) [n:127.0.0.1:37265_solr c:replacenodetest_coll s:shard5 r:core_node15 x:replacenodetest_coll_shard5_replica_n5 t:null-1663] o.a.s.c.S.Request webapp=/solr path=/get params={distrib=false&qt=/get&fingerprint=false&getVersions=100&wt=javabin&version=2} status=0 QTime=8 2> 411610 INFO (qtp1384576555-4336) [n:127.0.0.1:34105_solr c:replacenodetest_coll s:shard1 r:core_node11 x:replacenodetest_coll_shard1_replica_n1 t:null-1663] o.a.s.c.ZkShardTerms Successful update of terms at /collections/replacenodetest_coll/terms/shard1 to Terms{values={core_node12=0, core_node11=0}, version=1} for registerTerm 2> 411611 INFO (qtp1384576555-4336) [n:127.0.0.1:34105_solr c:replacenodetest_coll s:shard1 r:core_node11 x:replacenodetest_coll_shard1_replica_n1 t:null-1663] o.a.s.c.ShardLeaderElectionContextBase make sure parent is created /collections/replacenodetest_coll/leaders/shard1 2> 411612 INFO (qtp583399768-4335) [n:127.0.0.1:37983_solr c:replacenodetest_coll s:shard5 r:core_node16 x:replacenodetest_coll_shard5_replica_n6 t:null-1663] o.a.s.c.SyncStrategy Leader's attempt to sync with shard failed, moving to the next candidate 2> 411612 INFO (qtp534543879-4354) [n:127.0.0.1:33013_solr c:replacenodetest_coll s:shard2 r:core_node13 x:replacenodetest_coll_shard2_replica_n3 t:null-1663] o.a.s.c.SyncStrategy Leader's attempt to sync with shard failed, moving to the next candidate 2> 411613 INFO (qtp583399768-4335) [n:127.0.0.1:37983_solr c:replacenodetest_coll s:shard5 r:core_node16 x:replacenodetest_coll_shard5_replica_n6 t:null-1663] o.a.s.c.ShardLeaderElectionContext We failed sync, but we have no versions - we can't sync in that case - we were active before, so become leader anyway 2> 411613 INFO (qtp534543879-4354) [n:127.0.0.1:33013_solr c:replacenodetest_coll s:shard2 r:core_node13 x:replacenodetest_coll_shard2_replica_n3 t:null-1663] o.a.s.c.ShardLeaderElectionContext We failed sync, but we have no versions - we can't sync in that case - we were active before, so become leader anyway 2> 411613 INFO (qtp583399768-4335) [n:127.0.0.1:37983_solr c:replacenodetest_coll s:shard5 r:core_node16 x:replacenodetest_coll_shard5_replica_n6 t:null-1663] o.a.s.c.ShardLeaderElectionContextBase Creating leader registration node /collections/replacenodetest_coll/leaders/shard5/leader after winning as /collections/replacenodetest_coll/leader_elect/shard5/election/72077816720195594-core_node16-n_0000000000 2> 411613 INFO (qtp534543879-4354) [n:127.0.0.1:33013_solr c:replacenodetest_coll s:shard2 r:core_node13 x:replacenodetest_coll_shard2_replica_n3 t:null-1663] o.a.s.c.ShardLeaderElectionContextBase Creating leader registration node /collections/replacenodetest_coll/leaders/shard2/leader after winning as /collections/replacenodetest_coll/leader_elect/shard2/election/72077816720195597-core_node13-n_0000000000 2> 411627 INFO (qtp819144083-4332) [n:127.0.0.1:34953_solr c:replacenodetest_coll s:shard4 r:core_node17 x:replacenodetest_coll_shard4_replica_n7 t:null-1663] o.a.s.c.ZkShardTerms Successful update of terms at /collections/replacenodetest_coll/terms/shard4 to Terms{values={core_node18=0, core_node17=0}, version=1} for registerTerm 2> 411634 INFO (qtp819144083-4332) [n:127.0.0.1:34953_solr c:replacenodetest_coll s:shard4 r:core_node17 x:replacenodetest_coll_shard4_replica_n7 t:null-1663] o.a.s.c.ShardLeaderElectionContextBase make sure parent is created /collections/replacenodetest_coll/leaders/shard4 2> 411641 INFO (qtp583399768-4335) [n:127.0.0.1:37983_solr c:replacenodetest_coll s:shard5 r:core_node16 x:replacenodetest_coll_shard5_replica_n6 t:null-1663] o.a.s.c.ShardLeaderElectionContext I am the new leader: http://127.0.0.1:37983/solr/replacenodetest_coll_shard5_replica_n6/ shard5 2> 411646 INFO (qtp1384576555-4330) [n:127.0.0.1:34105_solr c:replacenodetest_coll s:shard3 r:core_node20 x:replacenodetest_coll_shard3_replica_n10 t:null-1663] o.a.s.c.ZkShardTerms Successful update of terms at /collections/replacenodetest_coll/terms/shard3 to Terms{values={core_node14=0, core_node20=0}, version=1} for registerTerm 2> 411650 INFO (qtp1384576555-4330) [n:127.0.0.1:34105_solr c:replacenodetest_coll s:shard3 r:core_node20 x:replacenodetest_coll_shard3_replica_n10 t:null-1663] o.a.s.c.ShardLeaderElectionContextBase make sure parent is created /collections/replacenodetest_coll/leaders/shard3 2> 411655 INFO (qtp534543879-4354) [n:127.0.0.1:33013_solr c:replacenodetest_coll s:shard2 r:core_node13 x:replacenodetest_coll_shard2_replica_n3 t:null-1663] o.a.s.c.ShardLeaderElectionContext I am the new leader: http://127.0.0.1:33013/solr/replacenodetest_coll_shard2_replica_n3/ shard2 2> 411660 INFO (qtp583399768-4335) [n:127.0.0.1:37983_solr c:replacenodetest_coll s:shard5 r:core_node16 x:replacenodetest_coll_shard5_replica_n6 t:null-1663] o.a.s.c.ZkController I am the leader, no recovery necessary 2> 411669 INFO (qtp534543879-4354) [n:127.0.0.1:33013_solr c:replacenodetest_coll s:shard2 r:core_node13 x:replacenodetest_coll_shard2_replica_n3 t:null-1663] o.a.s.c.ZkController I am the leader, no recovery necessary 2> 411671 INFO (qtp819144083-4324) [n:127.0.0.1:34953_solr c:replacenodetest_coll s:shard1 r:core_node12 x:replacenodetest_coll_shard1_replica_n2 t:null-1663] o.a.s.c.ShardLeaderElectionContext Enough replicas found to continue. 2> 411672 INFO (qtp819144083-4324) [n:127.0.0.1:34953_solr c:replacenodetest_coll s:shard1 r:core_node12 x:replacenodetest_coll_shard1_replica_n2 t:null-1663] o.a.s.c.ShardLeaderElectionContext I may be the new leader - try and sync 2> 411672 INFO (qtp819144083-4324) [n:127.0.0.1:34953_solr c:replacenodetest_coll s:shard1 r:core_node12 x:replacenodetest_coll_shard1_replica_n2 t:null-1663] o.a.s.c.SyncStrategy Sync replicas to http://127.0.0.1:34953/solr/replacenodetest_coll_shard1_replica_n2/ 2> 411679 INFO (qtp819144083-4324) [n:127.0.0.1:34953_solr c:replacenodetest_coll s:shard1 r:core_node12 x:replacenodetest_coll_shard1_replica_n2 t:null-1663] o.a.s.u.PeerSync PeerSync: core=replacenodetest_coll_shard1_replica_n2 url=http://127.0.0.1:34953/solr START replicas=[http://127.0.0.1:34105/solr/replacenodetest_coll_shard1_replica_n1/] nUpdates=100 2> 411684 INFO (qtp819144083-4324) [n:127.0.0.1:34953_solr c:replacenodetest_coll s:shard1 r:core_node12 x:replacenodetest_coll_shard1_replica_n2 t:null-1663] o.a.s.u.PeerSync PeerSync: core=replacenodetest_coll_shard1_replica_n2 url=http://127.0.0.1:34953/solr DONE. We have no versions. sync failed. 2> 411702 INFO (qtp1384576555-4320) [n:127.0.0.1:34105_solr c:replacenodetest_coll s:shard1 r:core_node11 x:replacenodetest_coll_shard1_replica_n1 t:null-1663] o.a.s.c.S.Request webapp=/solr path=/get params={distrib=false&qt=/get&fingerprint=false&getVersions=100&wt=javabin&version=2} status=0 QTime=2 2> 411703 INFO (qtp583399768-4335) [n:127.0.0.1:37983_solr c: s: r: x: t:null-1663] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/cores params={qt=/admin/cores&coreNodeName=core_node16&collection.configName=conf1&newCollection=true&name=replacenodetest_coll_shard5_replica_n6&action=CREATE&numShards=5&collection=replacenodetest_coll&shard=shard5&wt=javabin&version=2&replicaType=NRT} status=0 QTime=2307 2> 411709 INFO (qtp819144083-4324) [n:127.0.0.1:34953_solr c:replacenodetest_coll s:shard1 r:core_node12 x:replacenodetest_coll_shard1_replica_n2 t:null-1663] o.a.s.c.SyncStrategy Leader's attempt to sync with shard failed, moving to the next candidate 2> 411709 INFO (qtp819144083-4324) [n:127.0.0.1:34953_solr c:replacenodetest_coll s:shard1 r:core_node12 x:replacenodetest_coll_shard1_replica_n2 t:null-1663] o.a.s.c.ShardLeaderElectionContext We failed sync, but we have no versions - we can't sync in that case - we were active before, so become leader anyway 2> 411710 INFO (qtp819144083-4324) [n:127.0.0.1:34953_solr c:replacenodetest_coll s:shard1 r:core_node12 x:replacenodetest_coll_shard1_replica_n2 t:null-1663] o.a.s.c.ShardLeaderElectionContextBase Creating leader registration node /collections/replacenodetest_coll/leaders/shard1/leader after winning as /collections/replacenodetest_coll/leader_elect/shard1/election/72077816720195595-core_node12-n_0000000000 2> 411715 INFO (qtp534543879-4354) [n:127.0.0.1:33013_solr c: s: r: x: t:null-1663] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/cores params={qt=/admin/cores&coreNodeName=core_node13&collection.configName=conf1&newCollection=true&name=replacenodetest_coll_shard2_replica_n3&action=CREATE&numShards=5&collection=replacenodetest_coll&shard=shard2&wt=javabin&version=2&replicaType=NRT} status=0 QTime=2322 2> 411736 INFO (qtp819144083-4324) [n:127.0.0.1:34953_solr c:replacenodetest_coll s:shard1 r:core_node12 x:replacenodetest_coll_shard1_replica_n2 t:null-1663] o.a.s.c.ShardLeaderElectionContext I am the new leader: http://127.0.0.1:34953/solr/replacenodetest_coll_shard1_replica_n2/ shard1 2> 411736 INFO (zkCallback-2430-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/replacenodetest_coll/state.json zxid: 860] for collection [replacenodetest_coll] has occurred - updating... (live nodes size: [6]) 2> 411737 INFO (zkCallback-2436-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/replacenodetest_coll/state.json zxid: 860] for collection [replacenodetest_coll] has occurred - updating... (live nodes size: [6]) 2> 411749 INFO (qtp819144083-4324) [n:127.0.0.1:34953_solr c:replacenodetest_coll s:shard1 r:core_node12 x:replacenodetest_coll_shard1_replica_n2 t:null-1663] o.a.s.c.ZkController I am the leader, no recovery necessary 2> 411762 INFO (zkCallback-2430-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/replacenodetest_coll/state.json zxid: 861] for collection [replacenodetest_coll] has occurred - updating... (live nodes size: [6]) 2> 411762 INFO (zkCallback-2436-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/replacenodetest_coll/state.json zxid: 861] for collection [replacenodetest_coll] has occurred - updating... (live nodes size: [6]) 2> 411790 INFO (qtp819144083-4324) [n:127.0.0.1:34953_solr c: s: r: x: t:null-1663] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/cores params={qt=/admin/cores&coreNodeName=core_node12&collection.configName=conf1&newCollection=true&name=replacenodetest_coll_shard1_replica_n2&action=CREATE&numShards=5&collection=replacenodetest_coll&shard=shard1&wt=javabin&version=2&replicaType=NRT} status=0 QTime=2399 2> 412042 INFO (qtp534543879-4356) [n:127.0.0.1:33013_solr c:replacenodetest_coll s:shard4 r:core_node18 x:replacenodetest_coll_shard4_replica_n8 t:null-1663] o.a.s.c.ShardLeaderElectionContext Enough replicas found to continue. 2> 412042 INFO (qtp534543879-4356) [n:127.0.0.1:33013_solr c:replacenodetest_coll s:shard4 r:core_node18 x:replacenodetest_coll_shard4_replica_n8 t:null-1663] o.a.s.c.ShardLeaderElectionContext I may be the new leader - try and sync 2> 412042 INFO (qtp534543879-4356) [n:127.0.0.1:33013_solr c:replacenodetest_coll s:shard4 r:core_node18 x:replacenodetest_coll_shard4_replica_n8 t:null-1663] o.a.s.c.SyncStrategy Sync replicas to http://127.0.0.1:33013/solr/replacenodetest_coll_shard4_replica_n8/ 2> 412046 INFO (qtp534543879-4356) [n:127.0.0.1:33013_solr c:replacenodetest_coll s:shard4 r:core_node18 x:replacenodetest_coll_shard4_replica_n8 t:null-1663] o.a.s.u.PeerSync PeerSync: core=replacenodetest_coll_shard4_replica_n8 url=http://127.0.0.1:33013/solr START replicas=[http://127.0.0.1:34953/solr/replacenodetest_coll_shard4_replica_n7/] nUpdates=100 2> 412050 INFO (qtp534543879-4356) [n:127.0.0.1:33013_solr c:replacenodetest_coll s:shard4 r:core_node18 x:replacenodetest_coll_shard4_replica_n8 t:null-1663] o.a.s.u.PeerSync PeerSync: core=replacenodetest_coll_shard4_replica_n8 url=http://127.0.0.1:33013/solr DONE. We have no versions. sync failed. 2> 412055 INFO (qtp819144083-4334) [n:127.0.0.1:34953_solr c:replacenodetest_coll s:shard4 r:core_node17 x:replacenodetest_coll_shard4_replica_n7 t:null-1663] o.a.s.c.S.Request webapp=/solr path=/get params={distrib=false&qt=/get&fingerprint=false&getVersions=100&wt=javabin&version=2} status=0 QTime=2 2> 412058 INFO (qtp534543879-4356) [n:127.0.0.1:33013_solr c:replacenodetest_coll s:shard4 r:core_node18 x:replacenodetest_coll_shard4_replica_n8 t:null-1663] o.a.s.c.SyncStrategy Leader's attempt to sync with shard failed, moving to the next candidate 2> 412059 INFO (qtp534543879-4356) [n:127.0.0.1:33013_solr c:replacenodetest_coll s:shard4 r:core_node18 x:replacenodetest_coll_shard4_replica_n8 t:null-1663] o.a.s.c.ShardLeaderElectionContext We failed sync, but we have no versions - we can't sync in that case - we were active before, so become leader anyway 2> 412059 INFO (qtp534543879-4356) [n:127.0.0.1:33013_solr c:replacenodetest_coll s:shard4 r:core_node18 x:replacenodetest_coll_shard4_replica_n8 t:null-1663] o.a.s.c.ShardLeaderElectionContextBase Creating leader registration node /collections/replacenodetest_coll/leaders/shard4/leader after winning as /collections/replacenodetest_coll/leader_elect/shard4/election/72077816720195597-core_node18-n_0000000000 2> 412073 INFO (zkCallback-2430-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/replacenodetest_coll/state.json zxid: 863] for collection [replacenodetest_coll] has occurred - updating... (live nodes size: [6]) 2> 412073 INFO (qtp534543879-4356) [n:127.0.0.1:33013_solr c:replacenodetest_coll s:shard4 r:core_node18 x:replacenodetest_coll_shard4_replica_n8 t:null-1663] o.a.s.c.ShardLeaderElectionContext I am the new leader: http://127.0.0.1:33013/solr/replacenodetest_coll_shard4_replica_n8/ shard4 2> 412073 INFO (zkCallback-2436-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/replacenodetest_coll/state.json zxid: 863] for collection [replacenodetest_coll] has occurred - updating... (live nodes size: [6]) 2> 412073 INFO (zkCallback-2434-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/replacenodetest_coll/state.json zxid: 863] for collection [replacenodetest_coll] has occurred - updating... (live nodes size: [6]) 2> 412078 INFO (qtp534543879-4356) [n:127.0.0.1:33013_solr c:replacenodetest_coll s:shard4 r:core_node18 x:replacenodetest_coll_shard4_replica_n8 t:null-1663] o.a.s.c.ZkController I am the leader, no recovery necessary 2> 412086 INFO (zkCallback-2434-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/replacenodetest_coll/state.json zxid: 864] for collection [replacenodetest_coll] has occurred - updating... (live nodes size: [6]) 2> 412087 INFO (zkCallback-2436-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/replacenodetest_coll/state.json zxid: 864] for collection [replacenodetest_coll] has occurred - updating... (live nodes size: [6]) 2> 412087 INFO (zkCallback-2430-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/replacenodetest_coll/state.json zxid: 864] for collection [replacenodetest_coll] has occurred - updating... (live nodes size: [6]) 2> 412100 INFO (qtp534543879-4356) [n:127.0.0.1:33013_solr c: s: r: x: t:null-1663] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/cores params={qt=/admin/cores&coreNodeName=core_node18&collection.configName=conf1&newCollection=true&name=replacenodetest_coll_shard4_replica_n8&action=CREATE&numShards=5&collection=replacenodetest_coll&shard=shard4&wt=javabin&version=2&replicaType=NRT} status=0 QTime=2706 2> 412111 INFO (qtp583399768-4331) [n:127.0.0.1:37983_solr c:replacenodetest_coll s:shard3 r:core_node14 x:replacenodetest_coll_shard3_replica_n4 t:null-1663] o.a.s.c.ShardLeaderElectionContext Enough replicas found to continue. 2> 412112 INFO (qtp583399768-4331) [n:127.0.0.1:37983_solr c:replacenodetest_coll s:shard3 r:core_node14 x:replacenodetest_coll_shard3_replica_n4 t:null-1663] o.a.s.c.ShardLeaderElectionContext I may be the new leader - try and sync 2> 412112 INFO (qtp583399768-4331) [n:127.0.0.1:37983_solr c:replacenodetest_coll s:shard3 r:core_node14 x:replacenodetest_coll_shard3_replica_n4 t:null-1663] o.a.s.c.SyncStrategy Sync replicas to http://127.0.0.1:37983/solr/replacenodetest_coll_shard3_replica_n4/ 2> 412116 INFO (qtp583399768-4331) [n:127.0.0.1:37983_solr c:replacenodetest_coll s:shard3 r:core_node14 x:replacenodetest_coll_shard3_replica_n4 t:null-1663] o.a.s.u.PeerSync PeerSync: core=replacenodetest_coll_shard3_replica_n4 url=http://127.0.0.1:37983/solr START replicas=[http://127.0.0.1:34105/solr/replacenodetest_coll_shard3_replica_n10/] nUpdates=100 2> 412118 INFO (qtp583399768-4331) [n:127.0.0.1:37983_solr c:replacenodetest_coll s:shard3 r:core_node14 x:replacenodetest_coll_shard3_replica_n4 t:null-1663] o.a.s.u.PeerSync PeerSync: core=replacenodetest_coll_shard3_replica_n4 url=http://127.0.0.1:37983/solr DONE. We have no versions. sync failed. 2> 412133 INFO (qtp1384576555-4346) [n:127.0.0.1:34105_solr c:replacenodetest_coll s:shard3 r:core_node20 x:replacenodetest_coll_shard3_replica_n10 t:null-1663] o.a.s.c.S.Request webapp=/solr path=/get params={distrib=false&qt=/get&fingerprint=false&getVersions=100&wt=javabin&version=2} status=0 QTime=2 2> 412136 INFO (qtp583399768-4331) [n:127.0.0.1:37983_solr c:replacenodetest_coll s:shard3 r:core_node14 x:replacenodetest_coll_shard3_replica_n4 t:null-1663] o.a.s.c.SyncStrategy Leader's attempt to sync with shard failed, moving to the next candidate 2> 412137 INFO (qtp583399768-4331) [n:127.0.0.1:37983_solr c:replacenodetest_coll s:shard3 r:core_node14 x:replacenodetest_coll_shard3_replica_n4 t:null-1663] o.a.s.c.ShardLeaderElectionContext We failed sync, but we have no versions - we can't sync in that case - we were active before, so become leader anyway 2> 412137 INFO (qtp583399768-4331) [n:127.0.0.1:37983_solr c:replacenodetest_coll s:shard3 r:core_node14 x:replacenodetest_coll_shard3_replica_n4 t:null-1663] o.a.s.c.ShardLeaderElectionContextBase Creating leader registration node /collections/replacenodetest_coll/leaders/shard3/leader after winning as /collections/replacenodetest_coll/leader_elect/shard3/election/72077816720195594-core_node14-n_0000000000 2> 412152 INFO (zkCallback-2434-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/replacenodetest_coll/state.json zxid: 866] for collection [replacenodetest_coll] has occurred - updating... (live nodes size: [6]) 2> 412152 INFO (qtp583399768-4331) [n:127.0.0.1:37983_solr c:replacenodetest_coll s:shard3 r:core_node14 x:replacenodetest_coll_shard3_replica_n4 t:null-1663] o.a.s.c.ShardLeaderElectionContext I am the new leader: http://127.0.0.1:37983/solr/replacenodetest_coll_shard3_replica_n4/ shard3 2> 412152 INFO (zkCallback-2430-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/replacenodetest_coll/state.json zxid: 866] for collection [replacenodetest_coll] has occurred - updating... (live nodes size: [6]) 2> 412152 INFO (zkCallback-2436-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/replacenodetest_coll/state.json zxid: 866] for collection [replacenodetest_coll] has occurred - updating... (live nodes size: [6]) 2> 412157 INFO (qtp583399768-4331) [n:127.0.0.1:37983_solr c:replacenodetest_coll s:shard3 r:core_node14 x:replacenodetest_coll_shard3_replica_n4 t:null-1663] o.a.s.c.ZkController I am the leader, no recovery necessary 2> 412166 INFO (zkCallback-2434-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/replacenodetest_coll/state.json zxid: 867] for collection [replacenodetest_coll] has occurred - updating... (live nodes size: [6]) 2> 412166 INFO (zkCallback-2430-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/replacenodetest_coll/state.json zxid: 867] for collection [replacenodetest_coll] has occurred - updating... (live nodes size: [6]) 2> 412166 INFO (zkCallback-2436-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/replacenodetest_coll/state.json zxid: 867] for collection [replacenodetest_coll] has occurred - updating... (live nodes size: [6]) 2> 412178 INFO (qtp583399768-4331) [n:127.0.0.1:37983_solr c: s: r: x: t:null-1663] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/cores params={qt=/admin/cores&coreNodeName=core_node14&collection.configName=conf1&newCollection=true&name=replacenodetest_coll_shard3_replica_n4&action=CREATE&numShards=5&collection=replacenodetest_coll&shard=shard3&wt=javabin&version=2&replicaType=NRT} status=0 QTime=2782 2> 412586 INFO (zkCallback-2430-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/replacenodetest_coll/state.json zxid: 868] for collection [replacenodetest_coll] has occurred - updating... (live nodes size: [6]) 2> 412586 INFO (zkCallback-2434-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/replacenodetest_coll/state.json zxid: 868] for collection [replacenodetest_coll] has occurred - updating... (live nodes size: [6]) 2> 412586 INFO (zkCallback-2436-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/replacenodetest_coll/state.json zxid: 868] for collection [replacenodetest_coll] has occurred - updating... (live nodes size: [6]) 2> 412616 INFO (qtp259113879-4343) [n:127.0.0.1:37265_solr c: s: r: x: t:null-1663] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/cores params={qt=/admin/cores&coreNodeName=core_node15&collection.configName=conf1&newCollection=true&name=replacenodetest_coll_shard5_replica_n5&action=CREATE&numShards=5&collection=replacenodetest_coll&shard=shard5&wt=javabin&version=2&replicaType=NRT} status=0 QTime=3217 2> 412618 INFO (zkCallback-2430-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/replacenodetest_coll/state.json zxid: 869] for collection [replacenodetest_coll] has occurred - updating... (live nodes size: [6]) 2> 412618 INFO (zkCallback-2440-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/replacenodetest_coll/state.json zxid: 869] for collection [replacenodetest_coll] has occurred - updating... (live nodes size: [6]) 2> 412618 INFO (zkCallback-2434-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/replacenodetest_coll/state.json zxid: 869] for collection [replacenodetest_coll] has occurred - updating... (live nodes size: [6]) 2> 412618 INFO (zkCallback-2436-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/replacenodetest_coll/state.json zxid: 869] for collection [replacenodetest_coll] has occurred - updating... (live nodes size: [6]) 2> 412629 INFO (qtp259113879-4351) [n:127.0.0.1:37265_solr c: s: r: x: t:null-1663] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/cores params={qt=/admin/cores&coreNodeName=core_node19&collection.configName=conf1&newCollection=true&name=replacenodetest_coll_shard2_replica_n9&action=CREATE&numShards=5&collection=replacenodetest_coll&shard=shard2&wt=javabin&version=2&replicaType=NRT} status=0 QTime=3226 2> 412674 INFO (zkCallback-2436-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/replacenodetest_coll/state.json zxid: 870] for collection [replacenodetest_coll] has occurred - updating... (live nodes size: [6]) 2> 412674 INFO (zkCallback-2434-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/replacenodetest_coll/state.json zxid: 870] for collection [replacenodetest_coll] has occurred - updating... (live nodes size: [6]) 2> 412674 INFO (zkCallback-2430-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/replacenodetest_coll/state.json zxid: 870] for collection [replacenodetest_coll] has occurred - updating... (live nodes size: [6]) 2> 412674 INFO (zkCallback-2440-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/replacenodetest_coll/state.json zxid: 870] for collection [replacenodetest_coll] has occurred - updating... (live nodes size: [6]) 2> 412687 INFO (qtp819144083-4332) [n:127.0.0.1:34953_solr c: s: r: x: t:null-1663] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/cores params={qt=/admin/cores&coreNodeName=core_node17&collection.configName=conf1&newCollection=true&name=replacenodetest_coll_shard4_replica_n7&action=CREATE&numShards=5&collection=replacenodetest_coll&shard=shard4&wt=javabin&version=2&replicaType=NRT} status=0 QTime=3296 2> 412690 INFO (zkCallback-2434-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/replacenodetest_coll/state.json zxid: 871] for collection [replacenodetest_coll] has occurred - updating... (live nodes size: [6]) 2> 412690 INFO (zkCallback-2430-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/replacenodetest_coll/state.json zxid: 871] for collection [replacenodetest_coll] has occurred - updating... (live nodes size: [6]) 2> 412690 INFO (zkCallback-2436-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/replacenodetest_coll/state.json zxid: 871] for collection [replacenodetest_coll] has occurred - updating... (live nodes size: [6]) 2> 412690 INFO (zkCallback-2440-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/replacenodetest_coll/state.json zxid: 871] for collection [replacenodetest_coll] has occurred - updating... (live nodes size: [6]) 2> 412701 INFO (zkCallback-2434-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/replacenodetest_coll/state.json zxid: 872] for collection [replacenodetest_coll] has occurred - updating... (live nodes size: [6]) 2> 412701 INFO (zkCallback-2440-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/replacenodetest_coll/state.json zxid: 872] for collection [replacenodetest_coll] has occurred - updating... (live nodes size: [6]) 2> 412701 INFO (zkCallback-2430-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/replacenodetest_coll/state.json zxid: 872] for collection [replacenodetest_coll] has occurred - updating... (live nodes size: [6]) 2> 412701 INFO (zkCallback-2436-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/replacenodetest_coll/state.json zxid: 872] for collection [replacenodetest_coll] has occurred - updating... (live nodes size: [6]) 2> 412715 INFO (qtp1384576555-4330) [n:127.0.0.1:34105_solr c: s: r: x: t:null-1663] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/cores params={qt=/admin/cores&coreNodeName=core_node20&collection.configName=conf1&newCollection=true&name=replacenodetest_coll_shard3_replica_n10&action=CREATE&numShards=5&collection=replacenodetest_coll&shard=shard3&wt=javabin&version=2&replicaType=NRT} status=0 QTime=3324 2> 412724 INFO (qtp1384576555-4336) [n:127.0.0.1:34105_solr c: s: r: x: t:null-1663] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/cores params={qt=/admin/cores&coreNodeName=core_node11&collection.configName=conf1&newCollection=true&name=replacenodetest_coll_shard1_replica_n1&action=CREATE&numShards=5&collection=replacenodetest_coll&shard=shard1&wt=javabin&version=2&replicaType=NRT} status=0 QTime=3333 2> 412731 INFO (qtp534543879-4338) [n:127.0.0.1:33013_solr c:replacenodetest_coll s: r: x: t:null-1663] o.a.s.h.a.CollectionsHandler Wait for new collection to be active for at most 45 seconds. Check all shard replicas 2> 412732 INFO (qtp534543879-4338) [n:127.0.0.1:33013_solr c:replacenodetest_coll s: r: x: t:null-1663] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/collections params={pullReplicas=0&collection.configName=conf1&name=replacenodetest_coll&nrtReplicas=2&action=CREATE&numShards=5&tlogReplicas=0&createNodeSet=127.0.0.1:37265_solr,127.0.0.1:34105_solr,127.0.0.1:33013_solr,127.0.0.1:37983_solr,127.0.0.1:34953_solr&wt=javabin&version=2} status=0 QTime=3820 2> 412734 INFO (TEST-ReplaceNodeTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.MiniSolrCloudCluster waitForActiveCollection: replacenodetest_coll 2> 412741 INFO (TEST-ReplaceNodeTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.ReplaceNodeTest excluded_node : 127.0.0.1:36563_solr 2> 412746 INFO (qtp534543879-4328) [n:127.0.0.1:33013_solr c: s: r: x: t:null-1664] o.a.s.c.a.c.DistributedCollectionConfigSetCommandRunner Running Collection API locally for REPLACENODE asyncId=000 2> 412765 INFO (qtp534543879-4328) [n:127.0.0.1:33013_solr c: s: r: x: t:null-1664] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/collections params={async=000&sourceNode=127.0.0.1:37265_solr&action=REPLACENODE&targetNode=127.0.0.1:36563_solr&wt=javabin&version=2} status=0 QTime=20 2> 412769 INFO (DistributedCollectionApiCommandExecutor-2458-thread-2-processing-127.0.0.1:33013_solr null-1664) [n:127.0.0.1:33013_solr c: s: r: x: t:null-1664] o.a.s.c.a.c.ReplicaMigrationUtils Going to create replica for collection=replacenodetest_coll shard=shard2 on node=127.0.0.1:36563_solr 2> 412778 INFO (DistributedCollectionApiCommandExecutor-2458-thread-2-processing-127.0.0.1:33013_solr null-1664) [n:127.0.0.1:33013_solr c: s: r: x: t:null-1664] o.a.s.c.a.c.AddReplicaCmd Node Identified 127.0.0.1:36563_solr for creating new replica of shard shard2 for collection replacenodetest_coll 2> 412783 INFO (qtp534543879-4348) [n:127.0.0.1:33013_solr c: s: r: x: t:null-1665] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/collections params={requestid=000&action=REQUESTSTATUS&wt=javabin&version=2} status=0 QTime=13 2> 412793 INFO (DistributedCollectionApiCommandExecutor-2458-thread-2-processing-127.0.0.1:33013_solr null-1664) [n:127.0.0.1:33013_solr c: s: r: x: t:null-1664] o.a.s.c.a.c.AddReplicaCmd Returning CreateReplica command. 2> 412797 INFO (DistributedCollectionApiCommandExecutor-2458-thread-2-processing-127.0.0.1:33013_solr null-1664) [n:127.0.0.1:33013_solr c: s: r: x: t:null-1664] o.a.s.c.o.SliceMutator createReplica() { 2> "core":"replacenodetest_coll_shard2_replica_n21", 2> "node_name":"127.0.0.1:36563_solr", 2> "base_url":"http://127.0.0.1:36563/solr", 2> "collection":"replacenodetest_coll", 2> "shard":"shard2", 2> "state":"down", 2> "type":"NRT", 2> "operation":"addreplica"} 2> 412814 INFO (zkCallback-2440-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/replacenodetest_coll/state.json zxid: 882] for collection [replacenodetest_coll] has occurred - updating... (live nodes size: [6]) 2> 412814 INFO (zkCallback-2430-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/replacenodetest_coll/state.json zxid: 882] for collection [replacenodetest_coll] has occurred - updating... (live nodes size: [6]) 2> 412814 INFO (zkCallback-2434-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/replacenodetest_coll/state.json zxid: 882] for collection [replacenodetest_coll] has occurred - updating... (live nodes size: [6]) 2> 412814 INFO (zkCallback-2425-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/replacenodetest_coll/state.json zxid: 882] for collection [replacenodetest_coll] has occurred - updating... (live nodes size: [6]) 2> 412814 INFO (zkCallback-2436-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/replacenodetest_coll/state.json zxid: 882] for collection [replacenodetest_coll] has occurred - updating... (live nodes size: [6]) 2> 412844 INFO (qtp1180210852-4327) [n:127.0.0.1:36563_solr c: s: r: x:replacenodetest_coll_shard2_replica_n21 t:null-1664] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/cores params={async=000308582088778900&qt=/admin/cores&coreNodeName=core_node22&collection.configName=conf1&name=replacenodetest_coll_shard2_replica_n21&action=CREATE&collection=replacenodetest_coll&shard=shard2&wt=javabin&version=2&replicaType=NRT} status=0 QTime=4 2> 412845 INFO (parallelCoreAdminAPIBaseExecutor-2482-thread-1-processing-127.0.0.1:36563_solr replacenodetest_coll_shard2_replica_n21 null-1664 000308582088778900 create) [n:127.0.0.1:36563_solr c: s: r: x:replacenodetest_coll_shard2_replica_n21 t:null-1664] o.a.s.h.a.CoreAdminOperation core create command async=000308582088778900&qt=/admin/cores&coreNodeName=core_node22&collection.configName=conf1&name=replacenodetest_coll_shard2_replica_n21&action=CREATE&collection=replacenodetest_coll&shard=shard2&wt=javabin&version=2&replicaType=NRT 2> 412852 INFO (qtp1180210852-4329) [n:127.0.0.1:36563_solr c: s: r: x: t:null-1664] o.a.s.h.a.CoreAdminOperation Checking request status for : 000308582088778900 2> 412856 INFO (qtp1180210852-4329) [n:127.0.0.1:36563_solr c: s: r: x: t:null-1664] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/cores params={qt=/admin/cores&requestid=000308582088778900&action=REQUESTSTATUS&wt=javabin&version=2} status=0 QTime=6 2> 412876 INFO (zkCallback-2440-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/replacenodetest_coll/state.json zxid: 883] for collection [replacenodetest_coll] has occurred - updating... (live nodes size: [6]) 2> 412876 INFO (zkCallback-2436-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/replacenodetest_coll/state.json zxid: 883] for collection [replacenodetest_coll] has occurred - updating... (live nodes size: [6]) 2> 412876 INFO (zkCallback-2434-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/replacenodetest_coll/state.json zxid: 883] for collection [replacenodetest_coll] has occurred - updating... (live nodes size: [6]) 2> 412876 INFO (zkCallback-2425-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/replacenodetest_coll/state.json zxid: 883] for collection [replacenodetest_coll] has occurred - updating... (live nodes size: [6]) 2> 412876 INFO (zkCallback-2430-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/replacenodetest_coll/state.json zxid: 883] for collection [replacenodetest_coll] has occurred - updating... (live nodes size: [6]) 2> 412929 INFO (parallelCoreAdminAPIBaseExecutor-2482-thread-1-processing-127.0.0.1:36563_solr replacenodetest_coll_shard2_replica_n21 null-1664 000308582088778900 create) [n:127.0.0.1:36563_solr c:replacenodetest_coll s:shard2 r:core_node22 x:replacenodetest_coll_shard2_replica_n21 t:null-1664] o.a.s.c.SolrConfig Using Lucene MatchVersion: 9.8.0 2> 412951 INFO (parallelCoreAdminAPIBaseExecutor-2482-thread-1-processing-127.0.0.1:36563_solr replacenodetest_coll_shard2_replica_n21 null-1664 000308582088778900 create) [n:127.0.0.1:36563_solr c:replacenodetest_coll s:shard2 r:core_node22 x:replacenodetest_coll_shard2_replica_n21 t:null-1664] o.a.s.s.IndexSchema Schema name=cloud-dynamic 2> 413598 INFO (parallelCoreAdminAPIBaseExecutor-2482-thread-1-processing-127.0.0.1:36563_solr replacenodetest_coll_shard2_replica_n21 null-1664 000308582088778900 create) [n:127.0.0.1:36563_solr c:replacenodetest_coll s:shard2 r:core_node22 x:replacenodetest_coll_shard2_replica_n21 t:null-1664] o.a.s.s.IndexSchema Loaded schema cloud-dynamic/1.6 with uniqueid field id 2> 413599 INFO (parallelCoreAdminAPIBaseExecutor-2482-thread-1-processing-127.0.0.1:36563_solr replacenodetest_coll_shard2_replica_n21 null-1664 000308582088778900 create) [n:127.0.0.1:36563_solr c:replacenodetest_coll s:shard2 r:core_node22 x:replacenodetest_coll_shard2_replica_n21 t:null-1664] o.a.s.c.CoreContainer Creating SolrCore 'replacenodetest_coll_shard2_replica_n21' using configuration from configset conf1, trusted=true 2> 413602 INFO (parallelCoreAdminAPIBaseExecutor-2482-thread-1-processing-127.0.0.1:36563_solr replacenodetest_coll_shard2_replica_n21 null-1664 000308582088778900 create) [n:127.0.0.1:36563_solr c:replacenodetest_coll s:shard2 r:core_node22 x:replacenodetest_coll_shard2_replica_n21 t:null-1664] o.a.s.c.SolrCore Opening new SolrCore at [/build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.cloud.ReplaceNodeTest_246C98A4C257C021-001/tempDir-003/node2/replacenodetest_coll_shard2_replica_n21], dataDir=[/build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.cloud.ReplaceNodeTest_246C98A4C257C021-001/tempDir-003/node2/replacenodetest_coll_shard2_replica_n21/data/] 2> 413803 INFO (qtp534543879-4352) [n:127.0.0.1:33013_solr c: s: r: x: t:null-1666] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/collections params={requestid=000&action=REQUESTSTATUS&wt=javabin&version=2} status=0 QTime=13 2> 413865 INFO (qtp1180210852-4337) [n:127.0.0.1:36563_solr c: s: r: x: t:null-1664] o.a.s.h.a.CoreAdminOperation Checking request status for : 000308582088778900 2> 413865 INFO (qtp1180210852-4337) [n:127.0.0.1:36563_solr c: s: r: x: t:null-1664] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/cores params={qt=/admin/cores&requestid=000308582088778900&action=REQUESTSTATUS&wt=javabin&version=2} status=0 QTime=2 2> 414187 INFO (parallelCoreAdminAPIBaseExecutor-2482-thread-1-processing-127.0.0.1:36563_solr replacenodetest_coll_shard2_replica_n21 null-1664 000308582088778900 create) [n:127.0.0.1:36563_solr c:replacenodetest_coll s:shard2 r:core_node22 x:replacenodetest_coll_shard2_replica_n21 t:null-1664] o.a.s.j.SolrRequestAuthorizer Creating a new SolrRequestAuthorizer 2> 414241 INFO (parallelCoreAdminAPIBaseExecutor-2482-thread-1-processing-127.0.0.1:36563_solr replacenodetest_coll_shard2_replica_n21 null-1664 000308582088778900 create) [n:127.0.0.1:36563_solr c:replacenodetest_coll s:shard2 r:core_node22 x:replacenodetest_coll_shard2_replica_n21 t:null-1664] o.a.s.u.UpdateHandler Using UpdateLog implementation: org.apache.solr.update.UpdateLog 2> 414241 INFO (parallelCoreAdminAPIBaseExecutor-2482-thread-1-processing-127.0.0.1:36563_solr replacenodetest_coll_shard2_replica_n21 null-1664 000308582088778900 create) [n:127.0.0.1:36563_solr c:replacenodetest_coll s:shard2 r:core_node22 x:replacenodetest_coll_shard2_replica_n21 t:null-1664] o.a.s.u.UpdateLog Initializing UpdateLog: dataDir=null defaultSyncLevel=FLUSH numRecordsToKeep=100 maxNumLogsToKeep=10 numVersionBuckets=65536 2> 414260 INFO (parallelCoreAdminAPIBaseExecutor-2482-thread-1-processing-127.0.0.1:36563_solr replacenodetest_coll_shard2_replica_n21 null-1664 000308582088778900 create) [n:127.0.0.1:36563_solr c:replacenodetest_coll s:shard2 r:core_node22 x:replacenodetest_coll_shard2_replica_n21 t:null-1664] o.a.s.u.CommitTracker Hard AutoCommit: disabled 2> 414260 INFO (parallelCoreAdminAPIBaseExecutor-2482-thread-1-processing-127.0.0.1:36563_solr replacenodetest_coll_shard2_replica_n21 null-1664 000308582088778900 create) [n:127.0.0.1:36563_solr c:replacenodetest_coll s:shard2 r:core_node22 x:replacenodetest_coll_shard2_replica_n21 t:null-1664] o.a.s.u.CommitTracker Soft AutoCommit: disabled 2> 414279 INFO (parallelCoreAdminAPIBaseExecutor-2482-thread-1-processing-127.0.0.1:36563_solr replacenodetest_coll_shard2_replica_n21 null-1664 000308582088778900 create) [n:127.0.0.1:36563_solr c:replacenodetest_coll s:shard2 r:core_node22 x:replacenodetest_coll_shard2_replica_n21 t:null-1664] o.a.s.r.ManagedResourceStorage Configured ZooKeeperStorageIO with znodeBase: /configs/conf1 2> 414282 INFO (parallelCoreAdminAPIBaseExecutor-2482-thread-1-processing-127.0.0.1:36563_solr replacenodetest_coll_shard2_replica_n21 null-1664 000308582088778900 create) [n:127.0.0.1:36563_solr c:replacenodetest_coll s:shard2 r:core_node22 x:replacenodetest_coll_shard2_replica_n21 t:null-1664] o.a.s.r.ManagedResourceStorage Loaded null at path _rest_managed.json using ZooKeeperStorageIO:path=/configs/conf1 2> 414287 INFO (parallelCoreAdminAPIBaseExecutor-2482-thread-1-processing-127.0.0.1:36563_solr replacenodetest_coll_shard2_replica_n21 null-1664 000308582088778900 create) [n:127.0.0.1:36563_solr c:replacenodetest_coll s:shard2 r:core_node22 x:replacenodetest_coll_shard2_replica_n21 t:null-1664] o.a.s.h.ReplicationHandler Commits will be reserved for 10000 ms 2> 414288 INFO (parallelCoreAdminAPIBaseExecutor-2482-thread-1-processing-127.0.0.1:36563_solr replacenodetest_coll_shard2_replica_n21 null-1664 000308582088778900 create) [n:127.0.0.1:36563_solr c:replacenodetest_coll s:shard2 r:core_node22 x:replacenodetest_coll_shard2_replica_n21 t:null-1664] o.a.s.u.UpdateLog Could not find max version in index or recent updates, using new clock 1788735691499241472 2> 414324 INFO (searcherExecutor-2560-thread-1-processing-replacenodetest_coll_shard2_replica_n21 null-1664 000308582088778900 core_node22 create 127.0.0.1:36563_solr replacenodetest_coll shard2) [n:127.0.0.1:36563_solr c:replacenodetest_coll s:shard2 r:core_node22 x:replacenodetest_coll_shard2_replica_n21 t:null-1664] o.a.s.c.SolrCore Registered new searcher autowarm time: 0 ms 2> 414330 INFO (parallelCoreAdminAPIBaseExecutor-2482-thread-1-processing-127.0.0.1:36563_solr replacenodetest_coll_shard2_replica_n21 null-1664 000308582088778900 create) [n:127.0.0.1:36563_solr c:replacenodetest_coll s:shard2 r:core_node22 x:replacenodetest_coll_shard2_replica_n21 t:null-1664] o.a.s.c.ZkShardTerms Successful update of terms at /collections/replacenodetest_coll/terms/shard2 to Terms{values={core_node13=0, core_node19=0, core_node22=0}, version=2} for registerTerm 2> 414331 INFO (parallelCoreAdminAPIBaseExecutor-2482-thread-1-processing-127.0.0.1:36563_solr replacenodetest_coll_shard2_replica_n21 null-1664 000308582088778900 create) [n:127.0.0.1:36563_solr c:replacenodetest_coll s:shard2 r:core_node22 x:replacenodetest_coll_shard2_replica_n21 t:null-1664] o.a.s.c.ShardLeaderElectionContextBase make sure parent is created /collections/replacenodetest_coll/leaders/shard2 2> 414355 INFO (parallelCoreAdminAPIBaseExecutor-2482-thread-1-processing-127.0.0.1:36563_solr replacenodetest_coll_shard2_replica_n21 null-1664 000308582088778900 create) [n:127.0.0.1:36563_solr c:replacenodetest_coll s:shard2 r:core_node22 x:replacenodetest_coll_shard2_replica_n21 t:null-1664] o.a.s.c.ZkController Core needs to recover:replacenodetest_coll_shard2_replica_n21 2> 414359 INFO (updateExecutor-2415-thread-1-processing-replacenodetest_coll_shard2_replica_n21 null-1664 000308582088778900 core_node22 create 127.0.0.1:36563_solr replacenodetest_coll shard2) [n:127.0.0.1:36563_solr c:replacenodetest_coll s:shard2 r:core_node22 x:replacenodetest_coll_shard2_replica_n21 t:null-1664] o.a.s.u.DefaultSolrCoreState Running recovery 2> 414363 INFO (recoveryExecutor-2432-thread-1-processing-replacenodetest_coll_shard2_replica_n21 null-1664 000308582088778900 core_node22 create 127.0.0.1:36563_solr replacenodetest_coll shard2) [n:127.0.0.1:36563_solr c:replacenodetest_coll s:shard2 r:core_node22 x:replacenodetest_coll_shard2_replica_n21 t:null-1664] o.a.s.c.RecoveryStrategy Starting recovery process. recoveringAfterStartup=true 2> 414364 INFO (recoveryExecutor-2432-thread-1-processing-replacenodetest_coll_shard2_replica_n21 null-1664 000308582088778900 core_node22 create 127.0.0.1:36563_solr replacenodetest_coll shard2) [n:127.0.0.1:36563_solr c:replacenodetest_coll s:shard2 r:core_node22 x:replacenodetest_coll_shard2_replica_n21 t:null-1664] o.a.s.c.RecoveryStrategy startupVersions is empty 2> 414411 INFO (qtp534543879-4356) [n:127.0.0.1:33013_solr c:replacenodetest_coll s:shard2 r:core_node13 x:replacenodetest_coll_shard2_replica_n3 t:null-1667] o.a.s.c.S.Request webapp=/solr path=/admin/ping params={wt=javabin&version=2} hits=0 status=0 QTime=25 2> 414411 INFO (qtp534543879-4356) [n:127.0.0.1:33013_solr c:replacenodetest_coll s:shard2 r:core_node13 x:replacenodetest_coll_shard2_replica_n3 t:null-1667] o.a.s.c.S.Request webapp=/solr path=/admin/ping params={wt=javabin&version=2} status=0 QTime=26 2> 414413 INFO (recoveryExecutor-2432-thread-1-processing-replacenodetest_coll_shard2_replica_n21 null-1664 000308582088778900 core_node22 create 127.0.0.1:36563_solr replacenodetest_coll shard2) [n:127.0.0.1:36563_solr c:replacenodetest_coll s:shard2 r:core_node22 x:replacenodetest_coll_shard2_replica_n21 t:null-1664] o.a.s.c.RecoveryStrategy Begin buffering updates. core=[replacenodetest_coll_shard2_replica_n21] 2> 414417 INFO (recoveryExecutor-2432-thread-1-processing-replacenodetest_coll_shard2_replica_n21 null-1664 000308582088778900 core_node22 create 127.0.0.1:36563_solr replacenodetest_coll shard2) [n:127.0.0.1:36563_solr c:replacenodetest_coll s:shard2 r:core_node22 x:replacenodetest_coll_shard2_replica_n21 t:null-1664] o.a.s.u.UpdateLog Starting to buffer updates. FSUpdateLog{state=ACTIVE, tlog=null} 2> 414418 INFO (recoveryExecutor-2432-thread-1-processing-replacenodetest_coll_shard2_replica_n21 null-1664 000308582088778900 core_node22 create 127.0.0.1:36563_solr replacenodetest_coll shard2) [n:127.0.0.1:36563_solr c:replacenodetest_coll s:shard2 r:core_node22 x:replacenodetest_coll_shard2_replica_n21 t:null-1664] o.a.s.c.RecoveryStrategy Publishing state of core [replacenodetest_coll_shard2_replica_n21] as recovering, leader is [http://127.0.0.1:33013/solr/replacenodetest_coll_shard2_replica_n3/] and I am [http://127.0.0.1:36563/solr/replacenodetest_coll_shard2_replica_n21/] 2> 414427 INFO (zkCallback-2440-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/replacenodetest_coll/state.json zxid: 890] for collection [replacenodetest_coll] has occurred - updating... (live nodes size: [6]) 2> 414427 INFO (zkCallback-2434-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/replacenodetest_coll/state.json zxid: 890] for collection [replacenodetest_coll] has occurred - updating... (live nodes size: [6]) 2> 414427 INFO (zkCallback-2436-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/replacenodetest_coll/state.json zxid: 890] for collection [replacenodetest_coll] has occurred - updating... (live nodes size: [6]) 2> 414427 INFO (zkCallback-2425-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/replacenodetest_coll/state.json zxid: 890] for collection [replacenodetest_coll] has occurred - updating... (live nodes size: [6]) 2> 414427 INFO (zkCallback-2438-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/replacenodetest_coll/state.json zxid: 890] for collection [replacenodetest_coll] has occurred - updating... (live nodes size: [6]) 2> 414427 INFO (zkCallback-2430-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/replacenodetest_coll/state.json zxid: 890] for collection [replacenodetest_coll] has occurred - updating... (live nodes size: [6]) 2> 414429 INFO (recoveryExecutor-2432-thread-1-processing-replacenodetest_coll_shard2_replica_n21 null-1664 000308582088778900 core_node22 create 127.0.0.1:36563_solr replacenodetest_coll shard2) [n:127.0.0.1:36563_solr c:replacenodetest_coll s:shard2 r:core_node22 x:replacenodetest_coll_shard2_replica_n21 t:null-1664] o.a.s.c.RecoveryStrategy Sending prep recovery command to [http://127.0.0.1:33013/solr]; [WaitForState: action=PREPRECOVERY&core=replacenodetest_coll_shard2_replica_n3&nodeName=127.0.0.1:36563_solr&coreNodeName=core_node22&state=recovering&checkLive=true&onlyIfLeader=true&onlyIfLeaderActive=true] 2> 414435 INFO (qtp534543879-4338) [n:127.0.0.1:33013_solr c: s: r: x:replacenodetest_coll_shard2_replica_n3 t:null-1668] o.a.s.h.a.PrepRecoveryOp Going to wait for coreNodeName: core_node22, state: recovering, checkLive: true, onlyIfLeader: true, onlyIfLeaderActive: true 2> 414436 INFO (qtp534543879-4338) [n:127.0.0.1:33013_solr c: s: r: x: t:null-1668] o.a.s.h.a.PrepRecoveryOp In WaitForState(recovering): collection=replacenodetest_coll, shard=shard2, thisCore=replacenodetest_coll_shard2_replica_n3, leaderDoesNotNeedRecovery=false, isLeader? true, live=true, checkLive=true, currentState=recovering, localState=active, nodeName=127.0.0.1:36563_solr, coreNodeName=core_node22, onlyIfActiveCheckResult=false, nodeProps: core_node22:{ 2> "core":"replacenodetest_coll_shard2_replica_n21", 2> "node_name":"127.0.0.1:36563_solr", 2> "base_url":"http://127.0.0.1:36563/solr", 2> "state":"recovering", 2> "type":"NRT", 2> "force_set_state":"false"} 2> 414436 INFO (qtp534543879-4338) [n:127.0.0.1:33013_solr c: s: r: x: t:null-1668] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/cores params={nodeName=127.0.0.1:36563_solr&onlyIfLeaderActive=true&core=replacenodetest_coll_shard2_replica_n3&coreNodeName=core_node22&action=PREPRECOVERY&checkLive=true&state=recovering&onlyIfLeader=true&wt=javabin&version=2} status=0 QTime=3 2> 414822 INFO (qtp534543879-4328) [n:127.0.0.1:33013_solr c: s: r: x: t:null-1669] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/collections params={requestid=000&action=REQUESTSTATUS&wt=javabin&version=2} status=0 QTime=12 2> 414873 INFO (qtp1180210852-4339) [n:127.0.0.1:36563_solr c: s: r: x: t:null-1664] o.a.s.h.a.CoreAdminOperation Checking request status for : 000308582088778900 2> 414874 INFO (qtp1180210852-4339) [n:127.0.0.1:36563_solr c: s: r: x: t:null-1664] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/cores params={qt=/admin/cores&requestid=000308582088778900&action=REQUESTSTATUS&wt=javabin&version=2} status=0 QTime=2 2> 414877 INFO (DistributedCollectionApiCommandExecutor-2458-thread-2-processing-127.0.0.1:33013_solr null-1664) [n:127.0.0.1:33013_solr c: s: r: x: t:null-1664] o.a.s.c.a.c.ReplicaMigrationUtils Going to create replica for collection=replacenodetest_coll shard=shard5 on node=127.0.0.1:36563_solr 2> 414883 INFO (DistributedCollectionApiCommandExecutor-2458-thread-2-processing-127.0.0.1:33013_solr null-1664) [n:127.0.0.1:33013_solr c: s: r: x: t:null-1664] o.a.s.c.a.c.AddReplicaCmd Node Identified 127.0.0.1:36563_solr for creating new replica of shard shard5 for collection replacenodetest_coll 2> 414895 INFO (DistributedCollectionApiCommandExecutor-2458-thread-2-processing-127.0.0.1:33013_solr null-1664) [n:127.0.0.1:33013_solr c: s: r: x: t:null-1664] o.a.s.c.a.c.AddReplicaCmd Returning CreateReplica command. 2> 414902 INFO (DistributedCollectionApiCommandExecutor-2458-thread-2-processing-127.0.0.1:33013_solr null-1664) [n:127.0.0.1:33013_solr c: s: r: x: t:null-1664] o.a.s.c.o.SliceMutator createReplica() { 2> "core":"replacenodetest_coll_shard5_replica_n23", 2> "node_name":"127.0.0.1:36563_solr", 2> "base_url":"http://127.0.0.1:36563/solr", 2> "collection":"replacenodetest_coll", 2> "shard":"shard5", 2> "state":"down", 2> "type":"NRT", 2> "operation":"addreplica"} 2> 414919 INFO (zkCallback-2440-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/replacenodetest_coll/state.json zxid: 893] for collection [replacenodetest_coll] has occurred - updating... (live nodes size: [6]) 2> 414919 INFO (zkCallback-2434-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/replacenodetest_coll/state.json zxid: 893] for collection [replacenodetest_coll] has occurred - updating... (live nodes size: [6]) 2> 414919 INFO (zkCallback-2436-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/replacenodetest_coll/state.json zxid: 893] for collection [replacenodetest_coll] has occurred - updating... (live nodes size: [6]) 2> 414920 INFO (zkCallback-2438-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/replacenodetest_coll/state.json zxid: 893] for collection [replacenodetest_coll] has occurred - updating... (live nodes size: [6]) 2> 414920 INFO (zkCallback-2430-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/replacenodetest_coll/state.json zxid: 893] for collection [replacenodetest_coll] has occurred - updating... (live nodes size: [6]) 2> 414919 INFO (zkCallback-2425-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/replacenodetest_coll/state.json zxid: 893] for collection [replacenodetest_coll] has occurred - updating... (live nodes size: [6]) 2> 414938 INFO (recoveryExecutor-2432-thread-1-processing-replacenodetest_coll_shard2_replica_n21 null-1664 000308582088778900 core_node22 create 127.0.0.1:36563_solr replacenodetest_coll shard2) [n:127.0.0.1:36563_solr c:replacenodetest_coll s:shard2 r:core_node22 x:replacenodetest_coll_shard2_replica_n21 t:null-1664] o.a.s.c.RecoveryStrategy Attempting to PeerSync from [http://127.0.0.1:33013/solr/replacenodetest_coll_shard2_replica_n3/] - recoveringAfterStartup=[true] 2> 414939 INFO (qtp1180210852-4317) [n:127.0.0.1:36563_solr c: s: r: x:replacenodetest_coll_shard5_replica_n23 t:null-1664] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/cores params={async=000308584190838394&qt=/admin/cores&coreNodeName=core_node24&collection.configName=conf1&name=replacenodetest_coll_shard5_replica_n23&action=CREATE&collection=replacenodetest_coll&shard=shard5&wt=javabin&version=2&replicaType=NRT} status=0 QTime=10 2> 414940 INFO (parallelCoreAdminAPIBaseExecutor-2482-thread-2-processing-127.0.0.1:36563_solr replacenodetest_coll_shard5_replica_n23 null-1664 000308584190838394 create) [n:127.0.0.1:36563_solr c: s: r: x:replacenodetest_coll_shard5_replica_n23 t:null-1664] o.a.s.h.a.CoreAdminOperation core create command async=000308584190838394&qt=/admin/cores&coreNodeName=core_node24&collection.configName=conf1&name=replacenodetest_coll_shard5_replica_n23&action=CREATE&collection=replacenodetest_coll&shard=shard5&wt=javabin&version=2&replicaType=NRT 2> 414941 WARN (recoveryExecutor-2432-thread-1-processing-replacenodetest_coll_shard2_replica_n21 null-1664 000308582088778900 core_node22 create 127.0.0.1:36563_solr replacenodetest_coll shard2) [n:127.0.0.1:36563_solr c:replacenodetest_coll s:shard2 r:core_node22 x:replacenodetest_coll_shard2_replica_n21 t:null-1664] o.a.s.u.PeerSyncWithLeader no frame of reference to tell if we've missed updates 2> 414942 INFO (recoveryExecutor-2432-thread-1-processing-replacenodetest_coll_shard2_replica_n21 null-1664 000308582088778900 core_node22 create 127.0.0.1:36563_solr replacenodetest_coll shard2) [n:127.0.0.1:36563_solr c:replacenodetest_coll s:shard2 r:core_node22 x:replacenodetest_coll_shard2_replica_n21 t:null-1664] o.a.s.c.RecoveryStrategy PeerSync Recovery was not successful - trying replication. 2> 414942 INFO (recoveryExecutor-2432-thread-1-processing-replacenodetest_coll_shard2_replica_n21 null-1664 000308582088778900 core_node22 create 127.0.0.1:36563_solr replacenodetest_coll shard2) [n:127.0.0.1:36563_solr c:replacenodetest_coll s:shard2 r:core_node22 x:replacenodetest_coll_shard2_replica_n21 t:null-1664] o.a.s.c.RecoveryStrategy Starting Replication Recovery. 2> 414942 INFO (recoveryExecutor-2432-thread-1-processing-replacenodetest_coll_shard2_replica_n21 null-1664 000308582088778900 core_node22 create 127.0.0.1:36563_solr replacenodetest_coll shard2) [n:127.0.0.1:36563_solr c:replacenodetest_coll s:shard2 r:core_node22 x:replacenodetest_coll_shard2_replica_n21 t:null-1664] o.a.s.c.RecoveryStrategy Attempting to replicate from [http://127.0.0.1:33013/solr/replacenodetest_coll_shard2_replica_n3/]. 2> 414946 INFO (qtp1180210852-4319) [n:127.0.0.1:36563_solr c: s: r: x: t:null-1664] o.a.s.h.a.CoreAdminOperation Checking request status for : 000308584190838394 2> 414947 INFO (qtp1180210852-4319) [n:127.0.0.1:36563_solr c: s: r: x: t:null-1664] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/cores params={qt=/admin/cores&requestid=000308584190838394&action=REQUESTSTATUS&wt=javabin&version=2} status=0 QTime=2 2> 414954 INFO (zkCallback-2430-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/replacenodetest_coll/state.json zxid: 894] for collection [replacenodetest_coll] has occurred - updating... (live nodes size: [6]) 2> 414954 INFO (zkCallback-2438-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/replacenodetest_coll/state.json zxid: 894] for collection [replacenodetest_coll] has occurred - updating... (live nodes size: [6]) 2> 414954 INFO (zkCallback-2436-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/replacenodetest_coll/state.json zxid: 894] for collection [replacenodetest_coll] has occurred - updating... (live nodes size: [6]) 2> 414954 INFO (zkCallback-2440-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/replacenodetest_coll/state.json zxid: 894] for collection [replacenodetest_coll] has occurred - updating... (live nodes size: [6]) 2> 414954 INFO (zkCallback-2425-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/replacenodetest_coll/state.json zxid: 894] for collection [replacenodetest_coll] has occurred - updating... (live nodes size: [6]) 2> 414954 INFO (zkCallback-2434-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/replacenodetest_coll/state.json zxid: 894] for collection [replacenodetest_coll] has occurred - updating... (live nodes size: [6]) 2> 414989 INFO (parallelCoreAdminAPIBaseExecutor-2482-thread-2-processing-127.0.0.1:36563_solr replacenodetest_coll_shard5_replica_n23 null-1664 000308584190838394 create) [n:127.0.0.1:36563_solr c:replacenodetest_coll s:shard5 r:core_node24 x:replacenodetest_coll_shard5_replica_n23 t:null-1664] o.a.s.c.SolrConfig Using Lucene MatchVersion: 9.8.0 2> 414995 INFO (parallelCoreAdminAPIBaseExecutor-2482-thread-2-processing-127.0.0.1:36563_solr replacenodetest_coll_shard5_replica_n23 null-1664 000308584190838394 create) [n:127.0.0.1:36563_solr c:replacenodetest_coll s:shard5 r:core_node24 x:replacenodetest_coll_shard5_replica_n23 t:null-1664] o.a.s.s.IndexSchema Schema name=cloud-dynamic 2> 415072 INFO (parallelCoreAdminAPIBaseExecutor-2482-thread-2-processing-127.0.0.1:36563_solr replacenodetest_coll_shard5_replica_n23 null-1664 000308584190838394 create) [n:127.0.0.1:36563_solr c:replacenodetest_coll s:shard5 r:core_node24 x:replacenodetest_coll_shard5_replica_n23 t:null-1664] o.a.s.s.IndexSchema Loaded schema cloud-dynamic/1.6 with uniqueid field id 2> 415072 INFO (parallelCoreAdminAPIBaseExecutor-2482-thread-2-processing-127.0.0.1:36563_solr replacenodetest_coll_shard5_replica_n23 null-1664 000308584190838394 create) [n:127.0.0.1:36563_solr c:replacenodetest_coll s:shard5 r:core_node24 x:replacenodetest_coll_shard5_replica_n23 t:null-1664] o.a.s.c.CoreContainer Creating SolrCore 'replacenodetest_coll_shard5_replica_n23' using configuration from configset conf1, trusted=true 2> 415076 INFO (parallelCoreAdminAPIBaseExecutor-2482-thread-2-processing-127.0.0.1:36563_solr replacenodetest_coll_shard5_replica_n23 null-1664 000308584190838394 create) [n:127.0.0.1:36563_solr c:replacenodetest_coll s:shard5 r:core_node24 x:replacenodetest_coll_shard5_replica_n23 t:null-1664] o.a.s.c.SolrCore Opening new SolrCore at [/build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.cloud.ReplaceNodeTest_246C98A4C257C021-001/tempDir-003/node2/replacenodetest_coll_shard5_replica_n23], dataDir=[/build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.cloud.ReplaceNodeTest_246C98A4C257C021-001/tempDir-003/node2/replacenodetest_coll_shard5_replica_n23/data/] 2> 415084 INFO (qtp819144083-4324) [n:127.0.0.1:34953_solr c:replacenodetest_coll s:shard4 r:core_node17 x:replacenodetest_coll_shard4_replica_n7 t:null-1670] o.a.s.u.p.LogUpdateProcessorFactory webapp=/solr path=/update params={update.distrib=FROMLEADER&waitSearcher=true&openSearcher=false&commit=true&softCommit=false&distrib.from=http://127.0.0.1:33013/solr/replacenodetest_coll_shard4_replica_n8/&commit_end_point=replicas&wt=javabin&version=2&expungeDeletes=false}{commit=} 0 61 2> 415087 INFO (qtp534543879-4356) [n:127.0.0.1:33013_solr c:replacenodetest_coll s:shard4 r:core_node18 x:replacenodetest_coll_shard4_replica_n8 t:null-1670] o.a.s.u.p.LogUpdateProcessorFactory webapp=/solr path=/update params={update.distrib=TOLEADER&waitSearcher=true&openSearcher=false&commit=true&softCommit=false&distrib.from=http://127.0.0.1:33013/solr/replacenodetest_coll_shard2_replica_n3/&commit_end_point=leaders&wt=javabin&version=2&expungeDeletes=false}{commit=} 0 86 2> 415106 INFO (qtp1384576555-4336) [n:127.0.0.1:34105_solr c:replacenodetest_coll s:shard1 r:core_node11 x:replacenodetest_coll_shard1_replica_n1 t:null-1670] o.a.s.u.p.LogUpdateProcessorFactory webapp=/solr path=/update params={update.distrib=FROMLEADER&waitSearcher=true&openSearcher=false&commit=true&softCommit=false&distrib.from=http://127.0.0.1:34953/solr/replacenodetest_coll_shard1_replica_n2/&commit_end_point=replicas&wt=javabin&version=2&expungeDeletes=false}{commit=} 0 12 2> 415106 INFO (qtp1384576555-4330) [n:127.0.0.1:34105_solr c:replacenodetest_coll s:shard3 r:core_node20 x:replacenodetest_coll_shard3_replica_n10 t:null-1670] o.a.s.u.p.LogUpdateProcessorFactory webapp=/solr path=/update params={update.distrib=FROMLEADER&waitSearcher=true&openSearcher=false&commit=true&softCommit=false&distrib.from=http://127.0.0.1:37983/solr/replacenodetest_coll_shard3_replica_n4/&commit_end_point=replicas&wt=javabin&version=2&expungeDeletes=false}{commit=} 0 15 2> 415109 INFO (qtp583399768-4321) [n:127.0.0.1:37983_solr c:replacenodetest_coll s:shard3 r:core_node14 x:replacenodetest_coll_shard3_replica_n4 t:null-1670] o.a.s.u.p.LogUpdateProcessorFactory webapp=/solr path=/update params={update.distrib=TOLEADER&waitSearcher=true&openSearcher=false&commit=true&softCommit=false&distrib.from=http://127.0.0.1:33013/solr/replacenodetest_coll_shard2_replica_n3/&commit_end_point=leaders&wt=javabin&version=2&expungeDeletes=false}{commit=} 0 110 2> 415109 INFO (qtp819144083-4322) [n:127.0.0.1:34953_solr c:replacenodetest_coll s:shard1 r:core_node12 x:replacenodetest_coll_shard1_replica_n2 t:null-1670] o.a.s.u.p.LogUpdateProcessorFactory webapp=/solr path=/update params={update.distrib=TOLEADER&waitSearcher=true&openSearcher=false&commit=true&softCommit=false&distrib.from=http://127.0.0.1:33013/solr/replacenodetest_coll_shard2_replica_n3/&commit_end_point=leaders&wt=javabin&version=2&expungeDeletes=false}{commit=} 0 98 2> 415114 INFO (qtp259113879-4323) [n:127.0.0.1:37265_solr c:replacenodetest_coll s:shard5 r:core_node15 x:replacenodetest_coll_shard5_replica_n5 t:null-1670] o.a.s.u.p.LogUpdateProcessorFactory webapp=/solr path=/update params={update.distrib=FROMLEADER&waitSearcher=true&openSearcher=false&commit=true&softCommit=false&distrib.from=http://127.0.0.1:37983/solr/replacenodetest_coll_shard5_replica_n6/&commit_end_point=replicas&wt=javabin&version=2&expungeDeletes=false}{commit=} 0 16 2> 415116 INFO (qtp583399768-4325) [n:127.0.0.1:37983_solr c:replacenodetest_coll s:shard5 r:core_node16 x:replacenodetest_coll_shard5_replica_n6 t:null-1670] o.a.s.u.p.LogUpdateProcessorFactory webapp=/solr path=/update params={update.distrib=TOLEADER&waitSearcher=true&openSearcher=false&commit=true&softCommit=false&distrib.from=http://127.0.0.1:33013/solr/replacenodetest_coll_shard2_replica_n3/&commit_end_point=leaders&wt=javabin&version=2&expungeDeletes=false}{commit=} 0 115 2> 415170 INFO (qtp1180210852-4337) [n:127.0.0.1:36563_solr c:replacenodetest_coll s:shard2 r:core_node22 x:replacenodetest_coll_shard2_replica_n21 t:null-1670] o.a.s.u.p.DistributedUpdateProcessor Ignoring commit while not ACTIVE - state: BUFFERING replay: false 2> 415170 INFO (qtp1180210852-4337) [n:127.0.0.1:36563_solr c:replacenodetest_coll s:shard2 r:core_node22 x:replacenodetest_coll_shard2_replica_n21 t:null-1670] o.a.s.u.p.LogUpdateProcessorFactory webapp=/solr path=/update params={update.distrib=FROMLEADER&waitSearcher=true&openSearcher=false&commit=true&softCommit=false&distrib.from=http://127.0.0.1:33013/solr/replacenodetest_coll_shard2_replica_n3/&commit_end_point=replicas&wt=javabin&version=2&expungeDeletes=false}{commit=} 0 29 2> 415172 INFO (qtp259113879-4351) [n:127.0.0.1:37265_solr c:replacenodetest_coll s:shard2 r:core_node19 x:replacenodetest_coll_shard2_replica_n9 t:null-1670] o.a.s.u.p.LogUpdateProcessorFactory webapp=/solr path=/update params={update.distrib=FROMLEADER&waitSearcher=true&openSearcher=false&commit=true&softCommit=false&distrib.from=http://127.0.0.1:33013/solr/replacenodetest_coll_shard2_replica_n3/&commit_end_point=replicas&wt=javabin&version=2&expungeDeletes=false}{commit=} 0 30 2> 415175 INFO (qtp534543879-4348) [n:127.0.0.1:33013_solr c:replacenodetest_coll s:shard2 r:core_node13 x:replacenodetest_coll_shard2_replica_n3 t:null-1670] o.a.s.u.p.LogUpdateProcessorFactory webapp=/solr path=/update params={waitSearcher=true&openSearcher=false&commit=true&softCommit=false&wt=javabin&version=2}{commit=} 0 229 2> 415187 INFO (qtp534543879-4338) [n:127.0.0.1:33013_solr c:replacenodetest_coll s:shard2 r:core_node13 x:replacenodetest_coll_shard2_replica_n3 t:null-1671] o.a.s.c.S.Request webapp=/solr path=/replication params={qt=/replication&wt=javabin&version=2&command=indexversion} status=0 QTime=2 2> 415189 INFO (recoveryExecutor-2432-thread-1-processing-replacenodetest_coll_shard2_replica_n21 null-1664 000308582088778900 core_node22 create 127.0.0.1:36563_solr replacenodetest_coll shard2) [n:127.0.0.1:36563_solr c:replacenodetest_coll s:shard2 r:core_node22 x:replacenodetest_coll_shard2_replica_n21 t:null-1664] o.a.s.h.IndexFetcher Leader's generation: 1 2> 415189 INFO (recoveryExecutor-2432-thread-1-processing-replacenodetest_coll_shard2_replica_n21 null-1664 000308582088778900 core_node22 create 127.0.0.1:36563_solr replacenodetest_coll shard2) [n:127.0.0.1:36563_solr c:replacenodetest_coll s:shard2 r:core_node22 x:replacenodetest_coll_shard2_replica_n21 t:null-1664] o.a.s.h.IndexFetcher Leader's version: 0 2> 415189 INFO (recoveryExecutor-2432-thread-1-processing-replacenodetest_coll_shard2_replica_n21 null-1664 000308582088778900 core_node22 create 127.0.0.1:36563_solr replacenodetest_coll shard2) [n:127.0.0.1:36563_solr c:replacenodetest_coll s:shard2 r:core_node22 x:replacenodetest_coll_shard2_replica_n21 t:null-1664] o.a.s.h.IndexFetcher Follower's generation: 1 2> 415189 INFO (recoveryExecutor-2432-thread-1-processing-replacenodetest_coll_shard2_replica_n21 null-1664 000308582088778900 core_node22 create 127.0.0.1:36563_solr replacenodetest_coll shard2) [n:127.0.0.1:36563_solr c:replacenodetest_coll s:shard2 r:core_node22 x:replacenodetest_coll_shard2_replica_n21 t:null-1664] o.a.s.h.IndexFetcher Follower's version: 0 2> 415189 INFO (recoveryExecutor-2432-thread-1-processing-replacenodetest_coll_shard2_replica_n21 null-1664 000308582088778900 core_node22 create 127.0.0.1:36563_solr replacenodetest_coll shard2) [n:127.0.0.1:36563_solr c:replacenodetest_coll s:shard2 r:core_node22 x:replacenodetest_coll_shard2_replica_n21 t:null-1664] o.a.s.c.RecoveryStrategy No replay needed. 2> 415192 INFO (recoveryExecutor-2432-thread-1-processing-replacenodetest_coll_shard2_replica_n21 null-1664 000308582088778900 core_node22 create 127.0.0.1:36563_solr replacenodetest_coll shard2) [n:127.0.0.1:36563_solr c:replacenodetest_coll s:shard2 r:core_node22 x:replacenodetest_coll_shard2_replica_n21 t:null-1664] o.a.s.c.RecoveryStrategy Replication Recovery was successful. 2> 415193 INFO (recoveryExecutor-2432-thread-1-processing-replacenodetest_coll_shard2_replica_n21 null-1664 000308582088778900 core_node22 create 127.0.0.1:36563_solr replacenodetest_coll shard2) [n:127.0.0.1:36563_solr c:replacenodetest_coll s:shard2 r:core_node22 x:replacenodetest_coll_shard2_replica_n21 t:null-1664] o.a.s.c.RecoveryStrategy Registering as Active after recovery. 2> 415201 INFO (zkCallback-2430-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/replacenodetest_coll/state.json zxid: 895] for collection [replacenodetest_coll] has occurred - updating... (live nodes size: [6]) 2> 415202 INFO (recoveryExecutor-2432-thread-1-processing-replacenodetest_coll_shard2_replica_n21 null-1664 000308582088778900 core_node22 create 127.0.0.1:36563_solr replacenodetest_coll shard2) [n:127.0.0.1:36563_solr c:replacenodetest_coll s:shard2 r:core_node22 x:replacenodetest_coll_shard2_replica_n21 t:null-1664] o.a.s.c.RecoveryStrategy Updating version bucket highest from index after successful recovery. 2> 415202 INFO (zkCallback-2434-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/replacenodetest_coll/state.json zxid: 895] for collection [replacenodetest_coll] has occurred - updating... (live nodes size: [6]) 2> 415202 INFO (zkCallback-2436-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/replacenodetest_coll/state.json zxid: 895] for collection [replacenodetest_coll] has occurred - updating... (live nodes size: [6]) 2> 415201 INFO (zkCallback-2438-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/replacenodetest_coll/state.json zxid: 895] for collection [replacenodetest_coll] has occurred - updating... (live nodes size: [6]) 2> 415202 INFO (recoveryExecutor-2432-thread-1-processing-replacenodetest_coll_shard2_replica_n21 null-1664 000308582088778900 core_node22 create 127.0.0.1:36563_solr replacenodetest_coll shard2) [n:127.0.0.1:36563_solr c:replacenodetest_coll s:shard2 r:core_node22 x:replacenodetest_coll_shard2_replica_n21 t:null-1664] o.a.s.u.UpdateLog Could not find max version in index or recent updates, using new clock 1788735692457639936 2> 415201 INFO (zkCallback-2425-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/replacenodetest_coll/state.json zxid: 895] for collection [replacenodetest_coll] has occurred - updating... (live nodes size: [6]) 2> 415201 INFO (zkCallback-2440-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/replacenodetest_coll/state.json zxid: 895] for collection [replacenodetest_coll] has occurred - updating... (live nodes size: [6]) 2> 415211 INFO (recoveryExecutor-2432-thread-1-processing-replacenodetest_coll_shard2_replica_n21 null-1664 000308582088778900 core_node22 create 127.0.0.1:36563_solr replacenodetest_coll shard2) [n:127.0.0.1:36563_solr c:replacenodetest_coll s:shard2 r:core_node22 x:replacenodetest_coll_shard2_replica_n21 t:null-1664] o.a.s.c.RecoveryStrategy Finished recovery process, successful=[true] msTimeTaken=847.0 2> 415211 INFO (recoveryExecutor-2432-thread-1-processing-replacenodetest_coll_shard2_replica_n21 null-1664 000308582088778900 core_node22 create 127.0.0.1:36563_solr replacenodetest_coll shard2) [n:127.0.0.1:36563_solr c:replacenodetest_coll s:shard2 r:core_node22 x:replacenodetest_coll_shard2_replica_n21 t:null-1664] o.a.s.c.RecoveryStrategy Finished recovery process. recoveringAfterStartup=true msTimeTaken=847.0 2> 415408 INFO (parallelCoreAdminAPIBaseExecutor-2482-thread-2-processing-127.0.0.1:36563_solr replacenodetest_coll_shard5_replica_n23 null-1664 000308584190838394 create) [n:127.0.0.1:36563_solr c:replacenodetest_coll s:shard5 r:core_node24 x:replacenodetest_coll_shard5_replica_n23 t:null-1664] o.a.s.u.UpdateHandler Using UpdateLog implementation: org.apache.solr.update.UpdateLog 2> 415408 INFO (parallelCoreAdminAPIBaseExecutor-2482-thread-2-processing-127.0.0.1:36563_solr replacenodetest_coll_shard5_replica_n23 null-1664 000308584190838394 create) [n:127.0.0.1:36563_solr c:replacenodetest_coll s:shard5 r:core_node24 x:replacenodetest_coll_shard5_replica_n23 t:null-1664] o.a.s.u.UpdateLog Initializing UpdateLog: dataDir=null defaultSyncLevel=FLUSH numRecordsToKeep=100 maxNumLogsToKeep=10 numVersionBuckets=65536 2> 415427 INFO (parallelCoreAdminAPIBaseExecutor-2482-thread-2-processing-127.0.0.1:36563_solr replacenodetest_coll_shard5_replica_n23 null-1664 000308584190838394 create) [n:127.0.0.1:36563_solr c:replacenodetest_coll s:shard5 r:core_node24 x:replacenodetest_coll_shard5_replica_n23 t:null-1664] o.a.s.u.CommitTracker Hard AutoCommit: disabled 2> 415427 INFO (parallelCoreAdminAPIBaseExecutor-2482-thread-2-processing-127.0.0.1:36563_solr replacenodetest_coll_shard5_replica_n23 null-1664 000308584190838394 create) [n:127.0.0.1:36563_solr c:replacenodetest_coll s:shard5 r:core_node24 x:replacenodetest_coll_shard5_replica_n23 t:null-1664] o.a.s.u.CommitTracker Soft AutoCommit: disabled 2> 415445 INFO (parallelCoreAdminAPIBaseExecutor-2482-thread-2-processing-127.0.0.1:36563_solr replacenodetest_coll_shard5_replica_n23 null-1664 000308584190838394 create) [n:127.0.0.1:36563_solr c:replacenodetest_coll s:shard5 r:core_node24 x:replacenodetest_coll_shard5_replica_n23 t:null-1664] o.a.s.r.ManagedResourceStorage Configured ZooKeeperStorageIO with znodeBase: /configs/conf1 2> 415449 INFO (parallelCoreAdminAPIBaseExecutor-2482-thread-2-processing-127.0.0.1:36563_solr replacenodetest_coll_shard5_replica_n23 null-1664 000308584190838394 create) [n:127.0.0.1:36563_solr c:replacenodetest_coll s:shard5 r:core_node24 x:replacenodetest_coll_shard5_replica_n23 t:null-1664] o.a.s.r.ManagedResourceStorage Loaded null at path _rest_managed.json using ZooKeeperStorageIO:path=/configs/conf1 2> 415454 INFO (parallelCoreAdminAPIBaseExecutor-2482-thread-2-processing-127.0.0.1:36563_solr replacenodetest_coll_shard5_replica_n23 null-1664 000308584190838394 create) [n:127.0.0.1:36563_solr c:replacenodetest_coll s:shard5 r:core_node24 x:replacenodetest_coll_shard5_replica_n23 t:null-1664] o.a.s.h.ReplicationHandler Commits will be reserved for 10000 ms 2> 415455 INFO (parallelCoreAdminAPIBaseExecutor-2482-thread-2-processing-127.0.0.1:36563_solr replacenodetest_coll_shard5_replica_n23 null-1664 000308584190838394 create) [n:127.0.0.1:36563_solr c:replacenodetest_coll s:shard5 r:core_node24 x:replacenodetest_coll_shard5_replica_n23 t:null-1664] o.a.s.u.UpdateLog Could not find max version in index or recent updates, using new clock 1788735692722929664 2> 415483 INFO (searcherExecutor-2567-thread-1-processing-replacenodetest_coll_shard5_replica_n23 null-1664 000308584190838394 core_node24 create 127.0.0.1:36563_solr replacenodetest_coll shard5) [n:127.0.0.1:36563_solr c:replacenodetest_coll s:shard5 r:core_node24 x:replacenodetest_coll_shard5_replica_n23 t:null-1664] o.a.s.c.SolrCore Registered new searcher autowarm time: 0 ms 2> 415490 INFO (parallelCoreAdminAPIBaseExecutor-2482-thread-2-processing-127.0.0.1:36563_solr replacenodetest_coll_shard5_replica_n23 null-1664 000308584190838394 create) [n:127.0.0.1:36563_solr c:replacenodetest_coll s:shard5 r:core_node24 x:replacenodetest_coll_shard5_replica_n23 t:null-1664] o.a.s.c.ZkShardTerms Successful update of terms at /collections/replacenodetest_coll/terms/shard5 to Terms{values={core_node24=0, core_node16=0, core_node15=0}, version=2} for registerTerm 2> 415500 INFO (parallelCoreAdminAPIBaseExecutor-2482-thread-2-processing-127.0.0.1:36563_solr replacenodetest_coll_shard5_replica_n23 null-1664 000308584190838394 create) [n:127.0.0.1:36563_solr c:replacenodetest_coll s:shard5 r:core_node24 x:replacenodetest_coll_shard5_replica_n23 t:null-1664] o.a.s.c.ShardLeaderElectionContextBase make sure parent is created /collections/replacenodetest_coll/leaders/shard5 2> 415519 INFO (parallelCoreAdminAPIBaseExecutor-2482-thread-2-processing-127.0.0.1:36563_solr replacenodetest_coll_shard5_replica_n23 null-1664 000308584190838394 create) [n:127.0.0.1:36563_solr c:replacenodetest_coll s:shard5 r:core_node24 x:replacenodetest_coll_shard5_replica_n23 t:null-1664] o.a.s.c.ZkController Core needs to recover:replacenodetest_coll_shard5_replica_n23 2> 415520 INFO (updateExecutor-2415-thread-1-processing-replacenodetest_coll_shard5_replica_n23 null-1664 000308584190838394 core_node24 create 127.0.0.1:36563_solr replacenodetest_coll shard5) [n:127.0.0.1:36563_solr c:replacenodetest_coll s:shard5 r:core_node24 x:replacenodetest_coll_shard5_replica_n23 t:null-1664] o.a.s.u.DefaultSolrCoreState Running recovery 2> 415521 INFO (recoveryExecutor-2432-thread-1-processing-replacenodetest_coll_shard5_replica_n23 null-1664 000308584190838394 core_node24 create 127.0.0.1:36563_solr replacenodetest_coll shard5) [n:127.0.0.1:36563_solr c:replacenodetest_coll s:shard5 r:core_node24 x:replacenodetest_coll_shard5_replica_n23 t:null-1664] o.a.s.c.RecoveryStrategy Starting recovery process. recoveringAfterStartup=true 2> 415521 INFO (recoveryExecutor-2432-thread-1-processing-replacenodetest_coll_shard5_replica_n23 null-1664 000308584190838394 core_node24 create 127.0.0.1:36563_solr replacenodetest_coll shard5) [n:127.0.0.1:36563_solr c:replacenodetest_coll s:shard5 r:core_node24 x:replacenodetest_coll_shard5_replica_n23 t:null-1664] o.a.s.c.RecoveryStrategy startupVersions is empty 2> 415534 INFO (qtp583399768-4331) [n:127.0.0.1:37983_solr c:replacenodetest_coll s:shard5 r:core_node16 x:replacenodetest_coll_shard5_replica_n6 t:null-1672] o.a.s.c.S.Request webapp=/solr path=/admin/ping params={wt=javabin&version=2} hits=0 status=0 QTime=4 2> 415534 INFO (qtp583399768-4331) [n:127.0.0.1:37983_solr c:replacenodetest_coll s:shard5 r:core_node16 x:replacenodetest_coll_shard5_replica_n6 t:null-1672] o.a.s.c.S.Request webapp=/solr path=/admin/ping params={wt=javabin&version=2} status=0 QTime=5 2> 415536 INFO (recoveryExecutor-2432-thread-1-processing-replacenodetest_coll_shard5_replica_n23 null-1664 000308584190838394 core_node24 create 127.0.0.1:36563_solr replacenodetest_coll shard5) [n:127.0.0.1:36563_solr c:replacenodetest_coll s:shard5 r:core_node24 x:replacenodetest_coll_shard5_replica_n23 t:null-1664] o.a.s.c.RecoveryStrategy Begin buffering updates. core=[replacenodetest_coll_shard5_replica_n23] 2> 415540 INFO (recoveryExecutor-2432-thread-1-processing-replacenodetest_coll_shard5_replica_n23 null-1664 000308584190838394 core_node24 create 127.0.0.1:36563_solr replacenodetest_coll shard5) [n:127.0.0.1:36563_solr c:replacenodetest_coll s:shard5 r:core_node24 x:replacenodetest_coll_shard5_replica_n23 t:null-1664] o.a.s.u.UpdateLog Starting to buffer updates. FSUpdateLog{state=ACTIVE, tlog=null} 2> 415540 INFO (recoveryExecutor-2432-thread-1-processing-replacenodetest_coll_shard5_replica_n23 null-1664 000308584190838394 core_node24 create 127.0.0.1:36563_solr replacenodetest_coll shard5) [n:127.0.0.1:36563_solr c:replacenodetest_coll s:shard5 r:core_node24 x:replacenodetest_coll_shard5_replica_n23 t:null-1664] o.a.s.c.RecoveryStrategy Publishing state of core [replacenodetest_coll_shard5_replica_n23] as recovering, leader is [http://127.0.0.1:37983/solr/replacenodetest_coll_shard5_replica_n6/] and I am [http://127.0.0.1:36563/solr/replacenodetest_coll_shard5_replica_n23/] 2> 415550 INFO (zkCallback-2440-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/replacenodetest_coll/state.json zxid: 902] for collection [replacenodetest_coll] has occurred - updating... (live nodes size: [6]) 2> 415550 INFO (zkCallback-2430-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/replacenodetest_coll/state.json zxid: 902] for collection [replacenodetest_coll] has occurred - updating... (live nodes size: [6]) 2> 415550 INFO (zkCallback-2434-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/replacenodetest_coll/state.json zxid: 902] for collection [replacenodetest_coll] has occurred - updating... (live nodes size: [6]) 2> 415550 INFO (zkCallback-2436-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/replacenodetest_coll/state.json zxid: 902] for collection [replacenodetest_coll] has occurred - updating... (live nodes size: [6]) 2> 415550 INFO (zkCallback-2425-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/replacenodetest_coll/state.json zxid: 902] for collection [replacenodetest_coll] has occurred - updating... (live nodes size: [6]) 2> 415550 INFO (zkCallback-2438-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/replacenodetest_coll/state.json zxid: 902] for collection [replacenodetest_coll] has occurred - updating... (live nodes size: [6]) 2> 415553 INFO (recoveryExecutor-2432-thread-1-processing-replacenodetest_coll_shard5_replica_n23 null-1664 000308584190838394 core_node24 create 127.0.0.1:36563_solr replacenodetest_coll shard5) [n:127.0.0.1:36563_solr c:replacenodetest_coll s:shard5 r:core_node24 x:replacenodetest_coll_shard5_replica_n23 t:null-1664] o.a.s.c.RecoveryStrategy Sending prep recovery command to [http://127.0.0.1:37983/solr]; [WaitForState: action=PREPRECOVERY&core=replacenodetest_coll_shard5_replica_n6&nodeName=127.0.0.1:36563_solr&coreNodeName=core_node24&state=recovering&checkLive=true&onlyIfLeader=true&onlyIfLeaderActive=true] 2> 415562 INFO (qtp583399768-4341) [n:127.0.0.1:37983_solr c: s: r: x:replacenodetest_coll_shard5_replica_n6 t:null-1673] o.a.s.h.a.PrepRecoveryOp Going to wait for coreNodeName: core_node24, state: recovering, checkLive: true, onlyIfLeader: true, onlyIfLeaderActive: true 2> 415562 INFO (qtp583399768-4341) [n:127.0.0.1:37983_solr c: s: r: x: t:null-1673] o.a.s.h.a.PrepRecoveryOp In WaitForState(recovering): collection=replacenodetest_coll, shard=shard5, thisCore=replacenodetest_coll_shard5_replica_n6, leaderDoesNotNeedRecovery=false, isLeader? true, live=true, checkLive=true, currentState=recovering, localState=active, nodeName=127.0.0.1:36563_solr, coreNodeName=core_node24, onlyIfActiveCheckResult=false, nodeProps: core_node24:{ 2> "core":"replacenodetest_coll_shard5_replica_n23", 2> "node_name":"127.0.0.1:36563_solr", 2> "base_url":"http://127.0.0.1:36563/solr", 2> "state":"recovering", 2> "type":"NRT", 2> "force_set_state":"false"} 2> 415563 INFO (qtp583399768-4341) [n:127.0.0.1:37983_solr c: s: r: x: t:null-1673] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/cores params={nodeName=127.0.0.1:36563_solr&onlyIfLeaderActive=true&core=replacenodetest_coll_shard5_replica_n6&coreNodeName=core_node24&action=PREPRECOVERY&checkLive=true&state=recovering&onlyIfLeader=true&wt=javabin&version=2} status=0 QTime=5 2> 415839 INFO (qtp534543879-4328) [n:127.0.0.1:33013_solr c: s: r: x: t:null-1674] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/collections params={requestid=000&action=REQUESTSTATUS&wt=javabin&version=2} status=0 QTime=11 2> 415955 INFO (qtp1180210852-4339) [n:127.0.0.1:36563_solr c: s: r: x: t:null-1664] o.a.s.h.a.CoreAdminOperation Checking request status for : 000308584190838394 2> 415955 INFO (qtp1180210852-4339) [n:127.0.0.1:36563_solr c: s: r: x: t:null-1664] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/cores params={qt=/admin/cores&requestid=000308584190838394&action=REQUESTSTATUS&wt=javabin&version=2} status=0 QTime=2 2> 415958 INFO (DistributedCollectionApiCommandExecutor-2458-thread-2-processing-127.0.0.1:33013_solr null-1664) [n:127.0.0.1:33013_solr c: s: r: x: t:null-1664] o.a.s.c.a.c.ReplicaMigrationUtils Deleting replica type=NRT for collection=replacenodetest_coll shard=shard2 on node=127.0.0.1:37265_solr 2> 415966 INFO (DistributedCollectionApiCommandExecutor-2458-thread-2-processing-127.0.0.1:33013_solr null-1664) [n:127.0.0.1:33013_solr c: s: r: x: t:null-1664] o.a.s.c.a.c.ReplicaMigrationUtils Deleting replica type=NRT for collection=replacenodetest_coll shard=shard5 on node=127.0.0.1:37265_solr 2> 415971 INFO (qtp259113879-4353) [n:127.0.0.1:37265_solr c: s: r: x:replacenodetest_coll_shard2_replica_n9 t:null-1664] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/cores params={deleteInstanceDir=true&core=replacenodetest_coll_shard2_replica_n9&async=000308585228661931&qt=/admin/cores&deleteDataDir=true&action=UNLOAD&wt=javabin&version=2&deleteIndex=true} status=0 QTime=4 2> 415974 INFO (parallelCoreAdminAPIBaseExecutor-2464-thread-1-processing-127.0.0.1:37265_solr replacenodetest_coll_shard2_replica_n9 null-1664 000308585228661931 unload) [n:127.0.0.1:37265_solr c: s: r: x:replacenodetest_coll_shard2_replica_n9 t:null-1664] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.core.replacenodetest_coll.shard2.replica_n9 tag=null 2> 415987 INFO (qtp259113879-4355) [n:127.0.0.1:37265_solr c: s: r: x:replacenodetest_coll_shard5_replica_n5 t:null-1664] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/cores params={deleteInstanceDir=true&core=replacenodetest_coll_shard5_replica_n5&async=000308585236821211&qt=/admin/cores&deleteDataDir=true&action=UNLOAD&wt=javabin&version=2&deleteIndex=true} status=0 QTime=13 2> 415988 INFO (qtp259113879-4323) [n:127.0.0.1:37265_solr c: s: r: x: t:null-1664] o.a.s.h.a.CoreAdminOperation Checking request status for : 000308585228661931 2> 415988 INFO (qtp259113879-4323) [n:127.0.0.1:37265_solr c: s: r: x: t:null-1664] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/cores params={qt=/admin/cores&requestid=000308585228661931&action=REQUESTSTATUS&wt=javabin&version=2} status=0 QTime=2 2> 415995 INFO (qtp259113879-4333) [n:127.0.0.1:37265_solr c: s: r: x: t:null-1664] o.a.s.h.a.CoreAdminOperation Checking request status for : 000308585236821211 2> 415995 INFO (qtp259113879-4333) [n:127.0.0.1:37265_solr c: s: r: x: t:null-1664] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/cores params={qt=/admin/cores&requestid=000308585236821211&action=REQUESTSTATUS&wt=javabin&version=2} status=0 QTime=2 2> 416065 INFO (recoveryExecutor-2432-thread-1-processing-replacenodetest_coll_shard5_replica_n23 null-1664 000308584190838394 core_node24 create 127.0.0.1:36563_solr replacenodetest_coll shard5) [n:127.0.0.1:36563_solr c:replacenodetest_coll s:shard5 r:core_node24 x:replacenodetest_coll_shard5_replica_n23 t:null-1664] o.a.s.c.RecoveryStrategy Attempting to PeerSync from [http://127.0.0.1:37983/solr/replacenodetest_coll_shard5_replica_n6/] - recoveringAfterStartup=[true] 2> 416068 WARN (recoveryExecutor-2432-thread-1-processing-replacenodetest_coll_shard5_replica_n23 null-1664 000308584190838394 core_node24 create 127.0.0.1:36563_solr replacenodetest_coll shard5) [n:127.0.0.1:36563_solr c:replacenodetest_coll s:shard5 r:core_node24 x:replacenodetest_coll_shard5_replica_n23 t:null-1664] o.a.s.u.PeerSyncWithLeader no frame of reference to tell if we've missed updates 2> 416069 INFO (recoveryExecutor-2432-thread-1-processing-replacenodetest_coll_shard5_replica_n23 null-1664 000308584190838394 core_node24 create 127.0.0.1:36563_solr replacenodetest_coll shard5) [n:127.0.0.1:36563_solr c:replacenodetest_coll s:shard5 r:core_node24 x:replacenodetest_coll_shard5_replica_n23 t:null-1664] o.a.s.c.RecoveryStrategy PeerSync Recovery was not successful - trying replication. 2> 416069 INFO (recoveryExecutor-2432-thread-1-processing-replacenodetest_coll_shard5_replica_n23 null-1664 000308584190838394 core_node24 create 127.0.0.1:36563_solr replacenodetest_coll shard5) [n:127.0.0.1:36563_solr c:replacenodetest_coll s:shard5 r:core_node24 x:replacenodetest_coll_shard5_replica_n23 t:null-1664] o.a.s.c.RecoveryStrategy Starting Replication Recovery. 2> 416069 INFO (recoveryExecutor-2432-thread-1-processing-replacenodetest_coll_shard5_replica_n23 null-1664 000308584190838394 core_node24 create 127.0.0.1:36563_solr replacenodetest_coll shard5) [n:127.0.0.1:36563_solr c:replacenodetest_coll s:shard5 r:core_node24 x:replacenodetest_coll_shard5_replica_n23 t:null-1664] o.a.s.c.RecoveryStrategy Attempting to replicate from [http://127.0.0.1:37983/solr/replacenodetest_coll_shard5_replica_n6/]. 2> 416146 INFO (qtp1180210852-4317) [n:127.0.0.1:36563_solr c:replacenodetest_coll s:shard2 r:core_node22 x:replacenodetest_coll_shard2_replica_n21 t:null-1675] o.a.s.u.p.LogUpdateProcessorFactory webapp=/solr path=/update params={update.distrib=FROMLEADER&waitSearcher=true&openSearcher=false&commit=true&softCommit=false&distrib.from=http://127.0.0.1:33013/solr/replacenodetest_coll_shard2_replica_n3/&commit_end_point=replicas&wt=javabin&version=2&expungeDeletes=false}{commit=} 0 16 2> 416160 INFO (qtp1384576555-4326) [n:127.0.0.1:34105_solr c:replacenodetest_coll s:shard3 r:core_node20 x:replacenodetest_coll_shard3_replica_n10 t:null-1675] o.a.s.u.p.LogUpdateProcessorFactory webapp=/solr path=/update params={update.distrib=FROMLEADER&waitSearcher=true&openSearcher=false&commit=true&softCommit=false&distrib.from=http://127.0.0.1:37983/solr/replacenodetest_coll_shard3_replica_n4/&commit_end_point=replicas&wt=javabin&version=2&expungeDeletes=false}{commit=} 0 30 2> 416156 INFO (qtp819144083-4314) [n:127.0.0.1:34953_solr c:replacenodetest_coll s:shard4 r:core_node17 x:replacenodetest_coll_shard4_replica_n7 t:null-1675] o.a.s.u.p.LogUpdateProcessorFactory webapp=/solr path=/update params={update.distrib=FROMLEADER&waitSearcher=true&openSearcher=false&commit=true&softCommit=false&distrib.from=http://127.0.0.1:33013/solr/replacenodetest_coll_shard4_replica_n8/&commit_end_point=replicas&wt=javabin&version=2&expungeDeletes=false}{commit=} 0 26 2> 416164 INFO (qtp1384576555-4320) [n:127.0.0.1:34105_solr c:replacenodetest_coll s:shard1 r:core_node11 x:replacenodetest_coll_shard1_replica_n1 t:null-1675] o.a.s.u.p.LogUpdateProcessorFactory webapp=/solr path=/update params={update.distrib=FROMLEADER&waitSearcher=true&openSearcher=false&commit=true&softCommit=false&distrib.from=http://127.0.0.1:34953/solr/replacenodetest_coll_shard1_replica_n2/&commit_end_point=replicas&wt=javabin&version=2&expungeDeletes=false}{commit=} 0 35 2> 416166 INFO (qtp583399768-4335) [n:127.0.0.1:37983_solr c:replacenodetest_coll s:shard3 r:core_node14 x:replacenodetest_coll_shard3_replica_n4 t:null-1675] o.a.s.u.p.LogUpdateProcessorFactory webapp=/solr path=/update params={update.distrib=TOLEADER&waitSearcher=true&openSearcher=false&commit=true&softCommit=false&distrib.from=http://127.0.0.1:37983/solr/replacenodetest_coll_shard5_replica_n6/&commit_end_point=leaders&wt=javabin&version=2&expungeDeletes=false}{commit=} 0 58 2> 416166 INFO (qtp534543879-4348) [n:127.0.0.1:33013_solr c:replacenodetest_coll s:shard4 r:core_node18 x:replacenodetest_coll_shard4_replica_n8 t:null-1675] o.a.s.u.p.LogUpdateProcessorFactory webapp=/solr path=/update params={update.distrib=TOLEADER&waitSearcher=true&openSearcher=false&commit=true&softCommit=false&distrib.from=http://127.0.0.1:37983/solr/replacenodetest_coll_shard5_replica_n6/&commit_end_point=leaders&wt=javabin&version=2&expungeDeletes=false}{commit=} 0 55 2> 416167 INFO (qtp819144083-4342) [n:127.0.0.1:34953_solr c:replacenodetest_coll s:shard1 r:core_node12 x:replacenodetest_coll_shard1_replica_n2 t:null-1675] o.a.s.u.p.LogUpdateProcessorFactory webapp=/solr path=/update params={update.distrib=TOLEADER&waitSearcher=true&openSearcher=false&commit=true&softCommit=false&distrib.from=http://127.0.0.1:37983/solr/replacenodetest_coll_shard5_replica_n6/&commit_end_point=leaders&wt=javabin&version=2&expungeDeletes=false}{commit=} 0 60 2> 416193 ERROR (updateExecutor-2416-thread-4-processing-replacenodetest_coll_shard2_replica_n3 null-1675 core_node13 127.0.0.1:33013_solr replacenodetest_coll shard2) [n:127.0.0.1:33013_solr c:replacenodetest_coll s:shard2 r:core_node13 x:replacenodetest_coll_shard2_replica_n3 t:null-1675] o.a.s.u.SolrCmdDistributor Exception making request 2> => org.apache.solr.client.solrj.impl.BaseHttpSolrClient$RemoteSolrException: Error from server at http://127.0.0.1:37265/solr/replacenodetest_coll_shard2_replica_n9/update?update.distrib=FROMLEADER&distrib.from=http%3A%2F%2F127.0.0.1%3A33013%2Fsolr%2Freplacenodetest_coll_shard2_replica_n3%2F: Expected mime type in [application/vnd.apache.solr.javabin, application/octet-stream] but got text/html. 2> 2> org.apache.solr.client.solrj.impl.BaseHttpSolrClient$RemoteSolrException: Error from server at http://127.0.0.1:37265/solr/replacenodetest_coll_shard2_replica_n9/update?update.distrib=FROMLEADER&distrib.from=http%3A%2F%2F127.0.0.1%3A33013%2Fsolr%2Freplacenodetest_coll_shard2_replica_n3%2F: Expected mime type in [application/vnd.apache.solr.javabin, application/octet-stream] but got text/html. 2> 2> 2> Error 404 Can not find: /solr/replacenodetest_coll_shard2_replica_n9/update 2> 2>

HTTP ERROR 404 Can not find: /solr/replacenodetest_coll_shard2_replica_n9/update

2> 2> 2> 2> 2> 2>
URI:/solr/replacenodetest_coll_shard2_replica_n9/update
STATUS:404
MESSAGE:Can not find: /solr/replacenodetest_coll_shard2_replica_n9/update
SERVLET:org.apache.solr.embedded.JettySolrRunner$Servlet404-416738cf
2>
Powered by Jetty:// 10.0.19
2> 2> 2> 2> 2> at org.apache.solr.client.solrj.impl.Http2SolrClient.checkContentType(Http2SolrClient.java:974) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.Http2SolrClient.processErrorsAndResponse(Http2SolrClient.java:863) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.Http2SolrClient.processErrorsAndResponse(Http2SolrClient.java:576) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.Http2SolrClient.request(Http2SolrClient.java:533) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.ConcurrentUpdateHttp2SolrClient.request(ConcurrentUpdateHttp2SolrClient.java:383) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.SolrClient.request(SolrClient.java:1192) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.update.SolrCmdDistributor.doRequest(SolrCmdDistributor.java:372) [main/:?] 2> at org.apache.solr.update.SolrCmdDistributor.lambda$submit$0(SolrCmdDistributor.java:361) [main/:?] 2> at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264) [?:?] 2> at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) [?:?] 2> at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264) [?:?] 2> at com.codahale.metrics.InstrumentedExecutorService$InstrumentedRunnable.run(InstrumentedExecutorService.java:212) [metrics-core-4.2.21.jar:4.2.21] 2> at org.apache.solr.common.util.ExecutorUtil$MDCAwareThreadPoolExecutor.lambda$execute$0(ExecutorUtil.java:294) [solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) [?:?] 2> at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) [?:?] 2> at java.base/java.lang.Thread.run(Thread.java:829) [?:?] 2> 416202 INFO (qtp534543879-4356) [n:127.0.0.1:33013_solr c:replacenodetest_coll s:shard2 r:core_node13 x:replacenodetest_coll_shard2_replica_n3 t:null-1675] o.a.s.u.SolrCmdDistributor SolrCmdDistributor found 1 errors 2> 416205 WARN (qtp534543879-4356) [n:127.0.0.1:33013_solr c:replacenodetest_coll s:shard2 r:core_node13 x:replacenodetest_coll_shard2_replica_n3 t:null-1675] o.a.s.u.p.DistributedZkUpdateProcessor Error sending update to http://127.0.0.1:37265/solr 2> => org.apache.solr.client.solrj.impl.BaseHttpSolrClient$RemoteSolrException: Error from server at http://127.0.0.1:37265/solr/replacenodetest_coll_shard2_replica_n9/update?update.distrib=FROMLEADER&distrib.from=http%3A%2F%2F127.0.0.1%3A33013%2Fsolr%2Freplacenodetest_coll_shard2_replica_n3%2F: Expected mime type in [application/vnd.apache.solr.javabin, application/octet-stream] but got text/html. 2> 2> org.apache.solr.client.solrj.impl.BaseHttpSolrClient$RemoteSolrException: Error from server at http://127.0.0.1:37265/solr/replacenodetest_coll_shard2_replica_n9/update?update.distrib=FROMLEADER&distrib.from=http%3A%2F%2F127.0.0.1%3A33013%2Fsolr%2Freplacenodetest_coll_shard2_replica_n3%2F: Expected mime type in [application/vnd.apache.solr.javabin, application/octet-stream] but got text/html. 2> 2> 2> Error 404 Can not find: /solr/replacenodetest_coll_shard2_replica_n9/update 2> 2>

HTTP ERROR 404 Can not find: /solr/replacenodetest_coll_shard2_replica_n9/update

2> 2> 2> 2> 2> 2>
URI:/solr/replacenodetest_coll_shard2_replica_n9/update
STATUS:404
MESSAGE:Can not find: /solr/replacenodetest_coll_shard2_replica_n9/update
SERVLET:org.apache.solr.embedded.JettySolrRunner$Servlet404-416738cf
2>
Powered by Jetty:// 10.0.19
2> 2> 2> 2> 2> at org.apache.solr.client.solrj.impl.Http2SolrClient.checkContentType(Http2SolrClient.java:974) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.Http2SolrClient.processErrorsAndResponse(Http2SolrClient.java:863) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.Http2SolrClient.processErrorsAndResponse(Http2SolrClient.java:576) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.Http2SolrClient.request(Http2SolrClient.java:533) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.ConcurrentUpdateHttp2SolrClient.request(ConcurrentUpdateHttp2SolrClient.java:383) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.SolrClient.request(SolrClient.java:1192) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.update.SolrCmdDistributor.doRequest(SolrCmdDistributor.java:372) ~[main/:?] 2> at org.apache.solr.update.SolrCmdDistributor.lambda$submit$0(SolrCmdDistributor.java:361) ~[main/:?] 2> at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264) ~[?:?] 2> at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) ~[?:?] 2> at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264) ~[?:?] 2> at com.codahale.metrics.InstrumentedExecutorService$InstrumentedRunnable.run(InstrumentedExecutorService.java:212) ~[metrics-core-4.2.21.jar:4.2.21] 2> at org.apache.solr.common.util.ExecutorUtil$MDCAwareThreadPoolExecutor.lambda$execute$0(ExecutorUtil.java:294) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) ~[?:?] 2> at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) ~[?:?] 2> at java.base/java.lang.Thread.run(Thread.java:829) [?:?] 2> 416215 INFO (qtp534543879-4356) [n:127.0.0.1:33013_solr c:replacenodetest_coll s:shard2 r:core_node13 x:replacenodetest_coll_shard2_replica_n3 t:null-1675] o.a.s.u.p.LogUpdateProcessorFactory webapp=/solr path=/update params={update.distrib=TOLEADER&waitSearcher=true&openSearcher=false&commit=true&softCommit=false&distrib.from=http://127.0.0.1:37983/solr/replacenodetest_coll_shard5_replica_n6/&commit_end_point=leaders&wt=javabin&version=2&expungeDeletes=false}{commit=} 0 106 2> 416242 ERROR (updateExecutor-2413-thread-1-processing-replacenodetest_coll_shard5_replica_n6 null-1675 core_node16 127.0.0.1:37983_solr replacenodetest_coll shard5) [n:127.0.0.1:37983_solr c:replacenodetest_coll s:shard5 r:core_node16 x:replacenodetest_coll_shard5_replica_n6 t:null-1675] o.a.s.u.SolrCmdDistributor Exception making request 2> => org.apache.solr.client.solrj.impl.BaseHttpSolrClient$RemoteSolrException: Error from server at http://127.0.0.1:37265/solr/replacenodetest_coll_shard5_replica_n5/update?update.distrib=FROMLEADER&distrib.from=http%3A%2F%2F127.0.0.1%3A37983%2Fsolr%2Freplacenodetest_coll_shard5_replica_n6%2F: Expected mime type in [application/vnd.apache.solr.javabin, application/octet-stream] but got text/html. 2> 2> org.apache.solr.client.solrj.impl.BaseHttpSolrClient$RemoteSolrException: Error from server at http://127.0.0.1:37265/solr/replacenodetest_coll_shard5_replica_n5/update?update.distrib=FROMLEADER&distrib.from=http%3A%2F%2F127.0.0.1%3A37983%2Fsolr%2Freplacenodetest_coll_shard5_replica_n6%2F: Expected mime type in [application/vnd.apache.solr.javabin, application/octet-stream] but got text/html. 2> 2> 2> Error 404 Can not find: /solr/replacenodetest_coll_shard5_replica_n5/update 2> 2>

HTTP ERROR 404 Can not find: /solr/replacenodetest_coll_shard5_replica_n5/update

2> 2> 2> 2> 2> 2>
URI:/solr/replacenodetest_coll_shard5_replica_n5/update
STATUS:404
MESSAGE:Can not find: /solr/replacenodetest_coll_shard5_replica_n5/update
SERVLET:org.apache.solr.embedded.JettySolrRunner$Servlet404-416738cf
2>
Powered by Jetty:// 10.0.19
2> 2> 2> 2> 2> at org.apache.solr.client.solrj.impl.Http2SolrClient.checkContentType(Http2SolrClient.java:974) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.Http2SolrClient.processErrorsAndResponse(Http2SolrClient.java:863) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.Http2SolrClient.processErrorsAndResponse(Http2SolrClient.java:576) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.Http2SolrClient.request(Http2SolrClient.java:533) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.ConcurrentUpdateHttp2SolrClient.request(ConcurrentUpdateHttp2SolrClient.java:383) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.SolrClient.request(SolrClient.java:1192) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.update.SolrCmdDistributor.doRequest(SolrCmdDistributor.java:372) [main/:?] 2> at org.apache.solr.update.SolrCmdDistributor.lambda$submit$0(SolrCmdDistributor.java:361) [main/:?] 2> at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264) [?:?] 2> at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) [?:?] 2> at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264) [?:?] 2> at com.codahale.metrics.InstrumentedExecutorService$InstrumentedRunnable.run(InstrumentedExecutorService.java:212) [metrics-core-4.2.21.jar:4.2.21] 2> at org.apache.solr.common.util.ExecutorUtil$MDCAwareThreadPoolExecutor.lambda$execute$0(ExecutorUtil.java:294) [solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) [?:?] 2> at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) [?:?] 2> at java.base/java.lang.Thread.run(Thread.java:829) [?:?] 2> 416250 INFO (qtp1180210852-4329) [n:127.0.0.1:36563_solr c:replacenodetest_coll s:shard5 r:core_node24 x:replacenodetest_coll_shard5_replica_n23 t:null-1675] o.a.s.u.p.DistributedUpdateProcessor Ignoring commit while not ACTIVE - state: BUFFERING replay: false 2> 416250 INFO (qtp1180210852-4329) [n:127.0.0.1:36563_solr c:replacenodetest_coll s:shard5 r:core_node24 x:replacenodetest_coll_shard5_replica_n23 t:null-1675] o.a.s.u.p.LogUpdateProcessorFactory webapp=/solr path=/update params={update.distrib=FROMLEADER&waitSearcher=true&openSearcher=false&commit=true&softCommit=false&distrib.from=http://127.0.0.1:37983/solr/replacenodetest_coll_shard5_replica_n6/&commit_end_point=replicas&wt=javabin&version=2&expungeDeletes=false}{commit=} 0 3 2> 416252 INFO (qtp583399768-4344) [n:127.0.0.1:37983_solr c:replacenodetest_coll s:shard5 r:core_node16 x:replacenodetest_coll_shard5_replica_n6 t:null-1675] o.a.s.u.SolrCmdDistributor SolrCmdDistributor found 1 errors 2> 416253 WARN (qtp583399768-4344) [n:127.0.0.1:37983_solr c:replacenodetest_coll s:shard5 r:core_node16 x:replacenodetest_coll_shard5_replica_n6 t:null-1675] o.a.s.u.p.DistributedZkUpdateProcessor Error sending update to http://127.0.0.1:37265/solr 2> => org.apache.solr.client.solrj.impl.BaseHttpSolrClient$RemoteSolrException: Error from server at http://127.0.0.1:37265/solr/replacenodetest_coll_shard5_replica_n5/update?update.distrib=FROMLEADER&distrib.from=http%3A%2F%2F127.0.0.1%3A37983%2Fsolr%2Freplacenodetest_coll_shard5_replica_n6%2F: Expected mime type in [application/vnd.apache.solr.javabin, application/octet-stream] but got text/html. 2> 2> org.apache.solr.client.solrj.impl.BaseHttpSolrClient$RemoteSolrException: Error from server at http://127.0.0.1:37265/solr/replacenodetest_coll_shard5_replica_n5/update?update.distrib=FROMLEADER&distrib.from=http%3A%2F%2F127.0.0.1%3A37983%2Fsolr%2Freplacenodetest_coll_shard5_replica_n6%2F: Expected mime type in [application/vnd.apache.solr.javabin, application/octet-stream] but got text/html. 2> 2> 2> Error 404 Can not find: /solr/replacenodetest_coll_shard5_replica_n5/update 2> 2>

HTTP ERROR 404 Can not find: /solr/replacenodetest_coll_shard5_replica_n5/update

2> 2> 2> 2> 2> 2>
URI:/solr/replacenodetest_coll_shard5_replica_n5/update
STATUS:404
MESSAGE:Can not find: /solr/replacenodetest_coll_shard5_replica_n5/update
SERVLET:org.apache.solr.embedded.JettySolrRunner$Servlet404-416738cf
2>
Powered by Jetty:// 10.0.19
2> 2> 2> 2> 2> at org.apache.solr.client.solrj.impl.Http2SolrClient.checkContentType(Http2SolrClient.java:974) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.Http2SolrClient.processErrorsAndResponse(Http2SolrClient.java:863) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.Http2SolrClient.processErrorsAndResponse(Http2SolrClient.java:576) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.Http2SolrClient.request(Http2SolrClient.java:533) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.ConcurrentUpdateHttp2SolrClient.request(ConcurrentUpdateHttp2SolrClient.java:383) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.SolrClient.request(SolrClient.java:1192) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.update.SolrCmdDistributor.doRequest(SolrCmdDistributor.java:372) ~[main/:?] 2> at org.apache.solr.update.SolrCmdDistributor.lambda$submit$0(SolrCmdDistributor.java:361) ~[main/:?] 2> at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264) ~[?:?] 2> at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) ~[?:?] 2> at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264) ~[?:?] 2> at com.codahale.metrics.InstrumentedExecutorService$InstrumentedRunnable.run(InstrumentedExecutorService.java:212) ~[metrics-core-4.2.21.jar:4.2.21] 2> at org.apache.solr.common.util.ExecutorUtil$MDCAwareThreadPoolExecutor.lambda$execute$0(ExecutorUtil.java:294) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) ~[?:?] 2> at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) ~[?:?] 2> at java.base/java.lang.Thread.run(Thread.java:829) [?:?] 2> 416259 INFO (qtp583399768-4344) [n:127.0.0.1:37983_solr c:replacenodetest_coll s:shard5 r:core_node16 x:replacenodetest_coll_shard5_replica_n6 t:null-1675] o.a.s.u.p.LogUpdateProcessorFactory webapp=/solr path=/update params={waitSearcher=true&openSearcher=false&commit=true&softCommit=false&wt=javabin&version=2}{commit=} 0 186 2> 416273 INFO (qtp583399768-4331) [n:127.0.0.1:37983_solr c:replacenodetest_coll s:shard5 r:core_node16 x:replacenodetest_coll_shard5_replica_n6 t:null-1676] o.a.s.c.S.Request webapp=/solr path=/replication params={qt=/replication&wt=javabin&version=2&command=indexversion} status=0 QTime=3 2> 416275 INFO (recoveryExecutor-2432-thread-1-processing-replacenodetest_coll_shard5_replica_n23 null-1664 000308584190838394 core_node24 create 127.0.0.1:36563_solr replacenodetest_coll shard5) [n:127.0.0.1:36563_solr c:replacenodetest_coll s:shard5 r:core_node24 x:replacenodetest_coll_shard5_replica_n23 t:null-1664] o.a.s.h.IndexFetcher Leader's generation: 1 2> 416275 INFO (recoveryExecutor-2432-thread-1-processing-replacenodetest_coll_shard5_replica_n23 null-1664 000308584190838394 core_node24 create 127.0.0.1:36563_solr replacenodetest_coll shard5) [n:127.0.0.1:36563_solr c:replacenodetest_coll s:shard5 r:core_node24 x:replacenodetest_coll_shard5_replica_n23 t:null-1664] o.a.s.h.IndexFetcher Leader's version: 0 2> 416275 INFO (recoveryExecutor-2432-thread-1-processing-replacenodetest_coll_shard5_replica_n23 null-1664 000308584190838394 core_node24 create 127.0.0.1:36563_solr replacenodetest_coll shard5) [n:127.0.0.1:36563_solr c:replacenodetest_coll s:shard5 r:core_node24 x:replacenodetest_coll_shard5_replica_n23 t:null-1664] o.a.s.h.IndexFetcher Follower's generation: 1 2> 416276 INFO (recoveryExecutor-2432-thread-1-processing-replacenodetest_coll_shard5_replica_n23 null-1664 000308584190838394 core_node24 create 127.0.0.1:36563_solr replacenodetest_coll shard5) [n:127.0.0.1:36563_solr c:replacenodetest_coll s:shard5 r:core_node24 x:replacenodetest_coll_shard5_replica_n23 t:null-1664] o.a.s.h.IndexFetcher Follower's version: 0 2> 416276 INFO (recoveryExecutor-2432-thread-1-processing-replacenodetest_coll_shard5_replica_n23 null-1664 000308584190838394 core_node24 create 127.0.0.1:36563_solr replacenodetest_coll shard5) [n:127.0.0.1:36563_solr c:replacenodetest_coll s:shard5 r:core_node24 x:replacenodetest_coll_shard5_replica_n23 t:null-1664] o.a.s.c.RecoveryStrategy No replay needed. 2> 416284 INFO (recoveryExecutor-2432-thread-1-processing-replacenodetest_coll_shard5_replica_n23 null-1664 000308584190838394 core_node24 create 127.0.0.1:36563_solr replacenodetest_coll shard5) [n:127.0.0.1:36563_solr c:replacenodetest_coll s:shard5 r:core_node24 x:replacenodetest_coll_shard5_replica_n23 t:null-1664] o.a.s.c.RecoveryStrategy Replication Recovery was successful. 2> 416285 INFO (recoveryExecutor-2432-thread-1-processing-replacenodetest_coll_shard5_replica_n23 null-1664 000308584190838394 core_node24 create 127.0.0.1:36563_solr replacenodetest_coll shard5) [n:127.0.0.1:36563_solr c:replacenodetest_coll s:shard5 r:core_node24 x:replacenodetest_coll_shard5_replica_n23 t:null-1664] o.a.s.c.RecoveryStrategy Registering as Active after recovery. 2> 416294 INFO (zkCallback-2440-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/replacenodetest_coll/state.json zxid: 903] for collection [replacenodetest_coll] has occurred - updating... (live nodes size: [6]) 2> 416294 INFO (zkCallback-2425-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/replacenodetest_coll/state.json zxid: 903] for collection [replacenodetest_coll] has occurred - updating... (live nodes size: [6]) 2> 416295 INFO (zkCallback-2436-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/replacenodetest_coll/state.json zxid: 903] for collection [replacenodetest_coll] has occurred - updating... (live nodes size: [6]) 2> 416295 INFO (zkCallback-2434-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/replacenodetest_coll/state.json zxid: 903] for collection [replacenodetest_coll] has occurred - updating... (live nodes size: [6]) 2> 416294 INFO (zkCallback-2430-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/replacenodetest_coll/state.json zxid: 903] for collection [replacenodetest_coll] has occurred - updating... (live nodes size: [6]) 2> 416294 INFO (zkCallback-2438-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/replacenodetest_coll/state.json zxid: 903] for collection [replacenodetest_coll] has occurred - updating... (live nodes size: [6]) 2> 416294 INFO (recoveryExecutor-2432-thread-1-processing-replacenodetest_coll_shard5_replica_n23 null-1664 000308584190838394 core_node24 create 127.0.0.1:36563_solr replacenodetest_coll shard5) [n:127.0.0.1:36563_solr c:replacenodetest_coll s:shard5 r:core_node24 x:replacenodetest_coll_shard5_replica_n23 t:null-1664] o.a.s.c.RecoveryStrategy Updating version bucket highest from index after successful recovery. 2> 416299 INFO (parallelCoreAdminAPIBaseExecutor-2464-thread-2-processing-127.0.0.1:37265_solr replacenodetest_coll_shard5_replica_n5 null-1664 000308585236821211 unload) [n:127.0.0.1:37265_solr c: s: r: x:replacenodetest_coll_shard5_replica_n5 t:null-1664] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.core.replacenodetest_coll.shard5.replica_n5 tag=null 2> 416299 INFO (parallelCoreAdminAPIBaseExecutor-2464-thread-1-processing-127.0.0.1:37265_solr replacenodetest_coll_shard2_replica_n9 null-1664 000308585228661931 unload) [n:127.0.0.1:37265_solr c: s: r: x:replacenodetest_coll_shard2_replica_n9 t:null-1664] o.a.s.c.SolrCore CLOSING SolrCore org.apache.solr.core.SolrCore@61ae9314 replacenodetest_coll_shard2_replica_n9 2> 416299 INFO (recoveryExecutor-2432-thread-1-processing-replacenodetest_coll_shard5_replica_n23 null-1664 000308584190838394 core_node24 create 127.0.0.1:36563_solr replacenodetest_coll shard5) [n:127.0.0.1:36563_solr c:replacenodetest_coll s:shard5 r:core_node24 x:replacenodetest_coll_shard5_replica_n23 t:null-1664] o.a.s.u.UpdateLog Could not find max version in index or recent updates, using new clock 1788735693607927808 2> 416307 INFO (recoveryExecutor-2432-thread-1-processing-replacenodetest_coll_shard5_replica_n23 null-1664 000308584190838394 core_node24 create 127.0.0.1:36563_solr replacenodetest_coll shard5) [n:127.0.0.1:36563_solr c:replacenodetest_coll s:shard5 r:core_node24 x:replacenodetest_coll_shard5_replica_n23 t:null-1664] o.a.s.c.RecoveryStrategy Finished recovery process, successful=[true] msTimeTaken=785.0 2> 416308 INFO (recoveryExecutor-2432-thread-1-processing-replacenodetest_coll_shard5_replica_n23 null-1664 000308584190838394 core_node24 create 127.0.0.1:36563_solr replacenodetest_coll shard5) [n:127.0.0.1:36563_solr c:replacenodetest_coll s:shard5 r:core_node24 x:replacenodetest_coll_shard5_replica_n23 t:null-1664] o.a.s.c.RecoveryStrategy Finished recovery process. recoveringAfterStartup=true msTimeTaken=786.0 2> 416539 INFO (parallelCoreAdminAPIBaseExecutor-2464-thread-1-processing-127.0.0.1:37265_solr replacenodetest_coll_shard2_replica_n9 null-1664 000308585228661931 unload) [n:127.0.0.1:37265_solr c: s: r: x:replacenodetest_coll_shard2_replica_n9 t:null-1664] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.core.replacenodetest_coll.shard2.replica_n9 tag=SolrCore@61ae9314 2> 416542 INFO (parallelCoreAdminAPIBaseExecutor-2464-thread-2-processing-127.0.0.1:37265_solr replacenodetest_coll_shard5_replica_n5 null-1664 000308585236821211 unload) [n:127.0.0.1:37265_solr c: s: r: x:replacenodetest_coll_shard5_replica_n5 t:null-1664] o.a.s.c.SolrCore CLOSING SolrCore org.apache.solr.core.SolrCore@49c26ee0 replacenodetest_coll_shard5_replica_n5 2> 416542 INFO (parallelCoreAdminAPIBaseExecutor-2464-thread-1-processing-127.0.0.1:37265_solr replacenodetest_coll_shard2_replica_n9 null-1664 000308585228661931 unload) [n:127.0.0.1:37265_solr c: s: r: x:replacenodetest_coll_shard2_replica_n9 t:null-1664] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.collection.replacenodetest_coll.shard2.leader tag=SolrCore@61ae9314 2> 416543 INFO (parallelCoreAdminAPIBaseExecutor-2464-thread-2-processing-127.0.0.1:37265_solr replacenodetest_coll_shard5_replica_n5 null-1664 000308585236821211 unload) [n:127.0.0.1:37265_solr c: s: r: x:replacenodetest_coll_shard5_replica_n5 t:null-1664] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.core.replacenodetest_coll.shard5.replica_n5 tag=SolrCore@49c26ee0 2> 416543 INFO (parallelCoreAdminAPIBaseExecutor-2464-thread-2-processing-127.0.0.1:37265_solr replacenodetest_coll_shard5_replica_n5 null-1664 000308585236821211 unload) [n:127.0.0.1:37265_solr c: s: r: x:replacenodetest_coll_shard5_replica_n5 t:null-1664] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.collection.replacenodetest_coll.shard5.leader tag=SolrCore@49c26ee0 2> 416546 INFO (parallelCoreAdminAPIBaseExecutor-2464-thread-1-processing-127.0.0.1:37265_solr replacenodetest_coll_shard2_replica_n9 null-1664 000308585228661931 unload) [n:127.0.0.1:37265_solr c: s: r: x:replacenodetest_coll_shard2_replica_n9 t:null-1664] o.a.s.u.DirectUpdateHandler2 Committing on IndexWriter.close() ... SKIPPED (unnecessary). 2> 416546 INFO (parallelCoreAdminAPIBaseExecutor-2464-thread-2-processing-127.0.0.1:37265_solr replacenodetest_coll_shard5_replica_n5 null-1664 000308585236821211 unload) [n:127.0.0.1:37265_solr c: s: r: x:replacenodetest_coll_shard5_replica_n5 t:null-1664] o.a.s.u.DirectUpdateHandler2 Committing on IndexWriter.close() ... SKIPPED (unnecessary). 2> 416589 INFO (parallelCoreAdminAPIBaseExecutor-2464-thread-1-processing-127.0.0.1:37265_solr replacenodetest_coll_shard2_replica_n9 null-1664 000308585228661931 unload) [n:127.0.0.1:37265_solr c: s: r: x: t:null-1664] o.a.s.c.ZkShardTerms Successful update of terms at /collections/replacenodetest_coll/terms/shard2 to Terms{values={core_node22=0, core_node13=0}, version=3} for removeTerm 2> 416595 INFO (parallelCoreAdminAPIBaseExecutor-2464-thread-2-processing-127.0.0.1:37265_solr replacenodetest_coll_shard5_replica_n5 null-1664 000308585236821211 unload) [n:127.0.0.1:37265_solr c: s: r: x: t:null-1664] o.a.s.c.ZkShardTerms Successful update of terms at /collections/replacenodetest_coll/terms/shard5 to Terms{values={core_node24=0, core_node16=0}, version=3} for removeTerm 2> 416602 INFO (parallelCoreAdminAPIBaseExecutor-2464-thread-1-processing-127.0.0.1:37265_solr replacenodetest_coll_shard2_replica_n9 null-1664 000308585228661931 unload) [n:127.0.0.1:37265_solr c: s: r: x: t:null-1664] o.a.s.c.ShardLeaderElectionContextBase No version found for ephemeral leader parent node, won't remove previous leader registration. 2> 416607 INFO (parallelCoreAdminAPIBaseExecutor-2464-thread-2-processing-127.0.0.1:37265_solr replacenodetest_coll_shard5_replica_n5 null-1664 000308585236821211 unload) [n:127.0.0.1:37265_solr c: s: r: x: t:null-1664] o.a.s.c.ShardLeaderElectionContextBase No version found for ephemeral leader parent node, won't remove previous leader registration. 2> 416611 INFO (zkCallback-2434-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/replacenodetest_coll/state.json zxid: 908] for collection [replacenodetest_coll] has occurred - updating... (live nodes size: [6]) 2> 416611 INFO (zkCallback-2430-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/replacenodetest_coll/state.json zxid: 908] for collection [replacenodetest_coll] has occurred - updating... (live nodes size: [6]) 2> 416611 INFO (zkCallback-2425-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/replacenodetest_coll/state.json zxid: 908] for collection [replacenodetest_coll] has occurred - updating... (live nodes size: [6]) 2> 416611 INFO (zkCallback-2438-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/replacenodetest_coll/state.json zxid: 908] for collection [replacenodetest_coll] has occurred - updating... (live nodes size: [6]) 2> 416611 INFO (zkCallback-2436-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/replacenodetest_coll/state.json zxid: 908] for collection [replacenodetest_coll] has occurred - updating... (live nodes size: [6]) 2> 416617 INFO (zkCallback-2425-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/replacenodetest_coll/state.json zxid: 909] for collection [replacenodetest_coll] has occurred - updating... (live nodes size: [6]) 2> 416617 INFO (zkCallback-2430-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/replacenodetest_coll/state.json zxid: 909] for collection [replacenodetest_coll] has occurred - updating... (live nodes size: [6]) 2> 416617 INFO (zkCallback-2434-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/replacenodetest_coll/state.json zxid: 909] for collection [replacenodetest_coll] has occurred - updating... (live nodes size: [6]) 2> 416618 INFO (zkCallback-2438-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/replacenodetest_coll/state.json zxid: 909] for collection [replacenodetest_coll] has occurred - updating... (live nodes size: [6]) 2> 416617 INFO (zkCallback-2436-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/replacenodetest_coll/state.json zxid: 909] for collection [replacenodetest_coll] has occurred - updating... (live nodes size: [6]) 2> 416858 INFO (qtp534543879-4338) [n:127.0.0.1:33013_solr c: s: r: x: t:null-1677] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/collections params={requestid=000&action=REQUESTSTATUS&wt=javabin&version=2} status=0 QTime=12 2> 416995 INFO (qtp259113879-4353) [n:127.0.0.1:37265_solr c: s: r: x: t:null-1664] o.a.s.h.a.CoreAdminOperation Checking request status for : 000308585228661931 2> 416996 INFO (qtp259113879-4353) [n:127.0.0.1:37265_solr c: s: r: x: t:null-1664] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/cores params={qt=/admin/cores&requestid=000308585228661931&action=REQUESTSTATUS&wt=javabin&version=2} status=0 QTime=2 2> 417002 INFO (qtp259113879-4355) [n:127.0.0.1:37265_solr c: s: r: x: t:null-1664] o.a.s.h.a.CoreAdminOperation Checking request status for : 000308585236821211 2> 417002 INFO (qtp259113879-4355) [n:127.0.0.1:37265_solr c: s: r: x: t:null-1664] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/cores params={qt=/admin/cores&requestid=000308585236821211&action=REQUESTSTATUS&wt=javabin&version=2} status=0 QTime=2 2> 417873 INFO (qtp534543879-4328) [n:127.0.0.1:33013_solr c: s: r: x: t:null-1678] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/collections params={requestid=000&action=REQUESTSTATUS&wt=javabin&version=2} status=0 QTime=9 2> 417886 INFO (qtp534543879-4352) [n:127.0.0.1:33013_solr c: s: r: x: t:null-1679] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/collections params={requestid=000&action=DELETESTATUS&wt=javabin&version=2} status=0 QTime=8 2> 417904 INFO (qtp259113879-4333) [n:127.0.0.1:37265_solr c: s: r: x: t:null-1680] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/cores params={action=STATUS&indexInfo=true&wt=javabin&version=2} status=0 QTime=1 2> 422925 INFO (qtp534543879-4354) [n:127.0.0.1:33013_solr c: s: r: x: t:null-1681] o.a.s.c.a.c.DistributedCollectionConfigSetCommandRunner Running Collection API locally for REPLACENODE asyncId=001 2> 422945 INFO (qtp534543879-4354) [n:127.0.0.1:33013_solr c: s: r: x: t:null-1681] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/collections params={async=001&sourceNode=127.0.0.1:36563_solr¶llel=true&action=REPLACENODE&waitForFinalState=true&targetNode=127.0.0.1:37265_solr&wt=javabin&version=2} status=0 QTime=22 2> 422948 INFO (DistributedCollectionApiCommandExecutor-2458-thread-3-processing-127.0.0.1:33013_solr null-1681) [n:127.0.0.1:33013_solr c: s: r: x: t:null-1681] o.a.s.c.a.c.ReplicaMigrationUtils Going to create replica for collection=replacenodetest_coll shard=shard2 on node=127.0.0.1:37265_solr 2> 422955 INFO (DistributedCollectionApiCommandExecutor-2458-thread-3-processing-127.0.0.1:33013_solr null-1681) [n:127.0.0.1:33013_solr c: s: r: x: t:null-1681] o.a.s.c.a.c.AddReplicaCmd Node Identified 127.0.0.1:37265_solr for creating new replica of shard shard2 for collection replacenodetest_coll 2> 422963 INFO (qtp534543879-4348) [n:127.0.0.1:33013_solr c: s: r: x: t:null-1682] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/collections params={requestid=001&action=REQUESTSTATUS&wt=javabin&version=2} status=0 QTime=13 2> 422968 INFO (DistributedCollectionApiCommandExecutor-2458-thread-3-processing-127.0.0.1:33013_solr null-1681) [n:127.0.0.1:33013_solr c: s: r: x: t:null-1681] o.a.s.c.a.c.AddReplicaCmd Returning CreateReplica command. 2> 422972 INFO (DistributedCollectionApiCommandExecutor-2458-thread-3-processing-127.0.0.1:33013_solr null-1681) [n:127.0.0.1:33013_solr c: s: r: x: t:null-1681] o.a.s.c.o.SliceMutator createReplica() { 2> "core":"replacenodetest_coll_shard2_replica_n25", 2> "node_name":"127.0.0.1:37265_solr", 2> "base_url":"http://127.0.0.1:37265/solr", 2> "collection":"replacenodetest_coll", 2> "shard":"shard2", 2> "state":"down", 2> "type":"NRT", 2> "operation":"addreplica"} 2> 422988 INFO (zkCallback-2434-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/replacenodetest_coll/state.json zxid: 925] for collection [replacenodetest_coll] has occurred - updating... (live nodes size: [6]) 2> 422988 INFO (zkCallback-2425-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/replacenodetest_coll/state.json zxid: 925] for collection [replacenodetest_coll] has occurred - updating... (live nodes size: [6]) 2> 422988 INFO (zkCallback-2436-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/replacenodetest_coll/state.json zxid: 925] for collection [replacenodetest_coll] has occurred - updating... (live nodes size: [6]) 2> 422988 INFO (zkCallback-2438-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/replacenodetest_coll/state.json zxid: 925] for collection [replacenodetest_coll] has occurred - updating... (live nodes size: [6]) 2> 422988 INFO (zkCallback-2430-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/replacenodetest_coll/state.json zxid: 925] for collection [replacenodetest_coll] has occurred - updating... (live nodes size: [6]) 2> 423000 INFO (qtp259113879-4353) [n:127.0.0.1:37265_solr c: s: r: x:replacenodetest_coll_shard2_replica_n25 t:null-1681] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/cores params={async=001308592259456507&qt=/admin/cores&coreNodeName=core_node26&collection.configName=conf1&name=replacenodetest_coll_shard2_replica_n25&action=CREATE&collection=replacenodetest_coll&shard=shard2&wt=javabin&version=2&replicaType=NRT} status=0 QTime=3 2> 423001 INFO (parallelCoreAdminAPIBaseExecutor-2464-thread-3-processing-127.0.0.1:37265_solr replacenodetest_coll_shard2_replica_n25 null-1681 001308592259456507 create) [n:127.0.0.1:37265_solr c: s: r: x:replacenodetest_coll_shard2_replica_n25 t:null-1681] o.a.s.h.a.CoreAdminOperation core create command async=001308592259456507&qt=/admin/cores&coreNodeName=core_node26&collection.configName=conf1&name=replacenodetest_coll_shard2_replica_n25&action=CREATE&collection=replacenodetest_coll&shard=shard2&wt=javabin&version=2&replicaType=NRT 2> 423009 INFO (qtp259113879-4355) [n:127.0.0.1:37265_solr c: s: r: x: t:null-1681] o.a.s.h.a.CoreAdminOperation Checking request status for : 001308592259456507 2> 423010 INFO (qtp259113879-4355) [n:127.0.0.1:37265_solr c: s: r: x: t:null-1681] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/cores params={qt=/admin/cores&requestid=001308592259456507&action=REQUESTSTATUS&wt=javabin&version=2} status=0 QTime=2 2> 423026 INFO (zkCallback-2425-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/replacenodetest_coll/state.json zxid: 926] for collection [replacenodetest_coll] has occurred - updating... (live nodes size: [6]) 2> 423026 INFO (zkCallback-2434-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/replacenodetest_coll/state.json zxid: 926] for collection [replacenodetest_coll] has occurred - updating... (live nodes size: [6]) 2> 423026 INFO (zkCallback-2438-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/replacenodetest_coll/state.json zxid: 926] for collection [replacenodetest_coll] has occurred - updating... (live nodes size: [6]) 2> 423026 INFO (zkCallback-2430-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/replacenodetest_coll/state.json zxid: 926] for collection [replacenodetest_coll] has occurred - updating... (live nodes size: [6]) 2> 423026 INFO (zkCallback-2436-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/replacenodetest_coll/state.json zxid: 926] for collection [replacenodetest_coll] has occurred - updating... (live nodes size: [6]) 2> 423136 INFO (parallelCoreAdminAPIBaseExecutor-2464-thread-3-processing-127.0.0.1:37265_solr replacenodetest_coll_shard2_replica_n25 null-1681 001308592259456507 create) [n:127.0.0.1:37265_solr c:replacenodetest_coll s:shard2 r:core_node26 x:replacenodetest_coll_shard2_replica_n25 t:null-1681] o.a.s.c.SolrConfig Using Lucene MatchVersion: 9.8.0 2> 423164 INFO (parallelCoreAdminAPIBaseExecutor-2464-thread-3-processing-127.0.0.1:37265_solr replacenodetest_coll_shard2_replica_n25 null-1681 001308592259456507 create) [n:127.0.0.1:37265_solr c:replacenodetest_coll s:shard2 r:core_node26 x:replacenodetest_coll_shard2_replica_n25 t:null-1681] o.a.s.s.IndexSchema Schema name=cloud-dynamic 2> 423191 INFO (parallelCoreAdminAPIBaseExecutor-2464-thread-3-processing-127.0.0.1:37265_solr replacenodetest_coll_shard2_replica_n25 null-1681 001308592259456507 create) [n:127.0.0.1:37265_solr c:replacenodetest_coll s:shard2 r:core_node26 x:replacenodetest_coll_shard2_replica_n25 t:null-1681] o.a.s.s.IndexSchema Loaded schema cloud-dynamic/1.6 with uniqueid field id 2> 423192 INFO (parallelCoreAdminAPIBaseExecutor-2464-thread-3-processing-127.0.0.1:37265_solr replacenodetest_coll_shard2_replica_n25 null-1681 001308592259456507 create) [n:127.0.0.1:37265_solr c:replacenodetest_coll s:shard2 r:core_node26 x:replacenodetest_coll_shard2_replica_n25 t:null-1681] o.a.s.c.CoreContainer Creating SolrCore 'replacenodetest_coll_shard2_replica_n25' using configuration from configset conf1, trusted=true 2> 423196 INFO (parallelCoreAdminAPIBaseExecutor-2464-thread-3-processing-127.0.0.1:37265_solr replacenodetest_coll_shard2_replica_n25 null-1681 001308592259456507 create) [n:127.0.0.1:37265_solr c:replacenodetest_coll s:shard2 r:core_node26 x:replacenodetest_coll_shard2_replica_n25 t:null-1681] o.a.s.c.SolrCore Opening new SolrCore at [/build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.cloud.ReplaceNodeTest_246C98A4C257C021-001/tempDir-003/node1/replacenodetest_coll_shard2_replica_n25], dataDir=[/build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.cloud.ReplaceNodeTest_246C98A4C257C021-001/tempDir-003/node1/replacenodetest_coll_shard2_replica_n25/data/] 2> 423460 INFO (parallelCoreAdminAPIBaseExecutor-2464-thread-3-processing-127.0.0.1:37265_solr replacenodetest_coll_shard2_replica_n25 null-1681 001308592259456507 create) [n:127.0.0.1:37265_solr c:replacenodetest_coll s:shard2 r:core_node26 x:replacenodetest_coll_shard2_replica_n25 t:null-1681] o.a.s.u.UpdateHandler Using UpdateLog implementation: org.apache.solr.update.UpdateLog 2> 423460 INFO (parallelCoreAdminAPIBaseExecutor-2464-thread-3-processing-127.0.0.1:37265_solr replacenodetest_coll_shard2_replica_n25 null-1681 001308592259456507 create) [n:127.0.0.1:37265_solr c:replacenodetest_coll s:shard2 r:core_node26 x:replacenodetest_coll_shard2_replica_n25 t:null-1681] o.a.s.u.UpdateLog Initializing UpdateLog: dataDir=null defaultSyncLevel=FLUSH numRecordsToKeep=100 maxNumLogsToKeep=10 numVersionBuckets=65536 2> 423478 INFO (parallelCoreAdminAPIBaseExecutor-2464-thread-3-processing-127.0.0.1:37265_solr replacenodetest_coll_shard2_replica_n25 null-1681 001308592259456507 create) [n:127.0.0.1:37265_solr c:replacenodetest_coll s:shard2 r:core_node26 x:replacenodetest_coll_shard2_replica_n25 t:null-1681] o.a.s.u.CommitTracker Hard AutoCommit: disabled 2> 423478 INFO (parallelCoreAdminAPIBaseExecutor-2464-thread-3-processing-127.0.0.1:37265_solr replacenodetest_coll_shard2_replica_n25 null-1681 001308592259456507 create) [n:127.0.0.1:37265_solr c:replacenodetest_coll s:shard2 r:core_node26 x:replacenodetest_coll_shard2_replica_n25 t:null-1681] o.a.s.u.CommitTracker Soft AutoCommit: disabled 2> 423498 INFO (parallelCoreAdminAPIBaseExecutor-2464-thread-3-processing-127.0.0.1:37265_solr replacenodetest_coll_shard2_replica_n25 null-1681 001308592259456507 create) [n:127.0.0.1:37265_solr c:replacenodetest_coll s:shard2 r:core_node26 x:replacenodetest_coll_shard2_replica_n25 t:null-1681] o.a.s.r.ManagedResourceStorage Configured ZooKeeperStorageIO with znodeBase: /configs/conf1 2> 423502 INFO (parallelCoreAdminAPIBaseExecutor-2464-thread-3-processing-127.0.0.1:37265_solr replacenodetest_coll_shard2_replica_n25 null-1681 001308592259456507 create) [n:127.0.0.1:37265_solr c:replacenodetest_coll s:shard2 r:core_node26 x:replacenodetest_coll_shard2_replica_n25 t:null-1681] o.a.s.r.ManagedResourceStorage Loaded null at path _rest_managed.json using ZooKeeperStorageIO:path=/configs/conf1 2> 423506 INFO (parallelCoreAdminAPIBaseExecutor-2464-thread-3-processing-127.0.0.1:37265_solr replacenodetest_coll_shard2_replica_n25 null-1681 001308592259456507 create) [n:127.0.0.1:37265_solr c:replacenodetest_coll s:shard2 r:core_node26 x:replacenodetest_coll_shard2_replica_n25 t:null-1681] o.a.s.h.ReplicationHandler Commits will be reserved for 10000 ms 2> 423507 INFO (parallelCoreAdminAPIBaseExecutor-2464-thread-3-processing-127.0.0.1:37265_solr replacenodetest_coll_shard2_replica_n25 null-1681 001308592259456507 create) [n:127.0.0.1:37265_solr c:replacenodetest_coll s:shard2 r:core_node26 x:replacenodetest_coll_shard2_replica_n25 t:null-1681] o.a.s.u.UpdateLog Could not find max version in index or recent updates, using new clock 1788735701166063616 2> 423537 INFO (searcherExecutor-2574-thread-1-processing-replacenodetest_coll_shard2_replica_n25 null-1681 001308592259456507 core_node26 create 127.0.0.1:37265_solr replacenodetest_coll shard2) [n:127.0.0.1:37265_solr c:replacenodetest_coll s:shard2 r:core_node26 x:replacenodetest_coll_shard2_replica_n25 t:null-1681] o.a.s.c.SolrCore Registered new searcher autowarm time: 0 ms 2> 423547 INFO (parallelCoreAdminAPIBaseExecutor-2464-thread-3-processing-127.0.0.1:37265_solr replacenodetest_coll_shard2_replica_n25 null-1681 001308592259456507 create) [n:127.0.0.1:37265_solr c:replacenodetest_coll s:shard2 r:core_node26 x:replacenodetest_coll_shard2_replica_n25 t:null-1681] o.a.s.c.ZkShardTerms Successful update of terms at /collections/replacenodetest_coll/terms/shard2 to Terms{values={core_node13=0, core_node22=0, core_node26=0}, version=4} for registerTerm 2> 423547 INFO (parallelCoreAdminAPIBaseExecutor-2464-thread-3-processing-127.0.0.1:37265_solr replacenodetest_coll_shard2_replica_n25 null-1681 001308592259456507 create) [n:127.0.0.1:37265_solr c:replacenodetest_coll s:shard2 r:core_node26 x:replacenodetest_coll_shard2_replica_n25 t:null-1681] o.a.s.c.ShardLeaderElectionContextBase make sure parent is created /collections/replacenodetest_coll/leaders/shard2 2> 423567 INFO (parallelCoreAdminAPIBaseExecutor-2464-thread-3-processing-127.0.0.1:37265_solr replacenodetest_coll_shard2_replica_n25 null-1681 001308592259456507 create) [n:127.0.0.1:37265_solr c:replacenodetest_coll s:shard2 r:core_node26 x:replacenodetest_coll_shard2_replica_n25 t:null-1681] o.a.s.c.ZkController Core needs to recover:replacenodetest_coll_shard2_replica_n25 2> 423569 INFO (updateExecutor-2417-thread-1-processing-replacenodetest_coll_shard2_replica_n25 null-1681 001308592259456507 core_node26 create 127.0.0.1:37265_solr replacenodetest_coll shard2) [n:127.0.0.1:37265_solr c:replacenodetest_coll s:shard2 r:core_node26 x:replacenodetest_coll_shard2_replica_n25 t:null-1681] o.a.s.u.DefaultSolrCoreState Running recovery 2> 423572 INFO (recoveryExecutor-2433-thread-1-processing-replacenodetest_coll_shard2_replica_n25 null-1681 001308592259456507 core_node26 create 127.0.0.1:37265_solr replacenodetest_coll shard2) [n:127.0.0.1:37265_solr c:replacenodetest_coll s:shard2 r:core_node26 x:replacenodetest_coll_shard2_replica_n25 t:null-1681] o.a.s.c.RecoveryStrategy Starting recovery process. recoveringAfterStartup=true 2> 423573 INFO (recoveryExecutor-2433-thread-1-processing-replacenodetest_coll_shard2_replica_n25 null-1681 001308592259456507 core_node26 create 127.0.0.1:37265_solr replacenodetest_coll shard2) [n:127.0.0.1:37265_solr c:replacenodetest_coll s:shard2 r:core_node26 x:replacenodetest_coll_shard2_replica_n25 t:null-1681] o.a.s.c.RecoveryStrategy startupVersions is empty 2> 423591 INFO (qtp534543879-4338) [n:127.0.0.1:33013_solr c:replacenodetest_coll s:shard2 r:core_node13 x:replacenodetest_coll_shard2_replica_n3 t:null-1683] o.a.s.c.S.Request webapp=/solr path=/admin/ping params={wt=javabin&version=2} hits=0 status=0 QTime=3 2> 423592 INFO (qtp534543879-4338) [n:127.0.0.1:33013_solr c:replacenodetest_coll s:shard2 r:core_node13 x:replacenodetest_coll_shard2_replica_n3 t:null-1683] o.a.s.c.S.Request webapp=/solr path=/admin/ping params={wt=javabin&version=2} status=0 QTime=4 2> 423594 INFO (recoveryExecutor-2433-thread-1-processing-replacenodetest_coll_shard2_replica_n25 null-1681 001308592259456507 core_node26 create 127.0.0.1:37265_solr replacenodetest_coll shard2) [n:127.0.0.1:37265_solr c:replacenodetest_coll s:shard2 r:core_node26 x:replacenodetest_coll_shard2_replica_n25 t:null-1681] o.a.s.c.RecoveryStrategy Begin buffering updates. core=[replacenodetest_coll_shard2_replica_n25] 2> 423599 INFO (recoveryExecutor-2433-thread-1-processing-replacenodetest_coll_shard2_replica_n25 null-1681 001308592259456507 core_node26 create 127.0.0.1:37265_solr replacenodetest_coll shard2) [n:127.0.0.1:37265_solr c:replacenodetest_coll s:shard2 r:core_node26 x:replacenodetest_coll_shard2_replica_n25 t:null-1681] o.a.s.u.UpdateLog Starting to buffer updates. FSUpdateLog{state=ACTIVE, tlog=null} 2> 423599 INFO (recoveryExecutor-2433-thread-1-processing-replacenodetest_coll_shard2_replica_n25 null-1681 001308592259456507 core_node26 create 127.0.0.1:37265_solr replacenodetest_coll shard2) [n:127.0.0.1:37265_solr c:replacenodetest_coll s:shard2 r:core_node26 x:replacenodetest_coll_shard2_replica_n25 t:null-1681] o.a.s.c.RecoveryStrategy Publishing state of core [replacenodetest_coll_shard2_replica_n25] as recovering, leader is [http://127.0.0.1:33013/solr/replacenodetest_coll_shard2_replica_n3/] and I am [http://127.0.0.1:37265/solr/replacenodetest_coll_shard2_replica_n25/] 2> 423608 INFO (zkCallback-2440-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/replacenodetest_coll/state.json zxid: 933] for collection [replacenodetest_coll] has occurred - updating... (live nodes size: [6]) 2> 423608 INFO (zkCallback-2438-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/replacenodetest_coll/state.json zxid: 933] for collection [replacenodetest_coll] has occurred - updating... (live nodes size: [6]) 2> 423609 INFO (zkCallback-2436-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/replacenodetest_coll/state.json zxid: 933] for collection [replacenodetest_coll] has occurred - updating... (live nodes size: [6]) 2> 423609 INFO (zkCallback-2434-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/replacenodetest_coll/state.json zxid: 933] for collection [replacenodetest_coll] has occurred - updating... (live nodes size: [6]) 2> 423609 INFO (zkCallback-2425-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/replacenodetest_coll/state.json zxid: 933] for collection [replacenodetest_coll] has occurred - updating... (live nodes size: [6]) 2> 423609 INFO (zkCallback-2430-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/replacenodetest_coll/state.json zxid: 933] for collection [replacenodetest_coll] has occurred - updating... (live nodes size: [6]) 2> 423612 INFO (recoveryExecutor-2433-thread-1-processing-replacenodetest_coll_shard2_replica_n25 null-1681 001308592259456507 core_node26 create 127.0.0.1:37265_solr replacenodetest_coll shard2) [n:127.0.0.1:37265_solr c:replacenodetest_coll s:shard2 r:core_node26 x:replacenodetest_coll_shard2_replica_n25 t:null-1681] o.a.s.c.RecoveryStrategy Sending prep recovery command to [http://127.0.0.1:33013/solr]; [WaitForState: action=PREPRECOVERY&core=replacenodetest_coll_shard2_replica_n3&nodeName=127.0.0.1:37265_solr&coreNodeName=core_node26&state=recovering&checkLive=true&onlyIfLeader=true&onlyIfLeaderActive=true] 2> 423617 INFO (qtp534543879-4328) [n:127.0.0.1:33013_solr c: s: r: x:replacenodetest_coll_shard2_replica_n3 t:null-1684] o.a.s.h.a.PrepRecoveryOp Going to wait for coreNodeName: core_node26, state: recovering, checkLive: true, onlyIfLeader: true, onlyIfLeaderActive: true 2> 423618 INFO (qtp534543879-4328) [n:127.0.0.1:33013_solr c: s: r: x: t:null-1684] o.a.s.h.a.PrepRecoveryOp In WaitForState(recovering): collection=replacenodetest_coll, shard=shard2, thisCore=replacenodetest_coll_shard2_replica_n3, leaderDoesNotNeedRecovery=false, isLeader? true, live=true, checkLive=true, currentState=recovering, localState=active, nodeName=127.0.0.1:37265_solr, coreNodeName=core_node26, onlyIfActiveCheckResult=false, nodeProps: core_node26:{ 2> "core":"replacenodetest_coll_shard2_replica_n25", 2> "node_name":"127.0.0.1:37265_solr", 2> "base_url":"http://127.0.0.1:37265/solr", 2> "state":"recovering", 2> "type":"NRT", 2> "force_set_state":"false"} 2> 423619 INFO (qtp534543879-4328) [n:127.0.0.1:33013_solr c: s: r: x: t:null-1684] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/cores params={nodeName=127.0.0.1:37265_solr&onlyIfLeaderActive=true&core=replacenodetest_coll_shard2_replica_n3&coreNodeName=core_node26&action=PREPRECOVERY&checkLive=true&state=recovering&onlyIfLeader=true&wt=javabin&version=2} status=0 QTime=3 2> 423981 INFO (qtp534543879-4352) [n:127.0.0.1:33013_solr c: s: r: x: t:null-1685] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/collections params={requestid=001&action=REQUESTSTATUS&wt=javabin&version=2} status=0 QTime=12 2> 424017 INFO (qtp259113879-4323) [n:127.0.0.1:37265_solr c: s: r: x: t:null-1681] o.a.s.h.a.CoreAdminOperation Checking request status for : 001308592259456507 2> 424018 INFO (qtp259113879-4323) [n:127.0.0.1:37265_solr c: s: r: x: t:null-1681] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/cores params={qt=/admin/cores&requestid=001308592259456507&action=REQUESTSTATUS&wt=javabin&version=2} status=0 QTime=2 2> 424022 INFO (DistributedCollectionApiCommandExecutor-2458-thread-3-processing-127.0.0.1:33013_solr null-1681) [n:127.0.0.1:33013_solr c: s: r: x: t:null-1681] o.a.s.c.a.c.ReplicaMigrationUtils Going to create replica for collection=replacenodetest_coll shard=shard5 on node=127.0.0.1:37265_solr 2> 424028 INFO (DistributedCollectionApiCommandExecutor-2458-thread-3-processing-127.0.0.1:33013_solr null-1681) [n:127.0.0.1:33013_solr c: s: r: x: t:null-1681] o.a.s.c.a.c.AddReplicaCmd Node Identified 127.0.0.1:37265_solr for creating new replica of shard shard5 for collection replacenodetest_coll 2> 424039 INFO (DistributedCollectionApiCommandExecutor-2458-thread-3-processing-127.0.0.1:33013_solr null-1681) [n:127.0.0.1:33013_solr c: s: r: x: t:null-1681] o.a.s.c.a.c.AddReplicaCmd Returning CreateReplica command. 2> 424042 INFO (DistributedCollectionApiCommandExecutor-2458-thread-3-processing-127.0.0.1:33013_solr null-1681) [n:127.0.0.1:33013_solr c: s: r: x: t:null-1681] o.a.s.c.o.SliceMutator createReplica() { 2> "core":"replacenodetest_coll_shard5_replica_n27", 2> "node_name":"127.0.0.1:37265_solr", 2> "base_url":"http://127.0.0.1:37265/solr", 2> "collection":"replacenodetest_coll", 2> "shard":"shard5", 2> "state":"down", 2> "type":"NRT", 2> "operation":"addreplica"} 2> 424056 INFO (zkCallback-2440-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/replacenodetest_coll/state.json zxid: 936] for collection [replacenodetest_coll] has occurred - updating... (live nodes size: [6]) 2> 424056 INFO (zkCallback-2425-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/replacenodetest_coll/state.json zxid: 936] for collection [replacenodetest_coll] has occurred - updating... (live nodes size: [6]) 2> 424056 INFO (zkCallback-2430-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/replacenodetest_coll/state.json zxid: 936] for collection [replacenodetest_coll] has occurred - updating... (live nodes size: [6]) 2> 424056 INFO (zkCallback-2438-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/replacenodetest_coll/state.json zxid: 936] for collection [replacenodetest_coll] has occurred - updating... (live nodes size: [6]) 2> 424056 INFO (zkCallback-2434-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/replacenodetest_coll/state.json zxid: 936] for collection [replacenodetest_coll] has occurred - updating... (live nodes size: [6]) 2> 424056 INFO (zkCallback-2436-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/replacenodetest_coll/state.json zxid: 936] for collection [replacenodetest_coll] has occurred - updating... (live nodes size: [6]) 2> 424076 INFO (qtp259113879-4333) [n:127.0.0.1:37265_solr c: s: r: x:replacenodetest_coll_shard5_replica_n27 t:null-1681] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/cores params={async=001308593327729923&qt=/admin/cores&coreNodeName=core_node28&collection.configName=conf1&name=replacenodetest_coll_shard5_replica_n27&action=CREATE&collection=replacenodetest_coll&shard=shard5&wt=javabin&version=2&replicaType=NRT} status=0 QTime=3 2> 424076 INFO (parallelCoreAdminAPIBaseExecutor-2464-thread-4-processing-127.0.0.1:37265_solr replacenodetest_coll_shard5_replica_n27 null-1681 001308593327729923 create) [n:127.0.0.1:37265_solr c: s: r: x:replacenodetest_coll_shard5_replica_n27 t:null-1681] o.a.s.h.a.CoreAdminOperation core create command async=001308593327729923&qt=/admin/cores&coreNodeName=core_node28&collection.configName=conf1&name=replacenodetest_coll_shard5_replica_n27&action=CREATE&collection=replacenodetest_coll&shard=shard5&wt=javabin&version=2&replicaType=NRT 2> 424082 INFO (qtp259113879-4351) [n:127.0.0.1:37265_solr c: s: r: x: t:null-1681] o.a.s.h.a.CoreAdminOperation Checking request status for : 001308593327729923 2> 424083 INFO (qtp259113879-4351) [n:127.0.0.1:37265_solr c: s: r: x: t:null-1681] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/cores params={qt=/admin/cores&requestid=001308593327729923&action=REQUESTSTATUS&wt=javabin&version=2} status=0 QTime=2 2> 424091 INFO (zkCallback-2440-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/replacenodetest_coll/state.json zxid: 937] for collection [replacenodetest_coll] has occurred - updating... (live nodes size: [6]) 2> 424091 INFO (zkCallback-2425-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/replacenodetest_coll/state.json zxid: 937] for collection [replacenodetest_coll] has occurred - updating... (live nodes size: [6]) 2> 424091 INFO (zkCallback-2434-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/replacenodetest_coll/state.json zxid: 937] for collection [replacenodetest_coll] has occurred - updating... (live nodes size: [6]) 2> 424091 INFO (zkCallback-2430-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/replacenodetest_coll/state.json zxid: 937] for collection [replacenodetest_coll] has occurred - updating... (live nodes size: [6]) 2> 424091 INFO (zkCallback-2438-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/replacenodetest_coll/state.json zxid: 937] for collection [replacenodetest_coll] has occurred - updating... (live nodes size: [6]) 2> 424091 INFO (zkCallback-2436-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/replacenodetest_coll/state.json zxid: 937] for collection [replacenodetest_coll] has occurred - updating... (live nodes size: [6]) 2> 424115 INFO (parallelCoreAdminAPIBaseExecutor-2464-thread-4-processing-127.0.0.1:37265_solr replacenodetest_coll_shard5_replica_n27 null-1681 001308593327729923 create) [n:127.0.0.1:37265_solr c:replacenodetest_coll s:shard5 r:core_node28 x:replacenodetest_coll_shard5_replica_n27 t:null-1681] o.a.s.c.SolrConfig Using Lucene MatchVersion: 9.8.0 2> 424121 INFO (recoveryExecutor-2433-thread-1-processing-replacenodetest_coll_shard2_replica_n25 null-1681 001308592259456507 core_node26 create 127.0.0.1:37265_solr replacenodetest_coll shard2) [n:127.0.0.1:37265_solr c:replacenodetest_coll s:shard2 r:core_node26 x:replacenodetest_coll_shard2_replica_n25 t:null-1681] o.a.s.c.RecoveryStrategy Attempting to PeerSync from [http://127.0.0.1:33013/solr/replacenodetest_coll_shard2_replica_n3/] - recoveringAfterStartup=[true] 2> 424121 INFO (parallelCoreAdminAPIBaseExecutor-2464-thread-4-processing-127.0.0.1:37265_solr replacenodetest_coll_shard5_replica_n27 null-1681 001308593327729923 create) [n:127.0.0.1:37265_solr c:replacenodetest_coll s:shard5 r:core_node28 x:replacenodetest_coll_shard5_replica_n27 t:null-1681] o.a.s.s.IndexSchema Schema name=cloud-dynamic 2> 424125 WARN (recoveryExecutor-2433-thread-1-processing-replacenodetest_coll_shard2_replica_n25 null-1681 001308592259456507 core_node26 create 127.0.0.1:37265_solr replacenodetest_coll shard2) [n:127.0.0.1:37265_solr c:replacenodetest_coll s:shard2 r:core_node26 x:replacenodetest_coll_shard2_replica_n25 t:null-1681] o.a.s.u.PeerSyncWithLeader no frame of reference to tell if we've missed updates 2> 424125 INFO (recoveryExecutor-2433-thread-1-processing-replacenodetest_coll_shard2_replica_n25 null-1681 001308592259456507 core_node26 create 127.0.0.1:37265_solr replacenodetest_coll shard2) [n:127.0.0.1:37265_solr c:replacenodetest_coll s:shard2 r:core_node26 x:replacenodetest_coll_shard2_replica_n25 t:null-1681] o.a.s.c.RecoveryStrategy PeerSync Recovery was not successful - trying replication. 2> 424126 INFO (recoveryExecutor-2433-thread-1-processing-replacenodetest_coll_shard2_replica_n25 null-1681 001308592259456507 core_node26 create 127.0.0.1:37265_solr replacenodetest_coll shard2) [n:127.0.0.1:37265_solr c:replacenodetest_coll s:shard2 r:core_node26 x:replacenodetest_coll_shard2_replica_n25 t:null-1681] o.a.s.c.RecoveryStrategy Starting Replication Recovery. 2> 424126 INFO (recoveryExecutor-2433-thread-1-processing-replacenodetest_coll_shard2_replica_n25 null-1681 001308592259456507 core_node26 create 127.0.0.1:37265_solr replacenodetest_coll shard2) [n:127.0.0.1:37265_solr c:replacenodetest_coll s:shard2 r:core_node26 x:replacenodetest_coll_shard2_replica_n25 t:null-1681] o.a.s.c.RecoveryStrategy Attempting to replicate from [http://127.0.0.1:33013/solr/replacenodetest_coll_shard2_replica_n3/]. 2> 424156 INFO (parallelCoreAdminAPIBaseExecutor-2464-thread-4-processing-127.0.0.1:37265_solr replacenodetest_coll_shard5_replica_n27 null-1681 001308593327729923 create) [n:127.0.0.1:37265_solr c:replacenodetest_coll s:shard5 r:core_node28 x:replacenodetest_coll_shard5_replica_n27 t:null-1681] o.a.s.s.IndexSchema Loaded schema cloud-dynamic/1.6 with uniqueid field id 2> 424156 INFO (parallelCoreAdminAPIBaseExecutor-2464-thread-4-processing-127.0.0.1:37265_solr replacenodetest_coll_shard5_replica_n27 null-1681 001308593327729923 create) [n:127.0.0.1:37265_solr c:replacenodetest_coll s:shard5 r:core_node28 x:replacenodetest_coll_shard5_replica_n27 t:null-1681] o.a.s.c.CoreContainer Creating SolrCore 'replacenodetest_coll_shard5_replica_n27' using configuration from configset conf1, trusted=true 2> 424163 INFO (parallelCoreAdminAPIBaseExecutor-2464-thread-4-processing-127.0.0.1:37265_solr replacenodetest_coll_shard5_replica_n27 null-1681 001308593327729923 create) [n:127.0.0.1:37265_solr c:replacenodetest_coll s:shard5 r:core_node28 x:replacenodetest_coll_shard5_replica_n27 t:null-1681] o.a.s.c.SolrCore Opening new SolrCore at [/build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.cloud.ReplaceNodeTest_246C98A4C257C021-001/tempDir-003/node1/replacenodetest_coll_shard5_replica_n27], dataDir=[/build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.cloud.ReplaceNodeTest_246C98A4C257C021-001/tempDir-003/node1/replacenodetest_coll_shard5_replica_n27/data/] 2> 424167 INFO (qtp1384576555-4346) [n:127.0.0.1:34105_solr c:replacenodetest_coll s:shard3 r:core_node20 x:replacenodetest_coll_shard3_replica_n10 t:null-1686] o.a.s.u.p.LogUpdateProcessorFactory webapp=/solr path=/update params={update.distrib=FROMLEADER&waitSearcher=true&openSearcher=false&commit=true&softCommit=false&distrib.from=http://127.0.0.1:37983/solr/replacenodetest_coll_shard3_replica_n4/&commit_end_point=replicas&wt=javabin&version=2&expungeDeletes=false}{commit=} 0 4 2> 424169 INFO (qtp1180210852-4337) [n:127.0.0.1:36563_solr c:replacenodetest_coll s:shard5 r:core_node24 x:replacenodetest_coll_shard5_replica_n23 t:null-1686] o.a.s.u.p.LogUpdateProcessorFactory webapp=/solr path=/update params={update.distrib=FROMLEADER&waitSearcher=true&openSearcher=false&commit=true&softCommit=false&distrib.from=http://127.0.0.1:37983/solr/replacenodetest_coll_shard5_replica_n6/&commit_end_point=replicas&wt=javabin&version=2&expungeDeletes=false}{commit=} 0 6 2> 424169 INFO (qtp819144083-4322) [n:127.0.0.1:34953_solr c:replacenodetest_coll s:shard4 r:core_node17 x:replacenodetest_coll_shard4_replica_n7 t:null-1686] o.a.s.u.p.LogUpdateProcessorFactory webapp=/solr path=/update params={update.distrib=FROMLEADER&waitSearcher=true&openSearcher=false&commit=true&softCommit=false&distrib.from=http://127.0.0.1:33013/solr/replacenodetest_coll_shard4_replica_n8/&commit_end_point=replicas&wt=javabin&version=2&expungeDeletes=false}{commit=} 0 6 2> 424169 INFO (qtp1384576555-4340) [n:127.0.0.1:34105_solr c:replacenodetest_coll s:shard1 r:core_node11 x:replacenodetest_coll_shard1_replica_n1 t:null-1686] o.a.s.u.p.LogUpdateProcessorFactory webapp=/solr path=/update params={update.distrib=FROMLEADER&waitSearcher=true&openSearcher=false&commit=true&softCommit=false&distrib.from=http://127.0.0.1:34953/solr/replacenodetest_coll_shard1_replica_n2/&commit_end_point=replicas&wt=javabin&version=2&expungeDeletes=false}{commit=} 0 6 2> 424170 INFO (qtp583399768-4321) [n:127.0.0.1:37983_solr c:replacenodetest_coll s:shard3 r:core_node14 x:replacenodetest_coll_shard3_replica_n4 t:null-1686] o.a.s.u.p.LogUpdateProcessorFactory webapp=/solr path=/update params={update.distrib=TOLEADER&waitSearcher=true&openSearcher=false&commit=true&softCommit=false&distrib.from=http://127.0.0.1:33013/solr/replacenodetest_coll_shard2_replica_n3/&commit_end_point=leaders&wt=javabin&version=2&expungeDeletes=false}{commit=} 0 28 2> 424171 INFO (qtp583399768-4341) [n:127.0.0.1:37983_solr c:replacenodetest_coll s:shard5 r:core_node16 x:replacenodetest_coll_shard5_replica_n6 t:null-1686] o.a.s.u.p.LogUpdateProcessorFactory webapp=/solr path=/update params={update.distrib=TOLEADER&waitSearcher=true&openSearcher=false&commit=true&softCommit=false&distrib.from=http://127.0.0.1:33013/solr/replacenodetest_coll_shard2_replica_n3/&commit_end_point=leaders&wt=javabin&version=2&expungeDeletes=false}{commit=} 0 30 2> 424172 INFO (qtp819144083-4324) [n:127.0.0.1:34953_solr c:replacenodetest_coll s:shard1 r:core_node12 x:replacenodetest_coll_shard1_replica_n2 t:null-1686] o.a.s.u.p.LogUpdateProcessorFactory webapp=/solr path=/update params={update.distrib=TOLEADER&waitSearcher=true&openSearcher=false&commit=true&softCommit=false&distrib.from=http://127.0.0.1:33013/solr/replacenodetest_coll_shard2_replica_n3/&commit_end_point=leaders&wt=javabin&version=2&expungeDeletes=false}{commit=} 0 30 2> 424171 INFO (qtp534543879-4348) [n:127.0.0.1:33013_solr c:replacenodetest_coll s:shard4 r:core_node18 x:replacenodetest_coll_shard4_replica_n8 t:null-1686] o.a.s.u.p.LogUpdateProcessorFactory webapp=/solr path=/update params={update.distrib=TOLEADER&waitSearcher=true&openSearcher=false&commit=true&softCommit=false&distrib.from=http://127.0.0.1:33013/solr/replacenodetest_coll_shard2_replica_n3/&commit_end_point=leaders&wt=javabin&version=2&expungeDeletes=false}{commit=} 0 30 2> 424180 INFO (qtp1180210852-4339) [n:127.0.0.1:36563_solr c:replacenodetest_coll s:shard2 r:core_node22 x:replacenodetest_coll_shard2_replica_n21 t:null-1686] o.a.s.u.p.LogUpdateProcessorFactory webapp=/solr path=/update params={update.distrib=FROMLEADER&waitSearcher=true&openSearcher=false&commit=true&softCommit=false&distrib.from=http://127.0.0.1:33013/solr/replacenodetest_coll_shard2_replica_n3/&commit_end_point=replicas&wt=javabin&version=2&expungeDeletes=false}{commit=} 0 3 2> 424181 INFO (qtp259113879-4343) [n:127.0.0.1:37265_solr c:replacenodetest_coll s:shard2 r:core_node26 x:replacenodetest_coll_shard2_replica_n25 t:null-1686] o.a.s.u.p.DistributedUpdateProcessor Ignoring commit while not ACTIVE - state: BUFFERING replay: false 2> 424182 INFO (qtp259113879-4343) [n:127.0.0.1:37265_solr c:replacenodetest_coll s:shard2 r:core_node26 x:replacenodetest_coll_shard2_replica_n25 t:null-1686] o.a.s.u.p.LogUpdateProcessorFactory webapp=/solr path=/update params={update.distrib=FROMLEADER&waitSearcher=true&openSearcher=false&commit=true&softCommit=false&distrib.from=http://127.0.0.1:33013/solr/replacenodetest_coll_shard2_replica_n3/&commit_end_point=replicas&wt=javabin&version=2&expungeDeletes=false}{commit=} 0 3 2> 424184 INFO (qtp534543879-4354) [n:127.0.0.1:33013_solr c:replacenodetest_coll s:shard2 r:core_node13 x:replacenodetest_coll_shard2_replica_n3 t:null-1686] o.a.s.u.p.LogUpdateProcessorFactory webapp=/solr path=/update params={waitSearcher=true&openSearcher=false&commit=true&softCommit=false&wt=javabin&version=2}{commit=} 0 54 2> 424195 INFO (qtp534543879-4356) [n:127.0.0.1:33013_solr c:replacenodetest_coll s:shard2 r:core_node13 x:replacenodetest_coll_shard2_replica_n3 t:null-1687] o.a.s.c.S.Request webapp=/solr path=/replication params={qt=/replication&wt=javabin&version=2&command=indexversion} status=0 QTime=2 2> 424197 INFO (recoveryExecutor-2433-thread-1-processing-replacenodetest_coll_shard2_replica_n25 null-1681 001308592259456507 core_node26 create 127.0.0.1:37265_solr replacenodetest_coll shard2) [n:127.0.0.1:37265_solr c:replacenodetest_coll s:shard2 r:core_node26 x:replacenodetest_coll_shard2_replica_n25 t:null-1681] o.a.s.h.IndexFetcher Leader's generation: 1 2> 424197 INFO (recoveryExecutor-2433-thread-1-processing-replacenodetest_coll_shard2_replica_n25 null-1681 001308592259456507 core_node26 create 127.0.0.1:37265_solr replacenodetest_coll shard2) [n:127.0.0.1:37265_solr c:replacenodetest_coll s:shard2 r:core_node26 x:replacenodetest_coll_shard2_replica_n25 t:null-1681] o.a.s.h.IndexFetcher Leader's version: 0 2> 424198 INFO (recoveryExecutor-2433-thread-1-processing-replacenodetest_coll_shard2_replica_n25 null-1681 001308592259456507 core_node26 create 127.0.0.1:37265_solr replacenodetest_coll shard2) [n:127.0.0.1:37265_solr c:replacenodetest_coll s:shard2 r:core_node26 x:replacenodetest_coll_shard2_replica_n25 t:null-1681] o.a.s.h.IndexFetcher Follower's generation: 1 2> 424198 INFO (recoveryExecutor-2433-thread-1-processing-replacenodetest_coll_shard2_replica_n25 null-1681 001308592259456507 core_node26 create 127.0.0.1:37265_solr replacenodetest_coll shard2) [n:127.0.0.1:37265_solr c:replacenodetest_coll s:shard2 r:core_node26 x:replacenodetest_coll_shard2_replica_n25 t:null-1681] o.a.s.h.IndexFetcher Follower's version: 0 2> 424198 INFO (recoveryExecutor-2433-thread-1-processing-replacenodetest_coll_shard2_replica_n25 null-1681 001308592259456507 core_node26 create 127.0.0.1:37265_solr replacenodetest_coll shard2) [n:127.0.0.1:37265_solr c:replacenodetest_coll s:shard2 r:core_node26 x:replacenodetest_coll_shard2_replica_n25 t:null-1681] o.a.s.c.RecoveryStrategy No replay needed. 2> 424206 INFO (recoveryExecutor-2433-thread-1-processing-replacenodetest_coll_shard2_replica_n25 null-1681 001308592259456507 core_node26 create 127.0.0.1:37265_solr replacenodetest_coll shard2) [n:127.0.0.1:37265_solr c:replacenodetest_coll s:shard2 r:core_node26 x:replacenodetest_coll_shard2_replica_n25 t:null-1681] o.a.s.c.RecoveryStrategy Replication Recovery was successful. 2> 424207 INFO (recoveryExecutor-2433-thread-1-processing-replacenodetest_coll_shard2_replica_n25 null-1681 001308592259456507 core_node26 create 127.0.0.1:37265_solr replacenodetest_coll shard2) [n:127.0.0.1:37265_solr c:replacenodetest_coll s:shard2 r:core_node26 x:replacenodetest_coll_shard2_replica_n25 t:null-1681] o.a.s.c.RecoveryStrategy Registering as Active after recovery. 2> 424216 INFO (zkCallback-2434-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/replacenodetest_coll/state.json zxid: 938] for collection [replacenodetest_coll] has occurred - updating... (live nodes size: [6]) 2> 424216 INFO (recoveryExecutor-2433-thread-1-processing-replacenodetest_coll_shard2_replica_n25 null-1681 001308592259456507 core_node26 create 127.0.0.1:37265_solr replacenodetest_coll shard2) [n:127.0.0.1:37265_solr c:replacenodetest_coll s:shard2 r:core_node26 x:replacenodetest_coll_shard2_replica_n25 t:null-1681] o.a.s.c.RecoveryStrategy Updating version bucket highest from index after successful recovery. 2> 424216 INFO (zkCallback-2438-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/replacenodetest_coll/state.json zxid: 938] for collection [replacenodetest_coll] has occurred - updating... (live nodes size: [6]) 2> 424216 INFO (zkCallback-2430-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/replacenodetest_coll/state.json zxid: 938] for collection [replacenodetest_coll] has occurred - updating... (live nodes size: [6]) 2> 424216 INFO (zkCallback-2425-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/replacenodetest_coll/state.json zxid: 938] for collection [replacenodetest_coll] has occurred - updating... (live nodes size: [6]) 2> 424216 INFO (zkCallback-2440-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/replacenodetest_coll/state.json zxid: 938] for collection [replacenodetest_coll] has occurred - updating... (live nodes size: [6]) 2> 424217 INFO (zkCallback-2436-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/replacenodetest_coll/state.json zxid: 938] for collection [replacenodetest_coll] has occurred - updating... (live nodes size: [6]) 2> 424218 INFO (recoveryExecutor-2433-thread-1-processing-replacenodetest_coll_shard2_replica_n25 null-1681 001308592259456507 core_node26 create 127.0.0.1:37265_solr replacenodetest_coll shard2) [n:127.0.0.1:37265_solr c:replacenodetest_coll s:shard2 r:core_node26 x:replacenodetest_coll_shard2_replica_n25 t:null-1681] o.a.s.u.UpdateLog Could not find max version in index or recent updates, using new clock 1788735701911601152 2> 424228 INFO (recoveryExecutor-2433-thread-1-processing-replacenodetest_coll_shard2_replica_n25 null-1681 001308592259456507 core_node26 create 127.0.0.1:37265_solr replacenodetest_coll shard2) [n:127.0.0.1:37265_solr c:replacenodetest_coll s:shard2 r:core_node26 x:replacenodetest_coll_shard2_replica_n25 t:null-1681] o.a.s.c.RecoveryStrategy Finished recovery process, successful=[true] msTimeTaken=654.0 2> 424228 INFO (recoveryExecutor-2433-thread-1-processing-replacenodetest_coll_shard2_replica_n25 null-1681 001308592259456507 core_node26 create 127.0.0.1:37265_solr replacenodetest_coll shard2) [n:127.0.0.1:37265_solr c:replacenodetest_coll s:shard2 r:core_node26 x:replacenodetest_coll_shard2_replica_n25 t:null-1681] o.a.s.c.RecoveryStrategy Finished recovery process. recoveringAfterStartup=true msTimeTaken=655.0 2> 424392 INFO (parallelCoreAdminAPIBaseExecutor-2464-thread-4-processing-127.0.0.1:37265_solr replacenodetest_coll_shard5_replica_n27 null-1681 001308593327729923 create) [n:127.0.0.1:37265_solr c:replacenodetest_coll s:shard5 r:core_node28 x:replacenodetest_coll_shard5_replica_n27 t:null-1681] o.a.s.u.UpdateHandler Using UpdateLog implementation: org.apache.solr.update.UpdateLog 2> 424393 INFO (parallelCoreAdminAPIBaseExecutor-2464-thread-4-processing-127.0.0.1:37265_solr replacenodetest_coll_shard5_replica_n27 null-1681 001308593327729923 create) [n:127.0.0.1:37265_solr c:replacenodetest_coll s:shard5 r:core_node28 x:replacenodetest_coll_shard5_replica_n27 t:null-1681] o.a.s.u.UpdateLog Initializing UpdateLog: dataDir=null defaultSyncLevel=FLUSH numRecordsToKeep=100 maxNumLogsToKeep=10 numVersionBuckets=65536 2> 424411 INFO (parallelCoreAdminAPIBaseExecutor-2464-thread-4-processing-127.0.0.1:37265_solr replacenodetest_coll_shard5_replica_n27 null-1681 001308593327729923 create) [n:127.0.0.1:37265_solr c:replacenodetest_coll s:shard5 r:core_node28 x:replacenodetest_coll_shard5_replica_n27 t:null-1681] o.a.s.u.CommitTracker Hard AutoCommit: disabled 2> 424411 INFO (parallelCoreAdminAPIBaseExecutor-2464-thread-4-processing-127.0.0.1:37265_solr replacenodetest_coll_shard5_replica_n27 null-1681 001308593327729923 create) [n:127.0.0.1:37265_solr c:replacenodetest_coll s:shard5 r:core_node28 x:replacenodetest_coll_shard5_replica_n27 t:null-1681] o.a.s.u.CommitTracker Soft AutoCommit: disabled 2> 424428 INFO (parallelCoreAdminAPIBaseExecutor-2464-thread-4-processing-127.0.0.1:37265_solr replacenodetest_coll_shard5_replica_n27 null-1681 001308593327729923 create) [n:127.0.0.1:37265_solr c:replacenodetest_coll s:shard5 r:core_node28 x:replacenodetest_coll_shard5_replica_n27 t:null-1681] o.a.s.r.ManagedResourceStorage Configured ZooKeeperStorageIO with znodeBase: /configs/conf1 2> 424431 INFO (parallelCoreAdminAPIBaseExecutor-2464-thread-4-processing-127.0.0.1:37265_solr replacenodetest_coll_shard5_replica_n27 null-1681 001308593327729923 create) [n:127.0.0.1:37265_solr c:replacenodetest_coll s:shard5 r:core_node28 x:replacenodetest_coll_shard5_replica_n27 t:null-1681] o.a.s.r.ManagedResourceStorage Loaded null at path _rest_managed.json using ZooKeeperStorageIO:path=/configs/conf1 2> 424435 INFO (parallelCoreAdminAPIBaseExecutor-2464-thread-4-processing-127.0.0.1:37265_solr replacenodetest_coll_shard5_replica_n27 null-1681 001308593327729923 create) [n:127.0.0.1:37265_solr c:replacenodetest_coll s:shard5 r:core_node28 x:replacenodetest_coll_shard5_replica_n27 t:null-1681] o.a.s.h.ReplicationHandler Commits will be reserved for 10000 ms 2> 424436 INFO (parallelCoreAdminAPIBaseExecutor-2464-thread-4-processing-127.0.0.1:37265_solr replacenodetest_coll_shard5_replica_n27 null-1681 001308593327729923 create) [n:127.0.0.1:37265_solr c:replacenodetest_coll s:shard5 r:core_node28 x:replacenodetest_coll_shard5_replica_n27 t:null-1681] o.a.s.u.UpdateLog Could not find max version in index or recent updates, using new clock 1788735702140190720 2> 424462 INFO (searcherExecutor-2581-thread-1-processing-replacenodetest_coll_shard5_replica_n27 null-1681 001308593327729923 core_node28 create 127.0.0.1:37265_solr replacenodetest_coll shard5) [n:127.0.0.1:37265_solr c:replacenodetest_coll s:shard5 r:core_node28 x:replacenodetest_coll_shard5_replica_n27 t:null-1681] o.a.s.c.SolrCore Registered new searcher autowarm time: 0 ms 2> 424466 INFO (parallelCoreAdminAPIBaseExecutor-2464-thread-4-processing-127.0.0.1:37265_solr replacenodetest_coll_shard5_replica_n27 null-1681 001308593327729923 create) [n:127.0.0.1:37265_solr c:replacenodetest_coll s:shard5 r:core_node28 x:replacenodetest_coll_shard5_replica_n27 t:null-1681] o.a.s.c.ZkShardTerms Successful update of terms at /collections/replacenodetest_coll/terms/shard5 to Terms{values={core_node24=0, core_node28=0, core_node16=0}, version=4} for registerTerm 2> 424467 INFO (parallelCoreAdminAPIBaseExecutor-2464-thread-4-processing-127.0.0.1:37265_solr replacenodetest_coll_shard5_replica_n27 null-1681 001308593327729923 create) [n:127.0.0.1:37265_solr c:replacenodetest_coll s:shard5 r:core_node28 x:replacenodetest_coll_shard5_replica_n27 t:null-1681] o.a.s.c.ShardLeaderElectionContextBase make sure parent is created /collections/replacenodetest_coll/leaders/shard5 2> 424482 INFO (parallelCoreAdminAPIBaseExecutor-2464-thread-4-processing-127.0.0.1:37265_solr replacenodetest_coll_shard5_replica_n27 null-1681 001308593327729923 create) [n:127.0.0.1:37265_solr c:replacenodetest_coll s:shard5 r:core_node28 x:replacenodetest_coll_shard5_replica_n27 t:null-1681] o.a.s.c.ZkController Core needs to recover:replacenodetest_coll_shard5_replica_n27 2> 424483 INFO (updateExecutor-2417-thread-1-processing-replacenodetest_coll_shard5_replica_n27 null-1681 001308593327729923 core_node28 create 127.0.0.1:37265_solr replacenodetest_coll shard5) [n:127.0.0.1:37265_solr c:replacenodetest_coll s:shard5 r:core_node28 x:replacenodetest_coll_shard5_replica_n27 t:null-1681] o.a.s.u.DefaultSolrCoreState Running recovery 2> 424484 INFO (recoveryExecutor-2433-thread-1-processing-replacenodetest_coll_shard5_replica_n27 null-1681 001308593327729923 core_node28 create 127.0.0.1:37265_solr replacenodetest_coll shard5) [n:127.0.0.1:37265_solr c:replacenodetest_coll s:shard5 r:core_node28 x:replacenodetest_coll_shard5_replica_n27 t:null-1681] o.a.s.c.RecoveryStrategy Starting recovery process. recoveringAfterStartup=true 2> 424484 INFO (recoveryExecutor-2433-thread-1-processing-replacenodetest_coll_shard5_replica_n27 null-1681 001308593327729923 core_node28 create 127.0.0.1:37265_solr replacenodetest_coll shard5) [n:127.0.0.1:37265_solr c:replacenodetest_coll s:shard5 r:core_node28 x:replacenodetest_coll_shard5_replica_n27 t:null-1681] o.a.s.c.RecoveryStrategy startupVersions is empty 2> 424494 INFO (qtp583399768-4335) [n:127.0.0.1:37983_solr c:replacenodetest_coll s:shard5 r:core_node16 x:replacenodetest_coll_shard5_replica_n6 t:null-1688] o.a.s.c.S.Request webapp=/solr path=/admin/ping params={wt=javabin&version=2} hits=0 status=0 QTime=3 2> 424494 INFO (qtp583399768-4335) [n:127.0.0.1:37983_solr c:replacenodetest_coll s:shard5 r:core_node16 x:replacenodetest_coll_shard5_replica_n6 t:null-1688] o.a.s.c.S.Request webapp=/solr path=/admin/ping params={wt=javabin&version=2} status=0 QTime=4 2> 424496 INFO (recoveryExecutor-2433-thread-1-processing-replacenodetest_coll_shard5_replica_n27 null-1681 001308593327729923 core_node28 create 127.0.0.1:37265_solr replacenodetest_coll shard5) [n:127.0.0.1:37265_solr c:replacenodetest_coll s:shard5 r:core_node28 x:replacenodetest_coll_shard5_replica_n27 t:null-1681] o.a.s.c.RecoveryStrategy Begin buffering updates. core=[replacenodetest_coll_shard5_replica_n27] 2> 424500 INFO (recoveryExecutor-2433-thread-1-processing-replacenodetest_coll_shard5_replica_n27 null-1681 001308593327729923 core_node28 create 127.0.0.1:37265_solr replacenodetest_coll shard5) [n:127.0.0.1:37265_solr c:replacenodetest_coll s:shard5 r:core_node28 x:replacenodetest_coll_shard5_replica_n27 t:null-1681] o.a.s.u.UpdateLog Starting to buffer updates. FSUpdateLog{state=ACTIVE, tlog=null} 2> 424500 INFO (recoveryExecutor-2433-thread-1-processing-replacenodetest_coll_shard5_replica_n27 null-1681 001308593327729923 core_node28 create 127.0.0.1:37265_solr replacenodetest_coll shard5) [n:127.0.0.1:37265_solr c:replacenodetest_coll s:shard5 r:core_node28 x:replacenodetest_coll_shard5_replica_n27 t:null-1681] o.a.s.c.RecoveryStrategy Publishing state of core [replacenodetest_coll_shard5_replica_n27] as recovering, leader is [http://127.0.0.1:37983/solr/replacenodetest_coll_shard5_replica_n6/] and I am [http://127.0.0.1:37265/solr/replacenodetest_coll_shard5_replica_n27/] 2> 424509 INFO (zkCallback-2440-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/replacenodetest_coll/state.json zxid: 945] for collection [replacenodetest_coll] has occurred - updating... (live nodes size: [6]) 2> 424509 INFO (zkCallback-2438-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/replacenodetest_coll/state.json zxid: 945] for collection [replacenodetest_coll] has occurred - updating... (live nodes size: [6]) 2> 424509 INFO (zkCallback-2434-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/replacenodetest_coll/state.json zxid: 945] for collection [replacenodetest_coll] has occurred - updating... (live nodes size: [6]) 2> 424509 INFO (zkCallback-2430-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/replacenodetest_coll/state.json zxid: 945] for collection [replacenodetest_coll] has occurred - updating... (live nodes size: [6]) 2> 424509 INFO (zkCallback-2425-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/replacenodetest_coll/state.json zxid: 945] for collection [replacenodetest_coll] has occurred - updating... (live nodes size: [6]) 2> 424509 INFO (zkCallback-2436-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/replacenodetest_coll/state.json zxid: 945] for collection [replacenodetest_coll] has occurred - updating... (live nodes size: [6]) 2> 424511 INFO (recoveryExecutor-2433-thread-1-processing-replacenodetest_coll_shard5_replica_n27 null-1681 001308593327729923 core_node28 create 127.0.0.1:37265_solr replacenodetest_coll shard5) [n:127.0.0.1:37265_solr c:replacenodetest_coll s:shard5 r:core_node28 x:replacenodetest_coll_shard5_replica_n27 t:null-1681] o.a.s.c.RecoveryStrategy Sending prep recovery command to [http://127.0.0.1:37983/solr]; [WaitForState: action=PREPRECOVERY&core=replacenodetest_coll_shard5_replica_n6&nodeName=127.0.0.1:37265_solr&coreNodeName=core_node28&state=recovering&checkLive=true&onlyIfLeader=true&onlyIfLeaderActive=true] 2> 424520 INFO (qtp583399768-4344) [n:127.0.0.1:37983_solr c: s: r: x:replacenodetest_coll_shard5_replica_n6 t:null-1689] o.a.s.h.a.PrepRecoveryOp Going to wait for coreNodeName: core_node28, state: recovering, checkLive: true, onlyIfLeader: true, onlyIfLeaderActive: true 2> 424521 INFO (qtp583399768-4344) [n:127.0.0.1:37983_solr c: s: r: x: t:null-1689] o.a.s.h.a.PrepRecoveryOp In WaitForState(recovering): collection=replacenodetest_coll, shard=shard5, thisCore=replacenodetest_coll_shard5_replica_n6, leaderDoesNotNeedRecovery=false, isLeader? true, live=true, checkLive=true, currentState=recovering, localState=active, nodeName=127.0.0.1:37265_solr, coreNodeName=core_node28, onlyIfActiveCheckResult=false, nodeProps: core_node28:{ 2> "core":"replacenodetest_coll_shard5_replica_n27", 2> "node_name":"127.0.0.1:37265_solr", 2> "base_url":"http://127.0.0.1:37265/solr", 2> "state":"recovering", 2> "type":"NRT", 2> "force_set_state":"false"} 2> 424521 INFO (qtp583399768-4344) [n:127.0.0.1:37983_solr c: s: r: x: t:null-1689] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/cores params={nodeName=127.0.0.1:37265_solr&onlyIfLeaderActive=true&core=replacenodetest_coll_shard5_replica_n6&coreNodeName=core_node28&action=PREPRECOVERY&checkLive=true&state=recovering&onlyIfLeader=true&wt=javabin&version=2} status=0 QTime=2 2> 424999 INFO (qtp534543879-4338) [n:127.0.0.1:33013_solr c: s: r: x: t:null-1690] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/collections params={requestid=001&action=REQUESTSTATUS&wt=javabin&version=2} status=0 QTime=12 2> 425023 INFO (recoveryExecutor-2433-thread-1-processing-replacenodetest_coll_shard5_replica_n27 null-1681 001308593327729923 core_node28 create 127.0.0.1:37265_solr replacenodetest_coll shard5) [n:127.0.0.1:37265_solr c:replacenodetest_coll s:shard5 r:core_node28 x:replacenodetest_coll_shard5_replica_n27 t:null-1681] o.a.s.c.RecoveryStrategy Attempting to PeerSync from [http://127.0.0.1:37983/solr/replacenodetest_coll_shard5_replica_n6/] - recoveringAfterStartup=[true] 2> 425027 WARN (recoveryExecutor-2433-thread-1-processing-replacenodetest_coll_shard5_replica_n27 null-1681 001308593327729923 core_node28 create 127.0.0.1:37265_solr replacenodetest_coll shard5) [n:127.0.0.1:37265_solr c:replacenodetest_coll s:shard5 r:core_node28 x:replacenodetest_coll_shard5_replica_n27 t:null-1681] o.a.s.u.PeerSyncWithLeader no frame of reference to tell if we've missed updates 2> 425028 INFO (recoveryExecutor-2433-thread-1-processing-replacenodetest_coll_shard5_replica_n27 null-1681 001308593327729923 core_node28 create 127.0.0.1:37265_solr replacenodetest_coll shard5) [n:127.0.0.1:37265_solr c:replacenodetest_coll s:shard5 r:core_node28 x:replacenodetest_coll_shard5_replica_n27 t:null-1681] o.a.s.c.RecoveryStrategy PeerSync Recovery was not successful - trying replication. 2> 425028 INFO (recoveryExecutor-2433-thread-1-processing-replacenodetest_coll_shard5_replica_n27 null-1681 001308593327729923 core_node28 create 127.0.0.1:37265_solr replacenodetest_coll shard5) [n:127.0.0.1:37265_solr c:replacenodetest_coll s:shard5 r:core_node28 x:replacenodetest_coll_shard5_replica_n27 t:null-1681] o.a.s.c.RecoveryStrategy Starting Replication Recovery. 2> 425029 INFO (recoveryExecutor-2433-thread-1-processing-replacenodetest_coll_shard5_replica_n27 null-1681 001308593327729923 core_node28 create 127.0.0.1:37265_solr replacenodetest_coll shard5) [n:127.0.0.1:37265_solr c:replacenodetest_coll s:shard5 r:core_node28 x:replacenodetest_coll_shard5_replica_n27 t:null-1681] o.a.s.c.RecoveryStrategy Attempting to replicate from [http://127.0.0.1:37983/solr/replacenodetest_coll_shard5_replica_n6/]. 2> 425074 INFO (qtp1384576555-4336) [n:127.0.0.1:34105_solr c:replacenodetest_coll s:shard1 r:core_node11 x:replacenodetest_coll_shard1_replica_n1 t:null-1691] o.a.s.u.p.LogUpdateProcessorFactory webapp=/solr path=/update params={update.distrib=FROMLEADER&waitSearcher=true&openSearcher=false&commit=true&softCommit=false&distrib.from=http://127.0.0.1:34953/solr/replacenodetest_coll_shard1_replica_n2/&commit_end_point=replicas&wt=javabin&version=2&expungeDeletes=false}{commit=} 0 3 2> 425074 INFO (qtp819144083-4332) [n:127.0.0.1:34953_solr c:replacenodetest_coll s:shard4 r:core_node17 x:replacenodetest_coll_shard4_replica_n7 t:null-1691] o.a.s.u.p.LogUpdateProcessorFactory webapp=/solr path=/update params={update.distrib=FROMLEADER&waitSearcher=true&openSearcher=false&commit=true&softCommit=false&distrib.from=http://127.0.0.1:33013/solr/replacenodetest_coll_shard4_replica_n8/&commit_end_point=replicas&wt=javabin&version=2&expungeDeletes=false}{commit=} 0 3 2> 425074 INFO (qtp1180210852-4317) [n:127.0.0.1:36563_solr c:replacenodetest_coll s:shard2 r:core_node22 x:replacenodetest_coll_shard2_replica_n21 t:null-1691] o.a.s.u.p.LogUpdateProcessorFactory webapp=/solr path=/update params={update.distrib=FROMLEADER&waitSearcher=true&openSearcher=false&commit=true&softCommit=false&distrib.from=http://127.0.0.1:33013/solr/replacenodetest_coll_shard2_replica_n3/&commit_end_point=replicas&wt=javabin&version=2&expungeDeletes=false}{commit=} 0 4 2> 425074 INFO (qtp1384576555-4330) [n:127.0.0.1:34105_solr c:replacenodetest_coll s:shard3 r:core_node20 x:replacenodetest_coll_shard3_replica_n10 t:null-1691] o.a.s.u.p.LogUpdateProcessorFactory webapp=/solr path=/update params={update.distrib=FROMLEADER&waitSearcher=true&openSearcher=false&commit=true&softCommit=false&distrib.from=http://127.0.0.1:37983/solr/replacenodetest_coll_shard3_replica_n4/&commit_end_point=replicas&wt=javabin&version=2&expungeDeletes=false}{commit=} 0 4 2> 425075 INFO (qtp259113879-4353) [n:127.0.0.1:37265_solr c:replacenodetest_coll s:shard2 r:core_node26 x:replacenodetest_coll_shard2_replica_n25 t:null-1691] o.a.s.u.p.LogUpdateProcessorFactory webapp=/solr path=/update params={update.distrib=FROMLEADER&waitSearcher=true&openSearcher=false&commit=true&softCommit=false&distrib.from=http://127.0.0.1:33013/solr/replacenodetest_coll_shard2_replica_n3/&commit_end_point=replicas&wt=javabin&version=2&expungeDeletes=false}{commit=} 0 3 2> 425076 INFO (qtp534543879-4328) [n:127.0.0.1:33013_solr c:replacenodetest_coll s:shard4 r:core_node18 x:replacenodetest_coll_shard4_replica_n8 t:null-1691] o.a.s.u.p.LogUpdateProcessorFactory webapp=/solr path=/update params={update.distrib=TOLEADER&waitSearcher=true&openSearcher=false&commit=true&softCommit=false&distrib.from=http://127.0.0.1:37983/solr/replacenodetest_coll_shard5_replica_n6/&commit_end_point=leaders&wt=javabin&version=2&expungeDeletes=false}{commit=} 0 27 2> 425076 INFO (qtp819144083-4334) [n:127.0.0.1:34953_solr c:replacenodetest_coll s:shard1 r:core_node12 x:replacenodetest_coll_shard1_replica_n2 t:null-1691] o.a.s.u.p.LogUpdateProcessorFactory webapp=/solr path=/update params={update.distrib=TOLEADER&waitSearcher=true&openSearcher=false&commit=true&softCommit=false&distrib.from=http://127.0.0.1:37983/solr/replacenodetest_coll_shard5_replica_n6/&commit_end_point=leaders&wt=javabin&version=2&expungeDeletes=false}{commit=} 0 27 2> 425077 INFO (qtp583399768-4321) [n:127.0.0.1:37983_solr c:replacenodetest_coll s:shard3 r:core_node14 x:replacenodetest_coll_shard3_replica_n4 t:null-1691] o.a.s.u.p.LogUpdateProcessorFactory webapp=/solr path=/update params={update.distrib=TOLEADER&waitSearcher=true&openSearcher=false&commit=true&softCommit=false&distrib.from=http://127.0.0.1:37983/solr/replacenodetest_coll_shard5_replica_n6/&commit_end_point=leaders&wt=javabin&version=2&expungeDeletes=false}{commit=} 0 27 2> 425077 INFO (qtp534543879-4352) [n:127.0.0.1:33013_solr c:replacenodetest_coll s:shard2 r:core_node13 x:replacenodetest_coll_shard2_replica_n3 t:null-1691] o.a.s.u.p.LogUpdateProcessorFactory webapp=/solr path=/update params={update.distrib=TOLEADER&waitSearcher=true&openSearcher=false&commit=true&softCommit=false&distrib.from=http://127.0.0.1:37983/solr/replacenodetest_coll_shard5_replica_n6/&commit_end_point=leaders&wt=javabin&version=2&expungeDeletes=false}{commit=} 0 27 2> 425086 INFO (qtp1180210852-4319) [n:127.0.0.1:36563_solr c:replacenodetest_coll s:shard5 r:core_node24 x:replacenodetest_coll_shard5_replica_n23 t:null-1691] o.a.s.u.p.LogUpdateProcessorFactory webapp=/solr path=/update params={update.distrib=FROMLEADER&waitSearcher=true&openSearcher=false&commit=true&softCommit=false&distrib.from=http://127.0.0.1:37983/solr/replacenodetest_coll_shard5_replica_n6/&commit_end_point=replicas&wt=javabin&version=2&expungeDeletes=false}{commit=} 0 3 2> 425087 INFO (qtp259113879-4355) [n:127.0.0.1:37265_solr c:replacenodetest_coll s:shard5 r:core_node28 x:replacenodetest_coll_shard5_replica_n27 t:null-1691] o.a.s.u.p.DistributedUpdateProcessor Ignoring commit while not ACTIVE - state: BUFFERING replay: false 2> 425087 INFO (qtp259113879-4355) [n:127.0.0.1:37265_solr c:replacenodetest_coll s:shard5 r:core_node28 x:replacenodetest_coll_shard5_replica_n27 t:null-1691] o.a.s.u.p.LogUpdateProcessorFactory webapp=/solr path=/update params={update.distrib=FROMLEADER&waitSearcher=true&openSearcher=false&commit=true&softCommit=false&distrib.from=http://127.0.0.1:37983/solr/replacenodetest_coll_shard5_replica_n6/&commit_end_point=replicas&wt=javabin&version=2&expungeDeletes=false}{commit=} 0 3 2> 425089 INFO (qtp583399768-4331) [n:127.0.0.1:37983_solr c:replacenodetest_coll s:shard5 r:core_node16 x:replacenodetest_coll_shard5_replica_n6 t:null-1691] o.a.s.u.p.LogUpdateProcessorFactory webapp=/solr path=/update params={waitSearcher=true&openSearcher=false&commit=true&softCommit=false&wt=javabin&version=2}{commit=} 0 56 2> 425089 INFO (qtp259113879-4323) [n:127.0.0.1:37265_solr c: s: r: x: t:null-1681] o.a.s.h.a.CoreAdminOperation Checking request status for : 001308593327729923 2> 425089 INFO (qtp259113879-4323) [n:127.0.0.1:37265_solr c: s: r: x: t:null-1681] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/cores params={qt=/admin/cores&requestid=001308593327729923&action=REQUESTSTATUS&wt=javabin&version=2} status=0 QTime=1 2> 425100 INFO (qtp583399768-4341) [n:127.0.0.1:37983_solr c:replacenodetest_coll s:shard5 r:core_node16 x:replacenodetest_coll_shard5_replica_n6 t:null-1692] o.a.s.c.S.Request webapp=/solr path=/replication params={qt=/replication&wt=javabin&version=2&command=indexversion} status=0 QTime=2 2> 425102 INFO (recoveryExecutor-2433-thread-1-processing-replacenodetest_coll_shard5_replica_n27 null-1681 001308593327729923 core_node28 create 127.0.0.1:37265_solr replacenodetest_coll shard5) [n:127.0.0.1:37265_solr c:replacenodetest_coll s:shard5 r:core_node28 x:replacenodetest_coll_shard5_replica_n27 t:null-1681] o.a.s.h.IndexFetcher Leader's generation: 1 2> 425102 INFO (recoveryExecutor-2433-thread-1-processing-replacenodetest_coll_shard5_replica_n27 null-1681 001308593327729923 core_node28 create 127.0.0.1:37265_solr replacenodetest_coll shard5) [n:127.0.0.1:37265_solr c:replacenodetest_coll s:shard5 r:core_node28 x:replacenodetest_coll_shard5_replica_n27 t:null-1681] o.a.s.h.IndexFetcher Leader's version: 0 2> 425102 INFO (recoveryExecutor-2433-thread-1-processing-replacenodetest_coll_shard5_replica_n27 null-1681 001308593327729923 core_node28 create 127.0.0.1:37265_solr replacenodetest_coll shard5) [n:127.0.0.1:37265_solr c:replacenodetest_coll s:shard5 r:core_node28 x:replacenodetest_coll_shard5_replica_n27 t:null-1681] o.a.s.h.IndexFetcher Follower's generation: 1 2> 425102 INFO (recoveryExecutor-2433-thread-1-processing-replacenodetest_coll_shard5_replica_n27 null-1681 001308593327729923 core_node28 create 127.0.0.1:37265_solr replacenodetest_coll shard5) [n:127.0.0.1:37265_solr c:replacenodetest_coll s:shard5 r:core_node28 x:replacenodetest_coll_shard5_replica_n27 t:null-1681] o.a.s.h.IndexFetcher Follower's version: 0 2> 425103 INFO (recoveryExecutor-2433-thread-1-processing-replacenodetest_coll_shard5_replica_n27 null-1681 001308593327729923 core_node28 create 127.0.0.1:37265_solr replacenodetest_coll shard5) [n:127.0.0.1:37265_solr c:replacenodetest_coll s:shard5 r:core_node28 x:replacenodetest_coll_shard5_replica_n27 t:null-1681] o.a.s.c.RecoveryStrategy No replay needed. 2> 425110 INFO (recoveryExecutor-2433-thread-1-processing-replacenodetest_coll_shard5_replica_n27 null-1681 001308593327729923 core_node28 create 127.0.0.1:37265_solr replacenodetest_coll shard5) [n:127.0.0.1:37265_solr c:replacenodetest_coll s:shard5 r:core_node28 x:replacenodetest_coll_shard5_replica_n27 t:null-1681] o.a.s.c.RecoveryStrategy Replication Recovery was successful. 2> 425111 INFO (recoveryExecutor-2433-thread-1-processing-replacenodetest_coll_shard5_replica_n27 null-1681 001308593327729923 core_node28 create 127.0.0.1:37265_solr replacenodetest_coll shard5) [n:127.0.0.1:37265_solr c:replacenodetest_coll s:shard5 r:core_node28 x:replacenodetest_coll_shard5_replica_n27 t:null-1681] o.a.s.c.RecoveryStrategy Registering as Active after recovery. 2> 425120 INFO (zkCallback-2440-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/replacenodetest_coll/state.json zxid: 946] for collection [replacenodetest_coll] has occurred - updating... (live nodes size: [6]) 2> 425120 INFO (zkCallback-2434-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/replacenodetest_coll/state.json zxid: 946] for collection [replacenodetest_coll] has occurred - updating... (live nodes size: [6]) 2> 425120 INFO (zkCallback-2436-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/replacenodetest_coll/state.json zxid: 946] for collection [replacenodetest_coll] has occurred - updating... (live nodes size: [6]) 2> 425120 INFO (recoveryExecutor-2433-thread-1-processing-replacenodetest_coll_shard5_replica_n27 null-1681 001308593327729923 core_node28 create 127.0.0.1:37265_solr replacenodetest_coll shard5) [n:127.0.0.1:37265_solr c:replacenodetest_coll s:shard5 r:core_node28 x:replacenodetest_coll_shard5_replica_n27 t:null-1681] o.a.s.c.RecoveryStrategy Updating version bucket highest from index after successful recovery. 2> 425120 INFO (zkCallback-2425-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/replacenodetest_coll/state.json zxid: 946] for collection [replacenodetest_coll] has occurred - updating... (live nodes size: [6]) 2> 425120 INFO (zkCallback-2430-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/replacenodetest_coll/state.json zxid: 946] for collection [replacenodetest_coll] has occurred - updating... (live nodes size: [6]) 2> 425120 INFO (zkCallback-2438-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/replacenodetest_coll/state.json zxid: 946] for collection [replacenodetest_coll] has occurred - updating... (live nodes size: [6]) 2> 425121 INFO (recoveryExecutor-2433-thread-1-processing-replacenodetest_coll_shard5_replica_n27 null-1681 001308593327729923 core_node28 create 127.0.0.1:37265_solr replacenodetest_coll shard5) [n:127.0.0.1:37265_solr c:replacenodetest_coll s:shard5 r:core_node28 x:replacenodetest_coll_shard5_replica_n27 t:null-1681] o.a.s.u.UpdateLog Could not find max version in index or recent updates, using new clock 1788735702857416704 2> 425126 INFO (DistributedCollectionApiCommandExecutor-2458-thread-3-processing-127.0.0.1:33013_solr null-1681) [n:127.0.0.1:33013_solr c: s: r: x: t:null-1681] o.a.s.c.a.c.ReplicaMigrationUtils Deleting replica type=NRT for collection=replacenodetest_coll shard=shard2 on node=127.0.0.1:36563_solr 2> 425131 INFO (recoveryExecutor-2433-thread-1-processing-replacenodetest_coll_shard5_replica_n27 null-1681 001308593327729923 core_node28 create 127.0.0.1:37265_solr replacenodetest_coll shard5) [n:127.0.0.1:37265_solr c:replacenodetest_coll s:shard5 r:core_node28 x:replacenodetest_coll_shard5_replica_n27 t:null-1681] o.a.s.c.RecoveryStrategy Finished recovery process, successful=[true] msTimeTaken=646.0 2> 425131 INFO (recoveryExecutor-2433-thread-1-processing-replacenodetest_coll_shard5_replica_n27 null-1681 001308593327729923 core_node28 create 127.0.0.1:37265_solr replacenodetest_coll shard5) [n:127.0.0.1:37265_solr c:replacenodetest_coll s:shard5 r:core_node28 x:replacenodetest_coll_shard5_replica_n27 t:null-1681] o.a.s.c.RecoveryStrategy Finished recovery process. recoveringAfterStartup=true msTimeTaken=647.0 2> 425134 INFO (DistributedCollectionApiCommandExecutor-2458-thread-3-processing-127.0.0.1:33013_solr null-1681) [n:127.0.0.1:33013_solr c: s: r: x: t:null-1681] o.a.s.c.a.c.ReplicaMigrationUtils Deleting replica type=NRT for collection=replacenodetest_coll shard=shard5 on node=127.0.0.1:36563_solr 2> 425139 INFO (qtp1180210852-4327) [n:127.0.0.1:36563_solr c: s: r: x:replacenodetest_coll_shard2_replica_n21 t:null-1681] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/cores params={deleteInstanceDir=true&core=replacenodetest_coll_shard2_replica_n21&async=001308594396621518&qt=/admin/cores&deleteDataDir=true&action=UNLOAD&wt=javabin&version=2&deleteIndex=true} status=0 QTime=4 2> 425141 INFO (parallelCoreAdminAPIBaseExecutor-2482-thread-3-processing-127.0.0.1:36563_solr replacenodetest_coll_shard2_replica_n21 null-1681 001308594396621518 unload) [n:127.0.0.1:36563_solr c: s: r: x:replacenodetest_coll_shard2_replica_n21 t:null-1681] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.core.replacenodetest_coll.shard2.replica_n21 tag=null 2> 425146 INFO (qtp1180210852-4329) [n:127.0.0.1:36563_solr c: s: r: x:replacenodetest_coll_shard5_replica_n23 t:null-1681] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/cores params={deleteInstanceDir=true&core=replacenodetest_coll_shard5_replica_n23&async=001308594404658538&qt=/admin/cores&deleteDataDir=true&action=UNLOAD&wt=javabin&version=2&deleteIndex=true} status=0 QTime=4 2> 425148 INFO (qtp1180210852-4337) [n:127.0.0.1:36563_solr c: s: r: x: t:null-1681] o.a.s.h.a.CoreAdminOperation Checking request status for : 001308594396621518 2> 425149 INFO (qtp1180210852-4337) [n:127.0.0.1:36563_solr c: s: r: x: t:null-1681] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/cores params={qt=/admin/cores&requestid=001308594396621518&action=REQUESTSTATUS&wt=javabin&version=2} status=0 QTime=2 2> 425152 INFO (qtp1180210852-4339) [n:127.0.0.1:36563_solr c: s: r: x: t:null-1681] o.a.s.h.a.CoreAdminOperation Checking request status for : 001308594404658538 2> 425153 INFO (qtp1180210852-4339) [n:127.0.0.1:36563_solr c: s: r: x: t:null-1681] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/cores params={qt=/admin/cores&requestid=001308594404658538&action=REQUESTSTATUS&wt=javabin&version=2} status=0 QTime=1 2> 425356 INFO (parallelCoreAdminAPIBaseExecutor-2482-thread-4-processing-127.0.0.1:36563_solr replacenodetest_coll_shard5_replica_n23 null-1681 001308594404658538 unload) [n:127.0.0.1:36563_solr c: s: r: x:replacenodetest_coll_shard5_replica_n23 t:null-1681] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.core.replacenodetest_coll.shard5.replica_n23 tag=null 2> 425356 INFO (parallelCoreAdminAPIBaseExecutor-2482-thread-3-processing-127.0.0.1:36563_solr replacenodetest_coll_shard2_replica_n21 null-1681 001308594396621518 unload) [n:127.0.0.1:36563_solr c: s: r: x:replacenodetest_coll_shard2_replica_n21 t:null-1681] o.a.s.c.SolrCore CLOSING SolrCore org.apache.solr.core.SolrCore@17a8e38 replacenodetest_coll_shard2_replica_n21 2> 425559 INFO (parallelCoreAdminAPIBaseExecutor-2482-thread-3-processing-127.0.0.1:36563_solr replacenodetest_coll_shard2_replica_n21 null-1681 001308594396621518 unload) [n:127.0.0.1:36563_solr c: s: r: x:replacenodetest_coll_shard2_replica_n21 t:null-1681] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.core.replacenodetest_coll.shard2.replica_n21 tag=SolrCore@17a8e38 2> 425559 INFO (parallelCoreAdminAPIBaseExecutor-2482-thread-4-processing-127.0.0.1:36563_solr replacenodetest_coll_shard5_replica_n23 null-1681 001308594404658538 unload) [n:127.0.0.1:36563_solr c: s: r: x:replacenodetest_coll_shard5_replica_n23 t:null-1681] o.a.s.c.SolrCore CLOSING SolrCore org.apache.solr.core.SolrCore@5ce46291 replacenodetest_coll_shard5_replica_n23 2> 425560 INFO (parallelCoreAdminAPIBaseExecutor-2482-thread-3-processing-127.0.0.1:36563_solr replacenodetest_coll_shard2_replica_n21 null-1681 001308594396621518 unload) [n:127.0.0.1:36563_solr c: s: r: x:replacenodetest_coll_shard2_replica_n21 t:null-1681] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.collection.replacenodetest_coll.shard2.leader tag=SolrCore@17a8e38 2> 425560 INFO (parallelCoreAdminAPIBaseExecutor-2482-thread-4-processing-127.0.0.1:36563_solr replacenodetest_coll_shard5_replica_n23 null-1681 001308594404658538 unload) [n:127.0.0.1:36563_solr c: s: r: x:replacenodetest_coll_shard5_replica_n23 t:null-1681] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.core.replacenodetest_coll.shard5.replica_n23 tag=SolrCore@5ce46291 2> 425560 INFO (parallelCoreAdminAPIBaseExecutor-2482-thread-4-processing-127.0.0.1:36563_solr replacenodetest_coll_shard5_replica_n23 null-1681 001308594404658538 unload) [n:127.0.0.1:36563_solr c: s: r: x:replacenodetest_coll_shard5_replica_n23 t:null-1681] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.collection.replacenodetest_coll.shard5.leader tag=SolrCore@5ce46291 2> 425561 INFO (parallelCoreAdminAPIBaseExecutor-2482-thread-3-processing-127.0.0.1:36563_solr replacenodetest_coll_shard2_replica_n21 null-1681 001308594396621518 unload) [n:127.0.0.1:36563_solr c: s: r: x:replacenodetest_coll_shard2_replica_n21 t:null-1681] o.a.s.u.DirectUpdateHandler2 Committing on IndexWriter.close() ... SKIPPED (unnecessary). 2> 425561 INFO (parallelCoreAdminAPIBaseExecutor-2482-thread-4-processing-127.0.0.1:36563_solr replacenodetest_coll_shard5_replica_n23 null-1681 001308594404658538 unload) [n:127.0.0.1:36563_solr c: s: r: x:replacenodetest_coll_shard5_replica_n23 t:null-1681] o.a.s.u.DirectUpdateHandler2 Committing on IndexWriter.close() ... SKIPPED (unnecessary). 2> 425591 INFO (parallelCoreAdminAPIBaseExecutor-2482-thread-4-processing-127.0.0.1:36563_solr replacenodetest_coll_shard5_replica_n23 null-1681 001308594404658538 unload) [n:127.0.0.1:36563_solr c: s: r: x: t:null-1681] o.a.s.c.ZkShardTerms Successful update of terms at /collections/replacenodetest_coll/terms/shard5 to Terms{values={core_node28=0, core_node16=0}, version=5} for removeTerm 2> 425595 INFO (parallelCoreAdminAPIBaseExecutor-2482-thread-3-processing-127.0.0.1:36563_solr replacenodetest_coll_shard2_replica_n21 null-1681 001308594396621518 unload) [n:127.0.0.1:36563_solr c: s: r: x: t:null-1681] o.a.s.c.ZkShardTerms Successful update of terms at /collections/replacenodetest_coll/terms/shard2 to Terms{values={core_node13=0, core_node26=0}, version=5} for removeTerm 2> 425596 INFO (parallelCoreAdminAPIBaseExecutor-2482-thread-4-processing-127.0.0.1:36563_solr replacenodetest_coll_shard5_replica_n23 null-1681 001308594404658538 unload) [n:127.0.0.1:36563_solr c: s: r: x: t:null-1681] o.a.s.c.ShardLeaderElectionContextBase No version found for ephemeral leader parent node, won't remove previous leader registration. 2> 425600 INFO (parallelCoreAdminAPIBaseExecutor-2482-thread-3-processing-127.0.0.1:36563_solr replacenodetest_coll_shard2_replica_n21 null-1681 001308594396621518 unload) [n:127.0.0.1:36563_solr c: s: r: x: t:null-1681] o.a.s.c.ShardLeaderElectionContextBase No version found for ephemeral leader parent node, won't remove previous leader registration. 2> 425604 INFO (zkCallback-2430-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/replacenodetest_coll/state.json zxid: 951] for collection [replacenodetest_coll] has occurred - updating... (live nodes size: [6]) 2> 425604 INFO (zkCallback-2434-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/replacenodetest_coll/state.json zxid: 951] for collection [replacenodetest_coll] has occurred - updating... (live nodes size: [6]) 2> 425604 INFO (zkCallback-2436-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/replacenodetest_coll/state.json zxid: 951] for collection [replacenodetest_coll] has occurred - updating... (live nodes size: [6]) 2> 425604 INFO (zkCallback-2438-thread-2) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/replacenodetest_coll/state.json zxid: 951] for collection [replacenodetest_coll] has occurred - updating... (live nodes size: [6]) 2> 425604 INFO (zkCallback-2425-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/replacenodetest_coll/state.json zxid: 951] for collection [replacenodetest_coll] has occurred - updating... (live nodes size: [6]) 2> 425604 INFO (zkCallback-2440-thread-2) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/replacenodetest_coll/state.json zxid: 951] for collection [replacenodetest_coll] has occurred - updating... (live nodes size: [6]) 2> 425614 INFO (zkCallback-2440-thread-2) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/replacenodetest_coll/state.json zxid: 953] for collection [replacenodetest_coll] has occurred - updating... (live nodes size: [6]) 2> 425615 INFO (zkCallback-2430-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/replacenodetest_coll/state.json zxid: 953] for collection [replacenodetest_coll] has occurred - updating... (live nodes size: [6]) 2> 425615 INFO (zkCallback-2425-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/replacenodetest_coll/state.json zxid: 953] for collection [replacenodetest_coll] has occurred - updating... (live nodes size: [6]) 2> 425615 INFO (zkCallback-2436-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/replacenodetest_coll/state.json zxid: 953] for collection [replacenodetest_coll] has occurred - updating... (live nodes size: [6]) 2> 425615 INFO (zkCallback-2434-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/replacenodetest_coll/state.json zxid: 953] for collection [replacenodetest_coll] has occurred - updating... (live nodes size: [6]) 2> 426019 INFO (qtp534543879-4348) [n:127.0.0.1:33013_solr c: s: r: x: t:null-1693] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/collections params={requestid=001&action=REQUESTSTATUS&wt=javabin&version=2} status=0 QTime=13 2> 426156 INFO (qtp1180210852-4317) [n:127.0.0.1:36563_solr c: s: r: x: t:null-1681] o.a.s.h.a.CoreAdminOperation Checking request status for : 001308594396621518 2> 426156 INFO (qtp1180210852-4317) [n:127.0.0.1:36563_solr c: s: r: x: t:null-1681] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/cores params={qt=/admin/cores&requestid=001308594396621518&action=REQUESTSTATUS&wt=javabin&version=2} status=0 QTime=2 2> 426159 INFO (qtp1180210852-4319) [n:127.0.0.1:36563_solr c: s: r: x: t:null-1681] o.a.s.h.a.CoreAdminOperation Checking request status for : 001308594404658538 2> 426160 INFO (qtp1180210852-4319) [n:127.0.0.1:36563_solr c: s: r: x: t:null-1681] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/cores params={qt=/admin/cores&requestid=001308594404658538&action=REQUESTSTATUS&wt=javabin&version=2} status=0 QTime=2 2> 427032 INFO (qtp534543879-4354) [n:127.0.0.1:33013_solr c: s: r: x: t:null-1694] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/collections params={requestid=001&action=REQUESTSTATUS&wt=javabin&version=2} status=0 QTime=6 2> 427044 INFO (qtp534543879-4356) [n:127.0.0.1:33013_solr c: s: r: x: t:null-1695] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/collections params={requestid=001&action=DELETESTATUS&wt=javabin&version=2} status=0 QTime=6 2> 427067 INFO (qtp1180210852-4329) [n:127.0.0.1:36563_solr c: s: r: x: t:null-1696] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/cores params={action=STATUS&indexInfo=true&wt=javabin&version=2} status=0 QTime=1 2> 427089 INFO (TEST-ReplaceNodeTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.SolrTestCaseJ4 ###Ending test 2> 427209 INFO (jetty-closer-2587-thread-1) [n: c: s: r: x: t:] o.e.j.s.Server Stopped Server@613c8803{STOPPING}[10.0.19,sto=0] 2> 427210 INFO (jetty-closer-2587-thread-2) [n: c: s: r: x: t:] o.e.j.s.Server Stopped Server@7101129e{STOPPING}[10.0.19,sto=0] 2> 427211 INFO (jetty-closer-2587-thread-3) [n: c: s: r: x: t:] o.e.j.s.Server Stopped Server@386126fc{STOPPING}[10.0.19,sto=0] 2> 427212 INFO (jetty-closer-2587-thread-5) [n: c: s: r: x: t:] o.e.j.s.Server Stopped Server@4e9c6e2a{STOPPING}[10.0.19,sto=0] 2> 427212 INFO (jetty-closer-2587-thread-4) [n: c: s: r: x: t:] o.e.j.s.Server Stopped Server@5c98a479{STOPPING}[10.0.19,sto=0] 2> 427213 INFO (jetty-closer-2587-thread-6) [n: c: s: r: x: t:] o.e.j.s.Server Stopped Server@2c2b064e{STOPPING}[10.0.19,sto=0] 2> 427217 INFO (jetty-closer-2587-thread-5) [n: c: s: r: x: t:] o.e.j.s.AbstractConnector Stopped ServerConnector@3e14ddf9{HTTP/1.1, (http/1.1, h2c)}{127.0.0.1:0} 2> 427215 INFO (jetty-closer-2587-thread-2) [n: c: s: r: x: t:] o.e.j.s.AbstractConnector Stopped ServerConnector@1b63c09{HTTP/1.1, (http/1.1, h2c)}{127.0.0.1:0} 2> 427216 INFO (jetty-closer-2587-thread-3) [n: c: s: r: x: t:] o.e.j.s.AbstractConnector Stopped ServerConnector@489f601f{HTTP/1.1, (http/1.1, h2c)}{127.0.0.1:0} 2> 427217 INFO (jetty-closer-2587-thread-1) [n: c: s: r: x: t:] o.e.j.s.AbstractConnector Stopped ServerConnector@111b938c{HTTP/1.1, (http/1.1, h2c)}{127.0.0.1:0} 2> 427219 INFO (jetty-closer-2587-thread-4) [n: c: s: r: x: t:] o.e.j.s.AbstractConnector Stopped ServerConnector@2c020370{HTTP/1.1, (http/1.1, h2c)}{127.0.0.1:0} 2> 427221 INFO (jetty-closer-2587-thread-6) [n: c: s: r: x: t:] o.e.j.s.AbstractConnector Stopped ServerConnector@5c41120b{HTTP/1.1, (http/1.1, h2c)}{127.0.0.1:0} 2> 427223 INFO (jetty-closer-2587-thread-3) [n: c: s: r: x: t:] o.e.j.s.h.ContextHandler Stopped o.e.j.s.ServletContextHandler@15c86ec1{/solr,file:///build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-cwd/,STOPPED} 2> 427223 INFO (jetty-closer-2587-thread-5) [n: c: s: r: x: t:] o.e.j.s.h.ContextHandler Stopped o.e.j.s.ServletContextHandler@3daedd6b{/solr,file:///build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-cwd/,STOPPED} 2> 427223 INFO (jetty-closer-2587-thread-6) [n: c: s: r: x: t:] o.e.j.s.h.ContextHandler Stopped o.e.j.s.ServletContextHandler@7d3a376b{/solr,file:///build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-cwd/,STOPPED} 2> 427223 INFO (jetty-closer-2587-thread-1) [n: c: s: r: x: t:] o.e.j.s.h.ContextHandler Stopped o.e.j.s.ServletContextHandler@6618269a{/solr,file:///build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-cwd/,STOPPED} 2> 427223 INFO (jetty-closer-2587-thread-2) [n: c: s: r: x: t:] o.e.j.s.h.ContextHandler Stopped o.e.j.s.ServletContextHandler@57900a9e{/solr,file:///build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-cwd/,STOPPED} 2> 427223 INFO (jetty-closer-2587-thread-4) [n: c: s: r: x: t:] o.e.j.s.h.ContextHandler Stopped o.e.j.s.ServletContextHandler@58bd38ad{/solr,file:///build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-cwd/,STOPPED} 2> 427232 INFO (jetty-closer-2587-thread-3) [n: c: s: r: x: t:] o.a.s.c.CoreContainer Shutting down CoreContainer instance=318823647 2> 427239 INFO (jetty-closer-2587-thread-3) [n: c: s: r: x: t:] o.a.s.c.ZkController Remove node as live in ZooKeeper:/live_nodes/127.0.0.1:36563_solr 2> 427245 INFO (jetty-closer-2587-thread-3) [n: c: s: r: x: t:] o.a.s.c.ZkController Publish this node as DOWN... 2> 427246 INFO (jetty-closer-2587-thread-3) [n: c: s: r: x: t:] o.a.s.c.ZkController Publish node=127.0.0.1:36563_solr as DOWN 2> 427249 INFO (zkCallback-2440-thread-2) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (6) -> (5) 2> 427249 INFO (zkCallback-2436-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (6) -> (5) 2> 427250 INFO (zkCallback-2438-thread-2) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (6) -> (5) 2> 427250 INFO (zkCallback-2425-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (6) -> (5) 2> 427251 INFO (zkCallback-2434-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (6) -> (5) 2> 427252 INFO (zkCallback-2430-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (6) -> (5) 2> 427271 INFO (jetty-closer-2587-thread-2) [n: c: s: r: x: t:] o.a.s.c.CoreContainer Shutting down CoreContainer instance=1922140376 2> 427271 INFO (jetty-closer-2587-thread-4) [n: c: s: r: x: t:] o.a.s.c.CoreContainer Shutting down CoreContainer instance=1671501900 2> 427271 INFO (jetty-closer-2587-thread-5) [n: c: s: r: x: t:] o.a.s.c.CoreContainer Shutting down CoreContainer instance=1591363771 2> 427271 INFO (jetty-closer-2587-thread-6) [n: c: s: r: x: t:] o.a.s.c.CoreContainer Shutting down CoreContainer instance=72830 2> 427272 INFO (jetty-closer-2587-thread-4) [n: c: s: r: x: t:] o.a.s.c.ZkController Remove node as live in ZooKeeper:/live_nodes/127.0.0.1:37983_solr 2> 427272 INFO (jetty-closer-2587-thread-2) [n: c: s: r: x: t:] o.a.s.c.ZkController Remove node as live in ZooKeeper:/live_nodes/127.0.0.1:37265_solr 2> 427272 INFO (jetty-closer-2587-thread-5) [n: c: s: r: x: t:] o.a.s.c.ZkController Remove node as live in ZooKeeper:/live_nodes/127.0.0.1:34953_solr 2> 427272 INFO (jetty-closer-2587-thread-6) [n: c: s: r: x: t:] o.a.s.c.ZkController Remove node as live in ZooKeeper:/live_nodes/127.0.0.1:34105_solr 2> 427276 INFO (jetty-closer-2587-thread-4) [n: c: s: r: x: t:] o.a.s.c.ZkController Publish this node as DOWN... 2> 427277 INFO (jetty-closer-2587-thread-4) [n: c: s: r: x: t:] o.a.s.c.ZkController Publish node=127.0.0.1:37983_solr as DOWN 2> 427277 INFO (jetty-closer-2587-thread-5) [n: c: s: r: x: t:] o.a.s.c.ZkController Publish this node as DOWN... 2> 427277 INFO (jetty-closer-2587-thread-5) [n: c: s: r: x: t:] o.a.s.c.ZkController Publish node=127.0.0.1:34953_solr as DOWN 2> 427278 INFO (jetty-closer-2587-thread-2) [n: c: s: r: x: t:] o.a.s.c.ZkController Publish this node as DOWN... 2> 427278 INFO (jetty-closer-2587-thread-6) [n: c: s: r: x: t:] o.a.s.c.ZkController Publish this node as DOWN... 2> 427279 INFO (jetty-closer-2587-thread-6) [n: c: s: r: x: t:] o.a.s.c.ZkController Publish node=127.0.0.1:34105_solr as DOWN 2> 427279 INFO (jetty-closer-2587-thread-2) [n: c: s: r: x: t:] o.a.s.c.ZkController Publish node=127.0.0.1:37265_solr as DOWN 2> 427279 INFO (zkCallback-2425-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (5) -> (1) 2> 427279 INFO (zkCallback-2434-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (5) -> (1) 2> 427279 INFO (zkCallback-2430-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (5) -> (1) 2> 427280 INFO (zkCallback-2440-thread-2) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (5) -> (1) 2> 427280 INFO (zkCallback-2436-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (5) -> (1) 2> 427281 INFO (zkCallback-2438-thread-2) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (5) -> (1) 2> 427293 INFO (jetty-closer-2587-thread-3) [n: c: s: r: x: t:] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.node tag=null 2> 427294 INFO (zkCallback-2425-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/replacenodetest_coll/state.json zxid: 967] for collection [replacenodetest_coll] has occurred - updating... (live nodes size: [1]) 2> 427294 INFO (zkCallback-2436-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/replacenodetest_coll/state.json zxid: 967] for collection [replacenodetest_coll] has occurred - updating... (live nodes size: [1]) 2> 427294 INFO (zkCallback-2434-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/replacenodetest_coll/state.json zxid: 967] for collection [replacenodetest_coll] has occurred - updating... (live nodes size: [1]) 2> 427294 INFO (zkCallback-2430-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/replacenodetest_coll/state.json zxid: 967] for collection [replacenodetest_coll] has occurred - updating... (live nodes size: [1]) 2> 427295 INFO (zkCallback-2440-thread-2) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/replacenodetest_coll/state.json zxid: 967] for collection [replacenodetest_coll] has occurred - updating... (live nodes size: [1]) 2> 427307 INFO (coreCloseExecutor-2595-thread-1) [n:127.0.0.1:34953_solr c:replacenodetest_coll s:shard1 r:core_node12 x:replacenodetest_coll_shard1_replica_n2 t:] o.a.s.c.SolrCore CLOSING SolrCore org.apache.solr.core.SolrCore@55a98856 replacenodetest_coll_shard1_replica_n2 2> 427308 INFO (coreCloseExecutor-2595-thread-1) [n:127.0.0.1:34953_solr c:replacenodetest_coll s:shard1 r:core_node12 x:replacenodetest_coll_shard1_replica_n2 t:] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.core.replacenodetest_coll.shard1.replica_n2 tag=SolrCore@55a98856 2> 427308 INFO (coreCloseExecutor-2595-thread-2) [n:127.0.0.1:34953_solr c:replacenodetest_coll s:shard4 r:core_node17 x:replacenodetest_coll_shard4_replica_n7 t:] o.a.s.c.SolrCore CLOSING SolrCore org.apache.solr.core.SolrCore@7ef3822 replacenodetest_coll_shard4_replica_n7 2> 427312 INFO (zkCallback-2440-thread-2) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/replacenodetest_coll/state.json zxid: 971] for collection [replacenodetest_coll] has occurred - updating... (live nodes size: [1]) 2> 427313 INFO (zkCallback-2430-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/replacenodetest_coll/state.json zxid: 971] for collection [replacenodetest_coll] has occurred - updating... (live nodes size: [1]) 2> 427313 INFO (zkCallback-2434-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/replacenodetest_coll/state.json zxid: 971] for collection [replacenodetest_coll] has occurred - updating... (live nodes size: [1]) 2> 427313 INFO (zkCallback-2436-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/replacenodetest_coll/state.json zxid: 971] for collection [replacenodetest_coll] has occurred - updating... (live nodes size: [1]) 2> 427313 INFO (zkCallback-2425-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/replacenodetest_coll/state.json zxid: 971] for collection [replacenodetest_coll] has occurred - updating... (live nodes size: [1]) 2> 427325 INFO (coreCloseExecutor-2597-thread-1) [n:127.0.0.1:37265_solr c:replacenodetest_coll s:shard2 r:core_node26 x:replacenodetest_coll_shard2_replica_n25 t:] o.a.s.c.SolrCore CLOSING SolrCore org.apache.solr.core.SolrCore@2977cc34 replacenodetest_coll_shard2_replica_n25 2> 427326 INFO (coreCloseExecutor-2597-thread-1) [n:127.0.0.1:37265_solr c:replacenodetest_coll s:shard2 r:core_node26 x:replacenodetest_coll_shard2_replica_n25 t:] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.core.replacenodetest_coll.shard2.replica_n25 tag=SolrCore@2977cc34 2> 427326 INFO (coreCloseExecutor-2597-thread-2) [n:127.0.0.1:37265_solr c:replacenodetest_coll s:shard5 r:core_node28 x:replacenodetest_coll_shard5_replica_n27 t:] o.a.s.c.SolrCore CLOSING SolrCore org.apache.solr.core.SolrCore@76269b36 replacenodetest_coll_shard5_replica_n27 2> 427331 INFO (zkCallback-2430-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/replacenodetest_coll/state.json zxid: 974] for collection [replacenodetest_coll] has occurred - updating... (live nodes size: [1]) 2> 427331 INFO (zkCallback-2440-thread-2) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/replacenodetest_coll/state.json zxid: 974] for collection [replacenodetest_coll] has occurred - updating... (live nodes size: [1]) 2> 427331 INFO (zkCallback-2436-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/replacenodetest_coll/state.json zxid: 974] for collection [replacenodetest_coll] has occurred - updating... (live nodes size: [1]) 2> 427331 INFO (zkCallback-2425-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/replacenodetest_coll/state.json zxid: 974] for collection [replacenodetest_coll] has occurred - updating... (live nodes size: [1]) 2> 427331 INFO (zkCallback-2434-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/replacenodetest_coll/state.json zxid: 974] for collection [replacenodetest_coll] has occurred - updating... (live nodes size: [1]) 2> 427339 INFO (coreCloseExecutor-2599-thread-1) [n:127.0.0.1:34105_solr c:replacenodetest_coll s:shard3 r:core_node20 x:replacenodetest_coll_shard3_replica_n10 t:] o.a.s.c.SolrCore CLOSING SolrCore org.apache.solr.core.SolrCore@177ac4d replacenodetest_coll_shard3_replica_n10 2> 427340 INFO (coreCloseExecutor-2599-thread-2) [n:127.0.0.1:34105_solr c:replacenodetest_coll s:shard1 r:core_node11 x:replacenodetest_coll_shard1_replica_n1 t:] o.a.s.c.SolrCore CLOSING SolrCore org.apache.solr.core.SolrCore@257effad replacenodetest_coll_shard1_replica_n1 2> 427340 INFO (coreCloseExecutor-2599-thread-1) [n:127.0.0.1:34105_solr c:replacenodetest_coll s:shard3 r:core_node20 x:replacenodetest_coll_shard3_replica_n10 t:] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.core.replacenodetest_coll.shard3.replica_n10 tag=SolrCore@177ac4d 2> 427350 INFO (zkCallback-2430-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/replacenodetest_coll/state.json zxid: 976] for collection [replacenodetest_coll] has occurred - updating... (live nodes size: [1]) 2> 427350 INFO (zkCallback-2440-thread-2) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/replacenodetest_coll/state.json zxid: 976] for collection [replacenodetest_coll] has occurred - updating... (live nodes size: [1]) 2> 427350 INFO (zkCallback-2434-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/replacenodetest_coll/state.json zxid: 976] for collection [replacenodetest_coll] has occurred - updating... (live nodes size: [1]) 2> 427350 INFO (zkCallback-2436-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/replacenodetest_coll/state.json zxid: 976] for collection [replacenodetest_coll] has occurred - updating... (live nodes size: [1]) 2> 427350 INFO (zkCallback-2425-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/replacenodetest_coll/state.json zxid: 976] for collection [replacenodetest_coll] has occurred - updating... (live nodes size: [1]) 2> 427358 INFO (coreCloseExecutor-2601-thread-1) [n:127.0.0.1:37983_solr c:replacenodetest_coll s:shard5 r:core_node16 x:replacenodetest_coll_shard5_replica_n6 t:] o.a.s.c.SolrCore CLOSING SolrCore org.apache.solr.core.SolrCore@737e0b11 replacenodetest_coll_shard5_replica_n6 2> 427359 INFO (coreCloseExecutor-2601-thread-2) [n:127.0.0.1:37983_solr c:replacenodetest_coll s:shard3 r:core_node14 x:replacenodetest_coll_shard3_replica_n4 t:] o.a.s.c.SolrCore CLOSING SolrCore org.apache.solr.core.SolrCore@e0b609b replacenodetest_coll_shard3_replica_n4 2> 427359 INFO (coreCloseExecutor-2601-thread-1) [n:127.0.0.1:37983_solr c:replacenodetest_coll s:shard5 r:core_node16 x:replacenodetest_coll_shard5_replica_n6 t:] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.core.replacenodetest_coll.shard5.replica_n6 tag=SolrCore@737e0b11 2> 427602 INFO (jetty-closer-2587-thread-1) [n: c: s: r: x: t:] o.a.s.c.CoreContainer Shutting down CoreContainer instance=1654878283 2> 427603 INFO (jetty-closer-2587-thread-1) [n: c: s: r: x: t:] o.a.s.c.ZkController Remove node as live in ZooKeeper:/live_nodes/127.0.0.1:33013_solr 2> 427608 INFO (jetty-closer-2587-thread-1) [n: c: s: r: x: t:] o.a.s.c.ZkController Publish this node as DOWN... 2> 427609 INFO (jetty-closer-2587-thread-1) [n: c: s: r: x: t:] o.a.s.c.ZkController Publish node=127.0.0.1:33013_solr as DOWN 2> 427611 INFO (zkCallback-2425-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (1) -> (0) 2> 427611 INFO (zkCallback-2440-thread-2) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (1) -> (0) 2> 427612 INFO (zkCallback-2434-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (1) -> (0) 2> 427613 INFO (zkCallback-2436-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (1) -> (0) 2> 427613 INFO (zkCallback-2430-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (1) -> (0) 2> 427614 INFO (zkCallback-2438-thread-2) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (1) -> (0) 2> 427622 INFO (zkCallback-2440-thread-2) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/replacenodetest_coll/state.json zxid: 978] for collection [replacenodetest_coll] has occurred - updating... (live nodes size: [0]) 2> 427622 INFO (zkCallback-2434-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/replacenodetest_coll/state.json zxid: 978] for collection [replacenodetest_coll] has occurred - updating... (live nodes size: [0]) 2> 427622 INFO (zkCallback-2436-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/replacenodetest_coll/state.json zxid: 978] for collection [replacenodetest_coll] has occurred - updating... (live nodes size: [0]) 2> 427622 INFO (zkCallback-2430-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/replacenodetest_coll/state.json zxid: 978] for collection [replacenodetest_coll] has occurred - updating... (live nodes size: [0]) 2> 427622 INFO (zkCallback-2425-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/replacenodetest_coll/state.json zxid: 978] for collection [replacenodetest_coll] has occurred - updating... (live nodes size: [0]) 2> 427630 INFO (coreCloseExecutor-2604-thread-1) [n:127.0.0.1:33013_solr c:replacenodetest_coll s:shard2 r:core_node13 x:replacenodetest_coll_shard2_replica_n3 t:] o.a.s.c.SolrCore CLOSING SolrCore org.apache.solr.core.SolrCore@de7a27 replacenodetest_coll_shard2_replica_n3 2> 427631 INFO (coreCloseExecutor-2604-thread-1) [n:127.0.0.1:33013_solr c:replacenodetest_coll s:shard2 r:core_node13 x:replacenodetest_coll_shard2_replica_n3 t:] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.core.replacenodetest_coll.shard2.replica_n3 tag=SolrCore@de7a27 2> 427631 INFO (coreCloseExecutor-2604-thread-2) [n:127.0.0.1:33013_solr c:replacenodetest_coll s:shard4 r:core_node18 x:replacenodetest_coll_shard4_replica_n8 t:] o.a.s.c.SolrCore CLOSING SolrCore org.apache.solr.core.SolrCore@5729dfa0 replacenodetest_coll_shard4_replica_n8 2> 427722 INFO (jetty-closer-2587-thread-3) [n: c: s: r: x: t:] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.jvm tag=null 2> 427968 INFO (jetty-closer-2587-thread-3) [n: c: s: r: x: t:] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.jetty tag=null 2> 427969 INFO (jetty-closer-2587-thread-3) [n: c: s: r: x: t:] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.cluster tag=null 2> 428070 INFO (coreCloseExecutor-2595-thread-1) [n:127.0.0.1:34953_solr c:replacenodetest_coll s:shard1 r:core_node12 x:replacenodetest_coll_shard1_replica_n2 t:] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.collection.replacenodetest_coll.shard1.leader tag=SolrCore@55a98856 2> 428071 INFO (coreCloseExecutor-2595-thread-2) [n:127.0.0.1:34953_solr c:replacenodetest_coll s:shard4 r:core_node17 x:replacenodetest_coll_shard4_replica_n7 t:] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.core.replacenodetest_coll.shard4.replica_n7 tag=SolrCore@7ef3822 2> 428071 INFO (coreCloseExecutor-2597-thread-1) [n:127.0.0.1:37265_solr c:replacenodetest_coll s:shard2 r:core_node26 x:replacenodetest_coll_shard2_replica_n25 t:] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.collection.replacenodetest_coll.shard2.leader tag=SolrCore@2977cc34 2> 428071 INFO (coreCloseExecutor-2597-thread-2) [n:127.0.0.1:37265_solr c:replacenodetest_coll s:shard5 r:core_node28 x:replacenodetest_coll_shard5_replica_n27 t:] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.core.replacenodetest_coll.shard5.replica_n27 tag=SolrCore@76269b36 2> 428078 INFO (coreCloseExecutor-2595-thread-1) [n:127.0.0.1:34953_solr c:replacenodetest_coll s:shard1 r:core_node12 x:replacenodetest_coll_shard1_replica_n2 t:] o.a.s.u.DirectUpdateHandler2 Committing on IndexWriter.close() ... SKIPPED (unnecessary). 2> 428078 INFO (coreCloseExecutor-2597-thread-1) [n:127.0.0.1:37265_solr c:replacenodetest_coll s:shard2 r:core_node26 x:replacenodetest_coll_shard2_replica_n25 t:] o.a.s.u.DirectUpdateHandler2 Committing on IndexWriter.close() ... SKIPPED (unnecessary). 2> 428110 INFO (coreCloseExecutor-2599-thread-1) [n:127.0.0.1:34105_solr c:replacenodetest_coll s:shard3 r:core_node20 x:replacenodetest_coll_shard3_replica_n10 t:] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.collection.replacenodetest_coll.shard3.leader tag=SolrCore@177ac4d 2> 428111 INFO (coreCloseExecutor-2599-thread-2) [n:127.0.0.1:34105_solr c:replacenodetest_coll s:shard1 r:core_node11 x:replacenodetest_coll_shard1_replica_n1 t:] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.core.replacenodetest_coll.shard1.replica_n1 tag=SolrCore@257effad 2> 428119 INFO (coreCloseExecutor-2599-thread-1) [n:127.0.0.1:34105_solr c:replacenodetest_coll s:shard3 r:core_node20 x:replacenodetest_coll_shard3_replica_n10 t:] o.a.s.u.DirectUpdateHandler2 Committing on IndexWriter.close() ... SKIPPED (unnecessary). 2> 428162 INFO (coreCloseExecutor-2601-thread-1) [n:127.0.0.1:37983_solr c:replacenodetest_coll s:shard5 r:core_node16 x:replacenodetest_coll_shard5_replica_n6 t:] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.collection.replacenodetest_coll.shard5.leader tag=SolrCore@737e0b11 2> 428163 INFO (coreCloseExecutor-2601-thread-2) [n:127.0.0.1:37983_solr c:replacenodetest_coll s:shard3 r:core_node14 x:replacenodetest_coll_shard3_replica_n4 t:] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.core.replacenodetest_coll.shard3.replica_n4 tag=SolrCore@e0b609b 2> 428170 INFO (coreCloseExecutor-2601-thread-1) [n:127.0.0.1:37983_solr c:replacenodetest_coll s:shard5 r:core_node16 x:replacenodetest_coll_shard5_replica_n6 t:] o.a.s.u.DirectUpdateHandler2 Committing on IndexWriter.close() ... SKIPPED (unnecessary). 2> 428412 INFO (coreCloseExecutor-2604-thread-1) [n:127.0.0.1:33013_solr c:replacenodetest_coll s:shard2 r:core_node13 x:replacenodetest_coll_shard2_replica_n3 t:] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.collection.replacenodetest_coll.shard2.leader tag=SolrCore@de7a27 2> 428412 INFO (coreCloseExecutor-2604-thread-2) [n:127.0.0.1:33013_solr c:replacenodetest_coll s:shard4 r:core_node18 x:replacenodetest_coll_shard4_replica_n8 t:] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.core.replacenodetest_coll.shard4.replica_n8 tag=SolrCore@5729dfa0 2> 428418 INFO (coreCloseExecutor-2604-thread-1) [n:127.0.0.1:33013_solr c:replacenodetest_coll s:shard2 r:core_node13 x:replacenodetest_coll_shard2_replica_n3 t:] o.a.s.u.DirectUpdateHandler2 Committing on IndexWriter.close() ... SKIPPED (unnecessary). 2> 428704 INFO (coreCloseExecutor-2599-thread-2) [n:127.0.0.1:34105_solr c:replacenodetest_coll s:shard1 r:core_node11 x:replacenodetest_coll_shard1_replica_n1 t:] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.collection.replacenodetest_coll.shard1.leader tag=SolrCore@257effad 2> 428711 INFO (coreCloseExecutor-2599-thread-2) [n:127.0.0.1:34105_solr c:replacenodetest_coll s:shard1 r:core_node11 x:replacenodetest_coll_shard1_replica_n1 t:] o.a.s.u.DirectUpdateHandler2 Committing on IndexWriter.close() ... SKIPPED (unnecessary). 2> 428718 INFO (jetty-closer-2587-thread-6) [n: c: s: r: x: t:] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.node tag=null 2> 428724 INFO (coreCloseExecutor-2597-thread-2) [n:127.0.0.1:37265_solr c:replacenodetest_coll s:shard5 r:core_node28 x:replacenodetest_coll_shard5_replica_n27 t:] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.collection.replacenodetest_coll.shard5.leader tag=SolrCore@76269b36 2> 428726 INFO (coreCloseExecutor-2595-thread-2) [n:127.0.0.1:34953_solr c:replacenodetest_coll s:shard4 r:core_node17 x:replacenodetest_coll_shard4_replica_n7 t:] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.collection.replacenodetest_coll.shard4.leader tag=SolrCore@7ef3822 2> 428730 INFO (coreCloseExecutor-2597-thread-2) [n:127.0.0.1:37265_solr c:replacenodetest_coll s:shard5 r:core_node28 x:replacenodetest_coll_shard5_replica_n27 t:] o.a.s.u.DirectUpdateHandler2 Committing on IndexWriter.close() ... SKIPPED (unnecessary). 2> 428732 INFO (coreCloseExecutor-2595-thread-2) [n:127.0.0.1:34953_solr c:replacenodetest_coll s:shard4 r:core_node17 x:replacenodetest_coll_shard4_replica_n7 t:] o.a.s.u.DirectUpdateHandler2 Committing on IndexWriter.close() ... SKIPPED (unnecessary). 2> 428736 INFO (jetty-closer-2587-thread-2) [n: c: s: r: x: t:] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.node tag=null 2> 428738 INFO (jetty-closer-2587-thread-5) [n: c: s: r: x: t:] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.node tag=null 2> 428755 INFO (coreCloseExecutor-2601-thread-2) [n:127.0.0.1:37983_solr c:replacenodetest_coll s:shard3 r:core_node14 x:replacenodetest_coll_shard3_replica_n4 t:] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.collection.replacenodetest_coll.shard3.leader tag=SolrCore@e0b609b 2> 428763 INFO (coreCloseExecutor-2601-thread-2) [n:127.0.0.1:37983_solr c:replacenodetest_coll s:shard3 r:core_node14 x:replacenodetest_coll_shard3_replica_n4 t:] o.a.s.u.DirectUpdateHandler2 Committing on IndexWriter.close() ... SKIPPED (unnecessary). 2> 428770 INFO (jetty-closer-2587-thread-4) [n: c: s: r: x: t:] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.node tag=null 2> 428950 INFO (coreCloseExecutor-2604-thread-2) [n:127.0.0.1:33013_solr c:replacenodetest_coll s:shard4 r:core_node18 x:replacenodetest_coll_shard4_replica_n8 t:] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.collection.replacenodetest_coll.shard4.leader tag=SolrCore@5729dfa0 2> 428957 INFO (coreCloseExecutor-2604-thread-2) [n:127.0.0.1:33013_solr c:replacenodetest_coll s:shard4 r:core_node18 x:replacenodetest_coll_shard4_replica_n8 t:] o.a.s.u.DirectUpdateHandler2 Committing on IndexWriter.close() ... SKIPPED (unnecessary). 2> 428963 INFO (jetty-closer-2587-thread-1) [n: c: s: r: x: t:] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.node tag=null 2> 429028 INFO (jetty-closer-2587-thread-6) [n: c: s: r: x: t:] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.jvm tag=null 2> 429082 INFO (jetty-closer-2587-thread-5) [n: c: s: r: x: t:] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.jvm tag=null 2> 429085 INFO (jetty-closer-2587-thread-2) [n: c: s: r: x: t:] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.jvm tag=null 2> 429151 INFO (jetty-closer-2587-thread-4) [n: c: s: r: x: t:] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.jvm tag=null 2> 429187 INFO (jetty-closer-2587-thread-6) [n: c: s: r: x: t:] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.jetty tag=null 2> 429188 INFO (jetty-closer-2587-thread-6) [n: c: s: r: x: t:] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.cluster tag=null 2> 429244 INFO (jetty-closer-2587-thread-2) [n: c: s: r: x: t:] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.jetty tag=null 2> 429245 INFO (jetty-closer-2587-thread-2) [n: c: s: r: x: t:] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.cluster tag=null 2> 429246 INFO (jetty-closer-2587-thread-5) [n: c: s: r: x: t:] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.jetty tag=null 2> 429247 INFO (jetty-closer-2587-thread-5) [n: c: s: r: x: t:] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.cluster tag=null 2> 429283 INFO (jetty-closer-2587-thread-4) [n: c: s: r: x: t:] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.jetty tag=null 2> 429284 INFO (jetty-closer-2587-thread-4) [n: c: s: r: x: t:] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.cluster tag=null 2> 429320 INFO (jetty-closer-2587-thread-1) [n: c: s: r: x: t:] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.jvm tag=null 2> 429397 INFO (jetty-closer-2587-thread-1) [n: c: s: r: x: t:] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.jetty tag=null 2> 429398 INFO (jetty-closer-2587-thread-1) [n: c: s: r: x: t:] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.cluster tag=null 2> 429405 INFO (closeThreadPool-2610-thread-1) [n: c: s: r: x: t:] o.a.s.c.Overseer Overseer (id=72077816720195597-127.0.0.1:33013_solr-n_0000000000) closing 2> 429406 INFO (OverseerStateUpdate-72077816720195597-127.0.0.1:33013_solr-n_0000000000) [n:127.0.0.1:33013_solr c: s: r: x: t:] o.a.s.c.Overseer Overseer Loop exiting : 127.0.0.1:33013_solr 2> 429411 INFO (closeThreadPool-2610-thread-2) [n: c: s: r: x: t:] o.a.s.c.Overseer Overseer (id=72077816720195597-127.0.0.1:33013_solr-n_0000000000) closing 2> 429524 INFO (jetty-closer-2587-thread-1) [n: c: s: r: x: t:] o.a.s.c.Overseer Overseer (id=72077816720195597-127.0.0.1:33013_solr-n_0000000000) closing 2> 429527 INFO (SUITE-ReplaceNodeTest-seed#[246C98A4C257C021]-worker) [n: c: s: r: x: t:] o.a.s.c.ZkTestServer Shutting down ZkTestServer. 2> 429743 WARN (ZkTestServer Run Thread) [n: c: s: r: x: t:] o.a.s.c.ZkTestServer Watch limit violations: 2> Maximum concurrent create/delete watches above limit: 2> 2> 19 /solr/collections/replacenodetest_coll/terms/shard2 2> 18 /solr/collections/replacenodetest_coll/terms/shard5 2> 7 /solr/aliases.json 2> 7 /solr/configs/conf1 2> 7 /solr/clusterprops.json 2> 6 /solr/packages.json 2> 6 /solr/security.json 2> 6 /solr/collections/replacenodetest_coll/collectionprops.json 2> 5 /solr/collections/replacenodetest_coll/terms/shard3 2> 5 /solr/collections/replacenodetest_coll/terms/shard4 2> 5 /solr/collections/replacenodetest_coll/terms/shard1 2> 2> Maximum concurrent data watches above limit: 2> 2> 180 /solr/collections/replacenodetest_coll/state.json 2> 3 /solr/collections/replacenodetest_coll/leader_elect/shard2/election/72077816720195597-core_node13-n_0000000000 2> 3 /solr/collections/replacenodetest_coll/leader_elect/shard5/election/72077816720195594-core_node16-n_0000000000 2> 2 /solr/overseer_elect/election/72077816720195595-127.0.0.1:34953_solr-n_0000000001 2> 2 /solr/overseer_elect/election/72077816720195597-127.0.0.1:33013_solr-n_0000000000 2> 2 /solr/overseer_elect/election/72077816720195598-127.0.0.1:37265_solr-n_0000000002 2> 2> Maximum concurrent children watches above limit: 2> 2> 37 /solr/live_nodes 2> 14 /solr/collections 2> 7 /solr/collections/replacenodetest_coll/state.json 2> 2> 429799 INFO (SUITE-ReplaceNodeTest-seed#[246C98A4C257C021]-worker) [n: c: s: r: x: t:] o.a.s.u.ErrorLogMuter Closing ErrorLogMuter-regex-21 after mutting 0 log messages 2> 429799 INFO (SUITE-ReplaceNodeTest-seed#[246C98A4C257C021]-worker) [n: c: s: r: x: t:] o.a.s.u.ErrorLogMuter Creating ErrorLogMuter-regex-22 for ERROR logs matching regex: ignore_exception 2> NOTE: leaving temporary files on disk at: /build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.cloud.ReplaceNodeTest_246C98A4C257C021-001 2> NOTE: test params are: codec=Asserting(Lucene95), sim=Asserting(RandomSimilarity(queryNorm=true): {}), locale=en-VG, timezone=Indian/Comoro 2> NOTE: Linux 6.1.61-3-sophgo-08357-g369f7207fe69 riscv64/Oracle Corporation 11.0.22 (64-bit)/cpus=1,threads=2,free=170656616,total=435159040 2> NOTE: All tests run in this JVM: [MinimalSchemaTest, TestGroupingSearch, TestCharFilters, SimplePostToolTest, BalanceReplicasTest, ClusterStateUpdateTest, DeleteReplicaTest, FullSolrCloudDistribCmdsTest, MigrateRouteKeyTest, OverseerRolesTest, ReplaceNodeTest] org.apache.solr.cloud.ReplicationFactorTest > test FAILED org.apache.solr.client.solrj.impl.BaseHttpSolrClient$RemoteSolrException: Error from server at http://127.0.0.1:38185/_c: Underlying core creation failed while creating collection: repfacttest_c8n_2x2 at __randomizedtesting.SeedInfo.seed([246C98A4C257C021:AC38A77E6CABADD9]:0) at app//org.apache.solr.client.solrj.impl.HttpSolrClient.executeMethod(HttpSolrClient.java:747) at app//org.apache.solr.client.solrj.impl.HttpSolrClient.request(HttpSolrClient.java:263) at app//org.apache.solr.client.solrj.impl.HttpSolrClient.request(HttpSolrClient.java:244) at app//org.apache.solr.client.solrj.SolrClient.request(SolrClient.java:1192) at app//org.apache.solr.cloud.AbstractFullDistribZkTestBase.createCollection(AbstractFullDistribZkTestBase.java:2100) at app//org.apache.solr.cloud.AbstractFullDistribZkTestBase.createCollection(AbstractFullDistribZkTestBase.java:2138) at app//org.apache.solr.cloud.AbstractFullDistribZkTestBase.createCollection(AbstractFullDistribZkTestBase.java:2026) at app//org.apache.solr.cloud.ReplicationFactorTest.createCollectionWithRetry(ReplicationFactorTest.java:514) at app//org.apache.solr.cloud.ReplicationFactorTest.testRf2NotUsingDirectUpdates(ReplicationFactorTest.java:111) at app//org.apache.solr.cloud.ReplicationFactorTest.test(ReplicationFactorTest.java:95) at java.base@11.0.22/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at java.base@11.0.22/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at java.base@11.0.22/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.base@11.0.22/java.lang.reflect.Method.invoke(Method.java:566) at app//com.carrotsearch.randomizedtesting.RandomizedRunner.invoke(RandomizedRunner.java:1758) at app//com.carrotsearch.randomizedtesting.RandomizedRunner$8.evaluate(RandomizedRunner.java:946) at app//com.carrotsearch.randomizedtesting.RandomizedRunner$9.evaluate(RandomizedRunner.java:982) at app//com.carrotsearch.randomizedtesting.RandomizedRunner$10.evaluate(RandomizedRunner.java:996) at app//org.apache.solr.BaseDistributedSearchTestCase$ShardsRepeatRule$ShardsFixedStatement.callStatement(BaseDistributedSearchTestCase.java:1163) at app//org.apache.solr.BaseDistributedSearchTestCase$ShardsRepeatRule$ShardsStatement.evaluate(BaseDistributedSearchTestCase.java:1134) at app//com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:80) at app//org.junit.rules.RunRules.evaluate(RunRules.java:20) at app//org.apache.lucene.tests.util.TestRuleSetupTeardownChained$1.evaluate(TestRuleSetupTeardownChained.java:48) at app//org.apache.lucene.tests.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:43) at app//org.apache.lucene.tests.util.TestRuleThreadAndTestName$1.evaluate(TestRuleThreadAndTestName.java:45) at app//org.apache.lucene.tests.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:60) at app//org.apache.lucene.tests.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:44) at app//org.junit.rules.RunRules.evaluate(RunRules.java:20) at app//com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at app//com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:390) at app//com.carrotsearch.randomizedtesting.ThreadLeakControl.forkTimeoutingTask(ThreadLeakControl.java:843) at app//com.carrotsearch.randomizedtesting.ThreadLeakControl$3.evaluate(ThreadLeakControl.java:490) at app//com.carrotsearch.randomizedtesting.RandomizedRunner.runSingleTest(RandomizedRunner.java:955) at app//com.carrotsearch.randomizedtesting.RandomizedRunner$5.evaluate(RandomizedRunner.java:840) at app//com.carrotsearch.randomizedtesting.RandomizedRunner$6.evaluate(RandomizedRunner.java:891) at app//com.carrotsearch.randomizedtesting.RandomizedRunner$7.evaluate(RandomizedRunner.java:902) at app//com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at app//com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at app//org.apache.lucene.tests.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:43) at app//org.apache.lucene.tests.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:43) at app//com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:80) at app//org.junit.rules.RunRules.evaluate(RunRules.java:20) at app//org.apache.lucene.tests.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:43) at app//com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at app//org.apache.lucene.tests.util.TestRuleStoreClassName$1.evaluate(TestRuleStoreClassName.java:38) at app//com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:40) at app//com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:40) at app//com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at app//com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at app//org.apache.lucene.tests.util.TestRuleAssertionsRequired$1.evaluate(TestRuleAssertionsRequired.java:53) at app//org.apache.lucene.tests.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:43) at app//org.apache.lucene.tests.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:44) at app//org.apache.lucene.tests.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:60) at app//org.apache.lucene.tests.util.TestRuleIgnoreTestSuites$1.evaluate(TestRuleIgnoreTestSuites.java:47) at app//org.junit.rules.RunRules.evaluate(RunRules.java:20) at app//com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at app//com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:390) at app//com.carrotsearch.randomizedtesting.ThreadLeakControl.lambda$forkTimeoutingTask$0(ThreadLeakControl.java:850) at java.base@11.0.22/java.lang.Thread.run(Thread.java:829) org.apache.solr.cloud.ReplicationFactorTest > test suite's output saved to /build/solr/src/solr-9.4.1/solr/core/build/test-results/test/outputs/OUTPUT-org.apache.solr.cloud.ReplicationFactorTest.txt, copied below: 2> 319260 INFO (SUITE-ReplicationFactorTest-seed#[246C98A4C257C021]-worker) [n: c: s: r: x: t:] o.a.s.SolrTestCase Setting 'solr.default.confdir' system property to test-framework derived value of '/build/solr/src/solr-9.4.1/solr/server/solr/configsets/_default/conf' 2> 319261 INFO (SUITE-ReplicationFactorTest-seed#[246C98A4C257C021]-worker) [n: c: s: r: x: t:] o.a.s.SolrTestCaseJ4 SecureRandom sanity checks: test.solr.allowed.securerandom=null & java.security.egd=file:/dev/./urandom 2> 319266 INFO (SUITE-ReplicationFactorTest-seed#[246C98A4C257C021]-worker) [n: c: s: r: x: t:] o.a.s.u.ErrorLogMuter Closing ErrorLogMuter-regex-21 after mutting 0 log messages 2> 319266 INFO (SUITE-ReplicationFactorTest-seed#[246C98A4C257C021]-worker) [n: c: s: r: x: t:] o.a.s.u.ErrorLogMuter Creating ErrorLogMuter-regex-22 for ERROR logs matching regex: ignore_exception 2> 319271 INFO (SUITE-ReplicationFactorTest-seed#[246C98A4C257C021]-worker) [n: c: s: r: x: t:] o.a.s.SolrTestCaseJ4 Created dataDir: /build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.cloud.ReplicationFactorTest_246C98A4C257C021-001/data-dir-10-001 2> 319273 WARN (SUITE-ReplicationFactorTest-seed#[246C98A4C257C021]-worker) [n: c: s: r: x: t:] o.a.s.SolrTestCaseJ4 startTrackingSearchers: numOpens=1 numCloses=1 2> 319274 INFO (SUITE-ReplicationFactorTest-seed#[246C98A4C257C021]-worker) [n: c: s: r: x: t:] o.a.s.SolrTestCaseJ4 Using PointFields (NUMERIC_POINTS_SYSPROP=true) w/NUMERIC_DOCVALUES_SYSPROP=true 2> 319279 INFO (SUITE-ReplicationFactorTest-seed#[246C98A4C257C021]-worker) [n: c: s: r: x: t:] o.a.s.SolrTestCaseJ4 Randomized ssl (false) and clientAuth (false) via: @org.apache.solr.SolrTestCaseJ4$SuppressSSL(bugUrl="https://issues.apache.org/jira/browse/SOLR-5776") 2> 319280 INFO (SUITE-ReplicationFactorTest-seed#[246C98A4C257C021]-worker) [n: c: s: r: x: t:] o.a.s.BaseDistributedSearchTestCase Setting hostContext system property: /_c/ 2> 319361 INFO (TEST-ReplicationFactorTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.u.ErrorLogMuter Closing ErrorLogMuter-regex-22 after mutting 0 log messages 2> 319361 INFO (TEST-ReplicationFactorTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.u.ErrorLogMuter Creating ErrorLogMuter-regex-23 for ERROR logs matching regex: ignore_exception 2> 319366 INFO (TEST-ReplicationFactorTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.ZkTestServer STARTING ZK TEST SERVER 2> 319368 INFO (ZkTestServer Run Thread) [n: c: s: r: x: t:] o.a.s.c.ZkTestServer client port: 0.0.0.0/0.0.0.0:0 2> 319368 INFO (ZkTestServer Run Thread) [n: c: s: r: x: t:] o.a.s.c.ZkTestServer Starting server 2> 319381 WARN (ZkTestServer Run Thread) [n: c: s: r: x: t:] o.a.z.s.ServerCnxnFactory maxCnxns is not configured, using default value 0. 2> 319467 INFO (TEST-ReplicationFactorTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.ZkTestServer start zk server on port: 38605 2> 319474 WARN (TEST-ReplicationFactorTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.c.SolrZkClient Using default ZkCredentialsInjector. ZkCredentialsInjector is not secure, it creates an empty list of credentials which leads to 'OPEN_ACL_UNSAFE' ACLs to Zookeeper nodes 2> 319482 INFO (TEST-ReplicationFactorTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.c.ConnectionManager Waiting up to 30000ms for client to connect to ZooKeeper 2> 319496 INFO (zkConnectionManagerCallback-1107-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ConnectionManager zkClient has connected 2> 319496 INFO (TEST-ReplicationFactorTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper 2> 319497 WARN (TEST-ReplicationFactorTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.c.SolrZkClient Using default ZkACLProvider. DefaultZkACLProvider is not secure, it creates 'OPEN_ACL_UNSAFE' ACLs to Zookeeper nodes 2> 319505 WARN (TEST-ReplicationFactorTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.c.SolrZkClient Using default ZkCredentialsInjector. ZkCredentialsInjector is not secure, it creates an empty list of credentials which leads to 'OPEN_ACL_UNSAFE' ACLs to Zookeeper nodes 2> 319513 INFO (TEST-ReplicationFactorTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.c.ConnectionManager Waiting up to 30000ms for client to connect to ZooKeeper 2> 319521 INFO (zkConnectionManagerCallback-1109-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ConnectionManager zkClient has connected 2> 319521 INFO (TEST-ReplicationFactorTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper 2> 319522 WARN (TEST-ReplicationFactorTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.c.SolrZkClient Using default ZkACLProvider. DefaultZkACLProvider is not secure, it creates 'OPEN_ACL_UNSAFE' ACLs to Zookeeper nodes 2> 319530 INFO (TEST-ReplicationFactorTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.ZkTestServer put /build/solr/src/solr-9.4.1/solr/core/build/resources/test/solr/collection1/conf/solrconfig-tlog.xml to /configs/conf1/solrconfig.xml 2> 319540 INFO (TEST-ReplicationFactorTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.ZkTestServer put /build/solr/src/solr-9.4.1/solr/core/build/resources/test/solr/collection1/conf/schema.xml to /configs/conf1/schema.xml 2> 319554 INFO (TEST-ReplicationFactorTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.ZkTestServer put /build/solr/src/solr-9.4.1/solr/core/build/resources/test/solr/collection1/conf/solrconfig.snippet.randomindexconfig.xml to /configs/conf1/solrconfig.snippet.randomindexconfig.xml 2> 319566 INFO (TEST-ReplicationFactorTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.ZkTestServer put /build/solr/src/solr-9.4.1/solr/core/build/resources/test/solr/collection1/conf/stopwords.txt to /configs/conf1/stopwords.txt 2> 319576 INFO (TEST-ReplicationFactorTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.ZkTestServer put /build/solr/src/solr-9.4.1/solr/core/build/resources/test/solr/collection1/conf/protwords.txt to /configs/conf1/protwords.txt 2> 319588 INFO (TEST-ReplicationFactorTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.ZkTestServer put /build/solr/src/solr-9.4.1/solr/core/build/resources/test/solr/collection1/conf/currency.xml to /configs/conf1/currency.xml 2> 319600 INFO (TEST-ReplicationFactorTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.ZkTestServer put /build/solr/src/solr-9.4.1/solr/core/build/resources/test/solr/collection1/conf/enumsConfig.xml to /configs/conf1/enumsConfig.xml 2> 319611 INFO (TEST-ReplicationFactorTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.ZkTestServer put /build/solr/src/solr-9.4.1/solr/core/build/resources/test/solr/collection1/conf/open-exchange-rates.json to /configs/conf1/open-exchange-rates.json 2> 319624 INFO (TEST-ReplicationFactorTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.ZkTestServer put /build/solr/src/solr-9.4.1/solr/core/build/resources/test/solr/collection1/conf/mapping-ISOLatin1Accent.txt to /configs/conf1/mapping-ISOLatin1Accent.txt 2> 319635 INFO (TEST-ReplicationFactorTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.ZkTestServer put /build/solr/src/solr-9.4.1/solr/core/build/resources/test/solr/collection1/conf/old_synonyms.txt to /configs/conf1/old_synonyms.txt 2> 319649 INFO (TEST-ReplicationFactorTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.ZkTestServer put /build/solr/src/solr-9.4.1/solr/core/build/resources/test/solr/collection1/conf/synonyms.txt to /configs/conf1/synonyms.txt 2> 319660 INFO (TEST-ReplicationFactorTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.AbstractFullDistribZkTestBase Will use NRT replicas unless explicitly asked otherwise 2> 322936 INFO (TEST-ReplicationFactorTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.e.JettySolrRunner Start Jetty (configured port=0, binding port=0) 2> 322937 INFO (TEST-ReplicationFactorTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.e.JettySolrRunner Trying to start Jetty on port 0 try number 2 ... 2> 322937 INFO (TEST-ReplicationFactorTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.e.j.s.Server jetty-10.0.19; built: 2023-12-15T20:38:32.477Z; git: 8492d1c78f122bb30cce20aecfa07e7283facd47; jvm 11.0.22+7 2> 322952 INFO (TEST-ReplicationFactorTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.e.j.s.s.DefaultSessionIdManager Session workerName=node0 2> 322957 INFO (TEST-ReplicationFactorTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.e.j.s.h.ContextHandler Started o.e.j.s.ServletContextHandler@48c1479c{/_c,file:///build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-cwd/,AVAILABLE} 2> 322961 INFO (TEST-ReplicationFactorTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.e.j.s.AbstractConnector Started ServerConnector@4c084e5d{HTTP/1.1, (http/1.1, h2c)}{127.0.0.1:43165} 2> 322962 INFO (TEST-ReplicationFactorTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.e.j.s.Server Started Server@e3d4963{STARTING}[10.0.19,sto=0] @323197ms 2> 322963 ERROR (TEST-ReplicationFactorTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.u.StartupLoggingUtils Missing Java Option solr.log.dir. Logging may be missing or incomplete. 2> 322963 INFO (TEST-ReplicationFactorTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.s.CoreContainerProvider Using logger factory org.apache.logging.slf4j.Log4jLoggerFactory 2> 322964 INFO (TEST-ReplicationFactorTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.s.CoreContainerProvider ___ _ Welcome to Apache Solr™ version 9.4.1 2> 322964 INFO (TEST-ReplicationFactorTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.s.CoreContainerProvider / __| ___| |_ _ Starting in cloud mode on port null 2> 322964 INFO (TEST-ReplicationFactorTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.s.CoreContainerProvider \__ \/ _ \ | '_| Install dir: /build/solr/src/solr-9.4.1/solr 2> 322964 INFO (TEST-ReplicationFactorTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.s.CoreContainerProvider |___/\___/_|_| Start time: 2024-01-21T21:07:35.062698Z 2> 322965 INFO (TEST-ReplicationFactorTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.s.CoreContainerProvider Solr Home: /build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.cloud.ReplicationFactorTest_246C98A4C257C021-001/control-001 (source: servlet config: solr.solr.home) 2> 322968 WARN (TEST-ReplicationFactorTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.c.SolrZkClient Using default ZkCredentialsInjector. ZkCredentialsInjector is not secure, it creates an empty list of credentials which leads to 'OPEN_ACL_UNSAFE' ACLs to Zookeeper nodes 2> 322977 INFO (TEST-ReplicationFactorTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.c.ConnectionManager Waiting up to 30000ms for client to connect to ZooKeeper 2> 322987 INFO (zkConnectionManagerCallback-1111-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ConnectionManager zkClient has connected 2> 322988 INFO (TEST-ReplicationFactorTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper 2> 322988 WARN (TEST-ReplicationFactorTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.c.SolrZkClient Using default ZkACLProvider. DefaultZkACLProvider is not secure, it creates 'OPEN_ACL_UNSAFE' ACLs to Zookeeper nodes 2> 323096 INFO (TEST-ReplicationFactorTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.NodeConfig Loading solr.xml from SolrHome (not found in ZooKeeper) 2> 323098 INFO (TEST-ReplicationFactorTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.SolrXmlConfig Loading solr.xml from /build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.cloud.ReplicationFactorTest_246C98A4C257C021-001/control-001/solr.xml 2> 325169 INFO (TEST-ReplicationFactorTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.u.t.SimplePropagator Always-on trace id generation enabled. 2> 325211 INFO (TEST-ReplicationFactorTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.ZkContainer Zookeeper client=127.0.0.1:38605/solr 2> 325212 WARN (TEST-ReplicationFactorTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.c.SolrZkClient Using default ZkCredentialsInjector. ZkCredentialsInjector is not secure, it creates an empty list of credentials which leads to 'OPEN_ACL_UNSAFE' ACLs to Zookeeper nodes 2> 325220 INFO (TEST-ReplicationFactorTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.c.ConnectionManager Waiting up to 15000ms for client to connect to ZooKeeper 2> 325230 INFO (zkConnectionManagerCallback-1121-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ConnectionManager zkClient has connected 2> 325230 INFO (TEST-ReplicationFactorTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper 2> 325231 WARN (TEST-ReplicationFactorTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.c.SolrZkClient Using default ZkACLProvider. DefaultZkACLProvider is not secure, it creates 'OPEN_ACL_UNSAFE' ACLs to Zookeeper nodes 2> 325340 INFO (TEST-ReplicationFactorTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.DistributedClusterStateUpdater Creating DistributedClusterStateUpdater with useDistributedStateUpdate=false. Solr will be using Overseer based cluster state updates. 2> 325353 INFO (TEST-ReplicationFactorTest.test-seed#[246C98A4C257C021]) [n:127.0.0.1:40643__c c: s: r: x: t:] o.a.s.c.c.ConnectionManager Waiting up to 15000ms for client to connect to ZooKeeper 2> 325364 INFO (zkConnectionManagerCallback-1123-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ConnectionManager zkClient has connected 2> 325364 INFO (TEST-ReplicationFactorTest.test-seed#[246C98A4C257C021]) [n:127.0.0.1:40643__c c: s: r: x: t:] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper 2> 325533 WARN (TEST-ReplicationFactorTest.test-seed#[246C98A4C257C021]) [n:127.0.0.1:40643__c c: s: r: x: t:] o.a.s.c.ZkController Contents of zookeeper /security.json are world-readable; consider setting up ACLs as described in https://solr.apache.org/guide/solr/latest/deployment-guide/zookeeper-access-control.html 2> 325558 INFO (TEST-ReplicationFactorTest.test-seed#[246C98A4C257C021]) [n:127.0.0.1:40643__c c: s: r: x: t:] o.a.s.c.DistributedClusterStateUpdater Creating DistributedClusterStateUpdater with useDistributedStateUpdate=false. Solr will be using Overseer based cluster state updates. 2> 325580 INFO (TEST-ReplicationFactorTest.test-seed#[246C98A4C257C021]) [n:127.0.0.1:40643__c c: s: r: x: t:] o.a.s.c.OverseerElectionContext I am going to be the leader 127.0.0.1:40643__c 2> 325587 INFO (TEST-ReplicationFactorTest.test-seed#[246C98A4C257C021]) [n:127.0.0.1:40643__c c: s: r: x: t:] o.a.s.c.Overseer Overseer (id=72077814988931076-127.0.0.1:40643__c-n_0000000000) starting 2> 325638 INFO (TEST-ReplicationFactorTest.test-seed#[246C98A4C257C021]) [n:127.0.0.1:40643__c c: s: r: x: t:] o.a.s.c.ZkController Register node as live in ZooKeeper:/live_nodes/127.0.0.1:40643__c 2> 325638 INFO (OverseerStateUpdate-72077814988931076-127.0.0.1:40643__c-n_0000000000) [n:127.0.0.1:40643__c c: s: r: x: t:] o.a.s.c.Overseer Starting to work on the main queue : 127.0.0.1:40643__c 2> 325649 INFO (OverseerStateUpdate-72077814988931076-127.0.0.1:40643__c-n_0000000000) [n:127.0.0.1:40643__c c: s: r: x: t:] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (0) -> (1) 2> 325666 WARN (TEST-ReplicationFactorTest.test-seed#[246C98A4C257C021]) [n:127.0.0.1:40643__c c: s: r: x: t:] o.a.s.c.CoreContainer Not all security plugins configured! authentication=disabled authorization=disabled. Solr is only as secure as you make it. Consider configuring authentication/authorization before exposing Solr to users internal or external. See https://s.apache.org/solrsecurity for more info 2> 326511 INFO (TEST-ReplicationFactorTest.test-seed#[246C98A4C257C021]) [n:127.0.0.1:40643__c c: s: r: x: t:] o.a.s.c.CorePropertiesLocator Found 0 core definitions underneath /build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.cloud.ReplicationFactorTest_246C98A4C257C021-001/control-001/cores 2> 327073 INFO (TEST-ReplicationFactorTest.test-seed#[246C98A4C257C021]) [n:127.0.0.1:40643__c c: s: r: x: t:] o.a.s.j.SolrRequestAuthorizer Creating a new SolrRequestAuthorizer 2> 327321 INFO (TEST-ReplicationFactorTest.test-seed#[246C98A4C257C021]) [n:127.0.0.1:40643__c c: s: r: x: t:] o.a.s.e.JettySolrRunner Jetty properties: {hostContext=/_c, solr.data.dir=/build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.cloud.ReplicationFactorTest_246C98A4C257C021-001/tempDir-001/control/data, hostPort=40643, coreRootDirectory=/build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.cloud.ReplicationFactorTest_246C98A4C257C021-001/control-001/cores, replicaType=NRT} 2> 327341 WARN (TEST-ReplicationFactorTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.c.SolrZkClient Using default ZkCredentialsInjector. ZkCredentialsInjector is not secure, it creates an empty list of credentials which leads to 'OPEN_ACL_UNSAFE' ACLs to Zookeeper nodes 2> 327352 INFO (TEST-ReplicationFactorTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.c.ConnectionManager Waiting up to 15000ms for client to connect to ZooKeeper 2> 327365 INFO (zkConnectionManagerCallback-1136-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ConnectionManager zkClient has connected 2> 327366 INFO (TEST-ReplicationFactorTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper 2> 327366 WARN (TEST-ReplicationFactorTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.c.SolrZkClient Using default ZkACLProvider. DefaultZkACLProvider is not secure, it creates 'OPEN_ACL_UNSAFE' ACLs to Zookeeper nodes 2> 327376 INFO (TEST-ReplicationFactorTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (0) -> (1) 2> 327385 INFO (TEST-ReplicationFactorTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.s.i.ZkClientClusterStateProvider Cluster at 127.0.0.1:38605/solr ready 2> 327391 INFO (SocketProxy-Acceptor-40643) [n: c: s: r: x: t:] o.a.s.c.s.c.SocketProxy accepted Socket[addr=/127.0.0.1,port=43436,localport=40643], receiveBufferSize: 65536 2> 327427 INFO (SocketProxy-Acceptor-40643) [n: c: s: r: x: t:] o.a.s.c.s.c.SocketProxy proxy connection Socket[addr=/127.0.0.1,port=43165,localport=36846], receiveBufferSize=65536 2> 327465 INFO (OverseerThreadFactory-1128-thread-1) [n: c:control_collection s: r: x: t:] o.a.s.c.a.c.CreateCollectionCmd Create collection control_collection 2> 327632 INFO (OverseerStateUpdate-72077814988931076-127.0.0.1:40643__c-n_0000000000) [n:127.0.0.1:40643__c c: s: r: x: t:] o.a.s.c.o.SliceMutator createReplica() { 2> "core":"control_collection_shard1_replica_n1", 2> "node_name":"127.0.0.1:40643__c", 2> "base_url":"http://127.0.0.1:40643/_c", 2> "collection":"control_collection", 2> "shard":"shard1", 2> "state":"down", 2> "type":"NRT", 2> "operation":"ADDREPLICA", 2> "waitForFinalState":"false"} 2> 327752 INFO (zkCallback-1122-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/control_collection/state.json zxid: 153] for collection [control_collection] has occurred - updating... (live nodes size: [1]) 2> 327763 INFO (SocketProxy-Acceptor-40643) [n: c: s: r: x: t:] o.a.s.c.s.c.SocketProxy accepted Socket[addr=/127.0.0.1,port=43448,localport=40643], receiveBufferSize: 65536 2> 327769 INFO (SocketProxy-Acceptor-40643) [n: c: s: r: x: t:] o.a.s.c.s.c.SocketProxy proxy connection Socket[addr=/127.0.0.1,port=43165,localport=36860], receiveBufferSize=65536 2> 327788 INFO (qtp1639791653-1928) [n:127.0.0.1:40643__c c: s: r: x:control_collection_shard1_replica_n1 t:null-1007] o.a.s.h.a.CoreAdminOperation core create command qt=/admin/cores&collection.configName=conf1&newCollection=true&collection=control_collection&version=2&replicaType=NRT&coreNodeName=core_node2&name=control_collection_shard1_replica_n1&action=CREATE&numShards=1&shard=shard1&wt=javabin 2> 327905 INFO (qtp1639791653-1928) [n:127.0.0.1:40643__c c:control_collection s:shard1 r:core_node2 x:control_collection_shard1_replica_n1 t:null-1007] o.a.s.c.SolrConfig Using Lucene MatchVersion: 9.8.0 2> 327958 INFO (qtp1639791653-1928) [n:127.0.0.1:40643__c c:control_collection s:shard1 r:core_node2 x:control_collection_shard1_replica_n1 t:null-1007] o.a.s.s.IndexSchema Schema name=test 2> 328225 INFO (qtp1639791653-1928) [n:127.0.0.1:40643__c c:control_collection s:shard1 r:core_node2 x:control_collection_shard1_replica_n1 t:null-1007] o.a.s.s.IndexSchema Loaded schema test/1.0 with uniqueid field id 2> 328394 INFO (qtp1639791653-1928) [n:127.0.0.1:40643__c c:control_collection s:shard1 r:core_node2 x:control_collection_shard1_replica_n1 t:null-1007] o.a.s.c.CoreContainer Creating SolrCore 'control_collection_shard1_replica_n1' using configuration from configset conf1, trusted=true 2> 328397 INFO (qtp1639791653-1928) [n:127.0.0.1:40643__c c:control_collection s:shard1 r:core_node2 x:control_collection_shard1_replica_n1 t:null-1007] o.a.s.c.SolrCore Opening new SolrCore at [/build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.cloud.ReplicationFactorTest_246C98A4C257C021-001/control-001/cores/control_collection_shard1_replica_n1], dataDir=[/build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.cloud.ReplicationFactorTest_246C98A4C257C021-001/control-001/cores/control_collection_shard1_replica_n1/data/] 2> 328418 INFO (qtp1639791653-1928) [n:127.0.0.1:40643__c c:control_collection s:shard1 r:core_node2 x:control_collection_shard1_replica_n1 t:null-1007] o.a.s.u.RandomMergePolicy RandomMergePolicy wrapping class org.apache.lucene.index.TieredMergePolicy: [TieredMergePolicy: maxMergeAtOnce=40, maxMergedSegmentMB=105.98709392547607, floorSegmentMB=0.24309539794921875, forceMergeDeletesPctAllowed=18.166870365028444, segmentsPerTier=19.0, maxCFSSegmentSizeMB=8.796093022208E12, noCFSRatio=0.0, deletesPctAllowed=47.18653045948602 2> 328449 WARN (qtp1639791653-1928) [n:127.0.0.1:40643__c c:control_collection s:shard1 r:core_node2 x:control_collection_shard1_replica_n1 t:null-1007] o.a.s.c.RequestHandlers INVALID paramSet a in requestHandler {type = requestHandler,name = /dump,class = DumpRequestHandler,attributes = {initParams=a, name=/dump, class=DumpRequestHandler},args = {defaults={a=A, b=B}}} 2> 328981 INFO (qtp1639791653-1928) [n:127.0.0.1:40643__c c:control_collection s:shard1 r:core_node2 x:control_collection_shard1_replica_n1 t:null-1007] o.a.s.j.SolrRequestAuthorizer Creating a new SolrRequestAuthorizer 2> 329290 INFO (qtp1639791653-1928) [n:127.0.0.1:40643__c c:control_collection s:shard1 r:core_node2 x:control_collection_shard1_replica_n1 t:null-1007] o.a.s.u.UpdateHandler Using UpdateLog implementation: org.apache.solr.update.UpdateLog 2> 329290 INFO (qtp1639791653-1928) [n:127.0.0.1:40643__c c:control_collection s:shard1 r:core_node2 x:control_collection_shard1_replica_n1 t:null-1007] o.a.s.u.UpdateLog Initializing UpdateLog: dataDir= defaultSyncLevel=FLUSH numRecordsToKeep=100 maxNumLogsToKeep=10 numVersionBuckets=65536 2> 329307 INFO (qtp1639791653-1928) [n:127.0.0.1:40643__c c:control_collection s:shard1 r:core_node2 x:control_collection_shard1_replica_n1 t:null-1007] o.a.s.u.CommitTracker Hard AutoCommit: disabled 2> 329307 INFO (qtp1639791653-1928) [n:127.0.0.1:40643__c c:control_collection s:shard1 r:core_node2 x:control_collection_shard1_replica_n1 t:null-1007] o.a.s.u.CommitTracker Soft AutoCommit: disabled 2> 329316 INFO (qtp1639791653-1928) [n:127.0.0.1:40643__c c:control_collection s:shard1 r:core_node2 x:control_collection_shard1_replica_n1 t:null-1007] o.a.s.u.RandomMergePolicy RandomMergePolicy wrapping class org.apache.lucene.index.LogByteSizeMergePolicy: [LogByteSizeMergePolicy: minMergeSize=1677721, mergeFactor=33, maxMergeSize=2147483648, maxMergeSizeForForcedMerge=9223372036854775807, calibrateSizeByDeletes=false, maxMergeDocs=2147483647, maxCFSSegmentSizeMB=8.796093022208E12, noCFSRatio=0.6324117596483627] 2> 329330 INFO (qtp1639791653-1928) [n:127.0.0.1:40643__c c:control_collection s:shard1 r:core_node2 x:control_collection_shard1_replica_n1 t:null-1007] o.a.s.r.ManagedResourceStorage Configured ZooKeeperStorageIO with znodeBase: /configs/conf1 2> 329332 INFO (qtp1639791653-1928) [n:127.0.0.1:40643__c c:control_collection s:shard1 r:core_node2 x:control_collection_shard1_replica_n1 t:null-1007] o.a.s.r.ManagedResourceStorage Loaded null at path _rest_managed.json using ZooKeeperStorageIO:path=/configs/conf1 2> 329336 INFO (qtp1639791653-1928) [n:127.0.0.1:40643__c c:control_collection s:shard1 r:core_node2 x:control_collection_shard1_replica_n1 t:null-1007] o.a.s.h.ReplicationHandler Commits will be reserved for 10000 ms 2> 329337 INFO (qtp1639791653-1928) [n:127.0.0.1:40643__c c:control_collection s:shard1 r:core_node2 x:control_collection_shard1_replica_n1 t:null-1007] o.a.s.u.UpdateLog Could not find max version in index or recent updates, using new clock 1788735663830466560 2> 329376 INFO (qtp1639791653-1928) [n:127.0.0.1:40643__c c:control_collection s:shard1 r:core_node2 x:control_collection_shard1_replica_n1 t:null-1007] o.a.s.c.ZkShardTerms Successful update of terms at /collections/control_collection/terms/shard1 to Terms{values={core_node2=0}, version=0} for registerTerm 2> 329377 INFO (qtp1639791653-1928) [n:127.0.0.1:40643__c c:control_collection s:shard1 r:core_node2 x:control_collection_shard1_replica_n1 t:null-1007] o.a.s.c.ShardLeaderElectionContextBase make sure parent is created /collections/control_collection/leaders/shard1 2> 329377 INFO (searcherExecutor-1138-thread-1-processing-control_collection_shard1_replica_n1 null-1007 core_node2 127.0.0.1:40643__c control_collection shard1) [n:127.0.0.1:40643__c c:control_collection s:shard1 r:core_node2 x:control_collection_shard1_replica_n1 t:null-1007] o.a.s.c.SolrCore Registered new searcher autowarm time: 0 ms 2> 329405 INFO (qtp1639791653-1928) [n:127.0.0.1:40643__c c:control_collection s:shard1 r:core_node2 x:control_collection_shard1_replica_n1 t:null-1007] o.a.s.c.ShardLeaderElectionContext Enough replicas found to continue. 2> 329405 INFO (qtp1639791653-1928) [n:127.0.0.1:40643__c c:control_collection s:shard1 r:core_node2 x:control_collection_shard1_replica_n1 t:null-1007] o.a.s.c.ShardLeaderElectionContext I may be the new leader - try and sync 2> 329405 INFO (qtp1639791653-1928) [n:127.0.0.1:40643__c c:control_collection s:shard1 r:core_node2 x:control_collection_shard1_replica_n1 t:null-1007] o.a.s.c.SyncStrategy Sync replicas to http://127.0.0.1:40643/_c/control_collection_shard1_replica_n1/ 2> 329408 INFO (qtp1639791653-1928) [n:127.0.0.1:40643__c c:control_collection s:shard1 r:core_node2 x:control_collection_shard1_replica_n1 t:null-1007] o.a.s.c.SyncStrategy Sync Success - now sync replicas to me 2> 329411 INFO (qtp1639791653-1928) [n:127.0.0.1:40643__c c:control_collection s:shard1 r:core_node2 x:control_collection_shard1_replica_n1 t:null-1007] o.a.s.c.SyncStrategy http://127.0.0.1:40643/_c/control_collection_shard1_replica_n1/ has no replicas 2> 329411 INFO (qtp1639791653-1928) [n:127.0.0.1:40643__c c:control_collection s:shard1 r:core_node2 x:control_collection_shard1_replica_n1 t:null-1007] o.a.s.c.ShardLeaderElectionContextBase Creating leader registration node /collections/control_collection/leaders/shard1/leader after winning as /collections/control_collection/leader_elect/shard1/election/72077814988931076-core_node2-n_0000000000 2> 329431 INFO (qtp1639791653-1928) [n:127.0.0.1:40643__c c:control_collection s:shard1 r:core_node2 x:control_collection_shard1_replica_n1 t:null-1007] o.a.s.c.ShardLeaderElectionContext I am the new leader: http://127.0.0.1:40643/_c/control_collection_shard1_replica_n1/ shard1 2> 329542 INFO (zkCallback-1122-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/control_collection/state.json zxid: 171] for collection [control_collection] has occurred - updating... (live nodes size: [1]) 2> 329550 INFO (qtp1639791653-1928) [n:127.0.0.1:40643__c c:control_collection s:shard1 r:core_node2 x:control_collection_shard1_replica_n1 t:null-1007] o.a.s.c.ZkController I am the leader, no recovery necessary 2> 329576 INFO (qtp1639791653-1928) [n:127.0.0.1:40643__c c: s: r: x: t:null-1007] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/cores params={qt=/admin/cores&coreNodeName=core_node2&collection.configName=conf1&newCollection=true&name=control_collection_shard1_replica_n1&action=CREATE&numShards=1&collection=control_collection&shard=shard1&wt=javabin&version=2&replicaType=NRT} status=0 QTime=1790 2> 329592 INFO (qtp1639791653-1925) [n:127.0.0.1:40643__c c:control_collection s: r: x: t:null-1006] o.a.s.h.a.CollectionsHandler Wait for new collection to be active for at most 45 seconds. Check all shard replicas 2> 329670 INFO (zkCallback-1122-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/control_collection/state.json zxid: 177] for collection [control_collection] has occurred - updating... (live nodes size: [1]) 2> 329670 INFO (zkCallback-1122-thread-2) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/control_collection/state.json zxid: 177] for collection [control_collection] has occurred - updating... (live nodes size: [1]) 2> 329679 INFO (qtp1639791653-1925) [n:127.0.0.1:40643__c c:control_collection s: r: x: t:null-1006] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/collections params={collection.configName=conf1&name=control_collection&nrtReplicas=1&action=CREATE&numShards=1&createNodeSet=127.0.0.1:40643__c&wt=javabin&version=2} status=0 QTime=2240 2> 329681 INFO (TEST-ReplicationFactorTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.AbstractFullDistribZkTestBase Waiting to see 1 active replicas in collection: control_collection 2> 329814 WARN (TEST-ReplicationFactorTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.c.SolrZkClient Using default ZkCredentialsInjector. ZkCredentialsInjector is not secure, it creates an empty list of credentials which leads to 'OPEN_ACL_UNSAFE' ACLs to Zookeeper nodes 2> 329823 INFO (TEST-ReplicationFactorTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.c.ConnectionManager Waiting up to 15000ms for client to connect to ZooKeeper 2> 329834 INFO (zkConnectionManagerCallback-1147-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ConnectionManager zkClient has connected 2> 329834 INFO (TEST-ReplicationFactorTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper 2> 329835 WARN (TEST-ReplicationFactorTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.c.SolrZkClient Using default ZkACLProvider. DefaultZkACLProvider is not secure, it creates 'OPEN_ACL_UNSAFE' ACLs to Zookeeper nodes 2> 329844 INFO (TEST-ReplicationFactorTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (0) -> (1) 2> 329853 INFO (TEST-ReplicationFactorTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.s.i.ZkClientClusterStateProvider Cluster at 127.0.0.1:38605/solr ready 2> 329854 INFO (TEST-ReplicationFactorTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.ChaosMonkey monkey: init - expire sessions:false cause connection loss:false 2> 329858 INFO (SocketProxy-Acceptor-40643) [n: c: s: r: x: t:] o.a.s.c.s.c.SocketProxy accepted Socket[addr=/127.0.0.1,port=43464,localport=40643], receiveBufferSize: 65536 2> 329862 INFO (SocketProxy-Acceptor-40643) [n: c: s: r: x: t:] o.a.s.c.s.c.SocketProxy proxy connection Socket[addr=/127.0.0.1,port=43165,localport=36868], receiveBufferSize=65536 2> 329900 INFO (OverseerThreadFactory-1128-thread-2) [n: c:collection1 s: r: x: t:] o.a.s.c.a.c.CreateCollectionCmd Create collection collection1 2> 329904 INFO (OverseerCollectionConfigSetProcessor-72077814988931076-127.0.0.1:40643__c-n_0000000000) [n: c: s: r: x: t:] o.a.s.c.OverseerTaskQueue Response ZK path: /overseer/collection-queue-work/qnr-0000000000 doesn't exist. Requestor may have disconnected from ZooKeeper 2> 330131 WARN (OverseerThreadFactory-1128-thread-2) [n: c:collection1 s: r: x: t:] o.a.s.c.a.c.CreateCollectionCmd It is unusual to create a collection (collection1) without cores. 2> 330143 INFO (qtp1639791653-1927) [n:127.0.0.1:40643__c c:collection1 s: r: x: t:null-1008] o.a.s.h.a.CollectionsHandler Wait for new collection to be active for at most 45 seconds. Check all shard replicas 2> 330147 INFO (qtp1639791653-1927) [n:127.0.0.1:40643__c c:collection1 s: r: x: t:null-1008] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/collections params={collection.configName=conf1&name=collection1&nrtReplicas=1&action=CREATE&numShards=3&createNodeSet=&wt=javabin&version=2} status=0 QTime=276 2> 330153 INFO (TEST-ReplicationFactorTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.SolrCloudTestCase active slice count: 3 expected: 3 2> 330153 INFO (TEST-ReplicationFactorTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.SolrCloudTestCase active replica count: 0 expected replica count: 0 2> 330153 INFO (TEST-ReplicationFactorTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.AbstractFullDistribZkTestBase Creating jetty instances pullReplicaCount=0 numOtherReplicas=3 2> 331919 INFO (OverseerCollectionConfigSetProcessor-72077814988931076-127.0.0.1:40643__c-n_0000000000) [n: c: s: r: x: t:] o.a.s.c.OverseerTaskQueue Response ZK path: /overseer/collection-queue-work/qnr-0000000002 doesn't exist. Requestor may have disconnected from ZooKeeper 2> 333235 INFO (TEST-ReplicationFactorTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.AbstractFullDistribZkTestBase create jetty 1 in directory /build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.cloud.ReplicationFactorTest_246C98A4C257C021-001/shard-1-001 of type NRT for shard2 2> 333251 INFO (closeThreadPool-1148-thread-1) [n: c: s: r: x: t:] o.a.s.e.JettySolrRunner Start Jetty (configured port=0, binding port=0) 2> 333252 INFO (closeThreadPool-1148-thread-1) [n: c: s: r: x: t:] o.a.s.e.JettySolrRunner Trying to start Jetty on port 0 try number 2 ... 2> 333253 INFO (closeThreadPool-1148-thread-1) [n: c: s: r: x: t:] o.e.j.s.Server jetty-10.0.19; built: 2023-12-15T20:38:32.477Z; git: 8492d1c78f122bb30cce20aecfa07e7283facd47; jvm 11.0.22+7 2> 333264 INFO (closeThreadPool-1148-thread-1) [n: c: s: r: x: t:] o.e.j.s.s.DefaultSessionIdManager Session workerName=node0 2> 333269 INFO (closeThreadPool-1148-thread-1) [n: c: s: r: x: t:] o.e.j.s.h.ContextHandler Started o.e.j.s.ServletContextHandler@24b0c50c{/_c,file:///build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-cwd/,AVAILABLE} 2> 333273 INFO (closeThreadPool-1148-thread-1) [n: c: s: r: x: t:] o.e.j.s.AbstractConnector Started ServerConnector@5322d1b4{HTTP/1.1, (http/1.1, h2c)}{127.0.0.1:35371} 2> 333274 INFO (closeThreadPool-1148-thread-1) [n: c: s: r: x: t:] o.e.j.s.Server Started Server@6296cee8{STARTING}[10.0.19,sto=0] @333509ms 2> 333275 ERROR (closeThreadPool-1148-thread-1) [n: c: s: r: x: t:] o.a.s.u.StartupLoggingUtils Missing Java Option solr.log.dir. Logging may be missing or incomplete. 2> 333276 INFO (closeThreadPool-1148-thread-1) [n: c: s: r: x: t:] o.a.s.s.CoreContainerProvider Using logger factory org.apache.logging.slf4j.Log4jLoggerFactory 2> 333276 INFO (closeThreadPool-1148-thread-1) [n: c: s: r: x: t:] o.a.s.s.CoreContainerProvider ___ _ Welcome to Apache Solr™ version 9.4.1 2> 333277 INFO (closeThreadPool-1148-thread-1) [n: c: s: r: x: t:] o.a.s.s.CoreContainerProvider / __| ___| |_ _ Starting in cloud mode on port null 2> 333277 INFO (closeThreadPool-1148-thread-1) [n: c: s: r: x: t:] o.a.s.s.CoreContainerProvider \__ \/ _ \ | '_| Install dir: /build/solr/src/solr-9.4.1/solr 2> 333277 INFO (closeThreadPool-1148-thread-1) [n: c: s: r: x: t:] o.a.s.s.CoreContainerProvider |___/\___/_|_| Start time: 2024-01-21T21:07:45.375674Z 2> 333278 INFO (closeThreadPool-1148-thread-1) [n: c: s: r: x: t:] o.a.s.s.CoreContainerProvider Solr Home: /build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.cloud.ReplicationFactorTest_246C98A4C257C021-001/shard-1-001 (source: servlet config: solr.solr.home) 2> 333281 WARN (closeThreadPool-1148-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.SolrZkClient Using default ZkCredentialsInjector. ZkCredentialsInjector is not secure, it creates an empty list of credentials which leads to 'OPEN_ACL_UNSAFE' ACLs to Zookeeper nodes 2> 333295 INFO (closeThreadPool-1148-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ConnectionManager Waiting up to 30000ms for client to connect to ZooKeeper 2> 333311 INFO (zkConnectionManagerCallback-1150-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ConnectionManager zkClient has connected 2> 333311 INFO (closeThreadPool-1148-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper 2> 333312 WARN (closeThreadPool-1148-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.SolrZkClient Using default ZkACLProvider. DefaultZkACLProvider is not secure, it creates 'OPEN_ACL_UNSAFE' ACLs to Zookeeper nodes 2> 333420 INFO (closeThreadPool-1148-thread-1) [n: c: s: r: x: t:] o.a.s.c.NodeConfig Loading solr.xml from SolrHome (not found in ZooKeeper) 2> 333422 INFO (closeThreadPool-1148-thread-1) [n: c: s: r: x: t:] o.a.s.c.SolrXmlConfig Loading solr.xml from /build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.cloud.ReplicationFactorTest_246C98A4C257C021-001/shard-1-001/solr.xml 2> 335065 INFO (closeThreadPool-1148-thread-1) [n: c: s: r: x: t:] o.a.s.c.ZkContainer Zookeeper client=127.0.0.1:38605/solr 2> 335067 WARN (closeThreadPool-1148-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.SolrZkClient Using default ZkCredentialsInjector. ZkCredentialsInjector is not secure, it creates an empty list of credentials which leads to 'OPEN_ACL_UNSAFE' ACLs to Zookeeper nodes 2> 335081 INFO (closeThreadPool-1148-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ConnectionManager Waiting up to 15000ms for client to connect to ZooKeeper 2> 335091 INFO (zkConnectionManagerCallback-1160-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ConnectionManager zkClient has connected 2> 335091 INFO (closeThreadPool-1148-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper 2> 335092 WARN (closeThreadPool-1148-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.SolrZkClient Using default ZkACLProvider. DefaultZkACLProvider is not secure, it creates 'OPEN_ACL_UNSAFE' ACLs to Zookeeper nodes 2> 335201 INFO (closeThreadPool-1148-thread-1) [n: c: s: r: x: t:] o.a.s.c.DistributedClusterStateUpdater Creating DistributedClusterStateUpdater with useDistributedStateUpdate=false. Solr will be using Overseer based cluster state updates. 2> 335210 INFO (closeThreadPool-1148-thread-1) [n:127.0.0.1:33207__c c: s: r: x: t:] o.a.s.c.c.ConnectionManager Waiting up to 15000ms for client to connect to ZooKeeper 2> 335218 INFO (zkConnectionManagerCallback-1162-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ConnectionManager zkClient has connected 2> 335219 INFO (closeThreadPool-1148-thread-1) [n:127.0.0.1:33207__c c: s: r: x: t:] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper 2> 335258 WARN (closeThreadPool-1148-thread-1) [n:127.0.0.1:33207__c c: s: r: x: t:] o.a.s.c.ZkController Contents of zookeeper /security.json are world-readable; consider setting up ACLs as described in https://solr.apache.org/guide/solr/latest/deployment-guide/zookeeper-access-control.html 2> 335264 INFO (closeThreadPool-1148-thread-1) [n:127.0.0.1:33207__c c: s: r: x: t:] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (0) -> (1) 2> 335281 INFO (closeThreadPool-1148-thread-1) [n:127.0.0.1:33207__c c: s: r: x: t:] o.a.s.c.DistributedClusterStateUpdater Creating DistributedClusterStateUpdater with useDistributedStateUpdate=false. Solr will be using Overseer based cluster state updates. 2> 335297 INFO (closeThreadPool-1148-thread-1) [n:127.0.0.1:33207__c c: s: r: x: t:] o.a.s.c.ZkController Publish node=127.0.0.1:33207__c as DOWN 2> 335305 INFO (closeThreadPool-1148-thread-1) [n:127.0.0.1:33207__c c: s: r: x: t:] o.a.s.c.ZkController Register node as live in ZooKeeper:/live_nodes/127.0.0.1:33207__c 2> 335313 INFO (zkCallback-1122-thread-2) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (1) -> (2) 2> 335315 INFO (zkCallback-1161-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (1) -> (2) 2> 335316 INFO (zkCallback-1146-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (1) -> (2) 2> 335331 WARN (closeThreadPool-1148-thread-1) [n:127.0.0.1:33207__c c: s: r: x: t:] o.a.s.c.CoreContainer Not all security plugins configured! authentication=disabled authorization=disabled. Solr is only as secure as you make it. Consider configuring authentication/authorization before exposing Solr to users internal or external. See https://s.apache.org/solrsecurity for more info 2> 335945 INFO (closeThreadPool-1148-thread-1) [n:127.0.0.1:33207__c c: s: r: x: t:] o.a.s.c.CorePropertiesLocator Found 0 core definitions underneath /build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.cloud.ReplicationFactorTest_246C98A4C257C021-001/shard-1-001/cores 2> 336505 INFO (TEST-ReplicationFactorTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.AbstractFullDistribZkTestBase create jetty 2 in directory /build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.cloud.ReplicationFactorTest_246C98A4C257C021-001/shard-2-001 of type NRT for shard3 2> 336522 INFO (closeThreadPool-1148-thread-2) [n: c: s: r: x: t:] o.a.s.e.JettySolrRunner Start Jetty (configured port=0, binding port=0) 2> 336523 INFO (closeThreadPool-1148-thread-2) [n: c: s: r: x: t:] o.a.s.e.JettySolrRunner Trying to start Jetty on port 0 try number 2 ... 2> 336524 INFO (closeThreadPool-1148-thread-2) [n: c: s: r: x: t:] o.e.j.s.Server jetty-10.0.19; built: 2023-12-15T20:38:32.477Z; git: 8492d1c78f122bb30cce20aecfa07e7283facd47; jvm 11.0.22+7 2> 336536 INFO (closeThreadPool-1148-thread-2) [n: c: s: r: x: t:] o.e.j.s.s.DefaultSessionIdManager Session workerName=node0 2> 336541 INFO (closeThreadPool-1148-thread-2) [n: c: s: r: x: t:] o.e.j.s.h.ContextHandler Started o.e.j.s.ServletContextHandler@1fb42ac7{/_c,file:///build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-cwd/,AVAILABLE} 2> 336545 INFO (closeThreadPool-1148-thread-2) [n: c: s: r: x: t:] o.e.j.s.AbstractConnector Started ServerConnector@5284241c{HTTP/1.1, (http/1.1, h2c)}{127.0.0.1:38185} 2> 336546 INFO (closeThreadPool-1148-thread-2) [n: c: s: r: x: t:] o.e.j.s.Server Started Server@550ca21e{STARTING}[10.0.19,sto=0] @336781ms 2> 336547 ERROR (closeThreadPool-1148-thread-2) [n: c: s: r: x: t:] o.a.s.u.StartupLoggingUtils Missing Java Option solr.log.dir. Logging may be missing or incomplete. 2> 336550 INFO (closeThreadPool-1148-thread-2) [n: c: s: r: x: t:] o.a.s.s.CoreContainerProvider Using logger factory org.apache.logging.slf4j.Log4jLoggerFactory 2> 336550 INFO (closeThreadPool-1148-thread-2) [n: c: s: r: x: t:] o.a.s.s.CoreContainerProvider ___ _ Welcome to Apache Solr™ version 9.4.1 2> 336551 INFO (closeThreadPool-1148-thread-2) [n: c: s: r: x: t:] o.a.s.s.CoreContainerProvider / __| ___| |_ _ Starting in cloud mode on port null 2> 336551 INFO (closeThreadPool-1148-thread-2) [n: c: s: r: x: t:] o.a.s.s.CoreContainerProvider \__ \/ _ \ | '_| Install dir: /build/solr/src/solr-9.4.1/solr 2> 336551 INFO (closeThreadPool-1148-thread-2) [n: c: s: r: x: t:] o.a.s.s.CoreContainerProvider |___/\___/_|_| Start time: 2024-01-21T21:07:48.649436Z 2> 336552 INFO (closeThreadPool-1148-thread-2) [n: c: s: r: x: t:] o.a.s.s.CoreContainerProvider Solr Home: /build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.cloud.ReplicationFactorTest_246C98A4C257C021-001/shard-2-001 (source: servlet config: solr.solr.home) 2> 336554 WARN (closeThreadPool-1148-thread-2) [n: c: s: r: x: t:] o.a.s.c.c.SolrZkClient Using default ZkCredentialsInjector. ZkCredentialsInjector is not secure, it creates an empty list of credentials which leads to 'OPEN_ACL_UNSAFE' ACLs to Zookeeper nodes 2> 336565 INFO (closeThreadPool-1148-thread-2) [n: c: s: r: x: t:] o.a.s.c.c.ConnectionManager Waiting up to 30000ms for client to connect to ZooKeeper 2> 336577 INFO (zkConnectionManagerCallback-1169-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ConnectionManager zkClient has connected 2> 336578 INFO (closeThreadPool-1148-thread-2) [n: c: s: r: x: t:] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper 2> 336578 WARN (closeThreadPool-1148-thread-2) [n: c: s: r: x: t:] o.a.s.c.c.SolrZkClient Using default ZkACLProvider. DefaultZkACLProvider is not secure, it creates 'OPEN_ACL_UNSAFE' ACLs to Zookeeper nodes 2> 336651 INFO (closeThreadPool-1148-thread-1) [n:127.0.0.1:33207__c c: s: r: x: t:] o.a.s.j.SolrRequestAuthorizer Creating a new SolrRequestAuthorizer 2> 336687 INFO (closeThreadPool-1148-thread-2) [n: c: s: r: x: t:] o.a.s.c.NodeConfig Loading solr.xml from SolrHome (not found in ZooKeeper) 2> 336689 INFO (closeThreadPool-1148-thread-2) [n: c: s: r: x: t:] o.a.s.c.SolrXmlConfig Loading solr.xml from /build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.cloud.ReplicationFactorTest_246C98A4C257C021-001/shard-2-001/solr.xml 2> 336859 INFO (closeThreadPool-1148-thread-1) [n:127.0.0.1:33207__c c: s: r: x: t:] o.a.s.e.JettySolrRunner Jetty properties: {hostContext=/_c, solrconfig=solrconfig.xml, solr.data.dir=/build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.cloud.ReplicationFactorTest_246C98A4C257C021-001/tempDir-001/jetty1, hostPort=33207, coreRootDirectory=/build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.cloud.ReplicationFactorTest_246C98A4C257C021-001/shard-1-001/cores} 2> 336864 INFO (closeThreadPool-1148-thread-1) [n: c: s: r: x: t:] o.a.s.c.AbstractFullDistribZkTestBase waitForLiveNode: 127.0.0.1:33207__c 2> 337733 INFO (closeThreadPool-1148-thread-2) [n: c: s: r: x: t:] o.a.s.c.ZkContainer Zookeeper client=127.0.0.1:38605/solr 2> 337734 WARN (closeThreadPool-1148-thread-2) [n: c: s: r: x: t:] o.a.s.c.c.SolrZkClient Using default ZkCredentialsInjector. ZkCredentialsInjector is not secure, it creates an empty list of credentials which leads to 'OPEN_ACL_UNSAFE' ACLs to Zookeeper nodes 2> 337747 INFO (closeThreadPool-1148-thread-2) [n: c: s: r: x: t:] o.a.s.c.c.ConnectionManager Waiting up to 15000ms for client to connect to ZooKeeper 2> 337757 INFO (zkConnectionManagerCallback-1179-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ConnectionManager zkClient has connected 2> 337758 INFO (closeThreadPool-1148-thread-2) [n: c: s: r: x: t:] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper 2> 337758 WARN (closeThreadPool-1148-thread-2) [n: c: s: r: x: t:] o.a.s.c.c.SolrZkClient Using default ZkACLProvider. DefaultZkACLProvider is not secure, it creates 'OPEN_ACL_UNSAFE' ACLs to Zookeeper nodes 2> 337867 INFO (closeThreadPool-1148-thread-2) [n: c: s: r: x: t:] o.a.s.c.DistributedClusterStateUpdater Creating DistributedClusterStateUpdater with useDistributedStateUpdate=false. Solr will be using Overseer based cluster state updates. 2> 337878 INFO (closeThreadPool-1148-thread-2) [n:127.0.0.1:32875__c c: s: r: x: t:] o.a.s.c.c.ConnectionManager Waiting up to 15000ms for client to connect to ZooKeeper 2> 337889 INFO (zkConnectionManagerCallback-1181-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ConnectionManager zkClient has connected 2> 337889 INFO (closeThreadPool-1148-thread-2) [n:127.0.0.1:32875__c c: s: r: x: t:] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper 2> 337941 WARN (closeThreadPool-1148-thread-2) [n:127.0.0.1:32875__c c: s: r: x: t:] o.a.s.c.ZkController Contents of zookeeper /security.json are world-readable; consider setting up ACLs as described in https://solr.apache.org/guide/solr/latest/deployment-guide/zookeeper-access-control.html 2> 337948 INFO (closeThreadPool-1148-thread-2) [n:127.0.0.1:32875__c c: s: r: x: t:] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (0) -> (2) 2> 337965 INFO (closeThreadPool-1148-thread-2) [n:127.0.0.1:32875__c c: s: r: x: t:] o.a.s.c.DistributedClusterStateUpdater Creating DistributedClusterStateUpdater with useDistributedStateUpdate=false. Solr will be using Overseer based cluster state updates. 2> 337981 INFO (closeThreadPool-1148-thread-2) [n:127.0.0.1:32875__c c: s: r: x: t:] o.a.s.c.ZkController Publish node=127.0.0.1:32875__c as DOWN 2> 337989 INFO (closeThreadPool-1148-thread-2) [n:127.0.0.1:32875__c c: s: r: x: t:] o.a.s.c.ZkController Register node as live in ZooKeeper:/live_nodes/127.0.0.1:32875__c 2> 337997 INFO (zkCallback-1161-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (2) -> (3) 2> 337998 INFO (zkCallback-1146-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (2) -> (3) 2> 337998 INFO (zkCallback-1122-thread-2) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (2) -> (3) 2> 337999 INFO (zkCallback-1180-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (2) -> (3) 2> 338015 WARN (closeThreadPool-1148-thread-2) [n:127.0.0.1:32875__c c: s: r: x: t:] o.a.s.c.CoreContainer Not all security plugins configured! authentication=disabled authorization=disabled. Solr is only as secure as you make it. Consider configuring authentication/authorization before exposing Solr to users internal or external. See https://s.apache.org/solrsecurity for more info 2> 339141 INFO (closeThreadPool-1148-thread-2) [n:127.0.0.1:32875__c c: s: r: x: t:] o.a.s.c.CorePropertiesLocator Found 0 core definitions underneath /build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.cloud.ReplicationFactorTest_246C98A4C257C021-001/shard-2-001/cores 2> 339712 INFO (closeThreadPool-1148-thread-2) [n:127.0.0.1:32875__c c: s: r: x: t:] o.a.s.j.SolrRequestAuthorizer Creating a new SolrRequestAuthorizer 2> 339883 INFO (TEST-ReplicationFactorTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.AbstractFullDistribZkTestBase create jetty 3 in directory /build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.cloud.ReplicationFactorTest_246C98A4C257C021-001/shard-3-001 of type NRT for shard1 2> 339904 INFO (closeThreadPool-1148-thread-1) [n: c: s: r: x: t:] o.a.s.e.JettySolrRunner Start Jetty (configured port=0, binding port=0) 2> 339905 INFO (closeThreadPool-1148-thread-1) [n: c: s: r: x: t:] o.a.s.e.JettySolrRunner Trying to start Jetty on port 0 try number 2 ... 2> 339906 INFO (closeThreadPool-1148-thread-1) [n: c: s: r: x: t:] o.e.j.s.Server jetty-10.0.19; built: 2023-12-15T20:38:32.477Z; git: 8492d1c78f122bb30cce20aecfa07e7283facd47; jvm 11.0.22+7 2> 339931 INFO (closeThreadPool-1148-thread-1) [n: c: s: r: x: t:] o.e.j.s.s.DefaultSessionIdManager Session workerName=node0 2> 339939 INFO (closeThreadPool-1148-thread-1) [n: c: s: r: x: t:] o.e.j.s.h.ContextHandler Started o.e.j.s.ServletContextHandler@5d5e94e6{/_c,file:///build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-cwd/,AVAILABLE} 2> 339943 INFO (closeThreadPool-1148-thread-1) [n: c: s: r: x: t:] o.e.j.s.AbstractConnector Started ServerConnector@75081bba{HTTP/1.1, (http/1.1, h2c)}{127.0.0.1:40767} 2> 339944 INFO (closeThreadPool-1148-thread-1) [n: c: s: r: x: t:] o.e.j.s.Server Started Server@9e82187{STARTING}[10.0.19,sto=0] @340179ms 2> 339946 ERROR (closeThreadPool-1148-thread-1) [n: c: s: r: x: t:] o.a.s.u.StartupLoggingUtils Missing Java Option solr.log.dir. Logging may be missing or incomplete. 2> 339955 INFO (closeThreadPool-1148-thread-1) [n: c: s: r: x: t:] o.a.s.s.CoreContainerProvider Using logger factory org.apache.logging.slf4j.Log4jLoggerFactory 2> 339955 INFO (closeThreadPool-1148-thread-1) [n: c: s: r: x: t:] o.a.s.s.CoreContainerProvider ___ _ Welcome to Apache Solr™ version 9.4.1 2> 339956 INFO (closeThreadPool-1148-thread-1) [n: c: s: r: x: t:] o.a.s.s.CoreContainerProvider / __| ___| |_ _ Starting in cloud mode on port null 2> 339956 INFO (closeThreadPool-1148-thread-1) [n: c: s: r: x: t:] o.a.s.s.CoreContainerProvider \__ \/ _ \ | '_| Install dir: /build/solr/src/solr-9.4.1/solr 2> 339957 INFO (closeThreadPool-1148-thread-1) [n: c: s: r: x: t:] o.a.s.s.CoreContainerProvider |___/\___/_|_| Start time: 2024-01-21T21:07:52.055087Z 2> 339958 INFO (closeThreadPool-1148-thread-1) [n: c: s: r: x: t:] o.a.s.s.CoreContainerProvider Solr Home: /build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.cloud.ReplicationFactorTest_246C98A4C257C021-001/shard-3-001 (source: servlet config: solr.solr.home) 2> 339960 WARN (closeThreadPool-1148-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.SolrZkClient Using default ZkCredentialsInjector. ZkCredentialsInjector is not secure, it creates an empty list of credentials which leads to 'OPEN_ACL_UNSAFE' ACLs to Zookeeper nodes 2> 339972 INFO (closeThreadPool-1148-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ConnectionManager Waiting up to 30000ms for client to connect to ZooKeeper 2> 339976 INFO (closeThreadPool-1148-thread-2) [n:127.0.0.1:32875__c c: s: r: x: t:] o.a.s.e.JettySolrRunner Jetty properties: {hostContext=/_c, solrconfig=solrconfig.xml, solr.data.dir=/build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.cloud.ReplicationFactorTest_246C98A4C257C021-001/tempDir-001/jetty2, hostPort=32875, coreRootDirectory=/build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.cloud.ReplicationFactorTest_246C98A4C257C021-001/shard-2-001/cores} 2> 339985 INFO (closeThreadPool-1148-thread-2) [n: c: s: r: x: t:] o.a.s.c.AbstractFullDistribZkTestBase waitForLiveNode: 127.0.0.1:32875__c 2> 339985 INFO (zkConnectionManagerCallback-1188-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ConnectionManager zkClient has connected 2> 339986 INFO (closeThreadPool-1148-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper 2> 339988 WARN (closeThreadPool-1148-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.SolrZkClient Using default ZkACLProvider. DefaultZkACLProvider is not secure, it creates 'OPEN_ACL_UNSAFE' ACLs to Zookeeper nodes 2> 340097 INFO (closeThreadPool-1148-thread-1) [n: c: s: r: x: t:] o.a.s.c.NodeConfig Loading solr.xml from SolrHome (not found in ZooKeeper) 2> 340100 INFO (closeThreadPool-1148-thread-1) [n: c: s: r: x: t:] o.a.s.c.SolrXmlConfig Loading solr.xml from /build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.cloud.ReplicationFactorTest_246C98A4C257C021-001/shard-3-001/solr.xml 2> 341856 INFO (closeThreadPool-1148-thread-1) [n: c: s: r: x: t:] o.a.s.c.ZkContainer Zookeeper client=127.0.0.1:38605/solr 2> 341858 WARN (closeThreadPool-1148-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.SolrZkClient Using default ZkCredentialsInjector. ZkCredentialsInjector is not secure, it creates an empty list of credentials which leads to 'OPEN_ACL_UNSAFE' ACLs to Zookeeper nodes 2> 341869 INFO (closeThreadPool-1148-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ConnectionManager Waiting up to 15000ms for client to connect to ZooKeeper 2> 341879 INFO (zkConnectionManagerCallback-1198-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ConnectionManager zkClient has connected 2> 341880 INFO (closeThreadPool-1148-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper 2> 341880 WARN (closeThreadPool-1148-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.SolrZkClient Using default ZkACLProvider. DefaultZkACLProvider is not secure, it creates 'OPEN_ACL_UNSAFE' ACLs to Zookeeper nodes 2> 341991 INFO (closeThreadPool-1148-thread-1) [n: c: s: r: x: t:] o.a.s.c.DistributedClusterStateUpdater Creating DistributedClusterStateUpdater with useDistributedStateUpdate=false. Solr will be using Overseer based cluster state updates. 2> 342000 INFO (closeThreadPool-1148-thread-1) [n:127.0.0.1:46873__c c: s: r: x: t:] o.a.s.c.c.ConnectionManager Waiting up to 15000ms for client to connect to ZooKeeper 2> 342010 INFO (zkConnectionManagerCallback-1200-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ConnectionManager zkClient has connected 2> 342010 INFO (closeThreadPool-1148-thread-1) [n:127.0.0.1:46873__c c: s: r: x: t:] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper 2> 342061 WARN (closeThreadPool-1148-thread-1) [n:127.0.0.1:46873__c c: s: r: x: t:] o.a.s.c.ZkController Contents of zookeeper /security.json are world-readable; consider setting up ACLs as described in https://solr.apache.org/guide/solr/latest/deployment-guide/zookeeper-access-control.html 2> 342068 INFO (closeThreadPool-1148-thread-1) [n:127.0.0.1:46873__c c: s: r: x: t:] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (0) -> (3) 2> 342088 INFO (closeThreadPool-1148-thread-1) [n:127.0.0.1:46873__c c: s: r: x: t:] o.a.s.c.DistributedClusterStateUpdater Creating DistributedClusterStateUpdater with useDistributedStateUpdate=false. Solr will be using Overseer based cluster state updates. 2> 342106 INFO (closeThreadPool-1148-thread-1) [n:127.0.0.1:46873__c c: s: r: x: t:] o.a.s.c.ZkController Publish node=127.0.0.1:46873__c as DOWN 2> 342115 INFO (closeThreadPool-1148-thread-1) [n:127.0.0.1:46873__c c: s: r: x: t:] o.a.s.c.ZkController Register node as live in ZooKeeper:/live_nodes/127.0.0.1:46873__c 2> 342125 INFO (zkCallback-1161-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (3) -> (4) 2> 342125 INFO (zkCallback-1122-thread-2) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (3) -> (4) 2> 342125 INFO (zkCallback-1180-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (3) -> (4) 2> 342127 INFO (zkCallback-1146-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (3) -> (4) 2> 342128 INFO (zkCallback-1199-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (3) -> (4) 2> 342149 WARN (closeThreadPool-1148-thread-1) [n:127.0.0.1:46873__c c: s: r: x: t:] o.a.s.c.CoreContainer Not all security plugins configured! authentication=disabled authorization=disabled. Solr is only as secure as you make it. Consider configuring authentication/authorization before exposing Solr to users internal or external. See https://s.apache.org/solrsecurity for more info 2> 342922 INFO (closeThreadPool-1148-thread-1) [n:127.0.0.1:46873__c c: s: r: x: t:] o.a.s.c.CorePropertiesLocator Found 0 core definitions underneath /build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.cloud.ReplicationFactorTest_246C98A4C257C021-001/shard-3-001/cores 2> 343831 INFO (closeThreadPool-1148-thread-1) [n:127.0.0.1:46873__c c: s: r: x: t:] o.a.s.j.SolrRequestAuthorizer Creating a new SolrRequestAuthorizer 2> 344059 INFO (closeThreadPool-1148-thread-1) [n:127.0.0.1:46873__c c: s: r: x: t:] o.a.s.e.JettySolrRunner Jetty properties: {hostContext=/_c, solrconfig=solrconfig.xml, solr.data.dir=/build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.cloud.ReplicationFactorTest_246C98A4C257C021-001/tempDir-001/jetty3, hostPort=46873, coreRootDirectory=/build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.cloud.ReplicationFactorTest_246C98A4C257C021-001/shard-3-001/cores, replicaType=NRT} 2> 344064 INFO (closeThreadPool-1148-thread-1) [n: c: s: r: x: t:] o.a.s.c.AbstractFullDistribZkTestBase waitForLiveNode: 127.0.0.1:46873__c 2> 344090 INFO (SocketProxy-Acceptor-32875) [n: c: s: r: x: t:] o.a.s.c.s.c.SocketProxy accepted Socket[addr=/127.0.0.1,port=39124,localport=32875], receiveBufferSize: 65536 2> 344107 INFO (SocketProxy-Acceptor-32875) [n: c: s: r: x: t:] o.a.s.c.s.c.SocketProxy proxy connection Socket[addr=/127.0.0.1,port=38185,localport=53604], receiveBufferSize=65536 2> 344113 INFO (SocketProxy-Acceptor-32875) [n: c: s: r: x: t:] o.a.s.c.s.c.SocketProxy accepted Socket[addr=/127.0.0.1,port=39128,localport=32875], receiveBufferSize: 65536 2> 344125 INFO (SocketProxy-Acceptor-32875) [n: c: s: r: x: t:] o.a.s.c.s.c.SocketProxy proxy connection Socket[addr=/127.0.0.1,port=38185,localport=53610], receiveBufferSize=65536 2> 344136 INFO (SocketProxy-Acceptor-32875) [n: c: s: r: x: t:] o.a.s.c.s.c.SocketProxy accepted Socket[addr=/127.0.0.1,port=39142,localport=32875], receiveBufferSize: 65536 2> 344152 INFO (SocketProxy-Acceptor-32875) [n: c: s: r: x: t:] o.a.s.c.s.c.SocketProxy proxy connection Socket[addr=/127.0.0.1,port=38185,localport=53624], receiveBufferSize=65536 2> 344199 INFO (OverseerThreadFactory-1128-thread-3) [n: c:collection1 s:shard3 r: x: t:] o.a.s.c.a.c.AddReplicaCmd Node Identified 127.0.0.1:32875__c for creating new replica of shard shard3 for collection collection1 2> 344216 INFO (OverseerThreadFactory-1128-thread-4) [n: c:collection1 s:shard2 r: x: t:] o.a.s.c.a.c.AddReplicaCmd Node Identified 127.0.0.1:33207__c for creating new replica of shard shard2 for collection collection1 2> 344221 INFO (OverseerThreadFactory-1128-thread-3) [n: c:collection1 s:shard3 r: x: t:] o.a.s.c.a.c.AddReplicaCmd Returning CreateReplica command. 2> 344227 INFO (OverseerThreadFactory-1128-thread-5) [n: c:collection1 s:shard1 r: x: t:] o.a.s.c.a.c.AddReplicaCmd Node Identified 127.0.0.1:46873__c for creating new replica of shard shard1 for collection collection1 2> 344237 INFO (OverseerThreadFactory-1128-thread-4) [n: c:collection1 s:shard2 r: x: t:] o.a.s.c.a.c.AddReplicaCmd Returning CreateReplica command. 2> 344240 INFO (OverseerStateUpdate-72077814988931076-127.0.0.1:40643__c-n_0000000000) [n:127.0.0.1:40643__c c: s: r: x: t:] o.a.s.c.o.SliceMutator createReplica() { 2> "core":"collection1_shard3_replica_n1", 2> "node_name":"127.0.0.1:32875__c", 2> "base_url":"http://127.0.0.1:32875/_c", 2> "collection":"collection1", 2> "shard":"shard3", 2> "state":"down", 2> "type":"NRT", 2> "operation":"addreplica"} 2> 344245 INFO (OverseerThreadFactory-1128-thread-5) [n: c:collection1 s:shard1 r: x: t:] o.a.s.c.a.c.AddReplicaCmd Returning CreateReplica command. 2> 344262 INFO (zkCallback-1122-thread-2) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/collection1/state.json zxid: 234] for collection [collection1] has occurred - updating... (live nodes size: [4]) 2> 344274 INFO (OverseerStateUpdate-72077814988931076-127.0.0.1:40643__c-n_0000000000) [n:127.0.0.1:40643__c c: s: r: x: t:] o.a.s.c.o.SliceMutator createReplica() { 2> "core":"collection1_shard2_replica_n2", 2> "node_name":"127.0.0.1:33207__c", 2> "base_url":"http://127.0.0.1:33207/_c", 2> "collection":"collection1", 2> "shard":"shard2", 2> "state":"down", 2> "type":"NRT", 2> "operation":"addreplica"} 2> 344276 INFO (SocketProxy-Acceptor-32875) [n: c: s: r: x: t:] o.a.s.c.s.c.SocketProxy accepted Socket[addr=/127.0.0.1,port=39148,localport=32875], receiveBufferSize: 65536 2> 344285 INFO (SocketProxy-Acceptor-32875) [n: c: s: r: x: t:] o.a.s.c.s.c.SocketProxy proxy connection Socket[addr=/127.0.0.1,port=38185,localport=53632], receiveBufferSize=65536 2> 344300 INFO (OverseerStateUpdate-72077814988931076-127.0.0.1:40643__c-n_0000000000) [n:127.0.0.1:40643__c c: s: r: x: t:] o.a.s.c.o.SliceMutator createReplica() { 2> "core":"collection1_shard1_replica_n3", 2> "node_name":"127.0.0.1:46873__c", 2> "base_url":"http://127.0.0.1:46873/_c", 2> "collection":"collection1", 2> "shard":"shard1", 2> "state":"down", 2> "type":"NRT", 2> "operation":"addreplica"} 2> 344310 INFO (qtp1455143443-2014) [n:127.0.0.1:32875__c c: s: r: x:collection1_shard3_replica_n1 t:null-1012] o.a.s.h.a.CoreAdminOperation core create command qt=/admin/cores&coreNodeName=core_node4&collection.configName=conf1&name=collection1_shard3_replica_n1&action=CREATE&collection=collection1&shard=shard3&wt=javabin&version=2&replicaType=NRT 2> 344401 INFO (qtp1455143443-2014) [n:127.0.0.1:32875__c c:collection1 s:shard3 r:core_node4 x:collection1_shard3_replica_n1 t:null-1012] o.a.s.c.SolrConfig Using Lucene MatchVersion: 9.8.0 2> 344449 INFO (zkCallback-1122-thread-2) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/collection1/state.json zxid: 239] for collection [collection1] has occurred - updating... (live nodes size: [4]) 2> 344457 INFO (SocketProxy-Acceptor-33207) [n: c: s: r: x: t:] o.a.s.c.s.c.SocketProxy accepted Socket[addr=/127.0.0.1,port=39806,localport=33207], receiveBufferSize: 65536 2> 344460 INFO (qtp1455143443-2014) [n:127.0.0.1:32875__c c:collection1 s:shard3 r:core_node4 x:collection1_shard3_replica_n1 t:null-1012] o.a.s.s.IndexSchema Schema name=test 2> 344461 INFO (SocketProxy-Acceptor-46873) [n: c: s: r: x: t:] o.a.s.c.s.c.SocketProxy accepted Socket[addr=/127.0.0.1,port=51188,localport=46873], receiveBufferSize: 65536 2> 344476 INFO (SocketProxy-Acceptor-46873) [n: c: s: r: x: t:] o.a.s.c.s.c.SocketProxy proxy connection Socket[addr=/127.0.0.1,port=40767,localport=35320], receiveBufferSize=65536 2> 344484 INFO (SocketProxy-Acceptor-33207) [n: c: s: r: x: t:] o.a.s.c.s.c.SocketProxy proxy connection Socket[addr=/127.0.0.1,port=35371,localport=60916], receiveBufferSize=65536 2> 344513 INFO (qtp1274442124-2042) [n:127.0.0.1:46873__c c: s: r: x:collection1_shard1_replica_n3 t:null-1013] o.a.s.h.a.CoreAdminOperation core create command qt=/admin/cores&coreNodeName=core_node6&collection.configName=conf1&name=collection1_shard1_replica_n3&action=CREATE&collection=collection1&shard=shard1&wt=javabin&version=2&replicaType=NRT 2> 344514 INFO (qtp1461542799-1983) [n:127.0.0.1:33207__c c: s: r: x:collection1_shard2_replica_n2 t:null-1014] o.a.s.h.a.CoreAdminOperation core create command qt=/admin/cores&coreNodeName=core_node5&collection.configName=conf1&name=collection1_shard2_replica_n2&action=CREATE&collection=collection1&shard=shard2&wt=javabin&version=2&replicaType=NRT 2> 344605 INFO (qtp1274442124-2042) [n:127.0.0.1:46873__c c:collection1 s:shard1 r:core_node6 x:collection1_shard1_replica_n3 t:null-1013] o.a.s.c.SolrConfig Using Lucene MatchVersion: 9.8.0 2> 344608 INFO (qtp1461542799-1983) [n:127.0.0.1:33207__c c:collection1 s:shard2 r:core_node5 x:collection1_shard2_replica_n2 t:null-1014] o.a.s.c.SolrConfig Using Lucene MatchVersion: 9.8.0 2> 344659 INFO (qtp1274442124-2042) [n:127.0.0.1:46873__c c:collection1 s:shard1 r:core_node6 x:collection1_shard1_replica_n3 t:null-1013] o.a.s.s.IndexSchema Schema name=test 2> 344663 INFO (qtp1461542799-1983) [n:127.0.0.1:33207__c c:collection1 s:shard2 r:core_node5 x:collection1_shard2_replica_n2 t:null-1014] o.a.s.s.IndexSchema Schema name=test 2> 344724 INFO (qtp1455143443-2014) [n:127.0.0.1:32875__c c:collection1 s:shard3 r:core_node4 x:collection1_shard3_replica_n1 t:null-1012] o.a.s.s.IndexSchema Loaded schema test/1.0 with uniqueid field id 2> 344855 INFO (qtp1274442124-2042) [n:127.0.0.1:46873__c c:collection1 s:shard1 r:core_node6 x:collection1_shard1_replica_n3 t:null-1013] o.a.s.s.IndexSchema Loaded schema test/1.0 with uniqueid field id 2> 344856 INFO (qtp1461542799-1983) [n:127.0.0.1:33207__c c:collection1 s:shard2 r:core_node5 x:collection1_shard2_replica_n2 t:null-1014] o.a.s.s.IndexSchema Loaded schema test/1.0 with uniqueid field id 2> 344902 INFO (qtp1455143443-2014) [n:127.0.0.1:32875__c c:collection1 s:shard3 r:core_node4 x:collection1_shard3_replica_n1 t:null-1012] o.a.s.c.CoreContainer Creating SolrCore 'collection1_shard3_replica_n1' using configuration from configset conf1, trusted=true 2> 344906 INFO (qtp1455143443-2014) [n:127.0.0.1:32875__c c:collection1 s:shard3 r:core_node4 x:collection1_shard3_replica_n1 t:null-1012] o.a.s.c.SolrCore Opening new SolrCore at [/build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.cloud.ReplicationFactorTest_246C98A4C257C021-001/shard-2-001/cores/collection1_shard3_replica_n1], dataDir=[/build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.cloud.ReplicationFactorTest_246C98A4C257C021-001/shard-2-001/cores/collection1_shard3_replica_n1/data/] 2> 344936 INFO (qtp1455143443-2014) [n:127.0.0.1:32875__c c:collection1 s:shard3 r:core_node4 x:collection1_shard3_replica_n1 t:null-1012] o.a.s.u.RandomMergePolicy RandomMergePolicy wrapping class org.apache.lucene.index.TieredMergePolicy: [TieredMergePolicy: maxMergeAtOnce=40, maxMergedSegmentMB=105.98709392547607, floorSegmentMB=0.24309539794921875, forceMergeDeletesPctAllowed=18.166870365028444, segmentsPerTier=19.0, maxCFSSegmentSizeMB=8.796093022208E12, noCFSRatio=0.0, deletesPctAllowed=47.18653045948602 2> 344955 WARN (qtp1455143443-2014) [n:127.0.0.1:32875__c c:collection1 s:shard3 r:core_node4 x:collection1_shard3_replica_n1 t:null-1012] o.a.s.c.RequestHandlers INVALID paramSet a in requestHandler {type = requestHandler,name = /dump,class = DumpRequestHandler,attributes = {initParams=a, name=/dump, class=DumpRequestHandler},args = {defaults={a=A, b=B}}} 2> 345034 INFO (qtp1274442124-2042) [n:127.0.0.1:46873__c c:collection1 s:shard1 r:core_node6 x:collection1_shard1_replica_n3 t:null-1013] o.a.s.c.CoreContainer Creating SolrCore 'collection1_shard1_replica_n3' using configuration from configset conf1, trusted=true 2> 345038 INFO (qtp1274442124-2042) [n:127.0.0.1:46873__c c:collection1 s:shard1 r:core_node6 x:collection1_shard1_replica_n3 t:null-1013] o.a.s.c.SolrCore Opening new SolrCore at [/build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.cloud.ReplicationFactorTest_246C98A4C257C021-001/shard-3-001/cores/collection1_shard1_replica_n3], dataDir=[/build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.cloud.ReplicationFactorTest_246C98A4C257C021-001/shard-3-001/cores/collection1_shard1_replica_n3/data/] 2> 345040 INFO (qtp1461542799-1983) [n:127.0.0.1:33207__c c:collection1 s:shard2 r:core_node5 x:collection1_shard2_replica_n2 t:null-1014] o.a.s.c.CoreContainer Creating SolrCore 'collection1_shard2_replica_n2' using configuration from configset conf1, trusted=true 2> 345043 INFO (qtp1461542799-1983) [n:127.0.0.1:33207__c c:collection1 s:shard2 r:core_node5 x:collection1_shard2_replica_n2 t:null-1014] o.a.s.c.SolrCore Opening new SolrCore at [/build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.cloud.ReplicationFactorTest_246C98A4C257C021-001/shard-1-001/cores/collection1_shard2_replica_n2], dataDir=[/build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.cloud.ReplicationFactorTest_246C98A4C257C021-001/shard-1-001/cores/collection1_shard2_replica_n2/data/] 2> 345060 INFO (qtp1274442124-2042) [n:127.0.0.1:46873__c c:collection1 s:shard1 r:core_node6 x:collection1_shard1_replica_n3 t:null-1013] o.a.s.u.RandomMergePolicy RandomMergePolicy wrapping class org.apache.lucene.index.TieredMergePolicy: [TieredMergePolicy: maxMergeAtOnce=40, maxMergedSegmentMB=105.98709392547607, floorSegmentMB=0.24309539794921875, forceMergeDeletesPctAllowed=18.166870365028444, segmentsPerTier=19.0, maxCFSSegmentSizeMB=8.796093022208E12, noCFSRatio=0.0, deletesPctAllowed=47.18653045948602 2> 345066 INFO (qtp1461542799-1983) [n:127.0.0.1:33207__c c:collection1 s:shard2 r:core_node5 x:collection1_shard2_replica_n2 t:null-1014] o.a.s.u.RandomMergePolicy RandomMergePolicy wrapping class org.apache.lucene.index.TieredMergePolicy: [TieredMergePolicy: maxMergeAtOnce=40, maxMergedSegmentMB=105.98709392547607, floorSegmentMB=0.24309539794921875, forceMergeDeletesPctAllowed=18.166870365028444, segmentsPerTier=19.0, maxCFSSegmentSizeMB=8.796093022208E12, noCFSRatio=0.0, deletesPctAllowed=47.18653045948602 2> 345078 WARN (qtp1274442124-2042) [n:127.0.0.1:46873__c c:collection1 s:shard1 r:core_node6 x:collection1_shard1_replica_n3 t:null-1013] o.a.s.c.RequestHandlers INVALID paramSet a in requestHandler {type = requestHandler,name = /dump,class = DumpRequestHandler,attributes = {initParams=a, name=/dump, class=DumpRequestHandler},args = {defaults={a=A, b=B}}} 2> 345083 WARN (qtp1461542799-1983) [n:127.0.0.1:33207__c c:collection1 s:shard2 r:core_node5 x:collection1_shard2_replica_n2 t:null-1014] o.a.s.c.RequestHandlers INVALID paramSet a in requestHandler {type = requestHandler,name = /dump,class = DumpRequestHandler,attributes = {initParams=a, name=/dump, class=DumpRequestHandler},args = {defaults={a=A, b=B}}} 2> 345576 INFO (qtp1455143443-2014) [n:127.0.0.1:32875__c c:collection1 s:shard3 r:core_node4 x:collection1_shard3_replica_n1 t:null-1012] o.a.s.j.SolrRequestAuthorizer Creating a new SolrRequestAuthorizer 2> 345635 INFO (qtp1455143443-2014) [n:127.0.0.1:32875__c c:collection1 s:shard3 r:core_node4 x:collection1_shard3_replica_n1 t:null-1012] o.a.s.u.UpdateHandler Using UpdateLog implementation: org.apache.solr.update.UpdateLog 2> 345636 INFO (qtp1455143443-2014) [n:127.0.0.1:32875__c c:collection1 s:shard3 r:core_node4 x:collection1_shard3_replica_n1 t:null-1012] o.a.s.u.UpdateLog Initializing UpdateLog: dataDir= defaultSyncLevel=FLUSH numRecordsToKeep=100 maxNumLogsToKeep=10 numVersionBuckets=65536 2> 345636 INFO (qtp1274442124-2042) [n:127.0.0.1:46873__c c:collection1 s:shard1 r:core_node6 x:collection1_shard1_replica_n3 t:null-1013] o.a.s.j.SolrRequestAuthorizer Creating a new SolrRequestAuthorizer 2> 345655 INFO (qtp1455143443-2014) [n:127.0.0.1:32875__c c:collection1 s:shard3 r:core_node4 x:collection1_shard3_replica_n1 t:null-1012] o.a.s.u.CommitTracker Hard AutoCommit: disabled 2> 345656 INFO (qtp1455143443-2014) [n:127.0.0.1:32875__c c:collection1 s:shard3 r:core_node4 x:collection1_shard3_replica_n1 t:null-1012] o.a.s.u.CommitTracker Soft AutoCommit: disabled 2> 345670 INFO (qtp1455143443-2014) [n:127.0.0.1:32875__c c:collection1 s:shard3 r:core_node4 x:collection1_shard3_replica_n1 t:null-1012] o.a.s.u.RandomMergePolicy RandomMergePolicy wrapping class org.apache.lucene.index.LogByteSizeMergePolicy: [LogByteSizeMergePolicy: minMergeSize=1677721, mergeFactor=33, maxMergeSize=2147483648, maxMergeSizeForForcedMerge=9223372036854775807, calibrateSizeByDeletes=false, maxMergeDocs=2147483647, maxCFSSegmentSizeMB=8.796093022208E12, noCFSRatio=0.6324117596483627] 2> 345687 INFO (qtp1461542799-1983) [n:127.0.0.1:33207__c c:collection1 s:shard2 r:core_node5 x:collection1_shard2_replica_n2 t:null-1014] o.a.s.j.SolrRequestAuthorizer Creating a new SolrRequestAuthorizer 2> 345689 INFO (qtp1455143443-2014) [n:127.0.0.1:32875__c c:collection1 s:shard3 r:core_node4 x:collection1_shard3_replica_n1 t:null-1012] o.a.s.r.ManagedResourceStorage Configured ZooKeeperStorageIO with znodeBase: /configs/conf1 2> 345692 INFO (qtp1455143443-2014) [n:127.0.0.1:32875__c c:collection1 s:shard3 r:core_node4 x:collection1_shard3_replica_n1 t:null-1012] o.a.s.r.ManagedResourceStorage Loaded null at path _rest_managed.json using ZooKeeperStorageIO:path=/configs/conf1 2> 345699 INFO (qtp1455143443-2014) [n:127.0.0.1:32875__c c:collection1 s:shard3 r:core_node4 x:collection1_shard3_replica_n1 t:null-1012] o.a.s.h.ReplicationHandler Commits will be reserved for 10000 ms 2> 345699 INFO (qtp1274442124-2042) [n:127.0.0.1:46873__c c:collection1 s:shard1 r:core_node6 x:collection1_shard1_replica_n3 t:null-1013] o.a.s.u.UpdateHandler Using UpdateLog implementation: org.apache.solr.update.UpdateLog 2> 345700 INFO (qtp1274442124-2042) [n:127.0.0.1:46873__c c:collection1 s:shard1 r:core_node6 x:collection1_shard1_replica_n3 t:null-1013] o.a.s.u.UpdateLog Initializing UpdateLog: dataDir= defaultSyncLevel=FLUSH numRecordsToKeep=100 maxNumLogsToKeep=10 numVersionBuckets=65536 2> 345701 INFO (qtp1455143443-2014) [n:127.0.0.1:32875__c c:collection1 s:shard3 r:core_node4 x:collection1_shard3_replica_n1 t:null-1012] o.a.s.u.UpdateLog Could not find max version in index or recent updates, using new clock 1788735680988315648 2> 345730 INFO (qtp1274442124-2042) [n:127.0.0.1:46873__c c:collection1 s:shard1 r:core_node6 x:collection1_shard1_replica_n3 t:null-1013] o.a.s.u.CommitTracker Hard AutoCommit: disabled 2> 345731 INFO (qtp1274442124-2042) [n:127.0.0.1:46873__c c:collection1 s:shard1 r:core_node6 x:collection1_shard1_replica_n3 t:null-1013] o.a.s.u.CommitTracker Soft AutoCommit: disabled 2> 345743 INFO (qtp1274442124-2042) [n:127.0.0.1:46873__c c:collection1 s:shard1 r:core_node6 x:collection1_shard1_replica_n3 t:null-1013] o.a.s.u.RandomMergePolicy RandomMergePolicy wrapping class org.apache.lucene.index.LogByteSizeMergePolicy: [LogByteSizeMergePolicy: minMergeSize=1677721, mergeFactor=33, maxMergeSize=2147483648, maxMergeSizeForForcedMerge=9223372036854775807, calibrateSizeByDeletes=false, maxMergeDocs=2147483647, maxCFSSegmentSizeMB=8.796093022208E12, noCFSRatio=0.6324117596483627] 2> 345754 INFO (searcherExecutor-1211-thread-1-processing-collection1_shard3_replica_n1 null-1012 core_node4 127.0.0.1:32875__c collection1 shard3) [n:127.0.0.1:32875__c c:collection1 s:shard3 r:core_node4 x:collection1_shard3_replica_n1 t:null-1012] o.a.s.c.SolrCore Registered new searcher autowarm time: 0 ms 2> 345759 INFO (qtp1455143443-2014) [n:127.0.0.1:32875__c c:collection1 s:shard3 r:core_node4 x:collection1_shard3_replica_n1 t:null-1012] o.a.s.c.ZkShardTerms Successful update of terms at /collections/collection1/terms/shard3 to Terms{values={core_node4=0}, version=0} for registerTerm 2> 345760 INFO (qtp1274442124-2042) [n:127.0.0.1:46873__c c:collection1 s:shard1 r:core_node6 x:collection1_shard1_replica_n3 t:null-1013] o.a.s.r.ManagedResourceStorage Configured ZooKeeperStorageIO with znodeBase: /configs/conf1 2> 345761 INFO (qtp1455143443-2014) [n:127.0.0.1:32875__c c:collection1 s:shard3 r:core_node4 x:collection1_shard3_replica_n1 t:null-1012] o.a.s.c.ShardLeaderElectionContextBase make sure parent is created /collections/collection1/leaders/shard3 2> 345761 INFO (qtp1461542799-1983) [n:127.0.0.1:33207__c c:collection1 s:shard2 r:core_node5 x:collection1_shard2_replica_n2 t:null-1014] o.a.s.u.UpdateHandler Using UpdateLog implementation: org.apache.solr.update.UpdateLog 2> 345762 INFO (qtp1461542799-1983) [n:127.0.0.1:33207__c c:collection1 s:shard2 r:core_node5 x:collection1_shard2_replica_n2 t:null-1014] o.a.s.u.UpdateLog Initializing UpdateLog: dataDir= defaultSyncLevel=FLUSH numRecordsToKeep=100 maxNumLogsToKeep=10 numVersionBuckets=65536 2> 345764 INFO (qtp1274442124-2042) [n:127.0.0.1:46873__c c:collection1 s:shard1 r:core_node6 x:collection1_shard1_replica_n3 t:null-1013] o.a.s.r.ManagedResourceStorage Loaded null at path _rest_managed.json using ZooKeeperStorageIO:path=/configs/conf1 2> 345770 INFO (qtp1274442124-2042) [n:127.0.0.1:46873__c c:collection1 s:shard1 r:core_node6 x:collection1_shard1_replica_n3 t:null-1013] o.a.s.h.ReplicationHandler Commits will be reserved for 10000 ms 2> 345771 INFO (qtp1274442124-2042) [n:127.0.0.1:46873__c c:collection1 s:shard1 r:core_node6 x:collection1_shard1_replica_n3 t:null-1013] o.a.s.u.UpdateLog Could not find max version in index or recent updates, using new clock 1788735681062764544 2> 345783 INFO (qtp1461542799-1983) [n:127.0.0.1:33207__c c:collection1 s:shard2 r:core_node5 x:collection1_shard2_replica_n2 t:null-1014] o.a.s.u.CommitTracker Hard AutoCommit: disabled 2> 345783 INFO (qtp1461542799-1983) [n:127.0.0.1:33207__c c:collection1 s:shard2 r:core_node5 x:collection1_shard2_replica_n2 t:null-1014] o.a.s.u.CommitTracker Soft AutoCommit: disabled 2> 345798 INFO (qtp1461542799-1983) [n:127.0.0.1:33207__c c:collection1 s:shard2 r:core_node5 x:collection1_shard2_replica_n2 t:null-1014] o.a.s.u.RandomMergePolicy RandomMergePolicy wrapping class org.apache.lucene.index.LogByteSizeMergePolicy: [LogByteSizeMergePolicy: minMergeSize=1677721, mergeFactor=33, maxMergeSize=2147483648, maxMergeSizeForForcedMerge=9223372036854775807, calibrateSizeByDeletes=false, maxMergeDocs=2147483647, maxCFSSegmentSizeMB=8.796093022208E12, noCFSRatio=0.6324117596483627] 2> 345802 INFO (qtp1455143443-2014) [n:127.0.0.1:32875__c c:collection1 s:shard3 r:core_node4 x:collection1_shard3_replica_n1 t:null-1012] o.a.s.c.ShardLeaderElectionContext Enough replicas found to continue. 2> 345802 INFO (qtp1455143443-2014) [n:127.0.0.1:32875__c c:collection1 s:shard3 r:core_node4 x:collection1_shard3_replica_n1 t:null-1012] o.a.s.c.ShardLeaderElectionContext I may be the new leader - try and sync 2> 345802 INFO (qtp1455143443-2014) [n:127.0.0.1:32875__c c:collection1 s:shard3 r:core_node4 x:collection1_shard3_replica_n1 t:null-1012] o.a.s.c.SyncStrategy Sync replicas to http://127.0.0.1:32875/_c/collection1_shard3_replica_n1/ 2> 345806 INFO (qtp1455143443-2014) [n:127.0.0.1:32875__c c:collection1 s:shard3 r:core_node4 x:collection1_shard3_replica_n1 t:null-1012] o.a.s.c.SyncStrategy Sync Success - now sync replicas to me 2> 345809 INFO (qtp1455143443-2014) [n:127.0.0.1:32875__c c:collection1 s:shard3 r:core_node4 x:collection1_shard3_replica_n1 t:null-1012] o.a.s.c.SyncStrategy http://127.0.0.1:32875/_c/collection1_shard3_replica_n1/ has no replicas 2> 345809 INFO (qtp1455143443-2014) [n:127.0.0.1:32875__c c:collection1 s:shard3 r:core_node4 x:collection1_shard3_replica_n1 t:null-1012] o.a.s.c.ShardLeaderElectionContextBase Creating leader registration node /collections/collection1/leaders/shard3/leader after winning as /collections/collection1/leader_elect/shard3/election/72077814988931084-core_node4-n_0000000000 2> 345816 INFO (qtp1461542799-1983) [n:127.0.0.1:33207__c c:collection1 s:shard2 r:core_node5 x:collection1_shard2_replica_n2 t:null-1014] o.a.s.r.ManagedResourceStorage Configured ZooKeeperStorageIO with znodeBase: /configs/conf1 2> 345817 INFO (searcherExecutor-1213-thread-1-processing-collection1_shard1_replica_n3 null-1013 core_node6 127.0.0.1:46873__c collection1 shard1) [n:127.0.0.1:46873__c c:collection1 s:shard1 r:core_node6 x:collection1_shard1_replica_n3 t:null-1013] o.a.s.c.SolrCore Registered new searcher autowarm time: 0 ms 2> 345819 INFO (qtp1461542799-1983) [n:127.0.0.1:33207__c c:collection1 s:shard2 r:core_node5 x:collection1_shard2_replica_n2 t:null-1014] o.a.s.r.ManagedResourceStorage Loaded null at path _rest_managed.json using ZooKeeperStorageIO:path=/configs/conf1 2> 345820 INFO (qtp1274442124-2042) [n:127.0.0.1:46873__c c:collection1 s:shard1 r:core_node6 x:collection1_shard1_replica_n3 t:null-1013] o.a.s.c.ZkShardTerms Successful update of terms at /collections/collection1/terms/shard1 to Terms{values={core_node6=0}, version=0} for registerTerm 2> 345822 INFO (qtp1274442124-2042) [n:127.0.0.1:46873__c c:collection1 s:shard1 r:core_node6 x:collection1_shard1_replica_n3 t:null-1013] o.a.s.c.ShardLeaderElectionContextBase make sure parent is created /collections/collection1/leaders/shard1 2> 345824 INFO (qtp1461542799-1983) [n:127.0.0.1:33207__c c:collection1 s:shard2 r:core_node5 x:collection1_shard2_replica_n2 t:null-1014] o.a.s.h.ReplicationHandler Commits will be reserved for 10000 ms 2> 345825 INFO (qtp1461542799-1983) [n:127.0.0.1:33207__c c:collection1 s:shard2 r:core_node5 x:collection1_shard2_replica_n2 t:null-1014] o.a.s.u.UpdateLog Could not find max version in index or recent updates, using new clock 1788735681119387648 2> 345832 INFO (qtp1455143443-2014) [n:127.0.0.1:32875__c c:collection1 s:shard3 r:core_node4 x:collection1_shard3_replica_n1 t:null-1012] o.a.s.c.ShardLeaderElectionContext I am the new leader: http://127.0.0.1:32875/_c/collection1_shard3_replica_n1/ shard3 2> 345857 INFO (qtp1274442124-2042) [n:127.0.0.1:46873__c c:collection1 s:shard1 r:core_node6 x:collection1_shard1_replica_n3 t:null-1013] o.a.s.c.ShardLeaderElectionContext Enough replicas found to continue. 2> 345857 INFO (qtp1274442124-2042) [n:127.0.0.1:46873__c c:collection1 s:shard1 r:core_node6 x:collection1_shard1_replica_n3 t:null-1013] o.a.s.c.ShardLeaderElectionContext I may be the new leader - try and sync 2> 345857 INFO (qtp1274442124-2042) [n:127.0.0.1:46873__c c:collection1 s:shard1 r:core_node6 x:collection1_shard1_replica_n3 t:null-1013] o.a.s.c.SyncStrategy Sync replicas to http://127.0.0.1:46873/_c/collection1_shard1_replica_n3/ 2> 345860 INFO (qtp1274442124-2042) [n:127.0.0.1:46873__c c:collection1 s:shard1 r:core_node6 x:collection1_shard1_replica_n3 t:null-1013] o.a.s.c.SyncStrategy Sync Success - now sync replicas to me 2> 345862 INFO (qtp1274442124-2042) [n:127.0.0.1:46873__c c:collection1 s:shard1 r:core_node6 x:collection1_shard1_replica_n3 t:null-1013] o.a.s.c.SyncStrategy http://127.0.0.1:46873/_c/collection1_shard1_replica_n3/ has no replicas 2> 345863 INFO (qtp1274442124-2042) [n:127.0.0.1:46873__c c:collection1 s:shard1 r:core_node6 x:collection1_shard1_replica_n3 t:null-1013] o.a.s.c.ShardLeaderElectionContextBase Creating leader registration node /collections/collection1/leaders/shard1/leader after winning as /collections/collection1/leader_elect/shard1/election/72077814988931087-core_node6-n_0000000000 2> 345865 INFO (searcherExecutor-1215-thread-1-processing-collection1_shard2_replica_n2 null-1014 core_node5 127.0.0.1:33207__c collection1 shard2) [n:127.0.0.1:33207__c c:collection1 s:shard2 r:core_node5 x:collection1_shard2_replica_n2 t:null-1014] o.a.s.c.SolrCore Registered new searcher autowarm time: 0 ms 2> 345869 INFO (qtp1461542799-1983) [n:127.0.0.1:33207__c c:collection1 s:shard2 r:core_node5 x:collection1_shard2_replica_n2 t:null-1014] o.a.s.c.ZkShardTerms Successful update of terms at /collections/collection1/terms/shard2 to Terms{values={core_node5=0}, version=0} for registerTerm 2> 345870 INFO (qtp1461542799-1983) [n:127.0.0.1:33207__c c:collection1 s:shard2 r:core_node5 x:collection1_shard2_replica_n2 t:null-1014] o.a.s.c.ShardLeaderElectionContextBase make sure parent is created /collections/collection1/leaders/shard2 2> 345886 INFO (qtp1274442124-2042) [n:127.0.0.1:46873__c c:collection1 s:shard1 r:core_node6 x:collection1_shard1_replica_n3 t:null-1013] o.a.s.c.ShardLeaderElectionContext I am the new leader: http://127.0.0.1:46873/_c/collection1_shard1_replica_n3/ shard1 2> 345903 INFO (qtp1461542799-1983) [n:127.0.0.1:33207__c c:collection1 s:shard2 r:core_node5 x:collection1_shard2_replica_n2 t:null-1014] o.a.s.c.ShardLeaderElectionContext Enough replicas found to continue. 2> 345903 INFO (qtp1461542799-1983) [n:127.0.0.1:33207__c c:collection1 s:shard2 r:core_node5 x:collection1_shard2_replica_n2 t:null-1014] o.a.s.c.ShardLeaderElectionContext I may be the new leader - try and sync 2> 345904 INFO (qtp1461542799-1983) [n:127.0.0.1:33207__c c:collection1 s:shard2 r:core_node5 x:collection1_shard2_replica_n2 t:null-1014] o.a.s.c.SyncStrategy Sync replicas to http://127.0.0.1:33207/_c/collection1_shard2_replica_n2/ 2> 345907 INFO (qtp1461542799-1983) [n:127.0.0.1:33207__c c:collection1 s:shard2 r:core_node5 x:collection1_shard2_replica_n2 t:null-1014] o.a.s.c.SyncStrategy Sync Success - now sync replicas to me 2> 345910 INFO (qtp1461542799-1983) [n:127.0.0.1:33207__c c:collection1 s:shard2 r:core_node5 x:collection1_shard2_replica_n2 t:null-1014] o.a.s.c.SyncStrategy http://127.0.0.1:33207/_c/collection1_shard2_replica_n2/ has no replicas 2> 345910 INFO (qtp1461542799-1983) [n:127.0.0.1:33207__c c:collection1 s:shard2 r:core_node5 x:collection1_shard2_replica_n2 t:null-1014] o.a.s.c.ShardLeaderElectionContextBase Creating leader registration node /collections/collection1/leaders/shard2/leader after winning as /collections/collection1/leader_elect/shard2/election/72077814988931081-core_node5-n_0000000000 2> 345932 INFO (qtp1461542799-1983) [n:127.0.0.1:33207__c c:collection1 s:shard2 r:core_node5 x:collection1_shard2_replica_n2 t:null-1014] o.a.s.c.ShardLeaderElectionContext I am the new leader: http://127.0.0.1:33207/_c/collection1_shard2_replica_n2/ shard2 2> 346046 INFO (zkCallback-1180-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/collection1/state.json zxid: 284] for collection [collection1] has occurred - updating... (live nodes size: [4]) 2> 346047 INFO (zkCallback-1199-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/collection1/state.json zxid: 284] for collection [collection1] has occurred - updating... (live nodes size: [4]) 2> 346046 INFO (zkCallback-1161-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/collection1/state.json zxid: 284] for collection [collection1] has occurred - updating... (live nodes size: [4]) 2> 346055 INFO (qtp1455143443-2014) [n:127.0.0.1:32875__c c:collection1 s:shard3 r:core_node4 x:collection1_shard3_replica_n1 t:null-1012] o.a.s.c.ZkController I am the leader, no recovery necessary 2> 346056 INFO (qtp1461542799-1983) [n:127.0.0.1:33207__c c:collection1 s:shard2 r:core_node5 x:collection1_shard2_replica_n2 t:null-1014] o.a.s.c.ZkController I am the leader, no recovery necessary 2> 346058 INFO (qtp1274442124-2042) [n:127.0.0.1:46873__c c:collection1 s:shard1 r:core_node6 x:collection1_shard1_replica_n3 t:null-1013] o.a.s.c.ZkController I am the leader, no recovery necessary 2> 346094 INFO (qtp1455143443-2014) [n:127.0.0.1:32875__c c: s: r: x: t:null-1012] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/cores params={qt=/admin/cores&coreNodeName=core_node4&collection.configName=conf1&name=collection1_shard3_replica_n1&action=CREATE&collection=collection1&shard=shard3&wt=javabin&version=2&replicaType=NRT} status=0 QTime=1787 2> 346094 INFO (qtp1461542799-1983) [n:127.0.0.1:33207__c c: s: r: x: t:null-1014] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/cores params={qt=/admin/cores&coreNodeName=core_node5&collection.configName=conf1&name=collection1_shard2_replica_n2&action=CREATE&collection=collection1&shard=shard2&wt=javabin&version=2&replicaType=NRT} status=0 QTime=1582 2> 346096 INFO (qtp1274442124-2042) [n:127.0.0.1:46873__c c: s: r: x: t:null-1013] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/cores params={qt=/admin/cores&coreNodeName=core_node6&collection.configName=conf1&name=collection1_shard1_replica_n3&action=CREATE&collection=collection1&shard=shard1&wt=javabin&version=2&replicaType=NRT} status=0 QTime=1584 2> 346121 INFO (qtp1455143443-2013) [n:127.0.0.1:32875__c c:collection1 s: r: x: t:null-1010] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/collections params={node=127.0.0.1:33207__c&action=ADDREPLICA&collection=collection1&shard=shard2&type=NRT&wt=javabin&version=2} status=0 QTime=1983 2> 346122 INFO (qtp1455143443-2015) [n:127.0.0.1:32875__c c:collection1 s: r: x: t:null-1011] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/collections params={node=127.0.0.1:46873__c&action=ADDREPLICA&collection=collection1&shard=shard1&type=NRT&wt=javabin&version=2} status=0 QTime=1955 2> 346127 INFO (qtp1455143443-2011) [n:127.0.0.1:32875__c c:collection1 s: r: x: t:null-1009] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/collections params={node=127.0.0.1:32875__c&action=ADDREPLICA&collection=collection1&shard=shard3&type=NRT&wt=javabin&version=2} status=0 QTime=2012 2> 346130 INFO (TEST-ReplicationFactorTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.AbstractFullDistribZkTestBase Waiting to see 3 active replicas in collection: collection1 2> 346132 INFO (OverseerCollectionConfigSetProcessor-72077814988931076-127.0.0.1:40643__c-n_0000000000) [n: c: s: r: x: t:] o.a.s.c.OverseerTaskQueue Response ZK path: /overseer/collection-queue-work/qnr-0000000004 doesn't exist. Requestor may have disconnected from ZooKeeper 2> 346139 INFO (OverseerCollectionConfigSetProcessor-72077814988931076-127.0.0.1:40643__c-n_0000000000) [n: c: s: r: x: t:] o.a.s.c.OverseerTaskQueue Response ZK path: /overseer/collection-queue-work/qnr-0000000006 doesn't exist. Requestor may have disconnected from ZooKeeper 2> 346145 INFO (OverseerCollectionConfigSetProcessor-72077814988931076-127.0.0.1:40643__c-n_0000000000) [n: c: s: r: x: t:] o.a.s.c.OverseerTaskQueue Response ZK path: /overseer/collection-queue-work/qnr-0000000007 doesn't exist. Requestor may have disconnected from ZooKeeper 2> 346191 INFO (zkCallback-1161-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/collection1/state.json zxid: 304] for collection [collection1] has occurred - updating... (live nodes size: [4]) 2> 346191 INFO (zkCallback-1146-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/collection1/state.json zxid: 304] for collection [collection1] has occurred - updating... (live nodes size: [4]) 2> 346191 INFO (zkCallback-1180-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/collection1/state.json zxid: 304] for collection [collection1] has occurred - updating... (live nodes size: [4]) 2> 346192 INFO (zkCallback-1199-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/collection1/state.json zxid: 304] for collection [collection1] has occurred - updating... (live nodes size: [4]) 2> 346193 INFO (zkCallback-1161-thread-2) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/collection1/state.json zxid: 304] for collection [collection1] has occurred - updating... (live nodes size: [4]) 2> 346194 INFO (zkCallback-1180-thread-2) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/collection1/state.json zxid: 304] for collection [collection1] has occurred - updating... (live nodes size: [4]) 2> 346195 INFO (zkCallback-1199-thread-2) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/collection1/state.json zxid: 304] for collection [collection1] has occurred - updating... (live nodes size: [4]) 2> 346215 INFO (TEST-ReplicationFactorTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.SolrTestCaseJ4 ###Starting test 2> 346216 INFO (TEST-ReplicationFactorTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.ReplicationFactorTest replication factor test running 2> 346217 INFO (TEST-ReplicationFactorTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.AbstractFullDistribZkTestBase Wait for recoveries to finish - wait 30SECONDS for each attempt 2> 346217 INFO (TEST-ReplicationFactorTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.AbstractDistribZkTestBase Wait for recoveries to finish - collection:collection1 failOnTimeout:true timeout:30SECONDS 2> 346221 INFO (TEST-ReplicationFactorTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.AbstractDistribZkTestBase Recoveries finished - collection:collection1 2> 346256 INFO (qtp1639791653-1929) [n:127.0.0.1:40643__c c:control_collection s:shard1 r:core_node2 x:control_collection_shard1_replica_n1 t:null-1015] o.a.s.u.p.LogUpdateProcessorFactory webapp=/_c path=/update params={waitSearcher=true&commit=true&softCommit=false&wt=javabin&version=2}{commit=} 0 27 2> 346296 INFO (SocketProxy-Acceptor-46873) [n: c: s: r: x: t:] o.a.s.c.s.c.SocketProxy accepted Socket[addr=/127.0.0.1,port=49506,localport=46873], receiveBufferSize: 65536 2> 346300 INFO (SocketProxy-Acceptor-33207) [n: c: s: r: x: t:] o.a.s.c.s.c.SocketProxy accepted Socket[addr=/127.0.0.1,port=59118,localport=33207], receiveBufferSize: 65536 2> 346313 INFO (SocketProxy-Acceptor-46873) [n: c: s: r: x: t:] o.a.s.c.s.c.SocketProxy proxy connection Socket[addr=/127.0.0.1,port=40767,localport=60632], receiveBufferSize=65536 2> 346321 INFO (SocketProxy-Acceptor-33207) [n: c: s: r: x: t:] o.a.s.c.s.c.SocketProxy proxy connection Socket[addr=/127.0.0.1,port=35371,localport=42052], receiveBufferSize=65536 2> 346355 INFO (qtp1274442124-2045) [n:127.0.0.1:46873__c c:collection1 s:shard1 r:core_node6 x:collection1_shard1_replica_n3 t:null-1016] o.a.s.u.p.LogUpdateProcessorFactory webapp=/_c path=/update params={update.distrib=TOLEADER&waitSearcher=true&openSearcher=true&commit=true&softCommit=false&distrib.from=http://127.0.0.1:32875/_c/collection1_shard3_replica_n1/&commit_end_point=leaders&wt=javabin&version=2&expungeDeletes=false}{commit=} 0 11 2> 346358 INFO (qtp1461542799-1986) [n:127.0.0.1:33207__c c:collection1 s:shard2 r:core_node5 x:collection1_shard2_replica_n2 t:null-1016] o.a.s.u.p.LogUpdateProcessorFactory webapp=/_c path=/update params={update.distrib=TOLEADER&waitSearcher=true&openSearcher=true&commit=true&softCommit=false&distrib.from=http://127.0.0.1:32875/_c/collection1_shard3_replica_n1/&commit_end_point=leaders&wt=javabin&version=2&expungeDeletes=false}{commit=} 0 12 2> 346360 INFO (qtp1455143443-2010) [n:127.0.0.1:32875__c c:collection1 s:shard3 r:core_node4 x:collection1_shard3_replica_n1 t:null-1016] o.a.s.u.p.LogUpdateProcessorFactory webapp=/_c path=/update params={_stateVer_=collection1:5&waitSearcher=true&commit=true&softCommit=false&wt=javabin&version=2}{commit=} 0 91 2> 346376 INFO (SocketProxy-Acceptor-33207) [n: c: s: r: x: t:] o.a.s.c.s.c.SocketProxy accepted Socket[addr=/127.0.0.1,port=59120,localport=33207], receiveBufferSize: 65536 2> 346385 INFO (SocketProxy-Acceptor-33207) [n: c: s: r: x: t:] o.a.s.c.s.c.SocketProxy proxy connection Socket[addr=/127.0.0.1,port=35371,localport=42068], receiveBufferSize=65536 2> 346397 INFO (qtp1461542799-1982) [n:127.0.0.1:33207__c c:collection1 s:shard2 r:core_node5 x:collection1_shard2_replica_n2 t:null-1017] o.a.s.c.S.Request webapp=/_c path=/select params={q=*:*&distrib=false&tests=checkShardConsistency&rows=0&wt=javabin&version=2} hits=0 status=0 QTime=4 2> 346407 INFO (SocketProxy-Acceptor-32875) [n: c: s: r: x: t:] o.a.s.c.s.c.SocketProxy accepted Socket[addr=/127.0.0.1,port=44194,localport=32875], receiveBufferSize: 65536 2> 346417 INFO (SocketProxy-Acceptor-32875) [n: c: s: r: x: t:] o.a.s.c.s.c.SocketProxy proxy connection Socket[addr=/127.0.0.1,port=38185,localport=40172], receiveBufferSize=65536 2> 346427 INFO (qtp1455143443-2014) [n:127.0.0.1:32875__c c:collection1 s:shard3 r:core_node4 x:collection1_shard3_replica_n1 t:null-1018] o.a.s.c.S.Request webapp=/_c path=/select params={q=*:*&distrib=false&tests=checkShardConsistency&rows=0&wt=javabin&version=2} hits=0 status=0 QTime=3 2> 346438 INFO (SocketProxy-Acceptor-46873) [n: c: s: r: x: t:] o.a.s.c.s.c.SocketProxy accepted Socket[addr=/127.0.0.1,port=49520,localport=46873], receiveBufferSize: 65536 2> 346447 INFO (SocketProxy-Acceptor-46873) [n: c: s: r: x: t:] o.a.s.c.s.c.SocketProxy proxy connection Socket[addr=/127.0.0.1,port=40767,localport=60636], receiveBufferSize=65536 2> 346456 INFO (qtp1274442124-2041) [n:127.0.0.1:46873__c c:collection1 s:shard1 r:core_node6 x:collection1_shard1_replica_n3 t:null-1019] o.a.s.c.S.Request webapp=/_c path=/select params={q=*:*&distrib=false&tests=checkShardConsistency&rows=0&wt=javabin&version=2} hits=0 status=0 QTime=2 2> 348459 INFO (TEST-ReplicationFactorTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.ReplicationFactorTest Testing replication factor handling for repfacttest_c8n_1x3 2> 348491 INFO (OverseerThreadFactory-1128-thread-3) [n: c:repfacttest_c8n_1x3 s: r: x: t:] o.a.s.c.a.c.CreateCollectionCmd Create collection repfacttest_c8n_1x3 2> 348665 INFO (OverseerStateUpdate-72077814988931076-127.0.0.1:40643__c-n_0000000000) [n:127.0.0.1:40643__c c: s: r: x: t:] o.a.s.c.o.SliceMutator createReplica() { 2> "core":"repfacttest_c8n_1x3_shard1_replica_n1", 2> "node_name":"127.0.0.1:32875__c", 2> "base_url":"http://127.0.0.1:32875/_c", 2> "collection":"repfacttest_c8n_1x3", 2> "shard":"shard1", 2> "state":"down", 2> "type":"NRT", 2> "operation":"ADDREPLICA", 2> "waitForFinalState":"false"} 2> 348709 INFO (OverseerStateUpdate-72077814988931076-127.0.0.1:40643__c-n_0000000000) [n:127.0.0.1:40643__c c: s: r: x: t:] o.a.s.c.o.SliceMutator createReplica() { 2> "core":"repfacttest_c8n_1x3_shard1_replica_n2", 2> "node_name":"127.0.0.1:40643__c", 2> "base_url":"http://127.0.0.1:40643/_c", 2> "collection":"repfacttest_c8n_1x3", 2> "shard":"shard1", 2> "state":"down", 2> "type":"NRT", 2> "operation":"ADDREPLICA", 2> "waitForFinalState":"false"} 2> 348729 INFO (OverseerStateUpdate-72077814988931076-127.0.0.1:40643__c-n_0000000000) [n:127.0.0.1:40643__c c: s: r: x: t:] o.a.s.c.o.SliceMutator createReplica() { 2> "core":"repfacttest_c8n_1x3_shard1_replica_n4", 2> "node_name":"127.0.0.1:33207__c", 2> "base_url":"http://127.0.0.1:33207/_c", 2> "collection":"repfacttest_c8n_1x3", 2> "shard":"shard1", 2> "state":"down", 2> "type":"NRT", 2> "operation":"ADDREPLICA", 2> "waitForFinalState":"false"} 2> 348849 INFO (zkCallback-1122-thread-2) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/repfacttest_c8n_1x3/state.json zxid: 323] for collection [repfacttest_c8n_1x3] has occurred - updating... (live nodes size: [4]) 2> 348859 INFO (SocketProxy-Acceptor-40643) [n: c: s: r: x: t:] o.a.s.c.s.c.SocketProxy accepted Socket[addr=/127.0.0.1,port=52034,localport=40643], receiveBufferSize: 65536 2> 348860 INFO (qtp1455143443-2011) [n:127.0.0.1:32875__c c: s: r: x:repfacttest_c8n_1x3_shard1_replica_n1 t:null-1021] o.a.s.h.a.CoreAdminOperation core create command qt=/admin/cores&collection.configName=conf1&newCollection=true&collection=repfacttest_c8n_1x3&version=2&replicaType=NRT&coreNodeName=core_node3&name=repfacttest_c8n_1x3_shard1_replica_n1&action=CREATE&numShards=1&shard=shard1&wt=javabin 2> 348863 INFO (qtp1461542799-1983) [n:127.0.0.1:33207__c c: s: r: x:repfacttest_c8n_1x3_shard1_replica_n4 t:null-1022] o.a.s.h.a.CoreAdminOperation core create command qt=/admin/cores&collection.configName=conf1&newCollection=true&collection=repfacttest_c8n_1x3&version=2&replicaType=NRT&coreNodeName=core_node6&name=repfacttest_c8n_1x3_shard1_replica_n4&action=CREATE&numShards=1&shard=shard1&wt=javabin 2> 348871 INFO (SocketProxy-Acceptor-40643) [n: c: s: r: x: t:] o.a.s.c.s.c.SocketProxy proxy connection Socket[addr=/127.0.0.1,port=43165,localport=53086], receiveBufferSize=65536 2> 348891 INFO (qtp1639791653-1927) [n:127.0.0.1:40643__c c: s: r: x:repfacttest_c8n_1x3_shard1_replica_n2 t:null-1023] o.a.s.h.a.CoreAdminOperation core create command qt=/admin/cores&collection.configName=conf1&newCollection=true&collection=repfacttest_c8n_1x3&version=2&replicaType=NRT&coreNodeName=core_node5&name=repfacttest_c8n_1x3_shard1_replica_n2&action=CREATE&numShards=1&shard=shard1&wt=javabin 2> 348935 INFO (qtp1455143443-2011) [n:127.0.0.1:32875__c c:repfacttest_c8n_1x3 s:shard1 r:core_node3 x:repfacttest_c8n_1x3_shard1_replica_n1 t:null-1021] o.a.s.c.SolrConfig Using Lucene MatchVersion: 9.8.0 2> 348939 INFO (qtp1461542799-1983) [n:127.0.0.1:33207__c c:repfacttest_c8n_1x3 s:shard1 r:core_node6 x:repfacttest_c8n_1x3_shard1_replica_n4 t:null-1022] o.a.s.c.SolrConfig Using Lucene MatchVersion: 9.8.0 2> 348948 INFO (qtp1639791653-1927) [n:127.0.0.1:40643__c c:repfacttest_c8n_1x3 s:shard1 r:core_node5 x:repfacttest_c8n_1x3_shard1_replica_n2 t:null-1023] o.a.s.c.SolrConfig Using Lucene MatchVersion: 9.8.0 2> 348950 INFO (qtp1455143443-2011) [n:127.0.0.1:32875__c c:repfacttest_c8n_1x3 s:shard1 r:core_node3 x:repfacttest_c8n_1x3_shard1_replica_n1 t:null-1021] o.a.s.s.IndexSchema Schema name=test 2> 348953 INFO (qtp1461542799-1983) [n:127.0.0.1:33207__c c:repfacttest_c8n_1x3 s:shard1 r:core_node6 x:repfacttest_c8n_1x3_shard1_replica_n4 t:null-1022] o.a.s.s.IndexSchema Schema name=test 2> 348960 INFO (qtp1639791653-1927) [n:127.0.0.1:40643__c c:repfacttest_c8n_1x3 s:shard1 r:core_node5 x:repfacttest_c8n_1x3_shard1_replica_n2 t:null-1023] o.a.s.s.IndexSchema Schema name=test 2> 349652 INFO (qtp1455143443-2011) [n:127.0.0.1:32875__c c:repfacttest_c8n_1x3 s:shard1 r:core_node3 x:repfacttest_c8n_1x3_shard1_replica_n1 t:null-1021] o.a.s.s.IndexSchema Loaded schema test/1.0 with uniqueid field id 2> 349661 INFO (qtp1461542799-1983) [n:127.0.0.1:33207__c c:repfacttest_c8n_1x3 s:shard1 r:core_node6 x:repfacttest_c8n_1x3_shard1_replica_n4 t:null-1022] o.a.s.s.IndexSchema Loaded schema test/1.0 with uniqueid field id 2> 349671 INFO (qtp1639791653-1927) [n:127.0.0.1:40643__c c:repfacttest_c8n_1x3 s:shard1 r:core_node5 x:repfacttest_c8n_1x3_shard1_replica_n2 t:null-1023] o.a.s.s.IndexSchema Loaded schema test/1.0 with uniqueid field id 2> 349829 INFO (qtp1455143443-2011) [n:127.0.0.1:32875__c c:repfacttest_c8n_1x3 s:shard1 r:core_node3 x:repfacttest_c8n_1x3_shard1_replica_n1 t:null-1021] o.a.s.c.CoreContainer Creating SolrCore 'repfacttest_c8n_1x3_shard1_replica_n1' using configuration from configset conf1, trusted=true 2> 349838 INFO (qtp1455143443-2011) [n:127.0.0.1:32875__c c:repfacttest_c8n_1x3 s:shard1 r:core_node3 x:repfacttest_c8n_1x3_shard1_replica_n1 t:null-1021] o.a.s.c.SolrCore Opening new SolrCore at [/build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.cloud.ReplicationFactorTest_246C98A4C257C021-001/shard-2-001/cores/repfacttest_c8n_1x3_shard1_replica_n1], dataDir=[/build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.cloud.ReplicationFactorTest_246C98A4C257C021-001/shard-2-001/cores/repfacttest_c8n_1x3_shard1_replica_n1/data/] 2> 349857 INFO (qtp1461542799-1983) [n:127.0.0.1:33207__c c:repfacttest_c8n_1x3 s:shard1 r:core_node6 x:repfacttest_c8n_1x3_shard1_replica_n4 t:null-1022] o.a.s.c.CoreContainer Creating SolrCore 'repfacttest_c8n_1x3_shard1_replica_n4' using configuration from configset conf1, trusted=true 2> 349860 INFO (qtp1455143443-2011) [n:127.0.0.1:32875__c c:repfacttest_c8n_1x3 s:shard1 r:core_node3 x:repfacttest_c8n_1x3_shard1_replica_n1 t:null-1021] o.a.s.u.RandomMergePolicy RandomMergePolicy wrapping class org.apache.lucene.index.TieredMergePolicy: [TieredMergePolicy: maxMergeAtOnce=40, maxMergedSegmentMB=105.98709392547607, floorSegmentMB=0.24309539794921875, forceMergeDeletesPctAllowed=18.166870365028444, segmentsPerTier=19.0, maxCFSSegmentSizeMB=8.796093022208E12, noCFSRatio=0.0, deletesPctAllowed=47.18653045948602 2> 349862 INFO (qtp1461542799-1983) [n:127.0.0.1:33207__c c:repfacttest_c8n_1x3 s:shard1 r:core_node6 x:repfacttest_c8n_1x3_shard1_replica_n4 t:null-1022] o.a.s.c.SolrCore Opening new SolrCore at [/build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.cloud.ReplicationFactorTest_246C98A4C257C021-001/shard-1-001/cores/repfacttest_c8n_1x3_shard1_replica_n4], dataDir=[/build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.cloud.ReplicationFactorTest_246C98A4C257C021-001/shard-1-001/cores/repfacttest_c8n_1x3_shard1_replica_n4/data/] 2> 349888 INFO (qtp1639791653-1927) [n:127.0.0.1:40643__c c:repfacttest_c8n_1x3 s:shard1 r:core_node5 x:repfacttest_c8n_1x3_shard1_replica_n2 t:null-1023] o.a.s.c.CoreContainer Creating SolrCore 'repfacttest_c8n_1x3_shard1_replica_n2' using configuration from configset conf1, trusted=true 2> 349894 INFO (qtp1639791653-1927) [n:127.0.0.1:40643__c c:repfacttest_c8n_1x3 s:shard1 r:core_node5 x:repfacttest_c8n_1x3_shard1_replica_n2 t:null-1023] o.a.s.c.SolrCore Opening new SolrCore at [/build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.cloud.ReplicationFactorTest_246C98A4C257C021-001/control-001/cores/repfacttest_c8n_1x3_shard1_replica_n2], dataDir=[/build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.cloud.ReplicationFactorTest_246C98A4C257C021-001/control-001/cores/repfacttest_c8n_1x3_shard1_replica_n2/data/] 2> 349901 INFO (qtp1461542799-1983) [n:127.0.0.1:33207__c c:repfacttest_c8n_1x3 s:shard1 r:core_node6 x:repfacttest_c8n_1x3_shard1_replica_n4 t:null-1022] o.a.s.u.RandomMergePolicy RandomMergePolicy wrapping class org.apache.lucene.index.TieredMergePolicy: [TieredMergePolicy: maxMergeAtOnce=24, maxMergedSegmentMB=96.44478511810303, floorSegmentMB=0.5299530029296875, forceMergeDeletesPctAllowed=2.7259169653814954, segmentsPerTier=33.0, maxCFSSegmentSizeMB=8.796093022208E12, noCFSRatio=0.0, deletesPctAllowed=22.936533526459158 2> 349902 WARN (qtp1455143443-2011) [n:127.0.0.1:32875__c c:repfacttest_c8n_1x3 s:shard1 r:core_node3 x:repfacttest_c8n_1x3_shard1_replica_n1 t:null-1021] o.a.s.c.RequestHandlers INVALID paramSet a in requestHandler {type = requestHandler,name = /dump,class = DumpRequestHandler,attributes = {initParams=a, name=/dump, class=DumpRequestHandler},args = {defaults={a=A, b=B}}} 2> 349918 INFO (qtp1639791653-1927) [n:127.0.0.1:40643__c c:repfacttest_c8n_1x3 s:shard1 r:core_node5 x:repfacttest_c8n_1x3_shard1_replica_n2 t:null-1023] o.a.s.u.RandomMergePolicy RandomMergePolicy wrapping class org.apache.lucene.index.TieredMergePolicy: [TieredMergePolicy: maxMergeAtOnce=40, maxMergedSegmentMB=105.98709392547607, floorSegmentMB=0.24309539794921875, forceMergeDeletesPctAllowed=18.166870365028444, segmentsPerTier=19.0, maxCFSSegmentSizeMB=8.796093022208E12, noCFSRatio=0.0, deletesPctAllowed=47.18653045948602 2> 349932 WARN (qtp1461542799-1983) [n:127.0.0.1:33207__c c:repfacttest_c8n_1x3 s:shard1 r:core_node6 x:repfacttest_c8n_1x3_shard1_replica_n4 t:null-1022] o.a.s.c.RequestHandlers INVALID paramSet a in requestHandler {type = requestHandler,name = /dump,class = DumpRequestHandler,attributes = {initParams=a, name=/dump, class=DumpRequestHandler},args = {defaults={a=A, b=B}}} 2> 349943 WARN (qtp1639791653-1927) [n:127.0.0.1:40643__c c:repfacttest_c8n_1x3 s:shard1 r:core_node5 x:repfacttest_c8n_1x3_shard1_replica_n2 t:null-1023] o.a.s.c.RequestHandlers INVALID paramSet a in requestHandler {type = requestHandler,name = /dump,class = DumpRequestHandler,attributes = {initParams=a, name=/dump, class=DumpRequestHandler},args = {defaults={a=A, b=B}}} 2> 350146 INFO (qtp1455143443-2011) [n:127.0.0.1:32875__c c:repfacttest_c8n_1x3 s:shard1 r:core_node3 x:repfacttest_c8n_1x3_shard1_replica_n1 t:null-1021] o.a.s.u.UpdateHandler Using UpdateLog implementation: org.apache.solr.update.UpdateLog 2> 350146 INFO (qtp1455143443-2011) [n:127.0.0.1:32875__c c:repfacttest_c8n_1x3 s:shard1 r:core_node3 x:repfacttest_c8n_1x3_shard1_replica_n1 t:null-1021] o.a.s.u.UpdateLog Initializing UpdateLog: dataDir= defaultSyncLevel=FLUSH numRecordsToKeep=100 maxNumLogsToKeep=10 numVersionBuckets=65536 2> 350168 INFO (qtp1455143443-2011) [n:127.0.0.1:32875__c c:repfacttest_c8n_1x3 s:shard1 r:core_node3 x:repfacttest_c8n_1x3_shard1_replica_n1 t:null-1021] o.a.s.u.CommitTracker Hard AutoCommit: disabled 2> 350168 INFO (qtp1455143443-2011) [n:127.0.0.1:32875__c c:repfacttest_c8n_1x3 s:shard1 r:core_node3 x:repfacttest_c8n_1x3_shard1_replica_n1 t:null-1021] o.a.s.u.CommitTracker Soft AutoCommit: disabled 2> 350184 INFO (qtp1455143443-2011) [n:127.0.0.1:32875__c c:repfacttest_c8n_1x3 s:shard1 r:core_node3 x:repfacttest_c8n_1x3_shard1_replica_n1 t:null-1021] o.a.s.u.RandomMergePolicy RandomMergePolicy wrapping class org.apache.lucene.index.LogByteSizeMergePolicy: [LogByteSizeMergePolicy: minMergeSize=1677721, mergeFactor=33, maxMergeSize=2147483648, maxMergeSizeForForcedMerge=9223372036854775807, calibrateSizeByDeletes=false, maxMergeDocs=2147483647, maxCFSSegmentSizeMB=8.796093022208E12, noCFSRatio=0.6324117596483627] 2> 350198 INFO (qtp1461542799-1983) [n:127.0.0.1:33207__c c:repfacttest_c8n_1x3 s:shard1 r:core_node6 x:repfacttest_c8n_1x3_shard1_replica_n4 t:null-1022] o.a.s.u.UpdateHandler Using UpdateLog implementation: org.apache.solr.update.UpdateLog 2> 350199 INFO (qtp1461542799-1983) [n:127.0.0.1:33207__c c:repfacttest_c8n_1x3 s:shard1 r:core_node6 x:repfacttest_c8n_1x3_shard1_replica_n4 t:null-1022] o.a.s.u.UpdateLog Initializing UpdateLog: dataDir= defaultSyncLevel=FLUSH numRecordsToKeep=100 maxNumLogsToKeep=10 numVersionBuckets=65536 2> 350202 INFO (qtp1455143443-2011) [n:127.0.0.1:32875__c c:repfacttest_c8n_1x3 s:shard1 r:core_node3 x:repfacttest_c8n_1x3_shard1_replica_n1 t:null-1021] o.a.s.r.ManagedResourceStorage Configured ZooKeeperStorageIO with znodeBase: /configs/conf1 2> 350206 INFO (qtp1455143443-2011) [n:127.0.0.1:32875__c c:repfacttest_c8n_1x3 s:shard1 r:core_node3 x:repfacttest_c8n_1x3_shard1_replica_n1 t:null-1021] o.a.s.r.ManagedResourceStorage Loaded null at path _rest_managed.json using ZooKeeperStorageIO:path=/configs/conf1 2> 350212 INFO (qtp1455143443-2011) [n:127.0.0.1:32875__c c:repfacttest_c8n_1x3 s:shard1 r:core_node3 x:repfacttest_c8n_1x3_shard1_replica_n1 t:null-1021] o.a.s.h.ReplicationHandler Commits will be reserved for 10000 ms 2> 350214 INFO (qtp1455143443-2011) [n:127.0.0.1:32875__c c:repfacttest_c8n_1x3 s:shard1 r:core_node3 x:repfacttest_c8n_1x3_shard1_replica_n1 t:null-1021] o.a.s.u.UpdateLog Could not find max version in index or recent updates, using new clock 1788735685721587712 2> 350220 INFO (qtp1461542799-1983) [n:127.0.0.1:33207__c c:repfacttest_c8n_1x3 s:shard1 r:core_node6 x:repfacttest_c8n_1x3_shard1_replica_n4 t:null-1022] o.a.s.u.CommitTracker Hard AutoCommit: disabled 2> 350220 INFO (qtp1461542799-1983) [n:127.0.0.1:33207__c c:repfacttest_c8n_1x3 s:shard1 r:core_node6 x:repfacttest_c8n_1x3_shard1_replica_n4 t:null-1022] o.a.s.u.CommitTracker Soft AutoCommit: disabled 2> 350247 INFO (qtp1461542799-1983) [n:127.0.0.1:33207__c c:repfacttest_c8n_1x3 s:shard1 r:core_node6 x:repfacttest_c8n_1x3_shard1_replica_n4 t:null-1022] o.a.s.u.RandomMergePolicy RandomMergePolicy wrapping class org.apache.lucene.index.LogByteSizeMergePolicy: [LogByteSizeMergePolicy: minMergeSize=1677721, mergeFactor=18, maxMergeSize=2147483648, maxMergeSizeForForcedMerge=9223372036854775807, calibrateSizeByDeletes=true, maxMergeDocs=2147483647, maxCFSSegmentSizeMB=8.796093022208E12, noCFSRatio=1.0] 2> 350262 INFO (qtp1639791653-1927) [n:127.0.0.1:40643__c c:repfacttest_c8n_1x3 s:shard1 r:core_node5 x:repfacttest_c8n_1x3_shard1_replica_n2 t:null-1023] o.a.s.u.UpdateHandler Using UpdateLog implementation: org.apache.solr.update.UpdateLog 2> 350262 INFO (qtp1639791653-1927) [n:127.0.0.1:40643__c c:repfacttest_c8n_1x3 s:shard1 r:core_node5 x:repfacttest_c8n_1x3_shard1_replica_n2 t:null-1023] o.a.s.u.UpdateLog Initializing UpdateLog: dataDir= defaultSyncLevel=FLUSH numRecordsToKeep=100 maxNumLogsToKeep=10 numVersionBuckets=65536 2> 350265 INFO (searcherExecutor-1230-thread-1-processing-repfacttest_c8n_1x3_shard1_replica_n1 null-1021 core_node3 127.0.0.1:32875__c repfacttest_c8n_1x3 shard1) [n:127.0.0.1:32875__c c:repfacttest_c8n_1x3 s:shard1 r:core_node3 x:repfacttest_c8n_1x3_shard1_replica_n1 t:null-1021] o.a.s.c.SolrCore Registered new searcher autowarm time: 0 ms 2> 350272 INFO (qtp1455143443-2011) [n:127.0.0.1:32875__c c:repfacttest_c8n_1x3 s:shard1 r:core_node3 x:repfacttest_c8n_1x3_shard1_replica_n1 t:null-1021] o.a.s.c.ZkShardTerms Successful update of terms at /collections/repfacttest_c8n_1x3/terms/shard1 to Terms{values={core_node3=0}, version=0} for registerTerm 2> 350279 INFO (qtp1455143443-2011) [n:127.0.0.1:32875__c c:repfacttest_c8n_1x3 s:shard1 r:core_node3 x:repfacttest_c8n_1x3_shard1_replica_n1 t:null-1021] o.a.s.c.ShardLeaderElectionContextBase make sure parent is created /collections/repfacttest_c8n_1x3/leaders/shard1 2> 350279 INFO (qtp1461542799-1983) [n:127.0.0.1:33207__c c:repfacttest_c8n_1x3 s:shard1 r:core_node6 x:repfacttest_c8n_1x3_shard1_replica_n4 t:null-1022] o.a.s.r.ManagedResourceStorage Configured ZooKeeperStorageIO with znodeBase: /configs/conf1 2> 350282 INFO (qtp1461542799-1983) [n:127.0.0.1:33207__c c:repfacttest_c8n_1x3 s:shard1 r:core_node6 x:repfacttest_c8n_1x3_shard1_replica_n4 t:null-1022] o.a.s.r.ManagedResourceStorage Loaded null at path _rest_managed.json using ZooKeeperStorageIO:path=/configs/conf1 2> 350290 INFO (qtp1461542799-1983) [n:127.0.0.1:33207__c c:repfacttest_c8n_1x3 s:shard1 r:core_node6 x:repfacttest_c8n_1x3_shard1_replica_n4 t:null-1022] o.a.s.h.ReplicationHandler Commits will be reserved for 10000 ms 2> 350292 INFO (qtp1461542799-1983) [n:127.0.0.1:33207__c c:repfacttest_c8n_1x3 s:shard1 r:core_node6 x:repfacttest_c8n_1x3_shard1_replica_n4 t:null-1022] o.a.s.u.UpdateLog Could not find max version in index or recent updates, using new clock 1788735685803376640 2> 350297 INFO (qtp1639791653-1927) [n:127.0.0.1:40643__c c:repfacttest_c8n_1x3 s:shard1 r:core_node5 x:repfacttest_c8n_1x3_shard1_replica_n2 t:null-1023] o.a.s.u.CommitTracker Hard AutoCommit: disabled 2> 350298 INFO (qtp1639791653-1927) [n:127.0.0.1:40643__c c:repfacttest_c8n_1x3 s:shard1 r:core_node5 x:repfacttest_c8n_1x3_shard1_replica_n2 t:null-1023] o.a.s.u.CommitTracker Soft AutoCommit: disabled 2> 350322 INFO (qtp1639791653-1927) [n:127.0.0.1:40643__c c:repfacttest_c8n_1x3 s:shard1 r:core_node5 x:repfacttest_c8n_1x3_shard1_replica_n2 t:null-1023] o.a.s.u.RandomMergePolicy RandomMergePolicy wrapping class org.apache.lucene.index.LogByteSizeMergePolicy: [LogByteSizeMergePolicy: minMergeSize=1677721, mergeFactor=33, maxMergeSize=2147483648, maxMergeSizeForForcedMerge=9223372036854775807, calibrateSizeByDeletes=false, maxMergeDocs=2147483647, maxCFSSegmentSizeMB=8.796093022208E12, noCFSRatio=0.6324117596483627] 2> 350329 INFO (qtp1455143443-2011) [n:127.0.0.1:32875__c c:repfacttest_c8n_1x3 s:shard1 r:core_node3 x:repfacttest_c8n_1x3_shard1_replica_n1 t:null-1021] o.a.s.c.ShardLeaderElectionContext Waiting until we see more replicas up for shard shard1: total=3 found=1 timeoute in=14994ms 2> 350340 INFO (qtp1461542799-1983) [n:127.0.0.1:33207__c c:repfacttest_c8n_1x3 s:shard1 r:core_node6 x:repfacttest_c8n_1x3_shard1_replica_n4 t:null-1022] o.a.s.c.ZkShardTerms Successful update of terms at /collections/repfacttest_c8n_1x3/terms/shard1 to Terms{values={core_node6=0, core_node3=0}, version=1} for registerTerm 2> 350341 INFO (qtp1461542799-1983) [n:127.0.0.1:33207__c c:repfacttest_c8n_1x3 s:shard1 r:core_node6 x:repfacttest_c8n_1x3_shard1_replica_n4 t:null-1022] o.a.s.c.ShardLeaderElectionContextBase make sure parent is created /collections/repfacttest_c8n_1x3/leaders/shard1 2> 350345 INFO (searcherExecutor-1232-thread-1-processing-repfacttest_c8n_1x3_shard1_replica_n4 null-1022 core_node6 127.0.0.1:33207__c repfacttest_c8n_1x3 shard1) [n:127.0.0.1:33207__c c:repfacttest_c8n_1x3 s:shard1 r:core_node6 x:repfacttest_c8n_1x3_shard1_replica_n4 t:null-1022] o.a.s.c.SolrCore Registered new searcher autowarm time: 0 ms 2> 350348 INFO (qtp1639791653-1927) [n:127.0.0.1:40643__c c:repfacttest_c8n_1x3 s:shard1 r:core_node5 x:repfacttest_c8n_1x3_shard1_replica_n2 t:null-1023] o.a.s.r.ManagedResourceStorage Configured ZooKeeperStorageIO with znodeBase: /configs/conf1 2> 350352 INFO (qtp1639791653-1927) [n:127.0.0.1:40643__c c:repfacttest_c8n_1x3 s:shard1 r:core_node5 x:repfacttest_c8n_1x3_shard1_replica_n2 t:null-1023] o.a.s.r.ManagedResourceStorage Loaded null at path _rest_managed.json using ZooKeeperStorageIO:path=/configs/conf1 2> 350358 INFO (qtp1639791653-1927) [n:127.0.0.1:40643__c c:repfacttest_c8n_1x3 s:shard1 r:core_node5 x:repfacttest_c8n_1x3_shard1_replica_n2 t:null-1023] o.a.s.h.ReplicationHandler Commits will be reserved for 10000 ms 2> 350359 INFO (qtp1639791653-1927) [n:127.0.0.1:40643__c c:repfacttest_c8n_1x3 s:shard1 r:core_node5 x:repfacttest_c8n_1x3_shard1_replica_n2 t:null-1023] o.a.s.u.UpdateLog Could not find max version in index or recent updates, using new clock 1788735685873631232 2> 350403 INFO (searcherExecutor-1234-thread-1-processing-repfacttest_c8n_1x3_shard1_replica_n2 null-1023 core_node5 127.0.0.1:40643__c repfacttest_c8n_1x3 shard1) [n:127.0.0.1:40643__c c:repfacttest_c8n_1x3 s:shard1 r:core_node5 x:repfacttest_c8n_1x3_shard1_replica_n2 t:null-1023] o.a.s.c.SolrCore Registered new searcher autowarm time: 0 ms 2> 350403 INFO (qtp1639791653-1927) [n:127.0.0.1:40643__c c:repfacttest_c8n_1x3 s:shard1 r:core_node5 x:repfacttest_c8n_1x3_shard1_replica_n2 t:null-1023] o.a.s.c.ZkShardTerms Successful update of terms at /collections/repfacttest_c8n_1x3/terms/shard1 to Terms{values={core_node6=0, core_node3=0, core_node5=0}, version=2} for registerTerm 2> 350407 INFO (qtp1639791653-1927) [n:127.0.0.1:40643__c c:repfacttest_c8n_1x3 s:shard1 r:core_node5 x:repfacttest_c8n_1x3_shard1_replica_n2 t:null-1023] o.a.s.c.ShardLeaderElectionContextBase make sure parent is created /collections/repfacttest_c8n_1x3/leaders/shard1 2> 350838 INFO (qtp1455143443-2011) [n:127.0.0.1:32875__c c:repfacttest_c8n_1x3 s:shard1 r:core_node3 x:repfacttest_c8n_1x3_shard1_replica_n1 t:null-1021] o.a.s.c.ShardLeaderElectionContext Enough replicas found to continue. 2> 350839 INFO (qtp1455143443-2011) [n:127.0.0.1:32875__c c:repfacttest_c8n_1x3 s:shard1 r:core_node3 x:repfacttest_c8n_1x3_shard1_replica_n1 t:null-1021] o.a.s.c.ShardLeaderElectionContext I may be the new leader - try and sync 2> 350839 INFO (qtp1455143443-2011) [n:127.0.0.1:32875__c c:repfacttest_c8n_1x3 s:shard1 r:core_node3 x:repfacttest_c8n_1x3_shard1_replica_n1 t:null-1021] o.a.s.c.SyncStrategy Sync replicas to http://127.0.0.1:32875/_c/repfacttest_c8n_1x3_shard1_replica_n1/ 2> 350845 INFO (qtp1455143443-2011) [n:127.0.0.1:32875__c c:repfacttest_c8n_1x3 s:shard1 r:core_node3 x:repfacttest_c8n_1x3_shard1_replica_n1 t:null-1021] o.a.s.u.PeerSync PeerSync: core=repfacttest_c8n_1x3_shard1_replica_n1 url=http://127.0.0.1:32875/_c START replicas=[http://127.0.0.1:40643/_c/repfacttest_c8n_1x3_shard1_replica_n2/, http://127.0.0.1:33207/_c/repfacttest_c8n_1x3_shard1_replica_n4/] nUpdates=100 2> 350854 INFO (qtp1455143443-2011) [n:127.0.0.1:32875__c c:repfacttest_c8n_1x3 s:shard1 r:core_node3 x:repfacttest_c8n_1x3_shard1_replica_n1 t:null-1021] o.a.s.u.PeerSync PeerSync: core=repfacttest_c8n_1x3_shard1_replica_n1 url=http://127.0.0.1:32875/_c DONE. We have no versions. sync failed. 2> 350856 INFO (SocketProxy-Acceptor-40643) [n: c: s: r: x: t:] o.a.s.c.s.c.SocketProxy accepted Socket[addr=/127.0.0.1,port=52050,localport=40643], receiveBufferSize: 65536 2> 350857 INFO (SocketProxy-Acceptor-33207) [n: c: s: r: x: t:] o.a.s.c.s.c.SocketProxy accepted Socket[addr=/127.0.0.1,port=59128,localport=33207], receiveBufferSize: 65536 2> 350867 INFO (SocketProxy-Acceptor-40643) [n: c: s: r: x: t:] o.a.s.c.s.c.SocketProxy proxy connection Socket[addr=/127.0.0.1,port=43165,localport=53096], receiveBufferSize=65536 2> 350872 INFO (SocketProxy-Acceptor-33207) [n: c: s: r: x: t:] o.a.s.c.s.c.SocketProxy proxy connection Socket[addr=/127.0.0.1,port=35371,localport=42074], receiveBufferSize=65536 2> 350883 INFO (qtp1639791653-1929) [n:127.0.0.1:40643__c c:repfacttest_c8n_1x3 s:shard1 r:core_node5 x:repfacttest_c8n_1x3_shard1_replica_n2 t:null-1021] o.a.s.c.S.Request webapp=/_c path=/get params={distrib=false&qt=/get&fingerprint=false&getVersions=100&wt=javabin&version=2} status=0 QTime=2 2> 350884 INFO (qtp1461542799-1986) [n:127.0.0.1:33207__c c:repfacttest_c8n_1x3 s:shard1 r:core_node6 x:repfacttest_c8n_1x3_shard1_replica_n4 t:null-1021] o.a.s.c.S.Request webapp=/_c path=/get params={distrib=false&qt=/get&fingerprint=false&getVersions=100&wt=javabin&version=2} status=0 QTime=2 2> 350887 INFO (qtp1455143443-2011) [n:127.0.0.1:32875__c c:repfacttest_c8n_1x3 s:shard1 r:core_node3 x:repfacttest_c8n_1x3_shard1_replica_n1 t:null-1021] o.a.s.c.SyncStrategy Leader's attempt to sync with shard failed, moving to the next candidate 2> 350887 INFO (qtp1455143443-2011) [n:127.0.0.1:32875__c c:repfacttest_c8n_1x3 s:shard1 r:core_node3 x:repfacttest_c8n_1x3_shard1_replica_n1 t:null-1021] o.a.s.c.ShardLeaderElectionContext We failed sync, but we have no versions - we can't sync in that case - we were active before, so become leader anyway 2> 350888 INFO (qtp1455143443-2011) [n:127.0.0.1:32875__c c:repfacttest_c8n_1x3 s:shard1 r:core_node3 x:repfacttest_c8n_1x3_shard1_replica_n1 t:null-1021] o.a.s.c.ShardLeaderElectionContextBase Creating leader registration node /collections/repfacttest_c8n_1x3/leaders/shard1/leader after winning as /collections/repfacttest_c8n_1x3/leader_elect/shard1/election/72077814988931084-core_node3-n_0000000000 2> 350906 INFO (qtp1455143443-2011) [n:127.0.0.1:32875__c c:repfacttest_c8n_1x3 s:shard1 r:core_node3 x:repfacttest_c8n_1x3_shard1_replica_n1 t:null-1021] o.a.s.c.ShardLeaderElectionContext I am the new leader: http://127.0.0.1:32875/_c/repfacttest_c8n_1x3_shard1_replica_n1/ shard1 2> 351021 INFO (zkCallback-1180-thread-2) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/repfacttest_c8n_1x3/state.json zxid: 358] for collection [repfacttest_c8n_1x3] has occurred - updating... (live nodes size: [4]) 2> 351030 INFO (qtp1455143443-2011) [n:127.0.0.1:32875__c c:repfacttest_c8n_1x3 s:shard1 r:core_node3 x:repfacttest_c8n_1x3_shard1_replica_n1 t:null-1021] o.a.s.c.ZkController I am the leader, no recovery necessary 2> 351058 INFO (qtp1455143443-2011) [n:127.0.0.1:32875__c c: s: r: x: t:null-1021] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/cores params={qt=/admin/cores&coreNodeName=core_node3&collection.configName=conf1&newCollection=true&name=repfacttest_c8n_1x3_shard1_replica_n1&action=CREATE&numShards=1&collection=repfacttest_c8n_1x3&shard=shard1&wt=javabin&version=2&replicaType=NRT} status=0 QTime=2200 2> 351154 INFO (zkCallback-1180-thread-2) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/repfacttest_c8n_1x3/state.json zxid: 361] for collection [repfacttest_c8n_1x3] has occurred - updating... (live nodes size: [4]) 2> 351154 INFO (zkCallback-1180-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/repfacttest_c8n_1x3/state.json zxid: 361] for collection [repfacttest_c8n_1x3] has occurred - updating... (live nodes size: [4]) 2> 351399 INFO (qtp1461542799-1983) [n:127.0.0.1:33207__c c: s: r: x: t:null-1022] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/cores params={qt=/admin/cores&coreNodeName=core_node6&collection.configName=conf1&newCollection=true&name=repfacttest_c8n_1x3_shard1_replica_n4&action=CREATE&numShards=1&collection=repfacttest_c8n_1x3&shard=shard1&wt=javabin&version=2&replicaType=NRT} status=0 QTime=2538 2> 351465 INFO (qtp1639791653-1927) [n:127.0.0.1:40643__c c: s: r: x: t:null-1023] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/cores params={qt=/admin/cores&coreNodeName=core_node5&collection.configName=conf1&newCollection=true&name=repfacttest_c8n_1x3_shard1_replica_n2&action=CREATE&numShards=1&collection=repfacttest_c8n_1x3&shard=shard1&wt=javabin&version=2&replicaType=NRT} status=0 QTime=2576 2> 351479 INFO (qtp1455143443-2015) [n:127.0.0.1:32875__c c:repfacttest_c8n_1x3 s: r: x: t:null-1020] o.a.s.h.a.CollectionsHandler Wait for new collection to be active for at most 45 seconds. Check all shard replicas 2> 351559 INFO (zkCallback-1161-thread-2) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/repfacttest_c8n_1x3/state.json zxid: 368] for collection [repfacttest_c8n_1x3] has occurred - updating... (live nodes size: [4]) 2> 351559 INFO (zkCallback-1180-thread-2) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/repfacttest_c8n_1x3/state.json zxid: 368] for collection [repfacttest_c8n_1x3] has occurred - updating... (live nodes size: [4]) 2> 351559 INFO (zkCallback-1122-thread-2) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/repfacttest_c8n_1x3/state.json zxid: 368] for collection [repfacttest_c8n_1x3] has occurred - updating... (live nodes size: [4]) 2> 351559 INFO (zkCallback-1180-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/repfacttest_c8n_1x3/state.json zxid: 368] for collection [repfacttest_c8n_1x3] has occurred - updating... (live nodes size: [4]) 2> 351565 INFO (qtp1455143443-2015) [n:127.0.0.1:32875__c c:repfacttest_c8n_1x3 s: r: x: t:null-1020] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/collections params={pullReplicas=0&collection.configName=conf1&nrtReplicas=3&name=repfacttest_c8n_1x3&action=CREATE&numShards=1&tlogReplicas=0&wt=javabin&version=2} status=0 QTime=3091 2> 351571 INFO (TEST-ReplicationFactorTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.SolrCloudTestCase active slice count: 1 expected: 1 2> 351571 INFO (TEST-ReplicationFactorTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.SolrCloudTestCase active replica count: 3 expected replica count: 3 2> 351602 INFO (TEST-ReplicationFactorTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.AbstractFullDistribZkTestBase Found 3 replicas and leader on 127.0.0.1:32875__c for shard1 in repfacttest_c8n_1x3 2> 351603 INFO (TEST-ReplicationFactorTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.AbstractFullDistribZkTestBase Took 31.0 ms to see all replicas become active. 2> 351603 INFO (TEST-ReplicationFactorTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.ReplicationFactorTest Indexing docId=1 2> 351700 INFO (SocketProxy-Acceptor-40643) [n: c: s: r: x: t:] o.a.s.c.s.c.SocketProxy accepted Socket[addr=/127.0.0.1,port=52052,localport=40643], receiveBufferSize: 65536 2> 351701 INFO (qtp1455143443-2014) [n:127.0.0.1:32875__c c:repfacttest_c8n_1x3 s:shard1 r:core_node3 x:repfacttest_c8n_1x3_shard1_replica_n1 t:null-1024] o.a.s.c.ZkShardTerms Successful update of terms at /collections/repfacttest_c8n_1x3/terms/shard1 to Terms{values={core_node6=1, core_node3=1, core_node5=1}, version=3} for ensureHighestTermsAreNotZero 2> 351709 INFO (SocketProxy-Acceptor-40643) [n: c: s: r: x: t:] o.a.s.c.s.c.SocketProxy proxy connection Socket[addr=/127.0.0.1,port=43165,localport=53102], receiveBufferSize=65536 2> 351764 INFO (qtp1639791653-1928) [n:127.0.0.1:40643__c c:repfacttest_c8n_1x3 s:shard1 r:core_node5 x:repfacttest_c8n_1x3_shard1_replica_n2 t:null-1024] o.a.s.u.p.LogUpdateProcessorFactory webapp=/_c path=/update params={update.distrib=FROMLEADER&distrib.from=http://127.0.0.1:32875/_c/repfacttest_c8n_1x3_shard1_replica_n1/&wt=javabin&version=2}{add=[1 (1788735687205322752)]} 0 37 2> 351769 INFO (qtp1461542799-1981) [n:127.0.0.1:33207__c c:repfacttest_c8n_1x3 s:shard1 r:core_node6 x:repfacttest_c8n_1x3_shard1_replica_n4 t:null-1024] o.a.s.u.p.LogUpdateProcessorFactory webapp=/_c path=/update params={update.distrib=FROMLEADER&distrib.from=http://127.0.0.1:32875/_c/repfacttest_c8n_1x3_shard1_replica_n1/&wt=javabin&version=2}{add=[1 (1788735687205322752)]} 0 67 2> 351772 INFO (qtp1455143443-2014) [n:127.0.0.1:32875__c c:repfacttest_c8n_1x3 s:shard1 r:core_node3 x:repfacttest_c8n_1x3_shard1_replica_n1 t:null-1024] o.a.s.u.p.LogUpdateProcessorFactory webapp=/_c path=/update params={wt=javabin&version=2}{add=[1 (1788735687205322752)]} 0 156 2> 351856 INFO (qtp1639791653-1925) [n:127.0.0.1:40643__c c:repfacttest_c8n_1x3 s:shard1 r:core_node5 x:repfacttest_c8n_1x3_shard1_replica_n2 t:null-1025] o.a.s.u.p.LogUpdateProcessorFactory webapp=/_c path=/update params={update.distrib=FROMLEADER&distrib.from=http://127.0.0.1:32875/_c/repfacttest_c8n_1x3_shard1_replica_n1/&wt=javabin&version=2}{add=[1134 (1788735687363657728)]} 0 18 2> 351865 INFO (qtp1461542799-1982) [n:127.0.0.1:33207__c c:repfacttest_c8n_1x3 s:shard1 r:core_node6 x:repfacttest_c8n_1x3_shard1_replica_n4 t:null-1025] o.a.s.u.p.LogUpdateProcessorFactory webapp=/_c path=/update params={update.distrib=FROMLEADER&distrib.from=http://127.0.0.1:32875/_c/repfacttest_c8n_1x3_shard1_replica_n1/&wt=javabin&version=2}{add=[1134 (1788735687363657728)]} 0 26 2> 351872 INFO (qtp1455143443-2013) [n:127.0.0.1:32875__c c:repfacttest_c8n_1x3 s:shard1 r:core_node3 x:repfacttest_c8n_1x3_shard1_replica_n1 t:null-1025] o.a.s.u.p.LogUpdateProcessorFactory webapp=/_c path=/update params={wt=javabin&version=2}{add=[1134 (1788735687363657728)]} 0 93 2> 351952 INFO (qtp1639791653-1929) [n:127.0.0.1:40643__c c:repfacttest_c8n_1x3 s:shard1 r:core_node5 x:repfacttest_c8n_1x3_shard1_replica_n2 t:null-1026] o.a.s.u.p.LogUpdateProcessorFactory webapp=/_c path=/update params={update.distrib=FROMLEADER&distrib.from=http://127.0.0.1:32875/_c/repfacttest_c8n_1x3_shard1_replica_n1/&wt=javabin&version=2}{delete=[1134 (-1788735687480049664)]} 0 10 2> 351957 INFO (qtp1461542799-1984) [n:127.0.0.1:33207__c c:repfacttest_c8n_1x3 s:shard1 r:core_node6 x:repfacttest_c8n_1x3_shard1_replica_n4 t:null-1026] o.a.s.u.p.LogUpdateProcessorFactory webapp=/_c path=/update params={update.distrib=FROMLEADER&distrib.from=http://127.0.0.1:32875/_c/repfacttest_c8n_1x3_shard1_replica_n1/&wt=javabin&version=2}{delete=[1134 (-1788735687480049664)]} 0 15 2> 351961 INFO (qtp1455143443-2011) [n:127.0.0.1:32875__c c:repfacttest_c8n_1x3 s:shard1 r:core_node3 x:repfacttest_c8n_1x3_shard1_replica_n1 t:null-1026] o.a.s.u.p.LogUpdateProcessorFactory webapp=/_c path=/update params={wt=javabin&version=2}{delete=[1134 (-1788735687480049664)]} 0 77 2> 352022 INFO (qtp1639791653-1927) [n:127.0.0.1:40643__c c:repfacttest_c8n_1x3 s:shard1 r:core_node5 x:repfacttest_c8n_1x3_shard1_replica_n2 t:null-1027] o.a.s.u.p.LogUpdateProcessorFactory webapp=/_c path=/update params={update.distrib=FROMLEADER&distrib.from=http://127.0.0.1:32875/_c/repfacttest_c8n_1x3_shard1_replica_n1/&wt=javabin&version=2}{add=[1167 (1788735687561838592)]} 0 4 2> 352027 INFO (qtp1461542799-1985) [n:127.0.0.1:33207__c c:repfacttest_c8n_1x3 s:shard1 r:core_node6 x:repfacttest_c8n_1x3_shard1_replica_n4 t:null-1027] o.a.s.u.p.LogUpdateProcessorFactory webapp=/_c path=/update params={update.distrib=FROMLEADER&distrib.from=http://127.0.0.1:32875/_c/repfacttest_c8n_1x3_shard1_replica_n1/&wt=javabin&version=2}{add=[1167 (1788735687561838592)]} 0 9 2> 352030 INFO (qtp1455143443-2015) [n:127.0.0.1:32875__c c:repfacttest_c8n_1x3 s:shard1 r:core_node3 x:repfacttest_c8n_1x3_shard1_replica_n1 t:null-1027] o.a.s.u.p.LogUpdateProcessorFactory webapp=/_c path=/update params={wt=javabin&version=2}{add=[1167 (1788735687561838592)]} 0 63 2> 352374 INFO (qtp1639791653-1924) [n:127.0.0.1:40643__c c:repfacttest_c8n_1x3 s:shard1 r:core_node5 x:repfacttest_c8n_1x3_shard1_replica_n2 t:null-1028] o.a.s.u.p.LogUpdateProcessorFactory webapp=/_c path=/update params={update.distrib=FROMLEADER&_version_=-1788735687639433216&distrib.from=http://127.0.0.1:32875/_c/repfacttest_c8n_1x3_shard1_replica_n1/&wt=javabin&version=2}{deleteByQuery=id:(1167) (-1788735687639433216)} 0 125 2> 352379 INFO (qtp1461542799-1986) [n:127.0.0.1:33207__c c:repfacttest_c8n_1x3 s:shard1 r:core_node6 x:repfacttest_c8n_1x3_shard1_replica_n4 t:null-1028] o.a.s.u.p.LogUpdateProcessorFactory webapp=/_c path=/update params={update.distrib=FROMLEADER&_version_=-1788735687639433216&distrib.from=http://127.0.0.1:32875/_c/repfacttest_c8n_1x3_shard1_replica_n1/&wt=javabin&version=2}{deleteByQuery=id:(1167) (-1788735687639433216)} 0 130 2> 352384 INFO (qtp1455143443-2010) [n:127.0.0.1:32875__c c:repfacttest_c8n_1x3 s:shard1 r:core_node3 x:repfacttest_c8n_1x3_shard1_replica_n1 t:null-1028] o.a.s.u.p.LogUpdateProcessorFactory webapp=/_c path=/update params={wt=javabin&version=2}{deleteByQuery=id:(1167) (-1788735687639433216)} 0 344 2> 352386 INFO (TEST-ReplicationFactorTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.ReplicationFactorTest Closing one proxy port 2> 352392 WARN (TEST-ReplicationFactorTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.s.c.SocketProxy Closing 4 connections to: http://127.0.0.1:33207/_c, target: http://127.0.0.1:35371/_c 2> 352398 INFO (TEST-ReplicationFactorTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.ReplicationFactorTest Indexing docId=2 2> 352439 INFO (qtp1639791653-1926) [n:127.0.0.1:40643__c c:repfacttest_c8n_1x3 s:shard1 r:core_node5 x:repfacttest_c8n_1x3_shard1_replica_n2 t:null-1029] o.a.s.u.p.LogUpdateProcessorFactory webapp=/_c path=/update params={update.distrib=FROMLEADER&distrib.from=http://127.0.0.1:32875/_c/repfacttest_c8n_1x3_shard1_replica_n1/&wt=javabin&version=2}{add=[2 (1788735688023212032)]} 0 10 2> 352443 ERROR (updateExecutor-1175-thread-2-processing-repfacttest_c8n_1x3_shard1_replica_n1 null-1029 core_node3 null 127.0.0.1:32875__c repfacttest_c8n_1x3 shard1) [n:127.0.0.1:32875__c c:repfacttest_c8n_1x3 s:shard1 r:core_node3 x:repfacttest_c8n_1x3_shard1_replica_n1 t:null-1029] o.a.s.u.ErrorReportingConcurrentUpdateSolrClient Error when calling SolrCmdDistributor$Req: cmd=add{,id=(null)}; node=StdNode: http://127.0.0.1:33207/_c/repfacttest_c8n_1x3_shard1_replica_n4/ to http://127.0.0.1:33207/_c/repfacttest_c8n_1x3_shard1_replica_n4/ 2> => java.io.IOException: java.net.ConnectException: Connection refused 2> at org.eclipse.jetty.client.util.OutputStreamRequestContent$AsyncOutputStream.write(OutputStreamRequestContent.java:104) 2> java.io.IOException: java.net.ConnectException: Connection refused 2> at org.eclipse.jetty.client.util.OutputStreamRequestContent$AsyncOutputStream.write(OutputStreamRequestContent.java:104) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.apache.solr.common.util.FastOutputStream.flush(FastOutputStream.java:207) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.common.util.FastOutputStream.flushBuffer(FastOutputStream.java:200) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.common.util.JavaBinCodec.marshal(JavaBinCodec.java:170) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.request.JavaBinUpdateRequestCodec.marshal(JavaBinUpdateRequestCodec.java:100) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.BinaryRequestWriter.write(BinaryRequestWriter.java:79) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.Http2SolrClient.send(Http2SolrClient.java:435) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.ConcurrentUpdateHttp2SolrClient$Runner.sendUpdateStream(ConcurrentUpdateHttp2SolrClient.java:250) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.ConcurrentUpdateHttp2SolrClient$Runner.run(ConcurrentUpdateHttp2SolrClient.java:192) [solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at com.codahale.metrics.InstrumentedExecutorService$InstrumentedRunnable.run(InstrumentedExecutorService.java:212) [metrics-core-4.2.21.jar:4.2.21] 2> at org.apache.solr.common.util.ExecutorUtil$MDCAwareThreadPoolExecutor.lambda$execute$0(ExecutorUtil.java:294) [solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) [?:?] 2> at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) [?:?] 2> at java.base/java.lang.Thread.run(Thread.java:829) [?:?] 2> Suppressed: java.io.IOException: java.net.ConnectException: Connection refused 2> at org.eclipse.jetty.client.util.OutputStreamRequestContent$AsyncOutputStream.write(OutputStreamRequestContent.java:104) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.apache.solr.common.util.FastOutputStream.flush(FastOutputStream.java:207) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.common.util.FastOutputStream.flushBuffer(FastOutputStream.java:200) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.common.util.JavaBinCodec.close(JavaBinCodec.java:1293) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.request.JavaBinUpdateRequestCodec.marshal(JavaBinUpdateRequestCodec.java:99) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.BinaryRequestWriter.write(BinaryRequestWriter.java:79) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.Http2SolrClient.send(Http2SolrClient.java:435) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.ConcurrentUpdateHttp2SolrClient$Runner.sendUpdateStream(ConcurrentUpdateHttp2SolrClient.java:250) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.ConcurrentUpdateHttp2SolrClient$Runner.run(ConcurrentUpdateHttp2SolrClient.java:192) [solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at com.codahale.metrics.InstrumentedExecutorService$InstrumentedRunnable.run(InstrumentedExecutorService.java:212) [metrics-core-4.2.21.jar:4.2.21] 2> at org.apache.solr.common.util.ExecutorUtil$MDCAwareThreadPoolExecutor.lambda$execute$0(ExecutorUtil.java:294) [solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) [?:?] 2> at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) [?:?] 2> at java.base/java.lang.Thread.run(Thread.java:829) [?:?] 2> Caused by: java.net.ConnectException: Connection refused 2> at java.base/sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) ~[?:?] 2> at java.base/sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:774) ~[?:?] 2> at java.base/sun.nio.ch.SocketAdaptor.connect(SocketAdaptor.java:120) ~[?:?] 2> at org.eclipse.jetty.io.ClientConnector.connect(ClientConnector.java:428) ~[jetty-io-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.HTTP2Client.connect(HTTP2Client.java:461) ~[http2-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.http.HttpClientTransportOverHTTP2.connect(HttpClientTransportOverHTTP2.java:146) ~[http2-http-client-transport-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.http.HttpClientTransportOverHTTP2.connect(HttpClientTransportOverHTTP2.java:135) ~[http2-http-client-transport-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.connect(HttpClient.java:583) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.succeeded(HttpClient.java:560) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.succeeded(HttpClient.java:556) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.util.SocketAddressResolver$Async.lambda$resolve$1(SocketAddressResolver.java:180) ~[jetty-util-10.0.19.jar:10.0.19] 2> ... 4 more 2> Caused by: java.net.ConnectException: Connection refused 2> at java.base/sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) ~[?:?] 2> at java.base/sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:774) ~[?:?] 2> at java.base/sun.nio.ch.SocketAdaptor.connect(SocketAdaptor.java:120) ~[?:?] 2> at org.eclipse.jetty.io.ClientConnector.connect(ClientConnector.java:428) ~[jetty-io-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.HTTP2Client.connect(HTTP2Client.java:461) ~[http2-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.http.HttpClientTransportOverHTTP2.connect(HttpClientTransportOverHTTP2.java:146) ~[http2-http-client-transport-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.http.HttpClientTransportOverHTTP2.connect(HttpClientTransportOverHTTP2.java:135) ~[http2-http-client-transport-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.connect(HttpClient.java:583) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.succeeded(HttpClient.java:560) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.succeeded(HttpClient.java:556) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.util.SocketAddressResolver$Async.lambda$resolve$1(SocketAddressResolver.java:180) ~[jetty-util-10.0.19.jar:10.0.19] 2> ... 4 more 2> 352474 INFO (qtp1455143443-2012) [n:127.0.0.1:32875__c c:repfacttest_c8n_1x3 s:shard1 r:core_node3 x:repfacttest_c8n_1x3_shard1_replica_n1 t:null-1029] o.a.s.u.SolrCmdDistributor SolrCmdDistributor found 1 errors 2> 352478 WARN (qtp1455143443-2012) [n:127.0.0.1:32875__c c:repfacttest_c8n_1x3 s:shard1 r:core_node3 x:repfacttest_c8n_1x3_shard1_replica_n1 t:null-1029] o.a.s.u.p.DistributedZkUpdateProcessor Error sending update to http://127.0.0.1:33207/_c 2> => java.io.IOException: java.net.ConnectException: Connection refused 2> at org.eclipse.jetty.client.util.OutputStreamRequestContent$AsyncOutputStream.write(OutputStreamRequestContent.java:104) 2> java.io.IOException: java.net.ConnectException: Connection refused 2> at org.eclipse.jetty.client.util.OutputStreamRequestContent$AsyncOutputStream.write(OutputStreamRequestContent.java:104) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.apache.solr.common.util.FastOutputStream.flush(FastOutputStream.java:207) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.common.util.FastOutputStream.flushBuffer(FastOutputStream.java:200) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.common.util.JavaBinCodec.marshal(JavaBinCodec.java:170) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.request.JavaBinUpdateRequestCodec.marshal(JavaBinUpdateRequestCodec.java:100) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.BinaryRequestWriter.write(BinaryRequestWriter.java:79) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.Http2SolrClient.send(Http2SolrClient.java:435) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.ConcurrentUpdateHttp2SolrClient$Runner.sendUpdateStream(ConcurrentUpdateHttp2SolrClient.java:250) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.ConcurrentUpdateHttp2SolrClient$Runner.run(ConcurrentUpdateHttp2SolrClient.java:192) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at com.codahale.metrics.InstrumentedExecutorService$InstrumentedRunnable.run(InstrumentedExecutorService.java:212) ~[metrics-core-4.2.21.jar:4.2.21] 2> at org.apache.solr.common.util.ExecutorUtil$MDCAwareThreadPoolExecutor.lambda$execute$0(ExecutorUtil.java:294) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) ~[?:?] 2> at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) ~[?:?] 2> at java.base/java.lang.Thread.run(Thread.java:829) [?:?] 2> Suppressed: java.io.IOException: java.net.ConnectException: Connection refused 2> at org.eclipse.jetty.client.util.OutputStreamRequestContent$AsyncOutputStream.write(OutputStreamRequestContent.java:104) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.apache.solr.common.util.FastOutputStream.flush(FastOutputStream.java:207) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.common.util.FastOutputStream.flushBuffer(FastOutputStream.java:200) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.common.util.JavaBinCodec.close(JavaBinCodec.java:1293) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.request.JavaBinUpdateRequestCodec.marshal(JavaBinUpdateRequestCodec.java:99) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.BinaryRequestWriter.write(BinaryRequestWriter.java:79) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.Http2SolrClient.send(Http2SolrClient.java:435) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.ConcurrentUpdateHttp2SolrClient$Runner.sendUpdateStream(ConcurrentUpdateHttp2SolrClient.java:250) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.ConcurrentUpdateHttp2SolrClient$Runner.run(ConcurrentUpdateHttp2SolrClient.java:192) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at com.codahale.metrics.InstrumentedExecutorService$InstrumentedRunnable.run(InstrumentedExecutorService.java:212) ~[metrics-core-4.2.21.jar:4.2.21] 2> at org.apache.solr.common.util.ExecutorUtil$MDCAwareThreadPoolExecutor.lambda$execute$0(ExecutorUtil.java:294) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) ~[?:?] 2> at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) ~[?:?] 2> at java.base/java.lang.Thread.run(Thread.java:829) [?:?] 2> Caused by: java.net.ConnectException: Connection refused 2> at java.base/sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) ~[?:?] 2> at java.base/sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:774) ~[?:?] 2> at java.base/sun.nio.ch.SocketAdaptor.connect(SocketAdaptor.java:120) ~[?:?] 2> at org.eclipse.jetty.io.ClientConnector.connect(ClientConnector.java:428) ~[jetty-io-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.HTTP2Client.connect(HTTP2Client.java:461) ~[http2-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.http.HttpClientTransportOverHTTP2.connect(HttpClientTransportOverHTTP2.java:146) ~[http2-http-client-transport-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.http.HttpClientTransportOverHTTP2.connect(HttpClientTransportOverHTTP2.java:135) ~[http2-http-client-transport-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.connect(HttpClient.java:583) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.succeeded(HttpClient.java:560) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.succeeded(HttpClient.java:556) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.util.SocketAddressResolver$Async.lambda$resolve$1(SocketAddressResolver.java:180) ~[jetty-util-10.0.19.jar:10.0.19] 2> ... 4 more 2> Caused by: java.net.ConnectException: Connection refused 2> at java.base/sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) ~[?:?] 2> at java.base/sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:774) ~[?:?] 2> at java.base/sun.nio.ch.SocketAdaptor.connect(SocketAdaptor.java:120) ~[?:?] 2> at org.eclipse.jetty.io.ClientConnector.connect(ClientConnector.java:428) ~[jetty-io-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.HTTP2Client.connect(HTTP2Client.java:461) ~[http2-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.http.HttpClientTransportOverHTTP2.connect(HttpClientTransportOverHTTP2.java:146) ~[http2-http-client-transport-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.http.HttpClientTransportOverHTTP2.connect(HttpClientTransportOverHTTP2.java:135) ~[http2-http-client-transport-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.connect(HttpClient.java:583) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.succeeded(HttpClient.java:560) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.succeeded(HttpClient.java:556) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.util.SocketAddressResolver$Async.lambda$resolve$1(SocketAddressResolver.java:180) ~[jetty-util-10.0.19.jar:10.0.19] 2> ... 4 more 2> 352504 INFO (OverseerCollectionConfigSetProcessor-72077814988931076-127.0.0.1:40643__c-n_0000000000) [n: c: s: r: x: t:] o.a.s.c.OverseerTaskQueue Response ZK path: /overseer/collection-queue-work/qnr-0000000010 doesn't exist. Requestor may have disconnected from ZooKeeper 2> 352515 ERROR (qtp1455143443-2012) [n:127.0.0.1:32875__c c:repfacttest_c8n_1x3 s:shard1 r:core_node3 x:repfacttest_c8n_1x3_shard1_replica_n1 t:null-1029] o.a.s.u.p.DistributedZkUpdateProcessor Setting up to try to start recovery on replica core_node6 with url http://127.0.0.1:33207/_c/repfacttest_c8n_1x3_shard1_replica_n4/ by increasing leader term 2> => java.net.ConnectException: Connection refused 2> at java.base/sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) 2> java.net.ConnectException: Connection refused 2> at java.base/sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) ~[?:?] 2> at java.base/sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:774) ~[?:?] 2> at java.base/sun.nio.ch.SocketAdaptor.connect(SocketAdaptor.java:120) ~[?:?] 2> at org.eclipse.jetty.io.ClientConnector.connect(ClientConnector.java:428) ~[jetty-io-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.HTTP2Client.connect(HTTP2Client.java:461) ~[http2-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.http.HttpClientTransportOverHTTP2.connect(HttpClientTransportOverHTTP2.java:146) ~[http2-http-client-transport-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.http.HttpClientTransportOverHTTP2.connect(HttpClientTransportOverHTTP2.java:135) ~[http2-http-client-transport-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.connect(HttpClient.java:583) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.succeeded(HttpClient.java:560) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.succeeded(HttpClient.java:556) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.util.SocketAddressResolver$Async.lambda$resolve$1(SocketAddressResolver.java:180) ~[jetty-util-10.0.19.jar:10.0.19] 2> at org.apache.solr.common.util.ExecutorUtil$MDCAwareThreadPoolExecutor.lambda$execute$0(ExecutorUtil.java:294) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) ~[?:?] 2> at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) ~[?:?] 2> at java.base/java.lang.Thread.run(Thread.java:829) [?:?] 2> 352533 INFO (qtp1455143443-2012) [n:127.0.0.1:32875__c c:repfacttest_c8n_1x3 s:shard1 r:core_node3 x:repfacttest_c8n_1x3_shard1_replica_n1 t:null-1029] o.a.s.c.ZkShardTerms Successful update of terms at /collections/repfacttest_c8n_1x3/terms/shard1 to Terms{values={core_node6=1, core_node3=2, core_node5=2}, version=4} for ensureTermsIsHigher 2> 352534 INFO (qtp1455143443-2012) [n:127.0.0.1:32875__c c:repfacttest_c8n_1x3 s:shard1 r:core_node3 x:repfacttest_c8n_1x3_shard1_replica_n1 t:null-1029] o.a.s.u.p.LogUpdateProcessorFactory webapp=/_c path=/update params={wt=javabin&version=2}{add=[2 (1788735688023212032)]} 0 127 2> 352540 INFO (zkCallback-1161-thread-2) [n:127.0.0.1:33207__c c:repfacttest_c8n_1x3 s:shard1 r:core_node6 x:repfacttest_c8n_1x3_shard1_replica_n4 t:] o.a.s.c.RecoveringCoreTermWatcher Start recovery on core_node6 because core's term is less than leader's term 2> 352542 INFO (updateExecutor-1156-thread-1-processing-127.0.0.1:33207__c repfacttest_c8n_1x3_shard1_replica_n4 repfacttest_c8n_1x3 shard1 core_node6) [n:127.0.0.1:33207__c c:repfacttest_c8n_1x3 s:shard1 r:core_node6 x:repfacttest_c8n_1x3_shard1_replica_n4 t:] o.a.s.u.DefaultSolrCoreState Running recovery 2> 352550 INFO (recoveryExecutor-1158-thread-1-processing-127.0.0.1:33207__c repfacttest_c8n_1x3_shard1_replica_n4 repfacttest_c8n_1x3 shard1 core_node6) [n:127.0.0.1:33207__c c:repfacttest_c8n_1x3 s:shard1 r:core_node6 x:repfacttest_c8n_1x3_shard1_replica_n4 t:] o.a.s.c.RecoveryStrategy Starting recovery process. recoveringAfterStartup=true 2> 352556 INFO (recoveryExecutor-1158-thread-1-processing-127.0.0.1:33207__c repfacttest_c8n_1x3_shard1_replica_n4 repfacttest_c8n_1x3 shard1 core_node6) [n:127.0.0.1:33207__c c:repfacttest_c8n_1x3 s:shard1 r:core_node6 x:repfacttest_c8n_1x3_shard1_replica_n4 t:] o.a.s.c.RecoveryStrategy Found new versions added after startup: num=[5] 2> 352556 INFO (recoveryExecutor-1158-thread-1-processing-127.0.0.1:33207__c repfacttest_c8n_1x3_shard1_replica_n4 repfacttest_c8n_1x3 shard1 core_node6) [n:127.0.0.1:33207__c c:repfacttest_c8n_1x3 s:shard1 r:core_node6 x:repfacttest_c8n_1x3_shard1_replica_n4 t:] o.a.s.c.RecoveryStrategy currentVersions size=5 range=[-1788735687639433216 to 1788735687205322752] 2> 352556 INFO (recoveryExecutor-1158-thread-1-processing-127.0.0.1:33207__c repfacttest_c8n_1x3_shard1_replica_n4 repfacttest_c8n_1x3 shard1 core_node6) [n:127.0.0.1:33207__c c:repfacttest_c8n_1x3 s:shard1 r:core_node6 x:repfacttest_c8n_1x3_shard1_replica_n4 t:] o.a.s.c.RecoveryStrategy startupVersions is empty 2> 352560 INFO (SocketProxy-Acceptor-32875) [n: c: s: r: x: t:] o.a.s.c.s.c.SocketProxy accepted Socket[addr=/127.0.0.1,port=44202,localport=32875], receiveBufferSize: 65536 2> 352564 INFO (SocketProxy-Acceptor-32875) [n: c: s: r: x: t:] o.a.s.c.s.c.SocketProxy proxy connection Socket[addr=/127.0.0.1,port=38185,localport=40198], receiveBufferSize=65536 2> 352570 INFO (qtp1455143443-2011) [n:127.0.0.1:32875__c c:repfacttest_c8n_1x3 s:shard1 r:core_node3 x:repfacttest_c8n_1x3_shard1_replica_n1 t:null-1031] o.a.s.c.S.Request webapp=/_c path=/admin/ping params={wt=javabin&version=2} hits=0 status=0 QTime=4 2> 352571 INFO (qtp1455143443-2011) [n:127.0.0.1:32875__c c:repfacttest_c8n_1x3 s:shard1 r:core_node3 x:repfacttest_c8n_1x3_shard1_replica_n1 t:null-1031] o.a.s.c.S.Request webapp=/_c path=/admin/ping params={wt=javabin&version=2} status=0 QTime=4 2> 352572 INFO (recoveryExecutor-1158-thread-1-processing-127.0.0.1:33207__c repfacttest_c8n_1x3_shard1_replica_n4 repfacttest_c8n_1x3 shard1 core_node6) [n:127.0.0.1:33207__c c:repfacttest_c8n_1x3 s:shard1 r:core_node6 x:repfacttest_c8n_1x3_shard1_replica_n4 t:] o.a.s.c.RecoveryStrategy Begin buffering updates. core=[repfacttest_c8n_1x3_shard1_replica_n4] 2> 352579 INFO (recoveryExecutor-1158-thread-1-processing-127.0.0.1:33207__c repfacttest_c8n_1x3_shard1_replica_n4 repfacttest_c8n_1x3 shard1 core_node6) [n:127.0.0.1:33207__c c:repfacttest_c8n_1x3 s:shard1 r:core_node6 x:repfacttest_c8n_1x3_shard1_replica_n4 t:] o.a.s.u.UpdateLog Starting to buffer updates. FSUpdateLog{state=ACTIVE, tlog=tlog{file=/build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.cloud.ReplicationFactorTest_246C98A4C257C021-001/shard-1-001/cores/repfacttest_c8n_1x3_shard1_replica_n4/data/tlog/tlog.0000000000000000000 refcount=1}} 2> 352579 INFO (recoveryExecutor-1158-thread-1-processing-127.0.0.1:33207__c repfacttest_c8n_1x3_shard1_replica_n4 repfacttest_c8n_1x3 shard1 core_node6) [n:127.0.0.1:33207__c c:repfacttest_c8n_1x3 s:shard1 r:core_node6 x:repfacttest_c8n_1x3_shard1_replica_n4 t:] o.a.s.c.RecoveryStrategy Publishing state of core [repfacttest_c8n_1x3_shard1_replica_n4] as recovering, leader is [http://127.0.0.1:32875/_c/repfacttest_c8n_1x3_shard1_replica_n1/] and I am [http://127.0.0.1:33207/_c/repfacttest_c8n_1x3_shard1_replica_n4/] 2> 352590 INFO (recoveryExecutor-1158-thread-1-processing-127.0.0.1:33207__c repfacttest_c8n_1x3_shard1_replica_n4 repfacttest_c8n_1x3 shard1 core_node6) [n:127.0.0.1:33207__c c:repfacttest_c8n_1x3 s:shard1 r:core_node6 x:repfacttest_c8n_1x3_shard1_replica_n4 t:] o.a.s.c.ZkShardTerms Successful update of terms at /collections/repfacttest_c8n_1x3/terms/shard1 to Terms{values={core_node6=2, core_node6_recovering=1, core_node3=2, core_node5=2}, version=5} for startRecovering 2> 352597 INFO (recoveryExecutor-1158-thread-1-processing-127.0.0.1:33207__c repfacttest_c8n_1x3_shard1_replica_n4 repfacttest_c8n_1x3 shard1 core_node6) [n:127.0.0.1:33207__c c:repfacttest_c8n_1x3 s:shard1 r:core_node6 x:repfacttest_c8n_1x3_shard1_replica_n4 t:] o.a.s.c.RecoveryStrategy Sending prep recovery command to [http://127.0.0.1:32875/_c]; [WaitForState: action=PREPRECOVERY&core=repfacttest_c8n_1x3_shard1_replica_n1&nodeName=127.0.0.1:33207__c&coreNodeName=core_node6&state=recovering&checkLive=true&onlyIfLeader=true&onlyIfLeaderActive=true] 2> 352604 INFO (qtp1639791653-1928) [n:127.0.0.1:40643__c c:repfacttest_c8n_1x3 s:shard1 r:core_node5 x:repfacttest_c8n_1x3_shard1_replica_n2 t:null-1030] o.a.s.u.p.LogUpdateProcessorFactory webapp=/_c path=/update params={update.distrib=FROMLEADER&distrib.from=http://127.0.0.1:32875/_c/repfacttest_c8n_1x3_shard1_replica_n1/&wt=javabin&version=2}{add=[1279 (1788735688164769792)]} 0 9 2> 352605 INFO (qtp1455143443-2015) [n:127.0.0.1:32875__c c: s: r: x:repfacttest_c8n_1x3_shard1_replica_n1 t:null-1032] o.a.s.h.a.PrepRecoveryOp Going to wait for coreNodeName: core_node6, state: recovering, checkLive: true, onlyIfLeader: true, onlyIfLeaderActive: true 2> 352606 INFO (qtp1455143443-2015) [n:127.0.0.1:32875__c c: s: r: x: t:null-1032] o.a.s.h.a.PrepRecoveryOp In WaitForState(recovering): collection=repfacttest_c8n_1x3, shard=shard1, thisCore=repfacttest_c8n_1x3_shard1_replica_n1, leaderDoesNotNeedRecovery=false, isLeader? true, live=true, checkLive=true, currentState=active, localState=active, nodeName=127.0.0.1:33207__c, coreNodeName=core_node6, onlyIfActiveCheckResult=false, nodeProps: core_node6:{ 2> "core":"repfacttest_c8n_1x3_shard1_replica_n4", 2> "node_name":"127.0.0.1:33207__c", 2> "base_url":"http://127.0.0.1:33207/_c", 2> "state":"active", 2> "type":"NRT", 2> "force_set_state":"false"} 2> 352607 INFO (qtp1455143443-2015) [n:127.0.0.1:32875__c c: s: r: x: t:null-1032] o.a.s.h.a.PrepRecoveryOp In WaitForState(recovering): collection=repfacttest_c8n_1x3, shard=shard1, thisCore=repfacttest_c8n_1x3_shard1_replica_n1, leaderDoesNotNeedRecovery=false, isLeader? true, live=true, checkLive=true, currentState=active, localState=active, nodeName=127.0.0.1:33207__c, coreNodeName=core_node6, onlyIfActiveCheckResult=false, nodeProps: core_node6:{ 2> "core":"repfacttest_c8n_1x3_shard1_replica_n4", 2> "node_name":"127.0.0.1:33207__c", 2> "base_url":"http://127.0.0.1:33207/_c", 2> "state":"active", 2> "type":"NRT", 2> "force_set_state":"false"} 2> 352607 INFO (qtp1455143443-2014) [n:127.0.0.1:32875__c c:repfacttest_c8n_1x3 s:shard1 r:core_node3 x:repfacttest_c8n_1x3_shard1_replica_n1 t:null-1030] o.a.s.u.p.LogUpdateProcessorFactory webapp=/_c path=/update params={wt=javabin&version=2}{add=[1279 (1788735688164769792)]} 0 66 2> 352608 INFO (qtp1455143443-2015) [n:127.0.0.1:32875__c c: s: r: x: t:null-1032] o.a.s.h.a.PrepRecoveryOp In WaitForState(recovering): collection=repfacttest_c8n_1x3, shard=shard1, thisCore=repfacttest_c8n_1x3_shard1_replica_n1, leaderDoesNotNeedRecovery=false, isLeader? true, live=true, checkLive=true, currentState=active, localState=active, nodeName=127.0.0.1:33207__c, coreNodeName=core_node6, onlyIfActiveCheckResult=false, nodeProps: core_node6:{ 2> "core":"repfacttest_c8n_1x3_shard1_replica_n4", 2> "node_name":"127.0.0.1:33207__c", 2> "base_url":"http://127.0.0.1:33207/_c", 2> "state":"active", 2> "type":"NRT", 2> "force_set_state":"false"} 2> 352608 INFO (qtp1455143443-2015) [n:127.0.0.1:32875__c c: s: r: x: t:null-1032] o.a.s.h.a.PrepRecoveryOp In WaitForState(recovering): collection=repfacttest_c8n_1x3, shard=shard1, thisCore=repfacttest_c8n_1x3_shard1_replica_n1, leaderDoesNotNeedRecovery=false, isLeader? true, live=true, checkLive=true, currentState=active, localState=active, nodeName=127.0.0.1:33207__c, coreNodeName=core_node6, onlyIfActiveCheckResult=false, nodeProps: core_node6:{ 2> "core":"repfacttest_c8n_1x3_shard1_replica_n4", 2> "node_name":"127.0.0.1:33207__c", 2> "base_url":"http://127.0.0.1:33207/_c", 2> "state":"active", 2> "type":"NRT", 2> "force_set_state":"false"} 2> 352716 INFO (zkCallback-1161-thread-2) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/repfacttest_c8n_1x3/state.json zxid: 380] for collection [repfacttest_c8n_1x3] has occurred - updating... (live nodes size: [4]) 2> 352716 INFO (zkCallback-1180-thread-2) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/repfacttest_c8n_1x3/state.json zxid: 380] for collection [repfacttest_c8n_1x3] has occurred - updating... (live nodes size: [4]) 2> 352716 INFO (zkCallback-1180-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/repfacttest_c8n_1x3/state.json zxid: 380] for collection [repfacttest_c8n_1x3] has occurred - updating... (live nodes size: [4]) 2> 352716 INFO (zkCallback-1122-thread-2) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/repfacttest_c8n_1x3/state.json zxid: 380] for collection [repfacttest_c8n_1x3] has occurred - updating... (live nodes size: [4]) 2> 352721 INFO (watches-1182-thread-2) [n:127.0.0.1:32875__c c: s: r: x: t:] o.a.s.h.a.PrepRecoveryOp In WaitForState(recovering): collection=repfacttest_c8n_1x3, shard=shard1, thisCore=repfacttest_c8n_1x3_shard1_replica_n1, leaderDoesNotNeedRecovery=false, isLeader? true, live=true, checkLive=true, currentState=recovering, localState=active, nodeName=127.0.0.1:33207__c, coreNodeName=core_node6, onlyIfActiveCheckResult=false, nodeProps: core_node6:{ 2> "core":"repfacttest_c8n_1x3_shard1_replica_n4", 2> "node_name":"127.0.0.1:33207__c", 2> "base_url":"http://127.0.0.1:33207/_c", 2> "state":"recovering", 2> "type":"NRT", 2> "force_set_state":"false"} 2> 352723 INFO (qtp1455143443-2015) [n:127.0.0.1:32875__c c: s: r: x: t:null-1032] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/cores params={nodeName=127.0.0.1:33207__c&onlyIfLeaderActive=true&core=repfacttest_c8n_1x3_shard1_replica_n1&coreNodeName=core_node6&action=PREPRECOVERY&checkLive=true&state=recovering&onlyIfLeader=true&wt=javabin&version=2} status=0 QTime=120 2> 352701 ERROR (updateExecutor-1175-thread-1-processing-repfacttest_c8n_1x3_shard1_replica_n1 null-1033 core_node3 null 127.0.0.1:32875__c repfacttest_c8n_1x3 shard1) [n:127.0.0.1:32875__c c:repfacttest_c8n_1x3 s:shard1 r:core_node3 x:repfacttest_c8n_1x3_shard1_replica_n1 t:null-1033] o.a.s.u.ErrorReportingConcurrentUpdateSolrClient Error when calling SolrCmdDistributor$Req: cmd=delete{_version_=-1788735688238170112,query=`id:(1279)`,commitWithin=-1}; node=StdNode: http://127.0.0.1:33207/_c/repfacttest_c8n_1x3_shard1_replica_n4/ to http://127.0.0.1:33207/_c/repfacttest_c8n_1x3_shard1_replica_n4/ 2> => java.io.IOException: java.net.ConnectException: Connection refused 2> at org.eclipse.jetty.client.util.OutputStreamRequestContent$AsyncOutputStream.write(OutputStreamRequestContent.java:104) 2> java.io.IOException: java.net.ConnectException: Connection refused 2> at org.eclipse.jetty.client.util.OutputStreamRequestContent$AsyncOutputStream.write(OutputStreamRequestContent.java:104) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.apache.solr.common.util.FastOutputStream.flush(FastOutputStream.java:207) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.common.util.FastOutputStream.flushBuffer(FastOutputStream.java:200) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.common.util.JavaBinCodec.marshal(JavaBinCodec.java:170) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.request.JavaBinUpdateRequestCodec.marshal(JavaBinUpdateRequestCodec.java:100) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.BinaryRequestWriter.write(BinaryRequestWriter.java:79) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.Http2SolrClient.send(Http2SolrClient.java:435) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.ConcurrentUpdateHttp2SolrClient$Runner.sendUpdateStream(ConcurrentUpdateHttp2SolrClient.java:250) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.ConcurrentUpdateHttp2SolrClient$Runner.run(ConcurrentUpdateHttp2SolrClient.java:192) [solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at com.codahale.metrics.InstrumentedExecutorService$InstrumentedRunnable.run(InstrumentedExecutorService.java:212) [metrics-core-4.2.21.jar:4.2.21] 2> at org.apache.solr.common.util.ExecutorUtil$MDCAwareThreadPoolExecutor.lambda$execute$0(ExecutorUtil.java:294) [solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) [?:?] 2> at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) [?:?] 2> at java.base/java.lang.Thread.run(Thread.java:829) [?:?] 2> Suppressed: java.io.IOException: java.net.ConnectException: Connection refused 2> at org.eclipse.jetty.client.util.OutputStreamRequestContent$AsyncOutputStream.write(OutputStreamRequestContent.java:104) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.apache.solr.common.util.FastOutputStream.flush(FastOutputStream.java:207) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.common.util.FastOutputStream.flushBuffer(FastOutputStream.java:200) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.common.util.JavaBinCodec.close(JavaBinCodec.java:1293) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.request.JavaBinUpdateRequestCodec.marshal(JavaBinUpdateRequestCodec.java:99) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.BinaryRequestWriter.write(BinaryRequestWriter.java:79) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.Http2SolrClient.send(Http2SolrClient.java:435) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.ConcurrentUpdateHttp2SolrClient$Runner.sendUpdateStream(ConcurrentUpdateHttp2SolrClient.java:250) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.ConcurrentUpdateHttp2SolrClient$Runner.run(ConcurrentUpdateHttp2SolrClient.java:192) [solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at com.codahale.metrics.InstrumentedExecutorService$InstrumentedRunnable.run(InstrumentedExecutorService.java:212) [metrics-core-4.2.21.jar:4.2.21] 2> at org.apache.solr.common.util.ExecutorUtil$MDCAwareThreadPoolExecutor.lambda$execute$0(ExecutorUtil.java:294) [solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) [?:?] 2> at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) [?:?] 2> at java.base/java.lang.Thread.run(Thread.java:829) [?:?] 2> Caused by: java.net.ConnectException: Connection refused 2> at java.base/sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) ~[?:?] 2> at java.base/sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:774) ~[?:?] 2> at java.base/sun.nio.ch.SocketAdaptor.connect(SocketAdaptor.java:120) ~[?:?] 2> at org.eclipse.jetty.io.ClientConnector.connect(ClientConnector.java:428) ~[jetty-io-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.HTTP2Client.connect(HTTP2Client.java:461) ~[http2-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.http.HttpClientTransportOverHTTP2.connect(HttpClientTransportOverHTTP2.java:146) ~[http2-http-client-transport-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.http.HttpClientTransportOverHTTP2.connect(HttpClientTransportOverHTTP2.java:135) ~[http2-http-client-transport-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.connect(HttpClient.java:583) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.succeeded(HttpClient.java:560) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.succeeded(HttpClient.java:556) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.util.SocketAddressResolver$Async.lambda$resolve$1(SocketAddressResolver.java:180) ~[jetty-util-10.0.19.jar:10.0.19] 2> ... 4 more 2> Caused by: java.net.ConnectException: Connection refused 2> at java.base/sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) ~[?:?] 2> at java.base/sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:774) ~[?:?] 2> at java.base/sun.nio.ch.SocketAdaptor.connect(SocketAdaptor.java:120) ~[?:?] 2> at org.eclipse.jetty.io.ClientConnector.connect(ClientConnector.java:428) ~[jetty-io-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.HTTP2Client.connect(HTTP2Client.java:461) ~[http2-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.http.HttpClientTransportOverHTTP2.connect(HttpClientTransportOverHTTP2.java:146) ~[http2-http-client-transport-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.http.HttpClientTransportOverHTTP2.connect(HttpClientTransportOverHTTP2.java:135) ~[http2-http-client-transport-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.connect(HttpClient.java:583) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.succeeded(HttpClient.java:560) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.succeeded(HttpClient.java:556) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.util.SocketAddressResolver$Async.lambda$resolve$1(SocketAddressResolver.java:180) ~[jetty-util-10.0.19.jar:10.0.19] 2> ... 4 more 2> 352826 INFO (qtp1639791653-1925) [n:127.0.0.1:40643__c c:repfacttest_c8n_1x3 s:shard1 r:core_node5 x:repfacttest_c8n_1x3_shard1_replica_n2 t:null-1033] o.a.s.u.p.LogUpdateProcessorFactory webapp=/_c path=/update params={update.distrib=FROMLEADER&_version_=-1788735688238170112&distrib.from=http://127.0.0.1:32875/_c/repfacttest_c8n_1x3_shard1_replica_n1/&wt=javabin&version=2}{deleteByQuery=id:(1279) (-1788735688238170112)} 0 84 2> 352829 INFO (qtp1455143443-2010) [n:127.0.0.1:32875__c c:repfacttest_c8n_1x3 s:shard1 r:core_node3 x:repfacttest_c8n_1x3_shard1_replica_n1 t:null-1033] o.a.s.u.SolrCmdDistributor SolrCmdDistributor found 1 errors 2> 352830 WARN (qtp1455143443-2010) [n:127.0.0.1:32875__c c:repfacttest_c8n_1x3 s:shard1 r:core_node3 x:repfacttest_c8n_1x3_shard1_replica_n1 t:null-1033] o.a.s.u.p.DistributedZkUpdateProcessor Error sending update to http://127.0.0.1:33207/_c 2> => java.io.IOException: java.net.ConnectException: Connection refused 2> at org.eclipse.jetty.client.util.OutputStreamRequestContent$AsyncOutputStream.write(OutputStreamRequestContent.java:104) 2> java.io.IOException: java.net.ConnectException: Connection refused 2> at org.eclipse.jetty.client.util.OutputStreamRequestContent$AsyncOutputStream.write(OutputStreamRequestContent.java:104) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.apache.solr.common.util.FastOutputStream.flush(FastOutputStream.java:207) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.common.util.FastOutputStream.flushBuffer(FastOutputStream.java:200) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.common.util.JavaBinCodec.marshal(JavaBinCodec.java:170) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.request.JavaBinUpdateRequestCodec.marshal(JavaBinUpdateRequestCodec.java:100) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.BinaryRequestWriter.write(BinaryRequestWriter.java:79) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.Http2SolrClient.send(Http2SolrClient.java:435) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.ConcurrentUpdateHttp2SolrClient$Runner.sendUpdateStream(ConcurrentUpdateHttp2SolrClient.java:250) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.ConcurrentUpdateHttp2SolrClient$Runner.run(ConcurrentUpdateHttp2SolrClient.java:192) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at com.codahale.metrics.InstrumentedExecutorService$InstrumentedRunnable.run(InstrumentedExecutorService.java:212) ~[metrics-core-4.2.21.jar:4.2.21] 2> at org.apache.solr.common.util.ExecutorUtil$MDCAwareThreadPoolExecutor.lambda$execute$0(ExecutorUtil.java:294) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) ~[?:?] 2> at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) ~[?:?] 2> at java.base/java.lang.Thread.run(Thread.java:829) [?:?] 2> Suppressed: java.io.IOException: java.net.ConnectException: Connection refused 2> at org.eclipse.jetty.client.util.OutputStreamRequestContent$AsyncOutputStream.write(OutputStreamRequestContent.java:104) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.apache.solr.common.util.FastOutputStream.flush(FastOutputStream.java:207) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.common.util.FastOutputStream.flushBuffer(FastOutputStream.java:200) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.common.util.JavaBinCodec.close(JavaBinCodec.java:1293) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.request.JavaBinUpdateRequestCodec.marshal(JavaBinUpdateRequestCodec.java:99) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.BinaryRequestWriter.write(BinaryRequestWriter.java:79) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.Http2SolrClient.send(Http2SolrClient.java:435) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.ConcurrentUpdateHttp2SolrClient$Runner.sendUpdateStream(ConcurrentUpdateHttp2SolrClient.java:250) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.ConcurrentUpdateHttp2SolrClient$Runner.run(ConcurrentUpdateHttp2SolrClient.java:192) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at com.codahale.metrics.InstrumentedExecutorService$InstrumentedRunnable.run(InstrumentedExecutorService.java:212) ~[metrics-core-4.2.21.jar:4.2.21] 2> at org.apache.solr.common.util.ExecutorUtil$MDCAwareThreadPoolExecutor.lambda$execute$0(ExecutorUtil.java:294) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) ~[?:?] 2> at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) ~[?:?] 2> at java.base/java.lang.Thread.run(Thread.java:829) [?:?] 2> Caused by: java.net.ConnectException: Connection refused 2> at java.base/sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) ~[?:?] 2> at java.base/sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:774) ~[?:?] 2> at java.base/sun.nio.ch.SocketAdaptor.connect(SocketAdaptor.java:120) ~[?:?] 2> at org.eclipse.jetty.io.ClientConnector.connect(ClientConnector.java:428) ~[jetty-io-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.HTTP2Client.connect(HTTP2Client.java:461) ~[http2-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.http.HttpClientTransportOverHTTP2.connect(HttpClientTransportOverHTTP2.java:146) ~[http2-http-client-transport-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.http.HttpClientTransportOverHTTP2.connect(HttpClientTransportOverHTTP2.java:135) ~[http2-http-client-transport-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.connect(HttpClient.java:583) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.succeeded(HttpClient.java:560) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.succeeded(HttpClient.java:556) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.util.SocketAddressResolver$Async.lambda$resolve$1(SocketAddressResolver.java:180) ~[jetty-util-10.0.19.jar:10.0.19] 2> ... 4 more 2> Caused by: java.net.ConnectException: Connection refused 2> at java.base/sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) ~[?:?] 2> at java.base/sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:774) ~[?:?] 2> at java.base/sun.nio.ch.SocketAdaptor.connect(SocketAdaptor.java:120) ~[?:?] 2> at org.eclipse.jetty.io.ClientConnector.connect(ClientConnector.java:428) ~[jetty-io-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.HTTP2Client.connect(HTTP2Client.java:461) ~[http2-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.http.HttpClientTransportOverHTTP2.connect(HttpClientTransportOverHTTP2.java:146) ~[http2-http-client-transport-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.http.HttpClientTransportOverHTTP2.connect(HttpClientTransportOverHTTP2.java:135) ~[http2-http-client-transport-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.connect(HttpClient.java:583) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.succeeded(HttpClient.java:560) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.succeeded(HttpClient.java:556) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.util.SocketAddressResolver$Async.lambda$resolve$1(SocketAddressResolver.java:180) ~[jetty-util-10.0.19.jar:10.0.19] 2> ... 4 more 2> 352848 ERROR (qtp1455143443-2010) [n:127.0.0.1:32875__c c:repfacttest_c8n_1x3 s:shard1 r:core_node3 x:repfacttest_c8n_1x3_shard1_replica_n1 t:null-1033] o.a.s.u.p.DistributedZkUpdateProcessor Setting up to try to start recovery on replica core_node6 with url http://127.0.0.1:33207/_c/repfacttest_c8n_1x3_shard1_replica_n4/ by increasing leader term 2> => java.net.ConnectException: Connection refused 2> at java.base/sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) 2> java.net.ConnectException: Connection refused 2> at java.base/sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) ~[?:?] 2> at java.base/sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:774) ~[?:?] 2> at java.base/sun.nio.ch.SocketAdaptor.connect(SocketAdaptor.java:120) ~[?:?] 2> at org.eclipse.jetty.io.ClientConnector.connect(ClientConnector.java:428) ~[jetty-io-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.HTTP2Client.connect(HTTP2Client.java:461) ~[http2-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.http.HttpClientTransportOverHTTP2.connect(HttpClientTransportOverHTTP2.java:146) ~[http2-http-client-transport-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.http.HttpClientTransportOverHTTP2.connect(HttpClientTransportOverHTTP2.java:135) ~[http2-http-client-transport-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.connect(HttpClient.java:583) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.succeeded(HttpClient.java:560) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.succeeded(HttpClient.java:556) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.util.SocketAddressResolver$Async.lambda$resolve$1(SocketAddressResolver.java:180) ~[jetty-util-10.0.19.jar:10.0.19] 2> at org.apache.solr.common.util.ExecutorUtil$MDCAwareThreadPoolExecutor.lambda$execute$0(ExecutorUtil.java:294) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) ~[?:?] 2> at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) ~[?:?] 2> at java.base/java.lang.Thread.run(Thread.java:829) [?:?] 2> 352861 INFO (qtp1455143443-2010) [n:127.0.0.1:32875__c c:repfacttest_c8n_1x3 s:shard1 r:core_node3 x:repfacttest_c8n_1x3_shard1_replica_n1 t:null-1033] o.a.s.c.ZkShardTerms Successful update of terms at /collections/repfacttest_c8n_1x3/terms/shard1 to Terms{values={core_node6=2, core_node6_recovering=1, core_node3=3, core_node5=3}, version=6} for ensureTermsIsHigher 2> 352861 INFO (qtp1455143443-2010) [n:127.0.0.1:32875__c c:repfacttest_c8n_1x3 s:shard1 r:core_node3 x:repfacttest_c8n_1x3_shard1_replica_n1 t:null-1033] o.a.s.u.p.LogUpdateProcessorFactory webapp=/_c path=/update params={wt=javabin&version=2}{deleteByQuery=id:(1279) (-1788735688238170112)} 0 248 2> 352867 INFO (zkCallback-1161-thread-2) [n:127.0.0.1:33207__c c:repfacttest_c8n_1x3 s:shard1 r:core_node6 x:repfacttest_c8n_1x3_shard1_replica_n4 t:] o.a.s.c.RecoveringCoreTermWatcher Start recovery on core_node6 because core's term is less than leader's term 2> 352870 WARN (updateExecutor-1156-thread-2-processing-127.0.0.1:33207__c repfacttest_c8n_1x3_shard1_replica_n4 repfacttest_c8n_1x3 shard1 core_node6) [n:127.0.0.1:33207__c c:repfacttest_c8n_1x3 s:shard1 r:core_node6 x:repfacttest_c8n_1x3_shard1_replica_n4 t:] o.a.s.c.RecoveryStrategy Stopping recovery for core=[repfacttest_c8n_1x3_shard1_replica_n4] coreNodeName=[core_node6] 2> 352890 INFO (qtp1639791653-1929) [n:127.0.0.1:40643__c c:repfacttest_c8n_1x3 s:shard1 r:core_node5 x:repfacttest_c8n_1x3_shard1_replica_n2 t:null-1034] o.a.s.u.p.LogUpdateProcessorFactory webapp=/_c path=/update params={update.distrib=FROMLEADER&distrib.from=http://127.0.0.1:32875/_c/repfacttest_c8n_1x3_shard1_replica_n1/&wt=javabin&version=2}{add=[1409 (1788735688506605568)]} 0 9 2> 352893 INFO (qtp1455143443-2012) [n:127.0.0.1:32875__c c:repfacttest_c8n_1x3 s:shard1 r:core_node3 x:repfacttest_c8n_1x3_shard1_replica_n1 t:null-1034] o.a.s.u.p.LogUpdateProcessorFactory webapp=/_c path=/update params={wt=javabin&version=2}{add=[1409 (1788735688506605568)]} 0 24 2> 352956 INFO (qtp1639791653-1927) [n:127.0.0.1:40643__c c:repfacttest_c8n_1x3 s:shard1 r:core_node5 x:repfacttest_c8n_1x3_shard1_replica_n2 t:null-1035] o.a.s.u.p.LogUpdateProcessorFactory webapp=/_c path=/update params={update.distrib=FROMLEADER&distrib.from=http://127.0.0.1:32875/_c/repfacttest_c8n_1x3_shard1_replica_n1/&wt=javabin&version=2}{delete=[1409 (-1788735688538062848)]} 0 7 2> 352960 INFO (qtp1455143443-2013) [n:127.0.0.1:32875__c c:repfacttest_c8n_1x3 s:shard1 r:core_node3 x:repfacttest_c8n_1x3_shard1_replica_n1 t:null-1035] o.a.s.u.p.LogUpdateProcessorFactory webapp=/_c path=/update params={wt=javabin&version=2}{delete=[1409 (-1788735688538062848)]} 0 61 2> 352963 INFO (TEST-ReplicationFactorTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.ReplicationFactorTest Indexing batch of documents (30-45) 2> 353046 INFO (qtp1639791653-1924) [n:127.0.0.1:40643__c c:repfacttest_c8n_1x3 s:shard1 r:core_node5 x:repfacttest_c8n_1x3_shard1_replica_n2 t:null-1036] o.a.s.u.p.LogUpdateProcessorFactory webapp=/_c path=/update params={update.distrib=FROMLEADER&distrib.from=http://127.0.0.1:32875/_c/repfacttest_c8n_1x3_shard1_replica_n1/&wt=javabin&version=2}{add=[30 (1788735688610414592), 31 (1788735688612511744), 32 (1788735688614608896), 33 (1788735688615657472), 34 (1788735688617754624), 35 (1788735688618803200), 36 (1788735688619851776), 37 (1788735688621948928), 38 (1788735688622997504), 39 (1788735688625094656), ... (15 adds)]} 0 27 2> 353049 INFO (qtp1455143443-2011) [n:127.0.0.1:32875__c c:repfacttest_c8n_1x3 s:shard1 r:core_node3 x:repfacttest_c8n_1x3_shard1_replica_n1 t:null-1036] o.a.s.u.p.LogUpdateProcessorFactory webapp=/_c path=/update params={wt=javabin&version=2}{add=[30 (1788735688610414592), 31 (1788735688612511744), 32 (1788735688614608896), 33 (1788735688615657472), 34 (1788735688617754624), 35 (1788735688618803200), 36 (1788735688619851776), 37 (1788735688621948928), 38 (1788735688622997504), 39 (1788735688625094656), ... (15 adds)]} 0 82 2> 353051 INFO (TEST-ReplicationFactorTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.ReplicationFactorTest Closing second proxy port 2> 353055 WARN (TEST-ReplicationFactorTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.s.c.SocketProxy Closing 6 connections to: http://127.0.0.1:40643/_c, target: http://127.0.0.1:43165/_c 2> 353056 INFO (TEST-ReplicationFactorTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.ReplicationFactorTest Indexing docId=3 2> 353083 ERROR (updateExecutor-1175-thread-2-processing-repfacttest_c8n_1x3_shard1_replica_n1 null-1037 core_node3 null 127.0.0.1:32875__c repfacttest_c8n_1x3 shard1) [n:127.0.0.1:32875__c c:repfacttest_c8n_1x3 s:shard1 r:core_node3 x:repfacttest_c8n_1x3_shard1_replica_n1 t:null-1037] o.a.s.u.ErrorReportingConcurrentUpdateSolrClient Error when calling SolrCmdDistributor$Req: cmd=add{,id=(null)}; node=StdNode: http://127.0.0.1:40643/_c/repfacttest_c8n_1x3_shard1_replica_n2/ to http://127.0.0.1:40643/_c/repfacttest_c8n_1x3_shard1_replica_n2/ 2> => java.io.IOException: java.net.ConnectException: Connection refused 2> at org.eclipse.jetty.client.util.OutputStreamRequestContent$AsyncOutputStream.write(OutputStreamRequestContent.java:104) 2> java.io.IOException: java.net.ConnectException: Connection refused 2> at org.eclipse.jetty.client.util.OutputStreamRequestContent$AsyncOutputStream.write(OutputStreamRequestContent.java:104) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.apache.solr.common.util.FastOutputStream.flush(FastOutputStream.java:207) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.common.util.FastOutputStream.flushBuffer(FastOutputStream.java:200) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.common.util.JavaBinCodec.marshal(JavaBinCodec.java:170) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.request.JavaBinUpdateRequestCodec.marshal(JavaBinUpdateRequestCodec.java:100) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.BinaryRequestWriter.write(BinaryRequestWriter.java:79) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.Http2SolrClient.send(Http2SolrClient.java:435) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.ConcurrentUpdateHttp2SolrClient$Runner.sendUpdateStream(ConcurrentUpdateHttp2SolrClient.java:250) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.ConcurrentUpdateHttp2SolrClient$Runner.run(ConcurrentUpdateHttp2SolrClient.java:192) [solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at com.codahale.metrics.InstrumentedExecutorService$InstrumentedRunnable.run(InstrumentedExecutorService.java:212) [metrics-core-4.2.21.jar:4.2.21] 2> at org.apache.solr.common.util.ExecutorUtil$MDCAwareThreadPoolExecutor.lambda$execute$0(ExecutorUtil.java:294) [solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) [?:?] 2> at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) [?:?] 2> at java.base/java.lang.Thread.run(Thread.java:829) [?:?] 2> Suppressed: java.io.IOException: java.net.ConnectException: Connection refused 2> at org.eclipse.jetty.client.util.OutputStreamRequestContent$AsyncOutputStream.write(OutputStreamRequestContent.java:104) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.apache.solr.common.util.FastOutputStream.flush(FastOutputStream.java:207) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.common.util.FastOutputStream.flushBuffer(FastOutputStream.java:200) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.common.util.JavaBinCodec.close(JavaBinCodec.java:1293) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.request.JavaBinUpdateRequestCodec.marshal(JavaBinUpdateRequestCodec.java:99) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.BinaryRequestWriter.write(BinaryRequestWriter.java:79) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.Http2SolrClient.send(Http2SolrClient.java:435) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.ConcurrentUpdateHttp2SolrClient$Runner.sendUpdateStream(ConcurrentUpdateHttp2SolrClient.java:250) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.ConcurrentUpdateHttp2SolrClient$Runner.run(ConcurrentUpdateHttp2SolrClient.java:192) [solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at com.codahale.metrics.InstrumentedExecutorService$InstrumentedRunnable.run(InstrumentedExecutorService.java:212) [metrics-core-4.2.21.jar:4.2.21] 2> at org.apache.solr.common.util.ExecutorUtil$MDCAwareThreadPoolExecutor.lambda$execute$0(ExecutorUtil.java:294) [solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) [?:?] 2> at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) [?:?] 2> at java.base/java.lang.Thread.run(Thread.java:829) [?:?] 2> Caused by: java.net.ConnectException: Connection refused 2> at java.base/sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) ~[?:?] 2> at java.base/sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:774) ~[?:?] 2> at java.base/sun.nio.ch.SocketAdaptor.connect(SocketAdaptor.java:120) ~[?:?] 2> at org.eclipse.jetty.io.ClientConnector.connect(ClientConnector.java:428) ~[jetty-io-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.HTTP2Client.connect(HTTP2Client.java:461) ~[http2-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.http.HttpClientTransportOverHTTP2.connect(HttpClientTransportOverHTTP2.java:146) ~[http2-http-client-transport-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.http.HttpClientTransportOverHTTP2.connect(HttpClientTransportOverHTTP2.java:135) ~[http2-http-client-transport-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.connect(HttpClient.java:583) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.succeeded(HttpClient.java:560) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.succeeded(HttpClient.java:556) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.util.SocketAddressResolver$Async.lambda$resolve$1(SocketAddressResolver.java:180) ~[jetty-util-10.0.19.jar:10.0.19] 2> ... 4 more 2> Caused by: java.net.ConnectException: Connection refused 2> at java.base/sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) ~[?:?] 2> at java.base/sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:774) ~[?:?] 2> at java.base/sun.nio.ch.SocketAdaptor.connect(SocketAdaptor.java:120) ~[?:?] 2> at org.eclipse.jetty.io.ClientConnector.connect(ClientConnector.java:428) ~[jetty-io-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.HTTP2Client.connect(HTTP2Client.java:461) ~[http2-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.http.HttpClientTransportOverHTTP2.connect(HttpClientTransportOverHTTP2.java:146) ~[http2-http-client-transport-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.http.HttpClientTransportOverHTTP2.connect(HttpClientTransportOverHTTP2.java:135) ~[http2-http-client-transport-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.connect(HttpClient.java:583) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.succeeded(HttpClient.java:560) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.succeeded(HttpClient.java:556) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.util.SocketAddressResolver$Async.lambda$resolve$1(SocketAddressResolver.java:180) ~[jetty-util-10.0.19.jar:10.0.19] 2> ... 4 more 2> 353106 INFO (qtp1455143443-2014) [n:127.0.0.1:32875__c c:repfacttest_c8n_1x3 s:shard1 r:core_node3 x:repfacttest_c8n_1x3_shard1_replica_n1 t:null-1037] o.a.s.u.SolrCmdDistributor SolrCmdDistributor found 1 errors 2> 353106 WARN (qtp1455143443-2014) [n:127.0.0.1:32875__c c:repfacttest_c8n_1x3 s:shard1 r:core_node3 x:repfacttest_c8n_1x3_shard1_replica_n1 t:null-1037] o.a.s.u.p.DistributedZkUpdateProcessor Error sending update to http://127.0.0.1:40643/_c 2> => java.io.IOException: java.net.ConnectException: Connection refused 2> at org.eclipse.jetty.client.util.OutputStreamRequestContent$AsyncOutputStream.write(OutputStreamRequestContent.java:104) 2> java.io.IOException: java.net.ConnectException: Connection refused 2> at org.eclipse.jetty.client.util.OutputStreamRequestContent$AsyncOutputStream.write(OutputStreamRequestContent.java:104) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.apache.solr.common.util.FastOutputStream.flush(FastOutputStream.java:207) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.common.util.FastOutputStream.flushBuffer(FastOutputStream.java:200) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.common.util.JavaBinCodec.marshal(JavaBinCodec.java:170) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.request.JavaBinUpdateRequestCodec.marshal(JavaBinUpdateRequestCodec.java:100) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.BinaryRequestWriter.write(BinaryRequestWriter.java:79) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.Http2SolrClient.send(Http2SolrClient.java:435) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.ConcurrentUpdateHttp2SolrClient$Runner.sendUpdateStream(ConcurrentUpdateHttp2SolrClient.java:250) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.ConcurrentUpdateHttp2SolrClient$Runner.run(ConcurrentUpdateHttp2SolrClient.java:192) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at com.codahale.metrics.InstrumentedExecutorService$InstrumentedRunnable.run(InstrumentedExecutorService.java:212) ~[metrics-core-4.2.21.jar:4.2.21] 2> at org.apache.solr.common.util.ExecutorUtil$MDCAwareThreadPoolExecutor.lambda$execute$0(ExecutorUtil.java:294) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) ~[?:?] 2> at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) ~[?:?] 2> at java.base/java.lang.Thread.run(Thread.java:829) [?:?] 2> Suppressed: java.io.IOException: java.net.ConnectException: Connection refused 2> at org.eclipse.jetty.client.util.OutputStreamRequestContent$AsyncOutputStream.write(OutputStreamRequestContent.java:104) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.apache.solr.common.util.FastOutputStream.flush(FastOutputStream.java:207) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.common.util.FastOutputStream.flushBuffer(FastOutputStream.java:200) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.common.util.JavaBinCodec.close(JavaBinCodec.java:1293) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.request.JavaBinUpdateRequestCodec.marshal(JavaBinUpdateRequestCodec.java:99) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.BinaryRequestWriter.write(BinaryRequestWriter.java:79) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.Http2SolrClient.send(Http2SolrClient.java:435) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.ConcurrentUpdateHttp2SolrClient$Runner.sendUpdateStream(ConcurrentUpdateHttp2SolrClient.java:250) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.ConcurrentUpdateHttp2SolrClient$Runner.run(ConcurrentUpdateHttp2SolrClient.java:192) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at com.codahale.metrics.InstrumentedExecutorService$InstrumentedRunnable.run(InstrumentedExecutorService.java:212) ~[metrics-core-4.2.21.jar:4.2.21] 2> at org.apache.solr.common.util.ExecutorUtil$MDCAwareThreadPoolExecutor.lambda$execute$0(ExecutorUtil.java:294) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) ~[?:?] 2> at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) ~[?:?] 2> at java.base/java.lang.Thread.run(Thread.java:829) [?:?] 2> Caused by: java.net.ConnectException: Connection refused 2> at java.base/sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) ~[?:?] 2> at java.base/sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:774) ~[?:?] 2> at java.base/sun.nio.ch.SocketAdaptor.connect(SocketAdaptor.java:120) ~[?:?] 2> at org.eclipse.jetty.io.ClientConnector.connect(ClientConnector.java:428) ~[jetty-io-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.HTTP2Client.connect(HTTP2Client.java:461) ~[http2-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.http.HttpClientTransportOverHTTP2.connect(HttpClientTransportOverHTTP2.java:146) ~[http2-http-client-transport-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.http.HttpClientTransportOverHTTP2.connect(HttpClientTransportOverHTTP2.java:135) ~[http2-http-client-transport-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.connect(HttpClient.java:583) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.succeeded(HttpClient.java:560) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.succeeded(HttpClient.java:556) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.util.SocketAddressResolver$Async.lambda$resolve$1(SocketAddressResolver.java:180) ~[jetty-util-10.0.19.jar:10.0.19] 2> ... 4 more 2> Caused by: java.net.ConnectException: Connection refused 2> at java.base/sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) ~[?:?] 2> at java.base/sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:774) ~[?:?] 2> at java.base/sun.nio.ch.SocketAdaptor.connect(SocketAdaptor.java:120) ~[?:?] 2> at org.eclipse.jetty.io.ClientConnector.connect(ClientConnector.java:428) ~[jetty-io-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.HTTP2Client.connect(HTTP2Client.java:461) ~[http2-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.http.HttpClientTransportOverHTTP2.connect(HttpClientTransportOverHTTP2.java:146) ~[http2-http-client-transport-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.http.HttpClientTransportOverHTTP2.connect(HttpClientTransportOverHTTP2.java:135) ~[http2-http-client-transport-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.connect(HttpClient.java:583) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.succeeded(HttpClient.java:560) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.succeeded(HttpClient.java:556) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.util.SocketAddressResolver$Async.lambda$resolve$1(SocketAddressResolver.java:180) ~[jetty-util-10.0.19.jar:10.0.19] 2> ... 4 more 2> 353132 ERROR (qtp1455143443-2014) [n:127.0.0.1:32875__c c:repfacttest_c8n_1x3 s:shard1 r:core_node3 x:repfacttest_c8n_1x3_shard1_replica_n1 t:null-1037] o.a.s.u.p.DistributedZkUpdateProcessor Setting up to try to start recovery on replica core_node5 with url http://127.0.0.1:40643/_c/repfacttest_c8n_1x3_shard1_replica_n2/ by increasing leader term 2> => java.net.ConnectException: Connection refused 2> at java.base/sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) 2> java.net.ConnectException: Connection refused 2> at java.base/sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) ~[?:?] 2> at java.base/sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:774) ~[?:?] 2> at java.base/sun.nio.ch.SocketAdaptor.connect(SocketAdaptor.java:120) ~[?:?] 2> at org.eclipse.jetty.io.ClientConnector.connect(ClientConnector.java:428) ~[jetty-io-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.HTTP2Client.connect(HTTP2Client.java:461) ~[http2-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.http.HttpClientTransportOverHTTP2.connect(HttpClientTransportOverHTTP2.java:146) ~[http2-http-client-transport-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.http.HttpClientTransportOverHTTP2.connect(HttpClientTransportOverHTTP2.java:135) ~[http2-http-client-transport-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.connect(HttpClient.java:583) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.succeeded(HttpClient.java:560) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.succeeded(HttpClient.java:556) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.util.SocketAddressResolver$Async.lambda$resolve$1(SocketAddressResolver.java:180) ~[jetty-util-10.0.19.jar:10.0.19] 2> at org.apache.solr.common.util.ExecutorUtil$MDCAwareThreadPoolExecutor.lambda$execute$0(ExecutorUtil.java:294) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) ~[?:?] 2> at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) ~[?:?] 2> at java.base/java.lang.Thread.run(Thread.java:829) [?:?] 2> 353145 INFO (qtp1455143443-2014) [n:127.0.0.1:32875__c c:repfacttest_c8n_1x3 s:shard1 r:core_node3 x:repfacttest_c8n_1x3_shard1_replica_n1 t:null-1037] o.a.s.c.ZkShardTerms Successful update of terms at /collections/repfacttest_c8n_1x3/terms/shard1 to Terms{values={core_node6=2, core_node6_recovering=1, core_node3=4, core_node5=3}, version=7} for ensureTermsIsHigher 2> 353146 INFO (qtp1455143443-2014) [n:127.0.0.1:32875__c c:repfacttest_c8n_1x3 s:shard1 r:core_node3 x:repfacttest_c8n_1x3_shard1_replica_n1 t:null-1037] o.a.s.u.p.LogUpdateProcessorFactory webapp=/_c path=/update params={wt=javabin&version=2}{add=[3 (1788735688722612224)]} 0 72 2> 353153 INFO (zkCallback-1122-thread-2) [n:127.0.0.1:40643__c c:repfacttest_c8n_1x3 s:shard1 r:core_node5 x:repfacttest_c8n_1x3_shard1_replica_n2 t:] o.a.s.c.RecoveringCoreTermWatcher Start recovery on core_node5 because core's term is less than leader's term 2> 353156 INFO (updateExecutor-1117-thread-1-processing-127.0.0.1:40643__c repfacttest_c8n_1x3_shard1_replica_n2 repfacttest_c8n_1x3 shard1 core_node5) [n:127.0.0.1:40643__c c:repfacttest_c8n_1x3 s:shard1 r:core_node5 x:repfacttest_c8n_1x3_shard1_replica_n2 t:] o.a.s.u.DefaultSolrCoreState Running recovery 2> 353158 INFO (recoveryExecutor-1119-thread-1-processing-127.0.0.1:40643__c repfacttest_c8n_1x3_shard1_replica_n2 repfacttest_c8n_1x3 shard1 core_node5) [n:127.0.0.1:40643__c c:repfacttest_c8n_1x3 s:shard1 r:core_node5 x:repfacttest_c8n_1x3_shard1_replica_n2 t:] o.a.s.c.RecoveryStrategy Starting recovery process. recoveringAfterStartup=true 2> 353158 INFO (qtp1455143443-2015) [n:127.0.0.1:32875__c c:repfacttest_c8n_1x3 s:shard1 r:core_node3 x:repfacttest_c8n_1x3_shard1_replica_n1 t:null-1038] o.a.s.u.p.LogUpdateProcessorFactory webapp=/_c path=/update params={wt=javabin&version=2}{add=[1510 (1788735688806498304)]} 0 4 2> 353160 INFO (recoveryExecutor-1119-thread-1-processing-127.0.0.1:40643__c repfacttest_c8n_1x3_shard1_replica_n2 repfacttest_c8n_1x3 shard1 core_node5) [n:127.0.0.1:40643__c c:repfacttest_c8n_1x3 s:shard1 r:core_node5 x:repfacttest_c8n_1x3_shard1_replica_n2 t:] o.a.s.c.RecoveryStrategy Found new versions added after startup: num=[25] 2> 353160 INFO (recoveryExecutor-1119-thread-1-processing-127.0.0.1:40643__c repfacttest_c8n_1x3_shard1_replica_n2 repfacttest_c8n_1x3 shard1 core_node5) [n:127.0.0.1:40643__c c:repfacttest_c8n_1x3 s:shard1 r:core_node5 x:repfacttest_c8n_1x3_shard1_replica_n2 t:] o.a.s.c.RecoveryStrategy currentVersions size=25 range=[1788735688632434688 to 1788735687205322752] 2> 353160 INFO (recoveryExecutor-1119-thread-1-processing-127.0.0.1:40643__c repfacttest_c8n_1x3_shard1_replica_n2 repfacttest_c8n_1x3 shard1 core_node5) [n:127.0.0.1:40643__c c:repfacttest_c8n_1x3 s:shard1 r:core_node5 x:repfacttest_c8n_1x3_shard1_replica_n2 t:] o.a.s.c.RecoveryStrategy startupVersions is empty 2> 353164 INFO (SocketProxy-Acceptor-32875) [n: c: s: r: x: t:] o.a.s.c.s.c.SocketProxy accepted Socket[addr=/127.0.0.1,port=44204,localport=32875], receiveBufferSize: 65536 2> 353167 INFO (SocketProxy-Acceptor-32875) [n: c: s: r: x: t:] o.a.s.c.s.c.SocketProxy proxy connection Socket[addr=/127.0.0.1,port=38185,localport=40202], receiveBufferSize=65536 2> 353175 INFO (qtp1455143443-2013) [n:127.0.0.1:32875__c c:repfacttest_c8n_1x3 s:shard1 r:core_node3 x:repfacttest_c8n_1x3_shard1_replica_n1 t:null-1040] o.a.s.c.S.Request webapp=/_c path=/admin/ping params={wt=javabin&version=2} hits=0 status=0 QTime=5 2> 353175 INFO (qtp1455143443-2013) [n:127.0.0.1:32875__c c:repfacttest_c8n_1x3 s:shard1 r:core_node3 x:repfacttest_c8n_1x3_shard1_replica_n1 t:null-1040] o.a.s.c.S.Request webapp=/_c path=/admin/ping params={wt=javabin&version=2} status=0 QTime=6 2> 353178 INFO (recoveryExecutor-1119-thread-1-processing-127.0.0.1:40643__c repfacttest_c8n_1x3_shard1_replica_n2 repfacttest_c8n_1x3 shard1 core_node5) [n:127.0.0.1:40643__c c:repfacttest_c8n_1x3 s:shard1 r:core_node5 x:repfacttest_c8n_1x3_shard1_replica_n2 t:] o.a.s.c.RecoveryStrategy Begin buffering updates. core=[repfacttest_c8n_1x3_shard1_replica_n2] 2> 353184 INFO (recoveryExecutor-1119-thread-1-processing-127.0.0.1:40643__c repfacttest_c8n_1x3_shard1_replica_n2 repfacttest_c8n_1x3 shard1 core_node5) [n:127.0.0.1:40643__c c:repfacttest_c8n_1x3 s:shard1 r:core_node5 x:repfacttest_c8n_1x3_shard1_replica_n2 t:] o.a.s.u.UpdateLog Starting to buffer updates. FSUpdateLog{state=ACTIVE, tlog=tlog{file=/build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.cloud.ReplicationFactorTest_246C98A4C257C021-001/control-001/cores/repfacttest_c8n_1x3_shard1_replica_n2/data/tlog/tlog.0000000000000000000 refcount=1}} 2> 353184 INFO (recoveryExecutor-1119-thread-1-processing-127.0.0.1:40643__c repfacttest_c8n_1x3_shard1_replica_n2 repfacttest_c8n_1x3 shard1 core_node5) [n:127.0.0.1:40643__c c:repfacttest_c8n_1x3 s:shard1 r:core_node5 x:repfacttest_c8n_1x3_shard1_replica_n2 t:] o.a.s.c.RecoveryStrategy Publishing state of core [repfacttest_c8n_1x3_shard1_replica_n2] as recovering, leader is [http://127.0.0.1:32875/_c/repfacttest_c8n_1x3_shard1_replica_n1/] and I am [http://127.0.0.1:40643/_c/repfacttest_c8n_1x3_shard1_replica_n2/] 2> 353193 INFO (recoveryExecutor-1119-thread-1-processing-127.0.0.1:40643__c repfacttest_c8n_1x3_shard1_replica_n2 repfacttest_c8n_1x3 shard1 core_node5) [n:127.0.0.1:40643__c c:repfacttest_c8n_1x3 s:shard1 r:core_node5 x:repfacttest_c8n_1x3_shard1_replica_n2 t:] o.a.s.c.ZkShardTerms Successful update of terms at /collections/repfacttest_c8n_1x3/terms/shard1 to Terms{values={core_node6=2, core_node6_recovering=1, core_node5_recovering=3, core_node3=4, core_node5=4}, version=8} for startRecovering 2> 353200 INFO (recoveryExecutor-1119-thread-1-processing-127.0.0.1:40643__c repfacttest_c8n_1x3_shard1_replica_n2 repfacttest_c8n_1x3 shard1 core_node5) [n:127.0.0.1:40643__c c:repfacttest_c8n_1x3 s:shard1 r:core_node5 x:repfacttest_c8n_1x3_shard1_replica_n2 t:] o.a.s.c.RecoveryStrategy Sending prep recovery command to [http://127.0.0.1:32875/_c]; [WaitForState: action=PREPRECOVERY&core=repfacttest_c8n_1x3_shard1_replica_n1&nodeName=127.0.0.1:40643__c&coreNodeName=core_node5&state=recovering&checkLive=true&onlyIfLeader=true&onlyIfLeaderActive=true] 2> 353224 INFO (qtp1455143443-2011) [n:127.0.0.1:32875__c c: s: r: x:repfacttest_c8n_1x3_shard1_replica_n1 t:null-1041] o.a.s.h.a.PrepRecoveryOp Going to wait for coreNodeName: core_node5, state: recovering, checkLive: true, onlyIfLeader: true, onlyIfLeaderActive: true 2> 353225 INFO (qtp1455143443-2011) [n:127.0.0.1:32875__c c: s: r: x: t:null-1041] o.a.s.h.a.PrepRecoveryOp In WaitForState(recovering): collection=repfacttest_c8n_1x3, shard=shard1, thisCore=repfacttest_c8n_1x3_shard1_replica_n1, leaderDoesNotNeedRecovery=false, isLeader? true, live=true, checkLive=true, currentState=active, localState=active, nodeName=127.0.0.1:40643__c, coreNodeName=core_node5, onlyIfActiveCheckResult=false, nodeProps: core_node5:{ 2> "core":"repfacttest_c8n_1x3_shard1_replica_n2", 2> "node_name":"127.0.0.1:40643__c", 2> "base_url":"http://127.0.0.1:40643/_c", 2> "state":"active", 2> "type":"NRT", 2> "force_set_state":"false"} 2> 353225 INFO (recoveryExecutor-1158-thread-1-processing-127.0.0.1:33207__c repfacttest_c8n_1x3_shard1_replica_n4 repfacttest_c8n_1x3 shard1 core_node6) [n:127.0.0.1:33207__c c:repfacttest_c8n_1x3 s:shard1 r:core_node6 x:repfacttest_c8n_1x3_shard1_replica_n4 t:] o.a.s.c.RecoveryStrategy Attempting to PeerSync from [http://127.0.0.1:32875/_c/repfacttest_c8n_1x3_shard1_replica_n1/] - recoveringAfterStartup=[true] 2> 353226 INFO (qtp1455143443-2011) [n:127.0.0.1:32875__c c: s: r: x: t:null-1041] o.a.s.h.a.PrepRecoveryOp In WaitForState(recovering): collection=repfacttest_c8n_1x3, shard=shard1, thisCore=repfacttest_c8n_1x3_shard1_replica_n1, leaderDoesNotNeedRecovery=false, isLeader? true, live=true, checkLive=true, currentState=active, localState=active, nodeName=127.0.0.1:40643__c, coreNodeName=core_node5, onlyIfActiveCheckResult=false, nodeProps: core_node5:{ 2> "core":"repfacttest_c8n_1x3_shard1_replica_n2", 2> "node_name":"127.0.0.1:40643__c", 2> "base_url":"http://127.0.0.1:40643/_c", 2> "state":"active", 2> "type":"NRT", 2> "force_set_state":"false"} 2> 353227 INFO (qtp1455143443-2011) [n:127.0.0.1:32875__c c: s: r: x: t:null-1041] o.a.s.h.a.PrepRecoveryOp In WaitForState(recovering): collection=repfacttest_c8n_1x3, shard=shard1, thisCore=repfacttest_c8n_1x3_shard1_replica_n1, leaderDoesNotNeedRecovery=false, isLeader? true, live=true, checkLive=true, currentState=active, localState=active, nodeName=127.0.0.1:40643__c, coreNodeName=core_node5, onlyIfActiveCheckResult=false, nodeProps: core_node5:{ 2> "core":"repfacttest_c8n_1x3_shard1_replica_n2", 2> "node_name":"127.0.0.1:40643__c", 2> "base_url":"http://127.0.0.1:40643/_c", 2> "state":"active", 2> "type":"NRT", 2> "force_set_state":"false"} 2> 353227 INFO (qtp1455143443-2011) [n:127.0.0.1:32875__c c: s: r: x: t:null-1041] o.a.s.h.a.PrepRecoveryOp In WaitForState(recovering): collection=repfacttest_c8n_1x3, shard=shard1, thisCore=repfacttest_c8n_1x3_shard1_replica_n1, leaderDoesNotNeedRecovery=false, isLeader? true, live=true, checkLive=true, currentState=active, localState=active, nodeName=127.0.0.1:40643__c, coreNodeName=core_node5, onlyIfActiveCheckResult=false, nodeProps: core_node5:{ 2> "core":"repfacttest_c8n_1x3_shard1_replica_n2", 2> "node_name":"127.0.0.1:40643__c", 2> "base_url":"http://127.0.0.1:40643/_c", 2> "state":"active", 2> "type":"NRT", 2> "force_set_state":"false"} 2> 353229 WARN (recoveryExecutor-1158-thread-1-processing-127.0.0.1:33207__c repfacttest_c8n_1x3_shard1_replica_n4 repfacttest_c8n_1x3 shard1 core_node6) [n:127.0.0.1:33207__c c:repfacttest_c8n_1x3 s:shard1 r:core_node6 x:repfacttest_c8n_1x3_shard1_replica_n4 t:] o.a.s.u.PeerSyncWithLeader no frame of reference to tell if we've missed updates 2> 353233 INFO (recoveryExecutor-1158-thread-1-processing-127.0.0.1:33207__c repfacttest_c8n_1x3_shard1_replica_n4 repfacttest_c8n_1x3 shard1 core_node6) [n:127.0.0.1:33207__c c:repfacttest_c8n_1x3 s:shard1 r:core_node6 x:repfacttest_c8n_1x3_shard1_replica_n4 t:] o.a.s.c.RecoveryStrategy PeerSync Recovery was not successful - trying replication. 2> 353233 INFO (recoveryExecutor-1158-thread-1-processing-127.0.0.1:33207__c repfacttest_c8n_1x3_shard1_replica_n4 repfacttest_c8n_1x3 shard1 core_node6) [n:127.0.0.1:33207__c c:repfacttest_c8n_1x3 s:shard1 r:core_node6 x:repfacttest_c8n_1x3_shard1_replica_n4 t:] o.a.s.c.RecoveryStrategy RecoveryStrategy has been closed 2> 353233 INFO (recoveryExecutor-1158-thread-1-processing-127.0.0.1:33207__c repfacttest_c8n_1x3_shard1_replica_n4 repfacttest_c8n_1x3 shard1 core_node6) [n:127.0.0.1:33207__c c:repfacttest_c8n_1x3 s:shard1 r:core_node6 x:repfacttest_c8n_1x3_shard1_replica_n4 t:] o.a.s.c.RecoveryStrategy Finished recovery process, successful=[false] msTimeTaken=682.0 2> 353233 INFO (recoveryExecutor-1158-thread-1-processing-127.0.0.1:33207__c repfacttest_c8n_1x3_shard1_replica_n4 repfacttest_c8n_1x3 shard1 core_node6) [n:127.0.0.1:33207__c c:repfacttest_c8n_1x3 s:shard1 r:core_node6 x:repfacttest_c8n_1x3_shard1_replica_n4 t:] o.a.s.c.RecoveryStrategy Finished recovery process. recoveringAfterStartup=true msTimeTaken=683.0 2> 353234 INFO (updateExecutor-1156-thread-2-processing-127.0.0.1:33207__c repfacttest_c8n_1x3_shard1_replica_n4 repfacttest_c8n_1x3 shard1 core_node6) [n:127.0.0.1:33207__c c:repfacttest_c8n_1x3 s:shard1 r:core_node6 x:repfacttest_c8n_1x3_shard1_replica_n4 t:] o.a.s.u.DefaultSolrCoreState Running recovery 2> 353234 INFO (updateExecutor-1156-thread-2-processing-127.0.0.1:33207__c repfacttest_c8n_1x3_shard1_replica_n4 repfacttest_c8n_1x3 shard1 core_node6) [n:127.0.0.1:33207__c c:repfacttest_c8n_1x3 s:shard1 r:core_node6 x:repfacttest_c8n_1x3_shard1_replica_n4 t:] o.a.s.c.ActionThrottle Throttling recovery attempts - waiting for 9308ms 2> 353315 INFO (zkCallback-1161-thread-2) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/repfacttest_c8n_1x3/state.json zxid: 386] for collection [repfacttest_c8n_1x3] has occurred - updating... (live nodes size: [4]) 2> 353315 INFO (zkCallback-1122-thread-2) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/repfacttest_c8n_1x3/state.json zxid: 386] for collection [repfacttest_c8n_1x3] has occurred - updating... (live nodes size: [4]) 2> 353315 INFO (zkCallback-1180-thread-2) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/repfacttest_c8n_1x3/state.json zxid: 386] for collection [repfacttest_c8n_1x3] has occurred - updating... (live nodes size: [4]) 2> 353315 INFO (zkCallback-1180-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/repfacttest_c8n_1x3/state.json zxid: 386] for collection [repfacttest_c8n_1x3] has occurred - updating... (live nodes size: [4]) 2> 353321 INFO (watches-1182-thread-3) [n:127.0.0.1:32875__c c: s: r: x: t:] o.a.s.h.a.PrepRecoveryOp In WaitForState(recovering): collection=repfacttest_c8n_1x3, shard=shard1, thisCore=repfacttest_c8n_1x3_shard1_replica_n1, leaderDoesNotNeedRecovery=false, isLeader? true, live=true, checkLive=true, currentState=recovering, localState=active, nodeName=127.0.0.1:40643__c, coreNodeName=core_node5, onlyIfActiveCheckResult=false, nodeProps: core_node5:{ 2> "core":"repfacttest_c8n_1x3_shard1_replica_n2", 2> "node_name":"127.0.0.1:40643__c", 2> "base_url":"http://127.0.0.1:40643/_c", 2> "state":"recovering", 2> "type":"NRT", 2> "force_set_state":"false"} 2> 353322 INFO (watches-1182-thread-2) [n:127.0.0.1:32875__c c: s: r: x: t:] o.a.s.h.a.PrepRecoveryOp In WaitForState(recovering): collection=repfacttest_c8n_1x3, shard=shard1, thisCore=repfacttest_c8n_1x3_shard1_replica_n1, leaderDoesNotNeedRecovery=false, isLeader? true, live=true, checkLive=true, currentState=recovering, localState=active, nodeName=127.0.0.1:40643__c, coreNodeName=core_node5, onlyIfActiveCheckResult=false, nodeProps: core_node5:{ 2> "core":"repfacttest_c8n_1x3_shard1_replica_n2", 2> "node_name":"127.0.0.1:40643__c", 2> "base_url":"http://127.0.0.1:40643/_c", 2> "state":"recovering", 2> "type":"NRT", 2> "force_set_state":"false"} 2> 353322 INFO (qtp1455143443-2011) [n:127.0.0.1:32875__c c: s: r: x: t:null-1041] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/cores params={nodeName=127.0.0.1:40643__c&onlyIfLeaderActive=true&core=repfacttest_c8n_1x3_shard1_replica_n1&coreNodeName=core_node5&action=PREPRECOVERY&checkLive=true&state=recovering&onlyIfLeader=true&wt=javabin&version=2} status=0 QTime=100 2> 353304 ERROR (updateExecutor-1175-thread-1-processing-repfacttest_c8n_1x3_shard1_replica_n1 null-1039 core_node3 null 127.0.0.1:32875__c repfacttest_c8n_1x3 shard1) [n:127.0.0.1:32875__c c:repfacttest_c8n_1x3 s:shard1 r:core_node3 x:repfacttest_c8n_1x3_shard1_replica_n1 t:null-1039] o.a.s.u.ErrorReportingConcurrentUpdateSolrClient Error when calling SolrCmdDistributor$Req: cmd=delete{_version_=-1788735688818032640,query=`id:(1510)`,commitWithin=-1}; node=StdNode: http://127.0.0.1:33207/_c/repfacttest_c8n_1x3_shard1_replica_n4/ to http://127.0.0.1:33207/_c/repfacttest_c8n_1x3_shard1_replica_n4/ 2> => java.io.IOException: java.net.ConnectException: Connection refused 2> at org.eclipse.jetty.client.util.OutputStreamRequestContent$AsyncOutputStream.write(OutputStreamRequestContent.java:104) 2> java.io.IOException: java.net.ConnectException: Connection refused 2> at org.eclipse.jetty.client.util.OutputStreamRequestContent$AsyncOutputStream.write(OutputStreamRequestContent.java:104) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.apache.solr.common.util.FastOutputStream.flush(FastOutputStream.java:207) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.common.util.FastOutputStream.flushBuffer(FastOutputStream.java:200) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.common.util.JavaBinCodec.marshal(JavaBinCodec.java:170) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.request.JavaBinUpdateRequestCodec.marshal(JavaBinUpdateRequestCodec.java:100) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.BinaryRequestWriter.write(BinaryRequestWriter.java:79) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.Http2SolrClient.send(Http2SolrClient.java:435) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.ConcurrentUpdateHttp2SolrClient$Runner.sendUpdateStream(ConcurrentUpdateHttp2SolrClient.java:250) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.ConcurrentUpdateHttp2SolrClient$Runner.run(ConcurrentUpdateHttp2SolrClient.java:192) [solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at com.codahale.metrics.InstrumentedExecutorService$InstrumentedRunnable.run(InstrumentedExecutorService.java:212) [metrics-core-4.2.21.jar:4.2.21] 2> at org.apache.solr.common.util.ExecutorUtil$MDCAwareThreadPoolExecutor.lambda$execute$0(ExecutorUtil.java:294) [solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) [?:?] 2> at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) [?:?] 2> at java.base/java.lang.Thread.run(Thread.java:829) [?:?] 2> Suppressed: java.io.IOException: java.net.ConnectException: Connection refused 2> at org.eclipse.jetty.client.util.OutputStreamRequestContent$AsyncOutputStream.write(OutputStreamRequestContent.java:104) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.apache.solr.common.util.FastOutputStream.flush(FastOutputStream.java:207) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.common.util.FastOutputStream.flushBuffer(FastOutputStream.java:200) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.common.util.JavaBinCodec.close(JavaBinCodec.java:1293) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.request.JavaBinUpdateRequestCodec.marshal(JavaBinUpdateRequestCodec.java:99) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.BinaryRequestWriter.write(BinaryRequestWriter.java:79) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.Http2SolrClient.send(Http2SolrClient.java:435) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.ConcurrentUpdateHttp2SolrClient$Runner.sendUpdateStream(ConcurrentUpdateHttp2SolrClient.java:250) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.ConcurrentUpdateHttp2SolrClient$Runner.run(ConcurrentUpdateHttp2SolrClient.java:192) [solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at com.codahale.metrics.InstrumentedExecutorService$InstrumentedRunnable.run(InstrumentedExecutorService.java:212) [metrics-core-4.2.21.jar:4.2.21] 2> at org.apache.solr.common.util.ExecutorUtil$MDCAwareThreadPoolExecutor.lambda$execute$0(ExecutorUtil.java:294) [solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) [?:?] 2> at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) [?:?] 2> at java.base/java.lang.Thread.run(Thread.java:829) [?:?] 2> Caused by: java.net.ConnectException: Connection refused 2> at java.base/sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) ~[?:?] 2> at java.base/sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:774) ~[?:?] 2> at java.base/sun.nio.ch.SocketAdaptor.connect(SocketAdaptor.java:120) ~[?:?] 2> at org.eclipse.jetty.io.ClientConnector.connect(ClientConnector.java:428) ~[jetty-io-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.HTTP2Client.connect(HTTP2Client.java:461) ~[http2-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.http.HttpClientTransportOverHTTP2.connect(HttpClientTransportOverHTTP2.java:146) ~[http2-http-client-transport-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.http.HttpClientTransportOverHTTP2.connect(HttpClientTransportOverHTTP2.java:135) ~[http2-http-client-transport-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.connect(HttpClient.java:583) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.succeeded(HttpClient.java:560) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.succeeded(HttpClient.java:556) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.util.SocketAddressResolver$Async.lambda$resolve$1(SocketAddressResolver.java:180) ~[jetty-util-10.0.19.jar:10.0.19] 2> ... 4 more 2> Caused by: java.net.ConnectException: Connection refused 2> at java.base/sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) ~[?:?] 2> at java.base/sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:774) ~[?:?] 2> at java.base/sun.nio.ch.SocketAdaptor.connect(SocketAdaptor.java:120) ~[?:?] 2> at org.eclipse.jetty.io.ClientConnector.connect(ClientConnector.java:428) ~[jetty-io-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.HTTP2Client.connect(HTTP2Client.java:461) ~[http2-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.http.HttpClientTransportOverHTTP2.connect(HttpClientTransportOverHTTP2.java:146) ~[http2-http-client-transport-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.http.HttpClientTransportOverHTTP2.connect(HttpClientTransportOverHTTP2.java:135) ~[http2-http-client-transport-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.connect(HttpClient.java:583) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.succeeded(HttpClient.java:560) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.succeeded(HttpClient.java:556) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.util.SocketAddressResolver$Async.lambda$resolve$1(SocketAddressResolver.java:180) ~[jetty-util-10.0.19.jar:10.0.19] 2> ... 4 more 2> 353299 ERROR (updateExecutor-1175-thread-2-processing-repfacttest_c8n_1x3_shard1_replica_n1 null-1039 core_node3 null 127.0.0.1:32875__c repfacttest_c8n_1x3 shard1) [n:127.0.0.1:32875__c c:repfacttest_c8n_1x3 s:shard1 r:core_node3 x:repfacttest_c8n_1x3_shard1_replica_n1 t:null-1039] o.a.s.u.ErrorReportingConcurrentUpdateSolrClient Error when calling SolrCmdDistributor$Req: cmd=delete{_version_=-1788735688818032640,query=`id:(1510)`,commitWithin=-1}; node=StdNode: http://127.0.0.1:40643/_c/repfacttest_c8n_1x3_shard1_replica_n2/ to http://127.0.0.1:40643/_c/repfacttest_c8n_1x3_shard1_replica_n2/ 2> => java.io.IOException: java.net.ConnectException: Connection refused 2> at org.eclipse.jetty.client.util.OutputStreamRequestContent$AsyncOutputStream.write(OutputStreamRequestContent.java:104) 2> java.io.IOException: java.net.ConnectException: Connection refused 2> at org.eclipse.jetty.client.util.OutputStreamRequestContent$AsyncOutputStream.write(OutputStreamRequestContent.java:104) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.apache.solr.common.util.FastOutputStream.flush(FastOutputStream.java:207) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.common.util.FastOutputStream.flushBuffer(FastOutputStream.java:200) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.common.util.JavaBinCodec.marshal(JavaBinCodec.java:170) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.request.JavaBinUpdateRequestCodec.marshal(JavaBinUpdateRequestCodec.java:100) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.BinaryRequestWriter.write(BinaryRequestWriter.java:79) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.Http2SolrClient.send(Http2SolrClient.java:435) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.ConcurrentUpdateHttp2SolrClient$Runner.sendUpdateStream(ConcurrentUpdateHttp2SolrClient.java:250) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.ConcurrentUpdateHttp2SolrClient$Runner.run(ConcurrentUpdateHttp2SolrClient.java:192) [solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at com.codahale.metrics.InstrumentedExecutorService$InstrumentedRunnable.run(InstrumentedExecutorService.java:212) [metrics-core-4.2.21.jar:4.2.21] 2> at org.apache.solr.common.util.ExecutorUtil$MDCAwareThreadPoolExecutor.lambda$execute$0(ExecutorUtil.java:294) [solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) [?:?] 2> at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) [?:?] 2> at java.base/java.lang.Thread.run(Thread.java:829) [?:?] 2> Suppressed: java.io.IOException: java.net.ConnectException: Connection refused 2> at org.eclipse.jetty.client.util.OutputStreamRequestContent$AsyncOutputStream.write(OutputStreamRequestContent.java:104) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.apache.solr.common.util.FastOutputStream.flush(FastOutputStream.java:207) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.common.util.FastOutputStream.flushBuffer(FastOutputStream.java:200) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.common.util.JavaBinCodec.close(JavaBinCodec.java:1293) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.request.JavaBinUpdateRequestCodec.marshal(JavaBinUpdateRequestCodec.java:99) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.BinaryRequestWriter.write(BinaryRequestWriter.java:79) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.Http2SolrClient.send(Http2SolrClient.java:435) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.ConcurrentUpdateHttp2SolrClient$Runner.sendUpdateStream(ConcurrentUpdateHttp2SolrClient.java:250) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.ConcurrentUpdateHttp2SolrClient$Runner.run(ConcurrentUpdateHttp2SolrClient.java:192) [solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at com.codahale.metrics.InstrumentedExecutorService$InstrumentedRunnable.run(InstrumentedExecutorService.java:212) [metrics-core-4.2.21.jar:4.2.21] 2> at org.apache.solr.common.util.ExecutorUtil$MDCAwareThreadPoolExecutor.lambda$execute$0(ExecutorUtil.java:294) [solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) [?:?] 2> at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) [?:?] 2> at java.base/java.lang.Thread.run(Thread.java:829) [?:?] 2> Caused by: java.net.ConnectException: Connection refused 2> at java.base/sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) ~[?:?] 2> at java.base/sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:774) ~[?:?] 2> at java.base/sun.nio.ch.SocketAdaptor.connect(SocketAdaptor.java:120) ~[?:?] 2> at org.eclipse.jetty.io.ClientConnector.connect(ClientConnector.java:428) ~[jetty-io-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.HTTP2Client.connect(HTTP2Client.java:461) ~[http2-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.http.HttpClientTransportOverHTTP2.connect(HttpClientTransportOverHTTP2.java:146) ~[http2-http-client-transport-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.http.HttpClientTransportOverHTTP2.connect(HttpClientTransportOverHTTP2.java:135) ~[http2-http-client-transport-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.connect(HttpClient.java:583) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.succeeded(HttpClient.java:560) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.succeeded(HttpClient.java:556) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.util.SocketAddressResolver$Async.lambda$resolve$1(SocketAddressResolver.java:180) ~[jetty-util-10.0.19.jar:10.0.19] 2> ... 4 more 2> Caused by: java.net.ConnectException: Connection refused 2> at java.base/sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) ~[?:?] 2> at java.base/sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:774) ~[?:?] 2> at java.base/sun.nio.ch.SocketAdaptor.connect(SocketAdaptor.java:120) ~[?:?] 2> at org.eclipse.jetty.io.ClientConnector.connect(ClientConnector.java:428) ~[jetty-io-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.HTTP2Client.connect(HTTP2Client.java:461) ~[http2-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.http.HttpClientTransportOverHTTP2.connect(HttpClientTransportOverHTTP2.java:146) ~[http2-http-client-transport-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.http.HttpClientTransportOverHTTP2.connect(HttpClientTransportOverHTTP2.java:135) ~[http2-http-client-transport-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.connect(HttpClient.java:583) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.succeeded(HttpClient.java:560) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.succeeded(HttpClient.java:556) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.util.SocketAddressResolver$Async.lambda$resolve$1(SocketAddressResolver.java:180) ~[jetty-util-10.0.19.jar:10.0.19] 2> ... 4 more 2> 353331 INFO (qtp1455143443-2010) [n:127.0.0.1:32875__c c:repfacttest_c8n_1x3 s:shard1 r:core_node3 x:repfacttest_c8n_1x3_shard1_replica_n1 t:null-1039] o.a.s.u.SolrCmdDistributor SolrCmdDistributor found 2 errors 2> 353332 WARN (qtp1455143443-2010) [n:127.0.0.1:32875__c c:repfacttest_c8n_1x3 s:shard1 r:core_node3 x:repfacttest_c8n_1x3_shard1_replica_n1 t:null-1039] o.a.s.u.p.DistributedZkUpdateProcessor Error sending update to http://127.0.0.1:33207/_c 2> => java.io.IOException: java.net.ConnectException: Connection refused 2> at org.eclipse.jetty.client.util.OutputStreamRequestContent$AsyncOutputStream.write(OutputStreamRequestContent.java:104) 2> java.io.IOException: java.net.ConnectException: Connection refused 2> at org.eclipse.jetty.client.util.OutputStreamRequestContent$AsyncOutputStream.write(OutputStreamRequestContent.java:104) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.apache.solr.common.util.FastOutputStream.flush(FastOutputStream.java:207) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.common.util.FastOutputStream.flushBuffer(FastOutputStream.java:200) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.common.util.JavaBinCodec.marshal(JavaBinCodec.java:170) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.request.JavaBinUpdateRequestCodec.marshal(JavaBinUpdateRequestCodec.java:100) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.BinaryRequestWriter.write(BinaryRequestWriter.java:79) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.Http2SolrClient.send(Http2SolrClient.java:435) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.ConcurrentUpdateHttp2SolrClient$Runner.sendUpdateStream(ConcurrentUpdateHttp2SolrClient.java:250) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.ConcurrentUpdateHttp2SolrClient$Runner.run(ConcurrentUpdateHttp2SolrClient.java:192) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at com.codahale.metrics.InstrumentedExecutorService$InstrumentedRunnable.run(InstrumentedExecutorService.java:212) ~[metrics-core-4.2.21.jar:4.2.21] 2> at org.apache.solr.common.util.ExecutorUtil$MDCAwareThreadPoolExecutor.lambda$execute$0(ExecutorUtil.java:294) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) ~[?:?] 2> at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) ~[?:?] 2> at java.base/java.lang.Thread.run(Thread.java:829) [?:?] 2> Suppressed: java.io.IOException: java.net.ConnectException: Connection refused 2> at org.eclipse.jetty.client.util.OutputStreamRequestContent$AsyncOutputStream.write(OutputStreamRequestContent.java:104) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.apache.solr.common.util.FastOutputStream.flush(FastOutputStream.java:207) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.common.util.FastOutputStream.flushBuffer(FastOutputStream.java:200) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.common.util.JavaBinCodec.close(JavaBinCodec.java:1293) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.request.JavaBinUpdateRequestCodec.marshal(JavaBinUpdateRequestCodec.java:99) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.BinaryRequestWriter.write(BinaryRequestWriter.java:79) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.Http2SolrClient.send(Http2SolrClient.java:435) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.ConcurrentUpdateHttp2SolrClient$Runner.sendUpdateStream(ConcurrentUpdateHttp2SolrClient.java:250) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.ConcurrentUpdateHttp2SolrClient$Runner.run(ConcurrentUpdateHttp2SolrClient.java:192) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at com.codahale.metrics.InstrumentedExecutorService$InstrumentedRunnable.run(InstrumentedExecutorService.java:212) ~[metrics-core-4.2.21.jar:4.2.21] 2> at org.apache.solr.common.util.ExecutorUtil$MDCAwareThreadPoolExecutor.lambda$execute$0(ExecutorUtil.java:294) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) ~[?:?] 2> at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) ~[?:?] 2> at java.base/java.lang.Thread.run(Thread.java:829) [?:?] 2> Caused by: java.net.ConnectException: Connection refused 2> at java.base/sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) ~[?:?] 2> at java.base/sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:774) ~[?:?] 2> at java.base/sun.nio.ch.SocketAdaptor.connect(SocketAdaptor.java:120) ~[?:?] 2> at org.eclipse.jetty.io.ClientConnector.connect(ClientConnector.java:428) ~[jetty-io-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.HTTP2Client.connect(HTTP2Client.java:461) ~[http2-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.http.HttpClientTransportOverHTTP2.connect(HttpClientTransportOverHTTP2.java:146) ~[http2-http-client-transport-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.http.HttpClientTransportOverHTTP2.connect(HttpClientTransportOverHTTP2.java:135) ~[http2-http-client-transport-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.connect(HttpClient.java:583) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.succeeded(HttpClient.java:560) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.succeeded(HttpClient.java:556) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.util.SocketAddressResolver$Async.lambda$resolve$1(SocketAddressResolver.java:180) ~[jetty-util-10.0.19.jar:10.0.19] 2> ... 4 more 2> Caused by: java.net.ConnectException: Connection refused 2> at java.base/sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) ~[?:?] 2> at java.base/sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:774) ~[?:?] 2> at java.base/sun.nio.ch.SocketAdaptor.connect(SocketAdaptor.java:120) ~[?:?] 2> at org.eclipse.jetty.io.ClientConnector.connect(ClientConnector.java:428) ~[jetty-io-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.HTTP2Client.connect(HTTP2Client.java:461) ~[http2-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.http.HttpClientTransportOverHTTP2.connect(HttpClientTransportOverHTTP2.java:146) ~[http2-http-client-transport-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.http.HttpClientTransportOverHTTP2.connect(HttpClientTransportOverHTTP2.java:135) ~[http2-http-client-transport-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.connect(HttpClient.java:583) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.succeeded(HttpClient.java:560) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.succeeded(HttpClient.java:556) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.util.SocketAddressResolver$Async.lambda$resolve$1(SocketAddressResolver.java:180) ~[jetty-util-10.0.19.jar:10.0.19] 2> ... 4 more 2> 353352 ERROR (qtp1455143443-2010) [n:127.0.0.1:32875__c c:repfacttest_c8n_1x3 s:shard1 r:core_node3 x:repfacttest_c8n_1x3_shard1_replica_n1 t:null-1039] o.a.s.u.p.DistributedZkUpdateProcessor Setting up to try to start recovery on replica core_node6 with url http://127.0.0.1:33207/_c/repfacttest_c8n_1x3_shard1_replica_n4/ by increasing leader term 2> => java.net.ConnectException: Connection refused 2> at java.base/sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) 2> java.net.ConnectException: Connection refused 2> at java.base/sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) ~[?:?] 2> at java.base/sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:774) ~[?:?] 2> at java.base/sun.nio.ch.SocketAdaptor.connect(SocketAdaptor.java:120) ~[?:?] 2> at org.eclipse.jetty.io.ClientConnector.connect(ClientConnector.java:428) ~[jetty-io-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.HTTP2Client.connect(HTTP2Client.java:461) ~[http2-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.http.HttpClientTransportOverHTTP2.connect(HttpClientTransportOverHTTP2.java:146) ~[http2-http-client-transport-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.http.HttpClientTransportOverHTTP2.connect(HttpClientTransportOverHTTP2.java:135) ~[http2-http-client-transport-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.connect(HttpClient.java:583) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.succeeded(HttpClient.java:560) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.succeeded(HttpClient.java:556) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.util.SocketAddressResolver$Async.lambda$resolve$1(SocketAddressResolver.java:180) ~[jetty-util-10.0.19.jar:10.0.19] 2> at org.apache.solr.common.util.ExecutorUtil$MDCAwareThreadPoolExecutor.lambda$execute$0(ExecutorUtil.java:294) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) ~[?:?] 2> at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) ~[?:?] 2> at java.base/java.lang.Thread.run(Thread.java:829) [?:?] 2> 353358 WARN (qtp1455143443-2010) [n:127.0.0.1:32875__c c:repfacttest_c8n_1x3 s:shard1 r:core_node3 x:repfacttest_c8n_1x3_shard1_replica_n1 t:null-1039] o.a.s.u.p.DistributedZkUpdateProcessor Error sending update to http://127.0.0.1:40643/_c 2> => java.io.IOException: java.net.ConnectException: Connection refused 2> at org.eclipse.jetty.client.util.OutputStreamRequestContent$AsyncOutputStream.write(OutputStreamRequestContent.java:104) 2> java.io.IOException: java.net.ConnectException: Connection refused 2> at org.eclipse.jetty.client.util.OutputStreamRequestContent$AsyncOutputStream.write(OutputStreamRequestContent.java:104) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.apache.solr.common.util.FastOutputStream.flush(FastOutputStream.java:207) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.common.util.FastOutputStream.flushBuffer(FastOutputStream.java:200) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.common.util.JavaBinCodec.marshal(JavaBinCodec.java:170) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.request.JavaBinUpdateRequestCodec.marshal(JavaBinUpdateRequestCodec.java:100) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.BinaryRequestWriter.write(BinaryRequestWriter.java:79) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.Http2SolrClient.send(Http2SolrClient.java:435) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.ConcurrentUpdateHttp2SolrClient$Runner.sendUpdateStream(ConcurrentUpdateHttp2SolrClient.java:250) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.ConcurrentUpdateHttp2SolrClient$Runner.run(ConcurrentUpdateHttp2SolrClient.java:192) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at com.codahale.metrics.InstrumentedExecutorService$InstrumentedRunnable.run(InstrumentedExecutorService.java:212) ~[metrics-core-4.2.21.jar:4.2.21] 2> at org.apache.solr.common.util.ExecutorUtil$MDCAwareThreadPoolExecutor.lambda$execute$0(ExecutorUtil.java:294) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) ~[?:?] 2> at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) ~[?:?] 2> at java.base/java.lang.Thread.run(Thread.java:829) [?:?] 2> Suppressed: java.io.IOException: java.net.ConnectException: Connection refused 2> at org.eclipse.jetty.client.util.OutputStreamRequestContent$AsyncOutputStream.write(OutputStreamRequestContent.java:104) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.apache.solr.common.util.FastOutputStream.flush(FastOutputStream.java:207) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.common.util.FastOutputStream.flushBuffer(FastOutputStream.java:200) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.common.util.JavaBinCodec.close(JavaBinCodec.java:1293) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.request.JavaBinUpdateRequestCodec.marshal(JavaBinUpdateRequestCodec.java:99) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.BinaryRequestWriter.write(BinaryRequestWriter.java:79) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.Http2SolrClient.send(Http2SolrClient.java:435) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.ConcurrentUpdateHttp2SolrClient$Runner.sendUpdateStream(ConcurrentUpdateHttp2SolrClient.java:250) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.ConcurrentUpdateHttp2SolrClient$Runner.run(ConcurrentUpdateHttp2SolrClient.java:192) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at com.codahale.metrics.InstrumentedExecutorService$InstrumentedRunnable.run(InstrumentedExecutorService.java:212) ~[metrics-core-4.2.21.jar:4.2.21] 2> at org.apache.solr.common.util.ExecutorUtil$MDCAwareThreadPoolExecutor.lambda$execute$0(ExecutorUtil.java:294) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) ~[?:?] 2> at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) ~[?:?] 2> at java.base/java.lang.Thread.run(Thread.java:829) [?:?] 2> Caused by: java.net.ConnectException: Connection refused 2> at java.base/sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) ~[?:?] 2> at java.base/sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:774) ~[?:?] 2> at java.base/sun.nio.ch.SocketAdaptor.connect(SocketAdaptor.java:120) ~[?:?] 2> at org.eclipse.jetty.io.ClientConnector.connect(ClientConnector.java:428) ~[jetty-io-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.HTTP2Client.connect(HTTP2Client.java:461) ~[http2-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.http.HttpClientTransportOverHTTP2.connect(HttpClientTransportOverHTTP2.java:146) ~[http2-http-client-transport-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.http.HttpClientTransportOverHTTP2.connect(HttpClientTransportOverHTTP2.java:135) ~[http2-http-client-transport-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.connect(HttpClient.java:583) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.succeeded(HttpClient.java:560) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.succeeded(HttpClient.java:556) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.util.SocketAddressResolver$Async.lambda$resolve$1(SocketAddressResolver.java:180) ~[jetty-util-10.0.19.jar:10.0.19] 2> ... 4 more 2> Caused by: java.net.ConnectException: Connection refused 2> at java.base/sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) ~[?:?] 2> at java.base/sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:774) ~[?:?] 2> at java.base/sun.nio.ch.SocketAdaptor.connect(SocketAdaptor.java:120) ~[?:?] 2> at org.eclipse.jetty.io.ClientConnector.connect(ClientConnector.java:428) ~[jetty-io-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.HTTP2Client.connect(HTTP2Client.java:461) ~[http2-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.http.HttpClientTransportOverHTTP2.connect(HttpClientTransportOverHTTP2.java:146) ~[http2-http-client-transport-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.http.HttpClientTransportOverHTTP2.connect(HttpClientTransportOverHTTP2.java:135) ~[http2-http-client-transport-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.connect(HttpClient.java:583) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.succeeded(HttpClient.java:560) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.succeeded(HttpClient.java:556) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.util.SocketAddressResolver$Async.lambda$resolve$1(SocketAddressResolver.java:180) ~[jetty-util-10.0.19.jar:10.0.19] 2> ... 4 more 2> 353377 ERROR (qtp1455143443-2010) [n:127.0.0.1:32875__c c:repfacttest_c8n_1x3 s:shard1 r:core_node3 x:repfacttest_c8n_1x3_shard1_replica_n1 t:null-1039] o.a.s.u.p.DistributedZkUpdateProcessor Setting up to try to start recovery on replica core_node5 with url http://127.0.0.1:40643/_c/repfacttest_c8n_1x3_shard1_replica_n2/ by increasing leader term 2> => java.net.ConnectException: Connection refused 2> at java.base/sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) 2> java.net.ConnectException: Connection refused 2> at java.base/sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) ~[?:?] 2> at java.base/sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:774) ~[?:?] 2> at java.base/sun.nio.ch.SocketAdaptor.connect(SocketAdaptor.java:120) ~[?:?] 2> at org.eclipse.jetty.io.ClientConnector.connect(ClientConnector.java:428) ~[jetty-io-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.HTTP2Client.connect(HTTP2Client.java:461) ~[http2-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.http.HttpClientTransportOverHTTP2.connect(HttpClientTransportOverHTTP2.java:146) ~[http2-http-client-transport-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.http.HttpClientTransportOverHTTP2.connect(HttpClientTransportOverHTTP2.java:135) ~[http2-http-client-transport-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.connect(HttpClient.java:583) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.succeeded(HttpClient.java:560) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.succeeded(HttpClient.java:556) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.util.SocketAddressResolver$Async.lambda$resolve$1(SocketAddressResolver.java:180) ~[jetty-util-10.0.19.jar:10.0.19] 2> at org.apache.solr.common.util.ExecutorUtil$MDCAwareThreadPoolExecutor.lambda$execute$0(ExecutorUtil.java:294) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) ~[?:?] 2> at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) ~[?:?] 2> at java.base/java.lang.Thread.run(Thread.java:829) [?:?] 2> 353389 INFO (qtp1455143443-2010) [n:127.0.0.1:32875__c c:repfacttest_c8n_1x3 s:shard1 r:core_node3 x:repfacttest_c8n_1x3_shard1_replica_n1 t:null-1039] o.a.s.c.ZkShardTerms Successful update of terms at /collections/repfacttest_c8n_1x3/terms/shard1 to Terms{values={core_node6=2, core_node6_recovering=1, core_node5_recovering=3, core_node3=5, core_node5=4}, version=9} for ensureTermsIsHigher 2> 353389 INFO (qtp1455143443-2010) [n:127.0.0.1:32875__c c:repfacttest_c8n_1x3 s:shard1 r:core_node3 x:repfacttest_c8n_1x3_shard1_replica_n1 t:null-1039] o.a.s.u.p.LogUpdateProcessorFactory webapp=/_c path=/update params={wt=javabin&version=2}{deleteByQuery=id:(1510) (-1788735688818032640)} 0 225 2> 353395 INFO (zkCallback-1122-thread-2) [n:127.0.0.1:40643__c c:repfacttest_c8n_1x3 s:shard1 r:core_node5 x:repfacttest_c8n_1x3_shard1_replica_n2 t:] o.a.s.c.RecoveringCoreTermWatcher Start recovery on core_node5 because core's term is less than leader's term 2> 353397 WARN (updateExecutor-1117-thread-2-processing-127.0.0.1:40643__c repfacttest_c8n_1x3_shard1_replica_n2 repfacttest_c8n_1x3 shard1 core_node5) [n:127.0.0.1:40643__c c:repfacttest_c8n_1x3 s:shard1 r:core_node5 x:repfacttest_c8n_1x3_shard1_replica_n2 t:] o.a.s.c.RecoveryStrategy Stopping recovery for core=[repfacttest_c8n_1x3_shard1_replica_n2] coreNodeName=[core_node5] 2> 353407 INFO (qtp1455143443-2014) [n:127.0.0.1:32875__c c:repfacttest_c8n_1x3 s:shard1 r:core_node3 x:repfacttest_c8n_1x3_shard1_replica_n1 t:null-1042] o.a.s.u.p.LogUpdateProcessorFactory webapp=/_c path=/update params={wt=javabin&version=2}{add=[1599 (1788735689064448000)]} 0 7 2> 353422 INFO (qtp1455143443-2015) [n:127.0.0.1:32875__c c:repfacttest_c8n_1x3 s:shard1 r:core_node3 x:repfacttest_c8n_1x3_shard1_replica_n1 t:null-1043] o.a.s.u.p.LogUpdateProcessorFactory webapp=/_c path=/update params={wt=javabin&version=2}{delete=[1599 (-1788735689081225216)]} 0 6 2> 353426 INFO (TEST-ReplicationFactorTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.ReplicationFactorTest Re-opening closed proxy ports 2> 353426 INFO (TEST-ReplicationFactorTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.s.c.SocketProxy Re-opening connectivity to http://127.0.0.1:33207/_c 2> 353430 INFO (TEST-ReplicationFactorTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.s.c.SocketProxy Re-opening connectivity to http://127.0.0.1:40643/_c 2> 353825 INFO (recoveryExecutor-1119-thread-1-processing-127.0.0.1:40643__c repfacttest_c8n_1x3_shard1_replica_n2 repfacttest_c8n_1x3 shard1 core_node5) [n:127.0.0.1:40643__c c:repfacttest_c8n_1x3 s:shard1 r:core_node5 x:repfacttest_c8n_1x3_shard1_replica_n2 t:] o.a.s.c.RecoveryStrategy Attempting to PeerSync from [http://127.0.0.1:32875/_c/repfacttest_c8n_1x3_shard1_replica_n1/] - recoveringAfterStartup=[true] 2> 353828 WARN (recoveryExecutor-1119-thread-1-processing-127.0.0.1:40643__c repfacttest_c8n_1x3_shard1_replica_n2 repfacttest_c8n_1x3 shard1 core_node5) [n:127.0.0.1:40643__c c:repfacttest_c8n_1x3 s:shard1 r:core_node5 x:repfacttest_c8n_1x3_shard1_replica_n2 t:] o.a.s.u.PeerSyncWithLeader no frame of reference to tell if we've missed updates 2> 353829 INFO (recoveryExecutor-1119-thread-1-processing-127.0.0.1:40643__c repfacttest_c8n_1x3_shard1_replica_n2 repfacttest_c8n_1x3 shard1 core_node5) [n:127.0.0.1:40643__c c:repfacttest_c8n_1x3 s:shard1 r:core_node5 x:repfacttest_c8n_1x3_shard1_replica_n2 t:] o.a.s.c.RecoveryStrategy PeerSync Recovery was not successful - trying replication. 2> 353829 INFO (recoveryExecutor-1119-thread-1-processing-127.0.0.1:40643__c repfacttest_c8n_1x3_shard1_replica_n2 repfacttest_c8n_1x3 shard1 core_node5) [n:127.0.0.1:40643__c c:repfacttest_c8n_1x3 s:shard1 r:core_node5 x:repfacttest_c8n_1x3_shard1_replica_n2 t:] o.a.s.c.RecoveryStrategy RecoveryStrategy has been closed 2> 353829 INFO (recoveryExecutor-1119-thread-1-processing-127.0.0.1:40643__c repfacttest_c8n_1x3_shard1_replica_n2 repfacttest_c8n_1x3 shard1 core_node5) [n:127.0.0.1:40643__c c:repfacttest_c8n_1x3 s:shard1 r:core_node5 x:repfacttest_c8n_1x3_shard1_replica_n2 t:] o.a.s.c.RecoveryStrategy Finished recovery process, successful=[false] msTimeTaken=671.0 2> 353830 INFO (recoveryExecutor-1119-thread-1-processing-127.0.0.1:40643__c repfacttest_c8n_1x3_shard1_replica_n2 repfacttest_c8n_1x3 shard1 core_node5) [n:127.0.0.1:40643__c c:repfacttest_c8n_1x3 s:shard1 r:core_node5 x:repfacttest_c8n_1x3_shard1_replica_n2 t:] o.a.s.c.RecoveryStrategy Finished recovery process. recoveringAfterStartup=true msTimeTaken=671.0 2> 353830 INFO (updateExecutor-1117-thread-2-processing-127.0.0.1:40643__c repfacttest_c8n_1x3_shard1_replica_n2 repfacttest_c8n_1x3 shard1 core_node5) [n:127.0.0.1:40643__c c:repfacttest_c8n_1x3 s:shard1 r:core_node5 x:repfacttest_c8n_1x3_shard1_replica_n2 t:] o.a.s.u.DefaultSolrCoreState Running recovery 2> 353831 INFO (updateExecutor-1117-thread-2-processing-127.0.0.1:40643__c repfacttest_c8n_1x3_shard1_replica_n2 repfacttest_c8n_1x3 shard1 core_node5) [n:127.0.0.1:40643__c c:repfacttest_c8n_1x3 s:shard1 r:core_node5 x:repfacttest_c8n_1x3_shard1_replica_n2 t:] o.a.s.c.ActionThrottle Throttling recovery attempts - waiting for 9325ms 2> 355458 INFO (TEST-ReplicationFactorTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.AbstractFullDistribZkTestBase Found 3 replicas and leader on 127.0.0.1:32875__c for shard1 in repfacttest_c8n_1x3 2> 355458 INFO (TEST-ReplicationFactorTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.AbstractFullDistribZkTestBase Replica core_node5 is currently recovering 2> 355458 INFO (TEST-ReplicationFactorTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.AbstractFullDistribZkTestBase Replica core_node6 is currently recovering 2> 355962 INFO (TEST-ReplicationFactorTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.AbstractFullDistribZkTestBase Found 3 replicas and leader on 127.0.0.1:32875__c for shard1 in repfacttest_c8n_1x3 2> 355962 INFO (TEST-ReplicationFactorTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.AbstractFullDistribZkTestBase Replica core_node5 is currently recovering 2> 355962 INFO (TEST-ReplicationFactorTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.AbstractFullDistribZkTestBase Replica core_node6 is currently recovering 2> 356465 INFO (TEST-ReplicationFactorTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.AbstractFullDistribZkTestBase Found 3 replicas and leader on 127.0.0.1:32875__c for shard1 in repfacttest_c8n_1x3 2> 356466 INFO (TEST-ReplicationFactorTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.AbstractFullDistribZkTestBase Replica core_node5 is currently recovering 2> 356466 INFO (TEST-ReplicationFactorTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.AbstractFullDistribZkTestBase Replica core_node6 is currently recovering 2> 356970 INFO (TEST-ReplicationFactorTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.AbstractFullDistribZkTestBase Found 3 replicas and leader on 127.0.0.1:32875__c for shard1 in repfacttest_c8n_1x3 2> 356971 INFO (TEST-ReplicationFactorTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.AbstractFullDistribZkTestBase Replica core_node5 is currently recovering 2> 356971 INFO (TEST-ReplicationFactorTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.AbstractFullDistribZkTestBase Replica core_node6 is currently recovering 2> 357474 INFO (TEST-ReplicationFactorTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.AbstractFullDistribZkTestBase Found 3 replicas and leader on 127.0.0.1:32875__c for shard1 in repfacttest_c8n_1x3 2> 357474 INFO (TEST-ReplicationFactorTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.AbstractFullDistribZkTestBase Replica core_node5 is currently recovering 2> 357475 INFO (TEST-ReplicationFactorTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.AbstractFullDistribZkTestBase Replica core_node6 is currently recovering 2> 357978 INFO (TEST-ReplicationFactorTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.AbstractFullDistribZkTestBase Found 3 replicas and leader on 127.0.0.1:32875__c for shard1 in repfacttest_c8n_1x3 2> 357979 INFO (TEST-ReplicationFactorTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.AbstractFullDistribZkTestBase Replica core_node5 is currently recovering 2> 357979 INFO (TEST-ReplicationFactorTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.AbstractFullDistribZkTestBase Replica core_node6 is currently recovering 2> 358482 INFO (TEST-ReplicationFactorTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.AbstractFullDistribZkTestBase Found 3 replicas and leader on 127.0.0.1:32875__c for shard1 in repfacttest_c8n_1x3 2> 358483 INFO (TEST-ReplicationFactorTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.AbstractFullDistribZkTestBase Replica core_node5 is currently recovering 2> 358483 INFO (TEST-ReplicationFactorTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.AbstractFullDistribZkTestBase Replica core_node6 is currently recovering 2> 358986 INFO (TEST-ReplicationFactorTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.AbstractFullDistribZkTestBase Found 3 replicas and leader on 127.0.0.1:32875__c for shard1 in repfacttest_c8n_1x3 2> 358987 INFO (TEST-ReplicationFactorTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.AbstractFullDistribZkTestBase Replica core_node5 is currently recovering 2> 358987 INFO (TEST-ReplicationFactorTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.AbstractFullDistribZkTestBase Replica core_node6 is currently recovering 2> 359490 INFO (TEST-ReplicationFactorTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.AbstractFullDistribZkTestBase Found 3 replicas and leader on 127.0.0.1:32875__c for shard1 in repfacttest_c8n_1x3 2> 359491 INFO (TEST-ReplicationFactorTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.AbstractFullDistribZkTestBase Replica core_node5 is currently recovering 2> 359491 INFO (TEST-ReplicationFactorTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.AbstractFullDistribZkTestBase Replica core_node6 is currently recovering 2> 359994 INFO (TEST-ReplicationFactorTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.AbstractFullDistribZkTestBase Found 3 replicas and leader on 127.0.0.1:32875__c for shard1 in repfacttest_c8n_1x3 2> 359995 INFO (TEST-ReplicationFactorTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.AbstractFullDistribZkTestBase Replica core_node5 is currently recovering 2> 359995 INFO (TEST-ReplicationFactorTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.AbstractFullDistribZkTestBase Replica core_node6 is currently recovering 2> 360499 INFO (TEST-ReplicationFactorTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.AbstractFullDistribZkTestBase Found 3 replicas and leader on 127.0.0.1:32875__c for shard1 in repfacttest_c8n_1x3 2> 360499 INFO (TEST-ReplicationFactorTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.AbstractFullDistribZkTestBase Replica core_node5 is currently recovering 2> 360500 INFO (TEST-ReplicationFactorTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.AbstractFullDistribZkTestBase Replica core_node6 is currently recovering 2> 361003 INFO (TEST-ReplicationFactorTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.AbstractFullDistribZkTestBase Found 3 replicas and leader on 127.0.0.1:32875__c for shard1 in repfacttest_c8n_1x3 2> 361004 INFO (TEST-ReplicationFactorTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.AbstractFullDistribZkTestBase Replica core_node5 is currently recovering 2> 361004 INFO (TEST-ReplicationFactorTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.AbstractFullDistribZkTestBase Replica core_node6 is currently recovering 2> 361508 INFO (TEST-ReplicationFactorTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.AbstractFullDistribZkTestBase Found 3 replicas and leader on 127.0.0.1:32875__c for shard1 in repfacttest_c8n_1x3 2> 361508 INFO (TEST-ReplicationFactorTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.AbstractFullDistribZkTestBase Replica core_node5 is currently recovering 2> 361508 INFO (TEST-ReplicationFactorTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.AbstractFullDistribZkTestBase Replica core_node6 is currently recovering 2> 362012 INFO (TEST-ReplicationFactorTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.AbstractFullDistribZkTestBase Found 3 replicas and leader on 127.0.0.1:32875__c for shard1 in repfacttest_c8n_1x3 2> 362012 INFO (TEST-ReplicationFactorTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.AbstractFullDistribZkTestBase Replica core_node5 is currently recovering 2> 362012 INFO (TEST-ReplicationFactorTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.AbstractFullDistribZkTestBase Replica core_node6 is currently recovering 2> 362516 INFO (TEST-ReplicationFactorTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.AbstractFullDistribZkTestBase Found 3 replicas and leader on 127.0.0.1:32875__c for shard1 in repfacttest_c8n_1x3 2> 362516 INFO (TEST-ReplicationFactorTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.AbstractFullDistribZkTestBase Replica core_node5 is currently recovering 2> 362516 INFO (TEST-ReplicationFactorTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.AbstractFullDistribZkTestBase Replica core_node6 is currently recovering 2> 362544 INFO (recoveryExecutor-1158-thread-1-processing-127.0.0.1:33207__c repfacttest_c8n_1x3_shard1_replica_n4 repfacttest_c8n_1x3 shard1 core_node6) [n:127.0.0.1:33207__c c:repfacttest_c8n_1x3 s:shard1 r:core_node6 x:repfacttest_c8n_1x3_shard1_replica_n4 t:] o.a.s.c.RecoveryStrategy Starting recovery process. recoveringAfterStartup=true 2> 362544 INFO (recoveryExecutor-1158-thread-1-processing-127.0.0.1:33207__c repfacttest_c8n_1x3_shard1_replica_n4 repfacttest_c8n_1x3 shard1 core_node6) [n:127.0.0.1:33207__c c:repfacttest_c8n_1x3 s:shard1 r:core_node6 x:repfacttest_c8n_1x3_shard1_replica_n4 t:] o.a.s.c.RecoveryStrategy Found new versions added after startup: num=[5] 2> 362545 INFO (recoveryExecutor-1158-thread-1-processing-127.0.0.1:33207__c repfacttest_c8n_1x3_shard1_replica_n4 repfacttest_c8n_1x3 shard1 core_node6) [n:127.0.0.1:33207__c c:repfacttest_c8n_1x3 s:shard1 r:core_node6 x:repfacttest_c8n_1x3_shard1_replica_n4 t:] o.a.s.c.RecoveryStrategy currentVersions size=5 range=[-1788735687639433216 to 1788735687205322752] 2> 362545 INFO (recoveryExecutor-1158-thread-1-processing-127.0.0.1:33207__c repfacttest_c8n_1x3_shard1_replica_n4 repfacttest_c8n_1x3 shard1 core_node6) [n:127.0.0.1:33207__c c:repfacttest_c8n_1x3 s:shard1 r:core_node6 x:repfacttest_c8n_1x3_shard1_replica_n4 t:] o.a.s.c.RecoveryStrategy startupVersions is empty 2> 362553 INFO (qtp1455143443-2012) [n:127.0.0.1:32875__c c:repfacttest_c8n_1x3 s:shard1 r:core_node3 x:repfacttest_c8n_1x3_shard1_replica_n1 t:null-1044] o.a.s.c.S.Request webapp=/_c path=/admin/ping params={wt=javabin&version=2} hits=0 status=0 QTime=2 2> 362553 INFO (qtp1455143443-2012) [n:127.0.0.1:32875__c c:repfacttest_c8n_1x3 s:shard1 r:core_node3 x:repfacttest_c8n_1x3_shard1_replica_n1 t:null-1044] o.a.s.c.S.Request webapp=/_c path=/admin/ping params={wt=javabin&version=2} status=0 QTime=3 2> 362555 INFO (recoveryExecutor-1158-thread-1-processing-127.0.0.1:33207__c repfacttest_c8n_1x3_shard1_replica_n4 repfacttest_c8n_1x3 shard1 core_node6) [n:127.0.0.1:33207__c c:repfacttest_c8n_1x3 s:shard1 r:core_node6 x:repfacttest_c8n_1x3_shard1_replica_n4 t:] o.a.s.c.RecoveryStrategy Begin buffering updates. core=[repfacttest_c8n_1x3_shard1_replica_n4] 2> 362561 INFO (recoveryExecutor-1158-thread-1-processing-127.0.0.1:33207__c repfacttest_c8n_1x3_shard1_replica_n4 repfacttest_c8n_1x3 shard1 core_node6) [n:127.0.0.1:33207__c c:repfacttest_c8n_1x3 s:shard1 r:core_node6 x:repfacttest_c8n_1x3_shard1_replica_n4 t:] o.a.s.u.UpdateLog Starting to buffer updates. FSUpdateLog{state=BUFFERING, tlog=tlog{file=/build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.cloud.ReplicationFactorTest_246C98A4C257C021-001/shard-1-001/cores/repfacttest_c8n_1x3_shard1_replica_n4/data/tlog/tlog.0000000000000000000 refcount=1}} 2> 362561 INFO (recoveryExecutor-1158-thread-1-processing-127.0.0.1:33207__c repfacttest_c8n_1x3_shard1_replica_n4 repfacttest_c8n_1x3 shard1 core_node6) [n:127.0.0.1:33207__c c:repfacttest_c8n_1x3 s:shard1 r:core_node6 x:repfacttest_c8n_1x3_shard1_replica_n4 t:] o.a.s.c.RecoveryStrategy Publishing state of core [repfacttest_c8n_1x3_shard1_replica_n4] as recovering, leader is [http://127.0.0.1:32875/_c/repfacttest_c8n_1x3_shard1_replica_n1/] and I am [http://127.0.0.1:33207/_c/repfacttest_c8n_1x3_shard1_replica_n4/] 2> 362566 INFO (recoveryExecutor-1158-thread-1-processing-127.0.0.1:33207__c repfacttest_c8n_1x3_shard1_replica_n4 repfacttest_c8n_1x3 shard1 core_node6) [n:127.0.0.1:33207__c c:repfacttest_c8n_1x3 s:shard1 r:core_node6 x:repfacttest_c8n_1x3_shard1_replica_n4 t:] o.a.s.c.ZkShardTerms Successful update of terms at /collections/repfacttest_c8n_1x3/terms/shard1 to Terms{values={core_node6=5, core_node6_recovering=1, core_node5_recovering=3, core_node3=5, core_node5=4}, version=10} for startRecovering 2> 362572 INFO (recoveryExecutor-1158-thread-1-processing-127.0.0.1:33207__c repfacttest_c8n_1x3_shard1_replica_n4 repfacttest_c8n_1x3 shard1 core_node6) [n:127.0.0.1:33207__c c:repfacttest_c8n_1x3 s:shard1 r:core_node6 x:repfacttest_c8n_1x3_shard1_replica_n4 t:] o.a.s.c.RecoveryStrategy Sending prep recovery command to [http://127.0.0.1:32875/_c]; [WaitForState: action=PREPRECOVERY&core=repfacttest_c8n_1x3_shard1_replica_n1&nodeName=127.0.0.1:33207__c&coreNodeName=core_node6&state=recovering&checkLive=true&onlyIfLeader=true&onlyIfLeaderActive=true] 2> 362577 INFO (qtp1455143443-2013) [n:127.0.0.1:32875__c c: s: r: x:repfacttest_c8n_1x3_shard1_replica_n1 t:null-1045] o.a.s.h.a.PrepRecoveryOp Going to wait for coreNodeName: core_node6, state: recovering, checkLive: true, onlyIfLeader: true, onlyIfLeaderActive: true 2> 362578 INFO (qtp1455143443-2013) [n:127.0.0.1:32875__c c: s: r: x: t:null-1045] o.a.s.h.a.PrepRecoveryOp In WaitForState(recovering): collection=repfacttest_c8n_1x3, shard=shard1, thisCore=repfacttest_c8n_1x3_shard1_replica_n1, leaderDoesNotNeedRecovery=false, isLeader? true, live=true, checkLive=true, currentState=recovering, localState=active, nodeName=127.0.0.1:33207__c, coreNodeName=core_node6, onlyIfActiveCheckResult=false, nodeProps: core_node6:{ 2> "core":"repfacttest_c8n_1x3_shard1_replica_n4", 2> "node_name":"127.0.0.1:33207__c", 2> "base_url":"http://127.0.0.1:33207/_c", 2> "state":"recovering", 2> "type":"NRT", 2> "force_set_state":"false"} 2> 362578 INFO (qtp1455143443-2013) [n:127.0.0.1:32875__c c: s: r: x: t:null-1045] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/cores params={nodeName=127.0.0.1:33207__c&onlyIfLeaderActive=true&core=repfacttest_c8n_1x3_shard1_replica_n1&coreNodeName=core_node6&action=PREPRECOVERY&checkLive=true&state=recovering&onlyIfLeader=true&wt=javabin&version=2} status=0 QTime=2 2> 362581 INFO (zkCallback-1122-thread-2) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/repfacttest_c8n_1x3/state.json zxid: 395] for collection [repfacttest_c8n_1x3] has occurred - updating... (live nodes size: [4]) 2> 362581 INFO (zkCallback-1180-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/repfacttest_c8n_1x3/state.json zxid: 395] for collection [repfacttest_c8n_1x3] has occurred - updating... (live nodes size: [4]) 2> 362581 INFO (zkCallback-1180-thread-2) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/repfacttest_c8n_1x3/state.json zxid: 395] for collection [repfacttest_c8n_1x3] has occurred - updating... (live nodes size: [4]) 2> 362581 INFO (zkCallback-1161-thread-2) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/repfacttest_c8n_1x3/state.json zxid: 395] for collection [repfacttest_c8n_1x3] has occurred - updating... (live nodes size: [4]) 2> 363022 INFO (TEST-ReplicationFactorTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.AbstractFullDistribZkTestBase Found 3 replicas and leader on 127.0.0.1:32875__c for shard1 in repfacttest_c8n_1x3 2> 363023 INFO (TEST-ReplicationFactorTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.AbstractFullDistribZkTestBase Replica core_node5 is currently recovering 2> 363023 INFO (TEST-ReplicationFactorTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.AbstractFullDistribZkTestBase Replica core_node6 is currently recovering 2> 363080 INFO (recoveryExecutor-1158-thread-1-processing-127.0.0.1:33207__c repfacttest_c8n_1x3_shard1_replica_n4 repfacttest_c8n_1x3 shard1 core_node6) [n:127.0.0.1:33207__c c:repfacttest_c8n_1x3 s:shard1 r:core_node6 x:repfacttest_c8n_1x3_shard1_replica_n4 t:] o.a.s.c.RecoveryStrategy Attempting to PeerSync from [http://127.0.0.1:32875/_c/repfacttest_c8n_1x3_shard1_replica_n1/] - recoveringAfterStartup=[true] 2> 363081 WARN (recoveryExecutor-1158-thread-1-processing-127.0.0.1:33207__c repfacttest_c8n_1x3_shard1_replica_n4 repfacttest_c8n_1x3 shard1 core_node6) [n:127.0.0.1:33207__c c:repfacttest_c8n_1x3 s:shard1 r:core_node6 x:repfacttest_c8n_1x3_shard1_replica_n4 t:] o.a.s.u.PeerSyncWithLeader no frame of reference to tell if we've missed updates 2> 363082 INFO (recoveryExecutor-1158-thread-1-processing-127.0.0.1:33207__c repfacttest_c8n_1x3_shard1_replica_n4 repfacttest_c8n_1x3 shard1 core_node6) [n:127.0.0.1:33207__c c:repfacttest_c8n_1x3 s:shard1 r:core_node6 x:repfacttest_c8n_1x3_shard1_replica_n4 t:] o.a.s.c.RecoveryStrategy PeerSync Recovery was not successful - trying replication. 2> 363082 INFO (recoveryExecutor-1158-thread-1-processing-127.0.0.1:33207__c repfacttest_c8n_1x3_shard1_replica_n4 repfacttest_c8n_1x3 shard1 core_node6) [n:127.0.0.1:33207__c c:repfacttest_c8n_1x3 s:shard1 r:core_node6 x:repfacttest_c8n_1x3_shard1_replica_n4 t:] o.a.s.c.RecoveryStrategy Starting Replication Recovery. 2> 363082 INFO (recoveryExecutor-1158-thread-1-processing-127.0.0.1:33207__c repfacttest_c8n_1x3_shard1_replica_n4 repfacttest_c8n_1x3 shard1 core_node6) [n:127.0.0.1:33207__c c:repfacttest_c8n_1x3 s:shard1 r:core_node6 x:repfacttest_c8n_1x3_shard1_replica_n4 t:] o.a.s.c.RecoveryStrategy Attempting to replicate from [http://127.0.0.1:32875/_c/repfacttest_c8n_1x3_shard1_replica_n1/]. 2> 363095 INFO (SocketProxy-Acceptor-33207) [n: c: s: r: x: t:] o.a.s.c.s.c.SocketProxy accepted Socket[addr=/127.0.0.1,port=44856,localport=33207], receiveBufferSize: 65536 2> 363099 INFO (SocketProxy-Acceptor-33207) [n: c: s: r: x: t:] o.a.s.c.s.c.SocketProxy proxy connection Socket[addr=/127.0.0.1,port=35371,localport=37052], receiveBufferSize=65536 2> 363157 INFO (recoveryExecutor-1119-thread-1-processing-127.0.0.1:40643__c repfacttest_c8n_1x3_shard1_replica_n2 repfacttest_c8n_1x3 shard1 core_node5) [n:127.0.0.1:40643__c c:repfacttest_c8n_1x3 s:shard1 r:core_node5 x:repfacttest_c8n_1x3_shard1_replica_n2 t:] o.a.s.c.RecoveryStrategy Starting recovery process. recoveringAfterStartup=true 2> 363159 INFO (recoveryExecutor-1119-thread-1-processing-127.0.0.1:40643__c repfacttest_c8n_1x3_shard1_replica_n2 repfacttest_c8n_1x3 shard1 core_node5) [n:127.0.0.1:40643__c c:repfacttest_c8n_1x3 s:shard1 r:core_node5 x:repfacttest_c8n_1x3_shard1_replica_n2 t:] o.a.s.c.RecoveryStrategy Found new versions added after startup: num=[25] 2> 363159 INFO (recoveryExecutor-1119-thread-1-processing-127.0.0.1:40643__c repfacttest_c8n_1x3_shard1_replica_n2 repfacttest_c8n_1x3 shard1 core_node5) [n:127.0.0.1:40643__c c:repfacttest_c8n_1x3 s:shard1 r:core_node5 x:repfacttest_c8n_1x3_shard1_replica_n2 t:] o.a.s.c.RecoveryStrategy currentVersions size=25 range=[1788735688632434688 to 1788735687205322752] 2> 363159 INFO (recoveryExecutor-1119-thread-1-processing-127.0.0.1:40643__c repfacttest_c8n_1x3_shard1_replica_n2 repfacttest_c8n_1x3 shard1 core_node5) [n:127.0.0.1:40643__c c:repfacttest_c8n_1x3 s:shard1 r:core_node5 x:repfacttest_c8n_1x3_shard1_replica_n2 t:] o.a.s.c.RecoveryStrategy startupVersions is empty 2> 363165 INFO (qtp1461542799-1981) [n:127.0.0.1:33207__c c:repfacttest_c8n_1x3 s:shard1 r:core_node6 x:repfacttest_c8n_1x3_shard1_replica_n4 t:null-1046] o.a.s.u.p.DistributedUpdateProcessor Ignoring commit while not ACTIVE - state: BUFFERING replay: false 2> 363166 INFO (qtp1461542799-1981) [n:127.0.0.1:33207__c c:repfacttest_c8n_1x3 s:shard1 r:core_node6 x:repfacttest_c8n_1x3_shard1_replica_n4 t:null-1046] o.a.s.u.p.LogUpdateProcessorFactory webapp=/_c path=/update params={update.distrib=FROMLEADER&waitSearcher=true&openSearcher=false&commit=true&softCommit=false&distrib.from=http://127.0.0.1:32875/_c/repfacttest_c8n_1x3_shard1_replica_n1/&commit_end_point=replicas&wt=javabin&version=2&expungeDeletes=false}{commit=} 0 3 2> 363168 INFO (qtp1455143443-2010) [n:127.0.0.1:32875__c c:repfacttest_c8n_1x3 s:shard1 r:core_node3 x:repfacttest_c8n_1x3_shard1_replica_n1 t:null-1047] o.a.s.c.S.Request webapp=/_c path=/admin/ping params={wt=javabin&version=2} hits=0 status=0 QTime=3 2> 363168 INFO (qtp1455143443-2010) [n:127.0.0.1:32875__c c:repfacttest_c8n_1x3 s:shard1 r:core_node3 x:repfacttest_c8n_1x3_shard1_replica_n1 t:null-1047] o.a.s.c.S.Request webapp=/_c path=/admin/ping params={wt=javabin&version=2} status=0 QTime=3 2> 363170 INFO (recoveryExecutor-1119-thread-1-processing-127.0.0.1:40643__c repfacttest_c8n_1x3_shard1_replica_n2 repfacttest_c8n_1x3 shard1 core_node5) [n:127.0.0.1:40643__c c:repfacttest_c8n_1x3 s:shard1 r:core_node5 x:repfacttest_c8n_1x3_shard1_replica_n2 t:] o.a.s.c.RecoveryStrategy Begin buffering updates. core=[repfacttest_c8n_1x3_shard1_replica_n2] 2> 363177 INFO (qtp1455143443-2011) [n:127.0.0.1:32875__c c:repfacttest_c8n_1x3 s:shard1 r:core_node3 x:repfacttest_c8n_1x3_shard1_replica_n1 t:null-1046] o.a.s.u.p.LogUpdateProcessorFactory webapp=/_c path=/update params={waitSearcher=true&openSearcher=false&commit=true&softCommit=false&wt=javabin&version=2}{commit=} 0 91 2> 363180 INFO (recoveryExecutor-1119-thread-1-processing-127.0.0.1:40643__c repfacttest_c8n_1x3_shard1_replica_n2 repfacttest_c8n_1x3 shard1 core_node5) [n:127.0.0.1:40643__c c:repfacttest_c8n_1x3 s:shard1 r:core_node5 x:repfacttest_c8n_1x3_shard1_replica_n2 t:] o.a.s.u.UpdateLog Starting to buffer updates. FSUpdateLog{state=BUFFERING, tlog=tlog{file=/build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.cloud.ReplicationFactorTest_246C98A4C257C021-001/control-001/cores/repfacttest_c8n_1x3_shard1_replica_n2/data/tlog/tlog.0000000000000000000 refcount=1}} 2> 363180 INFO (recoveryExecutor-1119-thread-1-processing-127.0.0.1:40643__c repfacttest_c8n_1x3_shard1_replica_n2 repfacttest_c8n_1x3 shard1 core_node5) [n:127.0.0.1:40643__c c:repfacttest_c8n_1x3 s:shard1 r:core_node5 x:repfacttest_c8n_1x3_shard1_replica_n2 t:] o.a.s.c.RecoveryStrategy Publishing state of core [repfacttest_c8n_1x3_shard1_replica_n2] as recovering, leader is [http://127.0.0.1:32875/_c/repfacttest_c8n_1x3_shard1_replica_n1/] and I am [http://127.0.0.1:40643/_c/repfacttest_c8n_1x3_shard1_replica_n2/] 2> 363186 INFO (recoveryExecutor-1119-thread-1-processing-127.0.0.1:40643__c repfacttest_c8n_1x3_shard1_replica_n2 repfacttest_c8n_1x3 shard1 core_node5) [n:127.0.0.1:40643__c c:repfacttest_c8n_1x3 s:shard1 r:core_node5 x:repfacttest_c8n_1x3_shard1_replica_n2 t:] o.a.s.c.ZkShardTerms Successful update of terms at /collections/repfacttest_c8n_1x3/terms/shard1 to Terms{values={core_node6=5, core_node6_recovering=1, core_node5_recovering=3, core_node3=5, core_node5=5}, version=11} for startRecovering 2> 363191 INFO (qtp1455143443-2014) [n:127.0.0.1:32875__c c:repfacttest_c8n_1x3 s:shard1 r:core_node3 x:repfacttest_c8n_1x3_shard1_replica_n1 t:null-1048] o.a.s.c.S.Request webapp=/_c path=/replication params={qt=/replication&wt=javabin&version=2&command=indexversion} status=0 QTime=2 2> 363193 INFO (recoveryExecutor-1158-thread-1-processing-127.0.0.1:33207__c repfacttest_c8n_1x3_shard1_replica_n4 repfacttest_c8n_1x3 shard1 core_node6) [n:127.0.0.1:33207__c c:repfacttest_c8n_1x3 s:shard1 r:core_node6 x:repfacttest_c8n_1x3_shard1_replica_n4 t:] o.a.s.h.IndexFetcher Leader's generation: 2 2> 363194 INFO (recoveryExecutor-1158-thread-1-processing-127.0.0.1:33207__c repfacttest_c8n_1x3_shard1_replica_n4 repfacttest_c8n_1x3 shard1 core_node6) [n:127.0.0.1:33207__c c:repfacttest_c8n_1x3 s:shard1 r:core_node6 x:repfacttest_c8n_1x3_shard1_replica_n4 t:] o.a.s.h.IndexFetcher Leader's version: 1705871295187 2> 363194 INFO (recoveryExecutor-1158-thread-1-processing-127.0.0.1:33207__c repfacttest_c8n_1x3_shard1_replica_n4 repfacttest_c8n_1x3 shard1 core_node6) [n:127.0.0.1:33207__c c:repfacttest_c8n_1x3 s:shard1 r:core_node6 x:repfacttest_c8n_1x3_shard1_replica_n4 t:] o.a.s.h.IndexFetcher Follower's generation: 1 2> 363194 INFO (recoveryExecutor-1158-thread-1-processing-127.0.0.1:33207__c repfacttest_c8n_1x3_shard1_replica_n4 repfacttest_c8n_1x3 shard1 core_node6) [n:127.0.0.1:33207__c c:repfacttest_c8n_1x3 s:shard1 r:core_node6 x:repfacttest_c8n_1x3_shard1_replica_n4 t:] o.a.s.h.IndexFetcher Follower's version: 0 2> 363194 INFO (recoveryExecutor-1158-thread-1-processing-127.0.0.1:33207__c repfacttest_c8n_1x3_shard1_replica_n4 repfacttest_c8n_1x3 shard1 core_node6) [n:127.0.0.1:33207__c c:repfacttest_c8n_1x3 s:shard1 r:core_node6 x:repfacttest_c8n_1x3_shard1_replica_n4 t:] o.a.s.h.IndexFetcher Starting replication process 2> 363195 INFO (recoveryExecutor-1119-thread-1-processing-127.0.0.1:40643__c repfacttest_c8n_1x3_shard1_replica_n2 repfacttest_c8n_1x3 shard1 core_node5) [n:127.0.0.1:40643__c c:repfacttest_c8n_1x3 s:shard1 r:core_node5 x:repfacttest_c8n_1x3_shard1_replica_n2 t:] o.a.s.c.RecoveryStrategy Sending prep recovery command to [http://127.0.0.1:32875/_c]; [WaitForState: action=PREPRECOVERY&core=repfacttest_c8n_1x3_shard1_replica_n1&nodeName=127.0.0.1:40643__c&coreNodeName=core_node5&state=recovering&checkLive=true&onlyIfLeader=true&onlyIfLeaderActive=true] 2> 363201 INFO (qtp1455143443-2012) [n:127.0.0.1:32875__c c: s: r: x:repfacttest_c8n_1x3_shard1_replica_n1 t:null-1049] o.a.s.h.a.PrepRecoveryOp Going to wait for coreNodeName: core_node5, state: recovering, checkLive: true, onlyIfLeader: true, onlyIfLeaderActive: true 2> 363202 INFO (qtp1455143443-2012) [n:127.0.0.1:32875__c c: s: r: x: t:null-1049] o.a.s.h.a.PrepRecoveryOp In WaitForState(recovering): collection=repfacttest_c8n_1x3, shard=shard1, thisCore=repfacttest_c8n_1x3_shard1_replica_n1, leaderDoesNotNeedRecovery=false, isLeader? true, live=true, checkLive=true, currentState=recovering, localState=active, nodeName=127.0.0.1:40643__c, coreNodeName=core_node5, onlyIfActiveCheckResult=false, nodeProps: core_node5:{ 2> "core":"repfacttest_c8n_1x3_shard1_replica_n2", 2> "node_name":"127.0.0.1:40643__c", 2> "base_url":"http://127.0.0.1:40643/_c", 2> "state":"recovering", 2> "type":"NRT", 2> "force_set_state":"false"} 2> 363203 INFO (qtp1455143443-2012) [n:127.0.0.1:32875__c c: s: r: x: t:null-1049] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/cores params={nodeName=127.0.0.1:40643__c&onlyIfLeaderActive=true&core=repfacttest_c8n_1x3_shard1_replica_n1&coreNodeName=core_node5&action=PREPRECOVERY&checkLive=true&state=recovering&onlyIfLeader=true&wt=javabin&version=2} status=0 QTime=5 2> 363209 INFO (qtp1455143443-2015) [n:127.0.0.1:32875__c c:repfacttest_c8n_1x3 s:shard1 r:core_node3 x:repfacttest_c8n_1x3_shard1_replica_n1 t:null-1050] o.a.s.c.S.Request webapp=/_c path=/replication params={generation=2&qt=/replication&wt=javabin&version=2&command=filelist} status=0 QTime=11 2> 363216 INFO (recoveryExecutor-1158-thread-1-processing-127.0.0.1:33207__c repfacttest_c8n_1x3_shard1_replica_n4 repfacttest_c8n_1x3 shard1 core_node6) [n:127.0.0.1:33207__c c:repfacttest_c8n_1x3 s:shard1 r:core_node6 x:repfacttest_c8n_1x3_shard1_replica_n4 t:] o.a.s.h.IndexFetcher Number of files in latest index in leader: 13 2> 363220 WARN (recoveryExecutor-1158-thread-1-processing-127.0.0.1:33207__c repfacttest_c8n_1x3_shard1_replica_n4 repfacttest_c8n_1x3 shard1 core_node6) [n:127.0.0.1:33207__c c:repfacttest_c8n_1x3 s:shard1 r:core_node6 x:repfacttest_c8n_1x3_shard1_replica_n4 t:] o.a.s.h.IndexFetcher File _0.cfe did not match. expected checksum is 2835523528 and actual is checksum 2324257968. expected length is 1230 and actual length is 1230 2> 363221 INFO (recoveryExecutor-1158-thread-1-processing-127.0.0.1:33207__c repfacttest_c8n_1x3_shard1_replica_n4 repfacttest_c8n_1x3 shard1 core_node6) [n:127.0.0.1:33207__c c:repfacttest_c8n_1x3 s:shard1 r:core_node6 x:repfacttest_c8n_1x3_shard1_replica_n4 t:] o.a.s.h.IndexFetcher Starting download (fullCopy=true) to MockDirectoryWrapper(ByteBuffersDirectory@26f5f8ed lockFactory=org.apache.lucene.store.SingleInstanceLockFactory@582c361e) 2> 363221 INFO (recoveryExecutor-1158-thread-1-processing-127.0.0.1:33207__c repfacttest_c8n_1x3_shard1_replica_n4 repfacttest_c8n_1x3 shard1 core_node6) [n:127.0.0.1:33207__c c:repfacttest_c8n_1x3 s:shard1 r:core_node6 x:repfacttest_c8n_1x3_shard1_replica_n4 t:] o.a.s.h.IndexFetcher tmpIndexDir_type : class org.apache.lucene.tests.store.MockDirectoryWrapper , ByteBuffersDirectory@26f5f8ed lockFactory=org.apache.lucene.store.SingleInstanceLockFactory@582c361e 2> 363233 INFO (qtp1455143443-2013) [n:127.0.0.1:32875__c c:repfacttest_c8n_1x3 s:shard1 r:core_node3 x:repfacttest_c8n_1x3_shard1_replica_n1 t:null-1051] o.a.s.c.S.Request webapp=/_c path=/replication params={generation=2&qt=/replication&file=_0_2.liv&checksum=true&wt=filestream&command=filecontent} status=0 QTime=1 2> 363279 WARN (recoveryExecutor-1158-thread-1-processing-127.0.0.1:33207__c repfacttest_c8n_1x3_shard1_replica_n4 repfacttest_c8n_1x3 shard1 core_node6) [n:127.0.0.1:33207__c c:repfacttest_c8n_1x3 s:shard1 r:core_node6 x:repfacttest_c8n_1x3_shard1_replica_n4 t:] o.a.s.h.IndexFetcher File _0.cfe did not match. expected checksum is 2835523528 and actual is checksum 2324257968. expected length is 1230 and actual length is 1230 2> 363283 INFO (qtp1455143443-2010) [n:127.0.0.1:32875__c c:repfacttest_c8n_1x3 s:shard1 r:core_node3 x:repfacttest_c8n_1x3_shard1_replica_n1 t:null-1052] o.a.s.c.S.Request webapp=/_c path=/replication params={generation=2&qt=/replication&file=_0.cfe&checksum=true&wt=filestream&command=filecontent} status=0 QTime=1 2> 363304 INFO (zkCallback-1161-thread-2) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/repfacttest_c8n_1x3/state.json zxid: 399] for collection [repfacttest_c8n_1x3] has occurred - updating... (live nodes size: [4]) 2> 363304 INFO (zkCallback-1180-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/repfacttest_c8n_1x3/state.json zxid: 399] for collection [repfacttest_c8n_1x3] has occurred - updating... (live nodes size: [4]) 2> 363304 INFO (zkCallback-1122-thread-2) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/repfacttest_c8n_1x3/state.json zxid: 399] for collection [repfacttest_c8n_1x3] has occurred - updating... (live nodes size: [4]) 2> 363304 INFO (zkCallback-1180-thread-2) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/repfacttest_c8n_1x3/state.json zxid: 399] for collection [repfacttest_c8n_1x3] has occurred - updating... (live nodes size: [4]) 2> 363331 WARN (recoveryExecutor-1158-thread-1-processing-127.0.0.1:33207__c repfacttest_c8n_1x3_shard1_replica_n4 repfacttest_c8n_1x3 shard1 core_node6) [n:127.0.0.1:33207__c c:repfacttest_c8n_1x3 s:shard1 r:core_node6 x:repfacttest_c8n_1x3_shard1_replica_n4 t:] o.a.s.h.IndexFetcher File _0.si did not match. expected checksum is 3803285895 and actual is checksum 1839935314. expected length is 350 and actual length is 350 2> 363335 INFO (qtp1455143443-2011) [n:127.0.0.1:32875__c c:repfacttest_c8n_1x3 s:shard1 r:core_node3 x:repfacttest_c8n_1x3_shard1_replica_n1 t:null-1053] o.a.s.c.S.Request webapp=/_c path=/replication params={generation=2&qt=/replication&file=_0.si&checksum=true&wt=filestream&command=filecontent} status=0 QTime=1 2> 363381 WARN (recoveryExecutor-1158-thread-1-processing-127.0.0.1:33207__c repfacttest_c8n_1x3_shard1_replica_n4 repfacttest_c8n_1x3 shard1 core_node6) [n:127.0.0.1:33207__c c:repfacttest_c8n_1x3 s:shard1 r:core_node6 x:repfacttest_c8n_1x3_shard1_replica_n4 t:] o.a.s.h.IndexFetcher File _0.cfs did not match. expected checksum is 2402758324 and actual is checksum 3005892478. expected length is 6638 and actual length is 6638 2> 363386 INFO (qtp1455143443-2014) [n:127.0.0.1:32875__c c:repfacttest_c8n_1x3 s:shard1 r:core_node3 x:repfacttest_c8n_1x3_shard1_replica_n1 t:null-1054] o.a.s.c.S.Request webapp=/_c path=/replication params={generation=2&qt=/replication&file=_0.cfs&checksum=true&wt=filestream&command=filecontent} status=0 QTime=1 2> 363439 INFO (qtp1455143443-2012) [n:127.0.0.1:32875__c c:repfacttest_c8n_1x3 s:shard1 r:core_node3 x:repfacttest_c8n_1x3_shard1_replica_n1 t:null-1055] o.a.s.c.S.Request webapp=/_c path=/replication params={generation=2&qt=/replication&file=_1.cfs&checksum=true&wt=filestream&command=filecontent} status=0 QTime=1 2> 363492 INFO (qtp1455143443-2015) [n:127.0.0.1:32875__c c:repfacttest_c8n_1x3 s:shard1 r:core_node3 x:repfacttest_c8n_1x3_shard1_replica_n1 t:null-1056] o.a.s.c.S.Request webapp=/_c path=/replication params={generation=2&qt=/replication&file=_1.cfe&checksum=true&wt=filestream&command=filecontent} status=0 QTime=2 2> 363531 INFO (TEST-ReplicationFactorTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.AbstractFullDistribZkTestBase Found 3 replicas and leader on 127.0.0.1:32875__c for shard1 in repfacttest_c8n_1x3 2> 363531 INFO (TEST-ReplicationFactorTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.AbstractFullDistribZkTestBase Replica core_node5 is currently recovering 2> 363531 INFO (TEST-ReplicationFactorTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.AbstractFullDistribZkTestBase Replica core_node6 is currently recovering 2> 363543 INFO (qtp1455143443-2013) [n:127.0.0.1:32875__c c:repfacttest_c8n_1x3 s:shard1 r:core_node3 x:repfacttest_c8n_1x3_shard1_replica_n1 t:null-1057] o.a.s.c.S.Request webapp=/_c path=/replication params={generation=2&qt=/replication&file=_1.si&checksum=true&wt=filestream&command=filecontent} status=0 QTime=1 2> 363592 INFO (qtp1455143443-2010) [n:127.0.0.1:32875__c c:repfacttest_c8n_1x3 s:shard1 r:core_node3 x:repfacttest_c8n_1x3_shard1_replica_n1 t:null-1058] o.a.s.c.S.Request webapp=/_c path=/replication params={generation=2&qt=/replication&file=_1_1.liv&checksum=true&wt=filestream&command=filecontent} status=0 QTime=1 2> 363646 INFO (qtp1455143443-2011) [n:127.0.0.1:32875__c c:repfacttest_c8n_1x3 s:shard1 r:core_node3 x:repfacttest_c8n_1x3_shard1_replica_n1 t:null-1059] o.a.s.c.S.Request webapp=/_c path=/replication params={generation=2&qt=/replication&file=_2.si&checksum=true&wt=filestream&command=filecontent} status=0 QTime=2 2> 363700 INFO (qtp1455143443-2014) [n:127.0.0.1:32875__c c:repfacttest_c8n_1x3 s:shard1 r:core_node3 x:repfacttest_c8n_1x3_shard1_replica_n1 t:null-1060] o.a.s.c.S.Request webapp=/_c path=/replication params={generation=2&qt=/replication&file=_2.cfe&checksum=true&wt=filestream&command=filecontent} status=0 QTime=2 2> 363705 INFO (recoveryExecutor-1119-thread-1-processing-127.0.0.1:40643__c repfacttest_c8n_1x3_shard1_replica_n2 repfacttest_c8n_1x3 shard1 core_node5) [n:127.0.0.1:40643__c c:repfacttest_c8n_1x3 s:shard1 r:core_node5 x:repfacttest_c8n_1x3_shard1_replica_n2 t:] o.a.s.c.RecoveryStrategy Attempting to PeerSync from [http://127.0.0.1:32875/_c/repfacttest_c8n_1x3_shard1_replica_n1/] - recoveringAfterStartup=[true] 2> 363706 WARN (recoveryExecutor-1119-thread-1-processing-127.0.0.1:40643__c repfacttest_c8n_1x3_shard1_replica_n2 repfacttest_c8n_1x3 shard1 core_node5) [n:127.0.0.1:40643__c c:repfacttest_c8n_1x3 s:shard1 r:core_node5 x:repfacttest_c8n_1x3_shard1_replica_n2 t:] o.a.s.u.PeerSyncWithLeader no frame of reference to tell if we've missed updates 2> 363707 INFO (recoveryExecutor-1119-thread-1-processing-127.0.0.1:40643__c repfacttest_c8n_1x3_shard1_replica_n2 repfacttest_c8n_1x3 shard1 core_node5) [n:127.0.0.1:40643__c c:repfacttest_c8n_1x3 s:shard1 r:core_node5 x:repfacttest_c8n_1x3_shard1_replica_n2 t:] o.a.s.c.RecoveryStrategy PeerSync Recovery was not successful - trying replication. 2> 363707 INFO (recoveryExecutor-1119-thread-1-processing-127.0.0.1:40643__c repfacttest_c8n_1x3_shard1_replica_n2 repfacttest_c8n_1x3 shard1 core_node5) [n:127.0.0.1:40643__c c:repfacttest_c8n_1x3 s:shard1 r:core_node5 x:repfacttest_c8n_1x3_shard1_replica_n2 t:] o.a.s.c.RecoveryStrategy Starting Replication Recovery. 2> 363707 INFO (recoveryExecutor-1119-thread-1-processing-127.0.0.1:40643__c repfacttest_c8n_1x3_shard1_replica_n2 repfacttest_c8n_1x3 shard1 core_node5) [n:127.0.0.1:40643__c c:repfacttest_c8n_1x3 s:shard1 r:core_node5 x:repfacttest_c8n_1x3_shard1_replica_n2 t:] o.a.s.c.RecoveryStrategy Attempting to replicate from [http://127.0.0.1:32875/_c/repfacttest_c8n_1x3_shard1_replica_n1/]. 2> 363736 INFO (SocketProxy-Acceptor-40643) [n: c: s: r: x: t:] o.a.s.c.s.c.SocketProxy accepted Socket[addr=/127.0.0.1,port=37526,localport=40643], receiveBufferSize: 65536 2> 363738 INFO (qtp1461542799-1982) [n:127.0.0.1:33207__c c:repfacttest_c8n_1x3 s:shard1 r:core_node6 x:repfacttest_c8n_1x3_shard1_replica_n4 t:null-1061] o.a.s.u.p.DistributedUpdateProcessor Ignoring commit while not ACTIVE - state: BUFFERING replay: false 2> 363740 INFO (qtp1461542799-1982) [n:127.0.0.1:33207__c c:repfacttest_c8n_1x3 s:shard1 r:core_node6 x:repfacttest_c8n_1x3_shard1_replica_n4 t:null-1061] o.a.s.u.p.LogUpdateProcessorFactory webapp=/_c path=/update params={update.distrib=FROMLEADER&waitSearcher=true&openSearcher=false&commit=true&softCommit=false&distrib.from=http://127.0.0.1:32875/_c/repfacttest_c8n_1x3_shard1_replica_n1/&commit_end_point=replicas&wt=javabin&version=2&expungeDeletes=false}{commit=} 0 4 2> 363746 INFO (SocketProxy-Acceptor-40643) [n: c: s: r: x: t:] o.a.s.c.s.c.SocketProxy proxy connection Socket[addr=/127.0.0.1,port=43165,localport=36758], receiveBufferSize=65536 2> 363750 INFO (qtp1455143443-2015) [n:127.0.0.1:32875__c c:repfacttest_c8n_1x3 s:shard1 r:core_node3 x:repfacttest_c8n_1x3_shard1_replica_n1 t:null-1062] o.a.s.c.S.Request webapp=/_c path=/replication params={generation=2&qt=/replication&file=_2.cfs&checksum=true&wt=filestream&command=filecontent} status=0 QTime=2 2> 363757 INFO (qtp1639791653-1928) [n:127.0.0.1:40643__c c:repfacttest_c8n_1x3 s:shard1 r:core_node5 x:repfacttest_c8n_1x3_shard1_replica_n2 t:null-1061] o.a.s.u.p.DistributedUpdateProcessor Ignoring commit while not ACTIVE - state: BUFFERING replay: false 2> 363757 INFO (qtp1639791653-1928) [n:127.0.0.1:40643__c c:repfacttest_c8n_1x3 s:shard1 r:core_node5 x:repfacttest_c8n_1x3_shard1_replica_n2 t:null-1061] o.a.s.u.p.LogUpdateProcessorFactory webapp=/_c path=/update params={update.distrib=FROMLEADER&waitSearcher=true&openSearcher=false&commit=true&softCommit=false&distrib.from=http://127.0.0.1:32875/_c/repfacttest_c8n_1x3_shard1_replica_n1/&commit_end_point=replicas&wt=javabin&version=2&expungeDeletes=false}{commit=} 0 3 2> 363760 INFO (qtp1455143443-2012) [n:127.0.0.1:32875__c c:repfacttest_c8n_1x3 s:shard1 r:core_node3 x:repfacttest_c8n_1x3_shard1_replica_n1 t:null-1061] o.a.s.u.p.LogUpdateProcessorFactory webapp=/_c path=/update params={waitSearcher=true&openSearcher=false&commit=true&softCommit=false&wt=javabin&version=2}{commit=} 0 49 2> 363777 INFO (qtp1455143443-2013) [n:127.0.0.1:32875__c c:repfacttest_c8n_1x3 s:shard1 r:core_node3 x:repfacttest_c8n_1x3_shard1_replica_n1 t:null-1063] o.a.s.c.S.Request webapp=/_c path=/replication params={qt=/replication&wt=javabin&version=2&command=indexversion} status=0 QTime=5 2> 363778 INFO (recoveryExecutor-1119-thread-1-processing-127.0.0.1:40643__c repfacttest_c8n_1x3_shard1_replica_n2 repfacttest_c8n_1x3 shard1 core_node5) [n:127.0.0.1:40643__c c:repfacttest_c8n_1x3 s:shard1 r:core_node5 x:repfacttest_c8n_1x3_shard1_replica_n2 t:] o.a.s.h.IndexFetcher Leader's generation: 2 2> 363779 INFO (recoveryExecutor-1119-thread-1-processing-127.0.0.1:40643__c repfacttest_c8n_1x3_shard1_replica_n2 repfacttest_c8n_1x3 shard1 core_node5) [n:127.0.0.1:40643__c c:repfacttest_c8n_1x3 s:shard1 r:core_node5 x:repfacttest_c8n_1x3_shard1_replica_n2 t:] o.a.s.h.IndexFetcher Leader's version: 1705871295187 2> 363779 INFO (recoveryExecutor-1119-thread-1-processing-127.0.0.1:40643__c repfacttest_c8n_1x3_shard1_replica_n2 repfacttest_c8n_1x3 shard1 core_node5) [n:127.0.0.1:40643__c c:repfacttest_c8n_1x3 s:shard1 r:core_node5 x:repfacttest_c8n_1x3_shard1_replica_n2 t:] o.a.s.h.IndexFetcher Follower's generation: 1 2> 363779 INFO (recoveryExecutor-1119-thread-1-processing-127.0.0.1:40643__c repfacttest_c8n_1x3_shard1_replica_n2 repfacttest_c8n_1x3 shard1 core_node5) [n:127.0.0.1:40643__c c:repfacttest_c8n_1x3 s:shard1 r:core_node5 x:repfacttest_c8n_1x3_shard1_replica_n2 t:] o.a.s.h.IndexFetcher Follower's version: 0 2> 363779 INFO (recoveryExecutor-1119-thread-1-processing-127.0.0.1:40643__c repfacttest_c8n_1x3_shard1_replica_n2 repfacttest_c8n_1x3 shard1 core_node5) [n:127.0.0.1:40643__c c:repfacttest_c8n_1x3 s:shard1 r:core_node5 x:repfacttest_c8n_1x3_shard1_replica_n2 t:] o.a.s.h.IndexFetcher Starting replication process 2> 363788 INFO (qtp1455143443-2010) [n:127.0.0.1:32875__c c:repfacttest_c8n_1x3 s:shard1 r:core_node3 x:repfacttest_c8n_1x3_shard1_replica_n1 t:null-1064] o.a.s.c.S.Request webapp=/_c path=/replication params={generation=2&qt=/replication&wt=javabin&version=2&command=filelist} status=0 QTime=6 2> 363794 INFO (recoveryExecutor-1119-thread-1-processing-127.0.0.1:40643__c repfacttest_c8n_1x3_shard1_replica_n2 repfacttest_c8n_1x3 shard1 core_node5) [n:127.0.0.1:40643__c c:repfacttest_c8n_1x3 s:shard1 r:core_node5 x:repfacttest_c8n_1x3_shard1_replica_n2 t:] o.a.s.h.IndexFetcher Number of files in latest index in leader: 13 2> 363797 WARN (recoveryExecutor-1119-thread-1-processing-127.0.0.1:40643__c repfacttest_c8n_1x3_shard1_replica_n2 repfacttest_c8n_1x3 shard1 core_node5) [n:127.0.0.1:40643__c c:repfacttest_c8n_1x3 s:shard1 r:core_node5 x:repfacttest_c8n_1x3_shard1_replica_n2 t:] o.a.s.h.IndexFetcher File _0.cfe did not match. expected checksum is 2835523528 and actual is checksum 1614462168. expected length is 1230 and actual length is 1230 2> 363797 INFO (recoveryExecutor-1119-thread-1-processing-127.0.0.1:40643__c repfacttest_c8n_1x3_shard1_replica_n2 repfacttest_c8n_1x3 shard1 core_node5) [n:127.0.0.1:40643__c c:repfacttest_c8n_1x3 s:shard1 r:core_node5 x:repfacttest_c8n_1x3_shard1_replica_n2 t:] o.a.s.h.IndexFetcher Starting download (fullCopy=true) to MockDirectoryWrapper(ByteBuffersDirectory@38e03314 lockFactory=org.apache.lucene.store.SingleInstanceLockFactory@48db97c5) 2> 363797 INFO (recoveryExecutor-1119-thread-1-processing-127.0.0.1:40643__c repfacttest_c8n_1x3_shard1_replica_n2 repfacttest_c8n_1x3 shard1 core_node5) [n:127.0.0.1:40643__c c:repfacttest_c8n_1x3 s:shard1 r:core_node5 x:repfacttest_c8n_1x3_shard1_replica_n2 t:] o.a.s.h.IndexFetcher tmpIndexDir_type : class org.apache.lucene.tests.store.MockDirectoryWrapper , ByteBuffersDirectory@38e03314 lockFactory=org.apache.lucene.store.SingleInstanceLockFactory@48db97c5 2> 363802 INFO (qtp1455143443-2011) [n:127.0.0.1:32875__c c:repfacttest_c8n_1x3 s:shard1 r:core_node3 x:repfacttest_c8n_1x3_shard1_replica_n1 t:null-1065] o.a.s.c.S.Request webapp=/_c path=/replication params={generation=2&qt=/replication&file=_2_2.liv&checksum=true&wt=filestream&command=filecontent} status=0 QTime=1 2> 363807 INFO (qtp1455143443-2014) [n:127.0.0.1:32875__c c:repfacttest_c8n_1x3 s:shard1 r:core_node3 x:repfacttest_c8n_1x3_shard1_replica_n1 t:null-1066] o.a.s.c.S.Request webapp=/_c path=/replication params={generation=2&qt=/replication&file=_0_2.liv&checksum=true&wt=filestream&command=filecontent} status=0 QTime=1 2> 363853 WARN (recoveryExecutor-1119-thread-1-processing-127.0.0.1:40643__c repfacttest_c8n_1x3_shard1_replica_n2 repfacttest_c8n_1x3 shard1 core_node5) [n:127.0.0.1:40643__c c:repfacttest_c8n_1x3 s:shard1 r:core_node5 x:repfacttest_c8n_1x3_shard1_replica_n2 t:] o.a.s.h.IndexFetcher File _0.cfe did not match. expected checksum is 2835523528 and actual is checksum 1614462168. expected length is 1230 and actual length is 1230 2> 363855 INFO (qtp1455143443-2015) [n:127.0.0.1:32875__c c:repfacttest_c8n_1x3 s:shard1 r:core_node3 x:repfacttest_c8n_1x3_shard1_replica_n1 t:null-1067] o.a.s.c.S.Request webapp=/_c path=/replication params={generation=2&qt=/replication&file=segments_2&checksum=true&wt=filestream&command=filecontent} status=0 QTime=1 2> 363858 INFO (qtp1455143443-2012) [n:127.0.0.1:32875__c c:repfacttest_c8n_1x3 s:shard1 r:core_node3 x:repfacttest_c8n_1x3_shard1_replica_n1 t:null-1068] o.a.s.c.S.Request webapp=/_c path=/replication params={generation=2&qt=/replication&file=_0.cfe&checksum=true&wt=filestream&command=filecontent} status=0 QTime=1 2> 363901 INFO (recoveryExecutor-1158-thread-1-processing-127.0.0.1:33207__c repfacttest_c8n_1x3_shard1_replica_n4 repfacttest_c8n_1x3 shard1 core_node6) [n:127.0.0.1:33207__c c:repfacttest_c8n_1x3 s:shard1 r:core_node6 x:repfacttest_c8n_1x3_shard1_replica_n4 t:] o.a.s.h.IndexFetcher Bytes downloaded: 26954, Bytes skipped downloading: 0 2> 363901 INFO (recoveryExecutor-1158-thread-1-processing-127.0.0.1:33207__c repfacttest_c8n_1x3_shard1_replica_n4 repfacttest_c8n_1x3 shard1 core_node6) [n:127.0.0.1:33207__c c:repfacttest_c8n_1x3 s:shard1 r:core_node6 x:repfacttest_c8n_1x3_shard1_replica_n4 t:] o.a.s.h.IndexFetcher Total time taken for download (fullCopy=true,bytesDownloaded=26954) : 0 secs (null bytes/sec) to MockDirectoryWrapper(ByteBuffersDirectory@26f5f8ed lockFactory=org.apache.lucene.store.SingleInstanceLockFactory@582c361e) 2> 363902 INFO (recoveryExecutor-1158-thread-1-processing-127.0.0.1:33207__c repfacttest_c8n_1x3_shard1_replica_n4 repfacttest_c8n_1x3 shard1 core_node6) [n:127.0.0.1:33207__c c:repfacttest_c8n_1x3 s:shard1 r:core_node6 x:repfacttest_c8n_1x3_shard1_replica_n4 t:] o.a.s.c.SolrCore Updating index properties... index=index.20240122000815315 2> 363905 INFO (recoveryExecutor-1158-thread-1-processing-127.0.0.1:33207__c repfacttest_c8n_1x3_shard1_replica_n4 repfacttest_c8n_1x3 shard1 core_node6) [n:127.0.0.1:33207__c c:repfacttest_c8n_1x3 s:shard1 r:core_node6 x:repfacttest_c8n_1x3_shard1_replica_n4 t:] o.a.s.h.IndexFetcher removing old index directory MockDirectoryWrapper(ByteBuffersDirectory@376df022 lockFactory=org.apache.lucene.store.SingleInstanceLockFactory@3149edc0) 2> 363910 WARN (recoveryExecutor-1119-thread-1-processing-127.0.0.1:40643__c repfacttest_c8n_1x3_shard1_replica_n2 repfacttest_c8n_1x3 shard1 core_node5) [n:127.0.0.1:40643__c c:repfacttest_c8n_1x3 s:shard1 r:core_node5 x:repfacttest_c8n_1x3_shard1_replica_n2 t:] o.a.s.h.IndexFetcher File _0.si did not match. expected checksum is 3803285895 and actual is checksum 628898392. expected length is 350 and actual length is 350 2> 363912 INFO (recoveryExecutor-1158-thread-1-processing-127.0.0.1:33207__c repfacttest_c8n_1x3_shard1_replica_n4 repfacttest_c8n_1x3 shard1 core_node6) [n:127.0.0.1:33207__c c:repfacttest_c8n_1x3 s:shard1 r:core_node6 x:repfacttest_c8n_1x3_shard1_replica_n4 t:] o.a.s.u.RandomMergePolicy RandomMergePolicy wrapping class org.apache.lucene.index.TieredMergePolicy: [TieredMergePolicy: maxMergeAtOnce=47, maxMergedSegmentMB=29.03762722015381, floorSegmentMB=1.11749267578125, forceMergeDeletesPctAllowed=1.8816060697216785, segmentsPerTier=37.0, maxCFSSegmentSizeMB=8.796093022208E12, noCFSRatio=1.0, deletesPctAllowed=41.914622909374664 2> 363917 INFO (qtp1455143443-2013) [n:127.0.0.1:32875__c c:repfacttest_c8n_1x3 s:shard1 r:core_node3 x:repfacttest_c8n_1x3_shard1_replica_n1 t:null-1069] o.a.s.c.S.Request webapp=/_c path=/replication params={generation=2&qt=/replication&file=_0.si&checksum=true&wt=filestream&command=filecontent} status=0 QTime=1 2> 363929 INFO (recoveryExecutor-1158-thread-1-processing-127.0.0.1:33207__c repfacttest_c8n_1x3_shard1_replica_n4 repfacttest_c8n_1x3 shard1 core_node6) [n:127.0.0.1:33207__c c:repfacttest_c8n_1x3 s:shard1 r:core_node6 x:repfacttest_c8n_1x3_shard1_replica_n4 t:] o.a.s.u.DefaultSolrCoreState New IndexWriter is ready to be used. 2> 363964 WARN (recoveryExecutor-1119-thread-1-processing-127.0.0.1:40643__c repfacttest_c8n_1x3_shard1_replica_n2 repfacttest_c8n_1x3 shard1 core_node5) [n:127.0.0.1:40643__c c:repfacttest_c8n_1x3 s:shard1 r:core_node5 x:repfacttest_c8n_1x3_shard1_replica_n2 t:] o.a.s.h.IndexFetcher File _0.cfs did not match. expected checksum is 2402758324 and actual is checksum 2517360286. expected length is 6638 and actual length is 6638 2> 363970 INFO (qtp1455143443-2010) [n:127.0.0.1:32875__c c:repfacttest_c8n_1x3 s:shard1 r:core_node3 x:repfacttest_c8n_1x3_shard1_replica_n1 t:null-1070] o.a.s.c.S.Request webapp=/_c path=/replication params={generation=2&qt=/replication&file=_0.cfs&checksum=true&wt=filestream&command=filecontent} status=0 QTime=2 2> 364018 WARN (recoveryExecutor-1119-thread-1-processing-127.0.0.1:40643__c repfacttest_c8n_1x3_shard1_replica_n2 repfacttest_c8n_1x3 shard1 core_node5) [n:127.0.0.1:40643__c c:repfacttest_c8n_1x3 s:shard1 r:core_node5 x:repfacttest_c8n_1x3_shard1_replica_n2 t:] o.a.s.h.IndexFetcher File _1.cfs did not match. expected checksum is 1427640015 and actual is checksum 1874507480. expected length is 6374 and actual length is 6374 2> 364024 INFO (qtp1455143443-2011) [n:127.0.0.1:32875__c c:repfacttest_c8n_1x3 s:shard1 r:core_node3 x:repfacttest_c8n_1x3_shard1_replica_n1 t:null-1071] o.a.s.c.S.Request webapp=/_c path=/replication params={generation=2&qt=/replication&file=_1.cfs&checksum=true&wt=filestream&command=filecontent} status=0 QTime=1 2> 364035 INFO (TEST-ReplicationFactorTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.AbstractFullDistribZkTestBase Found 3 replicas and leader on 127.0.0.1:32875__c for shard1 in repfacttest_c8n_1x3 2> 364035 INFO (TEST-ReplicationFactorTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.AbstractFullDistribZkTestBase Replica core_node5 is currently recovering 2> 364035 INFO (TEST-ReplicationFactorTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.AbstractFullDistribZkTestBase Replica core_node6 is currently recovering 2> 364036 INFO (searcherExecutor-1232-thread-1-processing-127.0.0.1:33207__c repfacttest_c8n_1x3_shard1_replica_n4 repfacttest_c8n_1x3 shard1 core_node6) [n:127.0.0.1:33207__c c:repfacttest_c8n_1x3 s:shard1 r:core_node6 x:repfacttest_c8n_1x3_shard1_replica_n4 t:] o.a.s.c.SolrCore Registered new searcher autowarm time: 1 ms 2> 364039 INFO (recoveryExecutor-1158-thread-1-processing-127.0.0.1:33207__c repfacttest_c8n_1x3_shard1_replica_n4 repfacttest_c8n_1x3 shard1 core_node6) [n:127.0.0.1:33207__c c:repfacttest_c8n_1x3 s:shard1 r:core_node6 x:repfacttest_c8n_1x3_shard1_replica_n4 t:] o.a.s.c.RecoveryStrategy No replay needed. 2> 364040 INFO (recoveryExecutor-1158-thread-1-processing-127.0.0.1:33207__c repfacttest_c8n_1x3_shard1_replica_n4 repfacttest_c8n_1x3 shard1 core_node6) [n:127.0.0.1:33207__c c:repfacttest_c8n_1x3 s:shard1 r:core_node6 x:repfacttest_c8n_1x3_shard1_replica_n4 t:] o.a.s.c.RecoveryStrategy Replication Recovery was successful. 2> 364040 INFO (recoveryExecutor-1158-thread-1-processing-127.0.0.1:33207__c repfacttest_c8n_1x3_shard1_replica_n4 repfacttest_c8n_1x3 shard1 core_node6) [n:127.0.0.1:33207__c c:repfacttest_c8n_1x3 s:shard1 r:core_node6 x:repfacttest_c8n_1x3_shard1_replica_n4 t:] o.a.s.c.RecoveryStrategy Registering as Active after recovery. 2> 364045 INFO (recoveryExecutor-1158-thread-1-processing-127.0.0.1:33207__c repfacttest_c8n_1x3_shard1_replica_n4 repfacttest_c8n_1x3 shard1 core_node6) [n:127.0.0.1:33207__c c:repfacttest_c8n_1x3 s:shard1 r:core_node6 x:repfacttest_c8n_1x3_shard1_replica_n4 t:] o.a.s.c.ZkShardTerms Successful update of terms at /collections/repfacttest_c8n_1x3/terms/shard1 to Terms{values={core_node6=5, core_node5_recovering=3, core_node3=5, core_node5=5}, version=12} for doneRecovering 2> 364051 INFO (recoveryExecutor-1158-thread-1-processing-127.0.0.1:33207__c repfacttest_c8n_1x3_shard1_replica_n4 repfacttest_c8n_1x3 shard1 core_node6) [n:127.0.0.1:33207__c c:repfacttest_c8n_1x3 s:shard1 r:core_node6 x:repfacttest_c8n_1x3_shard1_replica_n4 t:] o.a.s.c.RecoveryStrategy Updating version bucket highest from index after successful recovery. 2> 364060 INFO (recoveryExecutor-1158-thread-1-processing-127.0.0.1:33207__c repfacttest_c8n_1x3_shard1_replica_n4 repfacttest_c8n_1x3 shard1 core_node6) [n:127.0.0.1:33207__c c:repfacttest_c8n_1x3 s:shard1 r:core_node6 x:repfacttest_c8n_1x3_shard1_replica_n4 t:] o.a.s.c.RecoveryStrategy Finished recovery process, successful=[true] msTimeTaken=1516.0 2> 364061 INFO (recoveryExecutor-1158-thread-1-processing-127.0.0.1:33207__c repfacttest_c8n_1x3_shard1_replica_n4 repfacttest_c8n_1x3 shard1 core_node6) [n:127.0.0.1:33207__c c:repfacttest_c8n_1x3 s:shard1 r:core_node6 x:repfacttest_c8n_1x3_shard1_replica_n4 t:] o.a.s.c.RecoveryStrategy Finished recovery process. recoveringAfterStartup=true msTimeTaken=1516.0 2> 364071 WARN (recoveryExecutor-1119-thread-1-processing-127.0.0.1:40643__c repfacttest_c8n_1x3_shard1_replica_n2 repfacttest_c8n_1x3 shard1 core_node5) [n:127.0.0.1:40643__c c:repfacttest_c8n_1x3 s:shard1 r:core_node5 x:repfacttest_c8n_1x3_shard1_replica_n2 t:] o.a.s.h.IndexFetcher File _1.cfe did not match. expected checksum is 4066343161 and actual is checksum 3520365441. expected length is 1230 and actual length is 1230 2> 364076 INFO (qtp1455143443-2014) [n:127.0.0.1:32875__c c:repfacttest_c8n_1x3 s:shard1 r:core_node3 x:repfacttest_c8n_1x3_shard1_replica_n1 t:null-1072] o.a.s.c.S.Request webapp=/_c path=/replication params={generation=2&qt=/replication&file=_1.cfe&checksum=true&wt=filestream&command=filecontent} status=0 QTime=1 2> 364121 WARN (recoveryExecutor-1119-thread-1-processing-127.0.0.1:40643__c repfacttest_c8n_1x3_shard1_replica_n2 repfacttest_c8n_1x3 shard1 core_node5) [n:127.0.0.1:40643__c c:repfacttest_c8n_1x3 s:shard1 r:core_node5 x:repfacttest_c8n_1x3_shard1_replica_n2 t:] o.a.s.h.IndexFetcher File _1.si did not match. expected checksum is 2541267383 and actual is checksum 2841908011. expected length is 350 and actual length is 350 2> 364126 INFO (qtp1455143443-2015) [n:127.0.0.1:32875__c c:repfacttest_c8n_1x3 s:shard1 r:core_node3 x:repfacttest_c8n_1x3_shard1_replica_n1 t:null-1073] o.a.s.c.S.Request webapp=/_c path=/replication params={generation=2&qt=/replication&file=_1.si&checksum=true&wt=filestream&command=filecontent} status=0 QTime=2 2> 364164 INFO (zkCallback-1122-thread-2) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/repfacttest_c8n_1x3/state.json zxid: 403] for collection [repfacttest_c8n_1x3] has occurred - updating... (live nodes size: [4]) 2> 364165 INFO (zkCallback-1161-thread-2) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/repfacttest_c8n_1x3/state.json zxid: 403] for collection [repfacttest_c8n_1x3] has occurred - updating... (live nodes size: [4]) 2> 364165 INFO (zkCallback-1180-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/repfacttest_c8n_1x3/state.json zxid: 403] for collection [repfacttest_c8n_1x3] has occurred - updating... (live nodes size: [4]) 2> 364164 INFO (zkCallback-1180-thread-2) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/repfacttest_c8n_1x3/state.json zxid: 403] for collection [repfacttest_c8n_1x3] has occurred - updating... (live nodes size: [4]) 2> 364180 INFO (qtp1455143443-2012) [n:127.0.0.1:32875__c c:repfacttest_c8n_1x3 s:shard1 r:core_node3 x:repfacttest_c8n_1x3_shard1_replica_n1 t:null-1074] o.a.s.c.S.Request webapp=/_c path=/replication params={generation=2&qt=/replication&file=_1_1.liv&checksum=true&wt=filestream&command=filecontent} status=0 QTime=2 2> 364855 INFO (TEST-ReplicationFactorTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.AbstractFullDistribZkTestBase Found 3 replicas and leader on 127.0.0.1:32875__c for shard1 in repfacttest_c8n_1x3 2> 364855 INFO (TEST-ReplicationFactorTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.AbstractFullDistribZkTestBase Replica core_node5 is currently recovering 2> 364890 INFO (qtp1455143443-2013) [n:127.0.0.1:32875__c c:repfacttest_c8n_1x3 s:shard1 r:core_node3 x:repfacttest_c8n_1x3_shard1_replica_n1 t:null-1075] o.a.s.c.S.Request webapp=/_c path=/replication params={generation=2&qt=/replication&file=_2.si&checksum=true&wt=filestream&command=filecontent} status=0 QTime=1 2> 364940 INFO (qtp1455143443-2010) [n:127.0.0.1:32875__c c:repfacttest_c8n_1x3 s:shard1 r:core_node3 x:repfacttest_c8n_1x3_shard1_replica_n1 t:null-1076] o.a.s.c.S.Request webapp=/_c path=/replication params={generation=2&qt=/replication&file=_2.cfe&checksum=true&wt=filestream&command=filecontent} status=0 QTime=1 2> 364990 INFO (qtp1455143443-2011) [n:127.0.0.1:32875__c c:repfacttest_c8n_1x3 s:shard1 r:core_node3 x:repfacttest_c8n_1x3_shard1_replica_n1 t:null-1077] o.a.s.c.S.Request webapp=/_c path=/replication params={generation=2&qt=/replication&file=_2.cfs&checksum=true&wt=filestream&command=filecontent} status=0 QTime=1 2> 365042 INFO (qtp1455143443-2014) [n:127.0.0.1:32875__c c:repfacttest_c8n_1x3 s:shard1 r:core_node3 x:repfacttest_c8n_1x3_shard1_replica_n1 t:null-1078] o.a.s.c.S.Request webapp=/_c path=/replication params={generation=2&qt=/replication&file=_2_2.liv&checksum=true&wt=filestream&command=filecontent} status=0 QTime=1 2> 365093 INFO (qtp1455143443-2015) [n:127.0.0.1:32875__c c:repfacttest_c8n_1x3 s:shard1 r:core_node3 x:repfacttest_c8n_1x3_shard1_replica_n1 t:null-1079] o.a.s.c.S.Request webapp=/_c path=/replication params={generation=2&qt=/replication&file=segments_2&checksum=true&wt=filestream&command=filecontent} status=0 QTime=1 2> 365138 INFO (recoveryExecutor-1119-thread-1-processing-127.0.0.1:40643__c repfacttest_c8n_1x3_shard1_replica_n2 repfacttest_c8n_1x3 shard1 core_node5) [n:127.0.0.1:40643__c c:repfacttest_c8n_1x3 s:shard1 r:core_node5 x:repfacttest_c8n_1x3_shard1_replica_n2 t:] o.a.s.h.IndexFetcher Bytes downloaded: 26954, Bytes skipped downloading: 0 2> 365138 INFO (recoveryExecutor-1119-thread-1-processing-127.0.0.1:40643__c repfacttest_c8n_1x3_shard1_replica_n2 repfacttest_c8n_1x3 shard1 core_node5) [n:127.0.0.1:40643__c c:repfacttest_c8n_1x3 s:shard1 r:core_node5 x:repfacttest_c8n_1x3_shard1_replica_n2 t:] o.a.s.h.IndexFetcher Total time taken for download (fullCopy=true,bytesDownloaded=26954) : 1 secs (26954 bytes/sec) to MockDirectoryWrapper(ByteBuffersDirectory@38e03314 lockFactory=org.apache.lucene.store.SingleInstanceLockFactory@48db97c5) 2> 365139 INFO (recoveryExecutor-1119-thread-1-processing-127.0.0.1:40643__c repfacttest_c8n_1x3_shard1_replica_n2 repfacttest_c8n_1x3 shard1 core_node5) [n:127.0.0.1:40643__c c:repfacttest_c8n_1x3 s:shard1 r:core_node5 x:repfacttest_c8n_1x3_shard1_replica_n2 t:] o.a.s.c.SolrCore Updating index properties... index=index.20240122000815892 2> 365142 INFO (recoveryExecutor-1119-thread-1-processing-127.0.0.1:40643__c repfacttest_c8n_1x3_shard1_replica_n2 repfacttest_c8n_1x3 shard1 core_node5) [n:127.0.0.1:40643__c c:repfacttest_c8n_1x3 s:shard1 r:core_node5 x:repfacttest_c8n_1x3_shard1_replica_n2 t:] o.a.s.h.IndexFetcher removing old index directory MockDirectoryWrapper(ByteBuffersDirectory@dbb0213 lockFactory=org.apache.lucene.store.SingleInstanceLockFactory@22ca5035) 2> 365153 INFO (recoveryExecutor-1119-thread-1-processing-127.0.0.1:40643__c repfacttest_c8n_1x3_shard1_replica_n2 repfacttest_c8n_1x3 shard1 core_node5) [n:127.0.0.1:40643__c c:repfacttest_c8n_1x3 s:shard1 r:core_node5 x:repfacttest_c8n_1x3_shard1_replica_n2 t:] o.a.s.u.RandomMergePolicy RandomMergePolicy wrapping class org.apache.lucene.index.TieredMergePolicy: [TieredMergePolicy: maxMergeAtOnce=47, maxMergedSegmentMB=29.03762722015381, floorSegmentMB=1.11749267578125, forceMergeDeletesPctAllowed=1.8816060697216785, segmentsPerTier=37.0, maxCFSSegmentSizeMB=8.796093022208E12, noCFSRatio=1.0, deletesPctAllowed=41.914622909374664 2> 365170 INFO (recoveryExecutor-1119-thread-1-processing-127.0.0.1:40643__c repfacttest_c8n_1x3_shard1_replica_n2 repfacttest_c8n_1x3 shard1 core_node5) [n:127.0.0.1:40643__c c:repfacttest_c8n_1x3 s:shard1 r:core_node5 x:repfacttest_c8n_1x3_shard1_replica_n2 t:] o.a.s.u.DefaultSolrCoreState New IndexWriter is ready to be used. 2> 365277 INFO (searcherExecutor-1234-thread-1-processing-127.0.0.1:40643__c repfacttest_c8n_1x3_shard1_replica_n2 repfacttest_c8n_1x3 shard1 core_node5) [n:127.0.0.1:40643__c c:repfacttest_c8n_1x3 s:shard1 r:core_node5 x:repfacttest_c8n_1x3_shard1_replica_n2 t:] o.a.s.c.SolrCore Registered new searcher autowarm time: 0 ms 2> 365280 INFO (recoveryExecutor-1119-thread-1-processing-127.0.0.1:40643__c repfacttest_c8n_1x3_shard1_replica_n2 repfacttest_c8n_1x3 shard1 core_node5) [n:127.0.0.1:40643__c c:repfacttest_c8n_1x3 s:shard1 r:core_node5 x:repfacttest_c8n_1x3_shard1_replica_n2 t:] o.a.s.c.RecoveryStrategy No replay needed. 2> 365282 INFO (recoveryExecutor-1119-thread-1-processing-127.0.0.1:40643__c repfacttest_c8n_1x3_shard1_replica_n2 repfacttest_c8n_1x3 shard1 core_node5) [n:127.0.0.1:40643__c c:repfacttest_c8n_1x3 s:shard1 r:core_node5 x:repfacttest_c8n_1x3_shard1_replica_n2 t:] o.a.s.c.RecoveryStrategy Replication Recovery was successful. 2> 365282 INFO (recoveryExecutor-1119-thread-1-processing-127.0.0.1:40643__c repfacttest_c8n_1x3_shard1_replica_n2 repfacttest_c8n_1x3 shard1 core_node5) [n:127.0.0.1:40643__c c:repfacttest_c8n_1x3 s:shard1 r:core_node5 x:repfacttest_c8n_1x3_shard1_replica_n2 t:] o.a.s.c.RecoveryStrategy Registering as Active after recovery. 2> 365287 INFO (recoveryExecutor-1119-thread-1-processing-127.0.0.1:40643__c repfacttest_c8n_1x3_shard1_replica_n2 repfacttest_c8n_1x3 shard1 core_node5) [n:127.0.0.1:40643__c c:repfacttest_c8n_1x3 s:shard1 r:core_node5 x:repfacttest_c8n_1x3_shard1_replica_n2 t:] o.a.s.c.ZkShardTerms Successful update of terms at /collections/repfacttest_c8n_1x3/terms/shard1 to Terms{values={core_node6=5, core_node3=5, core_node5=5}, version=13} for doneRecovering 2> 365292 INFO (recoveryExecutor-1119-thread-1-processing-127.0.0.1:40643__c repfacttest_c8n_1x3_shard1_replica_n2 repfacttest_c8n_1x3 shard1 core_node5) [n:127.0.0.1:40643__c c:repfacttest_c8n_1x3 s:shard1 r:core_node5 x:repfacttest_c8n_1x3_shard1_replica_n2 t:] o.a.s.c.RecoveryStrategy Updating version bucket highest from index after successful recovery. 2> 365302 INFO (recoveryExecutor-1119-thread-1-processing-127.0.0.1:40643__c repfacttest_c8n_1x3_shard1_replica_n2 repfacttest_c8n_1x3 shard1 core_node5) [n:127.0.0.1:40643__c c:repfacttest_c8n_1x3 s:shard1 r:core_node5 x:repfacttest_c8n_1x3_shard1_replica_n2 t:] o.a.s.c.RecoveryStrategy Finished recovery process, successful=[true] msTimeTaken=2144.0 2> 365302 INFO (recoveryExecutor-1119-thread-1-processing-127.0.0.1:40643__c repfacttest_c8n_1x3_shard1_replica_n2 repfacttest_c8n_1x3 shard1 core_node5) [n:127.0.0.1:40643__c c:repfacttest_c8n_1x3 s:shard1 r:core_node5 x:repfacttest_c8n_1x3_shard1_replica_n2 t:] o.a.s.c.RecoveryStrategy Finished recovery process. recoveringAfterStartup=true msTimeTaken=2144.0 2> 365359 INFO (TEST-ReplicationFactorTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.AbstractFullDistribZkTestBase Found 3 replicas and leader on 127.0.0.1:32875__c for shard1 in repfacttest_c8n_1x3 2> 365359 INFO (TEST-ReplicationFactorTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.AbstractFullDistribZkTestBase Replica core_node5 is currently recovering 2> 365404 INFO (zkCallback-1180-thread-2) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/repfacttest_c8n_1x3/state.json zxid: 407] for collection [repfacttest_c8n_1x3] has occurred - updating... (live nodes size: [4]) 2> 365404 INFO (zkCallback-1180-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/repfacttest_c8n_1x3/state.json zxid: 407] for collection [repfacttest_c8n_1x3] has occurred - updating... (live nodes size: [4]) 2> 365404 INFO (zkCallback-1161-thread-2) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/repfacttest_c8n_1x3/state.json zxid: 407] for collection [repfacttest_c8n_1x3] has occurred - updating... (live nodes size: [4]) 2> 365404 INFO (zkCallback-1122-thread-2) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/repfacttest_c8n_1x3/state.json zxid: 407] for collection [repfacttest_c8n_1x3] has occurred - updating... (live nodes size: [4]) 2> 365866 INFO (TEST-ReplicationFactorTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.AbstractFullDistribZkTestBase Found 3 replicas and leader on 127.0.0.1:32875__c for shard1 in repfacttest_c8n_1x3 2> 365866 INFO (TEST-ReplicationFactorTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.AbstractFullDistribZkTestBase Took 10433.0 ms to see all replicas become active. 2> 365867 INFO (TEST-ReplicationFactorTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.ReplicationFactorTest Indexing docId=4 2> 365904 INFO (qtp1639791653-1925) [n:127.0.0.1:40643__c c:repfacttest_c8n_1x3 s:shard1 r:core_node5 x:repfacttest_c8n_1x3_shard1_replica_n2 t:null-1080] o.a.s.u.p.LogUpdateProcessorFactory webapp=/_c path=/update params={update.distrib=FROMLEADER&distrib.from=http://127.0.0.1:32875/_c/repfacttest_c8n_1x3_shard1_replica_n1/&wt=javabin&version=2}{add=[4 (1788735702143336448)]} 0 13 2> 365951 INFO (qtp1461542799-1985) [n:127.0.0.1:33207__c c:repfacttest_c8n_1x3 s:shard1 r:core_node6 x:repfacttest_c8n_1x3_shard1_replica_n4 t:null-1080] o.a.s.u.p.LogUpdateProcessorFactory webapp=/_c path=/update params={update.distrib=FROMLEADER&distrib.from=http://127.0.0.1:32875/_c/repfacttest_c8n_1x3_shard1_replica_n1/&wt=javabin&version=2}{add=[4 (1788735702143336448)]} 0 20 2> 365955 INFO (qtp1455143443-2012) [n:127.0.0.1:32875__c c:repfacttest_c8n_1x3 s:shard1 r:core_node3 x:repfacttest_c8n_1x3_shard1_replica_n1 t:null-1080] o.a.s.u.p.LogUpdateProcessorFactory webapp=/_c path=/update params={wt=javabin&version=2}{add=[4 (1788735702143336448)]} 0 81 2> 366017 INFO (qtp1639791653-1929) [n:127.0.0.1:40643__c c:repfacttest_c8n_1x3 s:shard1 r:core_node5 x:repfacttest_c8n_1x3_shard1_replica_n2 t:null-1081] o.a.s.u.p.LogUpdateProcessorFactory webapp=/_c path=/update params={update.distrib=FROMLEADER&distrib.from=http://127.0.0.1:32875/_c/repfacttest_c8n_1x3_shard1_replica_n1/&wt=javabin&version=2}{add=[1690 (1788735702235611136)]} 0 6 2> 366023 INFO (qtp1461542799-1984) [n:127.0.0.1:33207__c c:repfacttest_c8n_1x3 s:shard1 r:core_node6 x:repfacttest_c8n_1x3_shard1_replica_n4 t:null-1081] o.a.s.u.p.LogUpdateProcessorFactory webapp=/_c path=/update params={update.distrib=FROMLEADER&distrib.from=http://127.0.0.1:32875/_c/repfacttest_c8n_1x3_shard1_replica_n1/&wt=javabin&version=2}{add=[1690 (1788735702235611136)]} 0 6 2> 366030 INFO (qtp1455143443-2013) [n:127.0.0.1:32875__c c:repfacttest_c8n_1x3 s:shard1 r:core_node3 x:repfacttest_c8n_1x3_shard1_replica_n1 t:null-1081] o.a.s.u.p.LogUpdateProcessorFactory webapp=/_c path=/update params={wt=javabin&version=2}{add=[1690 (1788735702235611136)]} 0 69 2> 366311 INFO (qtp1639791653-1927) [n:127.0.0.1:40643__c c:repfacttest_c8n_1x3 s:shard1 r:core_node5 x:repfacttest_c8n_1x3_shard1_replica_n2 t:null-1082] o.a.s.u.p.LogUpdateProcessorFactory webapp=/_c path=/update params={update.distrib=FROMLEADER&_version_=-1788735702313205760&distrib.from=http://127.0.0.1:32875/_c/repfacttest_c8n_1x3_shard1_replica_n1/&wt=javabin&version=2}{deleteByQuery=id:(1690) (-1788735702313205760)} 0 136 2> 366318 INFO (qtp1461542799-1986) [n:127.0.0.1:33207__c c:repfacttest_c8n_1x3 s:shard1 r:core_node6 x:repfacttest_c8n_1x3_shard1_replica_n4 t:null-1082] o.a.s.u.p.LogUpdateProcessorFactory webapp=/_c path=/update params={update.distrib=FROMLEADER&_version_=-1788735702313205760&distrib.from=http://127.0.0.1:32875/_c/repfacttest_c8n_1x3_shard1_replica_n1/&wt=javabin&version=2}{deleteByQuery=id:(1690) (-1788735702313205760)} 0 144 2> 366340 INFO (qtp1455143443-2010) [n:127.0.0.1:32875__c c:repfacttest_c8n_1x3 s:shard1 r:core_node3 x:repfacttest_c8n_1x3_shard1_replica_n1 t:null-1082] o.a.s.u.p.LogUpdateProcessorFactory webapp=/_c path=/update params={wt=javabin&version=2}{deleteByQuery=id:(1690) (-1788735702313205760)} 0 304 2> 366373 INFO (qtp1639791653-1924) [n:127.0.0.1:40643__c c:repfacttest_c8n_1x3 s:shard1 r:core_node5 x:repfacttest_c8n_1x3_shard1_replica_n2 t:null-1083] o.a.s.u.p.LogUpdateProcessorFactory webapp=/_c path=/update params={update.distrib=FROMLEADER&distrib.from=http://127.0.0.1:32875/_c/repfacttest_c8n_1x3_shard1_replica_n1/&wt=javabin&version=2}{add=[1842 (1788735702646652928)]} 0 7 2> 366431 INFO (qtp1461542799-1983) [n:127.0.0.1:33207__c c:repfacttest_c8n_1x3 s:shard1 r:core_node6 x:repfacttest_c8n_1x3_shard1_replica_n4 t:null-1083] o.a.s.u.p.LogUpdateProcessorFactory webapp=/_c path=/update params={update.distrib=FROMLEADER&distrib.from=http://127.0.0.1:32875/_c/repfacttest_c8n_1x3_shard1_replica_n1/&wt=javabin&version=2}{add=[1842 (1788735702646652928)]} 0 19 2> 366436 INFO (qtp1455143443-2014) [n:127.0.0.1:32875__c c:repfacttest_c8n_1x3 s:shard1 r:core_node3 x:repfacttest_c8n_1x3_shard1_replica_n1 t:null-1083] o.a.s.u.p.LogUpdateProcessorFactory webapp=/_c path=/update params={wt=javabin&version=2}{add=[1842 (1788735702646652928)]} 0 82 2> 366501 INFO (qtp1639791653-1926) [n:127.0.0.1:40643__c c:repfacttest_c8n_1x3 s:shard1 r:core_node5 x:repfacttest_c8n_1x3_shard1_replica_n2 t:null-1084] o.a.s.u.p.LogUpdateProcessorFactory webapp=/_c path=/update params={update.distrib=FROMLEADER&distrib.from=http://127.0.0.1:32875/_c/repfacttest_c8n_1x3_shard1_replica_n1/&wt=javabin&version=2}{delete=[1842 (-1788735702738927616)]} 0 6 2> 366525 INFO (qtp1461542799-1981) [n:127.0.0.1:33207__c c:repfacttest_c8n_1x3 s:shard1 r:core_node6 x:repfacttest_c8n_1x3_shard1_replica_n4 t:null-1084] o.a.s.u.p.LogUpdateProcessorFactory webapp=/_c path=/update params={update.distrib=FROMLEADER&distrib.from=http://127.0.0.1:32875/_c/repfacttest_c8n_1x3_shard1_replica_n1/&wt=javabin&version=2}{delete=[1842 (-1788735702738927616)]} 0 24 2> 366528 INFO (qtp1455143443-2015) [n:127.0.0.1:32875__c c:repfacttest_c8n_1x3 s:shard1 r:core_node3 x:repfacttest_c8n_1x3_shard1_replica_n1 t:null-1084] o.a.s.u.p.LogUpdateProcessorFactory webapp=/_c path=/update params={wt=javabin&version=2}{delete=[1842 (-1788735702738927616)]} 0 86 2> 366531 INFO (TEST-ReplicationFactorTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.ReplicationFactorTest Indexing batch of documents (5-14) 2> 366601 INFO (qtp1639791653-1928) [n:127.0.0.1:40643__c c:repfacttest_c8n_1x3 s:shard1 r:core_node5 x:repfacttest_c8n_1x3_shard1_replica_n2 t:null-1085] o.a.s.u.p.LogUpdateProcessorFactory webapp=/_c path=/update params={update.distrib=FROMLEADER&distrib.from=http://127.0.0.1:32875/_c/repfacttest_c8n_1x3_shard1_replica_n1/&wt=javabin&version=2}{add=[5 (1788735702836445184), 6 (1788735702839590912), 7 (1788735702840639488), 8 (1788735702841688064), 9 (1788735702843785216), 10 (1788735702844833792), 11 (1788735702845882368), 12 (1788735702847979520), 13 (1788735702849028096), 14 (1788735702851125248)]} 0 16 2> 366607 INFO (qtp1461542799-1982) [n:127.0.0.1:33207__c c:repfacttest_c8n_1x3 s:shard1 r:core_node6 x:repfacttest_c8n_1x3_shard1_replica_n4 t:null-1085] o.a.s.u.p.LogUpdateProcessorFactory webapp=/_c path=/update params={update.distrib=FROMLEADER&distrib.from=http://127.0.0.1:32875/_c/repfacttest_c8n_1x3_shard1_replica_n1/&wt=javabin&version=2}{add=[5 (1788735702836445184), 6 (1788735702839590912), 7 (1788735702840639488), 8 (1788735702841688064), 9 (1788735702843785216), 10 (1788735702844833792), 11 (1788735702845882368), 12 (1788735702847979520), 13 (1788735702849028096), 14 (1788735702851125248)]} 0 15 2> 366611 INFO (qtp1455143443-2012) [n:127.0.0.1:32875__c c:repfacttest_c8n_1x3 s:shard1 r:core_node3 x:repfacttest_c8n_1x3_shard1_replica_n1 t:null-1085] o.a.s.u.p.LogUpdateProcessorFactory webapp=/_c path=/update params={wt=javabin&version=2}{add=[5 (1788735702836445184), 6 (1788735702839590912), 7 (1788735702840639488), 8 (1788735702841688064), 9 (1788735702843785216), 10 (1788735702844833792), 11 (1788735702845882368), 12 (1788735702847979520), 13 (1788735702849028096), 14 (1788735702851125248)]} 0 76 2> 366698 INFO (qtp1639791653-1925) [n:127.0.0.1:40643__c c:repfacttest_c8n_1x3 s:shard1 r:core_node5 x:repfacttest_c8n_1x3_shard1_replica_n2 t:null-1086] o.a.s.u.p.LogUpdateProcessorFactory webapp=/_c path=/update params={update.distrib=FROMLEADER&distrib.from=http://127.0.0.1:32875/_c/repfacttest_c8n_1x3_shard1_replica_n1/&wt=javabin&version=2}{add=[1888 (1788735702923476992), 1952 (1788735702928719872), 1889 (1788735702929768448), 1922 (1788735702930817024), 1924 (1788735702932914176), 1895 (1788735702933962752), 1964 (1788735702935011328), 1869 (1788735702936059904), 1902 (1788735702937108480), 1935 (1788735702938157056), ... (15 adds)]} 0 17 2> 366714 INFO (qtp1461542799-1985) [n:127.0.0.1:33207__c c:repfacttest_c8n_1x3 s:shard1 r:core_node6 x:repfacttest_c8n_1x3_shard1_replica_n4 t:null-1086] o.a.s.u.p.LogUpdateProcessorFactory webapp=/_c path=/update params={update.distrib=FROMLEADER&distrib.from=http://127.0.0.1:32875/_c/repfacttest_c8n_1x3_shard1_replica_n1/&wt=javabin&version=2}{add=[1888 (1788735702923476992), 1952 (1788735702928719872), 1889 (1788735702929768448), 1922 (1788735702930817024), 1924 (1788735702932914176), 1895 (1788735702933962752), 1964 (1788735702935011328), 1869 (1788735702936059904), 1902 (1788735702937108480), 1935 (1788735702938157056), ... (15 adds)]} 0 36 2> 366718 INFO (qtp1455143443-2013) [n:127.0.0.1:32875__c c:repfacttest_c8n_1x3 s:shard1 r:core_node3 x:repfacttest_c8n_1x3_shard1_replica_n1 t:null-1086] o.a.s.u.p.LogUpdateProcessorFactory webapp=/_c path=/update params={wt=javabin&version=2}{add=[1888 (1788735702923476992), 1952 (1788735702928719872), 1889 (1788735702929768448), 1922 (1788735702930817024), 1924 (1788735702932914176), 1895 (1788735702933962752), 1964 (1788735702935011328), 1869 (1788735702936059904), 1902 (1788735702937108480), 1935 (1788735702938157056), ... (15 adds)]} 0 100 2> 367583 INFO (qtp1639791653-1929) [n:127.0.0.1:40643__c c:repfacttest_c8n_1x3 s:shard1 r:core_node5 x:repfacttest_c8n_1x3_shard1_replica_n2 t:null-1087] o.a.s.u.p.LogUpdateProcessorFactory webapp=/_c path=/update params={update.distrib=FROMLEADER&_version_=-1788735703034626048&distrib.from=http://127.0.0.1:32875/_c/repfacttest_c8n_1x3_shard1_replica_n1/&wt=javabin&version=2}{deleteByQuery=id:(1888 OR 1952 OR 1889 OR 1922 OR 1924 OR 1895 OR 1964 OR 1869 OR 1902 OR 1935 OR 1936 OR 1905 OR 1874 OR 1946 OR 1947) (-1788735703034626048)} 0 691 2> 367592 INFO (qtp1461542799-1984) [n:127.0.0.1:33207__c c:repfacttest_c8n_1x3 s:shard1 r:core_node6 x:repfacttest_c8n_1x3_shard1_replica_n4 t:null-1087] o.a.s.u.p.LogUpdateProcessorFactory webapp=/_c path=/update params={update.distrib=FROMLEADER&_version_=-1788735703034626048&distrib.from=http://127.0.0.1:32875/_c/repfacttest_c8n_1x3_shard1_replica_n1/&wt=javabin&version=2}{deleteByQuery=id:(1888 OR 1952 OR 1889 OR 1922 OR 1924 OR 1895 OR 1964 OR 1869 OR 1902 OR 1935 OR 1936 OR 1905 OR 1874 OR 1946 OR 1947) (-1788735703034626048)} 0 701 2> 367597 INFO (qtp1455143443-2011) [n:127.0.0.1:32875__c c:repfacttest_c8n_1x3 s:shard1 r:core_node3 x:repfacttest_c8n_1x3_shard1_replica_n1 t:null-1087] o.a.s.u.p.LogUpdateProcessorFactory webapp=/_c path=/update params={wt=javabin&version=2}{deleteByQuery=id:(1888 OR 1952 OR 1889 OR 1922 OR 1924 OR 1895 OR 1964 OR 1869 OR 1902 OR 1935 OR 1936 OR 1905 OR 1874 OR 1946 OR 1947) (-1788735703034626048)} 0 874 2> 367637 INFO (qtp1639791653-1927) [n:127.0.0.1:40643__c c:repfacttest_c8n_1x3 s:shard1 r:core_node5 x:repfacttest_c8n_1x3_shard1_replica_n2 t:null-1088] o.a.s.u.p.LogUpdateProcessorFactory webapp=/_c path=/update params={update.distrib=FROMLEADER&distrib.from=http://127.0.0.1:32875/_c/repfacttest_c8n_1x3_shard1_replica_n1/&wt=javabin&version=2}{add=[2080 (1788735703956324352), 2016 (1788735703963664384), 2020 (1788735703964712960), 2022 (1788735703966810112), 2023 (1788735703967858688), 1993 (1788735703969955840), 2060 (1788735703971004416), 2063 (1788735703973101568), 2031 (1788735703974150144), 2065 (1788735703976247296), ... (15 adds)]} 0 23 2> 367678 INFO (qtp1461542799-1986) [n:127.0.0.1:33207__c c:repfacttest_c8n_1x3 s:shard1 r:core_node6 x:repfacttest_c8n_1x3_shard1_replica_n4 t:null-1088] o.a.s.u.p.LogUpdateProcessorFactory webapp=/_c path=/update params={update.distrib=FROMLEADER&distrib.from=http://127.0.0.1:32875/_c/repfacttest_c8n_1x3_shard1_replica_n1/&wt=javabin&version=2}{add=[2080 (1788735703956324352), 2016 (1788735703963664384), 2020 (1788735703964712960), 2022 (1788735703966810112), 2023 (1788735703967858688), 1993 (1788735703969955840), 2060 (1788735703971004416), 2063 (1788735703973101568), 2031 (1788735703974150144), 2065 (1788735703976247296), ... (15 adds)]} 0 24 2> 367682 INFO (qtp1455143443-2010) [n:127.0.0.1:32875__c c:repfacttest_c8n_1x3 s:shard1 r:core_node3 x:repfacttest_c8n_1x3_shard1_replica_n1 t:null-1088] o.a.s.u.p.LogUpdateProcessorFactory webapp=/_c path=/update params={wt=javabin&version=2}{add=[2080 (1788735703956324352), 2016 (1788735703963664384), 2020 (1788735703964712960), 2022 (1788735703966810112), 2023 (1788735703967858688), 1993 (1788735703969955840), 2060 (1788735703971004416), 2063 (1788735703973101568), 2031 (1788735703974150144), 2065 (1788735703976247296), ... (15 adds)]} 0 78 2> 367786 INFO (qtp1639791653-1924) [n:127.0.0.1:40643__c c:repfacttest_c8n_1x3 s:shard1 r:core_node5 x:repfacttest_c8n_1x3_shard1_replica_n2 t:null-1089] o.a.s.u.p.LogUpdateProcessorFactory webapp=/_c path=/update params={update.distrib=FROMLEADER&distrib.from=http://127.0.0.1:32875/_c/repfacttest_c8n_1x3_shard1_replica_n1/&wt=javabin&version=2}{delete=[2080,2016,2020,2022,2023,1993,2060,2063,2031,2065,2035,2006,2038,2009,2077 (-1788735704049647616)]} 0 5 2> 367793 INFO (qtp1461542799-1983) [n:127.0.0.1:33207__c c:repfacttest_c8n_1x3 s:shard1 r:core_node6 x:repfacttest_c8n_1x3_shard1_replica_n4 t:null-1089] o.a.s.u.p.LogUpdateProcessorFactory webapp=/_c path=/update params={update.distrib=FROMLEADER&distrib.from=http://127.0.0.1:32875/_c/repfacttest_c8n_1x3_shard1_replica_n1/&wt=javabin&version=2}{delete=[2080,2016,2020,2022,2023,1993,2060,2063,2031,2065,2035,2006,2038,2009,2077 (-1788735704049647616)]} 0 12 2> 367796 INFO (qtp1455143443-2014) [n:127.0.0.1:32875__c c:repfacttest_c8n_1x3 s:shard1 r:core_node3 x:repfacttest_c8n_1x3_shard1_replica_n1 t:null-1089] o.a.s.u.p.LogUpdateProcessorFactory webapp=/_c path=/update params={wt=javabin&version=2}{delete=[2080,2016,2020,2022,2023,1993,2060,2063,2031,2065,2035,2006,2038,2009,2077 (-1788735704049647616)]} 0 105 2> 367797 INFO (TEST-ReplicationFactorTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.ReplicationFactorTest Closing one proxy port (again) 2> 367798 WARN (TEST-ReplicationFactorTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.s.c.SocketProxy Closing 1 connections to: http://127.0.0.1:33207/_c, target: http://127.0.0.1:35371/_c 2> 367798 INFO (TEST-ReplicationFactorTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.ReplicationFactorTest Indexing docId=5 2> 367817 ERROR (updateExecutor-1175-thread-1-processing-repfacttest_c8n_1x3_shard1_replica_n1 null-1090 core_node3 null 127.0.0.1:32875__c repfacttest_c8n_1x3 shard1) [n:127.0.0.1:32875__c c:repfacttest_c8n_1x3 s:shard1 r:core_node3 x:repfacttest_c8n_1x3_shard1_replica_n1 t:null-1090] o.a.s.u.ErrorReportingConcurrentUpdateSolrClient Error when calling SolrCmdDistributor$Req: cmd=add{,id=(null)}; node=StdNode: http://127.0.0.1:33207/_c/repfacttest_c8n_1x3_shard1_replica_n4/ to http://127.0.0.1:33207/_c/repfacttest_c8n_1x3_shard1_replica_n4/ 2> => java.io.IOException: java.net.ConnectException: Connection refused 2> at org.eclipse.jetty.client.util.OutputStreamRequestContent$AsyncOutputStream.write(OutputStreamRequestContent.java:104) 2> java.io.IOException: java.net.ConnectException: Connection refused 2> at org.eclipse.jetty.client.util.OutputStreamRequestContent$AsyncOutputStream.write(OutputStreamRequestContent.java:104) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.apache.solr.common.util.FastOutputStream.flush(FastOutputStream.java:207) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.common.util.FastOutputStream.flushBuffer(FastOutputStream.java:200) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.common.util.JavaBinCodec.marshal(JavaBinCodec.java:170) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.request.JavaBinUpdateRequestCodec.marshal(JavaBinUpdateRequestCodec.java:100) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.BinaryRequestWriter.write(BinaryRequestWriter.java:79) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.Http2SolrClient.send(Http2SolrClient.java:435) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.ConcurrentUpdateHttp2SolrClient$Runner.sendUpdateStream(ConcurrentUpdateHttp2SolrClient.java:250) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.ConcurrentUpdateHttp2SolrClient$Runner.run(ConcurrentUpdateHttp2SolrClient.java:192) [solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at com.codahale.metrics.InstrumentedExecutorService$InstrumentedRunnable.run(InstrumentedExecutorService.java:212) [metrics-core-4.2.21.jar:4.2.21] 2> at org.apache.solr.common.util.ExecutorUtil$MDCAwareThreadPoolExecutor.lambda$execute$0(ExecutorUtil.java:294) [solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) [?:?] 2> at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) [?:?] 2> at java.base/java.lang.Thread.run(Thread.java:829) [?:?] 2> Suppressed: java.io.IOException: java.net.ConnectException: Connection refused 2> at org.eclipse.jetty.client.util.OutputStreamRequestContent$AsyncOutputStream.write(OutputStreamRequestContent.java:104) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.apache.solr.common.util.FastOutputStream.flush(FastOutputStream.java:207) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.common.util.FastOutputStream.flushBuffer(FastOutputStream.java:200) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.common.util.JavaBinCodec.close(JavaBinCodec.java:1293) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.request.JavaBinUpdateRequestCodec.marshal(JavaBinUpdateRequestCodec.java:99) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.BinaryRequestWriter.write(BinaryRequestWriter.java:79) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.Http2SolrClient.send(Http2SolrClient.java:435) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.ConcurrentUpdateHttp2SolrClient$Runner.sendUpdateStream(ConcurrentUpdateHttp2SolrClient.java:250) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.ConcurrentUpdateHttp2SolrClient$Runner.run(ConcurrentUpdateHttp2SolrClient.java:192) [solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at com.codahale.metrics.InstrumentedExecutorService$InstrumentedRunnable.run(InstrumentedExecutorService.java:212) [metrics-core-4.2.21.jar:4.2.21] 2> at org.apache.solr.common.util.ExecutorUtil$MDCAwareThreadPoolExecutor.lambda$execute$0(ExecutorUtil.java:294) [solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) [?:?] 2> at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) [?:?] 2> at java.base/java.lang.Thread.run(Thread.java:829) [?:?] 2> Caused by: java.net.ConnectException: Connection refused 2> at java.base/sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) ~[?:?] 2> at java.base/sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:774) ~[?:?] 2> at java.base/sun.nio.ch.SocketAdaptor.connect(SocketAdaptor.java:120) ~[?:?] 2> at org.eclipse.jetty.io.ClientConnector.connect(ClientConnector.java:428) ~[jetty-io-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.HTTP2Client.connect(HTTP2Client.java:461) ~[http2-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.http.HttpClientTransportOverHTTP2.connect(HttpClientTransportOverHTTP2.java:146) ~[http2-http-client-transport-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.http.HttpClientTransportOverHTTP2.connect(HttpClientTransportOverHTTP2.java:135) ~[http2-http-client-transport-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.connect(HttpClient.java:583) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.succeeded(HttpClient.java:560) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.succeeded(HttpClient.java:556) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.util.SocketAddressResolver$Async.lambda$resolve$1(SocketAddressResolver.java:180) ~[jetty-util-10.0.19.jar:10.0.19] 2> ... 4 more 2> Caused by: java.net.ConnectException: Connection refused 2> at java.base/sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) ~[?:?] 2> at java.base/sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:774) ~[?:?] 2> at java.base/sun.nio.ch.SocketAdaptor.connect(SocketAdaptor.java:120) ~[?:?] 2> at org.eclipse.jetty.io.ClientConnector.connect(ClientConnector.java:428) ~[jetty-io-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.HTTP2Client.connect(HTTP2Client.java:461) ~[http2-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.http.HttpClientTransportOverHTTP2.connect(HttpClientTransportOverHTTP2.java:146) ~[http2-http-client-transport-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.http.HttpClientTransportOverHTTP2.connect(HttpClientTransportOverHTTP2.java:135) ~[http2-http-client-transport-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.connect(HttpClient.java:583) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.succeeded(HttpClient.java:560) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.succeeded(HttpClient.java:556) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.util.SocketAddressResolver$Async.lambda$resolve$1(SocketAddressResolver.java:180) ~[jetty-util-10.0.19.jar:10.0.19] 2> ... 4 more 2> 367858 INFO (qtp1639791653-1926) [n:127.0.0.1:40643__c c:repfacttest_c8n_1x3 s:shard1 r:core_node5 x:repfacttest_c8n_1x3_shard1_replica_n2 t:null-1090] o.a.s.u.p.LogUpdateProcessorFactory webapp=/_c path=/update params={update.distrib=FROMLEADER&distrib.from=http://127.0.0.1:32875/_c/repfacttest_c8n_1x3_shard1_replica_n1/&wt=javabin&version=2}{add=[5 (1788735704169185280)]} 0 3 2> 367862 INFO (qtp1455143443-2015) [n:127.0.0.1:32875__c c:repfacttest_c8n_1x3 s:shard1 r:core_node3 x:repfacttest_c8n_1x3_shard1_replica_n1 t:null-1090] o.a.s.u.SolrCmdDistributor SolrCmdDistributor found 1 errors 2> 367863 WARN (qtp1455143443-2015) [n:127.0.0.1:32875__c c:repfacttest_c8n_1x3 s:shard1 r:core_node3 x:repfacttest_c8n_1x3_shard1_replica_n1 t:null-1090] o.a.s.u.p.DistributedZkUpdateProcessor Error sending update to http://127.0.0.1:33207/_c 2> => java.io.IOException: java.net.ConnectException: Connection refused 2> at org.eclipse.jetty.client.util.OutputStreamRequestContent$AsyncOutputStream.write(OutputStreamRequestContent.java:104) 2> java.io.IOException: java.net.ConnectException: Connection refused 2> at org.eclipse.jetty.client.util.OutputStreamRequestContent$AsyncOutputStream.write(OutputStreamRequestContent.java:104) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.apache.solr.common.util.FastOutputStream.flush(FastOutputStream.java:207) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.common.util.FastOutputStream.flushBuffer(FastOutputStream.java:200) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.common.util.JavaBinCodec.marshal(JavaBinCodec.java:170) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.request.JavaBinUpdateRequestCodec.marshal(JavaBinUpdateRequestCodec.java:100) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.BinaryRequestWriter.write(BinaryRequestWriter.java:79) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.Http2SolrClient.send(Http2SolrClient.java:435) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.ConcurrentUpdateHttp2SolrClient$Runner.sendUpdateStream(ConcurrentUpdateHttp2SolrClient.java:250) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.ConcurrentUpdateHttp2SolrClient$Runner.run(ConcurrentUpdateHttp2SolrClient.java:192) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at com.codahale.metrics.InstrumentedExecutorService$InstrumentedRunnable.run(InstrumentedExecutorService.java:212) ~[metrics-core-4.2.21.jar:4.2.21] 2> at org.apache.solr.common.util.ExecutorUtil$MDCAwareThreadPoolExecutor.lambda$execute$0(ExecutorUtil.java:294) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) ~[?:?] 2> at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) ~[?:?] 2> at java.base/java.lang.Thread.run(Thread.java:829) [?:?] 2> Suppressed: java.io.IOException: java.net.ConnectException: Connection refused 2> at org.eclipse.jetty.client.util.OutputStreamRequestContent$AsyncOutputStream.write(OutputStreamRequestContent.java:104) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.apache.solr.common.util.FastOutputStream.flush(FastOutputStream.java:207) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.common.util.FastOutputStream.flushBuffer(FastOutputStream.java:200) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.common.util.JavaBinCodec.close(JavaBinCodec.java:1293) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.request.JavaBinUpdateRequestCodec.marshal(JavaBinUpdateRequestCodec.java:99) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.BinaryRequestWriter.write(BinaryRequestWriter.java:79) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.Http2SolrClient.send(Http2SolrClient.java:435) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.ConcurrentUpdateHttp2SolrClient$Runner.sendUpdateStream(ConcurrentUpdateHttp2SolrClient.java:250) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.ConcurrentUpdateHttp2SolrClient$Runner.run(ConcurrentUpdateHttp2SolrClient.java:192) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at com.codahale.metrics.InstrumentedExecutorService$InstrumentedRunnable.run(InstrumentedExecutorService.java:212) ~[metrics-core-4.2.21.jar:4.2.21] 2> at org.apache.solr.common.util.ExecutorUtil$MDCAwareThreadPoolExecutor.lambda$execute$0(ExecutorUtil.java:294) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) ~[?:?] 2> at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) ~[?:?] 2> at java.base/java.lang.Thread.run(Thread.java:829) [?:?] 2> Caused by: java.net.ConnectException: Connection refused 2> at java.base/sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) ~[?:?] 2> at java.base/sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:774) ~[?:?] 2> at java.base/sun.nio.ch.SocketAdaptor.connect(SocketAdaptor.java:120) ~[?:?] 2> at org.eclipse.jetty.io.ClientConnector.connect(ClientConnector.java:428) ~[jetty-io-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.HTTP2Client.connect(HTTP2Client.java:461) ~[http2-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.http.HttpClientTransportOverHTTP2.connect(HttpClientTransportOverHTTP2.java:146) ~[http2-http-client-transport-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.http.HttpClientTransportOverHTTP2.connect(HttpClientTransportOverHTTP2.java:135) ~[http2-http-client-transport-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.connect(HttpClient.java:583) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.succeeded(HttpClient.java:560) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.succeeded(HttpClient.java:556) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.util.SocketAddressResolver$Async.lambda$resolve$1(SocketAddressResolver.java:180) ~[jetty-util-10.0.19.jar:10.0.19] 2> ... 4 more 2> Caused by: java.net.ConnectException: Connection refused 2> at java.base/sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) ~[?:?] 2> at java.base/sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:774) ~[?:?] 2> at java.base/sun.nio.ch.SocketAdaptor.connect(SocketAdaptor.java:120) ~[?:?] 2> at org.eclipse.jetty.io.ClientConnector.connect(ClientConnector.java:428) ~[jetty-io-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.HTTP2Client.connect(HTTP2Client.java:461) ~[http2-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.http.HttpClientTransportOverHTTP2.connect(HttpClientTransportOverHTTP2.java:146) ~[http2-http-client-transport-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.http.HttpClientTransportOverHTTP2.connect(HttpClientTransportOverHTTP2.java:135) ~[http2-http-client-transport-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.connect(HttpClient.java:583) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.succeeded(HttpClient.java:560) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.succeeded(HttpClient.java:556) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.util.SocketAddressResolver$Async.lambda$resolve$1(SocketAddressResolver.java:180) ~[jetty-util-10.0.19.jar:10.0.19] 2> ... 4 more 2> 367880 ERROR (qtp1455143443-2015) [n:127.0.0.1:32875__c c:repfacttest_c8n_1x3 s:shard1 r:core_node3 x:repfacttest_c8n_1x3_shard1_replica_n1 t:null-1090] o.a.s.u.p.DistributedZkUpdateProcessor Setting up to try to start recovery on replica core_node6 with url http://127.0.0.1:33207/_c/repfacttest_c8n_1x3_shard1_replica_n4/ by increasing leader term 2> => java.net.ConnectException: Connection refused 2> at java.base/sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) 2> java.net.ConnectException: Connection refused 2> at java.base/sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) ~[?:?] 2> at java.base/sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:774) ~[?:?] 2> at java.base/sun.nio.ch.SocketAdaptor.connect(SocketAdaptor.java:120) ~[?:?] 2> at org.eclipse.jetty.io.ClientConnector.connect(ClientConnector.java:428) ~[jetty-io-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.HTTP2Client.connect(HTTP2Client.java:461) ~[http2-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.http.HttpClientTransportOverHTTP2.connect(HttpClientTransportOverHTTP2.java:146) ~[http2-http-client-transport-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.http.HttpClientTransportOverHTTP2.connect(HttpClientTransportOverHTTP2.java:135) ~[http2-http-client-transport-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.connect(HttpClient.java:583) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.succeeded(HttpClient.java:560) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.succeeded(HttpClient.java:556) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.util.SocketAddressResolver$Async.lambda$resolve$1(SocketAddressResolver.java:180) ~[jetty-util-10.0.19.jar:10.0.19] 2> at org.apache.solr.common.util.ExecutorUtil$MDCAwareThreadPoolExecutor.lambda$execute$0(ExecutorUtil.java:294) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) ~[?:?] 2> at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) ~[?:?] 2> at java.base/java.lang.Thread.run(Thread.java:829) [?:?] 2> 367897 INFO (qtp1455143443-2015) [n:127.0.0.1:32875__c c:repfacttest_c8n_1x3 s:shard1 r:core_node3 x:repfacttest_c8n_1x3_shard1_replica_n1 t:null-1090] o.a.s.c.ZkShardTerms Successful update of terms at /collections/repfacttest_c8n_1x3/terms/shard1 to Terms{values={core_node6=5, core_node3=6, core_node5=6}, version=14} for ensureTermsIsHigher 2> 367898 INFO (qtp1455143443-2015) [n:127.0.0.1:32875__c c:repfacttest_c8n_1x3 s:shard1 r:core_node3 x:repfacttest_c8n_1x3_shard1_replica_n1 t:null-1090] o.a.s.u.p.LogUpdateProcessorFactory webapp=/_c path=/update params={wt=javabin&version=2}{add=[5 (1788735704169185280)]} 0 92 2> 367901 INFO (TEST-ReplicationFactorTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.ReplicationFactorTest Indexing batch of documents (15-29) 2> 367902 INFO (zkCallback-1161-thread-2) [n:127.0.0.1:33207__c c:repfacttest_c8n_1x3 s:shard1 r:core_node6 x:repfacttest_c8n_1x3_shard1_replica_n4 t:] o.a.s.c.RecoveringCoreTermWatcher Start recovery on core_node6 because core's term is less than leader's term 2> 367905 INFO (updateExecutor-1156-thread-2-processing-127.0.0.1:33207__c repfacttest_c8n_1x3_shard1_replica_n4 repfacttest_c8n_1x3 shard1 core_node6) [n:127.0.0.1:33207__c c:repfacttest_c8n_1x3 s:shard1 r:core_node6 x:repfacttest_c8n_1x3_shard1_replica_n4 t:] o.a.s.u.DefaultSolrCoreState Running recovery 2> 367905 INFO (updateExecutor-1156-thread-2-processing-127.0.0.1:33207__c repfacttest_c8n_1x3_shard1_replica_n4 repfacttest_c8n_1x3 shard1 core_node6) [n:127.0.0.1:33207__c c:repfacttest_c8n_1x3 s:shard1 r:core_node6 x:repfacttest_c8n_1x3_shard1_replica_n4 t:] o.a.s.c.ActionThrottle Throttling recovery attempts - waiting for 4637ms 2> 367974 INFO (qtp1639791653-1928) [n:127.0.0.1:40643__c c:repfacttest_c8n_1x3 s:shard1 r:core_node5 x:repfacttest_c8n_1x3_shard1_replica_n2 t:null-1091] o.a.s.u.p.LogUpdateProcessorFactory webapp=/_c path=/update params={update.distrib=FROMLEADER&distrib.from=http://127.0.0.1:32875/_c/repfacttest_c8n_1x3_shard1_replica_n1/&wt=javabin&version=2}{add=[15 (1788735704274042880), 16 (1788735704276140032), 17 (1788735704277188608), 18 (1788735704278237184), 19 (1788735704279285760), 20 (1788735704280334336), 21 (1788735704282431488), 22 (1788735704283480064), 23 (1788735704284528640), 24 (1788735704285577216), ... (15 adds)]} 0 16 2> 367977 INFO (qtp1455143443-2012) [n:127.0.0.1:32875__c c:repfacttest_c8n_1x3 s:shard1 r:core_node3 x:repfacttest_c8n_1x3_shard1_replica_n1 t:null-1091] o.a.s.u.p.LogUpdateProcessorFactory webapp=/_c path=/update params={wt=javabin&version=2}{add=[15 (1788735704274042880), 16 (1788735704276140032), 17 (1788735704277188608), 18 (1788735704278237184), 19 (1788735704279285760), 20 (1788735704280334336), 21 (1788735704282431488), 22 (1788735704283480064), 23 (1788735704284528640), 24 (1788735704285577216), ... (15 adds)]} 0 71 2> 368052 INFO (qtp1639791653-1925) [n:127.0.0.1:40643__c c:repfacttest_c8n_1x3 s:shard1 r:core_node5 x:repfacttest_c8n_1x3_shard1_replica_n2 t:null-1092] o.a.s.u.p.LogUpdateProcessorFactory webapp=/_c path=/update params={update.distrib=FROMLEADER&distrib.from=http://127.0.0.1:32875/_c/repfacttest_c8n_1x3_shard1_replica_n1/&wt=javabin&version=2}{add=[2144 (1788735704353734656), 2177 (1788735704355831808), 2181 (1788735704356880384), 2123 (1788735704357928960), 2187 (1788735704358977536), 2127 (1788735704360026112), 2130 (1788735704361074688), 2131 (1788735704362123264), 2165 (1788735704363171840), 2103 (1788735704364220416), ... (15 adds)]} 0 20 2> 368055 INFO (qtp1455143443-2013) [n:127.0.0.1:32875__c c:repfacttest_c8n_1x3 s:shard1 r:core_node3 x:repfacttest_c8n_1x3_shard1_replica_n1 t:null-1092] o.a.s.u.p.LogUpdateProcessorFactory webapp=/_c path=/update params={wt=javabin&version=2}{add=[2144 (1788735704353734656), 2177 (1788735704355831808), 2181 (1788735704356880384), 2123 (1788735704357928960), 2187 (1788735704358977536), 2127 (1788735704360026112), 2130 (1788735704361074688), 2131 (1788735704362123264), 2165 (1788735704363171840), 2103 (1788735704364220416), ... (15 adds)]} 0 73 2> 368186 ERROR (updateExecutor-1175-thread-1-processing-repfacttest_c8n_1x3_shard1_replica_n1 null-1093 core_node3 null 127.0.0.1:32875__c repfacttest_c8n_1x3 shard1) [n:127.0.0.1:32875__c c:repfacttest_c8n_1x3 s:shard1 r:core_node3 x:repfacttest_c8n_1x3_shard1_replica_n1 t:null-1093] o.a.s.u.ErrorReportingConcurrentUpdateSolrClient Error when calling SolrCmdDistributor$Req: cmd=delete{_version_=-1788735704435523584,query=`id:(2144 OR 2177 OR 2181 OR 2123 OR 2187 OR 2127 OR 2130 OR 2131 OR 2165 OR 2103 OR 2168 OR 2106 OR 2140 OR 2108 OR 2141)`,commitWithin=-1}; node=StdNode: http://127.0.0.1:33207/_c/repfacttest_c8n_1x3_shard1_replica_n4/ to http://127.0.0.1:33207/_c/repfacttest_c8n_1x3_shard1_replica_n4/ 2> => java.io.IOException: java.net.ConnectException: Connection refused 2> at org.eclipse.jetty.client.util.OutputStreamRequestContent$AsyncOutputStream.write(OutputStreamRequestContent.java:104) 2> java.io.IOException: java.net.ConnectException: Connection refused 2> at org.eclipse.jetty.client.util.OutputStreamRequestContent$AsyncOutputStream.write(OutputStreamRequestContent.java:104) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.apache.solr.common.util.FastOutputStream.flush(FastOutputStream.java:207) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.common.util.FastOutputStream.flushBuffer(FastOutputStream.java:200) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.common.util.JavaBinCodec.marshal(JavaBinCodec.java:170) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.request.JavaBinUpdateRequestCodec.marshal(JavaBinUpdateRequestCodec.java:100) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.BinaryRequestWriter.write(BinaryRequestWriter.java:79) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.Http2SolrClient.send(Http2SolrClient.java:435) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.ConcurrentUpdateHttp2SolrClient$Runner.sendUpdateStream(ConcurrentUpdateHttp2SolrClient.java:250) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.ConcurrentUpdateHttp2SolrClient$Runner.run(ConcurrentUpdateHttp2SolrClient.java:192) [solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at com.codahale.metrics.InstrumentedExecutorService$InstrumentedRunnable.run(InstrumentedExecutorService.java:212) [metrics-core-4.2.21.jar:4.2.21] 2> at org.apache.solr.common.util.ExecutorUtil$MDCAwareThreadPoolExecutor.lambda$execute$0(ExecutorUtil.java:294) [solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) [?:?] 2> at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) [?:?] 2> at java.base/java.lang.Thread.run(Thread.java:829) [?:?] 2> Suppressed: java.io.IOException: java.net.ConnectException: Connection refused 2> at org.eclipse.jetty.client.util.OutputStreamRequestContent$AsyncOutputStream.write(OutputStreamRequestContent.java:104) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.apache.solr.common.util.FastOutputStream.flush(FastOutputStream.java:207) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.common.util.FastOutputStream.flushBuffer(FastOutputStream.java:200) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.common.util.JavaBinCodec.close(JavaBinCodec.java:1293) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.request.JavaBinUpdateRequestCodec.marshal(JavaBinUpdateRequestCodec.java:99) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.BinaryRequestWriter.write(BinaryRequestWriter.java:79) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.Http2SolrClient.send(Http2SolrClient.java:435) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.ConcurrentUpdateHttp2SolrClient$Runner.sendUpdateStream(ConcurrentUpdateHttp2SolrClient.java:250) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.ConcurrentUpdateHttp2SolrClient$Runner.run(ConcurrentUpdateHttp2SolrClient.java:192) [solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at com.codahale.metrics.InstrumentedExecutorService$InstrumentedRunnable.run(InstrumentedExecutorService.java:212) [metrics-core-4.2.21.jar:4.2.21] 2> at org.apache.solr.common.util.ExecutorUtil$MDCAwareThreadPoolExecutor.lambda$execute$0(ExecutorUtil.java:294) [solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) [?:?] 2> at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) [?:?] 2> at java.base/java.lang.Thread.run(Thread.java:829) [?:?] 2> Caused by: java.net.ConnectException: Connection refused 2> at java.base/sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) ~[?:?] 2> at java.base/sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:774) ~[?:?] 2> at java.base/sun.nio.ch.SocketAdaptor.connect(SocketAdaptor.java:120) ~[?:?] 2> at org.eclipse.jetty.io.ClientConnector.connect(ClientConnector.java:428) ~[jetty-io-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.HTTP2Client.connect(HTTP2Client.java:461) ~[http2-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.http.HttpClientTransportOverHTTP2.connect(HttpClientTransportOverHTTP2.java:146) ~[http2-http-client-transport-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.http.HttpClientTransportOverHTTP2.connect(HttpClientTransportOverHTTP2.java:135) ~[http2-http-client-transport-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.connect(HttpClient.java:583) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.succeeded(HttpClient.java:560) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.succeeded(HttpClient.java:556) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.util.SocketAddressResolver$Async.lambda$resolve$1(SocketAddressResolver.java:180) ~[jetty-util-10.0.19.jar:10.0.19] 2> ... 4 more 2> Caused by: java.net.ConnectException: Connection refused 2> at java.base/sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) ~[?:?] 2> at java.base/sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:774) ~[?:?] 2> at java.base/sun.nio.ch.SocketAdaptor.connect(SocketAdaptor.java:120) ~[?:?] 2> at org.eclipse.jetty.io.ClientConnector.connect(ClientConnector.java:428) ~[jetty-io-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.HTTP2Client.connect(HTTP2Client.java:461) ~[http2-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.http.HttpClientTransportOverHTTP2.connect(HttpClientTransportOverHTTP2.java:146) ~[http2-http-client-transport-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.http.HttpClientTransportOverHTTP2.connect(HttpClientTransportOverHTTP2.java:135) ~[http2-http-client-transport-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.connect(HttpClient.java:583) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.succeeded(HttpClient.java:560) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.succeeded(HttpClient.java:556) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.util.SocketAddressResolver$Async.lambda$resolve$1(SocketAddressResolver.java:180) ~[jetty-util-10.0.19.jar:10.0.19] 2> ... 4 more 2> 368319 INFO (qtp1639791653-1929) [n:127.0.0.1:40643__c c:repfacttest_c8n_1x3 s:shard1 r:core_node5 x:repfacttest_c8n_1x3_shard1_replica_n2 t:null-1093] o.a.s.u.p.LogUpdateProcessorFactory webapp=/_c path=/update params={update.distrib=FROMLEADER&_version_=-1788735704435523584&distrib.from=http://127.0.0.1:32875/_c/repfacttest_c8n_1x3_shard1_replica_n1/&wt=javabin&version=2}{deleteByQuery=id:(2144 OR 2177 OR 2181 OR 2123 OR 2187 OR 2127 OR 2130 OR 2131 OR 2165 OR 2103 OR 2168 OR 2106 OR 2140 OR 2108 OR 2141) (-1788735704435523584)} 0 95 2> 368322 INFO (qtp1455143443-2011) [n:127.0.0.1:32875__c c:repfacttest_c8n_1x3 s:shard1 r:core_node3 x:repfacttest_c8n_1x3_shard1_replica_n1 t:null-1093] o.a.s.u.SolrCmdDistributor SolrCmdDistributor found 1 errors 2> 368323 WARN (qtp1455143443-2011) [n:127.0.0.1:32875__c c:repfacttest_c8n_1x3 s:shard1 r:core_node3 x:repfacttest_c8n_1x3_shard1_replica_n1 t:null-1093] o.a.s.u.p.DistributedZkUpdateProcessor Error sending update to http://127.0.0.1:33207/_c 2> => java.io.IOException: java.net.ConnectException: Connection refused 2> at org.eclipse.jetty.client.util.OutputStreamRequestContent$AsyncOutputStream.write(OutputStreamRequestContent.java:104) 2> java.io.IOException: java.net.ConnectException: Connection refused 2> at org.eclipse.jetty.client.util.OutputStreamRequestContent$AsyncOutputStream.write(OutputStreamRequestContent.java:104) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.apache.solr.common.util.FastOutputStream.flush(FastOutputStream.java:207) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.common.util.FastOutputStream.flushBuffer(FastOutputStream.java:200) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.common.util.JavaBinCodec.marshal(JavaBinCodec.java:170) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.request.JavaBinUpdateRequestCodec.marshal(JavaBinUpdateRequestCodec.java:100) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.BinaryRequestWriter.write(BinaryRequestWriter.java:79) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.Http2SolrClient.send(Http2SolrClient.java:435) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.ConcurrentUpdateHttp2SolrClient$Runner.sendUpdateStream(ConcurrentUpdateHttp2SolrClient.java:250) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.ConcurrentUpdateHttp2SolrClient$Runner.run(ConcurrentUpdateHttp2SolrClient.java:192) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at com.codahale.metrics.InstrumentedExecutorService$InstrumentedRunnable.run(InstrumentedExecutorService.java:212) ~[metrics-core-4.2.21.jar:4.2.21] 2> at org.apache.solr.common.util.ExecutorUtil$MDCAwareThreadPoolExecutor.lambda$execute$0(ExecutorUtil.java:294) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) ~[?:?] 2> at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) ~[?:?] 2> at java.base/java.lang.Thread.run(Thread.java:829) [?:?] 2> Suppressed: java.io.IOException: java.net.ConnectException: Connection refused 2> at org.eclipse.jetty.client.util.OutputStreamRequestContent$AsyncOutputStream.write(OutputStreamRequestContent.java:104) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.apache.solr.common.util.FastOutputStream.flush(FastOutputStream.java:207) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.common.util.FastOutputStream.flushBuffer(FastOutputStream.java:200) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.common.util.JavaBinCodec.close(JavaBinCodec.java:1293) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.request.JavaBinUpdateRequestCodec.marshal(JavaBinUpdateRequestCodec.java:99) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.BinaryRequestWriter.write(BinaryRequestWriter.java:79) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.Http2SolrClient.send(Http2SolrClient.java:435) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.ConcurrentUpdateHttp2SolrClient$Runner.sendUpdateStream(ConcurrentUpdateHttp2SolrClient.java:250) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.ConcurrentUpdateHttp2SolrClient$Runner.run(ConcurrentUpdateHttp2SolrClient.java:192) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at com.codahale.metrics.InstrumentedExecutorService$InstrumentedRunnable.run(InstrumentedExecutorService.java:212) ~[metrics-core-4.2.21.jar:4.2.21] 2> at org.apache.solr.common.util.ExecutorUtil$MDCAwareThreadPoolExecutor.lambda$execute$0(ExecutorUtil.java:294) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) ~[?:?] 2> at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) ~[?:?] 2> at java.base/java.lang.Thread.run(Thread.java:829) [?:?] 2> Caused by: java.net.ConnectException: Connection refused 2> at java.base/sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) ~[?:?] 2> at java.base/sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:774) ~[?:?] 2> at java.base/sun.nio.ch.SocketAdaptor.connect(SocketAdaptor.java:120) ~[?:?] 2> at org.eclipse.jetty.io.ClientConnector.connect(ClientConnector.java:428) ~[jetty-io-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.HTTP2Client.connect(HTTP2Client.java:461) ~[http2-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.http.HttpClientTransportOverHTTP2.connect(HttpClientTransportOverHTTP2.java:146) ~[http2-http-client-transport-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.http.HttpClientTransportOverHTTP2.connect(HttpClientTransportOverHTTP2.java:135) ~[http2-http-client-transport-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.connect(HttpClient.java:583) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.succeeded(HttpClient.java:560) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.succeeded(HttpClient.java:556) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.util.SocketAddressResolver$Async.lambda$resolve$1(SocketAddressResolver.java:180) ~[jetty-util-10.0.19.jar:10.0.19] 2> ... 4 more 2> Caused by: java.net.ConnectException: Connection refused 2> at java.base/sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) ~[?:?] 2> at java.base/sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:774) ~[?:?] 2> at java.base/sun.nio.ch.SocketAdaptor.connect(SocketAdaptor.java:120) ~[?:?] 2> at org.eclipse.jetty.io.ClientConnector.connect(ClientConnector.java:428) ~[jetty-io-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.HTTP2Client.connect(HTTP2Client.java:461) ~[http2-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.http.HttpClientTransportOverHTTP2.connect(HttpClientTransportOverHTTP2.java:146) ~[http2-http-client-transport-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.http.HttpClientTransportOverHTTP2.connect(HttpClientTransportOverHTTP2.java:135) ~[http2-http-client-transport-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.connect(HttpClient.java:583) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.succeeded(HttpClient.java:560) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.succeeded(HttpClient.java:556) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.util.SocketAddressResolver$Async.lambda$resolve$1(SocketAddressResolver.java:180) ~[jetty-util-10.0.19.jar:10.0.19] 2> ... 4 more 2> 368337 ERROR (qtp1455143443-2011) [n:127.0.0.1:32875__c c:repfacttest_c8n_1x3 s:shard1 r:core_node3 x:repfacttest_c8n_1x3_shard1_replica_n1 t:null-1093] o.a.s.u.p.DistributedZkUpdateProcessor Setting up to try to start recovery on replica core_node6 with url http://127.0.0.1:33207/_c/repfacttest_c8n_1x3_shard1_replica_n4/ by increasing leader term 2> => java.net.ConnectException: Connection refused 2> at java.base/sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) 2> java.net.ConnectException: Connection refused 2> at java.base/sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) ~[?:?] 2> at java.base/sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:774) ~[?:?] 2> at java.base/sun.nio.ch.SocketAdaptor.connect(SocketAdaptor.java:120) ~[?:?] 2> at org.eclipse.jetty.io.ClientConnector.connect(ClientConnector.java:428) ~[jetty-io-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.HTTP2Client.connect(HTTP2Client.java:461) ~[http2-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.http.HttpClientTransportOverHTTP2.connect(HttpClientTransportOverHTTP2.java:146) ~[http2-http-client-transport-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.http.HttpClientTransportOverHTTP2.connect(HttpClientTransportOverHTTP2.java:135) ~[http2-http-client-transport-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.connect(HttpClient.java:583) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.succeeded(HttpClient.java:560) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.succeeded(HttpClient.java:556) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.util.SocketAddressResolver$Async.lambda$resolve$1(SocketAddressResolver.java:180) ~[jetty-util-10.0.19.jar:10.0.19] 2> at org.apache.solr.common.util.ExecutorUtil$MDCAwareThreadPoolExecutor.lambda$execute$0(ExecutorUtil.java:294) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) ~[?:?] 2> at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) ~[?:?] 2> at java.base/java.lang.Thread.run(Thread.java:829) [?:?] 2> 368343 INFO (qtp1455143443-2011) [n:127.0.0.1:32875__c c:repfacttest_c8n_1x3 s:shard1 r:core_node3 x:repfacttest_c8n_1x3_shard1_replica_n1 t:null-1093] o.a.s.u.p.LogUpdateProcessorFactory webapp=/_c path=/update params={wt=javabin&version=2}{deleteByQuery=id:(2144 OR 2177 OR 2181 OR 2123 OR 2187 OR 2127 OR 2130 OR 2131 OR 2165 OR 2103 OR 2168 OR 2106 OR 2140 OR 2108 OR 2141) (-1788735704435523584)} 0 283 2> 368377 INFO (qtp1639791653-1927) [n:127.0.0.1:40643__c c:repfacttest_c8n_1x3 s:shard1 r:core_node5 x:repfacttest_c8n_1x3_shard1_replica_n2 t:null-1094] o.a.s.u.p.LogUpdateProcessorFactory webapp=/_c path=/update params={update.distrib=FROMLEADER&distrib.from=http://127.0.0.1:32875/_c/repfacttest_c8n_1x3_shard1_replica_n1/&wt=javabin&version=2}{add=[2304 (1788735704738562048), 2274 (1788735704743804928), 2276 (1788735704745902080), 2213 (1788735704746950656), 2246 (1788735704747999232), 2217 (1788735704749047808), 2219 (1788735704751144960), 2284 (1788735704752193536), 2285 (1788735704753242112), 2254 (1788735704754290688), ... (15 adds)]} 0 19 2> 368380 INFO (qtp1455143443-2010) [n:127.0.0.1:32875__c c:repfacttest_c8n_1x3 s:shard1 r:core_node3 x:repfacttest_c8n_1x3_shard1_replica_n1 t:null-1094] o.a.s.u.p.LogUpdateProcessorFactory webapp=/_c path=/update params={wt=javabin&version=2}{add=[2304 (1788735704738562048), 2274 (1788735704743804928), 2276 (1788735704745902080), 2213 (1788735704746950656), 2246 (1788735704747999232), 2217 (1788735704749047808), 2219 (1788735704751144960), 2284 (1788735704752193536), 2285 (1788735704753242112), 2254 (1788735704754290688), ... (15 adds)]} 0 31 2> 368442 INFO (qtp1639791653-1924) [n:127.0.0.1:40643__c c:repfacttest_c8n_1x3 s:shard1 r:core_node5 x:repfacttest_c8n_1x3_shard1_replica_n2 t:null-1095] o.a.s.u.p.LogUpdateProcessorFactory webapp=/_c path=/update params={update.distrib=FROMLEADER&distrib.from=http://127.0.0.1:32875/_c/repfacttest_c8n_1x3_shard1_replica_n1/&wt=javabin&version=2}{delete=[2304,2274,2276,2213,2246,2217,2219,2284,2285,2254,2224,2258,2268,2237,2238 (-1788735704778407936)]} 0 4 2> 368446 INFO (qtp1455143443-2014) [n:127.0.0.1:32875__c c:repfacttest_c8n_1x3 s:shard1 r:core_node3 x:repfacttest_c8n_1x3_shard1_replica_n1 t:null-1095] o.a.s.u.p.LogUpdateProcessorFactory webapp=/_c path=/update params={wt=javabin&version=2}{delete=[2304,2274,2276,2213,2246,2217,2219,2284,2285,2254,2224,2258,2268,2237,2238 (-1788735704778407936)]} 0 60 2> 368448 INFO (TEST-ReplicationFactorTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.ReplicationFactorTest Closing second proxy port (again) 2> 368448 WARN (TEST-ReplicationFactorTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.s.c.SocketProxy Closing 1 connections to: http://127.0.0.1:40643/_c, target: http://127.0.0.1:43165/_c 2> 368463 ERROR (updateExecutor-1175-thread-2-processing-repfacttest_c8n_1x3_shard1_replica_n1 null-1096 core_node3 null 127.0.0.1:32875__c repfacttest_c8n_1x3 shard1) [n:127.0.0.1:32875__c c:repfacttest_c8n_1x3 s:shard1 r:core_node3 x:repfacttest_c8n_1x3_shard1_replica_n1 t:null-1096] o.a.s.u.ErrorReportingConcurrentUpdateSolrClient Error when calling SolrCmdDistributor$Req: cmd=add{_version_=1788735704856002560,id=35}; node=StdNode: http://127.0.0.1:40643/_c/repfacttest_c8n_1x3_shard1_replica_n2/ to http://127.0.0.1:40643/_c/repfacttest_c8n_1x3_shard1_replica_n2/ 2> => java.io.IOException: java.net.ConnectException: Connection refused 2> at org.eclipse.jetty.client.util.OutputStreamRequestContent$AsyncOutputStream.write(OutputStreamRequestContent.java:104) 2> java.io.IOException: java.net.ConnectException: Connection refused 2> at org.eclipse.jetty.client.util.OutputStreamRequestContent$AsyncOutputStream.write(OutputStreamRequestContent.java:104) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.apache.solr.common.util.FastOutputStream.flush(FastOutputStream.java:207) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.common.util.FastOutputStream.flushBuffer(FastOutputStream.java:200) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.common.util.JavaBinCodec.marshal(JavaBinCodec.java:170) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.request.JavaBinUpdateRequestCodec.marshal(JavaBinUpdateRequestCodec.java:100) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.BinaryRequestWriter.write(BinaryRequestWriter.java:79) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.Http2SolrClient.send(Http2SolrClient.java:435) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.ConcurrentUpdateHttp2SolrClient$Runner.sendUpdateStream(ConcurrentUpdateHttp2SolrClient.java:250) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.ConcurrentUpdateHttp2SolrClient$Runner.run(ConcurrentUpdateHttp2SolrClient.java:192) [solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at com.codahale.metrics.InstrumentedExecutorService$InstrumentedRunnable.run(InstrumentedExecutorService.java:212) [metrics-core-4.2.21.jar:4.2.21] 2> at org.apache.solr.common.util.ExecutorUtil$MDCAwareThreadPoolExecutor.lambda$execute$0(ExecutorUtil.java:294) [solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) [?:?] 2> at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) [?:?] 2> at java.base/java.lang.Thread.run(Thread.java:829) [?:?] 2> Suppressed: java.io.IOException: java.net.ConnectException: Connection refused 2> at org.eclipse.jetty.client.util.OutputStreamRequestContent$AsyncOutputStream.write(OutputStreamRequestContent.java:104) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.apache.solr.common.util.FastOutputStream.flush(FastOutputStream.java:207) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.common.util.FastOutputStream.flushBuffer(FastOutputStream.java:200) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.common.util.JavaBinCodec.close(JavaBinCodec.java:1293) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.request.JavaBinUpdateRequestCodec.marshal(JavaBinUpdateRequestCodec.java:99) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.BinaryRequestWriter.write(BinaryRequestWriter.java:79) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.Http2SolrClient.send(Http2SolrClient.java:435) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.ConcurrentUpdateHttp2SolrClient$Runner.sendUpdateStream(ConcurrentUpdateHttp2SolrClient.java:250) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.ConcurrentUpdateHttp2SolrClient$Runner.run(ConcurrentUpdateHttp2SolrClient.java:192) [solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at com.codahale.metrics.InstrumentedExecutorService$InstrumentedRunnable.run(InstrumentedExecutorService.java:212) [metrics-core-4.2.21.jar:4.2.21] 2> at org.apache.solr.common.util.ExecutorUtil$MDCAwareThreadPoolExecutor.lambda$execute$0(ExecutorUtil.java:294) [solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) [?:?] 2> at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) [?:?] 2> at java.base/java.lang.Thread.run(Thread.java:829) [?:?] 2> Caused by: java.net.ConnectException: Connection refused 2> at java.base/sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) ~[?:?] 2> at java.base/sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:774) ~[?:?] 2> at java.base/sun.nio.ch.SocketAdaptor.connect(SocketAdaptor.java:120) ~[?:?] 2> at org.eclipse.jetty.io.ClientConnector.connect(ClientConnector.java:428) ~[jetty-io-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.HTTP2Client.connect(HTTP2Client.java:461) ~[http2-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.http.HttpClientTransportOverHTTP2.connect(HttpClientTransportOverHTTP2.java:146) ~[http2-http-client-transport-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.http.HttpClientTransportOverHTTP2.connect(HttpClientTransportOverHTTP2.java:135) ~[http2-http-client-transport-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.connect(HttpClient.java:583) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.succeeded(HttpClient.java:560) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.succeeded(HttpClient.java:556) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.util.SocketAddressResolver$Async.lambda$resolve$1(SocketAddressResolver.java:180) ~[jetty-util-10.0.19.jar:10.0.19] 2> ... 4 more 2> Caused by: java.net.ConnectException: Connection refused 2> at java.base/sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) ~[?:?] 2> at java.base/sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:774) ~[?:?] 2> at java.base/sun.nio.ch.SocketAdaptor.connect(SocketAdaptor.java:120) ~[?:?] 2> at org.eclipse.jetty.io.ClientConnector.connect(ClientConnector.java:428) ~[jetty-io-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.HTTP2Client.connect(HTTP2Client.java:461) ~[http2-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.http.HttpClientTransportOverHTTP2.connect(HttpClientTransportOverHTTP2.java:146) ~[http2-http-client-transport-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.http.HttpClientTransportOverHTTP2.connect(HttpClientTransportOverHTTP2.java:135) ~[http2-http-client-transport-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.connect(HttpClient.java:583) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.succeeded(HttpClient.java:560) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.succeeded(HttpClient.java:556) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.util.SocketAddressResolver$Async.lambda$resolve$1(SocketAddressResolver.java:180) ~[jetty-util-10.0.19.jar:10.0.19] 2> ... 4 more 2> 368483 ERROR (updateExecutor-1175-thread-2-processing-repfacttest_c8n_1x3_shard1_replica_n1 null-1096 core_node3 null 127.0.0.1:32875__c repfacttest_c8n_1x3 shard1) [n:127.0.0.1:32875__c c:repfacttest_c8n_1x3 s:shard1 r:core_node3 x:repfacttest_c8n_1x3_shard1_replica_n1 t:null-1096] o.a.s.u.ErrorReportingConcurrentUpdateSolrClient Error when calling SolrCmdDistributor$Req: cmd=add{,id=(null)}; node=StdNode: http://127.0.0.1:40643/_c/repfacttest_c8n_1x3_shard1_replica_n2/ to http://127.0.0.1:40643/_c/repfacttest_c8n_1x3_shard1_replica_n2/ 2> => java.io.IOException: java.net.ConnectException: Connection refused 2> at org.eclipse.jetty.client.util.OutputStreamRequestContent$AsyncOutputStream.write(OutputStreamRequestContent.java:104) 2> java.io.IOException: java.net.ConnectException: Connection refused 2> at org.eclipse.jetty.client.util.OutputStreamRequestContent$AsyncOutputStream.write(OutputStreamRequestContent.java:104) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.apache.solr.common.util.FastOutputStream.flush(FastOutputStream.java:207) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.common.util.FastOutputStream.flushBuffer(FastOutputStream.java:200) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.common.util.JavaBinCodec.marshal(JavaBinCodec.java:170) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.request.JavaBinUpdateRequestCodec.marshal(JavaBinUpdateRequestCodec.java:100) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.BinaryRequestWriter.write(BinaryRequestWriter.java:79) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.Http2SolrClient.send(Http2SolrClient.java:435) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.ConcurrentUpdateHttp2SolrClient$Runner.sendUpdateStream(ConcurrentUpdateHttp2SolrClient.java:250) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.ConcurrentUpdateHttp2SolrClient$Runner.run(ConcurrentUpdateHttp2SolrClient.java:192) [solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at com.codahale.metrics.InstrumentedExecutorService$InstrumentedRunnable.run(InstrumentedExecutorService.java:212) [metrics-core-4.2.21.jar:4.2.21] 2> at org.apache.solr.common.util.ExecutorUtil$MDCAwareThreadPoolExecutor.lambda$execute$0(ExecutorUtil.java:294) [solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) [?:?] 2> at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) [?:?] 2> at java.base/java.lang.Thread.run(Thread.java:829) [?:?] 2> Suppressed: java.io.IOException: java.net.ConnectException: Connection refused 2> at org.eclipse.jetty.client.util.OutputStreamRequestContent$AsyncOutputStream.write(OutputStreamRequestContent.java:104) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.apache.solr.common.util.FastOutputStream.flush(FastOutputStream.java:207) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.common.util.FastOutputStream.flushBuffer(FastOutputStream.java:200) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.common.util.JavaBinCodec.close(JavaBinCodec.java:1293) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.request.JavaBinUpdateRequestCodec.marshal(JavaBinUpdateRequestCodec.java:99) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.BinaryRequestWriter.write(BinaryRequestWriter.java:79) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.Http2SolrClient.send(Http2SolrClient.java:435) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.ConcurrentUpdateHttp2SolrClient$Runner.sendUpdateStream(ConcurrentUpdateHttp2SolrClient.java:250) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.ConcurrentUpdateHttp2SolrClient$Runner.run(ConcurrentUpdateHttp2SolrClient.java:192) [solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at com.codahale.metrics.InstrumentedExecutorService$InstrumentedRunnable.run(InstrumentedExecutorService.java:212) [metrics-core-4.2.21.jar:4.2.21] 2> at org.apache.solr.common.util.ExecutorUtil$MDCAwareThreadPoolExecutor.lambda$execute$0(ExecutorUtil.java:294) [solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) [?:?] 2> at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) [?:?] 2> at java.base/java.lang.Thread.run(Thread.java:829) [?:?] 2> Caused by: java.net.ConnectException: Connection refused 2> at java.base/sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) ~[?:?] 2> at java.base/sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:774) ~[?:?] 2> at java.base/sun.nio.ch.SocketAdaptor.connect(SocketAdaptor.java:120) ~[?:?] 2> at org.eclipse.jetty.io.ClientConnector.connect(ClientConnector.java:428) ~[jetty-io-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.HTTP2Client.connect(HTTP2Client.java:461) ~[http2-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.http.HttpClientTransportOverHTTP2.connect(HttpClientTransportOverHTTP2.java:146) ~[http2-http-client-transport-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.http.HttpClientTransportOverHTTP2.connect(HttpClientTransportOverHTTP2.java:135) ~[http2-http-client-transport-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.connect(HttpClient.java:583) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.succeeded(HttpClient.java:560) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.succeeded(HttpClient.java:556) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.util.SocketAddressResolver$Async.lambda$resolve$1(SocketAddressResolver.java:180) ~[jetty-util-10.0.19.jar:10.0.19] 2> ... 4 more 2> Caused by: java.net.ConnectException: Connection refused 2> at java.base/sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) ~[?:?] 2> at java.base/sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:774) ~[?:?] 2> at java.base/sun.nio.ch.SocketAdaptor.connect(SocketAdaptor.java:120) ~[?:?] 2> at org.eclipse.jetty.io.ClientConnector.connect(ClientConnector.java:428) ~[jetty-io-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.HTTP2Client.connect(HTTP2Client.java:461) ~[http2-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.http.HttpClientTransportOverHTTP2.connect(HttpClientTransportOverHTTP2.java:146) ~[http2-http-client-transport-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.http.HttpClientTransportOverHTTP2.connect(HttpClientTransportOverHTTP2.java:135) ~[http2-http-client-transport-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.connect(HttpClient.java:583) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.succeeded(HttpClient.java:560) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.succeeded(HttpClient.java:556) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.util.SocketAddressResolver$Async.lambda$resolve$1(SocketAddressResolver.java:180) ~[jetty-util-10.0.19.jar:10.0.19] 2> ... 4 more 2> 368503 ERROR (updateExecutor-1175-thread-2-processing-repfacttest_c8n_1x3_shard1_replica_n1 null-1096 core_node3 null 127.0.0.1:32875__c repfacttest_c8n_1x3 shard1) [n:127.0.0.1:32875__c c:repfacttest_c8n_1x3 s:shard1 r:core_node3 x:repfacttest_c8n_1x3_shard1_replica_n1 t:null-1096] o.a.s.u.ErrorReportingConcurrentUpdateSolrClient Error when calling SolrCmdDistributor$Req: cmd=add{,id=(null)}; node=StdNode: http://127.0.0.1:40643/_c/repfacttest_c8n_1x3_shard1_replica_n2/ to http://127.0.0.1:40643/_c/repfacttest_c8n_1x3_shard1_replica_n2/ 2> => java.io.IOException: java.net.ConnectException: Connection refused 2> at org.eclipse.jetty.client.util.OutputStreamRequestContent$AsyncOutputStream.write(OutputStreamRequestContent.java:104) 2> java.io.IOException: java.net.ConnectException: Connection refused 2> at org.eclipse.jetty.client.util.OutputStreamRequestContent$AsyncOutputStream.write(OutputStreamRequestContent.java:104) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.apache.solr.common.util.FastOutputStream.flush(FastOutputStream.java:207) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.common.util.FastOutputStream.flushBuffer(FastOutputStream.java:200) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.common.util.JavaBinCodec.marshal(JavaBinCodec.java:170) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.request.JavaBinUpdateRequestCodec.marshal(JavaBinUpdateRequestCodec.java:100) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.BinaryRequestWriter.write(BinaryRequestWriter.java:79) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.Http2SolrClient.send(Http2SolrClient.java:435) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.ConcurrentUpdateHttp2SolrClient$Runner.sendUpdateStream(ConcurrentUpdateHttp2SolrClient.java:250) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.ConcurrentUpdateHttp2SolrClient$Runner.run(ConcurrentUpdateHttp2SolrClient.java:192) [solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at com.codahale.metrics.InstrumentedExecutorService$InstrumentedRunnable.run(InstrumentedExecutorService.java:212) [metrics-core-4.2.21.jar:4.2.21] 2> at org.apache.solr.common.util.ExecutorUtil$MDCAwareThreadPoolExecutor.lambda$execute$0(ExecutorUtil.java:294) [solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) [?:?] 2> at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) [?:?] 2> at java.base/java.lang.Thread.run(Thread.java:829) [?:?] 2> Suppressed: java.io.IOException: java.net.ConnectException: Connection refused 2> at org.eclipse.jetty.client.util.OutputStreamRequestContent$AsyncOutputStream.write(OutputStreamRequestContent.java:104) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.apache.solr.common.util.FastOutputStream.flush(FastOutputStream.java:207) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.common.util.FastOutputStream.flushBuffer(FastOutputStream.java:200) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.common.util.JavaBinCodec.close(JavaBinCodec.java:1293) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.request.JavaBinUpdateRequestCodec.marshal(JavaBinUpdateRequestCodec.java:99) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.BinaryRequestWriter.write(BinaryRequestWriter.java:79) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.Http2SolrClient.send(Http2SolrClient.java:435) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.ConcurrentUpdateHttp2SolrClient$Runner.sendUpdateStream(ConcurrentUpdateHttp2SolrClient.java:250) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.ConcurrentUpdateHttp2SolrClient$Runner.run(ConcurrentUpdateHttp2SolrClient.java:192) [solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at com.codahale.metrics.InstrumentedExecutorService$InstrumentedRunnable.run(InstrumentedExecutorService.java:212) [metrics-core-4.2.21.jar:4.2.21] 2> at org.apache.solr.common.util.ExecutorUtil$MDCAwareThreadPoolExecutor.lambda$execute$0(ExecutorUtil.java:294) [solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) [?:?] 2> at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) [?:?] 2> at java.base/java.lang.Thread.run(Thread.java:829) [?:?] 2> Caused by: java.net.ConnectException: Connection refused 2> at java.base/sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) ~[?:?] 2> at java.base/sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:774) ~[?:?] 2> at java.base/sun.nio.ch.SocketAdaptor.connect(SocketAdaptor.java:120) ~[?:?] 2> at org.eclipse.jetty.io.ClientConnector.connect(ClientConnector.java:428) ~[jetty-io-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.HTTP2Client.connect(HTTP2Client.java:461) ~[http2-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.http.HttpClientTransportOverHTTP2.connect(HttpClientTransportOverHTTP2.java:146) ~[http2-http-client-transport-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.http.HttpClientTransportOverHTTP2.connect(HttpClientTransportOverHTTP2.java:135) ~[http2-http-client-transport-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.connect(HttpClient.java:583) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.succeeded(HttpClient.java:560) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.succeeded(HttpClient.java:556) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.util.SocketAddressResolver$Async.lambda$resolve$1(SocketAddressResolver.java:180) ~[jetty-util-10.0.19.jar:10.0.19] 2> ... 4 more 2> Caused by: java.net.ConnectException: Connection refused 2> at java.base/sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) ~[?:?] 2> at java.base/sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:774) ~[?:?] 2> at java.base/sun.nio.ch.SocketAdaptor.connect(SocketAdaptor.java:120) ~[?:?] 2> at org.eclipse.jetty.io.ClientConnector.connect(ClientConnector.java:428) ~[jetty-io-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.HTTP2Client.connect(HTTP2Client.java:461) ~[http2-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.http.HttpClientTransportOverHTTP2.connect(HttpClientTransportOverHTTP2.java:146) ~[http2-http-client-transport-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.http.HttpClientTransportOverHTTP2.connect(HttpClientTransportOverHTTP2.java:135) ~[http2-http-client-transport-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.connect(HttpClient.java:583) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.succeeded(HttpClient.java:560) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.succeeded(HttpClient.java:556) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.util.SocketAddressResolver$Async.lambda$resolve$1(SocketAddressResolver.java:180) ~[jetty-util-10.0.19.jar:10.0.19] 2> ... 4 more 2> 368526 ERROR (updateExecutor-1175-thread-2-processing-repfacttest_c8n_1x3_shard1_replica_n1 null-1096 core_node3 null 127.0.0.1:32875__c repfacttest_c8n_1x3 shard1) [n:127.0.0.1:32875__c c:repfacttest_c8n_1x3 s:shard1 r:core_node3 x:repfacttest_c8n_1x3_shard1_replica_n1 t:null-1096] o.a.s.u.ErrorReportingConcurrentUpdateSolrClient Error when calling SolrCmdDistributor$Req: cmd=add{,id=(null)}; node=StdNode: http://127.0.0.1:40643/_c/repfacttest_c8n_1x3_shard1_replica_n2/ to http://127.0.0.1:40643/_c/repfacttest_c8n_1x3_shard1_replica_n2/ 2> => java.io.IOException: java.net.ConnectException: Connection refused 2> at org.eclipse.jetty.client.util.OutputStreamRequestContent$AsyncOutputStream.write(OutputStreamRequestContent.java:104) 2> java.io.IOException: java.net.ConnectException: Connection refused 2> at org.eclipse.jetty.client.util.OutputStreamRequestContent$AsyncOutputStream.write(OutputStreamRequestContent.java:104) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.apache.solr.common.util.FastOutputStream.flush(FastOutputStream.java:207) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.common.util.FastOutputStream.flushBuffer(FastOutputStream.java:200) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.common.util.JavaBinCodec.marshal(JavaBinCodec.java:170) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.request.JavaBinUpdateRequestCodec.marshal(JavaBinUpdateRequestCodec.java:100) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.BinaryRequestWriter.write(BinaryRequestWriter.java:79) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.Http2SolrClient.send(Http2SolrClient.java:435) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.ConcurrentUpdateHttp2SolrClient$Runner.sendUpdateStream(ConcurrentUpdateHttp2SolrClient.java:250) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.ConcurrentUpdateHttp2SolrClient$Runner.run(ConcurrentUpdateHttp2SolrClient.java:192) [solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at com.codahale.metrics.InstrumentedExecutorService$InstrumentedRunnable.run(InstrumentedExecutorService.java:212) [metrics-core-4.2.21.jar:4.2.21] 2> at org.apache.solr.common.util.ExecutorUtil$MDCAwareThreadPoolExecutor.lambda$execute$0(ExecutorUtil.java:294) [solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) [?:?] 2> at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) [?:?] 2> at java.base/java.lang.Thread.run(Thread.java:829) [?:?] 2> Suppressed: java.io.IOException: java.net.ConnectException: Connection refused 2> at org.eclipse.jetty.client.util.OutputStreamRequestContent$AsyncOutputStream.write(OutputStreamRequestContent.java:104) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.apache.solr.common.util.FastOutputStream.flush(FastOutputStream.java:207) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.common.util.FastOutputStream.flushBuffer(FastOutputStream.java:200) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.common.util.JavaBinCodec.close(JavaBinCodec.java:1293) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.request.JavaBinUpdateRequestCodec.marshal(JavaBinUpdateRequestCodec.java:99) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.BinaryRequestWriter.write(BinaryRequestWriter.java:79) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.Http2SolrClient.send(Http2SolrClient.java:435) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.ConcurrentUpdateHttp2SolrClient$Runner.sendUpdateStream(ConcurrentUpdateHttp2SolrClient.java:250) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.ConcurrentUpdateHttp2SolrClient$Runner.run(ConcurrentUpdateHttp2SolrClient.java:192) [solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at com.codahale.metrics.InstrumentedExecutorService$InstrumentedRunnable.run(InstrumentedExecutorService.java:212) [metrics-core-4.2.21.jar:4.2.21] 2> at org.apache.solr.common.util.ExecutorUtil$MDCAwareThreadPoolExecutor.lambda$execute$0(ExecutorUtil.java:294) [solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) [?:?] 2> at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) [?:?] 2> at java.base/java.lang.Thread.run(Thread.java:829) [?:?] 2> Caused by: java.net.ConnectException: Connection refused 2> at java.base/sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) ~[?:?] 2> at java.base/sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:774) ~[?:?] 2> at java.base/sun.nio.ch.SocketAdaptor.connect(SocketAdaptor.java:120) ~[?:?] 2> at org.eclipse.jetty.io.ClientConnector.connect(ClientConnector.java:428) ~[jetty-io-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.HTTP2Client.connect(HTTP2Client.java:461) ~[http2-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.http.HttpClientTransportOverHTTP2.connect(HttpClientTransportOverHTTP2.java:146) ~[http2-http-client-transport-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.http.HttpClientTransportOverHTTP2.connect(HttpClientTransportOverHTTP2.java:135) ~[http2-http-client-transport-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.connect(HttpClient.java:583) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.succeeded(HttpClient.java:560) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.succeeded(HttpClient.java:556) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.util.SocketAddressResolver$Async.lambda$resolve$1(SocketAddressResolver.java:180) ~[jetty-util-10.0.19.jar:10.0.19] 2> ... 4 more 2> Caused by: java.net.ConnectException: Connection refused 2> at java.base/sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) ~[?:?] 2> at java.base/sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:774) ~[?:?] 2> at java.base/sun.nio.ch.SocketAdaptor.connect(SocketAdaptor.java:120) ~[?:?] 2> at org.eclipse.jetty.io.ClientConnector.connect(ClientConnector.java:428) ~[jetty-io-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.HTTP2Client.connect(HTTP2Client.java:461) ~[http2-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.http.HttpClientTransportOverHTTP2.connect(HttpClientTransportOverHTTP2.java:146) ~[http2-http-client-transport-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.http.HttpClientTransportOverHTTP2.connect(HttpClientTransportOverHTTP2.java:135) ~[http2-http-client-transport-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.connect(HttpClient.java:583) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.succeeded(HttpClient.java:560) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.succeeded(HttpClient.java:556) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.util.SocketAddressResolver$Async.lambda$resolve$1(SocketAddressResolver.java:180) ~[jetty-util-10.0.19.jar:10.0.19] 2> ... 4 more 2> 368550 ERROR (updateExecutor-1175-thread-2-processing-repfacttest_c8n_1x3_shard1_replica_n1 null-1096 core_node3 null 127.0.0.1:32875__c repfacttest_c8n_1x3 shard1) [n:127.0.0.1:32875__c c:repfacttest_c8n_1x3 s:shard1 r:core_node3 x:repfacttest_c8n_1x3_shard1_replica_n1 t:null-1096] o.a.s.u.ErrorReportingConcurrentUpdateSolrClient Error when calling SolrCmdDistributor$Req: cmd=add{,id=(null)}; node=StdNode: http://127.0.0.1:40643/_c/repfacttest_c8n_1x3_shard1_replica_n2/ to http://127.0.0.1:40643/_c/repfacttest_c8n_1x3_shard1_replica_n2/ 2> => java.io.IOException: java.net.ConnectException: Connection refused 2> at org.eclipse.jetty.client.util.OutputStreamRequestContent$AsyncOutputStream.write(OutputStreamRequestContent.java:104) 2> java.io.IOException: java.net.ConnectException: Connection refused 2> at org.eclipse.jetty.client.util.OutputStreamRequestContent$AsyncOutputStream.write(OutputStreamRequestContent.java:104) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.apache.solr.common.util.FastOutputStream.flush(FastOutputStream.java:207) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.common.util.FastOutputStream.flushBuffer(FastOutputStream.java:200) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.common.util.JavaBinCodec.marshal(JavaBinCodec.java:170) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.request.JavaBinUpdateRequestCodec.marshal(JavaBinUpdateRequestCodec.java:100) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.BinaryRequestWriter.write(BinaryRequestWriter.java:79) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.Http2SolrClient.send(Http2SolrClient.java:435) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.ConcurrentUpdateHttp2SolrClient$Runner.sendUpdateStream(ConcurrentUpdateHttp2SolrClient.java:250) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.ConcurrentUpdateHttp2SolrClient$Runner.run(ConcurrentUpdateHttp2SolrClient.java:192) [solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at com.codahale.metrics.InstrumentedExecutorService$InstrumentedRunnable.run(InstrumentedExecutorService.java:212) [metrics-core-4.2.21.jar:4.2.21] 2> at org.apache.solr.common.util.ExecutorUtil$MDCAwareThreadPoolExecutor.lambda$execute$0(ExecutorUtil.java:294) [solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) [?:?] 2> at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) [?:?] 2> at java.base/java.lang.Thread.run(Thread.java:829) [?:?] 2> Suppressed: java.io.IOException: java.net.ConnectException: Connection refused 2> at org.eclipse.jetty.client.util.OutputStreamRequestContent$AsyncOutputStream.write(OutputStreamRequestContent.java:104) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.apache.solr.common.util.FastOutputStream.flush(FastOutputStream.java:207) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.common.util.FastOutputStream.flushBuffer(FastOutputStream.java:200) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.common.util.JavaBinCodec.close(JavaBinCodec.java:1293) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.request.JavaBinUpdateRequestCodec.marshal(JavaBinUpdateRequestCodec.java:99) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.BinaryRequestWriter.write(BinaryRequestWriter.java:79) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.Http2SolrClient.send(Http2SolrClient.java:435) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.ConcurrentUpdateHttp2SolrClient$Runner.sendUpdateStream(ConcurrentUpdateHttp2SolrClient.java:250) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.ConcurrentUpdateHttp2SolrClient$Runner.run(ConcurrentUpdateHttp2SolrClient.java:192) [solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at com.codahale.metrics.InstrumentedExecutorService$InstrumentedRunnable.run(InstrumentedExecutorService.java:212) [metrics-core-4.2.21.jar:4.2.21] 2> at org.apache.solr.common.util.ExecutorUtil$MDCAwareThreadPoolExecutor.lambda$execute$0(ExecutorUtil.java:294) [solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) [?:?] 2> at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) [?:?] 2> at java.base/java.lang.Thread.run(Thread.java:829) [?:?] 2> Caused by: java.net.ConnectException: Connection refused 2> at java.base/sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) ~[?:?] 2> at java.base/sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:774) ~[?:?] 2> at java.base/sun.nio.ch.SocketAdaptor.connect(SocketAdaptor.java:120) ~[?:?] 2> at org.eclipse.jetty.io.ClientConnector.connect(ClientConnector.java:428) ~[jetty-io-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.HTTP2Client.connect(HTTP2Client.java:461) ~[http2-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.http.HttpClientTransportOverHTTP2.connect(HttpClientTransportOverHTTP2.java:146) ~[http2-http-client-transport-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.http.HttpClientTransportOverHTTP2.connect(HttpClientTransportOverHTTP2.java:135) ~[http2-http-client-transport-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.connect(HttpClient.java:583) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.succeeded(HttpClient.java:560) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.succeeded(HttpClient.java:556) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.util.SocketAddressResolver$Async.lambda$resolve$1(SocketAddressResolver.java:180) ~[jetty-util-10.0.19.jar:10.0.19] 2> ... 4 more 2> Caused by: java.net.ConnectException: Connection refused 2> at java.base/sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) ~[?:?] 2> at java.base/sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:774) ~[?:?] 2> at java.base/sun.nio.ch.SocketAdaptor.connect(SocketAdaptor.java:120) ~[?:?] 2> at org.eclipse.jetty.io.ClientConnector.connect(ClientConnector.java:428) ~[jetty-io-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.HTTP2Client.connect(HTTP2Client.java:461) ~[http2-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.http.HttpClientTransportOverHTTP2.connect(HttpClientTransportOverHTTP2.java:146) ~[http2-http-client-transport-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.http.HttpClientTransportOverHTTP2.connect(HttpClientTransportOverHTTP2.java:135) ~[http2-http-client-transport-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.connect(HttpClient.java:583) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.succeeded(HttpClient.java:560) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.succeeded(HttpClient.java:556) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.util.SocketAddressResolver$Async.lambda$resolve$1(SocketAddressResolver.java:180) ~[jetty-util-10.0.19.jar:10.0.19] 2> ... 4 more 2> 368568 ERROR (updateExecutor-1175-thread-2-processing-repfacttest_c8n_1x3_shard1_replica_n1 null-1096 core_node3 null 127.0.0.1:32875__c repfacttest_c8n_1x3 shard1) [n:127.0.0.1:32875__c c:repfacttest_c8n_1x3 s:shard1 r:core_node3 x:repfacttest_c8n_1x3_shard1_replica_n1 t:null-1096] o.a.s.u.ErrorReportingConcurrentUpdateSolrClient Error when calling SolrCmdDistributor$Req: cmd=add{,id=(null)}; node=StdNode: http://127.0.0.1:40643/_c/repfacttest_c8n_1x3_shard1_replica_n2/ to http://127.0.0.1:40643/_c/repfacttest_c8n_1x3_shard1_replica_n2/ 2> => java.io.IOException: java.net.ConnectException: Connection refused 2> at org.eclipse.jetty.client.util.OutputStreamRequestContent$AsyncOutputStream.write(OutputStreamRequestContent.java:104) 2> java.io.IOException: java.net.ConnectException: Connection refused 2> at org.eclipse.jetty.client.util.OutputStreamRequestContent$AsyncOutputStream.write(OutputStreamRequestContent.java:104) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.apache.solr.common.util.FastOutputStream.flush(FastOutputStream.java:207) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.common.util.FastOutputStream.flushBuffer(FastOutputStream.java:200) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.common.util.JavaBinCodec.marshal(JavaBinCodec.java:170) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.request.JavaBinUpdateRequestCodec.marshal(JavaBinUpdateRequestCodec.java:100) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.BinaryRequestWriter.write(BinaryRequestWriter.java:79) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.Http2SolrClient.send(Http2SolrClient.java:435) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.ConcurrentUpdateHttp2SolrClient$Runner.sendUpdateStream(ConcurrentUpdateHttp2SolrClient.java:250) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.ConcurrentUpdateHttp2SolrClient$Runner.run(ConcurrentUpdateHttp2SolrClient.java:192) [solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at com.codahale.metrics.InstrumentedExecutorService$InstrumentedRunnable.run(InstrumentedExecutorService.java:212) [metrics-core-4.2.21.jar:4.2.21] 2> at org.apache.solr.common.util.ExecutorUtil$MDCAwareThreadPoolExecutor.lambda$execute$0(ExecutorUtil.java:294) [solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) [?:?] 2> at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) [?:?] 2> at java.base/java.lang.Thread.run(Thread.java:829) [?:?] 2> Suppressed: java.io.IOException: java.net.ConnectException: Connection refused 2> at org.eclipse.jetty.client.util.OutputStreamRequestContent$AsyncOutputStream.write(OutputStreamRequestContent.java:104) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.apache.solr.common.util.FastOutputStream.flush(FastOutputStream.java:207) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.common.util.FastOutputStream.flushBuffer(FastOutputStream.java:200) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.common.util.JavaBinCodec.close(JavaBinCodec.java:1293) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.request.JavaBinUpdateRequestCodec.marshal(JavaBinUpdateRequestCodec.java:99) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.BinaryRequestWriter.write(BinaryRequestWriter.java:79) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.Http2SolrClient.send(Http2SolrClient.java:435) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.ConcurrentUpdateHttp2SolrClient$Runner.sendUpdateStream(ConcurrentUpdateHttp2SolrClient.java:250) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.ConcurrentUpdateHttp2SolrClient$Runner.run(ConcurrentUpdateHttp2SolrClient.java:192) [solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at com.codahale.metrics.InstrumentedExecutorService$InstrumentedRunnable.run(InstrumentedExecutorService.java:212) [metrics-core-4.2.21.jar:4.2.21] 2> at org.apache.solr.common.util.ExecutorUtil$MDCAwareThreadPoolExecutor.lambda$execute$0(ExecutorUtil.java:294) [solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) [?:?] 2> at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) [?:?] 2> at java.base/java.lang.Thread.run(Thread.java:829) [?:?] 2> Caused by: java.net.ConnectException: Connection refused 2> at java.base/sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) ~[?:?] 2> at java.base/sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:774) ~[?:?] 2> at java.base/sun.nio.ch.SocketAdaptor.connect(SocketAdaptor.java:120) ~[?:?] 2> at org.eclipse.jetty.io.ClientConnector.connect(ClientConnector.java:428) ~[jetty-io-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.HTTP2Client.connect(HTTP2Client.java:461) ~[http2-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.http.HttpClientTransportOverHTTP2.connect(HttpClientTransportOverHTTP2.java:146) ~[http2-http-client-transport-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.http.HttpClientTransportOverHTTP2.connect(HttpClientTransportOverHTTP2.java:135) ~[http2-http-client-transport-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.connect(HttpClient.java:583) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.succeeded(HttpClient.java:560) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.succeeded(HttpClient.java:556) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.util.SocketAddressResolver$Async.lambda$resolve$1(SocketAddressResolver.java:180) ~[jetty-util-10.0.19.jar:10.0.19] 2> ... 4 more 2> Caused by: java.net.ConnectException: Connection refused 2> at java.base/sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) ~[?:?] 2> at java.base/sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:774) ~[?:?] 2> at java.base/sun.nio.ch.SocketAdaptor.connect(SocketAdaptor.java:120) ~[?:?] 2> at org.eclipse.jetty.io.ClientConnector.connect(ClientConnector.java:428) ~[jetty-io-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.HTTP2Client.connect(HTTP2Client.java:461) ~[http2-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.http.HttpClientTransportOverHTTP2.connect(HttpClientTransportOverHTTP2.java:146) ~[http2-http-client-transport-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.http.HttpClientTransportOverHTTP2.connect(HttpClientTransportOverHTTP2.java:135) ~[http2-http-client-transport-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.connect(HttpClient.java:583) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.succeeded(HttpClient.java:560) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.succeeded(HttpClient.java:556) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.util.SocketAddressResolver$Async.lambda$resolve$1(SocketAddressResolver.java:180) ~[jetty-util-10.0.19.jar:10.0.19] 2> ... 4 more 2> 368589 ERROR (updateExecutor-1175-thread-2-processing-repfacttest_c8n_1x3_shard1_replica_n1 null-1096 core_node3 null 127.0.0.1:32875__c repfacttest_c8n_1x3 shard1) [n:127.0.0.1:32875__c c:repfacttest_c8n_1x3 s:shard1 r:core_node3 x:repfacttest_c8n_1x3_shard1_replica_n1 t:null-1096] o.a.s.u.ErrorReportingConcurrentUpdateSolrClient Error when calling SolrCmdDistributor$Req: cmd=add{,id=(null)}; node=StdNode: http://127.0.0.1:40643/_c/repfacttest_c8n_1x3_shard1_replica_n2/ to http://127.0.0.1:40643/_c/repfacttest_c8n_1x3_shard1_replica_n2/ 2> => java.io.IOException: java.net.ConnectException: Connection refused 2> at org.eclipse.jetty.client.util.OutputStreamRequestContent$AsyncOutputStream.write(OutputStreamRequestContent.java:104) 2> java.io.IOException: java.net.ConnectException: Connection refused 2> at org.eclipse.jetty.client.util.OutputStreamRequestContent$AsyncOutputStream.write(OutputStreamRequestContent.java:104) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.apache.solr.common.util.FastOutputStream.flush(FastOutputStream.java:207) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.common.util.FastOutputStream.flushBuffer(FastOutputStream.java:200) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.common.util.JavaBinCodec.marshal(JavaBinCodec.java:170) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.request.JavaBinUpdateRequestCodec.marshal(JavaBinUpdateRequestCodec.java:100) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.BinaryRequestWriter.write(BinaryRequestWriter.java:79) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.Http2SolrClient.send(Http2SolrClient.java:435) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.ConcurrentUpdateHttp2SolrClient$Runner.sendUpdateStream(ConcurrentUpdateHttp2SolrClient.java:250) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.ConcurrentUpdateHttp2SolrClient$Runner.run(ConcurrentUpdateHttp2SolrClient.java:192) [solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at com.codahale.metrics.InstrumentedExecutorService$InstrumentedRunnable.run(InstrumentedExecutorService.java:212) [metrics-core-4.2.21.jar:4.2.21] 2> at org.apache.solr.common.util.ExecutorUtil$MDCAwareThreadPoolExecutor.lambda$execute$0(ExecutorUtil.java:294) [solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) [?:?] 2> at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) [?:?] 2> at java.base/java.lang.Thread.run(Thread.java:829) [?:?] 2> Suppressed: java.io.IOException: java.net.ConnectException: Connection refused 2> at org.eclipse.jetty.client.util.OutputStreamRequestContent$AsyncOutputStream.write(OutputStreamRequestContent.java:104) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.apache.solr.common.util.FastOutputStream.flush(FastOutputStream.java:207) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.common.util.FastOutputStream.flushBuffer(FastOutputStream.java:200) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.common.util.JavaBinCodec.close(JavaBinCodec.java:1293) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.request.JavaBinUpdateRequestCodec.marshal(JavaBinUpdateRequestCodec.java:99) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.BinaryRequestWriter.write(BinaryRequestWriter.java:79) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.Http2SolrClient.send(Http2SolrClient.java:435) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.ConcurrentUpdateHttp2SolrClient$Runner.sendUpdateStream(ConcurrentUpdateHttp2SolrClient.java:250) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.ConcurrentUpdateHttp2SolrClient$Runner.run(ConcurrentUpdateHttp2SolrClient.java:192) [solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at com.codahale.metrics.InstrumentedExecutorService$InstrumentedRunnable.run(InstrumentedExecutorService.java:212) [metrics-core-4.2.21.jar:4.2.21] 2> at org.apache.solr.common.util.ExecutorUtil$MDCAwareThreadPoolExecutor.lambda$execute$0(ExecutorUtil.java:294) [solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) [?:?] 2> at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) [?:?] 2> at java.base/java.lang.Thread.run(Thread.java:829) [?:?] 2> Caused by: java.net.ConnectException: Connection refused 2> at java.base/sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) ~[?:?] 2> at java.base/sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:774) ~[?:?] 2> at java.base/sun.nio.ch.SocketAdaptor.connect(SocketAdaptor.java:120) ~[?:?] 2> at org.eclipse.jetty.io.ClientConnector.connect(ClientConnector.java:428) ~[jetty-io-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.HTTP2Client.connect(HTTP2Client.java:461) ~[http2-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.http.HttpClientTransportOverHTTP2.connect(HttpClientTransportOverHTTP2.java:146) ~[http2-http-client-transport-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.http.HttpClientTransportOverHTTP2.connect(HttpClientTransportOverHTTP2.java:135) ~[http2-http-client-transport-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.connect(HttpClient.java:583) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.succeeded(HttpClient.java:560) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.succeeded(HttpClient.java:556) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.util.SocketAddressResolver$Async.lambda$resolve$1(SocketAddressResolver.java:180) ~[jetty-util-10.0.19.jar:10.0.19] 2> ... 4 more 2> Caused by: java.net.ConnectException: Connection refused 2> at java.base/sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) ~[?:?] 2> at java.base/sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:774) ~[?:?] 2> at java.base/sun.nio.ch.SocketAdaptor.connect(SocketAdaptor.java:120) ~[?:?] 2> at org.eclipse.jetty.io.ClientConnector.connect(ClientConnector.java:428) ~[jetty-io-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.HTTP2Client.connect(HTTP2Client.java:461) ~[http2-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.http.HttpClientTransportOverHTTP2.connect(HttpClientTransportOverHTTP2.java:146) ~[http2-http-client-transport-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.http.HttpClientTransportOverHTTP2.connect(HttpClientTransportOverHTTP2.java:135) ~[http2-http-client-transport-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.connect(HttpClient.java:583) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.succeeded(HttpClient.java:560) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.succeeded(HttpClient.java:556) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.util.SocketAddressResolver$Async.lambda$resolve$1(SocketAddressResolver.java:180) ~[jetty-util-10.0.19.jar:10.0.19] 2> ... 4 more 2> 368608 ERROR (updateExecutor-1175-thread-2-processing-repfacttest_c8n_1x3_shard1_replica_n1 null-1096 core_node3 null 127.0.0.1:32875__c repfacttest_c8n_1x3 shard1) [n:127.0.0.1:32875__c c:repfacttest_c8n_1x3 s:shard1 r:core_node3 x:repfacttest_c8n_1x3_shard1_replica_n1 t:null-1096] o.a.s.u.ErrorReportingConcurrentUpdateSolrClient Error when calling SolrCmdDistributor$Req: cmd=add{,id=(null)}; node=StdNode: http://127.0.0.1:40643/_c/repfacttest_c8n_1x3_shard1_replica_n2/ to http://127.0.0.1:40643/_c/repfacttest_c8n_1x3_shard1_replica_n2/ 2> => java.io.IOException: java.net.ConnectException: Connection refused 2> at org.eclipse.jetty.client.util.OutputStreamRequestContent$AsyncOutputStream.write(OutputStreamRequestContent.java:104) 2> java.io.IOException: java.net.ConnectException: Connection refused 2> at org.eclipse.jetty.client.util.OutputStreamRequestContent$AsyncOutputStream.write(OutputStreamRequestContent.java:104) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.apache.solr.common.util.FastOutputStream.flush(FastOutputStream.java:207) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.common.util.FastOutputStream.flushBuffer(FastOutputStream.java:200) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.common.util.JavaBinCodec.marshal(JavaBinCodec.java:170) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.request.JavaBinUpdateRequestCodec.marshal(JavaBinUpdateRequestCodec.java:100) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.BinaryRequestWriter.write(BinaryRequestWriter.java:79) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.Http2SolrClient.send(Http2SolrClient.java:435) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.ConcurrentUpdateHttp2SolrClient$Runner.sendUpdateStream(ConcurrentUpdateHttp2SolrClient.java:250) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.ConcurrentUpdateHttp2SolrClient$Runner.run(ConcurrentUpdateHttp2SolrClient.java:192) [solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at com.codahale.metrics.InstrumentedExecutorService$InstrumentedRunnable.run(InstrumentedExecutorService.java:212) [metrics-core-4.2.21.jar:4.2.21] 2> at org.apache.solr.common.util.ExecutorUtil$MDCAwareThreadPoolExecutor.lambda$execute$0(ExecutorUtil.java:294) [solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) [?:?] 2> at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) [?:?] 2> at java.base/java.lang.Thread.run(Thread.java:829) [?:?] 2> Suppressed: java.io.IOException: java.net.ConnectException: Connection refused 2> at org.eclipse.jetty.client.util.OutputStreamRequestContent$AsyncOutputStream.write(OutputStreamRequestContent.java:104) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.apache.solr.common.util.FastOutputStream.flush(FastOutputStream.java:207) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.common.util.FastOutputStream.flushBuffer(FastOutputStream.java:200) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.common.util.JavaBinCodec.close(JavaBinCodec.java:1293) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.request.JavaBinUpdateRequestCodec.marshal(JavaBinUpdateRequestCodec.java:99) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.BinaryRequestWriter.write(BinaryRequestWriter.java:79) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.Http2SolrClient.send(Http2SolrClient.java:435) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.ConcurrentUpdateHttp2SolrClient$Runner.sendUpdateStream(ConcurrentUpdateHttp2SolrClient.java:250) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.ConcurrentUpdateHttp2SolrClient$Runner.run(ConcurrentUpdateHttp2SolrClient.java:192) [solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at com.codahale.metrics.InstrumentedExecutorService$InstrumentedRunnable.run(InstrumentedExecutorService.java:212) [metrics-core-4.2.21.jar:4.2.21] 2> at org.apache.solr.common.util.ExecutorUtil$MDCAwareThreadPoolExecutor.lambda$execute$0(ExecutorUtil.java:294) [solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) [?:?] 2> at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) [?:?] 2> at java.base/java.lang.Thread.run(Thread.java:829) [?:?] 2> Caused by: java.net.ConnectException: Connection refused 2> at java.base/sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) ~[?:?] 2> at java.base/sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:774) ~[?:?] 2> at java.base/sun.nio.ch.SocketAdaptor.connect(SocketAdaptor.java:120) ~[?:?] 2> at org.eclipse.jetty.io.ClientConnector.connect(ClientConnector.java:428) ~[jetty-io-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.HTTP2Client.connect(HTTP2Client.java:461) ~[http2-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.http.HttpClientTransportOverHTTP2.connect(HttpClientTransportOverHTTP2.java:146) ~[http2-http-client-transport-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.http.HttpClientTransportOverHTTP2.connect(HttpClientTransportOverHTTP2.java:135) ~[http2-http-client-transport-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.connect(HttpClient.java:583) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.succeeded(HttpClient.java:560) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.succeeded(HttpClient.java:556) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.util.SocketAddressResolver$Async.lambda$resolve$1(SocketAddressResolver.java:180) ~[jetty-util-10.0.19.jar:10.0.19] 2> ... 4 more 2> Caused by: java.net.ConnectException: Connection refused 2> at java.base/sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) ~[?:?] 2> at java.base/sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:774) ~[?:?] 2> at java.base/sun.nio.ch.SocketAdaptor.connect(SocketAdaptor.java:120) ~[?:?] 2> at org.eclipse.jetty.io.ClientConnector.connect(ClientConnector.java:428) ~[jetty-io-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.HTTP2Client.connect(HTTP2Client.java:461) ~[http2-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.http.HttpClientTransportOverHTTP2.connect(HttpClientTransportOverHTTP2.java:146) ~[http2-http-client-transport-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.http.HttpClientTransportOverHTTP2.connect(HttpClientTransportOverHTTP2.java:135) ~[http2-http-client-transport-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.connect(HttpClient.java:583) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.succeeded(HttpClient.java:560) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.succeeded(HttpClient.java:556) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.util.SocketAddressResolver$Async.lambda$resolve$1(SocketAddressResolver.java:180) ~[jetty-util-10.0.19.jar:10.0.19] 2> ... 4 more 2> 368628 ERROR (updateExecutor-1175-thread-2-processing-repfacttest_c8n_1x3_shard1_replica_n1 null-1096 core_node3 null 127.0.0.1:32875__c repfacttest_c8n_1x3 shard1) [n:127.0.0.1:32875__c c:repfacttest_c8n_1x3 s:shard1 r:core_node3 x:repfacttest_c8n_1x3_shard1_replica_n1 t:null-1096] o.a.s.u.ErrorReportingConcurrentUpdateSolrClient Error when calling SolrCmdDistributor$Req: cmd=add{,id=(null)}; node=StdNode: http://127.0.0.1:40643/_c/repfacttest_c8n_1x3_shard1_replica_n2/ to http://127.0.0.1:40643/_c/repfacttest_c8n_1x3_shard1_replica_n2/ 2> => java.io.IOException: java.net.ConnectException: Connection refused 2> at org.eclipse.jetty.client.util.OutputStreamRequestContent$AsyncOutputStream.write(OutputStreamRequestContent.java:104) 2> java.io.IOException: java.net.ConnectException: Connection refused 2> at org.eclipse.jetty.client.util.OutputStreamRequestContent$AsyncOutputStream.write(OutputStreamRequestContent.java:104) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.apache.solr.common.util.FastOutputStream.flush(FastOutputStream.java:207) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.common.util.FastOutputStream.flushBuffer(FastOutputStream.java:200) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.common.util.JavaBinCodec.marshal(JavaBinCodec.java:170) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.request.JavaBinUpdateRequestCodec.marshal(JavaBinUpdateRequestCodec.java:100) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.BinaryRequestWriter.write(BinaryRequestWriter.java:79) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.Http2SolrClient.send(Http2SolrClient.java:435) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.ConcurrentUpdateHttp2SolrClient$Runner.sendUpdateStream(ConcurrentUpdateHttp2SolrClient.java:250) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.ConcurrentUpdateHttp2SolrClient$Runner.run(ConcurrentUpdateHttp2SolrClient.java:192) [solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at com.codahale.metrics.InstrumentedExecutorService$InstrumentedRunnable.run(InstrumentedExecutorService.java:212) [metrics-core-4.2.21.jar:4.2.21] 2> at org.apache.solr.common.util.ExecutorUtil$MDCAwareThreadPoolExecutor.lambda$execute$0(ExecutorUtil.java:294) [solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) [?:?] 2> at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) [?:?] 2> at java.base/java.lang.Thread.run(Thread.java:829) [?:?] 2> Suppressed: java.io.IOException: java.net.ConnectException: Connection refused 2> at org.eclipse.jetty.client.util.OutputStreamRequestContent$AsyncOutputStream.write(OutputStreamRequestContent.java:104) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.apache.solr.common.util.FastOutputStream.flush(FastOutputStream.java:207) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.common.util.FastOutputStream.flushBuffer(FastOutputStream.java:200) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.common.util.JavaBinCodec.close(JavaBinCodec.java:1293) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.request.JavaBinUpdateRequestCodec.marshal(JavaBinUpdateRequestCodec.java:99) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.BinaryRequestWriter.write(BinaryRequestWriter.java:79) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.Http2SolrClient.send(Http2SolrClient.java:435) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.ConcurrentUpdateHttp2SolrClient$Runner.sendUpdateStream(ConcurrentUpdateHttp2SolrClient.java:250) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.ConcurrentUpdateHttp2SolrClient$Runner.run(ConcurrentUpdateHttp2SolrClient.java:192) [solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at com.codahale.metrics.InstrumentedExecutorService$InstrumentedRunnable.run(InstrumentedExecutorService.java:212) [metrics-core-4.2.21.jar:4.2.21] 2> at org.apache.solr.common.util.ExecutorUtil$MDCAwareThreadPoolExecutor.lambda$execute$0(ExecutorUtil.java:294) [solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) [?:?] 2> at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) [?:?] 2> at java.base/java.lang.Thread.run(Thread.java:829) [?:?] 2> Caused by: java.net.ConnectException: Connection refused 2> at java.base/sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) ~[?:?] 2> at java.base/sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:774) ~[?:?] 2> at java.base/sun.nio.ch.SocketAdaptor.connect(SocketAdaptor.java:120) ~[?:?] 2> at org.eclipse.jetty.io.ClientConnector.connect(ClientConnector.java:428) ~[jetty-io-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.HTTP2Client.connect(HTTP2Client.java:461) ~[http2-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.http.HttpClientTransportOverHTTP2.connect(HttpClientTransportOverHTTP2.java:146) ~[http2-http-client-transport-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.http.HttpClientTransportOverHTTP2.connect(HttpClientTransportOverHTTP2.java:135) ~[http2-http-client-transport-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.connect(HttpClient.java:583) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.succeeded(HttpClient.java:560) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.succeeded(HttpClient.java:556) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.util.SocketAddressResolver$Async.lambda$resolve$1(SocketAddressResolver.java:180) ~[jetty-util-10.0.19.jar:10.0.19] 2> ... 4 more 2> Caused by: java.net.ConnectException: Connection refused 2> at java.base/sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) ~[?:?] 2> at java.base/sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:774) ~[?:?] 2> at java.base/sun.nio.ch.SocketAdaptor.connect(SocketAdaptor.java:120) ~[?:?] 2> at org.eclipse.jetty.io.ClientConnector.connect(ClientConnector.java:428) ~[jetty-io-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.HTTP2Client.connect(HTTP2Client.java:461) ~[http2-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.http.HttpClientTransportOverHTTP2.connect(HttpClientTransportOverHTTP2.java:146) ~[http2-http-client-transport-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.http.HttpClientTransportOverHTTP2.connect(HttpClientTransportOverHTTP2.java:135) ~[http2-http-client-transport-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.connect(HttpClient.java:583) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.succeeded(HttpClient.java:560) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.succeeded(HttpClient.java:556) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.util.SocketAddressResolver$Async.lambda$resolve$1(SocketAddressResolver.java:180) ~[jetty-util-10.0.19.jar:10.0.19] 2> ... 4 more 2> 368645 ERROR (updateExecutor-1175-thread-2-processing-repfacttest_c8n_1x3_shard1_replica_n1 null-1096 core_node3 null 127.0.0.1:32875__c repfacttest_c8n_1x3 shard1) [n:127.0.0.1:32875__c c:repfacttest_c8n_1x3 s:shard1 r:core_node3 x:repfacttest_c8n_1x3_shard1_replica_n1 t:null-1096] o.a.s.u.ErrorReportingConcurrentUpdateSolrClient Error when calling SolrCmdDistributor$Req: cmd=add{,id=(null)}; node=StdNode: http://127.0.0.1:40643/_c/repfacttest_c8n_1x3_shard1_replica_n2/ to http://127.0.0.1:40643/_c/repfacttest_c8n_1x3_shard1_replica_n2/ 2> => java.io.IOException: java.net.ConnectException: Connection refused 2> at org.eclipse.jetty.client.util.OutputStreamRequestContent$AsyncOutputStream.write(OutputStreamRequestContent.java:104) 2> java.io.IOException: java.net.ConnectException: Connection refused 2> at org.eclipse.jetty.client.util.OutputStreamRequestContent$AsyncOutputStream.write(OutputStreamRequestContent.java:104) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.apache.solr.common.util.FastOutputStream.flush(FastOutputStream.java:207) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.common.util.FastOutputStream.flushBuffer(FastOutputStream.java:200) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.common.util.JavaBinCodec.marshal(JavaBinCodec.java:170) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.request.JavaBinUpdateRequestCodec.marshal(JavaBinUpdateRequestCodec.java:100) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.BinaryRequestWriter.write(BinaryRequestWriter.java:79) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.Http2SolrClient.send(Http2SolrClient.java:435) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.ConcurrentUpdateHttp2SolrClient$Runner.sendUpdateStream(ConcurrentUpdateHttp2SolrClient.java:250) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.ConcurrentUpdateHttp2SolrClient$Runner.run(ConcurrentUpdateHttp2SolrClient.java:192) [solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at com.codahale.metrics.InstrumentedExecutorService$InstrumentedRunnable.run(InstrumentedExecutorService.java:212) [metrics-core-4.2.21.jar:4.2.21] 2> at org.apache.solr.common.util.ExecutorUtil$MDCAwareThreadPoolExecutor.lambda$execute$0(ExecutorUtil.java:294) [solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) [?:?] 2> at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) [?:?] 2> at java.base/java.lang.Thread.run(Thread.java:829) [?:?] 2> Suppressed: java.io.IOException: java.net.ConnectException: Connection refused 2> at org.eclipse.jetty.client.util.OutputStreamRequestContent$AsyncOutputStream.write(OutputStreamRequestContent.java:104) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.apache.solr.common.util.FastOutputStream.flush(FastOutputStream.java:207) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.common.util.FastOutputStream.flushBuffer(FastOutputStream.java:200) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.common.util.JavaBinCodec.close(JavaBinCodec.java:1293) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.request.JavaBinUpdateRequestCodec.marshal(JavaBinUpdateRequestCodec.java:99) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.BinaryRequestWriter.write(BinaryRequestWriter.java:79) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.Http2SolrClient.send(Http2SolrClient.java:435) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.ConcurrentUpdateHttp2SolrClient$Runner.sendUpdateStream(ConcurrentUpdateHttp2SolrClient.java:250) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.ConcurrentUpdateHttp2SolrClient$Runner.run(ConcurrentUpdateHttp2SolrClient.java:192) [solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at com.codahale.metrics.InstrumentedExecutorService$InstrumentedRunnable.run(InstrumentedExecutorService.java:212) [metrics-core-4.2.21.jar:4.2.21] 2> at org.apache.solr.common.util.ExecutorUtil$MDCAwareThreadPoolExecutor.lambda$execute$0(ExecutorUtil.java:294) [solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) [?:?] 2> at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) [?:?] 2> at java.base/java.lang.Thread.run(Thread.java:829) [?:?] 2> Caused by: java.net.ConnectException: Connection refused 2> at java.base/sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) ~[?:?] 2> at java.base/sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:774) ~[?:?] 2> at java.base/sun.nio.ch.SocketAdaptor.connect(SocketAdaptor.java:120) ~[?:?] 2> at org.eclipse.jetty.io.ClientConnector.connect(ClientConnector.java:428) ~[jetty-io-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.HTTP2Client.connect(HTTP2Client.java:461) ~[http2-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.http.HttpClientTransportOverHTTP2.connect(HttpClientTransportOverHTTP2.java:146) ~[http2-http-client-transport-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.http.HttpClientTransportOverHTTP2.connect(HttpClientTransportOverHTTP2.java:135) ~[http2-http-client-transport-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.connect(HttpClient.java:583) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.succeeded(HttpClient.java:560) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.succeeded(HttpClient.java:556) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.util.SocketAddressResolver$Async.lambda$resolve$1(SocketAddressResolver.java:180) ~[jetty-util-10.0.19.jar:10.0.19] 2> ... 4 more 2> Caused by: java.net.ConnectException: Connection refused 2> at java.base/sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) ~[?:?] 2> at java.base/sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:774) ~[?:?] 2> at java.base/sun.nio.ch.SocketAdaptor.connect(SocketAdaptor.java:120) ~[?:?] 2> at org.eclipse.jetty.io.ClientConnector.connect(ClientConnector.java:428) ~[jetty-io-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.HTTP2Client.connect(HTTP2Client.java:461) ~[http2-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.http.HttpClientTransportOverHTTP2.connect(HttpClientTransportOverHTTP2.java:146) ~[http2-http-client-transport-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.http.HttpClientTransportOverHTTP2.connect(HttpClientTransportOverHTTP2.java:135) ~[http2-http-client-transport-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.connect(HttpClient.java:583) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.succeeded(HttpClient.java:560) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.succeeded(HttpClient.java:556) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.util.SocketAddressResolver$Async.lambda$resolve$1(SocketAddressResolver.java:180) ~[jetty-util-10.0.19.jar:10.0.19] 2> ... 4 more 2> 368665 ERROR (updateExecutor-1175-thread-2-processing-repfacttest_c8n_1x3_shard1_replica_n1 null-1096 core_node3 null 127.0.0.1:32875__c repfacttest_c8n_1x3 shard1) [n:127.0.0.1:32875__c c:repfacttest_c8n_1x3 s:shard1 r:core_node3 x:repfacttest_c8n_1x3_shard1_replica_n1 t:null-1096] o.a.s.u.ErrorReportingConcurrentUpdateSolrClient Error when calling SolrCmdDistributor$Req: cmd=add{,id=(null)}; node=StdNode: http://127.0.0.1:40643/_c/repfacttest_c8n_1x3_shard1_replica_n2/ to http://127.0.0.1:40643/_c/repfacttest_c8n_1x3_shard1_replica_n2/ 2> => java.io.IOException: java.net.ConnectException: Connection refused 2> at org.eclipse.jetty.client.util.OutputStreamRequestContent$AsyncOutputStream.write(OutputStreamRequestContent.java:104) 2> java.io.IOException: java.net.ConnectException: Connection refused 2> at org.eclipse.jetty.client.util.OutputStreamRequestContent$AsyncOutputStream.write(OutputStreamRequestContent.java:104) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.apache.solr.common.util.FastOutputStream.flush(FastOutputStream.java:207) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.common.util.FastOutputStream.flushBuffer(FastOutputStream.java:200) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.common.util.JavaBinCodec.marshal(JavaBinCodec.java:170) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.request.JavaBinUpdateRequestCodec.marshal(JavaBinUpdateRequestCodec.java:100) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.BinaryRequestWriter.write(BinaryRequestWriter.java:79) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.Http2SolrClient.send(Http2SolrClient.java:435) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.ConcurrentUpdateHttp2SolrClient$Runner.sendUpdateStream(ConcurrentUpdateHttp2SolrClient.java:250) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.ConcurrentUpdateHttp2SolrClient$Runner.run(ConcurrentUpdateHttp2SolrClient.java:192) [solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at com.codahale.metrics.InstrumentedExecutorService$InstrumentedRunnable.run(InstrumentedExecutorService.java:212) [metrics-core-4.2.21.jar:4.2.21] 2> at org.apache.solr.common.util.ExecutorUtil$MDCAwareThreadPoolExecutor.lambda$execute$0(ExecutorUtil.java:294) [solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) [?:?] 2> at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) [?:?] 2> at java.base/java.lang.Thread.run(Thread.java:829) [?:?] 2> Suppressed: java.io.IOException: java.net.ConnectException: Connection refused 2> at org.eclipse.jetty.client.util.OutputStreamRequestContent$AsyncOutputStream.write(OutputStreamRequestContent.java:104) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.apache.solr.common.util.FastOutputStream.flush(FastOutputStream.java:207) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.common.util.FastOutputStream.flushBuffer(FastOutputStream.java:200) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.common.util.JavaBinCodec.close(JavaBinCodec.java:1293) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.request.JavaBinUpdateRequestCodec.marshal(JavaBinUpdateRequestCodec.java:99) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.BinaryRequestWriter.write(BinaryRequestWriter.java:79) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.Http2SolrClient.send(Http2SolrClient.java:435) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.ConcurrentUpdateHttp2SolrClient$Runner.sendUpdateStream(ConcurrentUpdateHttp2SolrClient.java:250) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.ConcurrentUpdateHttp2SolrClient$Runner.run(ConcurrentUpdateHttp2SolrClient.java:192) [solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at com.codahale.metrics.InstrumentedExecutorService$InstrumentedRunnable.run(InstrumentedExecutorService.java:212) [metrics-core-4.2.21.jar:4.2.21] 2> at org.apache.solr.common.util.ExecutorUtil$MDCAwareThreadPoolExecutor.lambda$execute$0(ExecutorUtil.java:294) [solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) [?:?] 2> at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) [?:?] 2> at java.base/java.lang.Thread.run(Thread.java:829) [?:?] 2> Caused by: java.net.ConnectException: Connection refused 2> at java.base/sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) ~[?:?] 2> at java.base/sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:774) ~[?:?] 2> at java.base/sun.nio.ch.SocketAdaptor.connect(SocketAdaptor.java:120) ~[?:?] 2> at org.eclipse.jetty.io.ClientConnector.connect(ClientConnector.java:428) ~[jetty-io-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.HTTP2Client.connect(HTTP2Client.java:461) ~[http2-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.http.HttpClientTransportOverHTTP2.connect(HttpClientTransportOverHTTP2.java:146) ~[http2-http-client-transport-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.http.HttpClientTransportOverHTTP2.connect(HttpClientTransportOverHTTP2.java:135) ~[http2-http-client-transport-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.connect(HttpClient.java:583) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.succeeded(HttpClient.java:560) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.succeeded(HttpClient.java:556) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.util.SocketAddressResolver$Async.lambda$resolve$1(SocketAddressResolver.java:180) ~[jetty-util-10.0.19.jar:10.0.19] 2> ... 4 more 2> Caused by: java.net.ConnectException: Connection refused 2> at java.base/sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) ~[?:?] 2> at java.base/sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:774) ~[?:?] 2> at java.base/sun.nio.ch.SocketAdaptor.connect(SocketAdaptor.java:120) ~[?:?] 2> at org.eclipse.jetty.io.ClientConnector.connect(ClientConnector.java:428) ~[jetty-io-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.HTTP2Client.connect(HTTP2Client.java:461) ~[http2-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.http.HttpClientTransportOverHTTP2.connect(HttpClientTransportOverHTTP2.java:146) ~[http2-http-client-transport-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.http.HttpClientTransportOverHTTP2.connect(HttpClientTransportOverHTTP2.java:135) ~[http2-http-client-transport-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.connect(HttpClient.java:583) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.succeeded(HttpClient.java:560) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.succeeded(HttpClient.java:556) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.util.SocketAddressResolver$Async.lambda$resolve$1(SocketAddressResolver.java:180) ~[jetty-util-10.0.19.jar:10.0.19] 2> ... 4 more 2> 368691 ERROR (updateExecutor-1175-thread-2-processing-repfacttest_c8n_1x3_shard1_replica_n1 null-1096 core_node3 null 127.0.0.1:32875__c repfacttest_c8n_1x3 shard1) [n:127.0.0.1:32875__c c:repfacttest_c8n_1x3 s:shard1 r:core_node3 x:repfacttest_c8n_1x3_shard1_replica_n1 t:null-1096] o.a.s.u.ErrorReportingConcurrentUpdateSolrClient Error when calling SolrCmdDistributor$Req: cmd=add{,id=(null)}; node=StdNode: http://127.0.0.1:40643/_c/repfacttest_c8n_1x3_shard1_replica_n2/ to http://127.0.0.1:40643/_c/repfacttest_c8n_1x3_shard1_replica_n2/ 2> => java.io.IOException: java.net.ConnectException: Connection refused 2> at org.eclipse.jetty.client.util.OutputStreamRequestContent$AsyncOutputStream.write(OutputStreamRequestContent.java:104) 2> java.io.IOException: java.net.ConnectException: Connection refused 2> at org.eclipse.jetty.client.util.OutputStreamRequestContent$AsyncOutputStream.write(OutputStreamRequestContent.java:104) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.apache.solr.common.util.FastOutputStream.flush(FastOutputStream.java:207) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.common.util.FastOutputStream.flushBuffer(FastOutputStream.java:200) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.common.util.JavaBinCodec.marshal(JavaBinCodec.java:170) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.request.JavaBinUpdateRequestCodec.marshal(JavaBinUpdateRequestCodec.java:100) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.BinaryRequestWriter.write(BinaryRequestWriter.java:79) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.Http2SolrClient.send(Http2SolrClient.java:435) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.ConcurrentUpdateHttp2SolrClient$Runner.sendUpdateStream(ConcurrentUpdateHttp2SolrClient.java:250) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.ConcurrentUpdateHttp2SolrClient$Runner.run(ConcurrentUpdateHttp2SolrClient.java:192) [solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at com.codahale.metrics.InstrumentedExecutorService$InstrumentedRunnable.run(InstrumentedExecutorService.java:212) [metrics-core-4.2.21.jar:4.2.21] 2> at org.apache.solr.common.util.ExecutorUtil$MDCAwareThreadPoolExecutor.lambda$execute$0(ExecutorUtil.java:294) [solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) [?:?] 2> at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) [?:?] 2> at java.base/java.lang.Thread.run(Thread.java:829) [?:?] 2> Suppressed: java.io.IOException: java.net.ConnectException: Connection refused 2> at org.eclipse.jetty.client.util.OutputStreamRequestContent$AsyncOutputStream.write(OutputStreamRequestContent.java:104) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.apache.solr.common.util.FastOutputStream.flush(FastOutputStream.java:207) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.common.util.FastOutputStream.flushBuffer(FastOutputStream.java:200) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.common.util.JavaBinCodec.close(JavaBinCodec.java:1293) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.request.JavaBinUpdateRequestCodec.marshal(JavaBinUpdateRequestCodec.java:99) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.BinaryRequestWriter.write(BinaryRequestWriter.java:79) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.Http2SolrClient.send(Http2SolrClient.java:435) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.ConcurrentUpdateHttp2SolrClient$Runner.sendUpdateStream(ConcurrentUpdateHttp2SolrClient.java:250) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.ConcurrentUpdateHttp2SolrClient$Runner.run(ConcurrentUpdateHttp2SolrClient.java:192) [solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at com.codahale.metrics.InstrumentedExecutorService$InstrumentedRunnable.run(InstrumentedExecutorService.java:212) [metrics-core-4.2.21.jar:4.2.21] 2> at org.apache.solr.common.util.ExecutorUtil$MDCAwareThreadPoolExecutor.lambda$execute$0(ExecutorUtil.java:294) [solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) [?:?] 2> at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) [?:?] 2> at java.base/java.lang.Thread.run(Thread.java:829) [?:?] 2> Caused by: java.net.ConnectException: Connection refused 2> at java.base/sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) ~[?:?] 2> at java.base/sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:774) ~[?:?] 2> at java.base/sun.nio.ch.SocketAdaptor.connect(SocketAdaptor.java:120) ~[?:?] 2> at org.eclipse.jetty.io.ClientConnector.connect(ClientConnector.java:428) ~[jetty-io-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.HTTP2Client.connect(HTTP2Client.java:461) ~[http2-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.http.HttpClientTransportOverHTTP2.connect(HttpClientTransportOverHTTP2.java:146) ~[http2-http-client-transport-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.http.HttpClientTransportOverHTTP2.connect(HttpClientTransportOverHTTP2.java:135) ~[http2-http-client-transport-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.connect(HttpClient.java:583) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.succeeded(HttpClient.java:560) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.succeeded(HttpClient.java:556) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.util.SocketAddressResolver$Async.lambda$resolve$1(SocketAddressResolver.java:180) ~[jetty-util-10.0.19.jar:10.0.19] 2> ... 4 more 2> Caused by: java.net.ConnectException: Connection refused 2> at java.base/sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) ~[?:?] 2> at java.base/sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:774) ~[?:?] 2> at java.base/sun.nio.ch.SocketAdaptor.connect(SocketAdaptor.java:120) ~[?:?] 2> at org.eclipse.jetty.io.ClientConnector.connect(ClientConnector.java:428) ~[jetty-io-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.HTTP2Client.connect(HTTP2Client.java:461) ~[http2-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.http.HttpClientTransportOverHTTP2.connect(HttpClientTransportOverHTTP2.java:146) ~[http2-http-client-transport-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.http.HttpClientTransportOverHTTP2.connect(HttpClientTransportOverHTTP2.java:135) ~[http2-http-client-transport-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.connect(HttpClient.java:583) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.succeeded(HttpClient.java:560) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.succeeded(HttpClient.java:556) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.util.SocketAddressResolver$Async.lambda$resolve$1(SocketAddressResolver.java:180) ~[jetty-util-10.0.19.jar:10.0.19] 2> ... 4 more 2> 368717 ERROR (updateExecutor-1175-thread-2-processing-repfacttest_c8n_1x3_shard1_replica_n1 null-1096 core_node3 null 127.0.0.1:32875__c repfacttest_c8n_1x3 shard1) [n:127.0.0.1:32875__c c:repfacttest_c8n_1x3 s:shard1 r:core_node3 x:repfacttest_c8n_1x3_shard1_replica_n1 t:null-1096] o.a.s.u.ErrorReportingConcurrentUpdateSolrClient Error when calling SolrCmdDistributor$Req: cmd=add{,id=(null)}; node=StdNode: http://127.0.0.1:40643/_c/repfacttest_c8n_1x3_shard1_replica_n2/ to http://127.0.0.1:40643/_c/repfacttest_c8n_1x3_shard1_replica_n2/ 2> => java.io.IOException: java.net.ConnectException: Connection refused 2> at org.eclipse.jetty.client.util.OutputStreamRequestContent$AsyncOutputStream.write(OutputStreamRequestContent.java:104) 2> java.io.IOException: java.net.ConnectException: Connection refused 2> at org.eclipse.jetty.client.util.OutputStreamRequestContent$AsyncOutputStream.write(OutputStreamRequestContent.java:104) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.apache.solr.common.util.FastOutputStream.flush(FastOutputStream.java:207) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.common.util.FastOutputStream.flushBuffer(FastOutputStream.java:200) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.common.util.JavaBinCodec.marshal(JavaBinCodec.java:170) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.request.JavaBinUpdateRequestCodec.marshal(JavaBinUpdateRequestCodec.java:100) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.BinaryRequestWriter.write(BinaryRequestWriter.java:79) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.Http2SolrClient.send(Http2SolrClient.java:435) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.ConcurrentUpdateHttp2SolrClient$Runner.sendUpdateStream(ConcurrentUpdateHttp2SolrClient.java:250) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.ConcurrentUpdateHttp2SolrClient$Runner.run(ConcurrentUpdateHttp2SolrClient.java:192) [solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at com.codahale.metrics.InstrumentedExecutorService$InstrumentedRunnable.run(InstrumentedExecutorService.java:212) [metrics-core-4.2.21.jar:4.2.21] 2> at org.apache.solr.common.util.ExecutorUtil$MDCAwareThreadPoolExecutor.lambda$execute$0(ExecutorUtil.java:294) [solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) [?:?] 2> at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) [?:?] 2> at java.base/java.lang.Thread.run(Thread.java:829) [?:?] 2> Suppressed: java.io.IOException: java.net.ConnectException: Connection refused 2> at org.eclipse.jetty.client.util.OutputStreamRequestContent$AsyncOutputStream.write(OutputStreamRequestContent.java:104) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.apache.solr.common.util.FastOutputStream.flush(FastOutputStream.java:207) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.common.util.FastOutputStream.flushBuffer(FastOutputStream.java:200) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.common.util.JavaBinCodec.close(JavaBinCodec.java:1293) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.request.JavaBinUpdateRequestCodec.marshal(JavaBinUpdateRequestCodec.java:99) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.BinaryRequestWriter.write(BinaryRequestWriter.java:79) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.Http2SolrClient.send(Http2SolrClient.java:435) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.ConcurrentUpdateHttp2SolrClient$Runner.sendUpdateStream(ConcurrentUpdateHttp2SolrClient.java:250) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.ConcurrentUpdateHttp2SolrClient$Runner.run(ConcurrentUpdateHttp2SolrClient.java:192) [solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at com.codahale.metrics.InstrumentedExecutorService$InstrumentedRunnable.run(InstrumentedExecutorService.java:212) [metrics-core-4.2.21.jar:4.2.21] 2> at org.apache.solr.common.util.ExecutorUtil$MDCAwareThreadPoolExecutor.lambda$execute$0(ExecutorUtil.java:294) [solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) [?:?] 2> at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) [?:?] 2> at java.base/java.lang.Thread.run(Thread.java:829) [?:?] 2> Caused by: java.net.ConnectException: Connection refused 2> at java.base/sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) ~[?:?] 2> at java.base/sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:774) ~[?:?] 2> at java.base/sun.nio.ch.SocketAdaptor.connect(SocketAdaptor.java:120) ~[?:?] 2> at org.eclipse.jetty.io.ClientConnector.connect(ClientConnector.java:428) ~[jetty-io-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.HTTP2Client.connect(HTTP2Client.java:461) ~[http2-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.http.HttpClientTransportOverHTTP2.connect(HttpClientTransportOverHTTP2.java:146) ~[http2-http-client-transport-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.http.HttpClientTransportOverHTTP2.connect(HttpClientTransportOverHTTP2.java:135) ~[http2-http-client-transport-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.connect(HttpClient.java:583) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.succeeded(HttpClient.java:560) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.succeeded(HttpClient.java:556) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.util.SocketAddressResolver$Async.lambda$resolve$1(SocketAddressResolver.java:180) ~[jetty-util-10.0.19.jar:10.0.19] 2> ... 4 more 2> Caused by: java.net.ConnectException: Connection refused 2> at java.base/sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) ~[?:?] 2> at java.base/sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:774) ~[?:?] 2> at java.base/sun.nio.ch.SocketAdaptor.connect(SocketAdaptor.java:120) ~[?:?] 2> at org.eclipse.jetty.io.ClientConnector.connect(ClientConnector.java:428) ~[jetty-io-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.HTTP2Client.connect(HTTP2Client.java:461) ~[http2-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.http.HttpClientTransportOverHTTP2.connect(HttpClientTransportOverHTTP2.java:146) ~[http2-http-client-transport-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.http.HttpClientTransportOverHTTP2.connect(HttpClientTransportOverHTTP2.java:135) ~[http2-http-client-transport-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.connect(HttpClient.java:583) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.succeeded(HttpClient.java:560) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.succeeded(HttpClient.java:556) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.util.SocketAddressResolver$Async.lambda$resolve$1(SocketAddressResolver.java:180) ~[jetty-util-10.0.19.jar:10.0.19] 2> ... 4 more 2> 368750 ERROR (updateExecutor-1175-thread-2-processing-repfacttest_c8n_1x3_shard1_replica_n1 null-1096 core_node3 null 127.0.0.1:32875__c repfacttest_c8n_1x3 shard1) [n:127.0.0.1:32875__c c:repfacttest_c8n_1x3 s:shard1 r:core_node3 x:repfacttest_c8n_1x3_shard1_replica_n1 t:null-1096] o.a.s.u.ErrorReportingConcurrentUpdateSolrClient Error when calling SolrCmdDistributor$Req: cmd=add{,id=(null)}; node=StdNode: http://127.0.0.1:40643/_c/repfacttest_c8n_1x3_shard1_replica_n2/ to http://127.0.0.1:40643/_c/repfacttest_c8n_1x3_shard1_replica_n2/ 2> => java.io.IOException: java.net.ConnectException: Connection refused 2> at org.eclipse.jetty.client.util.OutputStreamRequestContent$AsyncOutputStream.write(OutputStreamRequestContent.java:104) 2> java.io.IOException: java.net.ConnectException: Connection refused 2> at org.eclipse.jetty.client.util.OutputStreamRequestContent$AsyncOutputStream.write(OutputStreamRequestContent.java:104) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.apache.solr.common.util.FastOutputStream.flush(FastOutputStream.java:207) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.common.util.FastOutputStream.flushBuffer(FastOutputStream.java:200) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.common.util.JavaBinCodec.marshal(JavaBinCodec.java:170) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.request.JavaBinUpdateRequestCodec.marshal(JavaBinUpdateRequestCodec.java:100) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.BinaryRequestWriter.write(BinaryRequestWriter.java:79) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.Http2SolrClient.send(Http2SolrClient.java:435) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.ConcurrentUpdateHttp2SolrClient$Runner.sendUpdateStream(ConcurrentUpdateHttp2SolrClient.java:250) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.ConcurrentUpdateHttp2SolrClient$Runner.run(ConcurrentUpdateHttp2SolrClient.java:192) [solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at com.codahale.metrics.InstrumentedExecutorService$InstrumentedRunnable.run(InstrumentedExecutorService.java:212) [metrics-core-4.2.21.jar:4.2.21] 2> at org.apache.solr.common.util.ExecutorUtil$MDCAwareThreadPoolExecutor.lambda$execute$0(ExecutorUtil.java:294) [solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) [?:?] 2> at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) [?:?] 2> at java.base/java.lang.Thread.run(Thread.java:829) [?:?] 2> Suppressed: java.io.IOException: java.net.ConnectException: Connection refused 2> at org.eclipse.jetty.client.util.OutputStreamRequestContent$AsyncOutputStream.write(OutputStreamRequestContent.java:104) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.apache.solr.common.util.FastOutputStream.flush(FastOutputStream.java:207) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.common.util.FastOutputStream.flushBuffer(FastOutputStream.java:200) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.common.util.JavaBinCodec.close(JavaBinCodec.java:1293) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.request.JavaBinUpdateRequestCodec.marshal(JavaBinUpdateRequestCodec.java:99) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.BinaryRequestWriter.write(BinaryRequestWriter.java:79) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.Http2SolrClient.send(Http2SolrClient.java:435) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.ConcurrentUpdateHttp2SolrClient$Runner.sendUpdateStream(ConcurrentUpdateHttp2SolrClient.java:250) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.ConcurrentUpdateHttp2SolrClient$Runner.run(ConcurrentUpdateHttp2SolrClient.java:192) [solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at com.codahale.metrics.InstrumentedExecutorService$InstrumentedRunnable.run(InstrumentedExecutorService.java:212) [metrics-core-4.2.21.jar:4.2.21] 2> at org.apache.solr.common.util.ExecutorUtil$MDCAwareThreadPoolExecutor.lambda$execute$0(ExecutorUtil.java:294) [solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) [?:?] 2> at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) [?:?] 2> at java.base/java.lang.Thread.run(Thread.java:829) [?:?] 2> Caused by: java.net.ConnectException: Connection refused 2> at java.base/sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) ~[?:?] 2> at java.base/sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:774) ~[?:?] 2> at java.base/sun.nio.ch.SocketAdaptor.connect(SocketAdaptor.java:120) ~[?:?] 2> at org.eclipse.jetty.io.ClientConnector.connect(ClientConnector.java:428) ~[jetty-io-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.HTTP2Client.connect(HTTP2Client.java:461) ~[http2-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.http.HttpClientTransportOverHTTP2.connect(HttpClientTransportOverHTTP2.java:146) ~[http2-http-client-transport-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.http.HttpClientTransportOverHTTP2.connect(HttpClientTransportOverHTTP2.java:135) ~[http2-http-client-transport-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.connect(HttpClient.java:583) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.succeeded(HttpClient.java:560) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.succeeded(HttpClient.java:556) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.util.SocketAddressResolver$Async.lambda$resolve$1(SocketAddressResolver.java:180) ~[jetty-util-10.0.19.jar:10.0.19] 2> ... 4 more 2> Caused by: java.net.ConnectException: Connection refused 2> at java.base/sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) ~[?:?] 2> at java.base/sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:774) ~[?:?] 2> at java.base/sun.nio.ch.SocketAdaptor.connect(SocketAdaptor.java:120) ~[?:?] 2> at org.eclipse.jetty.io.ClientConnector.connect(ClientConnector.java:428) ~[jetty-io-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.HTTP2Client.connect(HTTP2Client.java:461) ~[http2-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.http.HttpClientTransportOverHTTP2.connect(HttpClientTransportOverHTTP2.java:146) ~[http2-http-client-transport-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.http.HttpClientTransportOverHTTP2.connect(HttpClientTransportOverHTTP2.java:135) ~[http2-http-client-transport-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.connect(HttpClient.java:583) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.succeeded(HttpClient.java:560) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.succeeded(HttpClient.java:556) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.util.SocketAddressResolver$Async.lambda$resolve$1(SocketAddressResolver.java:180) ~[jetty-util-10.0.19.jar:10.0.19] 2> ... 4 more 2> 368773 ERROR (updateExecutor-1175-thread-2-processing-repfacttest_c8n_1x3_shard1_replica_n1 null-1096 core_node3 null 127.0.0.1:32875__c repfacttest_c8n_1x3 shard1) [n:127.0.0.1:32875__c c:repfacttest_c8n_1x3 s:shard1 r:core_node3 x:repfacttest_c8n_1x3_shard1_replica_n1 t:null-1096] o.a.s.u.ErrorReportingConcurrentUpdateSolrClient Error when calling SolrCmdDistributor$Req: cmd=add{,id=(null)}; node=StdNode: http://127.0.0.1:40643/_c/repfacttest_c8n_1x3_shard1_replica_n2/ to http://127.0.0.1:40643/_c/repfacttest_c8n_1x3_shard1_replica_n2/ 2> => java.io.IOException: java.net.ConnectException: Connection refused 2> at org.eclipse.jetty.client.util.OutputStreamRequestContent$AsyncOutputStream.write(OutputStreamRequestContent.java:104) 2> java.io.IOException: java.net.ConnectException: Connection refused 2> at org.eclipse.jetty.client.util.OutputStreamRequestContent$AsyncOutputStream.write(OutputStreamRequestContent.java:104) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.apache.solr.common.util.FastOutputStream.flush(FastOutputStream.java:207) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.common.util.FastOutputStream.flushBuffer(FastOutputStream.java:200) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.common.util.JavaBinCodec.marshal(JavaBinCodec.java:170) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.request.JavaBinUpdateRequestCodec.marshal(JavaBinUpdateRequestCodec.java:100) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.BinaryRequestWriter.write(BinaryRequestWriter.java:79) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.Http2SolrClient.send(Http2SolrClient.java:435) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.ConcurrentUpdateHttp2SolrClient$Runner.sendUpdateStream(ConcurrentUpdateHttp2SolrClient.java:250) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.ConcurrentUpdateHttp2SolrClient$Runner.run(ConcurrentUpdateHttp2SolrClient.java:192) [solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at com.codahale.metrics.InstrumentedExecutorService$InstrumentedRunnable.run(InstrumentedExecutorService.java:212) [metrics-core-4.2.21.jar:4.2.21] 2> at org.apache.solr.common.util.ExecutorUtil$MDCAwareThreadPoolExecutor.lambda$execute$0(ExecutorUtil.java:294) [solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) [?:?] 2> at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) [?:?] 2> at java.base/java.lang.Thread.run(Thread.java:829) [?:?] 2> Suppressed: java.io.IOException: java.net.ConnectException: Connection refused 2> at org.eclipse.jetty.client.util.OutputStreamRequestContent$AsyncOutputStream.write(OutputStreamRequestContent.java:104) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.apache.solr.common.util.FastOutputStream.flush(FastOutputStream.java:207) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.common.util.FastOutputStream.flushBuffer(FastOutputStream.java:200) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.common.util.JavaBinCodec.close(JavaBinCodec.java:1293) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.request.JavaBinUpdateRequestCodec.marshal(JavaBinUpdateRequestCodec.java:99) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.BinaryRequestWriter.write(BinaryRequestWriter.java:79) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.Http2SolrClient.send(Http2SolrClient.java:435) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.ConcurrentUpdateHttp2SolrClient$Runner.sendUpdateStream(ConcurrentUpdateHttp2SolrClient.java:250) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.ConcurrentUpdateHttp2SolrClient$Runner.run(ConcurrentUpdateHttp2SolrClient.java:192) [solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at com.codahale.metrics.InstrumentedExecutorService$InstrumentedRunnable.run(InstrumentedExecutorService.java:212) [metrics-core-4.2.21.jar:4.2.21] 2> at org.apache.solr.common.util.ExecutorUtil$MDCAwareThreadPoolExecutor.lambda$execute$0(ExecutorUtil.java:294) [solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) [?:?] 2> at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) [?:?] 2> at java.base/java.lang.Thread.run(Thread.java:829) [?:?] 2> Caused by: java.net.ConnectException: Connection refused 2> at java.base/sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) ~[?:?] 2> at java.base/sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:774) ~[?:?] 2> at java.base/sun.nio.ch.SocketAdaptor.connect(SocketAdaptor.java:120) ~[?:?] 2> at org.eclipse.jetty.io.ClientConnector.connect(ClientConnector.java:428) ~[jetty-io-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.HTTP2Client.connect(HTTP2Client.java:461) ~[http2-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.http.HttpClientTransportOverHTTP2.connect(HttpClientTransportOverHTTP2.java:146) ~[http2-http-client-transport-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.http.HttpClientTransportOverHTTP2.connect(HttpClientTransportOverHTTP2.java:135) ~[http2-http-client-transport-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.connect(HttpClient.java:583) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.succeeded(HttpClient.java:560) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.succeeded(HttpClient.java:556) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.util.SocketAddressResolver$Async.lambda$resolve$1(SocketAddressResolver.java:180) ~[jetty-util-10.0.19.jar:10.0.19] 2> ... 4 more 2> Caused by: java.net.ConnectException: Connection refused 2> at java.base/sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) ~[?:?] 2> at java.base/sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:774) ~[?:?] 2> at java.base/sun.nio.ch.SocketAdaptor.connect(SocketAdaptor.java:120) ~[?:?] 2> at org.eclipse.jetty.io.ClientConnector.connect(ClientConnector.java:428) ~[jetty-io-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.HTTP2Client.connect(HTTP2Client.java:461) ~[http2-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.http.HttpClientTransportOverHTTP2.connect(HttpClientTransportOverHTTP2.java:146) ~[http2-http-client-transport-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.http.HttpClientTransportOverHTTP2.connect(HttpClientTransportOverHTTP2.java:135) ~[http2-http-client-transport-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.connect(HttpClient.java:583) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.succeeded(HttpClient.java:560) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.succeeded(HttpClient.java:556) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.util.SocketAddressResolver$Async.lambda$resolve$1(SocketAddressResolver.java:180) ~[jetty-util-10.0.19.jar:10.0.19] 2> ... 4 more 2> 368791 INFO (qtp1455143443-2015) [n:127.0.0.1:32875__c c:repfacttest_c8n_1x3 s:shard1 r:core_node3 x:repfacttest_c8n_1x3_shard1_replica_n1 t:null-1096] o.a.s.u.SolrCmdDistributor SolrCmdDistributor found 15 errors 2> 368791 WARN (qtp1455143443-2015) [n:127.0.0.1:32875__c c:repfacttest_c8n_1x3 s:shard1 r:core_node3 x:repfacttest_c8n_1x3_shard1_replica_n1 t:null-1096] o.a.s.u.p.DistributedZkUpdateProcessor Error sending update to http://127.0.0.1:40643/_c 2> => java.io.IOException: java.net.ConnectException: Connection refused 2> at org.eclipse.jetty.client.util.OutputStreamRequestContent$AsyncOutputStream.write(OutputStreamRequestContent.java:104) 2> java.io.IOException: java.net.ConnectException: Connection refused 2> at org.eclipse.jetty.client.util.OutputStreamRequestContent$AsyncOutputStream.write(OutputStreamRequestContent.java:104) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.apache.solr.common.util.FastOutputStream.flush(FastOutputStream.java:207) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.common.util.FastOutputStream.flushBuffer(FastOutputStream.java:200) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.common.util.JavaBinCodec.marshal(JavaBinCodec.java:170) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.request.JavaBinUpdateRequestCodec.marshal(JavaBinUpdateRequestCodec.java:100) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.BinaryRequestWriter.write(BinaryRequestWriter.java:79) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.Http2SolrClient.send(Http2SolrClient.java:435) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.ConcurrentUpdateHttp2SolrClient$Runner.sendUpdateStream(ConcurrentUpdateHttp2SolrClient.java:250) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.ConcurrentUpdateHttp2SolrClient$Runner.run(ConcurrentUpdateHttp2SolrClient.java:192) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at com.codahale.metrics.InstrumentedExecutorService$InstrumentedRunnable.run(InstrumentedExecutorService.java:212) ~[metrics-core-4.2.21.jar:4.2.21] 2> at org.apache.solr.common.util.ExecutorUtil$MDCAwareThreadPoolExecutor.lambda$execute$0(ExecutorUtil.java:294) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) ~[?:?] 2> at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) ~[?:?] 2> at java.base/java.lang.Thread.run(Thread.java:829) [?:?] 2> Suppressed: java.io.IOException: java.net.ConnectException: Connection refused 2> at org.eclipse.jetty.client.util.OutputStreamRequestContent$AsyncOutputStream.write(OutputStreamRequestContent.java:104) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.apache.solr.common.util.FastOutputStream.flush(FastOutputStream.java:207) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.common.util.FastOutputStream.flushBuffer(FastOutputStream.java:200) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.common.util.JavaBinCodec.close(JavaBinCodec.java:1293) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.request.JavaBinUpdateRequestCodec.marshal(JavaBinUpdateRequestCodec.java:99) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.BinaryRequestWriter.write(BinaryRequestWriter.java:79) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.Http2SolrClient.send(Http2SolrClient.java:435) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.ConcurrentUpdateHttp2SolrClient$Runner.sendUpdateStream(ConcurrentUpdateHttp2SolrClient.java:250) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.ConcurrentUpdateHttp2SolrClient$Runner.run(ConcurrentUpdateHttp2SolrClient.java:192) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at com.codahale.metrics.InstrumentedExecutorService$InstrumentedRunnable.run(InstrumentedExecutorService.java:212) ~[metrics-core-4.2.21.jar:4.2.21] 2> at org.apache.solr.common.util.ExecutorUtil$MDCAwareThreadPoolExecutor.lambda$execute$0(ExecutorUtil.java:294) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) ~[?:?] 2> at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) ~[?:?] 2> at java.base/java.lang.Thread.run(Thread.java:829) [?:?] 2> Caused by: java.net.ConnectException: Connection refused 2> at java.base/sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) ~[?:?] 2> at java.base/sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:774) ~[?:?] 2> at java.base/sun.nio.ch.SocketAdaptor.connect(SocketAdaptor.java:120) ~[?:?] 2> at org.eclipse.jetty.io.ClientConnector.connect(ClientConnector.java:428) ~[jetty-io-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.HTTP2Client.connect(HTTP2Client.java:461) ~[http2-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.http.HttpClientTransportOverHTTP2.connect(HttpClientTransportOverHTTP2.java:146) ~[http2-http-client-transport-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.http.HttpClientTransportOverHTTP2.connect(HttpClientTransportOverHTTP2.java:135) ~[http2-http-client-transport-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.connect(HttpClient.java:583) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.succeeded(HttpClient.java:560) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.succeeded(HttpClient.java:556) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.util.SocketAddressResolver$Async.lambda$resolve$1(SocketAddressResolver.java:180) ~[jetty-util-10.0.19.jar:10.0.19] 2> ... 4 more 2> Caused by: java.net.ConnectException: Connection refused 2> at java.base/sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) ~[?:?] 2> at java.base/sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:774) ~[?:?] 2> at java.base/sun.nio.ch.SocketAdaptor.connect(SocketAdaptor.java:120) ~[?:?] 2> at org.eclipse.jetty.io.ClientConnector.connect(ClientConnector.java:428) ~[jetty-io-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.HTTP2Client.connect(HTTP2Client.java:461) ~[http2-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.http.HttpClientTransportOverHTTP2.connect(HttpClientTransportOverHTTP2.java:146) ~[http2-http-client-transport-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.http.HttpClientTransportOverHTTP2.connect(HttpClientTransportOverHTTP2.java:135) ~[http2-http-client-transport-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.connect(HttpClient.java:583) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.succeeded(HttpClient.java:560) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.succeeded(HttpClient.java:556) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.util.SocketAddressResolver$Async.lambda$resolve$1(SocketAddressResolver.java:180) ~[jetty-util-10.0.19.jar:10.0.19] 2> ... 4 more 2> 368807 ERROR (qtp1455143443-2015) [n:127.0.0.1:32875__c c:repfacttest_c8n_1x3 s:shard1 r:core_node3 x:repfacttest_c8n_1x3_shard1_replica_n1 t:null-1096] o.a.s.u.p.DistributedZkUpdateProcessor Setting up to try to start recovery on replica core_node5 with url http://127.0.0.1:40643/_c/repfacttest_c8n_1x3_shard1_replica_n2/ by increasing leader term 2> => java.net.ConnectException: Connection refused 2> at java.base/sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) 2> java.net.ConnectException: Connection refused 2> at java.base/sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) ~[?:?] 2> at java.base/sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:774) ~[?:?] 2> at java.base/sun.nio.ch.SocketAdaptor.connect(SocketAdaptor.java:120) ~[?:?] 2> at org.eclipse.jetty.io.ClientConnector.connect(ClientConnector.java:428) ~[jetty-io-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.HTTP2Client.connect(HTTP2Client.java:461) ~[http2-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.http.HttpClientTransportOverHTTP2.connect(HttpClientTransportOverHTTP2.java:146) ~[http2-http-client-transport-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.http.HttpClientTransportOverHTTP2.connect(HttpClientTransportOverHTTP2.java:135) ~[http2-http-client-transport-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.connect(HttpClient.java:583) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.succeeded(HttpClient.java:560) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.succeeded(HttpClient.java:556) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.util.SocketAddressResolver$Async.lambda$resolve$1(SocketAddressResolver.java:180) ~[jetty-util-10.0.19.jar:10.0.19] 2> at org.apache.solr.common.util.ExecutorUtil$MDCAwareThreadPoolExecutor.lambda$execute$0(ExecutorUtil.java:294) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) ~[?:?] 2> at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) ~[?:?] 2> at java.base/java.lang.Thread.run(Thread.java:829) [?:?] 2> 368813 WARN (qtp1455143443-2015) [n:127.0.0.1:32875__c c:repfacttest_c8n_1x3 s:shard1 r:core_node3 x:repfacttest_c8n_1x3_shard1_replica_n1 t:null-1096] o.a.s.u.p.DistributedZkUpdateProcessor Error sending update to http://127.0.0.1:40643/_c 2> => java.io.IOException: java.net.ConnectException: Connection refused 2> at org.eclipse.jetty.client.util.OutputStreamRequestContent$AsyncOutputStream.write(OutputStreamRequestContent.java:104) 2> java.io.IOException: java.net.ConnectException: Connection refused 2> at org.eclipse.jetty.client.util.OutputStreamRequestContent$AsyncOutputStream.write(OutputStreamRequestContent.java:104) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.apache.solr.common.util.FastOutputStream.flush(FastOutputStream.java:207) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.common.util.FastOutputStream.flushBuffer(FastOutputStream.java:200) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.common.util.JavaBinCodec.marshal(JavaBinCodec.java:170) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.request.JavaBinUpdateRequestCodec.marshal(JavaBinUpdateRequestCodec.java:100) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.BinaryRequestWriter.write(BinaryRequestWriter.java:79) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.Http2SolrClient.send(Http2SolrClient.java:435) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.ConcurrentUpdateHttp2SolrClient$Runner.sendUpdateStream(ConcurrentUpdateHttp2SolrClient.java:250) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.ConcurrentUpdateHttp2SolrClient$Runner.run(ConcurrentUpdateHttp2SolrClient.java:192) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at com.codahale.metrics.InstrumentedExecutorService$InstrumentedRunnable.run(InstrumentedExecutorService.java:212) ~[metrics-core-4.2.21.jar:4.2.21] 2> at org.apache.solr.common.util.ExecutorUtil$MDCAwareThreadPoolExecutor.lambda$execute$0(ExecutorUtil.java:294) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) ~[?:?] 2> at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) ~[?:?] 2> at java.base/java.lang.Thread.run(Thread.java:829) [?:?] 2> Suppressed: java.io.IOException: java.net.ConnectException: Connection refused 2> at org.eclipse.jetty.client.util.OutputStreamRequestContent$AsyncOutputStream.write(OutputStreamRequestContent.java:104) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.apache.solr.common.util.FastOutputStream.flush(FastOutputStream.java:207) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.common.util.FastOutputStream.flushBuffer(FastOutputStream.java:200) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.common.util.JavaBinCodec.close(JavaBinCodec.java:1293) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.request.JavaBinUpdateRequestCodec.marshal(JavaBinUpdateRequestCodec.java:99) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.BinaryRequestWriter.write(BinaryRequestWriter.java:79) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.Http2SolrClient.send(Http2SolrClient.java:435) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.ConcurrentUpdateHttp2SolrClient$Runner.sendUpdateStream(ConcurrentUpdateHttp2SolrClient.java:250) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.ConcurrentUpdateHttp2SolrClient$Runner.run(ConcurrentUpdateHttp2SolrClient.java:192) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at com.codahale.metrics.InstrumentedExecutorService$InstrumentedRunnable.run(InstrumentedExecutorService.java:212) ~[metrics-core-4.2.21.jar:4.2.21] 2> at org.apache.solr.common.util.ExecutorUtil$MDCAwareThreadPoolExecutor.lambda$execute$0(ExecutorUtil.java:294) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) ~[?:?] 2> at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) ~[?:?] 2> at java.base/java.lang.Thread.run(Thread.java:829) [?:?] 2> Caused by: java.net.ConnectException: Connection refused 2> at java.base/sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) ~[?:?] 2> at java.base/sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:774) ~[?:?] 2> at java.base/sun.nio.ch.SocketAdaptor.connect(SocketAdaptor.java:120) ~[?:?] 2> at org.eclipse.jetty.io.ClientConnector.connect(ClientConnector.java:428) ~[jetty-io-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.HTTP2Client.connect(HTTP2Client.java:461) ~[http2-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.http.HttpClientTransportOverHTTP2.connect(HttpClientTransportOverHTTP2.java:146) ~[http2-http-client-transport-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.http.HttpClientTransportOverHTTP2.connect(HttpClientTransportOverHTTP2.java:135) ~[http2-http-client-transport-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.connect(HttpClient.java:583) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.succeeded(HttpClient.java:560) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.succeeded(HttpClient.java:556) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.util.SocketAddressResolver$Async.lambda$resolve$1(SocketAddressResolver.java:180) ~[jetty-util-10.0.19.jar:10.0.19] 2> ... 4 more 2> Caused by: java.net.ConnectException: Connection refused 2> at java.base/sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) ~[?:?] 2> at java.base/sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:774) ~[?:?] 2> at java.base/sun.nio.ch.SocketAdaptor.connect(SocketAdaptor.java:120) ~[?:?] 2> at org.eclipse.jetty.io.ClientConnector.connect(ClientConnector.java:428) ~[jetty-io-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.HTTP2Client.connect(HTTP2Client.java:461) ~[http2-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.http.HttpClientTransportOverHTTP2.connect(HttpClientTransportOverHTTP2.java:146) ~[http2-http-client-transport-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.http.HttpClientTransportOverHTTP2.connect(HttpClientTransportOverHTTP2.java:135) ~[http2-http-client-transport-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.connect(HttpClient.java:583) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.succeeded(HttpClient.java:560) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.succeeded(HttpClient.java:556) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.util.SocketAddressResolver$Async.lambda$resolve$1(SocketAddressResolver.java:180) ~[jetty-util-10.0.19.jar:10.0.19] 2> ... 4 more 2> 368828 ERROR (qtp1455143443-2015) [n:127.0.0.1:32875__c c:repfacttest_c8n_1x3 s:shard1 r:core_node3 x:repfacttest_c8n_1x3_shard1_replica_n1 t:null-1096] o.a.s.u.p.DistributedZkUpdateProcessor Setting up to try to start recovery on replica core_node5 with url http://127.0.0.1:40643/_c/repfacttest_c8n_1x3_shard1_replica_n2/ by increasing leader term 2> => java.net.ConnectException: Connection refused 2> at java.base/sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) 2> java.net.ConnectException: Connection refused 2> at java.base/sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) ~[?:?] 2> at java.base/sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:774) ~[?:?] 2> at java.base/sun.nio.ch.SocketAdaptor.connect(SocketAdaptor.java:120) ~[?:?] 2> at org.eclipse.jetty.io.ClientConnector.connect(ClientConnector.java:428) ~[jetty-io-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.HTTP2Client.connect(HTTP2Client.java:461) ~[http2-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.http.HttpClientTransportOverHTTP2.connect(HttpClientTransportOverHTTP2.java:146) ~[http2-http-client-transport-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.http.HttpClientTransportOverHTTP2.connect(HttpClientTransportOverHTTP2.java:135) ~[http2-http-client-transport-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.connect(HttpClient.java:583) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.succeeded(HttpClient.java:560) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.succeeded(HttpClient.java:556) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.util.SocketAddressResolver$Async.lambda$resolve$1(SocketAddressResolver.java:180) ~[jetty-util-10.0.19.jar:10.0.19] 2> at org.apache.solr.common.util.ExecutorUtil$MDCAwareThreadPoolExecutor.lambda$execute$0(ExecutorUtil.java:294) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) ~[?:?] 2> at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) ~[?:?] 2> at java.base/java.lang.Thread.run(Thread.java:829) [?:?] 2> 368833 WARN (qtp1455143443-2015) [n:127.0.0.1:32875__c c:repfacttest_c8n_1x3 s:shard1 r:core_node3 x:repfacttest_c8n_1x3_shard1_replica_n1 t:null-1096] o.a.s.u.p.DistributedZkUpdateProcessor Error sending update to http://127.0.0.1:40643/_c 2> => java.io.IOException: java.net.ConnectException: Connection refused 2> at org.eclipse.jetty.client.util.OutputStreamRequestContent$AsyncOutputStream.write(OutputStreamRequestContent.java:104) 2> java.io.IOException: java.net.ConnectException: Connection refused 2> at org.eclipse.jetty.client.util.OutputStreamRequestContent$AsyncOutputStream.write(OutputStreamRequestContent.java:104) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.apache.solr.common.util.FastOutputStream.flush(FastOutputStream.java:207) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.common.util.FastOutputStream.flushBuffer(FastOutputStream.java:200) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.common.util.JavaBinCodec.marshal(JavaBinCodec.java:170) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.request.JavaBinUpdateRequestCodec.marshal(JavaBinUpdateRequestCodec.java:100) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.BinaryRequestWriter.write(BinaryRequestWriter.java:79) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.Http2SolrClient.send(Http2SolrClient.java:435) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.ConcurrentUpdateHttp2SolrClient$Runner.sendUpdateStream(ConcurrentUpdateHttp2SolrClient.java:250) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.ConcurrentUpdateHttp2SolrClient$Runner.run(ConcurrentUpdateHttp2SolrClient.java:192) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at com.codahale.metrics.InstrumentedExecutorService$InstrumentedRunnable.run(InstrumentedExecutorService.java:212) ~[metrics-core-4.2.21.jar:4.2.21] 2> at org.apache.solr.common.util.ExecutorUtil$MDCAwareThreadPoolExecutor.lambda$execute$0(ExecutorUtil.java:294) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) ~[?:?] 2> at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) ~[?:?] 2> at java.base/java.lang.Thread.run(Thread.java:829) [?:?] 2> Suppressed: java.io.IOException: java.net.ConnectException: Connection refused 2> at org.eclipse.jetty.client.util.OutputStreamRequestContent$AsyncOutputStream.write(OutputStreamRequestContent.java:104) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.apache.solr.common.util.FastOutputStream.flush(FastOutputStream.java:207) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.common.util.FastOutputStream.flushBuffer(FastOutputStream.java:200) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.common.util.JavaBinCodec.close(JavaBinCodec.java:1293) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.request.JavaBinUpdateRequestCodec.marshal(JavaBinUpdateRequestCodec.java:99) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.BinaryRequestWriter.write(BinaryRequestWriter.java:79) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.Http2SolrClient.send(Http2SolrClient.java:435) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.ConcurrentUpdateHttp2SolrClient$Runner.sendUpdateStream(ConcurrentUpdateHttp2SolrClient.java:250) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.ConcurrentUpdateHttp2SolrClient$Runner.run(ConcurrentUpdateHttp2SolrClient.java:192) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at com.codahale.metrics.InstrumentedExecutorService$InstrumentedRunnable.run(InstrumentedExecutorService.java:212) ~[metrics-core-4.2.21.jar:4.2.21] 2> at org.apache.solr.common.util.ExecutorUtil$MDCAwareThreadPoolExecutor.lambda$execute$0(ExecutorUtil.java:294) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) ~[?:?] 2> at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) ~[?:?] 2> at java.base/java.lang.Thread.run(Thread.java:829) [?:?] 2> Caused by: java.net.ConnectException: Connection refused 2> at java.base/sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) ~[?:?] 2> at java.base/sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:774) ~[?:?] 2> at java.base/sun.nio.ch.SocketAdaptor.connect(SocketAdaptor.java:120) ~[?:?] 2> at org.eclipse.jetty.io.ClientConnector.connect(ClientConnector.java:428) ~[jetty-io-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.HTTP2Client.connect(HTTP2Client.java:461) ~[http2-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.http.HttpClientTransportOverHTTP2.connect(HttpClientTransportOverHTTP2.java:146) ~[http2-http-client-transport-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.http.HttpClientTransportOverHTTP2.connect(HttpClientTransportOverHTTP2.java:135) ~[http2-http-client-transport-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.connect(HttpClient.java:583) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.succeeded(HttpClient.java:560) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.succeeded(HttpClient.java:556) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.util.SocketAddressResolver$Async.lambda$resolve$1(SocketAddressResolver.java:180) ~[jetty-util-10.0.19.jar:10.0.19] 2> ... 4 more 2> Caused by: java.net.ConnectException: Connection refused 2> at java.base/sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) ~[?:?] 2> at java.base/sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:774) ~[?:?] 2> at java.base/sun.nio.ch.SocketAdaptor.connect(SocketAdaptor.java:120) ~[?:?] 2> at org.eclipse.jetty.io.ClientConnector.connect(ClientConnector.java:428) ~[jetty-io-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.HTTP2Client.connect(HTTP2Client.java:461) ~[http2-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.http.HttpClientTransportOverHTTP2.connect(HttpClientTransportOverHTTP2.java:146) ~[http2-http-client-transport-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.http.HttpClientTransportOverHTTP2.connect(HttpClientTransportOverHTTP2.java:135) ~[http2-http-client-transport-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.connect(HttpClient.java:583) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.succeeded(HttpClient.java:560) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.succeeded(HttpClient.java:556) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.util.SocketAddressResolver$Async.lambda$resolve$1(SocketAddressResolver.java:180) ~[jetty-util-10.0.19.jar:10.0.19] 2> ... 4 more 2> 368849 ERROR (qtp1455143443-2015) [n:127.0.0.1:32875__c c:repfacttest_c8n_1x3 s:shard1 r:core_node3 x:repfacttest_c8n_1x3_shard1_replica_n1 t:null-1096] o.a.s.u.p.DistributedZkUpdateProcessor Setting up to try to start recovery on replica core_node5 with url http://127.0.0.1:40643/_c/repfacttest_c8n_1x3_shard1_replica_n2/ by increasing leader term 2> => java.net.ConnectException: Connection refused 2> at java.base/sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) 2> java.net.ConnectException: Connection refused 2> at java.base/sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) ~[?:?] 2> at java.base/sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:774) ~[?:?] 2> at java.base/sun.nio.ch.SocketAdaptor.connect(SocketAdaptor.java:120) ~[?:?] 2> at org.eclipse.jetty.io.ClientConnector.connect(ClientConnector.java:428) ~[jetty-io-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.HTTP2Client.connect(HTTP2Client.java:461) ~[http2-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.http.HttpClientTransportOverHTTP2.connect(HttpClientTransportOverHTTP2.java:146) ~[http2-http-client-transport-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.http.HttpClientTransportOverHTTP2.connect(HttpClientTransportOverHTTP2.java:135) ~[http2-http-client-transport-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.connect(HttpClient.java:583) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.succeeded(HttpClient.java:560) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.succeeded(HttpClient.java:556) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.util.SocketAddressResolver$Async.lambda$resolve$1(SocketAddressResolver.java:180) ~[jetty-util-10.0.19.jar:10.0.19] 2> at org.apache.solr.common.util.ExecutorUtil$MDCAwareThreadPoolExecutor.lambda$execute$0(ExecutorUtil.java:294) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) ~[?:?] 2> at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) ~[?:?] 2> at java.base/java.lang.Thread.run(Thread.java:829) [?:?] 2> 368856 WARN (qtp1455143443-2015) [n:127.0.0.1:32875__c c:repfacttest_c8n_1x3 s:shard1 r:core_node3 x:repfacttest_c8n_1x3_shard1_replica_n1 t:null-1096] o.a.s.u.p.DistributedZkUpdateProcessor Error sending update to http://127.0.0.1:40643/_c 2> => java.io.IOException: java.net.ConnectException: Connection refused 2> at org.eclipse.jetty.client.util.OutputStreamRequestContent$AsyncOutputStream.write(OutputStreamRequestContent.java:104) 2> java.io.IOException: java.net.ConnectException: Connection refused 2> at org.eclipse.jetty.client.util.OutputStreamRequestContent$AsyncOutputStream.write(OutputStreamRequestContent.java:104) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.apache.solr.common.util.FastOutputStream.flush(FastOutputStream.java:207) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.common.util.FastOutputStream.flushBuffer(FastOutputStream.java:200) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.common.util.JavaBinCodec.marshal(JavaBinCodec.java:170) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.request.JavaBinUpdateRequestCodec.marshal(JavaBinUpdateRequestCodec.java:100) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.BinaryRequestWriter.write(BinaryRequestWriter.java:79) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.Http2SolrClient.send(Http2SolrClient.java:435) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.ConcurrentUpdateHttp2SolrClient$Runner.sendUpdateStream(ConcurrentUpdateHttp2SolrClient.java:250) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.ConcurrentUpdateHttp2SolrClient$Runner.run(ConcurrentUpdateHttp2SolrClient.java:192) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at com.codahale.metrics.InstrumentedExecutorService$InstrumentedRunnable.run(InstrumentedExecutorService.java:212) ~[metrics-core-4.2.21.jar:4.2.21] 2> at org.apache.solr.common.util.ExecutorUtil$MDCAwareThreadPoolExecutor.lambda$execute$0(ExecutorUtil.java:294) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) ~[?:?] 2> at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) ~[?:?] 2> at java.base/java.lang.Thread.run(Thread.java:829) [?:?] 2> Suppressed: java.io.IOException: java.net.ConnectException: Connection refused 2> at org.eclipse.jetty.client.util.OutputStreamRequestContent$AsyncOutputStream.write(OutputStreamRequestContent.java:104) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.apache.solr.common.util.FastOutputStream.flush(FastOutputStream.java:207) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.common.util.FastOutputStream.flushBuffer(FastOutputStream.java:200) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.common.util.JavaBinCodec.close(JavaBinCodec.java:1293) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.request.JavaBinUpdateRequestCodec.marshal(JavaBinUpdateRequestCodec.java:99) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.BinaryRequestWriter.write(BinaryRequestWriter.java:79) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.Http2SolrClient.send(Http2SolrClient.java:435) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.ConcurrentUpdateHttp2SolrClient$Runner.sendUpdateStream(ConcurrentUpdateHttp2SolrClient.java:250) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.ConcurrentUpdateHttp2SolrClient$Runner.run(ConcurrentUpdateHttp2SolrClient.java:192) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at com.codahale.metrics.InstrumentedExecutorService$InstrumentedRunnable.run(InstrumentedExecutorService.java:212) ~[metrics-core-4.2.21.jar:4.2.21] 2> at org.apache.solr.common.util.ExecutorUtil$MDCAwareThreadPoolExecutor.lambda$execute$0(ExecutorUtil.java:294) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) ~[?:?] 2> at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) ~[?:?] 2> at java.base/java.lang.Thread.run(Thread.java:829) [?:?] 2> Caused by: java.net.ConnectException: Connection refused 2> at java.base/sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) ~[?:?] 2> at java.base/sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:774) ~[?:?] 2> at java.base/sun.nio.ch.SocketAdaptor.connect(SocketAdaptor.java:120) ~[?:?] 2> at org.eclipse.jetty.io.ClientConnector.connect(ClientConnector.java:428) ~[jetty-io-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.HTTP2Client.connect(HTTP2Client.java:461) ~[http2-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.http.HttpClientTransportOverHTTP2.connect(HttpClientTransportOverHTTP2.java:146) ~[http2-http-client-transport-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.http.HttpClientTransportOverHTTP2.connect(HttpClientTransportOverHTTP2.java:135) ~[http2-http-client-transport-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.connect(HttpClient.java:583) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.succeeded(HttpClient.java:560) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.succeeded(HttpClient.java:556) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.util.SocketAddressResolver$Async.lambda$resolve$1(SocketAddressResolver.java:180) ~[jetty-util-10.0.19.jar:10.0.19] 2> ... 4 more 2> Caused by: java.net.ConnectException: Connection refused 2> at java.base/sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) ~[?:?] 2> at java.base/sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:774) ~[?:?] 2> at java.base/sun.nio.ch.SocketAdaptor.connect(SocketAdaptor.java:120) ~[?:?] 2> at org.eclipse.jetty.io.ClientConnector.connect(ClientConnector.java:428) ~[jetty-io-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.HTTP2Client.connect(HTTP2Client.java:461) ~[http2-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.http.HttpClientTransportOverHTTP2.connect(HttpClientTransportOverHTTP2.java:146) ~[http2-http-client-transport-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.http.HttpClientTransportOverHTTP2.connect(HttpClientTransportOverHTTP2.java:135) ~[http2-http-client-transport-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.connect(HttpClient.java:583) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.succeeded(HttpClient.java:560) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.succeeded(HttpClient.java:556) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.util.SocketAddressResolver$Async.lambda$resolve$1(SocketAddressResolver.java:180) ~[jetty-util-10.0.19.jar:10.0.19] 2> ... 4 more 2> 368874 ERROR (qtp1455143443-2015) [n:127.0.0.1:32875__c c:repfacttest_c8n_1x3 s:shard1 r:core_node3 x:repfacttest_c8n_1x3_shard1_replica_n1 t:null-1096] o.a.s.u.p.DistributedZkUpdateProcessor Setting up to try to start recovery on replica core_node5 with url http://127.0.0.1:40643/_c/repfacttest_c8n_1x3_shard1_replica_n2/ by increasing leader term 2> => java.net.ConnectException: Connection refused 2> at java.base/sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) 2> java.net.ConnectException: Connection refused 2> at java.base/sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) ~[?:?] 2> at java.base/sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:774) ~[?:?] 2> at java.base/sun.nio.ch.SocketAdaptor.connect(SocketAdaptor.java:120) ~[?:?] 2> at org.eclipse.jetty.io.ClientConnector.connect(ClientConnector.java:428) ~[jetty-io-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.HTTP2Client.connect(HTTP2Client.java:461) ~[http2-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.http.HttpClientTransportOverHTTP2.connect(HttpClientTransportOverHTTP2.java:146) ~[http2-http-client-transport-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.http.HttpClientTransportOverHTTP2.connect(HttpClientTransportOverHTTP2.java:135) ~[http2-http-client-transport-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.connect(HttpClient.java:583) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.succeeded(HttpClient.java:560) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.succeeded(HttpClient.java:556) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.util.SocketAddressResolver$Async.lambda$resolve$1(SocketAddressResolver.java:180) ~[jetty-util-10.0.19.jar:10.0.19] 2> at org.apache.solr.common.util.ExecutorUtil$MDCAwareThreadPoolExecutor.lambda$execute$0(ExecutorUtil.java:294) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) ~[?:?] 2> at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) ~[?:?] 2> at java.base/java.lang.Thread.run(Thread.java:829) [?:?] 2> 368880 WARN (qtp1455143443-2015) [n:127.0.0.1:32875__c c:repfacttest_c8n_1x3 s:shard1 r:core_node3 x:repfacttest_c8n_1x3_shard1_replica_n1 t:null-1096] o.a.s.u.p.DistributedZkUpdateProcessor Error sending update to http://127.0.0.1:40643/_c 2> => java.io.IOException: java.net.ConnectException: Connection refused 2> at org.eclipse.jetty.client.util.OutputStreamRequestContent$AsyncOutputStream.write(OutputStreamRequestContent.java:104) 2> java.io.IOException: java.net.ConnectException: Connection refused 2> at org.eclipse.jetty.client.util.OutputStreamRequestContent$AsyncOutputStream.write(OutputStreamRequestContent.java:104) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.apache.solr.common.util.FastOutputStream.flush(FastOutputStream.java:207) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.common.util.FastOutputStream.flushBuffer(FastOutputStream.java:200) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.common.util.JavaBinCodec.marshal(JavaBinCodec.java:170) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.request.JavaBinUpdateRequestCodec.marshal(JavaBinUpdateRequestCodec.java:100) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.BinaryRequestWriter.write(BinaryRequestWriter.java:79) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.Http2SolrClient.send(Http2SolrClient.java:435) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.ConcurrentUpdateHttp2SolrClient$Runner.sendUpdateStream(ConcurrentUpdateHttp2SolrClient.java:250) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.ConcurrentUpdateHttp2SolrClient$Runner.run(ConcurrentUpdateHttp2SolrClient.java:192) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at com.codahale.metrics.InstrumentedExecutorService$InstrumentedRunnable.run(InstrumentedExecutorService.java:212) ~[metrics-core-4.2.21.jar:4.2.21] 2> at org.apache.solr.common.util.ExecutorUtil$MDCAwareThreadPoolExecutor.lambda$execute$0(ExecutorUtil.java:294) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) ~[?:?] 2> at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) ~[?:?] 2> at java.base/java.lang.Thread.run(Thread.java:829) [?:?] 2> Suppressed: java.io.IOException: java.net.ConnectException: Connection refused 2> at org.eclipse.jetty.client.util.OutputStreamRequestContent$AsyncOutputStream.write(OutputStreamRequestContent.java:104) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.apache.solr.common.util.FastOutputStream.flush(FastOutputStream.java:207) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.common.util.FastOutputStream.flushBuffer(FastOutputStream.java:200) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.common.util.JavaBinCodec.close(JavaBinCodec.java:1293) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.request.JavaBinUpdateRequestCodec.marshal(JavaBinUpdateRequestCodec.java:99) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.BinaryRequestWriter.write(BinaryRequestWriter.java:79) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.Http2SolrClient.send(Http2SolrClient.java:435) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.ConcurrentUpdateHttp2SolrClient$Runner.sendUpdateStream(ConcurrentUpdateHttp2SolrClient.java:250) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.ConcurrentUpdateHttp2SolrClient$Runner.run(ConcurrentUpdateHttp2SolrClient.java:192) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at com.codahale.metrics.InstrumentedExecutorService$InstrumentedRunnable.run(InstrumentedExecutorService.java:212) ~[metrics-core-4.2.21.jar:4.2.21] 2> at org.apache.solr.common.util.ExecutorUtil$MDCAwareThreadPoolExecutor.lambda$execute$0(ExecutorUtil.java:294) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) ~[?:?] 2> at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) ~[?:?] 2> at java.base/java.lang.Thread.run(Thread.java:829) [?:?] 2> Caused by: java.net.ConnectException: Connection refused 2> at java.base/sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) ~[?:?] 2> at java.base/sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:774) ~[?:?] 2> at java.base/sun.nio.ch.SocketAdaptor.connect(SocketAdaptor.java:120) ~[?:?] 2> at org.eclipse.jetty.io.ClientConnector.connect(ClientConnector.java:428) ~[jetty-io-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.HTTP2Client.connect(HTTP2Client.java:461) ~[http2-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.http.HttpClientTransportOverHTTP2.connect(HttpClientTransportOverHTTP2.java:146) ~[http2-http-client-transport-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.http.HttpClientTransportOverHTTP2.connect(HttpClientTransportOverHTTP2.java:135) ~[http2-http-client-transport-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.connect(HttpClient.java:583) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.succeeded(HttpClient.java:560) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.succeeded(HttpClient.java:556) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.util.SocketAddressResolver$Async.lambda$resolve$1(SocketAddressResolver.java:180) ~[jetty-util-10.0.19.jar:10.0.19] 2> ... 4 more 2> Caused by: java.net.ConnectException: Connection refused 2> at java.base/sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) ~[?:?] 2> at java.base/sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:774) ~[?:?] 2> at java.base/sun.nio.ch.SocketAdaptor.connect(SocketAdaptor.java:120) ~[?:?] 2> at org.eclipse.jetty.io.ClientConnector.connect(ClientConnector.java:428) ~[jetty-io-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.HTTP2Client.connect(HTTP2Client.java:461) ~[http2-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.http.HttpClientTransportOverHTTP2.connect(HttpClientTransportOverHTTP2.java:146) ~[http2-http-client-transport-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.http.HttpClientTransportOverHTTP2.connect(HttpClientTransportOverHTTP2.java:135) ~[http2-http-client-transport-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.connect(HttpClient.java:583) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.succeeded(HttpClient.java:560) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.succeeded(HttpClient.java:556) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.util.SocketAddressResolver$Async.lambda$resolve$1(SocketAddressResolver.java:180) ~[jetty-util-10.0.19.jar:10.0.19] 2> ... 4 more 2> 368896 ERROR (qtp1455143443-2015) [n:127.0.0.1:32875__c c:repfacttest_c8n_1x3 s:shard1 r:core_node3 x:repfacttest_c8n_1x3_shard1_replica_n1 t:null-1096] o.a.s.u.p.DistributedZkUpdateProcessor Setting up to try to start recovery on replica core_node5 with url http://127.0.0.1:40643/_c/repfacttest_c8n_1x3_shard1_replica_n2/ by increasing leader term 2> => java.net.ConnectException: Connection refused 2> at java.base/sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) 2> java.net.ConnectException: Connection refused 2> at java.base/sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) ~[?:?] 2> at java.base/sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:774) ~[?:?] 2> at java.base/sun.nio.ch.SocketAdaptor.connect(SocketAdaptor.java:120) ~[?:?] 2> at org.eclipse.jetty.io.ClientConnector.connect(ClientConnector.java:428) ~[jetty-io-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.HTTP2Client.connect(HTTP2Client.java:461) ~[http2-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.http.HttpClientTransportOverHTTP2.connect(HttpClientTransportOverHTTP2.java:146) ~[http2-http-client-transport-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.http.HttpClientTransportOverHTTP2.connect(HttpClientTransportOverHTTP2.java:135) ~[http2-http-client-transport-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.connect(HttpClient.java:583) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.succeeded(HttpClient.java:560) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.succeeded(HttpClient.java:556) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.util.SocketAddressResolver$Async.lambda$resolve$1(SocketAddressResolver.java:180) ~[jetty-util-10.0.19.jar:10.0.19] 2> at org.apache.solr.common.util.ExecutorUtil$MDCAwareThreadPoolExecutor.lambda$execute$0(ExecutorUtil.java:294) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) ~[?:?] 2> at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) ~[?:?] 2> at java.base/java.lang.Thread.run(Thread.java:829) [?:?] 2> 368901 WARN (qtp1455143443-2015) [n:127.0.0.1:32875__c c:repfacttest_c8n_1x3 s:shard1 r:core_node3 x:repfacttest_c8n_1x3_shard1_replica_n1 t:null-1096] o.a.s.u.p.DistributedZkUpdateProcessor Error sending update to http://127.0.0.1:40643/_c 2> => java.io.IOException: java.net.ConnectException: Connection refused 2> at org.eclipse.jetty.client.util.OutputStreamRequestContent$AsyncOutputStream.write(OutputStreamRequestContent.java:104) 2> java.io.IOException: java.net.ConnectException: Connection refused 2> at org.eclipse.jetty.client.util.OutputStreamRequestContent$AsyncOutputStream.write(OutputStreamRequestContent.java:104) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.apache.solr.common.util.FastOutputStream.flush(FastOutputStream.java:207) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.common.util.FastOutputStream.flushBuffer(FastOutputStream.java:200) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.common.util.JavaBinCodec.marshal(JavaBinCodec.java:170) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.request.JavaBinUpdateRequestCodec.marshal(JavaBinUpdateRequestCodec.java:100) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.BinaryRequestWriter.write(BinaryRequestWriter.java:79) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.Http2SolrClient.send(Http2SolrClient.java:435) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.ConcurrentUpdateHttp2SolrClient$Runner.sendUpdateStream(ConcurrentUpdateHttp2SolrClient.java:250) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.ConcurrentUpdateHttp2SolrClient$Runner.run(ConcurrentUpdateHttp2SolrClient.java:192) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at com.codahale.metrics.InstrumentedExecutorService$InstrumentedRunnable.run(InstrumentedExecutorService.java:212) ~[metrics-core-4.2.21.jar:4.2.21] 2> at org.apache.solr.common.util.ExecutorUtil$MDCAwareThreadPoolExecutor.lambda$execute$0(ExecutorUtil.java:294) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) ~[?:?] 2> at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) ~[?:?] 2> at java.base/java.lang.Thread.run(Thread.java:829) [?:?] 2> Suppressed: java.io.IOException: java.net.ConnectException: Connection refused 2> at org.eclipse.jetty.client.util.OutputStreamRequestContent$AsyncOutputStream.write(OutputStreamRequestContent.java:104) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.apache.solr.common.util.FastOutputStream.flush(FastOutputStream.java:207) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.common.util.FastOutputStream.flushBuffer(FastOutputStream.java:200) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.common.util.JavaBinCodec.close(JavaBinCodec.java:1293) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.request.JavaBinUpdateRequestCodec.marshal(JavaBinUpdateRequestCodec.java:99) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.BinaryRequestWriter.write(BinaryRequestWriter.java:79) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.Http2SolrClient.send(Http2SolrClient.java:435) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.ConcurrentUpdateHttp2SolrClient$Runner.sendUpdateStream(ConcurrentUpdateHttp2SolrClient.java:250) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.ConcurrentUpdateHttp2SolrClient$Runner.run(ConcurrentUpdateHttp2SolrClient.java:192) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at com.codahale.metrics.InstrumentedExecutorService$InstrumentedRunnable.run(InstrumentedExecutorService.java:212) ~[metrics-core-4.2.21.jar:4.2.21] 2> at org.apache.solr.common.util.ExecutorUtil$MDCAwareThreadPoolExecutor.lambda$execute$0(ExecutorUtil.java:294) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) ~[?:?] 2> at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) ~[?:?] 2> at java.base/java.lang.Thread.run(Thread.java:829) [?:?] 2> Caused by: java.net.ConnectException: Connection refused 2> at java.base/sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) ~[?:?] 2> at java.base/sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:774) ~[?:?] 2> at java.base/sun.nio.ch.SocketAdaptor.connect(SocketAdaptor.java:120) ~[?:?] 2> at org.eclipse.jetty.io.ClientConnector.connect(ClientConnector.java:428) ~[jetty-io-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.HTTP2Client.connect(HTTP2Client.java:461) ~[http2-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.http.HttpClientTransportOverHTTP2.connect(HttpClientTransportOverHTTP2.java:146) ~[http2-http-client-transport-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.http.HttpClientTransportOverHTTP2.connect(HttpClientTransportOverHTTP2.java:135) ~[http2-http-client-transport-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.connect(HttpClient.java:583) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.succeeded(HttpClient.java:560) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.succeeded(HttpClient.java:556) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.util.SocketAddressResolver$Async.lambda$resolve$1(SocketAddressResolver.java:180) ~[jetty-util-10.0.19.jar:10.0.19] 2> ... 4 more 2> Caused by: java.net.ConnectException: Connection refused 2> at java.base/sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) ~[?:?] 2> at java.base/sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:774) ~[?:?] 2> at java.base/sun.nio.ch.SocketAdaptor.connect(SocketAdaptor.java:120) ~[?:?] 2> at org.eclipse.jetty.io.ClientConnector.connect(ClientConnector.java:428) ~[jetty-io-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.HTTP2Client.connect(HTTP2Client.java:461) ~[http2-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.http.HttpClientTransportOverHTTP2.connect(HttpClientTransportOverHTTP2.java:146) ~[http2-http-client-transport-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.http.HttpClientTransportOverHTTP2.connect(HttpClientTransportOverHTTP2.java:135) ~[http2-http-client-transport-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.connect(HttpClient.java:583) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.succeeded(HttpClient.java:560) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.succeeded(HttpClient.java:556) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.util.SocketAddressResolver$Async.lambda$resolve$1(SocketAddressResolver.java:180) ~[jetty-util-10.0.19.jar:10.0.19] 2> ... 4 more 2> 368914 ERROR (qtp1455143443-2015) [n:127.0.0.1:32875__c c:repfacttest_c8n_1x3 s:shard1 r:core_node3 x:repfacttest_c8n_1x3_shard1_replica_n1 t:null-1096] o.a.s.u.p.DistributedZkUpdateProcessor Setting up to try to start recovery on replica core_node5 with url http://127.0.0.1:40643/_c/repfacttest_c8n_1x3_shard1_replica_n2/ by increasing leader term 2> => java.net.ConnectException: Connection refused 2> at java.base/sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) 2> java.net.ConnectException: Connection refused 2> at java.base/sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) ~[?:?] 2> at java.base/sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:774) ~[?:?] 2> at java.base/sun.nio.ch.SocketAdaptor.connect(SocketAdaptor.java:120) ~[?:?] 2> at org.eclipse.jetty.io.ClientConnector.connect(ClientConnector.java:428) ~[jetty-io-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.HTTP2Client.connect(HTTP2Client.java:461) ~[http2-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.http.HttpClientTransportOverHTTP2.connect(HttpClientTransportOverHTTP2.java:146) ~[http2-http-client-transport-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.http.HttpClientTransportOverHTTP2.connect(HttpClientTransportOverHTTP2.java:135) ~[http2-http-client-transport-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.connect(HttpClient.java:583) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.succeeded(HttpClient.java:560) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.succeeded(HttpClient.java:556) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.util.SocketAddressResolver$Async.lambda$resolve$1(SocketAddressResolver.java:180) ~[jetty-util-10.0.19.jar:10.0.19] 2> at org.apache.solr.common.util.ExecutorUtil$MDCAwareThreadPoolExecutor.lambda$execute$0(ExecutorUtil.java:294) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) ~[?:?] 2> at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) ~[?:?] 2> at java.base/java.lang.Thread.run(Thread.java:829) [?:?] 2> 368919 WARN (qtp1455143443-2015) [n:127.0.0.1:32875__c c:repfacttest_c8n_1x3 s:shard1 r:core_node3 x:repfacttest_c8n_1x3_shard1_replica_n1 t:null-1096] o.a.s.u.p.DistributedZkUpdateProcessor Error sending update to http://127.0.0.1:40643/_c 2> => java.io.IOException: java.net.ConnectException: Connection refused 2> at org.eclipse.jetty.client.util.OutputStreamRequestContent$AsyncOutputStream.write(OutputStreamRequestContent.java:104) 2> java.io.IOException: java.net.ConnectException: Connection refused 2> at org.eclipse.jetty.client.util.OutputStreamRequestContent$AsyncOutputStream.write(OutputStreamRequestContent.java:104) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.apache.solr.common.util.FastOutputStream.flush(FastOutputStream.java:207) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.common.util.FastOutputStream.flushBuffer(FastOutputStream.java:200) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.common.util.JavaBinCodec.marshal(JavaBinCodec.java:170) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.request.JavaBinUpdateRequestCodec.marshal(JavaBinUpdateRequestCodec.java:100) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.BinaryRequestWriter.write(BinaryRequestWriter.java:79) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.Http2SolrClient.send(Http2SolrClient.java:435) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.ConcurrentUpdateHttp2SolrClient$Runner.sendUpdateStream(ConcurrentUpdateHttp2SolrClient.java:250) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.ConcurrentUpdateHttp2SolrClient$Runner.run(ConcurrentUpdateHttp2SolrClient.java:192) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at com.codahale.metrics.InstrumentedExecutorService$InstrumentedRunnable.run(InstrumentedExecutorService.java:212) ~[metrics-core-4.2.21.jar:4.2.21] 2> at org.apache.solr.common.util.ExecutorUtil$MDCAwareThreadPoolExecutor.lambda$execute$0(ExecutorUtil.java:294) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) ~[?:?] 2> at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) ~[?:?] 2> at java.base/java.lang.Thread.run(Thread.java:829) [?:?] 2> Suppressed: java.io.IOException: java.net.ConnectException: Connection refused 2> at org.eclipse.jetty.client.util.OutputStreamRequestContent$AsyncOutputStream.write(OutputStreamRequestContent.java:104) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.apache.solr.common.util.FastOutputStream.flush(FastOutputStream.java:207) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.common.util.FastOutputStream.flushBuffer(FastOutputStream.java:200) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.common.util.JavaBinCodec.close(JavaBinCodec.java:1293) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.request.JavaBinUpdateRequestCodec.marshal(JavaBinUpdateRequestCodec.java:99) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.BinaryRequestWriter.write(BinaryRequestWriter.java:79) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.Http2SolrClient.send(Http2SolrClient.java:435) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.ConcurrentUpdateHttp2SolrClient$Runner.sendUpdateStream(ConcurrentUpdateHttp2SolrClient.java:250) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.ConcurrentUpdateHttp2SolrClient$Runner.run(ConcurrentUpdateHttp2SolrClient.java:192) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at com.codahale.metrics.InstrumentedExecutorService$InstrumentedRunnable.run(InstrumentedExecutorService.java:212) ~[metrics-core-4.2.21.jar:4.2.21] 2> at org.apache.solr.common.util.ExecutorUtil$MDCAwareThreadPoolExecutor.lambda$execute$0(ExecutorUtil.java:294) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) ~[?:?] 2> at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) ~[?:?] 2> at java.base/java.lang.Thread.run(Thread.java:829) [?:?] 2> Caused by: java.net.ConnectException: Connection refused 2> at java.base/sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) ~[?:?] 2> at java.base/sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:774) ~[?:?] 2> at java.base/sun.nio.ch.SocketAdaptor.connect(SocketAdaptor.java:120) ~[?:?] 2> at org.eclipse.jetty.io.ClientConnector.connect(ClientConnector.java:428) ~[jetty-io-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.HTTP2Client.connect(HTTP2Client.java:461) ~[http2-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.http.HttpClientTransportOverHTTP2.connect(HttpClientTransportOverHTTP2.java:146) ~[http2-http-client-transport-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.http.HttpClientTransportOverHTTP2.connect(HttpClientTransportOverHTTP2.java:135) ~[http2-http-client-transport-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.connect(HttpClient.java:583) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.succeeded(HttpClient.java:560) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.succeeded(HttpClient.java:556) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.util.SocketAddressResolver$Async.lambda$resolve$1(SocketAddressResolver.java:180) ~[jetty-util-10.0.19.jar:10.0.19] 2> ... 4 more 2> Caused by: java.net.ConnectException: Connection refused 2> at java.base/sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) ~[?:?] 2> at java.base/sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:774) ~[?:?] 2> at java.base/sun.nio.ch.SocketAdaptor.connect(SocketAdaptor.java:120) ~[?:?] 2> at org.eclipse.jetty.io.ClientConnector.connect(ClientConnector.java:428) ~[jetty-io-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.HTTP2Client.connect(HTTP2Client.java:461) ~[http2-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.http.HttpClientTransportOverHTTP2.connect(HttpClientTransportOverHTTP2.java:146) ~[http2-http-client-transport-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.http.HttpClientTransportOverHTTP2.connect(HttpClientTransportOverHTTP2.java:135) ~[http2-http-client-transport-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.connect(HttpClient.java:583) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.succeeded(HttpClient.java:560) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.succeeded(HttpClient.java:556) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.util.SocketAddressResolver$Async.lambda$resolve$1(SocketAddressResolver.java:180) ~[jetty-util-10.0.19.jar:10.0.19] 2> ... 4 more 2> 368933 ERROR (qtp1455143443-2015) [n:127.0.0.1:32875__c c:repfacttest_c8n_1x3 s:shard1 r:core_node3 x:repfacttest_c8n_1x3_shard1_replica_n1 t:null-1096] o.a.s.u.p.DistributedZkUpdateProcessor Setting up to try to start recovery on replica core_node5 with url http://127.0.0.1:40643/_c/repfacttest_c8n_1x3_shard1_replica_n2/ by increasing leader term 2> => java.net.ConnectException: Connection refused 2> at java.base/sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) 2> java.net.ConnectException: Connection refused 2> at java.base/sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) ~[?:?] 2> at java.base/sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:774) ~[?:?] 2> at java.base/sun.nio.ch.SocketAdaptor.connect(SocketAdaptor.java:120) ~[?:?] 2> at org.eclipse.jetty.io.ClientConnector.connect(ClientConnector.java:428) ~[jetty-io-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.HTTP2Client.connect(HTTP2Client.java:461) ~[http2-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.http.HttpClientTransportOverHTTP2.connect(HttpClientTransportOverHTTP2.java:146) ~[http2-http-client-transport-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.http.HttpClientTransportOverHTTP2.connect(HttpClientTransportOverHTTP2.java:135) ~[http2-http-client-transport-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.connect(HttpClient.java:583) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.succeeded(HttpClient.java:560) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.succeeded(HttpClient.java:556) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.util.SocketAddressResolver$Async.lambda$resolve$1(SocketAddressResolver.java:180) ~[jetty-util-10.0.19.jar:10.0.19] 2> at org.apache.solr.common.util.ExecutorUtil$MDCAwareThreadPoolExecutor.lambda$execute$0(ExecutorUtil.java:294) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) ~[?:?] 2> at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) ~[?:?] 2> at java.base/java.lang.Thread.run(Thread.java:829) [?:?] 2> 368940 WARN (qtp1455143443-2015) [n:127.0.0.1:32875__c c:repfacttest_c8n_1x3 s:shard1 r:core_node3 x:repfacttest_c8n_1x3_shard1_replica_n1 t:null-1096] o.a.s.u.p.DistributedZkUpdateProcessor Error sending update to http://127.0.0.1:40643/_c 2> => java.io.IOException: java.net.ConnectException: Connection refused 2> at org.eclipse.jetty.client.util.OutputStreamRequestContent$AsyncOutputStream.write(OutputStreamRequestContent.java:104) 2> java.io.IOException: java.net.ConnectException: Connection refused 2> at org.eclipse.jetty.client.util.OutputStreamRequestContent$AsyncOutputStream.write(OutputStreamRequestContent.java:104) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.apache.solr.common.util.FastOutputStream.flush(FastOutputStream.java:207) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.common.util.FastOutputStream.flushBuffer(FastOutputStream.java:200) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.common.util.JavaBinCodec.marshal(JavaBinCodec.java:170) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.request.JavaBinUpdateRequestCodec.marshal(JavaBinUpdateRequestCodec.java:100) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.BinaryRequestWriter.write(BinaryRequestWriter.java:79) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.Http2SolrClient.send(Http2SolrClient.java:435) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.ConcurrentUpdateHttp2SolrClient$Runner.sendUpdateStream(ConcurrentUpdateHttp2SolrClient.java:250) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.ConcurrentUpdateHttp2SolrClient$Runner.run(ConcurrentUpdateHttp2SolrClient.java:192) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at com.codahale.metrics.InstrumentedExecutorService$InstrumentedRunnable.run(InstrumentedExecutorService.java:212) ~[metrics-core-4.2.21.jar:4.2.21] 2> at org.apache.solr.common.util.ExecutorUtil$MDCAwareThreadPoolExecutor.lambda$execute$0(ExecutorUtil.java:294) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) ~[?:?] 2> at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) ~[?:?] 2> at java.base/java.lang.Thread.run(Thread.java:829) [?:?] 2> Suppressed: java.io.IOException: java.net.ConnectException: Connection refused 2> at org.eclipse.jetty.client.util.OutputStreamRequestContent$AsyncOutputStream.write(OutputStreamRequestContent.java:104) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.apache.solr.common.util.FastOutputStream.flush(FastOutputStream.java:207) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.common.util.FastOutputStream.flushBuffer(FastOutputStream.java:200) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.common.util.JavaBinCodec.close(JavaBinCodec.java:1293) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.request.JavaBinUpdateRequestCodec.marshal(JavaBinUpdateRequestCodec.java:99) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.BinaryRequestWriter.write(BinaryRequestWriter.java:79) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.Http2SolrClient.send(Http2SolrClient.java:435) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.ConcurrentUpdateHttp2SolrClient$Runner.sendUpdateStream(ConcurrentUpdateHttp2SolrClient.java:250) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.ConcurrentUpdateHttp2SolrClient$Runner.run(ConcurrentUpdateHttp2SolrClient.java:192) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at com.codahale.metrics.InstrumentedExecutorService$InstrumentedRunnable.run(InstrumentedExecutorService.java:212) ~[metrics-core-4.2.21.jar:4.2.21] 2> at org.apache.solr.common.util.ExecutorUtil$MDCAwareThreadPoolExecutor.lambda$execute$0(ExecutorUtil.java:294) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) ~[?:?] 2> at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) ~[?:?] 2> at java.base/java.lang.Thread.run(Thread.java:829) [?:?] 2> Caused by: java.net.ConnectException: Connection refused 2> at java.base/sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) ~[?:?] 2> at java.base/sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:774) ~[?:?] 2> at java.base/sun.nio.ch.SocketAdaptor.connect(SocketAdaptor.java:120) ~[?:?] 2> at org.eclipse.jetty.io.ClientConnector.connect(ClientConnector.java:428) ~[jetty-io-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.HTTP2Client.connect(HTTP2Client.java:461) ~[http2-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.http.HttpClientTransportOverHTTP2.connect(HttpClientTransportOverHTTP2.java:146) ~[http2-http-client-transport-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.http.HttpClientTransportOverHTTP2.connect(HttpClientTransportOverHTTP2.java:135) ~[http2-http-client-transport-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.connect(HttpClient.java:583) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.succeeded(HttpClient.java:560) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.succeeded(HttpClient.java:556) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.util.SocketAddressResolver$Async.lambda$resolve$1(SocketAddressResolver.java:180) ~[jetty-util-10.0.19.jar:10.0.19] 2> ... 4 more 2> Caused by: java.net.ConnectException: Connection refused 2> at java.base/sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) ~[?:?] 2> at java.base/sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:774) ~[?:?] 2> at java.base/sun.nio.ch.SocketAdaptor.connect(SocketAdaptor.java:120) ~[?:?] 2> at org.eclipse.jetty.io.ClientConnector.connect(ClientConnector.java:428) ~[jetty-io-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.HTTP2Client.connect(HTTP2Client.java:461) ~[http2-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.http.HttpClientTransportOverHTTP2.connect(HttpClientTransportOverHTTP2.java:146) ~[http2-http-client-transport-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.http.HttpClientTransportOverHTTP2.connect(HttpClientTransportOverHTTP2.java:135) ~[http2-http-client-transport-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.connect(HttpClient.java:583) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.succeeded(HttpClient.java:560) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.succeeded(HttpClient.java:556) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.util.SocketAddressResolver$Async.lambda$resolve$1(SocketAddressResolver.java:180) ~[jetty-util-10.0.19.jar:10.0.19] 2> ... 4 more 2> 368958 ERROR (qtp1455143443-2015) [n:127.0.0.1:32875__c c:repfacttest_c8n_1x3 s:shard1 r:core_node3 x:repfacttest_c8n_1x3_shard1_replica_n1 t:null-1096] o.a.s.u.p.DistributedZkUpdateProcessor Setting up to try to start recovery on replica core_node5 with url http://127.0.0.1:40643/_c/repfacttest_c8n_1x3_shard1_replica_n2/ by increasing leader term 2> => java.net.ConnectException: Connection refused 2> at java.base/sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) 2> java.net.ConnectException: Connection refused 2> at java.base/sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) ~[?:?] 2> at java.base/sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:774) ~[?:?] 2> at java.base/sun.nio.ch.SocketAdaptor.connect(SocketAdaptor.java:120) ~[?:?] 2> at org.eclipse.jetty.io.ClientConnector.connect(ClientConnector.java:428) ~[jetty-io-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.HTTP2Client.connect(HTTP2Client.java:461) ~[http2-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.http.HttpClientTransportOverHTTP2.connect(HttpClientTransportOverHTTP2.java:146) ~[http2-http-client-transport-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.http.HttpClientTransportOverHTTP2.connect(HttpClientTransportOverHTTP2.java:135) ~[http2-http-client-transport-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.connect(HttpClient.java:583) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.succeeded(HttpClient.java:560) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.succeeded(HttpClient.java:556) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.util.SocketAddressResolver$Async.lambda$resolve$1(SocketAddressResolver.java:180) ~[jetty-util-10.0.19.jar:10.0.19] 2> at org.apache.solr.common.util.ExecutorUtil$MDCAwareThreadPoolExecutor.lambda$execute$0(ExecutorUtil.java:294) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) ~[?:?] 2> at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) ~[?:?] 2> at java.base/java.lang.Thread.run(Thread.java:829) [?:?] 2> 368964 WARN (qtp1455143443-2015) [n:127.0.0.1:32875__c c:repfacttest_c8n_1x3 s:shard1 r:core_node3 x:repfacttest_c8n_1x3_shard1_replica_n1 t:null-1096] o.a.s.u.p.DistributedZkUpdateProcessor Error sending update to http://127.0.0.1:40643/_c 2> => java.io.IOException: java.net.ConnectException: Connection refused 2> at org.eclipse.jetty.client.util.OutputStreamRequestContent$AsyncOutputStream.write(OutputStreamRequestContent.java:104) 2> java.io.IOException: java.net.ConnectException: Connection refused 2> at org.eclipse.jetty.client.util.OutputStreamRequestContent$AsyncOutputStream.write(OutputStreamRequestContent.java:104) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.apache.solr.common.util.FastOutputStream.flush(FastOutputStream.java:207) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.common.util.FastOutputStream.flushBuffer(FastOutputStream.java:200) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.common.util.JavaBinCodec.marshal(JavaBinCodec.java:170) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.request.JavaBinUpdateRequestCodec.marshal(JavaBinUpdateRequestCodec.java:100) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.BinaryRequestWriter.write(BinaryRequestWriter.java:79) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.Http2SolrClient.send(Http2SolrClient.java:435) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.ConcurrentUpdateHttp2SolrClient$Runner.sendUpdateStream(ConcurrentUpdateHttp2SolrClient.java:250) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.ConcurrentUpdateHttp2SolrClient$Runner.run(ConcurrentUpdateHttp2SolrClient.java:192) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at com.codahale.metrics.InstrumentedExecutorService$InstrumentedRunnable.run(InstrumentedExecutorService.java:212) ~[metrics-core-4.2.21.jar:4.2.21] 2> at org.apache.solr.common.util.ExecutorUtil$MDCAwareThreadPoolExecutor.lambda$execute$0(ExecutorUtil.java:294) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) ~[?:?] 2> at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) ~[?:?] 2> at java.base/java.lang.Thread.run(Thread.java:829) [?:?] 2> Suppressed: java.io.IOException: java.net.ConnectException: Connection refused 2> at org.eclipse.jetty.client.util.OutputStreamRequestContent$AsyncOutputStream.write(OutputStreamRequestContent.java:104) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.apache.solr.common.util.FastOutputStream.flush(FastOutputStream.java:207) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.common.util.FastOutputStream.flushBuffer(FastOutputStream.java:200) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.common.util.JavaBinCodec.close(JavaBinCodec.java:1293) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.request.JavaBinUpdateRequestCodec.marshal(JavaBinUpdateRequestCodec.java:99) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.BinaryRequestWriter.write(BinaryRequestWriter.java:79) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.Http2SolrClient.send(Http2SolrClient.java:435) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.ConcurrentUpdateHttp2SolrClient$Runner.sendUpdateStream(ConcurrentUpdateHttp2SolrClient.java:250) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.ConcurrentUpdateHttp2SolrClient$Runner.run(ConcurrentUpdateHttp2SolrClient.java:192) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at com.codahale.metrics.InstrumentedExecutorService$InstrumentedRunnable.run(InstrumentedExecutorService.java:212) ~[metrics-core-4.2.21.jar:4.2.21] 2> at org.apache.solr.common.util.ExecutorUtil$MDCAwareThreadPoolExecutor.lambda$execute$0(ExecutorUtil.java:294) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) ~[?:?] 2> at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) ~[?:?] 2> at java.base/java.lang.Thread.run(Thread.java:829) [?:?] 2> Caused by: java.net.ConnectException: Connection refused 2> at java.base/sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) ~[?:?] 2> at java.base/sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:774) ~[?:?] 2> at java.base/sun.nio.ch.SocketAdaptor.connect(SocketAdaptor.java:120) ~[?:?] 2> at org.eclipse.jetty.io.ClientConnector.connect(ClientConnector.java:428) ~[jetty-io-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.HTTP2Client.connect(HTTP2Client.java:461) ~[http2-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.http.HttpClientTransportOverHTTP2.connect(HttpClientTransportOverHTTP2.java:146) ~[http2-http-client-transport-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.http.HttpClientTransportOverHTTP2.connect(HttpClientTransportOverHTTP2.java:135) ~[http2-http-client-transport-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.connect(HttpClient.java:583) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.succeeded(HttpClient.java:560) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.succeeded(HttpClient.java:556) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.util.SocketAddressResolver$Async.lambda$resolve$1(SocketAddressResolver.java:180) ~[jetty-util-10.0.19.jar:10.0.19] 2> ... 4 more 2> Caused by: java.net.ConnectException: Connection refused 2> at java.base/sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) ~[?:?] 2> at java.base/sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:774) ~[?:?] 2> at java.base/sun.nio.ch.SocketAdaptor.connect(SocketAdaptor.java:120) ~[?:?] 2> at org.eclipse.jetty.io.ClientConnector.connect(ClientConnector.java:428) ~[jetty-io-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.HTTP2Client.connect(HTTP2Client.java:461) ~[http2-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.http.HttpClientTransportOverHTTP2.connect(HttpClientTransportOverHTTP2.java:146) ~[http2-http-client-transport-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.http.HttpClientTransportOverHTTP2.connect(HttpClientTransportOverHTTP2.java:135) ~[http2-http-client-transport-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.connect(HttpClient.java:583) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.succeeded(HttpClient.java:560) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.succeeded(HttpClient.java:556) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.util.SocketAddressResolver$Async.lambda$resolve$1(SocketAddressResolver.java:180) ~[jetty-util-10.0.19.jar:10.0.19] 2> ... 4 more 2> 368985 ERROR (qtp1455143443-2015) [n:127.0.0.1:32875__c c:repfacttest_c8n_1x3 s:shard1 r:core_node3 x:repfacttest_c8n_1x3_shard1_replica_n1 t:null-1096] o.a.s.u.p.DistributedZkUpdateProcessor Setting up to try to start recovery on replica core_node5 with url http://127.0.0.1:40643/_c/repfacttest_c8n_1x3_shard1_replica_n2/ by increasing leader term 2> => java.net.ConnectException: Connection refused 2> at java.base/sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) 2> java.net.ConnectException: Connection refused 2> at java.base/sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) ~[?:?] 2> at java.base/sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:774) ~[?:?] 2> at java.base/sun.nio.ch.SocketAdaptor.connect(SocketAdaptor.java:120) ~[?:?] 2> at org.eclipse.jetty.io.ClientConnector.connect(ClientConnector.java:428) ~[jetty-io-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.HTTP2Client.connect(HTTP2Client.java:461) ~[http2-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.http.HttpClientTransportOverHTTP2.connect(HttpClientTransportOverHTTP2.java:146) ~[http2-http-client-transport-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.http.HttpClientTransportOverHTTP2.connect(HttpClientTransportOverHTTP2.java:135) ~[http2-http-client-transport-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.connect(HttpClient.java:583) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.succeeded(HttpClient.java:560) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.succeeded(HttpClient.java:556) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.util.SocketAddressResolver$Async.lambda$resolve$1(SocketAddressResolver.java:180) ~[jetty-util-10.0.19.jar:10.0.19] 2> at org.apache.solr.common.util.ExecutorUtil$MDCAwareThreadPoolExecutor.lambda$execute$0(ExecutorUtil.java:294) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) ~[?:?] 2> at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) ~[?:?] 2> at java.base/java.lang.Thread.run(Thread.java:829) [?:?] 2> 368991 WARN (qtp1455143443-2015) [n:127.0.0.1:32875__c c:repfacttest_c8n_1x3 s:shard1 r:core_node3 x:repfacttest_c8n_1x3_shard1_replica_n1 t:null-1096] o.a.s.u.p.DistributedZkUpdateProcessor Error sending update to http://127.0.0.1:40643/_c 2> => java.io.IOException: java.net.ConnectException: Connection refused 2> at org.eclipse.jetty.client.util.OutputStreamRequestContent$AsyncOutputStream.write(OutputStreamRequestContent.java:104) 2> java.io.IOException: java.net.ConnectException: Connection refused 2> at org.eclipse.jetty.client.util.OutputStreamRequestContent$AsyncOutputStream.write(OutputStreamRequestContent.java:104) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.apache.solr.common.util.FastOutputStream.flush(FastOutputStream.java:207) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.common.util.FastOutputStream.flushBuffer(FastOutputStream.java:200) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.common.util.JavaBinCodec.marshal(JavaBinCodec.java:170) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.request.JavaBinUpdateRequestCodec.marshal(JavaBinUpdateRequestCodec.java:100) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.BinaryRequestWriter.write(BinaryRequestWriter.java:79) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.Http2SolrClient.send(Http2SolrClient.java:435) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.ConcurrentUpdateHttp2SolrClient$Runner.sendUpdateStream(ConcurrentUpdateHttp2SolrClient.java:250) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.ConcurrentUpdateHttp2SolrClient$Runner.run(ConcurrentUpdateHttp2SolrClient.java:192) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at com.codahale.metrics.InstrumentedExecutorService$InstrumentedRunnable.run(InstrumentedExecutorService.java:212) ~[metrics-core-4.2.21.jar:4.2.21] 2> at org.apache.solr.common.util.ExecutorUtil$MDCAwareThreadPoolExecutor.lambda$execute$0(ExecutorUtil.java:294) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) ~[?:?] 2> at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) ~[?:?] 2> at java.base/java.lang.Thread.run(Thread.java:829) [?:?] 2> Suppressed: java.io.IOException: java.net.ConnectException: Connection refused 2> at org.eclipse.jetty.client.util.OutputStreamRequestContent$AsyncOutputStream.write(OutputStreamRequestContent.java:104) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.apache.solr.common.util.FastOutputStream.flush(FastOutputStream.java:207) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.common.util.FastOutputStream.flushBuffer(FastOutputStream.java:200) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.common.util.JavaBinCodec.close(JavaBinCodec.java:1293) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.request.JavaBinUpdateRequestCodec.marshal(JavaBinUpdateRequestCodec.java:99) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.BinaryRequestWriter.write(BinaryRequestWriter.java:79) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.Http2SolrClient.send(Http2SolrClient.java:435) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.ConcurrentUpdateHttp2SolrClient$Runner.sendUpdateStream(ConcurrentUpdateHttp2SolrClient.java:250) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.ConcurrentUpdateHttp2SolrClient$Runner.run(ConcurrentUpdateHttp2SolrClient.java:192) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at com.codahale.metrics.InstrumentedExecutorService$InstrumentedRunnable.run(InstrumentedExecutorService.java:212) ~[metrics-core-4.2.21.jar:4.2.21] 2> at org.apache.solr.common.util.ExecutorUtil$MDCAwareThreadPoolExecutor.lambda$execute$0(ExecutorUtil.java:294) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) ~[?:?] 2> at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) ~[?:?] 2> at java.base/java.lang.Thread.run(Thread.java:829) [?:?] 2> Caused by: java.net.ConnectException: Connection refused 2> at java.base/sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) ~[?:?] 2> at java.base/sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:774) ~[?:?] 2> at java.base/sun.nio.ch.SocketAdaptor.connect(SocketAdaptor.java:120) ~[?:?] 2> at org.eclipse.jetty.io.ClientConnector.connect(ClientConnector.java:428) ~[jetty-io-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.HTTP2Client.connect(HTTP2Client.java:461) ~[http2-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.http.HttpClientTransportOverHTTP2.connect(HttpClientTransportOverHTTP2.java:146) ~[http2-http-client-transport-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.http.HttpClientTransportOverHTTP2.connect(HttpClientTransportOverHTTP2.java:135) ~[http2-http-client-transport-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.connect(HttpClient.java:583) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.succeeded(HttpClient.java:560) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.succeeded(HttpClient.java:556) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.util.SocketAddressResolver$Async.lambda$resolve$1(SocketAddressResolver.java:180) ~[jetty-util-10.0.19.jar:10.0.19] 2> ... 4 more 2> Caused by: java.net.ConnectException: Connection refused 2> at java.base/sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) ~[?:?] 2> at java.base/sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:774) ~[?:?] 2> at java.base/sun.nio.ch.SocketAdaptor.connect(SocketAdaptor.java:120) ~[?:?] 2> at org.eclipse.jetty.io.ClientConnector.connect(ClientConnector.java:428) ~[jetty-io-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.HTTP2Client.connect(HTTP2Client.java:461) ~[http2-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.http.HttpClientTransportOverHTTP2.connect(HttpClientTransportOverHTTP2.java:146) ~[http2-http-client-transport-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.http.HttpClientTransportOverHTTP2.connect(HttpClientTransportOverHTTP2.java:135) ~[http2-http-client-transport-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.connect(HttpClient.java:583) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.succeeded(HttpClient.java:560) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.succeeded(HttpClient.java:556) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.util.SocketAddressResolver$Async.lambda$resolve$1(SocketAddressResolver.java:180) ~[jetty-util-10.0.19.jar:10.0.19] 2> ... 4 more 2> 369006 ERROR (qtp1455143443-2015) [n:127.0.0.1:32875__c c:repfacttest_c8n_1x3 s:shard1 r:core_node3 x:repfacttest_c8n_1x3_shard1_replica_n1 t:null-1096] o.a.s.u.p.DistributedZkUpdateProcessor Setting up to try to start recovery on replica core_node5 with url http://127.0.0.1:40643/_c/repfacttest_c8n_1x3_shard1_replica_n2/ by increasing leader term 2> => java.net.ConnectException: Connection refused 2> at java.base/sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) 2> java.net.ConnectException: Connection refused 2> at java.base/sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) ~[?:?] 2> at java.base/sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:774) ~[?:?] 2> at java.base/sun.nio.ch.SocketAdaptor.connect(SocketAdaptor.java:120) ~[?:?] 2> at org.eclipse.jetty.io.ClientConnector.connect(ClientConnector.java:428) ~[jetty-io-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.HTTP2Client.connect(HTTP2Client.java:461) ~[http2-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.http.HttpClientTransportOverHTTP2.connect(HttpClientTransportOverHTTP2.java:146) ~[http2-http-client-transport-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.http.HttpClientTransportOverHTTP2.connect(HttpClientTransportOverHTTP2.java:135) ~[http2-http-client-transport-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.connect(HttpClient.java:583) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.succeeded(HttpClient.java:560) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.succeeded(HttpClient.java:556) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.util.SocketAddressResolver$Async.lambda$resolve$1(SocketAddressResolver.java:180) ~[jetty-util-10.0.19.jar:10.0.19] 2> at org.apache.solr.common.util.ExecutorUtil$MDCAwareThreadPoolExecutor.lambda$execute$0(ExecutorUtil.java:294) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) ~[?:?] 2> at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) ~[?:?] 2> at java.base/java.lang.Thread.run(Thread.java:829) [?:?] 2> 369011 WARN (qtp1455143443-2015) [n:127.0.0.1:32875__c c:repfacttest_c8n_1x3 s:shard1 r:core_node3 x:repfacttest_c8n_1x3_shard1_replica_n1 t:null-1096] o.a.s.u.p.DistributedZkUpdateProcessor Error sending update to http://127.0.0.1:40643/_c 2> => java.io.IOException: java.net.ConnectException: Connection refused 2> at org.eclipse.jetty.client.util.OutputStreamRequestContent$AsyncOutputStream.write(OutputStreamRequestContent.java:104) 2> java.io.IOException: java.net.ConnectException: Connection refused 2> at org.eclipse.jetty.client.util.OutputStreamRequestContent$AsyncOutputStream.write(OutputStreamRequestContent.java:104) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.apache.solr.common.util.FastOutputStream.flush(FastOutputStream.java:207) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.common.util.FastOutputStream.flushBuffer(FastOutputStream.java:200) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.common.util.JavaBinCodec.marshal(JavaBinCodec.java:170) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.request.JavaBinUpdateRequestCodec.marshal(JavaBinUpdateRequestCodec.java:100) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.BinaryRequestWriter.write(BinaryRequestWriter.java:79) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.Http2SolrClient.send(Http2SolrClient.java:435) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.ConcurrentUpdateHttp2SolrClient$Runner.sendUpdateStream(ConcurrentUpdateHttp2SolrClient.java:250) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.ConcurrentUpdateHttp2SolrClient$Runner.run(ConcurrentUpdateHttp2SolrClient.java:192) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at com.codahale.metrics.InstrumentedExecutorService$InstrumentedRunnable.run(InstrumentedExecutorService.java:212) ~[metrics-core-4.2.21.jar:4.2.21] 2> at org.apache.solr.common.util.ExecutorUtil$MDCAwareThreadPoolExecutor.lambda$execute$0(ExecutorUtil.java:294) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) ~[?:?] 2> at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) ~[?:?] 2> at java.base/java.lang.Thread.run(Thread.java:829) [?:?] 2> Suppressed: java.io.IOException: java.net.ConnectException: Connection refused 2> at org.eclipse.jetty.client.util.OutputStreamRequestContent$AsyncOutputStream.write(OutputStreamRequestContent.java:104) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.apache.solr.common.util.FastOutputStream.flush(FastOutputStream.java:207) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.common.util.FastOutputStream.flushBuffer(FastOutputStream.java:200) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.common.util.JavaBinCodec.close(JavaBinCodec.java:1293) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.request.JavaBinUpdateRequestCodec.marshal(JavaBinUpdateRequestCodec.java:99) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.BinaryRequestWriter.write(BinaryRequestWriter.java:79) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.Http2SolrClient.send(Http2SolrClient.java:435) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.ConcurrentUpdateHttp2SolrClient$Runner.sendUpdateStream(ConcurrentUpdateHttp2SolrClient.java:250) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.ConcurrentUpdateHttp2SolrClient$Runner.run(ConcurrentUpdateHttp2SolrClient.java:192) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at com.codahale.metrics.InstrumentedExecutorService$InstrumentedRunnable.run(InstrumentedExecutorService.java:212) ~[metrics-core-4.2.21.jar:4.2.21] 2> at org.apache.solr.common.util.ExecutorUtil$MDCAwareThreadPoolExecutor.lambda$execute$0(ExecutorUtil.java:294) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) ~[?:?] 2> at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) ~[?:?] 2> at java.base/java.lang.Thread.run(Thread.java:829) [?:?] 2> Caused by: java.net.ConnectException: Connection refused 2> at java.base/sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) ~[?:?] 2> at java.base/sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:774) ~[?:?] 2> at java.base/sun.nio.ch.SocketAdaptor.connect(SocketAdaptor.java:120) ~[?:?] 2> at org.eclipse.jetty.io.ClientConnector.connect(ClientConnector.java:428) ~[jetty-io-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.HTTP2Client.connect(HTTP2Client.java:461) ~[http2-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.http.HttpClientTransportOverHTTP2.connect(HttpClientTransportOverHTTP2.java:146) ~[http2-http-client-transport-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.http.HttpClientTransportOverHTTP2.connect(HttpClientTransportOverHTTP2.java:135) ~[http2-http-client-transport-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.connect(HttpClient.java:583) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.succeeded(HttpClient.java:560) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.succeeded(HttpClient.java:556) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.util.SocketAddressResolver$Async.lambda$resolve$1(SocketAddressResolver.java:180) ~[jetty-util-10.0.19.jar:10.0.19] 2> ... 4 more 2> Caused by: java.net.ConnectException: Connection refused 2> at java.base/sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) ~[?:?] 2> at java.base/sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:774) ~[?:?] 2> at java.base/sun.nio.ch.SocketAdaptor.connect(SocketAdaptor.java:120) ~[?:?] 2> at org.eclipse.jetty.io.ClientConnector.connect(ClientConnector.java:428) ~[jetty-io-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.HTTP2Client.connect(HTTP2Client.java:461) ~[http2-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.http.HttpClientTransportOverHTTP2.connect(HttpClientTransportOverHTTP2.java:146) ~[http2-http-client-transport-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.http.HttpClientTransportOverHTTP2.connect(HttpClientTransportOverHTTP2.java:135) ~[http2-http-client-transport-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.connect(HttpClient.java:583) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.succeeded(HttpClient.java:560) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.succeeded(HttpClient.java:556) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.util.SocketAddressResolver$Async.lambda$resolve$1(SocketAddressResolver.java:180) ~[jetty-util-10.0.19.jar:10.0.19] 2> ... 4 more 2> 369028 ERROR (qtp1455143443-2015) [n:127.0.0.1:32875__c c:repfacttest_c8n_1x3 s:shard1 r:core_node3 x:repfacttest_c8n_1x3_shard1_replica_n1 t:null-1096] o.a.s.u.p.DistributedZkUpdateProcessor Setting up to try to start recovery on replica core_node5 with url http://127.0.0.1:40643/_c/repfacttest_c8n_1x3_shard1_replica_n2/ by increasing leader term 2> => java.net.ConnectException: Connection refused 2> at java.base/sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) 2> java.net.ConnectException: Connection refused 2> at java.base/sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) ~[?:?] 2> at java.base/sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:774) ~[?:?] 2> at java.base/sun.nio.ch.SocketAdaptor.connect(SocketAdaptor.java:120) ~[?:?] 2> at org.eclipse.jetty.io.ClientConnector.connect(ClientConnector.java:428) ~[jetty-io-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.HTTP2Client.connect(HTTP2Client.java:461) ~[http2-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.http.HttpClientTransportOverHTTP2.connect(HttpClientTransportOverHTTP2.java:146) ~[http2-http-client-transport-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.http.HttpClientTransportOverHTTP2.connect(HttpClientTransportOverHTTP2.java:135) ~[http2-http-client-transport-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.connect(HttpClient.java:583) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.succeeded(HttpClient.java:560) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.succeeded(HttpClient.java:556) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.util.SocketAddressResolver$Async.lambda$resolve$1(SocketAddressResolver.java:180) ~[jetty-util-10.0.19.jar:10.0.19] 2> at org.apache.solr.common.util.ExecutorUtil$MDCAwareThreadPoolExecutor.lambda$execute$0(ExecutorUtil.java:294) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) ~[?:?] 2> at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) ~[?:?] 2> at java.base/java.lang.Thread.run(Thread.java:829) [?:?] 2> 369036 WARN (qtp1455143443-2015) [n:127.0.0.1:32875__c c:repfacttest_c8n_1x3 s:shard1 r:core_node3 x:repfacttest_c8n_1x3_shard1_replica_n1 t:null-1096] o.a.s.u.p.DistributedZkUpdateProcessor Error sending update to http://127.0.0.1:40643/_c 2> => java.io.IOException: java.net.ConnectException: Connection refused 2> at org.eclipse.jetty.client.util.OutputStreamRequestContent$AsyncOutputStream.write(OutputStreamRequestContent.java:104) 2> java.io.IOException: java.net.ConnectException: Connection refused 2> at org.eclipse.jetty.client.util.OutputStreamRequestContent$AsyncOutputStream.write(OutputStreamRequestContent.java:104) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.apache.solr.common.util.FastOutputStream.flush(FastOutputStream.java:207) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.common.util.FastOutputStream.flushBuffer(FastOutputStream.java:200) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.common.util.JavaBinCodec.marshal(JavaBinCodec.java:170) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.request.JavaBinUpdateRequestCodec.marshal(JavaBinUpdateRequestCodec.java:100) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.BinaryRequestWriter.write(BinaryRequestWriter.java:79) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.Http2SolrClient.send(Http2SolrClient.java:435) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.ConcurrentUpdateHttp2SolrClient$Runner.sendUpdateStream(ConcurrentUpdateHttp2SolrClient.java:250) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.ConcurrentUpdateHttp2SolrClient$Runner.run(ConcurrentUpdateHttp2SolrClient.java:192) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at com.codahale.metrics.InstrumentedExecutorService$InstrumentedRunnable.run(InstrumentedExecutorService.java:212) ~[metrics-core-4.2.21.jar:4.2.21] 2> at org.apache.solr.common.util.ExecutorUtil$MDCAwareThreadPoolExecutor.lambda$execute$0(ExecutorUtil.java:294) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) ~[?:?] 2> at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) ~[?:?] 2> at java.base/java.lang.Thread.run(Thread.java:829) [?:?] 2> Suppressed: java.io.IOException: java.net.ConnectException: Connection refused 2> at org.eclipse.jetty.client.util.OutputStreamRequestContent$AsyncOutputStream.write(OutputStreamRequestContent.java:104) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.apache.solr.common.util.FastOutputStream.flush(FastOutputStream.java:207) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.common.util.FastOutputStream.flushBuffer(FastOutputStream.java:200) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.common.util.JavaBinCodec.close(JavaBinCodec.java:1293) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.request.JavaBinUpdateRequestCodec.marshal(JavaBinUpdateRequestCodec.java:99) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.BinaryRequestWriter.write(BinaryRequestWriter.java:79) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.Http2SolrClient.send(Http2SolrClient.java:435) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.ConcurrentUpdateHttp2SolrClient$Runner.sendUpdateStream(ConcurrentUpdateHttp2SolrClient.java:250) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.ConcurrentUpdateHttp2SolrClient$Runner.run(ConcurrentUpdateHttp2SolrClient.java:192) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at com.codahale.metrics.InstrumentedExecutorService$InstrumentedRunnable.run(InstrumentedExecutorService.java:212) ~[metrics-core-4.2.21.jar:4.2.21] 2> at org.apache.solr.common.util.ExecutorUtil$MDCAwareThreadPoolExecutor.lambda$execute$0(ExecutorUtil.java:294) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) ~[?:?] 2> at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) ~[?:?] 2> at java.base/java.lang.Thread.run(Thread.java:829) [?:?] 2> Caused by: java.net.ConnectException: Connection refused 2> at java.base/sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) ~[?:?] 2> at java.base/sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:774) ~[?:?] 2> at java.base/sun.nio.ch.SocketAdaptor.connect(SocketAdaptor.java:120) ~[?:?] 2> at org.eclipse.jetty.io.ClientConnector.connect(ClientConnector.java:428) ~[jetty-io-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.HTTP2Client.connect(HTTP2Client.java:461) ~[http2-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.http.HttpClientTransportOverHTTP2.connect(HttpClientTransportOverHTTP2.java:146) ~[http2-http-client-transport-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.http.HttpClientTransportOverHTTP2.connect(HttpClientTransportOverHTTP2.java:135) ~[http2-http-client-transport-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.connect(HttpClient.java:583) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.succeeded(HttpClient.java:560) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.succeeded(HttpClient.java:556) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.util.SocketAddressResolver$Async.lambda$resolve$1(SocketAddressResolver.java:180) ~[jetty-util-10.0.19.jar:10.0.19] 2> ... 4 more 2> Caused by: java.net.ConnectException: Connection refused 2> at java.base/sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) ~[?:?] 2> at java.base/sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:774) ~[?:?] 2> at java.base/sun.nio.ch.SocketAdaptor.connect(SocketAdaptor.java:120) ~[?:?] 2> at org.eclipse.jetty.io.ClientConnector.connect(ClientConnector.java:428) ~[jetty-io-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.HTTP2Client.connect(HTTP2Client.java:461) ~[http2-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.http.HttpClientTransportOverHTTP2.connect(HttpClientTransportOverHTTP2.java:146) ~[http2-http-client-transport-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.http.HttpClientTransportOverHTTP2.connect(HttpClientTransportOverHTTP2.java:135) ~[http2-http-client-transport-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.connect(HttpClient.java:583) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.succeeded(HttpClient.java:560) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.succeeded(HttpClient.java:556) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.util.SocketAddressResolver$Async.lambda$resolve$1(SocketAddressResolver.java:180) ~[jetty-util-10.0.19.jar:10.0.19] 2> ... 4 more 2> 369057 ERROR (qtp1455143443-2015) [n:127.0.0.1:32875__c c:repfacttest_c8n_1x3 s:shard1 r:core_node3 x:repfacttest_c8n_1x3_shard1_replica_n1 t:null-1096] o.a.s.u.p.DistributedZkUpdateProcessor Setting up to try to start recovery on replica core_node5 with url http://127.0.0.1:40643/_c/repfacttest_c8n_1x3_shard1_replica_n2/ by increasing leader term 2> => java.net.ConnectException: Connection refused 2> at java.base/sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) 2> java.net.ConnectException: Connection refused 2> at java.base/sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) ~[?:?] 2> at java.base/sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:774) ~[?:?] 2> at java.base/sun.nio.ch.SocketAdaptor.connect(SocketAdaptor.java:120) ~[?:?] 2> at org.eclipse.jetty.io.ClientConnector.connect(ClientConnector.java:428) ~[jetty-io-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.HTTP2Client.connect(HTTP2Client.java:461) ~[http2-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.http.HttpClientTransportOverHTTP2.connect(HttpClientTransportOverHTTP2.java:146) ~[http2-http-client-transport-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.http.HttpClientTransportOverHTTP2.connect(HttpClientTransportOverHTTP2.java:135) ~[http2-http-client-transport-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.connect(HttpClient.java:583) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.succeeded(HttpClient.java:560) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.succeeded(HttpClient.java:556) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.util.SocketAddressResolver$Async.lambda$resolve$1(SocketAddressResolver.java:180) ~[jetty-util-10.0.19.jar:10.0.19] 2> at org.apache.solr.common.util.ExecutorUtil$MDCAwareThreadPoolExecutor.lambda$execute$0(ExecutorUtil.java:294) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) ~[?:?] 2> at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) ~[?:?] 2> at java.base/java.lang.Thread.run(Thread.java:829) [?:?] 2> 369066 WARN (qtp1455143443-2015) [n:127.0.0.1:32875__c c:repfacttest_c8n_1x3 s:shard1 r:core_node3 x:repfacttest_c8n_1x3_shard1_replica_n1 t:null-1096] o.a.s.u.p.DistributedZkUpdateProcessor Error sending update to http://127.0.0.1:40643/_c 2> => java.io.IOException: java.net.ConnectException: Connection refused 2> at org.eclipse.jetty.client.util.OutputStreamRequestContent$AsyncOutputStream.write(OutputStreamRequestContent.java:104) 2> java.io.IOException: java.net.ConnectException: Connection refused 2> at org.eclipse.jetty.client.util.OutputStreamRequestContent$AsyncOutputStream.write(OutputStreamRequestContent.java:104) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.apache.solr.common.util.FastOutputStream.flush(FastOutputStream.java:207) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.common.util.FastOutputStream.flushBuffer(FastOutputStream.java:200) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.common.util.JavaBinCodec.marshal(JavaBinCodec.java:170) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.request.JavaBinUpdateRequestCodec.marshal(JavaBinUpdateRequestCodec.java:100) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.BinaryRequestWriter.write(BinaryRequestWriter.java:79) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.Http2SolrClient.send(Http2SolrClient.java:435) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.ConcurrentUpdateHttp2SolrClient$Runner.sendUpdateStream(ConcurrentUpdateHttp2SolrClient.java:250) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.ConcurrentUpdateHttp2SolrClient$Runner.run(ConcurrentUpdateHttp2SolrClient.java:192) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at com.codahale.metrics.InstrumentedExecutorService$InstrumentedRunnable.run(InstrumentedExecutorService.java:212) ~[metrics-core-4.2.21.jar:4.2.21] 2> at org.apache.solr.common.util.ExecutorUtil$MDCAwareThreadPoolExecutor.lambda$execute$0(ExecutorUtil.java:294) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) ~[?:?] 2> at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) ~[?:?] 2> at java.base/java.lang.Thread.run(Thread.java:829) [?:?] 2> Suppressed: java.io.IOException: java.net.ConnectException: Connection refused 2> at org.eclipse.jetty.client.util.OutputStreamRequestContent$AsyncOutputStream.write(OutputStreamRequestContent.java:104) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.apache.solr.common.util.FastOutputStream.flush(FastOutputStream.java:207) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.common.util.FastOutputStream.flushBuffer(FastOutputStream.java:200) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.common.util.JavaBinCodec.close(JavaBinCodec.java:1293) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.request.JavaBinUpdateRequestCodec.marshal(JavaBinUpdateRequestCodec.java:99) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.BinaryRequestWriter.write(BinaryRequestWriter.java:79) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.Http2SolrClient.send(Http2SolrClient.java:435) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.ConcurrentUpdateHttp2SolrClient$Runner.sendUpdateStream(ConcurrentUpdateHttp2SolrClient.java:250) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.ConcurrentUpdateHttp2SolrClient$Runner.run(ConcurrentUpdateHttp2SolrClient.java:192) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at com.codahale.metrics.InstrumentedExecutorService$InstrumentedRunnable.run(InstrumentedExecutorService.java:212) ~[metrics-core-4.2.21.jar:4.2.21] 2> at org.apache.solr.common.util.ExecutorUtil$MDCAwareThreadPoolExecutor.lambda$execute$0(ExecutorUtil.java:294) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) ~[?:?] 2> at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) ~[?:?] 2> at java.base/java.lang.Thread.run(Thread.java:829) [?:?] 2> Caused by: java.net.ConnectException: Connection refused 2> at java.base/sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) ~[?:?] 2> at java.base/sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:774) ~[?:?] 2> at java.base/sun.nio.ch.SocketAdaptor.connect(SocketAdaptor.java:120) ~[?:?] 2> at org.eclipse.jetty.io.ClientConnector.connect(ClientConnector.java:428) ~[jetty-io-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.HTTP2Client.connect(HTTP2Client.java:461) ~[http2-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.http.HttpClientTransportOverHTTP2.connect(HttpClientTransportOverHTTP2.java:146) ~[http2-http-client-transport-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.http.HttpClientTransportOverHTTP2.connect(HttpClientTransportOverHTTP2.java:135) ~[http2-http-client-transport-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.connect(HttpClient.java:583) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.succeeded(HttpClient.java:560) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.succeeded(HttpClient.java:556) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.util.SocketAddressResolver$Async.lambda$resolve$1(SocketAddressResolver.java:180) ~[jetty-util-10.0.19.jar:10.0.19] 2> ... 4 more 2> Caused by: java.net.ConnectException: Connection refused 2> at java.base/sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) ~[?:?] 2> at java.base/sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:774) ~[?:?] 2> at java.base/sun.nio.ch.SocketAdaptor.connect(SocketAdaptor.java:120) ~[?:?] 2> at org.eclipse.jetty.io.ClientConnector.connect(ClientConnector.java:428) ~[jetty-io-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.HTTP2Client.connect(HTTP2Client.java:461) ~[http2-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.http.HttpClientTransportOverHTTP2.connect(HttpClientTransportOverHTTP2.java:146) ~[http2-http-client-transport-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.http.HttpClientTransportOverHTTP2.connect(HttpClientTransportOverHTTP2.java:135) ~[http2-http-client-transport-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.connect(HttpClient.java:583) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.succeeded(HttpClient.java:560) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.succeeded(HttpClient.java:556) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.util.SocketAddressResolver$Async.lambda$resolve$1(SocketAddressResolver.java:180) ~[jetty-util-10.0.19.jar:10.0.19] 2> ... 4 more 2> 369089 ERROR (qtp1455143443-2015) [n:127.0.0.1:32875__c c:repfacttest_c8n_1x3 s:shard1 r:core_node3 x:repfacttest_c8n_1x3_shard1_replica_n1 t:null-1096] o.a.s.u.p.DistributedZkUpdateProcessor Setting up to try to start recovery on replica core_node5 with url http://127.0.0.1:40643/_c/repfacttest_c8n_1x3_shard1_replica_n2/ by increasing leader term 2> => java.net.ConnectException: Connection refused 2> at java.base/sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) 2> java.net.ConnectException: Connection refused 2> at java.base/sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) ~[?:?] 2> at java.base/sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:774) ~[?:?] 2> at java.base/sun.nio.ch.SocketAdaptor.connect(SocketAdaptor.java:120) ~[?:?] 2> at org.eclipse.jetty.io.ClientConnector.connect(ClientConnector.java:428) ~[jetty-io-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.HTTP2Client.connect(HTTP2Client.java:461) ~[http2-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.http.HttpClientTransportOverHTTP2.connect(HttpClientTransportOverHTTP2.java:146) ~[http2-http-client-transport-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.http.HttpClientTransportOverHTTP2.connect(HttpClientTransportOverHTTP2.java:135) ~[http2-http-client-transport-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.connect(HttpClient.java:583) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.succeeded(HttpClient.java:560) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.succeeded(HttpClient.java:556) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.util.SocketAddressResolver$Async.lambda$resolve$1(SocketAddressResolver.java:180) ~[jetty-util-10.0.19.jar:10.0.19] 2> at org.apache.solr.common.util.ExecutorUtil$MDCAwareThreadPoolExecutor.lambda$execute$0(ExecutorUtil.java:294) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) ~[?:?] 2> at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) ~[?:?] 2> at java.base/java.lang.Thread.run(Thread.java:829) [?:?] 2> 369098 WARN (qtp1455143443-2015) [n:127.0.0.1:32875__c c:repfacttest_c8n_1x3 s:shard1 r:core_node3 x:repfacttest_c8n_1x3_shard1_replica_n1 t:null-1096] o.a.s.u.p.DistributedZkUpdateProcessor Error sending update to http://127.0.0.1:40643/_c 2> => java.io.IOException: java.net.ConnectException: Connection refused 2> at org.eclipse.jetty.client.util.OutputStreamRequestContent$AsyncOutputStream.write(OutputStreamRequestContent.java:104) 2> java.io.IOException: java.net.ConnectException: Connection refused 2> at org.eclipse.jetty.client.util.OutputStreamRequestContent$AsyncOutputStream.write(OutputStreamRequestContent.java:104) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.apache.solr.common.util.FastOutputStream.flush(FastOutputStream.java:207) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.common.util.FastOutputStream.flushBuffer(FastOutputStream.java:200) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.common.util.JavaBinCodec.marshal(JavaBinCodec.java:170) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.request.JavaBinUpdateRequestCodec.marshal(JavaBinUpdateRequestCodec.java:100) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.BinaryRequestWriter.write(BinaryRequestWriter.java:79) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.Http2SolrClient.send(Http2SolrClient.java:435) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.ConcurrentUpdateHttp2SolrClient$Runner.sendUpdateStream(ConcurrentUpdateHttp2SolrClient.java:250) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.ConcurrentUpdateHttp2SolrClient$Runner.run(ConcurrentUpdateHttp2SolrClient.java:192) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at com.codahale.metrics.InstrumentedExecutorService$InstrumentedRunnable.run(InstrumentedExecutorService.java:212) ~[metrics-core-4.2.21.jar:4.2.21] 2> at org.apache.solr.common.util.ExecutorUtil$MDCAwareThreadPoolExecutor.lambda$execute$0(ExecutorUtil.java:294) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) ~[?:?] 2> at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) ~[?:?] 2> at java.base/java.lang.Thread.run(Thread.java:829) [?:?] 2> Suppressed: java.io.IOException: java.net.ConnectException: Connection refused 2> at org.eclipse.jetty.client.util.OutputStreamRequestContent$AsyncOutputStream.write(OutputStreamRequestContent.java:104) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.apache.solr.common.util.FastOutputStream.flush(FastOutputStream.java:207) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.common.util.FastOutputStream.flushBuffer(FastOutputStream.java:200) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.common.util.JavaBinCodec.close(JavaBinCodec.java:1293) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.request.JavaBinUpdateRequestCodec.marshal(JavaBinUpdateRequestCodec.java:99) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.BinaryRequestWriter.write(BinaryRequestWriter.java:79) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.Http2SolrClient.send(Http2SolrClient.java:435) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.ConcurrentUpdateHttp2SolrClient$Runner.sendUpdateStream(ConcurrentUpdateHttp2SolrClient.java:250) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.ConcurrentUpdateHttp2SolrClient$Runner.run(ConcurrentUpdateHttp2SolrClient.java:192) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at com.codahale.metrics.InstrumentedExecutorService$InstrumentedRunnable.run(InstrumentedExecutorService.java:212) ~[metrics-core-4.2.21.jar:4.2.21] 2> at org.apache.solr.common.util.ExecutorUtil$MDCAwareThreadPoolExecutor.lambda$execute$0(ExecutorUtil.java:294) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) ~[?:?] 2> at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) ~[?:?] 2> at java.base/java.lang.Thread.run(Thread.java:829) [?:?] 2> Caused by: java.net.ConnectException: Connection refused 2> at java.base/sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) ~[?:?] 2> at java.base/sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:774) ~[?:?] 2> at java.base/sun.nio.ch.SocketAdaptor.connect(SocketAdaptor.java:120) ~[?:?] 2> at org.eclipse.jetty.io.ClientConnector.connect(ClientConnector.java:428) ~[jetty-io-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.HTTP2Client.connect(HTTP2Client.java:461) ~[http2-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.http.HttpClientTransportOverHTTP2.connect(HttpClientTransportOverHTTP2.java:146) ~[http2-http-client-transport-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.http.HttpClientTransportOverHTTP2.connect(HttpClientTransportOverHTTP2.java:135) ~[http2-http-client-transport-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.connect(HttpClient.java:583) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.succeeded(HttpClient.java:560) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.succeeded(HttpClient.java:556) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.util.SocketAddressResolver$Async.lambda$resolve$1(SocketAddressResolver.java:180) ~[jetty-util-10.0.19.jar:10.0.19] 2> ... 4 more 2> Caused by: java.net.ConnectException: Connection refused 2> at java.base/sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) ~[?:?] 2> at java.base/sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:774) ~[?:?] 2> at java.base/sun.nio.ch.SocketAdaptor.connect(SocketAdaptor.java:120) ~[?:?] 2> at org.eclipse.jetty.io.ClientConnector.connect(ClientConnector.java:428) ~[jetty-io-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.HTTP2Client.connect(HTTP2Client.java:461) ~[http2-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.http.HttpClientTransportOverHTTP2.connect(HttpClientTransportOverHTTP2.java:146) ~[http2-http-client-transport-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.http.HttpClientTransportOverHTTP2.connect(HttpClientTransportOverHTTP2.java:135) ~[http2-http-client-transport-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.connect(HttpClient.java:583) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.succeeded(HttpClient.java:560) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.succeeded(HttpClient.java:556) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.util.SocketAddressResolver$Async.lambda$resolve$1(SocketAddressResolver.java:180) ~[jetty-util-10.0.19.jar:10.0.19] 2> ... 4 more 2> 369118 ERROR (qtp1455143443-2015) [n:127.0.0.1:32875__c c:repfacttest_c8n_1x3 s:shard1 r:core_node3 x:repfacttest_c8n_1x3_shard1_replica_n1 t:null-1096] o.a.s.u.p.DistributedZkUpdateProcessor Setting up to try to start recovery on replica core_node5 with url http://127.0.0.1:40643/_c/repfacttest_c8n_1x3_shard1_replica_n2/ by increasing leader term 2> => java.net.ConnectException: Connection refused 2> at java.base/sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) 2> java.net.ConnectException: Connection refused 2> at java.base/sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) ~[?:?] 2> at java.base/sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:774) ~[?:?] 2> at java.base/sun.nio.ch.SocketAdaptor.connect(SocketAdaptor.java:120) ~[?:?] 2> at org.eclipse.jetty.io.ClientConnector.connect(ClientConnector.java:428) ~[jetty-io-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.HTTP2Client.connect(HTTP2Client.java:461) ~[http2-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.http.HttpClientTransportOverHTTP2.connect(HttpClientTransportOverHTTP2.java:146) ~[http2-http-client-transport-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.http.HttpClientTransportOverHTTP2.connect(HttpClientTransportOverHTTP2.java:135) ~[http2-http-client-transport-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.connect(HttpClient.java:583) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.succeeded(HttpClient.java:560) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.succeeded(HttpClient.java:556) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.util.SocketAddressResolver$Async.lambda$resolve$1(SocketAddressResolver.java:180) ~[jetty-util-10.0.19.jar:10.0.19] 2> at org.apache.solr.common.util.ExecutorUtil$MDCAwareThreadPoolExecutor.lambda$execute$0(ExecutorUtil.java:294) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) ~[?:?] 2> at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) ~[?:?] 2> at java.base/java.lang.Thread.run(Thread.java:829) [?:?] 2> 369124 WARN (qtp1455143443-2015) [n:127.0.0.1:32875__c c:repfacttest_c8n_1x3 s:shard1 r:core_node3 x:repfacttest_c8n_1x3_shard1_replica_n1 t:null-1096] o.a.s.u.p.DistributedZkUpdateProcessor Error sending update to http://127.0.0.1:40643/_c 2> => java.io.IOException: java.net.ConnectException: Connection refused 2> at org.eclipse.jetty.client.util.OutputStreamRequestContent$AsyncOutputStream.write(OutputStreamRequestContent.java:104) 2> java.io.IOException: java.net.ConnectException: Connection refused 2> at org.eclipse.jetty.client.util.OutputStreamRequestContent$AsyncOutputStream.write(OutputStreamRequestContent.java:104) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.apache.solr.common.util.FastOutputStream.flush(FastOutputStream.java:207) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.common.util.FastOutputStream.flushBuffer(FastOutputStream.java:200) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.common.util.JavaBinCodec.marshal(JavaBinCodec.java:170) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.request.JavaBinUpdateRequestCodec.marshal(JavaBinUpdateRequestCodec.java:100) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.BinaryRequestWriter.write(BinaryRequestWriter.java:79) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.Http2SolrClient.send(Http2SolrClient.java:435) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.ConcurrentUpdateHttp2SolrClient$Runner.sendUpdateStream(ConcurrentUpdateHttp2SolrClient.java:250) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.ConcurrentUpdateHttp2SolrClient$Runner.run(ConcurrentUpdateHttp2SolrClient.java:192) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at com.codahale.metrics.InstrumentedExecutorService$InstrumentedRunnable.run(InstrumentedExecutorService.java:212) ~[metrics-core-4.2.21.jar:4.2.21] 2> at org.apache.solr.common.util.ExecutorUtil$MDCAwareThreadPoolExecutor.lambda$execute$0(ExecutorUtil.java:294) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) ~[?:?] 2> at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) ~[?:?] 2> at java.base/java.lang.Thread.run(Thread.java:829) [?:?] 2> Suppressed: java.io.IOException: java.net.ConnectException: Connection refused 2> at org.eclipse.jetty.client.util.OutputStreamRequestContent$AsyncOutputStream.write(OutputStreamRequestContent.java:104) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.apache.solr.common.util.FastOutputStream.flush(FastOutputStream.java:207) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.common.util.FastOutputStream.flushBuffer(FastOutputStream.java:200) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.common.util.JavaBinCodec.close(JavaBinCodec.java:1293) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.request.JavaBinUpdateRequestCodec.marshal(JavaBinUpdateRequestCodec.java:99) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.BinaryRequestWriter.write(BinaryRequestWriter.java:79) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.Http2SolrClient.send(Http2SolrClient.java:435) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.ConcurrentUpdateHttp2SolrClient$Runner.sendUpdateStream(ConcurrentUpdateHttp2SolrClient.java:250) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.ConcurrentUpdateHttp2SolrClient$Runner.run(ConcurrentUpdateHttp2SolrClient.java:192) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at com.codahale.metrics.InstrumentedExecutorService$InstrumentedRunnable.run(InstrumentedExecutorService.java:212) ~[metrics-core-4.2.21.jar:4.2.21] 2> at org.apache.solr.common.util.ExecutorUtil$MDCAwareThreadPoolExecutor.lambda$execute$0(ExecutorUtil.java:294) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) ~[?:?] 2> at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) ~[?:?] 2> at java.base/java.lang.Thread.run(Thread.java:829) [?:?] 2> Caused by: java.net.ConnectException: Connection refused 2> at java.base/sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) ~[?:?] 2> at java.base/sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:774) ~[?:?] 2> at java.base/sun.nio.ch.SocketAdaptor.connect(SocketAdaptor.java:120) ~[?:?] 2> at org.eclipse.jetty.io.ClientConnector.connect(ClientConnector.java:428) ~[jetty-io-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.HTTP2Client.connect(HTTP2Client.java:461) ~[http2-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.http.HttpClientTransportOverHTTP2.connect(HttpClientTransportOverHTTP2.java:146) ~[http2-http-client-transport-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.http.HttpClientTransportOverHTTP2.connect(HttpClientTransportOverHTTP2.java:135) ~[http2-http-client-transport-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.connect(HttpClient.java:583) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.succeeded(HttpClient.java:560) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.succeeded(HttpClient.java:556) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.util.SocketAddressResolver$Async.lambda$resolve$1(SocketAddressResolver.java:180) ~[jetty-util-10.0.19.jar:10.0.19] 2> ... 4 more 2> Caused by: java.net.ConnectException: Connection refused 2> at java.base/sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) ~[?:?] 2> at java.base/sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:774) ~[?:?] 2> at java.base/sun.nio.ch.SocketAdaptor.connect(SocketAdaptor.java:120) ~[?:?] 2> at org.eclipse.jetty.io.ClientConnector.connect(ClientConnector.java:428) ~[jetty-io-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.HTTP2Client.connect(HTTP2Client.java:461) ~[http2-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.http.HttpClientTransportOverHTTP2.connect(HttpClientTransportOverHTTP2.java:146) ~[http2-http-client-transport-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.http.HttpClientTransportOverHTTP2.connect(HttpClientTransportOverHTTP2.java:135) ~[http2-http-client-transport-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.connect(HttpClient.java:583) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.succeeded(HttpClient.java:560) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.succeeded(HttpClient.java:556) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.util.SocketAddressResolver$Async.lambda$resolve$1(SocketAddressResolver.java:180) ~[jetty-util-10.0.19.jar:10.0.19] 2> ... 4 more 2> 369138 ERROR (qtp1455143443-2015) [n:127.0.0.1:32875__c c:repfacttest_c8n_1x3 s:shard1 r:core_node3 x:repfacttest_c8n_1x3_shard1_replica_n1 t:null-1096] o.a.s.u.p.DistributedZkUpdateProcessor Setting up to try to start recovery on replica core_node5 with url http://127.0.0.1:40643/_c/repfacttest_c8n_1x3_shard1_replica_n2/ by increasing leader term 2> => java.net.ConnectException: Connection refused 2> at java.base/sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) 2> java.net.ConnectException: Connection refused 2> at java.base/sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) ~[?:?] 2> at java.base/sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:774) ~[?:?] 2> at java.base/sun.nio.ch.SocketAdaptor.connect(SocketAdaptor.java:120) ~[?:?] 2> at org.eclipse.jetty.io.ClientConnector.connect(ClientConnector.java:428) ~[jetty-io-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.HTTP2Client.connect(HTTP2Client.java:461) ~[http2-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.http.HttpClientTransportOverHTTP2.connect(HttpClientTransportOverHTTP2.java:146) ~[http2-http-client-transport-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.http.HttpClientTransportOverHTTP2.connect(HttpClientTransportOverHTTP2.java:135) ~[http2-http-client-transport-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.connect(HttpClient.java:583) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.succeeded(HttpClient.java:560) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.succeeded(HttpClient.java:556) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.util.SocketAddressResolver$Async.lambda$resolve$1(SocketAddressResolver.java:180) ~[jetty-util-10.0.19.jar:10.0.19] 2> at org.apache.solr.common.util.ExecutorUtil$MDCAwareThreadPoolExecutor.lambda$execute$0(ExecutorUtil.java:294) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) ~[?:?] 2> at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) ~[?:?] 2> at java.base/java.lang.Thread.run(Thread.java:829) [?:?] 2> 369148 INFO (qtp1455143443-2015) [n:127.0.0.1:32875__c c:repfacttest_c8n_1x3 s:shard1 r:core_node3 x:repfacttest_c8n_1x3_shard1_replica_n1 t:null-1096] o.a.s.c.ZkShardTerms Successful update of terms at /collections/repfacttest_c8n_1x3/terms/shard1 to Terms{values={core_node6=5, core_node3=7, core_node5=6}, version=15} for ensureTermsIsHigher 2> 369149 INFO (qtp1455143443-2015) [n:127.0.0.1:32875__c c:repfacttest_c8n_1x3 s:shard1 r:core_node3 x:repfacttest_c8n_1x3_shard1_replica_n1 t:null-1096] o.a.s.u.p.LogUpdateProcessorFactory webapp=/_c path=/update params={wt=javabin&version=2}{add=[30 (1788735704850759680), 31 (1788735704852856832), 32 (1788735704853905408), 33 (1788735704854953984), 34 (1788735704854953985), 35 (1788735704856002560), 36 (1788735704859148288), 37 (1788735704860196864), 38 (1788735704861245440), 39 (1788735704862294016), ... (15 adds)]} 0 693 2> 369154 INFO (zkCallback-1122-thread-2) [n:127.0.0.1:40643__c c:repfacttest_c8n_1x3 s:shard1 r:core_node5 x:repfacttest_c8n_1x3_shard1_replica_n2 t:] o.a.s.c.RecoveringCoreTermWatcher Start recovery on core_node5 because core's term is less than leader's term 2> 369156 INFO (updateExecutor-1117-thread-2-processing-127.0.0.1:40643__c repfacttest_c8n_1x3_shard1_replica_n2 repfacttest_c8n_1x3 shard1 core_node5) [n:127.0.0.1:40643__c c:repfacttest_c8n_1x3 s:shard1 r:core_node5 x:repfacttest_c8n_1x3_shard1_replica_n2 t:] o.a.s.u.DefaultSolrCoreState Running recovery 2> 369156 INFO (updateExecutor-1117-thread-2-processing-127.0.0.1:40643__c repfacttest_c8n_1x3_shard1_replica_n2 repfacttest_c8n_1x3 shard1 core_node5) [n:127.0.0.1:40643__c c:repfacttest_c8n_1x3 s:shard1 r:core_node5 x:repfacttest_c8n_1x3_shard1_replica_n2 t:] o.a.s.c.ActionThrottle Throttling recovery attempts - waiting for 4000ms 2> 369172 INFO (qtp1455143443-2012) [n:127.0.0.1:32875__c c:repfacttest_c8n_1x3 s:shard1 r:core_node3 x:repfacttest_c8n_1x3_shard1_replica_n1 t:null-1097] o.a.s.u.p.LogUpdateProcessorFactory webapp=/_c path=/update params={wt=javabin&version=2}{add=[2371 (1788735705584762880), 2408 (1788735705585811456), 2409 (1788735705587908608), 2378 (1788735705587908609), 2347 (1788735705588957184), 2348 (1788735705590005760), 2419 (1788735705591054336), 2356 (1788735705592102912), 2389 (1788735705593151488), 2391 (1788735705594200064), ... (15 adds)]} 0 16 2> 369307 ERROR (updateExecutor-1175-thread-2-processing-repfacttest_c8n_1x3_shard1_replica_n1 null-1098 core_node3 null 127.0.0.1:32875__c repfacttest_c8n_1x3 shard1) [n:127.0.0.1:32875__c c:repfacttest_c8n_1x3 s:shard1 r:core_node3 x:repfacttest_c8n_1x3_shard1_replica_n1 t:null-1098] o.a.s.u.ErrorReportingConcurrentUpdateSolrClient Error when calling SolrCmdDistributor$Req: cmd=delete{_version_=-1788735705607831552,query=`id:(2371 OR 2408 OR 2409 OR 2378 OR 2347 OR 2348 OR 2419 OR 2356 OR 2389 OR 2391 OR 2392 OR 2329 OR 2330 OR 2333 OR 2398)`,commitWithin=-1}; node=StdNode: http://127.0.0.1:40643/_c/repfacttest_c8n_1x3_shard1_replica_n2/ to http://127.0.0.1:40643/_c/repfacttest_c8n_1x3_shard1_replica_n2/ 2> => java.io.IOException: java.net.ConnectException: Connection refused 2> at org.eclipse.jetty.client.util.OutputStreamRequestContent$AsyncOutputStream.write(OutputStreamRequestContent.java:104) 2> java.io.IOException: java.net.ConnectException: Connection refused 2> at org.eclipse.jetty.client.util.OutputStreamRequestContent$AsyncOutputStream.write(OutputStreamRequestContent.java:104) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.apache.solr.common.util.FastOutputStream.flush(FastOutputStream.java:207) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.common.util.FastOutputStream.flushBuffer(FastOutputStream.java:200) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.common.util.JavaBinCodec.marshal(JavaBinCodec.java:170) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.request.JavaBinUpdateRequestCodec.marshal(JavaBinUpdateRequestCodec.java:100) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.BinaryRequestWriter.write(BinaryRequestWriter.java:79) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.Http2SolrClient.send(Http2SolrClient.java:435) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.ConcurrentUpdateHttp2SolrClient$Runner.sendUpdateStream(ConcurrentUpdateHttp2SolrClient.java:250) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.ConcurrentUpdateHttp2SolrClient$Runner.run(ConcurrentUpdateHttp2SolrClient.java:192) [solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at com.codahale.metrics.InstrumentedExecutorService$InstrumentedRunnable.run(InstrumentedExecutorService.java:212) [metrics-core-4.2.21.jar:4.2.21] 2> at org.apache.solr.common.util.ExecutorUtil$MDCAwareThreadPoolExecutor.lambda$execute$0(ExecutorUtil.java:294) [solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) [?:?] 2> at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) [?:?] 2> at java.base/java.lang.Thread.run(Thread.java:829) [?:?] 2> Suppressed: java.io.IOException: java.net.ConnectException: Connection refused 2> at org.eclipse.jetty.client.util.OutputStreamRequestContent$AsyncOutputStream.write(OutputStreamRequestContent.java:104) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.apache.solr.common.util.FastOutputStream.flush(FastOutputStream.java:207) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.common.util.FastOutputStream.flushBuffer(FastOutputStream.java:200) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.common.util.JavaBinCodec.close(JavaBinCodec.java:1293) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.request.JavaBinUpdateRequestCodec.marshal(JavaBinUpdateRequestCodec.java:99) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.BinaryRequestWriter.write(BinaryRequestWriter.java:79) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.Http2SolrClient.send(Http2SolrClient.java:435) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.ConcurrentUpdateHttp2SolrClient$Runner.sendUpdateStream(ConcurrentUpdateHttp2SolrClient.java:250) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.ConcurrentUpdateHttp2SolrClient$Runner.run(ConcurrentUpdateHttp2SolrClient.java:192) [solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at com.codahale.metrics.InstrumentedExecutorService$InstrumentedRunnable.run(InstrumentedExecutorService.java:212) [metrics-core-4.2.21.jar:4.2.21] 2> at org.apache.solr.common.util.ExecutorUtil$MDCAwareThreadPoolExecutor.lambda$execute$0(ExecutorUtil.java:294) [solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) [?:?] 2> at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) [?:?] 2> at java.base/java.lang.Thread.run(Thread.java:829) [?:?] 2> Caused by: java.net.ConnectException: Connection refused 2> at java.base/sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) ~[?:?] 2> at java.base/sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:774) ~[?:?] 2> at java.base/sun.nio.ch.SocketAdaptor.connect(SocketAdaptor.java:120) ~[?:?] 2> at org.eclipse.jetty.io.ClientConnector.connect(ClientConnector.java:428) ~[jetty-io-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.HTTP2Client.connect(HTTP2Client.java:461) ~[http2-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.http.HttpClientTransportOverHTTP2.connect(HttpClientTransportOverHTTP2.java:146) ~[http2-http-client-transport-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.http.HttpClientTransportOverHTTP2.connect(HttpClientTransportOverHTTP2.java:135) ~[http2-http-client-transport-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.connect(HttpClient.java:583) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.succeeded(HttpClient.java:560) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.succeeded(HttpClient.java:556) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.util.SocketAddressResolver$Async.lambda$resolve$1(SocketAddressResolver.java:180) ~[jetty-util-10.0.19.jar:10.0.19] 2> ... 4 more 2> Caused by: java.net.ConnectException: Connection refused 2> at java.base/sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) ~[?:?] 2> at java.base/sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:774) ~[?:?] 2> at java.base/sun.nio.ch.SocketAdaptor.connect(SocketAdaptor.java:120) ~[?:?] 2> at org.eclipse.jetty.io.ClientConnector.connect(ClientConnector.java:428) ~[jetty-io-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.HTTP2Client.connect(HTTP2Client.java:461) ~[http2-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.http.HttpClientTransportOverHTTP2.connect(HttpClientTransportOverHTTP2.java:146) ~[http2-http-client-transport-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.http.HttpClientTransportOverHTTP2.connect(HttpClientTransportOverHTTP2.java:135) ~[http2-http-client-transport-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.connect(HttpClient.java:583) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.succeeded(HttpClient.java:560) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.succeeded(HttpClient.java:556) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.util.SocketAddressResolver$Async.lambda$resolve$1(SocketAddressResolver.java:180) ~[jetty-util-10.0.19.jar:10.0.19] 2> ... 4 more 2> 369307 ERROR (updateExecutor-1175-thread-1-processing-repfacttest_c8n_1x3_shard1_replica_n1 null-1098 core_node3 null 127.0.0.1:32875__c repfacttest_c8n_1x3 shard1) [n:127.0.0.1:32875__c c:repfacttest_c8n_1x3 s:shard1 r:core_node3 x:repfacttest_c8n_1x3_shard1_replica_n1 t:null-1098] o.a.s.u.ErrorReportingConcurrentUpdateSolrClient Error when calling SolrCmdDistributor$Req: cmd=delete{_version_=-1788735705607831552,query=`id:(2371 OR 2408 OR 2409 OR 2378 OR 2347 OR 2348 OR 2419 OR 2356 OR 2389 OR 2391 OR 2392 OR 2329 OR 2330 OR 2333 OR 2398)`,commitWithin=-1}; node=StdNode: http://127.0.0.1:33207/_c/repfacttest_c8n_1x3_shard1_replica_n4/ to http://127.0.0.1:33207/_c/repfacttest_c8n_1x3_shard1_replica_n4/ 2> => java.io.IOException: java.net.ConnectException: Connection refused 2> at org.eclipse.jetty.client.util.OutputStreamRequestContent$AsyncOutputStream.write(OutputStreamRequestContent.java:104) 2> java.io.IOException: java.net.ConnectException: Connection refused 2> at org.eclipse.jetty.client.util.OutputStreamRequestContent$AsyncOutputStream.write(OutputStreamRequestContent.java:104) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.apache.solr.common.util.FastOutputStream.flush(FastOutputStream.java:207) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.common.util.FastOutputStream.flushBuffer(FastOutputStream.java:200) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.common.util.JavaBinCodec.marshal(JavaBinCodec.java:170) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.request.JavaBinUpdateRequestCodec.marshal(JavaBinUpdateRequestCodec.java:100) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.BinaryRequestWriter.write(BinaryRequestWriter.java:79) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.Http2SolrClient.send(Http2SolrClient.java:435) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.ConcurrentUpdateHttp2SolrClient$Runner.sendUpdateStream(ConcurrentUpdateHttp2SolrClient.java:250) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.ConcurrentUpdateHttp2SolrClient$Runner.run(ConcurrentUpdateHttp2SolrClient.java:192) [solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at com.codahale.metrics.InstrumentedExecutorService$InstrumentedRunnable.run(InstrumentedExecutorService.java:212) [metrics-core-4.2.21.jar:4.2.21] 2> at org.apache.solr.common.util.ExecutorUtil$MDCAwareThreadPoolExecutor.lambda$execute$0(ExecutorUtil.java:294) [solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) [?:?] 2> at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) [?:?] 2> at java.base/java.lang.Thread.run(Thread.java:829) [?:?] 2> Suppressed: java.io.IOException: java.net.ConnectException: Connection refused 2> at org.eclipse.jetty.client.util.OutputStreamRequestContent$AsyncOutputStream.write(OutputStreamRequestContent.java:104) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.apache.solr.common.util.FastOutputStream.flush(FastOutputStream.java:207) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.common.util.FastOutputStream.flushBuffer(FastOutputStream.java:200) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.common.util.JavaBinCodec.close(JavaBinCodec.java:1293) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.request.JavaBinUpdateRequestCodec.marshal(JavaBinUpdateRequestCodec.java:99) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.BinaryRequestWriter.write(BinaryRequestWriter.java:79) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.Http2SolrClient.send(Http2SolrClient.java:435) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.ConcurrentUpdateHttp2SolrClient$Runner.sendUpdateStream(ConcurrentUpdateHttp2SolrClient.java:250) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.ConcurrentUpdateHttp2SolrClient$Runner.run(ConcurrentUpdateHttp2SolrClient.java:192) [solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at com.codahale.metrics.InstrumentedExecutorService$InstrumentedRunnable.run(InstrumentedExecutorService.java:212) [metrics-core-4.2.21.jar:4.2.21] 2> at org.apache.solr.common.util.ExecutorUtil$MDCAwareThreadPoolExecutor.lambda$execute$0(ExecutorUtil.java:294) [solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) [?:?] 2> at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) [?:?] 2> at java.base/java.lang.Thread.run(Thread.java:829) [?:?] 2> Caused by: java.net.ConnectException: Connection refused 2> at java.base/sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) ~[?:?] 2> at java.base/sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:774) ~[?:?] 2> at java.base/sun.nio.ch.SocketAdaptor.connect(SocketAdaptor.java:120) ~[?:?] 2> at org.eclipse.jetty.io.ClientConnector.connect(ClientConnector.java:428) ~[jetty-io-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.HTTP2Client.connect(HTTP2Client.java:461) ~[http2-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.http.HttpClientTransportOverHTTP2.connect(HttpClientTransportOverHTTP2.java:146) ~[http2-http-client-transport-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.http.HttpClientTransportOverHTTP2.connect(HttpClientTransportOverHTTP2.java:135) ~[http2-http-client-transport-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.connect(HttpClient.java:583) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.succeeded(HttpClient.java:560) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.succeeded(HttpClient.java:556) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.util.SocketAddressResolver$Async.lambda$resolve$1(SocketAddressResolver.java:180) ~[jetty-util-10.0.19.jar:10.0.19] 2> ... 4 more 2> Caused by: java.net.ConnectException: Connection refused 2> at java.base/sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) ~[?:?] 2> at java.base/sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:774) ~[?:?] 2> at java.base/sun.nio.ch.SocketAdaptor.connect(SocketAdaptor.java:120) ~[?:?] 2> at org.eclipse.jetty.io.ClientConnector.connect(ClientConnector.java:428) ~[jetty-io-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.HTTP2Client.connect(HTTP2Client.java:461) ~[http2-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.http.HttpClientTransportOverHTTP2.connect(HttpClientTransportOverHTTP2.java:146) ~[http2-http-client-transport-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.http.HttpClientTransportOverHTTP2.connect(HttpClientTransportOverHTTP2.java:135) ~[http2-http-client-transport-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.connect(HttpClient.java:583) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.succeeded(HttpClient.java:560) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.succeeded(HttpClient.java:556) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.util.SocketAddressResolver$Async.lambda$resolve$1(SocketAddressResolver.java:180) ~[jetty-util-10.0.19.jar:10.0.19] 2> ... 4 more 2> 369331 INFO (qtp1455143443-2013) [n:127.0.0.1:32875__c c:repfacttest_c8n_1x3 s:shard1 r:core_node3 x:repfacttest_c8n_1x3_shard1_replica_n1 t:null-1098] o.a.s.u.SolrCmdDistributor SolrCmdDistributor found 2 errors 2> 369332 WARN (qtp1455143443-2013) [n:127.0.0.1:32875__c c:repfacttest_c8n_1x3 s:shard1 r:core_node3 x:repfacttest_c8n_1x3_shard1_replica_n1 t:null-1098] o.a.s.u.p.DistributedZkUpdateProcessor Error sending update to http://127.0.0.1:40643/_c 2> => java.io.IOException: java.net.ConnectException: Connection refused 2> at org.eclipse.jetty.client.util.OutputStreamRequestContent$AsyncOutputStream.write(OutputStreamRequestContent.java:104) 2> java.io.IOException: java.net.ConnectException: Connection refused 2> at org.eclipse.jetty.client.util.OutputStreamRequestContent$AsyncOutputStream.write(OutputStreamRequestContent.java:104) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.apache.solr.common.util.FastOutputStream.flush(FastOutputStream.java:207) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.common.util.FastOutputStream.flushBuffer(FastOutputStream.java:200) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.common.util.JavaBinCodec.marshal(JavaBinCodec.java:170) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.request.JavaBinUpdateRequestCodec.marshal(JavaBinUpdateRequestCodec.java:100) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.BinaryRequestWriter.write(BinaryRequestWriter.java:79) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.Http2SolrClient.send(Http2SolrClient.java:435) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.ConcurrentUpdateHttp2SolrClient$Runner.sendUpdateStream(ConcurrentUpdateHttp2SolrClient.java:250) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.ConcurrentUpdateHttp2SolrClient$Runner.run(ConcurrentUpdateHttp2SolrClient.java:192) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at com.codahale.metrics.InstrumentedExecutorService$InstrumentedRunnable.run(InstrumentedExecutorService.java:212) ~[metrics-core-4.2.21.jar:4.2.21] 2> at org.apache.solr.common.util.ExecutorUtil$MDCAwareThreadPoolExecutor.lambda$execute$0(ExecutorUtil.java:294) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) ~[?:?] 2> at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) ~[?:?] 2> at java.base/java.lang.Thread.run(Thread.java:829) [?:?] 2> Suppressed: java.io.IOException: java.net.ConnectException: Connection refused 2> at org.eclipse.jetty.client.util.OutputStreamRequestContent$AsyncOutputStream.write(OutputStreamRequestContent.java:104) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.apache.solr.common.util.FastOutputStream.flush(FastOutputStream.java:207) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.common.util.FastOutputStream.flushBuffer(FastOutputStream.java:200) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.common.util.JavaBinCodec.close(JavaBinCodec.java:1293) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.request.JavaBinUpdateRequestCodec.marshal(JavaBinUpdateRequestCodec.java:99) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.BinaryRequestWriter.write(BinaryRequestWriter.java:79) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.Http2SolrClient.send(Http2SolrClient.java:435) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.ConcurrentUpdateHttp2SolrClient$Runner.sendUpdateStream(ConcurrentUpdateHttp2SolrClient.java:250) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.ConcurrentUpdateHttp2SolrClient$Runner.run(ConcurrentUpdateHttp2SolrClient.java:192) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at com.codahale.metrics.InstrumentedExecutorService$InstrumentedRunnable.run(InstrumentedExecutorService.java:212) ~[metrics-core-4.2.21.jar:4.2.21] 2> at org.apache.solr.common.util.ExecutorUtil$MDCAwareThreadPoolExecutor.lambda$execute$0(ExecutorUtil.java:294) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) ~[?:?] 2> at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) ~[?:?] 2> at java.base/java.lang.Thread.run(Thread.java:829) [?:?] 2> Caused by: java.net.ConnectException: Connection refused 2> at java.base/sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) ~[?:?] 2> at java.base/sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:774) ~[?:?] 2> at java.base/sun.nio.ch.SocketAdaptor.connect(SocketAdaptor.java:120) ~[?:?] 2> at org.eclipse.jetty.io.ClientConnector.connect(ClientConnector.java:428) ~[jetty-io-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.HTTP2Client.connect(HTTP2Client.java:461) ~[http2-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.http.HttpClientTransportOverHTTP2.connect(HttpClientTransportOverHTTP2.java:146) ~[http2-http-client-transport-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.http.HttpClientTransportOverHTTP2.connect(HttpClientTransportOverHTTP2.java:135) ~[http2-http-client-transport-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.connect(HttpClient.java:583) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.succeeded(HttpClient.java:560) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.succeeded(HttpClient.java:556) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.util.SocketAddressResolver$Async.lambda$resolve$1(SocketAddressResolver.java:180) ~[jetty-util-10.0.19.jar:10.0.19] 2> ... 4 more 2> Caused by: java.net.ConnectException: Connection refused 2> at java.base/sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) ~[?:?] 2> at java.base/sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:774) ~[?:?] 2> at java.base/sun.nio.ch.SocketAdaptor.connect(SocketAdaptor.java:120) ~[?:?] 2> at org.eclipse.jetty.io.ClientConnector.connect(ClientConnector.java:428) ~[jetty-io-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.HTTP2Client.connect(HTTP2Client.java:461) ~[http2-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.http.HttpClientTransportOverHTTP2.connect(HttpClientTransportOverHTTP2.java:146) ~[http2-http-client-transport-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.http.HttpClientTransportOverHTTP2.connect(HttpClientTransportOverHTTP2.java:135) ~[http2-http-client-transport-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.connect(HttpClient.java:583) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.succeeded(HttpClient.java:560) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.succeeded(HttpClient.java:556) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.util.SocketAddressResolver$Async.lambda$resolve$1(SocketAddressResolver.java:180) ~[jetty-util-10.0.19.jar:10.0.19] 2> ... 4 more 2> 369352 ERROR (qtp1455143443-2013) [n:127.0.0.1:32875__c c:repfacttest_c8n_1x3 s:shard1 r:core_node3 x:repfacttest_c8n_1x3_shard1_replica_n1 t:null-1098] o.a.s.u.p.DistributedZkUpdateProcessor Setting up to try to start recovery on replica core_node5 with url http://127.0.0.1:40643/_c/repfacttest_c8n_1x3_shard1_replica_n2/ by increasing leader term 2> => java.net.ConnectException: Connection refused 2> at java.base/sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) 2> java.net.ConnectException: Connection refused 2> at java.base/sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) ~[?:?] 2> at java.base/sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:774) ~[?:?] 2> at java.base/sun.nio.ch.SocketAdaptor.connect(SocketAdaptor.java:120) ~[?:?] 2> at org.eclipse.jetty.io.ClientConnector.connect(ClientConnector.java:428) ~[jetty-io-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.HTTP2Client.connect(HTTP2Client.java:461) ~[http2-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.http.HttpClientTransportOverHTTP2.connect(HttpClientTransportOverHTTP2.java:146) ~[http2-http-client-transport-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.http.HttpClientTransportOverHTTP2.connect(HttpClientTransportOverHTTP2.java:135) ~[http2-http-client-transport-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.connect(HttpClient.java:583) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.succeeded(HttpClient.java:560) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.succeeded(HttpClient.java:556) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.util.SocketAddressResolver$Async.lambda$resolve$1(SocketAddressResolver.java:180) ~[jetty-util-10.0.19.jar:10.0.19] 2> at org.apache.solr.common.util.ExecutorUtil$MDCAwareThreadPoolExecutor.lambda$execute$0(ExecutorUtil.java:294) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) ~[?:?] 2> at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) ~[?:?] 2> at java.base/java.lang.Thread.run(Thread.java:829) [?:?] 2> 369359 WARN (qtp1455143443-2013) [n:127.0.0.1:32875__c c:repfacttest_c8n_1x3 s:shard1 r:core_node3 x:repfacttest_c8n_1x3_shard1_replica_n1 t:null-1098] o.a.s.u.p.DistributedZkUpdateProcessor Error sending update to http://127.0.0.1:33207/_c 2> => java.io.IOException: java.net.ConnectException: Connection refused 2> at org.eclipse.jetty.client.util.OutputStreamRequestContent$AsyncOutputStream.write(OutputStreamRequestContent.java:104) 2> java.io.IOException: java.net.ConnectException: Connection refused 2> at org.eclipse.jetty.client.util.OutputStreamRequestContent$AsyncOutputStream.write(OutputStreamRequestContent.java:104) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.apache.solr.common.util.FastOutputStream.flush(FastOutputStream.java:207) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.common.util.FastOutputStream.flushBuffer(FastOutputStream.java:200) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.common.util.JavaBinCodec.marshal(JavaBinCodec.java:170) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.request.JavaBinUpdateRequestCodec.marshal(JavaBinUpdateRequestCodec.java:100) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.BinaryRequestWriter.write(BinaryRequestWriter.java:79) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.Http2SolrClient.send(Http2SolrClient.java:435) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.ConcurrentUpdateHttp2SolrClient$Runner.sendUpdateStream(ConcurrentUpdateHttp2SolrClient.java:250) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.ConcurrentUpdateHttp2SolrClient$Runner.run(ConcurrentUpdateHttp2SolrClient.java:192) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at com.codahale.metrics.InstrumentedExecutorService$InstrumentedRunnable.run(InstrumentedExecutorService.java:212) ~[metrics-core-4.2.21.jar:4.2.21] 2> at org.apache.solr.common.util.ExecutorUtil$MDCAwareThreadPoolExecutor.lambda$execute$0(ExecutorUtil.java:294) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) ~[?:?] 2> at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) ~[?:?] 2> at java.base/java.lang.Thread.run(Thread.java:829) [?:?] 2> Suppressed: java.io.IOException: java.net.ConnectException: Connection refused 2> at org.eclipse.jetty.client.util.OutputStreamRequestContent$AsyncOutputStream.write(OutputStreamRequestContent.java:104) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.apache.solr.common.util.FastOutputStream.flush(FastOutputStream.java:207) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.common.util.FastOutputStream.flushBuffer(FastOutputStream.java:200) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.common.util.JavaBinCodec.close(JavaBinCodec.java:1293) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.request.JavaBinUpdateRequestCodec.marshal(JavaBinUpdateRequestCodec.java:99) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.BinaryRequestWriter.write(BinaryRequestWriter.java:79) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.Http2SolrClient.send(Http2SolrClient.java:435) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.ConcurrentUpdateHttp2SolrClient$Runner.sendUpdateStream(ConcurrentUpdateHttp2SolrClient.java:250) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.ConcurrentUpdateHttp2SolrClient$Runner.run(ConcurrentUpdateHttp2SolrClient.java:192) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at com.codahale.metrics.InstrumentedExecutorService$InstrumentedRunnable.run(InstrumentedExecutorService.java:212) ~[metrics-core-4.2.21.jar:4.2.21] 2> at org.apache.solr.common.util.ExecutorUtil$MDCAwareThreadPoolExecutor.lambda$execute$0(ExecutorUtil.java:294) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) ~[?:?] 2> at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) ~[?:?] 2> at java.base/java.lang.Thread.run(Thread.java:829) [?:?] 2> Caused by: java.net.ConnectException: Connection refused 2> at java.base/sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) ~[?:?] 2> at java.base/sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:774) ~[?:?] 2> at java.base/sun.nio.ch.SocketAdaptor.connect(SocketAdaptor.java:120) ~[?:?] 2> at org.eclipse.jetty.io.ClientConnector.connect(ClientConnector.java:428) ~[jetty-io-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.HTTP2Client.connect(HTTP2Client.java:461) ~[http2-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.http.HttpClientTransportOverHTTP2.connect(HttpClientTransportOverHTTP2.java:146) ~[http2-http-client-transport-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.http.HttpClientTransportOverHTTP2.connect(HttpClientTransportOverHTTP2.java:135) ~[http2-http-client-transport-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.connect(HttpClient.java:583) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.succeeded(HttpClient.java:560) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.succeeded(HttpClient.java:556) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.util.SocketAddressResolver$Async.lambda$resolve$1(SocketAddressResolver.java:180) ~[jetty-util-10.0.19.jar:10.0.19] 2> ... 4 more 2> Caused by: java.net.ConnectException: Connection refused 2> at java.base/sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) ~[?:?] 2> at java.base/sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:774) ~[?:?] 2> at java.base/sun.nio.ch.SocketAdaptor.connect(SocketAdaptor.java:120) ~[?:?] 2> at org.eclipse.jetty.io.ClientConnector.connect(ClientConnector.java:428) ~[jetty-io-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.HTTP2Client.connect(HTTP2Client.java:461) ~[http2-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.http.HttpClientTransportOverHTTP2.connect(HttpClientTransportOverHTTP2.java:146) ~[http2-http-client-transport-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.http.HttpClientTransportOverHTTP2.connect(HttpClientTransportOverHTTP2.java:135) ~[http2-http-client-transport-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.connect(HttpClient.java:583) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.succeeded(HttpClient.java:560) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.succeeded(HttpClient.java:556) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.util.SocketAddressResolver$Async.lambda$resolve$1(SocketAddressResolver.java:180) ~[jetty-util-10.0.19.jar:10.0.19] 2> ... 4 more 2> 369379 ERROR (qtp1455143443-2013) [n:127.0.0.1:32875__c c:repfacttest_c8n_1x3 s:shard1 r:core_node3 x:repfacttest_c8n_1x3_shard1_replica_n1 t:null-1098] o.a.s.u.p.DistributedZkUpdateProcessor Setting up to try to start recovery on replica core_node6 with url http://127.0.0.1:33207/_c/repfacttest_c8n_1x3_shard1_replica_n4/ by increasing leader term 2> => java.net.ConnectException: Connection refused 2> at java.base/sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) 2> java.net.ConnectException: Connection refused 2> at java.base/sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) ~[?:?] 2> at java.base/sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:774) ~[?:?] 2> at java.base/sun.nio.ch.SocketAdaptor.connect(SocketAdaptor.java:120) ~[?:?] 2> at org.eclipse.jetty.io.ClientConnector.connect(ClientConnector.java:428) ~[jetty-io-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.HTTP2Client.connect(HTTP2Client.java:461) ~[http2-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.http.HttpClientTransportOverHTTP2.connect(HttpClientTransportOverHTTP2.java:146) ~[http2-http-client-transport-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.http2.client.http.HttpClientTransportOverHTTP2.connect(HttpClientTransportOverHTTP2.java:135) ~[http2-http-client-transport-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.connect(HttpClient.java:583) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.succeeded(HttpClient.java:560) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpClient$1.succeeded(HttpClient.java:556) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.util.SocketAddressResolver$Async.lambda$resolve$1(SocketAddressResolver.java:180) ~[jetty-util-10.0.19.jar:10.0.19] 2> at org.apache.solr.common.util.ExecutorUtil$MDCAwareThreadPoolExecutor.lambda$execute$0(ExecutorUtil.java:294) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) ~[?:?] 2> at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) ~[?:?] 2> at java.base/java.lang.Thread.run(Thread.java:829) [?:?] 2> 369387 INFO (qtp1455143443-2013) [n:127.0.0.1:32875__c c:repfacttest_c8n_1x3 s:shard1 r:core_node3 x:repfacttest_c8n_1x3_shard1_replica_n1 t:null-1098] o.a.s.u.p.LogUpdateProcessorFactory webapp=/_c path=/update params={wt=javabin&version=2}{deleteByQuery=id:(2371 OR 2408 OR 2409 OR 2378 OR 2347 OR 2348 OR 2419 OR 2356 OR 2389 OR 2391 OR 2392 OR 2329 OR 2330 OR 2333 OR 2398) (-1788735705607831552)} 0 210 2> 369416 INFO (qtp1455143443-2011) [n:127.0.0.1:32875__c c:repfacttest_c8n_1x3 s:shard1 r:core_node3 x:repfacttest_c8n_1x3_shard1_replica_n1 t:null-1099] o.a.s.u.p.LogUpdateProcessorFactory webapp=/_c path=/update params={wt=javabin&version=2}{add=[2464 (1788735705834323968), 2532 (1788735705838518272), 2537 (1788735705839566848), 2475 (1788735705841664000), 2508 (1788735705842712576), 2476 (1788735705843761152), 2477 (1788735705844809728), 2448 (1788735705845858304), 2482 (1788735705847955456), 2451 (1788735705849004032), ... (15 adds)]} 0 23 2> 369433 INFO (qtp1455143443-2010) [n:127.0.0.1:32875__c c:repfacttest_c8n_1x3 s:shard1 r:core_node3 x:repfacttest_c8n_1x3_shard1_replica_n1 t:null-1100] o.a.s.u.p.LogUpdateProcessorFactory webapp=/_c path=/update params={wt=javabin&version=2}{delete=[2464,2532,2537,2475,2508,2476,2477,2448,2482,2451,2487,2455,2462,2494,2495 (-1788735705864732672)]} 0 10 2> 369435 INFO (TEST-ReplicationFactorTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.s.c.SocketProxy Re-opening connectivity to http://127.0.0.1:33207/_c 2> 369439 INFO (TEST-ReplicationFactorTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.s.c.SocketProxy Re-opening connectivity to http://127.0.0.1:40643/_c 2> 371462 INFO (TEST-ReplicationFactorTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.AbstractFullDistribZkTestBase Found 3 replicas and leader on 127.0.0.1:32875__c for shard1 in repfacttest_c8n_1x3 2> 371462 INFO (TEST-ReplicationFactorTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.AbstractFullDistribZkTestBase Replica core_node5 is currently active 2> 371462 INFO (TEST-ReplicationFactorTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.AbstractFullDistribZkTestBase Replica core_node6 is currently active 2> 371966 INFO (TEST-ReplicationFactorTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.AbstractFullDistribZkTestBase Found 3 replicas and leader on 127.0.0.1:32875__c for shard1 in repfacttest_c8n_1x3 2> 371966 INFO (TEST-ReplicationFactorTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.AbstractFullDistribZkTestBase Replica core_node5 is currently active 2> 371966 INFO (TEST-ReplicationFactorTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.AbstractFullDistribZkTestBase Replica core_node6 is currently active 2> 372469 INFO (TEST-ReplicationFactorTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.AbstractFullDistribZkTestBase Found 3 replicas and leader on 127.0.0.1:32875__c for shard1 in repfacttest_c8n_1x3 2> 372470 INFO (TEST-ReplicationFactorTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.AbstractFullDistribZkTestBase Replica core_node5 is currently active 2> 372470 INFO (TEST-ReplicationFactorTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.AbstractFullDistribZkTestBase Replica core_node6 is currently active 2> 372544 INFO (recoveryExecutor-1158-thread-1-processing-127.0.0.1:33207__c repfacttest_c8n_1x3_shard1_replica_n4 repfacttest_c8n_1x3 shard1 core_node6) [n:127.0.0.1:33207__c c:repfacttest_c8n_1x3 s:shard1 r:core_node6 x:repfacttest_c8n_1x3_shard1_replica_n4 t:] o.a.s.c.RecoveryStrategy Starting recovery process. recoveringAfterStartup=false 2> 372552 INFO (qtp1455143443-2014) [n:127.0.0.1:32875__c c:repfacttest_c8n_1x3 s:shard1 r:core_node3 x:repfacttest_c8n_1x3_shard1_replica_n1 t:null-1101] o.a.s.c.S.Request webapp=/_c path=/admin/ping params={wt=javabin&version=2} hits=0 status=0 QTime=2 2> 372552 INFO (qtp1455143443-2014) [n:127.0.0.1:32875__c c:repfacttest_c8n_1x3 s:shard1 r:core_node3 x:repfacttest_c8n_1x3_shard1_replica_n1 t:null-1101] o.a.s.c.S.Request webapp=/_c path=/admin/ping params={wt=javabin&version=2} status=0 QTime=2 2> 372554 INFO (recoveryExecutor-1158-thread-1-processing-127.0.0.1:33207__c repfacttest_c8n_1x3_shard1_replica_n4 repfacttest_c8n_1x3 shard1 core_node6) [n:127.0.0.1:33207__c c:repfacttest_c8n_1x3 s:shard1 r:core_node6 x:repfacttest_c8n_1x3_shard1_replica_n4 t:] o.a.s.c.RecoveryStrategy Begin buffering updates. core=[repfacttest_c8n_1x3_shard1_replica_n4] 2> 372558 INFO (recoveryExecutor-1158-thread-1-processing-127.0.0.1:33207__c repfacttest_c8n_1x3_shard1_replica_n4 repfacttest_c8n_1x3 shard1 core_node6) [n:127.0.0.1:33207__c c:repfacttest_c8n_1x3 s:shard1 r:core_node6 x:repfacttest_c8n_1x3_shard1_replica_n4 t:] o.a.s.u.UpdateLog Starting to buffer updates. FSUpdateLog{state=ACTIVE, tlog=tlog{file=/build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.cloud.ReplicationFactorTest_246C98A4C257C021-001/shard-1-001/cores/repfacttest_c8n_1x3_shard1_replica_n4/data/tlog/tlog.0000000000000000000 refcount=1}} 2> 372559 INFO (recoveryExecutor-1158-thread-1-processing-127.0.0.1:33207__c repfacttest_c8n_1x3_shard1_replica_n4 repfacttest_c8n_1x3 shard1 core_node6) [n:127.0.0.1:33207__c c:repfacttest_c8n_1x3 s:shard1 r:core_node6 x:repfacttest_c8n_1x3_shard1_replica_n4 t:] o.a.s.c.RecoveryStrategy Publishing state of core [repfacttest_c8n_1x3_shard1_replica_n4] as recovering, leader is [http://127.0.0.1:32875/_c/repfacttest_c8n_1x3_shard1_replica_n1/] and I am [http://127.0.0.1:33207/_c/repfacttest_c8n_1x3_shard1_replica_n4/] 2> 372563 INFO (recoveryExecutor-1158-thread-1-processing-127.0.0.1:33207__c repfacttest_c8n_1x3_shard1_replica_n4 repfacttest_c8n_1x3 shard1 core_node6) [n:127.0.0.1:33207__c c:repfacttest_c8n_1x3 s:shard1 r:core_node6 x:repfacttest_c8n_1x3_shard1_replica_n4 t:] o.a.s.c.ZkShardTerms Successful update of terms at /collections/repfacttest_c8n_1x3/terms/shard1 to Terms{values={core_node6=7, core_node6_recovering=5, core_node3=7, core_node5=6}, version=16} for startRecovering 2> 372569 INFO (recoveryExecutor-1158-thread-1-processing-127.0.0.1:33207__c repfacttest_c8n_1x3_shard1_replica_n4 repfacttest_c8n_1x3 shard1 core_node6) [n:127.0.0.1:33207__c c:repfacttest_c8n_1x3 s:shard1 r:core_node6 x:repfacttest_c8n_1x3_shard1_replica_n4 t:] o.a.s.c.RecoveryStrategy Sending prep recovery command to [http://127.0.0.1:32875/_c]; [WaitForState: action=PREPRECOVERY&core=repfacttest_c8n_1x3_shard1_replica_n1&nodeName=127.0.0.1:33207__c&coreNodeName=core_node6&state=recovering&checkLive=true&onlyIfLeader=true&onlyIfLeaderActive=true] 2> 372573 INFO (qtp1455143443-2015) [n:127.0.0.1:32875__c c: s: r: x:repfacttest_c8n_1x3_shard1_replica_n1 t:null-1102] o.a.s.h.a.PrepRecoveryOp Going to wait for coreNodeName: core_node6, state: recovering, checkLive: true, onlyIfLeader: true, onlyIfLeaderActive: true 2> 372574 INFO (qtp1455143443-2015) [n:127.0.0.1:32875__c c: s: r: x: t:null-1102] o.a.s.h.a.PrepRecoveryOp In WaitForState(recovering): collection=repfacttest_c8n_1x3, shard=shard1, thisCore=repfacttest_c8n_1x3_shard1_replica_n1, leaderDoesNotNeedRecovery=false, isLeader? true, live=true, checkLive=true, currentState=active, localState=active, nodeName=127.0.0.1:33207__c, coreNodeName=core_node6, onlyIfActiveCheckResult=false, nodeProps: core_node6:{ 2> "core":"repfacttest_c8n_1x3_shard1_replica_n4", 2> "node_name":"127.0.0.1:33207__c", 2> "base_url":"http://127.0.0.1:33207/_c", 2> "state":"active", 2> "type":"NRT", 2> "force_set_state":"false"} 2> 372574 INFO (qtp1455143443-2015) [n:127.0.0.1:32875__c c: s: r: x: t:null-1102] o.a.s.h.a.PrepRecoveryOp In WaitForState(recovering): collection=repfacttest_c8n_1x3, shard=shard1, thisCore=repfacttest_c8n_1x3_shard1_replica_n1, leaderDoesNotNeedRecovery=false, isLeader? true, live=true, checkLive=true, currentState=active, localState=active, nodeName=127.0.0.1:33207__c, coreNodeName=core_node6, onlyIfActiveCheckResult=false, nodeProps: core_node6:{ 2> "core":"repfacttest_c8n_1x3_shard1_replica_n4", 2> "node_name":"127.0.0.1:33207__c", 2> "base_url":"http://127.0.0.1:33207/_c", 2> "state":"active", 2> "type":"NRT", 2> "force_set_state":"false"} 2> 372575 INFO (qtp1455143443-2015) [n:127.0.0.1:32875__c c: s: r: x: t:null-1102] o.a.s.h.a.PrepRecoveryOp In WaitForState(recovering): collection=repfacttest_c8n_1x3, shard=shard1, thisCore=repfacttest_c8n_1x3_shard1_replica_n1, leaderDoesNotNeedRecovery=false, isLeader? true, live=true, checkLive=true, currentState=active, localState=active, nodeName=127.0.0.1:33207__c, coreNodeName=core_node6, onlyIfActiveCheckResult=false, nodeProps: core_node6:{ 2> "core":"repfacttest_c8n_1x3_shard1_replica_n4", 2> "node_name":"127.0.0.1:33207__c", 2> "base_url":"http://127.0.0.1:33207/_c", 2> "state":"active", 2> "type":"NRT", 2> "force_set_state":"false"} 2> 372576 INFO (qtp1455143443-2015) [n:127.0.0.1:32875__c c: s: r: x: t:null-1102] o.a.s.h.a.PrepRecoveryOp In WaitForState(recovering): collection=repfacttest_c8n_1x3, shard=shard1, thisCore=repfacttest_c8n_1x3_shard1_replica_n1, leaderDoesNotNeedRecovery=false, isLeader? true, live=true, checkLive=true, currentState=active, localState=active, nodeName=127.0.0.1:33207__c, coreNodeName=core_node6, onlyIfActiveCheckResult=false, nodeProps: core_node6:{ 2> "core":"repfacttest_c8n_1x3_shard1_replica_n4", 2> "node_name":"127.0.0.1:33207__c", 2> "base_url":"http://127.0.0.1:33207/_c", 2> "state":"active", 2> "type":"NRT", 2> "force_set_state":"false"} 2> 372579 INFO (zkCallback-1161-thread-2) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/repfacttest_c8n_1x3/state.json zxid: 417] for collection [repfacttest_c8n_1x3] has occurred - updating... (live nodes size: [4]) 2> 372579 INFO (zkCallback-1180-thread-2) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/repfacttest_c8n_1x3/state.json zxid: 417] for collection [repfacttest_c8n_1x3] has occurred - updating... (live nodes size: [4]) 2> 372579 INFO (zkCallback-1180-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/repfacttest_c8n_1x3/state.json zxid: 417] for collection [repfacttest_c8n_1x3] has occurred - updating... (live nodes size: [4]) 2> 372579 INFO (zkCallback-1122-thread-2) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/repfacttest_c8n_1x3/state.json zxid: 417] for collection [repfacttest_c8n_1x3] has occurred - updating... (live nodes size: [4]) 2> 372586 INFO (watches-1182-thread-2) [n:127.0.0.1:32875__c c: s: r: x: t:] o.a.s.h.a.PrepRecoveryOp In WaitForState(recovering): collection=repfacttest_c8n_1x3, shard=shard1, thisCore=repfacttest_c8n_1x3_shard1_replica_n1, leaderDoesNotNeedRecovery=false, isLeader? true, live=true, checkLive=true, currentState=recovering, localState=active, nodeName=127.0.0.1:33207__c, coreNodeName=core_node6, onlyIfActiveCheckResult=false, nodeProps: core_node6:{ 2> "core":"repfacttest_c8n_1x3_shard1_replica_n4", 2> "node_name":"127.0.0.1:33207__c", 2> "base_url":"http://127.0.0.1:33207/_c", 2> "state":"recovering", 2> "type":"NRT", 2> "force_set_state":"false"} 2> 372586 INFO (watches-1182-thread-3) [n:127.0.0.1:32875__c c: s: r: x: t:] o.a.s.h.a.PrepRecoveryOp In WaitForState(recovering): collection=repfacttest_c8n_1x3, shard=shard1, thisCore=repfacttest_c8n_1x3_shard1_replica_n1, leaderDoesNotNeedRecovery=false, isLeader? true, live=true, checkLive=true, currentState=recovering, localState=active, nodeName=127.0.0.1:33207__c, coreNodeName=core_node6, onlyIfActiveCheckResult=false, nodeProps: core_node6:{ 2> "core":"repfacttest_c8n_1x3_shard1_replica_n4", 2> "node_name":"127.0.0.1:33207__c", 2> "base_url":"http://127.0.0.1:33207/_c", 2> "state":"recovering", 2> "type":"NRT", 2> "force_set_state":"false"} 2> 372588 INFO (qtp1455143443-2015) [n:127.0.0.1:32875__c c: s: r: x: t:null-1102] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/cores params={nodeName=127.0.0.1:33207__c&onlyIfLeaderActive=true&core=repfacttest_c8n_1x3_shard1_replica_n1&coreNodeName=core_node6&action=PREPRECOVERY&checkLive=true&state=recovering&onlyIfLeader=true&wt=javabin&version=2} status=0 QTime=16 2> 372975 INFO (TEST-ReplicationFactorTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.AbstractFullDistribZkTestBase Found 3 replicas and leader on 127.0.0.1:32875__c for shard1 in repfacttest_c8n_1x3 2> 372975 INFO (TEST-ReplicationFactorTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.AbstractFullDistribZkTestBase Replica core_node5 is currently active 2> 372975 INFO (TEST-ReplicationFactorTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.AbstractFullDistribZkTestBase Replica core_node6 is currently recovering 2> 373090 INFO (recoveryExecutor-1158-thread-1-processing-127.0.0.1:33207__c repfacttest_c8n_1x3_shard1_replica_n4 repfacttest_c8n_1x3 shard1 core_node6) [n:127.0.0.1:33207__c c:repfacttest_c8n_1x3 s:shard1 r:core_node6 x:repfacttest_c8n_1x3_shard1_replica_n4 t:] o.a.s.c.RecoveryStrategy Attempting to PeerSync from [http://127.0.0.1:32875/_c/repfacttest_c8n_1x3_shard1_replica_n1/] - recoveringAfterStartup=[false] 2> 373091 INFO (recoveryExecutor-1158-thread-1-processing-127.0.0.1:33207__c repfacttest_c8n_1x3_shard1_replica_n4 repfacttest_c8n_1x3 shard1 core_node6) [n:127.0.0.1:33207__c c:repfacttest_c8n_1x3 s:shard1 r:core_node6 x:repfacttest_c8n_1x3_shard1_replica_n4 t:] o.a.s.u.PeerSyncWithLeader PeerSync: core=repfacttest_c8n_1x3_shard1_replica_n4 url=http://127.0.0.1:33207/_c START leader=http://127.0.0.1:32875/_c/repfacttest_c8n_1x3_shard1_replica_n1/ nUpdates=100 2> 373099 INFO (SocketProxy-Acceptor-32875) [n: c: s: r: x: t:] o.a.s.c.s.c.SocketProxy accepted Socket[addr=/127.0.0.1,port=42950,localport=32875], receiveBufferSize: 65536 2> 373103 INFO (SocketProxy-Acceptor-32875) [n: c: s: r: x: t:] o.a.s.c.s.c.SocketProxy proxy connection Socket[addr=/127.0.0.1,port=38185,localport=52510], receiveBufferSize=65536 2> 373158 INFO (recoveryExecutor-1119-thread-1-processing-127.0.0.1:40643__c repfacttest_c8n_1x3_shard1_replica_n2 repfacttest_c8n_1x3 shard1 core_node5) [n:127.0.0.1:40643__c c:repfacttest_c8n_1x3 s:shard1 r:core_node5 x:repfacttest_c8n_1x3_shard1_replica_n2 t:] o.a.s.c.RecoveryStrategy Starting recovery process. recoveringAfterStartup=false 2> 373171 INFO (qtp1455143443-2011) [n:127.0.0.1:32875__c c:repfacttest_c8n_1x3 s:shard1 r:core_node3 x:repfacttest_c8n_1x3_shard1_replica_n1 t:null-1104] o.a.s.c.S.Request webapp=/_c path=/admin/ping params={wt=javabin&version=2} hits=0 status=0 QTime=2 2> 373171 INFO (qtp1455143443-2011) [n:127.0.0.1:32875__c c:repfacttest_c8n_1x3 s:shard1 r:core_node3 x:repfacttest_c8n_1x3_shard1_replica_n1 t:null-1104] o.a.s.c.S.Request webapp=/_c path=/admin/ping params={wt=javabin&version=2} status=0 QTime=3 2> 373173 INFO (recoveryExecutor-1119-thread-1-processing-127.0.0.1:40643__c repfacttest_c8n_1x3_shard1_replica_n2 repfacttest_c8n_1x3 shard1 core_node5) [n:127.0.0.1:40643__c c:repfacttest_c8n_1x3 s:shard1 r:core_node5 x:repfacttest_c8n_1x3_shard1_replica_n2 t:] o.a.s.c.RecoveryStrategy Begin buffering updates. core=[repfacttest_c8n_1x3_shard1_replica_n2] 2> 373179 INFO (recoveryExecutor-1119-thread-1-processing-127.0.0.1:40643__c repfacttest_c8n_1x3_shard1_replica_n2 repfacttest_c8n_1x3 shard1 core_node5) [n:127.0.0.1:40643__c c:repfacttest_c8n_1x3 s:shard1 r:core_node5 x:repfacttest_c8n_1x3_shard1_replica_n2 t:] o.a.s.u.UpdateLog Starting to buffer updates. FSUpdateLog{state=ACTIVE, tlog=tlog{file=/build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.cloud.ReplicationFactorTest_246C98A4C257C021-001/control-001/cores/repfacttest_c8n_1x3_shard1_replica_n2/data/tlog/tlog.0000000000000000000 refcount=1}} 2> 373180 INFO (recoveryExecutor-1119-thread-1-processing-127.0.0.1:40643__c repfacttest_c8n_1x3_shard1_replica_n2 repfacttest_c8n_1x3 shard1 core_node5) [n:127.0.0.1:40643__c c:repfacttest_c8n_1x3 s:shard1 r:core_node5 x:repfacttest_c8n_1x3_shard1_replica_n2 t:] o.a.s.c.RecoveryStrategy Publishing state of core [repfacttest_c8n_1x3_shard1_replica_n2] as recovering, leader is [http://127.0.0.1:32875/_c/repfacttest_c8n_1x3_shard1_replica_n1/] and I am [http://127.0.0.1:40643/_c/repfacttest_c8n_1x3_shard1_replica_n2/] 2> 373185 INFO (recoveryExecutor-1119-thread-1-processing-127.0.0.1:40643__c repfacttest_c8n_1x3_shard1_replica_n2 repfacttest_c8n_1x3 shard1 core_node5) [n:127.0.0.1:40643__c c:repfacttest_c8n_1x3 s:shard1 r:core_node5 x:repfacttest_c8n_1x3_shard1_replica_n2 t:] o.a.s.c.ZkShardTerms Successful update of terms at /collections/repfacttest_c8n_1x3/terms/shard1 to Terms{values={core_node6=7, core_node6_recovering=5, core_node5_recovering=6, core_node3=7, core_node5=7}, version=17} for startRecovering 2> 373189 INFO (qtp1455143443-2013) [n:127.0.0.1:32875__c c:repfacttest_c8n_1x3 s:shard1 r:core_node3 x:repfacttest_c8n_1x3_shard1_replica_n1 t:null-1103] o.a.s.u.IndexFingerprint IndexFingerprint millis:82.0 result:{maxVersionSpecified=9223372036854775807, maxVersionEncountered=1788735705854246912, maxInHash=1788735705854246912, versionsHash=-6928541743164708226, numVersions=89, numDocs=89, maxDoc=46} 2> 373190 INFO (qtp1455143443-2013) [n:127.0.0.1:32875__c c:repfacttest_c8n_1x3 s:shard1 r:core_node3 x:repfacttest_c8n_1x3_shard1_replica_n1 t:null-1103] o.a.s.c.S.Request webapp=/_c path=/get params={distrib=false&qt=/get&getFingerprint=9223372036854775807&wt=javabin&version=2} status=0 QTime=85 2> 373194 INFO (recoveryExecutor-1119-thread-1-processing-127.0.0.1:40643__c repfacttest_c8n_1x3_shard1_replica_n2 repfacttest_c8n_1x3 shard1 core_node5) [n:127.0.0.1:40643__c c:repfacttest_c8n_1x3 s:shard1 r:core_node5 x:repfacttest_c8n_1x3_shard1_replica_n2 t:] o.a.s.c.RecoveryStrategy Sending prep recovery command to [http://127.0.0.1:32875/_c]; [WaitForState: action=PREPRECOVERY&core=repfacttest_c8n_1x3_shard1_replica_n1&nodeName=127.0.0.1:40643__c&coreNodeName=core_node5&state=recovering&checkLive=true&onlyIfLeader=true&onlyIfLeaderActive=true] 2> 373199 INFO (qtp1455143443-2010) [n:127.0.0.1:32875__c c: s: r: x:repfacttest_c8n_1x3_shard1_replica_n1 t:null-1105] o.a.s.h.a.PrepRecoveryOp Going to wait for coreNodeName: core_node5, state: recovering, checkLive: true, onlyIfLeader: true, onlyIfLeaderActive: true 2> 373200 INFO (qtp1455143443-2010) [n:127.0.0.1:32875__c c: s: r: x: t:null-1105] o.a.s.h.a.PrepRecoveryOp In WaitForState(recovering): collection=repfacttest_c8n_1x3, shard=shard1, thisCore=repfacttest_c8n_1x3_shard1_replica_n1, leaderDoesNotNeedRecovery=false, isLeader? true, live=true, checkLive=true, currentState=active, localState=active, nodeName=127.0.0.1:40643__c, coreNodeName=core_node5, onlyIfActiveCheckResult=false, nodeProps: core_node5:{ 2> "core":"repfacttest_c8n_1x3_shard1_replica_n2", 2> "node_name":"127.0.0.1:40643__c", 2> "base_url":"http://127.0.0.1:40643/_c", 2> "state":"active", 2> "type":"NRT", 2> "force_set_state":"false"} 2> 373201 INFO (qtp1455143443-2010) [n:127.0.0.1:32875__c c: s: r: x: t:null-1105] o.a.s.h.a.PrepRecoveryOp In WaitForState(recovering): collection=repfacttest_c8n_1x3, shard=shard1, thisCore=repfacttest_c8n_1x3_shard1_replica_n1, leaderDoesNotNeedRecovery=false, isLeader? true, live=true, checkLive=true, currentState=active, localState=active, nodeName=127.0.0.1:40643__c, coreNodeName=core_node5, onlyIfActiveCheckResult=false, nodeProps: core_node5:{ 2> "core":"repfacttest_c8n_1x3_shard1_replica_n2", 2> "node_name":"127.0.0.1:40643__c", 2> "base_url":"http://127.0.0.1:40643/_c", 2> "state":"active", 2> "type":"NRT", 2> "force_set_state":"false"} 2> 373202 INFO (qtp1455143443-2010) [n:127.0.0.1:32875__c c: s: r: x: t:null-1105] o.a.s.h.a.PrepRecoveryOp In WaitForState(recovering): collection=repfacttest_c8n_1x3, shard=shard1, thisCore=repfacttest_c8n_1x3_shard1_replica_n1, leaderDoesNotNeedRecovery=false, isLeader? true, live=true, checkLive=true, currentState=active, localState=active, nodeName=127.0.0.1:40643__c, coreNodeName=core_node5, onlyIfActiveCheckResult=false, nodeProps: core_node5:{ 2> "core":"repfacttest_c8n_1x3_shard1_replica_n2", 2> "node_name":"127.0.0.1:40643__c", 2> "base_url":"http://127.0.0.1:40643/_c", 2> "state":"active", 2> "type":"NRT", 2> "force_set_state":"false"} 2> 373203 INFO (qtp1455143443-2010) [n:127.0.0.1:32875__c c: s: r: x: t:null-1105] o.a.s.h.a.PrepRecoveryOp In WaitForState(recovering): collection=repfacttest_c8n_1x3, shard=shard1, thisCore=repfacttest_c8n_1x3_shard1_replica_n1, leaderDoesNotNeedRecovery=false, isLeader? true, live=true, checkLive=true, currentState=active, localState=active, nodeName=127.0.0.1:40643__c, coreNodeName=core_node5, onlyIfActiveCheckResult=false, nodeProps: core_node5:{ 2> "core":"repfacttest_c8n_1x3_shard1_replica_n2", 2> "node_name":"127.0.0.1:40643__c", 2> "base_url":"http://127.0.0.1:40643/_c", 2> "state":"active", 2> "type":"NRT", 2> "force_set_state":"false"} 2> 373254 INFO (recoveryExecutor-1158-thread-1-processing-127.0.0.1:33207__c repfacttest_c8n_1x3_shard1_replica_n4 repfacttest_c8n_1x3 shard1 core_node6) [n:127.0.0.1:33207__c c:repfacttest_c8n_1x3 s:shard1 r:core_node6 x:repfacttest_c8n_1x3_shard1_replica_n4 t:] o.a.s.u.IndexFingerprint IndexFingerprint millis:60.0 result:{maxVersionSpecified=9223372036854775807, maxVersionEncountered=1788735703983587328, maxInHash=1788735703983587328, versionsHash=192063643304484494, numVersions=44, numDocs=44, maxDoc=29} 2> 373254 INFO (recoveryExecutor-1158-thread-1-processing-127.0.0.1:33207__c repfacttest_c8n_1x3_shard1_replica_n4 repfacttest_c8n_1x3 shard1 core_node6) [n:127.0.0.1:33207__c c:repfacttest_c8n_1x3 s:shard1 r:core_node6 x:repfacttest_c8n_1x3_shard1_replica_n4 t:] o.a.s.u.PeerSyncWithLeader Fingerprint comparison result: 1 2> 373255 INFO (recoveryExecutor-1158-thread-1-processing-127.0.0.1:33207__c repfacttest_c8n_1x3_shard1_replica_n4 repfacttest_c8n_1x3 shard1 core_node6) [n:127.0.0.1:33207__c c:repfacttest_c8n_1x3 s:shard1 r:core_node6 x:repfacttest_c8n_1x3_shard1_replica_n4 t:] o.a.s.u.PeerSyncWithLeader Leader fingerprint: {maxVersionSpecified=9223372036854775807, maxVersionEncountered=1788735705854246912, maxInHash=1788735705854246912, versionsHash=-6928541743164708226, numVersions=89, numDocs=89, maxDoc=46}, Our fingerprint: {maxVersionSpecified=9223372036854775807, maxVersionEncountered=1788735703983587328, maxInHash=1788735703983587328, versionsHash=192063643304484494, numVersions=44, numDocs=44, maxDoc=29} 2> 373263 INFO (qtp1455143443-2014) [n:127.0.0.1:32875__c c:repfacttest_c8n_1x3 s:shard1 r:core_node3 x:repfacttest_c8n_1x3_shard1_replica_n1 t:null-1106] o.a.s.u.IndexFingerprint IndexFingerprint millis:0.0 result:{maxVersionSpecified=9223372036854775807, maxVersionEncountered=1788735705854246912, maxInHash=1788735705854246912, versionsHash=-6928541743164708226, numVersions=89, numDocs=89, maxDoc=46} 2> 373265 INFO (qtp1455143443-2014) [n:127.0.0.1:32875__c c:repfacttest_c8n_1x3 s:shard1 r:core_node3 x:repfacttest_c8n_1x3_shard1_replica_n1 t:null-1106] o.a.s.c.S.Request webapp=/_c path=/get params={distrib=false&qt=/get&fingerprint=true&getVersions=100&wt=javabin&version=2} status=0 QTime=4 2> 373267 INFO (recoveryExecutor-1158-thread-1-processing-127.0.0.1:33207__c repfacttest_c8n_1x3_shard1_replica_n4 repfacttest_c8n_1x3 shard1 core_node6) [n:127.0.0.1:33207__c c:repfacttest_c8n_1x3 s:shard1 r:core_node6 x:repfacttest_c8n_1x3_shard1_replica_n4 t:] o.a.s.u.PeerSyncWithLeader Leader fingerprint {maxVersionSpecified=9223372036854775807, maxVersionEncountered=1788735705854246912, maxInHash=1788735705854246912, versionsHash=-6928541743164708226, numVersions=89, numDocs=89, maxDoc=46} 2> 373270 INFO (recoveryExecutor-1158-thread-1-processing-127.0.0.1:33207__c repfacttest_c8n_1x3_shard1_replica_n4 repfacttest_c8n_1x3 shard1 core_node6) [n:127.0.0.1:33207__c c:repfacttest_c8n_1x3 s:shard1 r:core_node6 x:repfacttest_c8n_1x3_shard1_replica_n4 t:] o.a.s.u.PeerSyncWithLeader PeerSync: core=repfacttest_c8n_1x3_shard1_replica_n4 url=http://127.0.0.1:33207/_c Received 100 versions from http://127.0.0.1:32875/_c/repfacttest_c8n_1x3_shard1_replica_n1/ 2> 373275 INFO (recoveryExecutor-1158-thread-1-processing-127.0.0.1:33207__c repfacttest_c8n_1x3_shard1_replica_n4 repfacttest_c8n_1x3 shard1 core_node6) [n:127.0.0.1:33207__c c:repfacttest_c8n_1x3 s:shard1 r:core_node6 x:repfacttest_c8n_1x3_shard1_replica_n4 t:] o.a.s.u.PeerSyncWithLeader PeerSync: core=repfacttest_c8n_1x3_shard1_replica_n4 url=http://127.0.0.1:33207/_c Requesting updates from http://127.0.0.1:32875/_c/repfacttest_c8n_1x3_shard1_replica_n1/ n=95 versions=1788735704169185280...-1788735705864732672 2> 373303 INFO (zkCallback-1161-thread-2) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/repfacttest_c8n_1x3/state.json zxid: 421] for collection [repfacttest_c8n_1x3] has occurred - updating... (live nodes size: [4]) 2> 373303 INFO (zkCallback-1122-thread-2) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/repfacttest_c8n_1x3/state.json zxid: 421] for collection [repfacttest_c8n_1x3] has occurred - updating... (live nodes size: [4]) 2> 373304 INFO (zkCallback-1180-thread-2) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/repfacttest_c8n_1x3/state.json zxid: 421] for collection [repfacttest_c8n_1x3] has occurred - updating... (live nodes size: [4]) 2> 373304 INFO (zkCallback-1180-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/repfacttest_c8n_1x3/state.json zxid: 421] for collection [repfacttest_c8n_1x3] has occurred - updating... (live nodes size: [4]) 2> 373306 INFO (qtp1455143443-2015) [n:127.0.0.1:32875__c c:repfacttest_c8n_1x3 s:shard1 r:core_node3 x:repfacttest_c8n_1x3_shard1_replica_n1 t:null-1107] o.a.s.c.S.Request webapp=/_c path=/get params={getUpdates=1788735704169185280...-1788735705864732672&distrib=false&qt=/get&onlyIfActive=false&skipDbq=true&wt=javabin&version=2} status=0 QTime=28 2> 373309 INFO (watches-1182-thread-3) [n:127.0.0.1:32875__c c: s: r: x: t:] o.a.s.h.a.PrepRecoveryOp In WaitForState(recovering): collection=repfacttest_c8n_1x3, shard=shard1, thisCore=repfacttest_c8n_1x3_shard1_replica_n1, leaderDoesNotNeedRecovery=false, isLeader? true, live=true, checkLive=true, currentState=recovering, localState=active, nodeName=127.0.0.1:40643__c, coreNodeName=core_node5, onlyIfActiveCheckResult=false, nodeProps: core_node5:{ 2> "core":"repfacttest_c8n_1x3_shard1_replica_n2", 2> "node_name":"127.0.0.1:40643__c", 2> "base_url":"http://127.0.0.1:40643/_c", 2> "state":"recovering", 2> "type":"NRT", 2> "force_set_state":"false"} 2> 373310 INFO (qtp1455143443-2010) [n:127.0.0.1:32875__c c: s: r: x: t:null-1105] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/cores params={nodeName=127.0.0.1:40643__c&onlyIfLeaderActive=true&core=repfacttest_c8n_1x3_shard1_replica_n1&coreNodeName=core_node5&action=PREPRECOVERY&checkLive=true&state=recovering&onlyIfLeader=true&wt=javabin&version=2} status=0 QTime=112 2> 373482 INFO (TEST-ReplicationFactorTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.AbstractFullDistribZkTestBase Found 3 replicas and leader on 127.0.0.1:32875__c for shard1 in repfacttest_c8n_1x3 2> 373482 INFO (TEST-ReplicationFactorTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.AbstractFullDistribZkTestBase Replica core_node5 is currently recovering 2> 373482 INFO (TEST-ReplicationFactorTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.AbstractFullDistribZkTestBase Replica core_node6 is currently recovering 2> 373619 INFO (recoveryExecutor-1158-thread-1-processing-127.0.0.1:33207__c repfacttest_c8n_1x3_shard1_replica_n4 repfacttest_c8n_1x3 shard1 core_node6) [n:127.0.0.1:33207__c c:repfacttest_c8n_1x3 s:shard1 r:core_node6 x:repfacttest_c8n_1x3_shard1_replica_n4 t:] o.a.s.u.p.LogUpdateProcessorFactory {add=[5 (1788735704169185280), 15 (1788735704274042880), 16 (1788735704276140032), 17 (1788735704277188608), 18 (1788735704278237184), 19 (1788735704279285760), 20 (1788735704280334336), 21 (1788735704282431488), 22 (1788735704283480064), 23 (1788735704284528640), ... (91 adds)], deleteByQuery=id:(2144 OR 2177 OR 2181 OR 2123 OR 2187 OR 2127 OR 2130 OR 2131 OR 2165 OR 2103 OR 2168 OR 2106 OR 2140 OR 2108 OR 2141) (-1788735704435523584), delete=[2304,2274,2276,2213,2246,2217,2219,2284,2285,2254,2224,2258,2268,2237,2238 (-1788735704778407936), 2464,2532,2537,2475,2508,2476,2477,2448,2482,2451,2487,2455,2462,2494,2495 (-1788735705864732672)], deleteByQuery=id:(2371 OR 2408 OR 2409 OR 2378 OR 2347 OR 2348 OR 2419 OR 2356 OR 2389 OR 2391 OR 2392 OR 2329 OR 2330 OR 2333 OR 2398) (-1788735705607831552)} 0 258 2> 373675 INFO (recoveryExecutor-1158-thread-1-processing-127.0.0.1:33207__c repfacttest_c8n_1x3_shard1_replica_n4 repfacttest_c8n_1x3 shard1 core_node6) [n:127.0.0.1:33207__c c:repfacttest_c8n_1x3 s:shard1 r:core_node6 x:repfacttest_c8n_1x3_shard1_replica_n4 t:] o.a.s.u.IndexFingerprint IndexFingerprint millis:54.0 result:{maxVersionSpecified=9223372036854775807, maxVersionEncountered=1788735705854246912, maxInHash=1788735705854246912, versionsHash=-6928541743164708226, numVersions=89, numDocs=89, maxDoc=45} 2> 373675 INFO (recoveryExecutor-1158-thread-1-processing-127.0.0.1:33207__c repfacttest_c8n_1x3_shard1_replica_n4 repfacttest_c8n_1x3 shard1 core_node6) [n:127.0.0.1:33207__c c:repfacttest_c8n_1x3 s:shard1 r:core_node6 x:repfacttest_c8n_1x3_shard1_replica_n4 t:] o.a.s.u.PeerSyncWithLeader Fingerprint comparison result: 0 2> 373675 INFO (recoveryExecutor-1158-thread-1-processing-127.0.0.1:33207__c repfacttest_c8n_1x3_shard1_replica_n4 repfacttest_c8n_1x3 shard1 core_node6) [n:127.0.0.1:33207__c c:repfacttest_c8n_1x3 s:shard1 r:core_node6 x:repfacttest_c8n_1x3_shard1_replica_n4 t:] o.a.s.u.PeerSyncWithLeader PeerSync: core=repfacttest_c8n_1x3_shard1_replica_n4 url=http://127.0.0.1:33207/_c DONE. sync succeeded 2> 373748 INFO (searcherExecutor-1232-thread-1-processing-127.0.0.1:33207__c repfacttest_c8n_1x3_shard1_replica_n4 repfacttest_c8n_1x3 shard1 core_node6) [n:127.0.0.1:33207__c c:repfacttest_c8n_1x3 s:shard1 r:core_node6 x:repfacttest_c8n_1x3_shard1_replica_n4 t:] o.a.s.c.SolrCore Registered new searcher autowarm time: 0 ms 2> 373751 INFO (recoveryExecutor-1158-thread-1-processing-127.0.0.1:33207__c repfacttest_c8n_1x3_shard1_replica_n4 repfacttest_c8n_1x3 shard1 core_node6) [n:127.0.0.1:33207__c c:repfacttest_c8n_1x3 s:shard1 r:core_node6 x:repfacttest_c8n_1x3_shard1_replica_n4 t:] o.a.s.c.RecoveryStrategy PeerSync stage of recovery was successful. 2> 373751 INFO (recoveryExecutor-1158-thread-1-processing-127.0.0.1:33207__c repfacttest_c8n_1x3_shard1_replica_n4 repfacttest_c8n_1x3 shard1 core_node6) [n:127.0.0.1:33207__c c:repfacttest_c8n_1x3 s:shard1 r:core_node6 x:repfacttest_c8n_1x3_shard1_replica_n4 t:] o.a.s.c.RecoveryStrategy Replaying updates buffered during PeerSync. 2> 373751 INFO (recoveryExecutor-1158-thread-1-processing-127.0.0.1:33207__c repfacttest_c8n_1x3_shard1_replica_n4 repfacttest_c8n_1x3 shard1 core_node6) [n:127.0.0.1:33207__c c:repfacttest_c8n_1x3 s:shard1 r:core_node6 x:repfacttest_c8n_1x3_shard1_replica_n4 t:] o.a.s.c.RecoveryStrategy No replay needed. 2> 373752 INFO (recoveryExecutor-1158-thread-1-processing-127.0.0.1:33207__c repfacttest_c8n_1x3_shard1_replica_n4 repfacttest_c8n_1x3 shard1 core_node6) [n:127.0.0.1:33207__c c:repfacttest_c8n_1x3 s:shard1 r:core_node6 x:repfacttest_c8n_1x3_shard1_replica_n4 t:] o.a.s.c.RecoveryStrategy Registering as Active after recovery. 2> 373756 INFO (recoveryExecutor-1158-thread-1-processing-127.0.0.1:33207__c repfacttest_c8n_1x3_shard1_replica_n4 repfacttest_c8n_1x3 shard1 core_node6) [n:127.0.0.1:33207__c c:repfacttest_c8n_1x3 s:shard1 r:core_node6 x:repfacttest_c8n_1x3_shard1_replica_n4 t:] o.a.s.c.ZkShardTerms Successful update of terms at /collections/repfacttest_c8n_1x3/terms/shard1 to Terms{values={core_node6=7, core_node5_recovering=6, core_node3=7, core_node5=7}, version=18} for doneRecovering 2> 373760 INFO (recoveryExecutor-1158-thread-1-processing-127.0.0.1:33207__c repfacttest_c8n_1x3_shard1_replica_n4 repfacttest_c8n_1x3 shard1 core_node6) [n:127.0.0.1:33207__c c:repfacttest_c8n_1x3 s:shard1 r:core_node6 x:repfacttest_c8n_1x3_shard1_replica_n4 t:] o.a.s.c.RecoveryStrategy Updating version bucket highest from index after successful recovery. 2> 373769 INFO (recoveryExecutor-1158-thread-1-processing-127.0.0.1:33207__c repfacttest_c8n_1x3_shard1_replica_n4 repfacttest_c8n_1x3 shard1 core_node6) [n:127.0.0.1:33207__c c:repfacttest_c8n_1x3 s:shard1 r:core_node6 x:repfacttest_c8n_1x3_shard1_replica_n4 t:] o.a.s.c.RecoveryStrategy Finished recovery process, successful=[true] msTimeTaken=1224.0 2> 373769 INFO (recoveryExecutor-1158-thread-1-processing-127.0.0.1:33207__c repfacttest_c8n_1x3_shard1_replica_n4 repfacttest_c8n_1x3 shard1 core_node6) [n:127.0.0.1:33207__c c:repfacttest_c8n_1x3 s:shard1 r:core_node6 x:repfacttest_c8n_1x3_shard1_replica_n4 t:] o.a.s.c.RecoveryStrategy Finished recovery process. recoveringAfterStartup=false msTimeTaken=1225.0 2> 373812 INFO (recoveryExecutor-1119-thread-1-processing-127.0.0.1:40643__c repfacttest_c8n_1x3_shard1_replica_n2 repfacttest_c8n_1x3 shard1 core_node5) [n:127.0.0.1:40643__c c:repfacttest_c8n_1x3 s:shard1 r:core_node5 x:repfacttest_c8n_1x3_shard1_replica_n2 t:] o.a.s.c.RecoveryStrategy Attempting to PeerSync from [http://127.0.0.1:32875/_c/repfacttest_c8n_1x3_shard1_replica_n1/] - recoveringAfterStartup=[false] 2> 373813 INFO (recoveryExecutor-1119-thread-1-processing-127.0.0.1:40643__c repfacttest_c8n_1x3_shard1_replica_n2 repfacttest_c8n_1x3 shard1 core_node5) [n:127.0.0.1:40643__c c:repfacttest_c8n_1x3 s:shard1 r:core_node5 x:repfacttest_c8n_1x3_shard1_replica_n2 t:] o.a.s.u.PeerSyncWithLeader PeerSync: core=repfacttest_c8n_1x3_shard1_replica_n2 url=http://127.0.0.1:40643/_c START leader=http://127.0.0.1:32875/_c/repfacttest_c8n_1x3_shard1_replica_n1/ nUpdates=100 2> 373818 INFO (SocketProxy-Acceptor-32875) [n: c: s: r: x: t:] o.a.s.c.s.c.SocketProxy accepted Socket[addr=/127.0.0.1,port=42960,localport=32875], receiveBufferSize: 65536 2> 373822 INFO (SocketProxy-Acceptor-32875) [n: c: s: r: x: t:] o.a.s.c.s.c.SocketProxy proxy connection Socket[addr=/127.0.0.1,port=38185,localport=52518], receiveBufferSize=65536 2> 373827 INFO (qtp1455143443-2011) [n:127.0.0.1:32875__c c:repfacttest_c8n_1x3 s:shard1 r:core_node3 x:repfacttest_c8n_1x3_shard1_replica_n1 t:null-1108] o.a.s.u.IndexFingerprint IndexFingerprint millis:0.0 result:{maxVersionSpecified=9223372036854775807, maxVersionEncountered=1788735705854246912, maxInHash=1788735705854246912, versionsHash=-6928541743164708226, numVersions=89, numDocs=89, maxDoc=46} 2> 373827 INFO (qtp1455143443-2011) [n:127.0.0.1:32875__c c:repfacttest_c8n_1x3 s:shard1 r:core_node3 x:repfacttest_c8n_1x3_shard1_replica_n1 t:null-1108] o.a.s.c.S.Request webapp=/_c path=/get params={distrib=false&qt=/get&getFingerprint=9223372036854775807&wt=javabin&version=2} status=0 QTime=3 2> 373878 INFO (zkCallback-1122-thread-2) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/repfacttest_c8n_1x3/state.json zxid: 425] for collection [repfacttest_c8n_1x3] has occurred - updating... (live nodes size: [4]) 2> 373878 INFO (zkCallback-1161-thread-2) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/repfacttest_c8n_1x3/state.json zxid: 425] for collection [repfacttest_c8n_1x3] has occurred - updating... (live nodes size: [4]) 2> 373878 INFO (zkCallback-1180-thread-2) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/repfacttest_c8n_1x3/state.json zxid: 425] for collection [repfacttest_c8n_1x3] has occurred - updating... (live nodes size: [4]) 2> 373878 INFO (zkCallback-1180-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/repfacttest_c8n_1x3/state.json zxid: 425] for collection [repfacttest_c8n_1x3] has occurred - updating... (live nodes size: [4]) 2> 373927 INFO (recoveryExecutor-1119-thread-1-processing-127.0.0.1:40643__c repfacttest_c8n_1x3_shard1_replica_n2 repfacttest_c8n_1x3 shard1 core_node5) [n:127.0.0.1:40643__c c:repfacttest_c8n_1x3 s:shard1 r:core_node5 x:repfacttest_c8n_1x3_shard1_replica_n2 t:] o.a.s.u.IndexFingerprint IndexFingerprint millis:98.0 result:{maxVersionSpecified=9223372036854775807, maxVersionEncountered=1788735704760582144, maxInHash=1788735704760582144, versionsHash=5607511469555871560, numVersions=74, numDocs=74, maxDoc=46} 2> 373928 INFO (recoveryExecutor-1119-thread-1-processing-127.0.0.1:40643__c repfacttest_c8n_1x3_shard1_replica_n2 repfacttest_c8n_1x3 shard1 core_node5) [n:127.0.0.1:40643__c c:repfacttest_c8n_1x3 s:shard1 r:core_node5 x:repfacttest_c8n_1x3_shard1_replica_n2 t:] o.a.s.u.PeerSyncWithLeader Fingerprint comparison result: 1 2> 373928 INFO (recoveryExecutor-1119-thread-1-processing-127.0.0.1:40643__c repfacttest_c8n_1x3_shard1_replica_n2 repfacttest_c8n_1x3 shard1 core_node5) [n:127.0.0.1:40643__c c:repfacttest_c8n_1x3 s:shard1 r:core_node5 x:repfacttest_c8n_1x3_shard1_replica_n2 t:] o.a.s.u.PeerSyncWithLeader Leader fingerprint: {maxVersionSpecified=9223372036854775807, maxVersionEncountered=1788735705854246912, maxInHash=1788735705854246912, versionsHash=-6928541743164708226, numVersions=89, numDocs=89, maxDoc=46}, Our fingerprint: {maxVersionSpecified=9223372036854775807, maxVersionEncountered=1788735704760582144, maxInHash=1788735704760582144, versionsHash=5607511469555871560, numVersions=74, numDocs=74, maxDoc=46} 2> 373939 INFO (qtp1455143443-2013) [n:127.0.0.1:32875__c c:repfacttest_c8n_1x3 s:shard1 r:core_node3 x:repfacttest_c8n_1x3_shard1_replica_n1 t:null-1109] o.a.s.u.IndexFingerprint IndexFingerprint millis:1.0 result:{maxVersionSpecified=9223372036854775807, maxVersionEncountered=1788735705854246912, maxInHash=1788735705854246912, versionsHash=-6928541743164708226, numVersions=89, numDocs=89, maxDoc=46} 2> 373941 INFO (qtp1455143443-2013) [n:127.0.0.1:32875__c c:repfacttest_c8n_1x3 s:shard1 r:core_node3 x:repfacttest_c8n_1x3_shard1_replica_n1 t:null-1109] o.a.s.c.S.Request webapp=/_c path=/get params={distrib=false&qt=/get&fingerprint=true&getVersions=100&wt=javabin&version=2} status=0 QTime=5 2> 373943 INFO (recoveryExecutor-1119-thread-1-processing-127.0.0.1:40643__c repfacttest_c8n_1x3_shard1_replica_n2 repfacttest_c8n_1x3 shard1 core_node5) [n:127.0.0.1:40643__c c:repfacttest_c8n_1x3 s:shard1 r:core_node5 x:repfacttest_c8n_1x3_shard1_replica_n2 t:] o.a.s.u.PeerSyncWithLeader Leader fingerprint {maxVersionSpecified=9223372036854775807, maxVersionEncountered=1788735705854246912, maxInHash=1788735705854246912, versionsHash=-6928541743164708226, numVersions=89, numDocs=89, maxDoc=46} 2> 373943 INFO (recoveryExecutor-1119-thread-1-processing-127.0.0.1:40643__c repfacttest_c8n_1x3_shard1_replica_n2 repfacttest_c8n_1x3 shard1 core_node5) [n:127.0.0.1:40643__c c:repfacttest_c8n_1x3 s:shard1 r:core_node5 x:repfacttest_c8n_1x3_shard1_replica_n2 t:] o.a.s.u.PeerSyncWithLeader PeerSync: core=repfacttest_c8n_1x3_shard1_replica_n2 url=http://127.0.0.1:40643/_c Received 100 versions from http://127.0.0.1:32875/_c/repfacttest_c8n_1x3_shard1_replica_n1/ 2> 373944 INFO (recoveryExecutor-1119-thread-1-processing-127.0.0.1:40643__c repfacttest_c8n_1x3_shard1_replica_n2 repfacttest_c8n_1x3 shard1 core_node5) [n:127.0.0.1:40643__c c:repfacttest_c8n_1x3 s:shard1 r:core_node5 x:repfacttest_c8n_1x3_shard1_replica_n2 t:] o.a.s.u.PeerSyncWithLeader PeerSync: core=repfacttest_c8n_1x3_shard1_replica_n2 url=http://127.0.0.1:40643/_c Requesting updates from http://127.0.0.1:32875/_c/repfacttest_c8n_1x3_shard1_replica_n1/ n=47 versions=1788735704850759680...-1788735705864732672 2> 373954 INFO (qtp1455143443-2014) [n:127.0.0.1:32875__c c:repfacttest_c8n_1x3 s:shard1 r:core_node3 x:repfacttest_c8n_1x3_shard1_replica_n1 t:null-1110] o.a.s.c.S.Request webapp=/_c path=/get params={getUpdates=1788735704850759680...-1788735705864732672&distrib=false&qt=/get&onlyIfActive=false&skipDbq=true&wt=javabin&version=2} status=0 QTime=7 2> 373989 INFO (TEST-ReplicationFactorTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.AbstractFullDistribZkTestBase Found 3 replicas and leader on 127.0.0.1:32875__c for shard1 in repfacttest_c8n_1x3 2> 373989 INFO (TEST-ReplicationFactorTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.AbstractFullDistribZkTestBase Replica core_node5 is currently recovering 2> 374133 INFO (recoveryExecutor-1119-thread-1-processing-127.0.0.1:40643__c repfacttest_c8n_1x3_shard1_replica_n2 repfacttest_c8n_1x3 shard1 core_node5) [n:127.0.0.1:40643__c c:repfacttest_c8n_1x3 s:shard1 r:core_node5 x:repfacttest_c8n_1x3_shard1_replica_n2 t:] o.a.s.u.p.LogUpdateProcessorFactory {add=[30 (1788735704850759680), 31 (1788735704852856832), 32 (1788735704853905408), 33 (1788735704854953984), 34 (1788735704854953985), 35 (1788735704856002560), 36 (1788735704859148288), 37 (1788735704860196864), 38 (1788735704861245440), 39 (1788735704862294016), ... (45 adds)], deleteByQuery=id:(2371 OR 2408 OR 2409 OR 2378 OR 2347 OR 2348 OR 2419 OR 2356 OR 2389 OR 2391 OR 2392 OR 2329 OR 2330 OR 2333 OR 2398) (-1788735705607831552), delete=[2464,2532,2537,2475,2508,2476,2477,2448,2482,2451,2487,2455,2462,2494,2495 (-1788735705864732672)]} 0 131 2> 374201 INFO (recoveryExecutor-1119-thread-1-processing-127.0.0.1:40643__c repfacttest_c8n_1x3_shard1_replica_n2 repfacttest_c8n_1x3 shard1 core_node5) [n:127.0.0.1:40643__c c:repfacttest_c8n_1x3 s:shard1 r:core_node5 x:repfacttest_c8n_1x3_shard1_replica_n2 t:] o.a.s.u.IndexFingerprint IndexFingerprint millis:67.0 result:{maxVersionSpecified=9223372036854775807, maxVersionEncountered=1788735705854246912, maxInHash=1788735705854246912, versionsHash=-6928541743164708226, numVersions=89, numDocs=89, maxDoc=46} 2> 374201 INFO (recoveryExecutor-1119-thread-1-processing-127.0.0.1:40643__c repfacttest_c8n_1x3_shard1_replica_n2 repfacttest_c8n_1x3 shard1 core_node5) [n:127.0.0.1:40643__c c:repfacttest_c8n_1x3 s:shard1 r:core_node5 x:repfacttest_c8n_1x3_shard1_replica_n2 t:] o.a.s.u.PeerSyncWithLeader Fingerprint comparison result: 0 2> 374201 INFO (recoveryExecutor-1119-thread-1-processing-127.0.0.1:40643__c repfacttest_c8n_1x3_shard1_replica_n2 repfacttest_c8n_1x3 shard1 core_node5) [n:127.0.0.1:40643__c c:repfacttest_c8n_1x3 s:shard1 r:core_node5 x:repfacttest_c8n_1x3_shard1_replica_n2 t:] o.a.s.u.PeerSyncWithLeader PeerSync: core=repfacttest_c8n_1x3_shard1_replica_n2 url=http://127.0.0.1:40643/_c DONE. sync succeeded 2> 374276 INFO (searcherExecutor-1234-thread-1-processing-127.0.0.1:40643__c repfacttest_c8n_1x3_shard1_replica_n2 repfacttest_c8n_1x3 shard1 core_node5) [n:127.0.0.1:40643__c c:repfacttest_c8n_1x3 s:shard1 r:core_node5 x:repfacttest_c8n_1x3_shard1_replica_n2 t:] o.a.s.c.SolrCore Registered new searcher autowarm time: 0 ms 2> 374279 INFO (recoveryExecutor-1119-thread-1-processing-127.0.0.1:40643__c repfacttest_c8n_1x3_shard1_replica_n2 repfacttest_c8n_1x3 shard1 core_node5) [n:127.0.0.1:40643__c c:repfacttest_c8n_1x3 s:shard1 r:core_node5 x:repfacttest_c8n_1x3_shard1_replica_n2 t:] o.a.s.c.RecoveryStrategy PeerSync stage of recovery was successful. 2> 374279 INFO (recoveryExecutor-1119-thread-1-processing-127.0.0.1:40643__c repfacttest_c8n_1x3_shard1_replica_n2 repfacttest_c8n_1x3 shard1 core_node5) [n:127.0.0.1:40643__c c:repfacttest_c8n_1x3 s:shard1 r:core_node5 x:repfacttest_c8n_1x3_shard1_replica_n2 t:] o.a.s.c.RecoveryStrategy Replaying updates buffered during PeerSync. 2> 374279 INFO (recoveryExecutor-1119-thread-1-processing-127.0.0.1:40643__c repfacttest_c8n_1x3_shard1_replica_n2 repfacttest_c8n_1x3 shard1 core_node5) [n:127.0.0.1:40643__c c:repfacttest_c8n_1x3 s:shard1 r:core_node5 x:repfacttest_c8n_1x3_shard1_replica_n2 t:] o.a.s.c.RecoveryStrategy No replay needed. 2> 374281 INFO (recoveryExecutor-1119-thread-1-processing-127.0.0.1:40643__c repfacttest_c8n_1x3_shard1_replica_n2 repfacttest_c8n_1x3 shard1 core_node5) [n:127.0.0.1:40643__c c:repfacttest_c8n_1x3 s:shard1 r:core_node5 x:repfacttest_c8n_1x3_shard1_replica_n2 t:] o.a.s.c.RecoveryStrategy Registering as Active after recovery. 2> 374286 INFO (recoveryExecutor-1119-thread-1-processing-127.0.0.1:40643__c repfacttest_c8n_1x3_shard1_replica_n2 repfacttest_c8n_1x3 shard1 core_node5) [n:127.0.0.1:40643__c c:repfacttest_c8n_1x3 s:shard1 r:core_node5 x:repfacttest_c8n_1x3_shard1_replica_n2 t:] o.a.s.c.ZkShardTerms Successful update of terms at /collections/repfacttest_c8n_1x3/terms/shard1 to Terms{values={core_node6=7, core_node3=7, core_node5=7}, version=19} for doneRecovering 2> 374290 INFO (recoveryExecutor-1119-thread-1-processing-127.0.0.1:40643__c repfacttest_c8n_1x3_shard1_replica_n2 repfacttest_c8n_1x3 shard1 core_node5) [n:127.0.0.1:40643__c c:repfacttest_c8n_1x3 s:shard1 r:core_node5 x:repfacttest_c8n_1x3_shard1_replica_n2 t:] o.a.s.c.RecoveryStrategy Updating version bucket highest from index after successful recovery. 2> 374301 INFO (recoveryExecutor-1119-thread-1-processing-127.0.0.1:40643__c repfacttest_c8n_1x3_shard1_replica_n2 repfacttest_c8n_1x3 shard1 core_node5) [n:127.0.0.1:40643__c c:repfacttest_c8n_1x3 s:shard1 r:core_node5 x:repfacttest_c8n_1x3_shard1_replica_n2 t:] o.a.s.c.RecoveryStrategy Finished recovery process, successful=[true] msTimeTaken=1142.0 2> 374301 INFO (recoveryExecutor-1119-thread-1-processing-127.0.0.1:40643__c repfacttest_c8n_1x3_shard1_replica_n2 repfacttest_c8n_1x3 shard1 core_node5) [n:127.0.0.1:40643__c c:repfacttest_c8n_1x3 s:shard1 r:core_node5 x:repfacttest_c8n_1x3_shard1_replica_n2 t:] o.a.s.c.RecoveryStrategy Finished recovery process. recoveringAfterStartup=false msTimeTaken=1142.0 2> 374403 INFO (zkCallback-1180-thread-2) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/repfacttest_c8n_1x3/state.json zxid: 429] for collection [repfacttest_c8n_1x3] has occurred - updating... (live nodes size: [4]) 2> 374403 INFO (zkCallback-1161-thread-2) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/repfacttest_c8n_1x3/state.json zxid: 429] for collection [repfacttest_c8n_1x3] has occurred - updating... (live nodes size: [4]) 2> 374403 INFO (zkCallback-1180-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/repfacttest_c8n_1x3/state.json zxid: 429] for collection [repfacttest_c8n_1x3] has occurred - updating... (live nodes size: [4]) 2> 374403 INFO (zkCallback-1122-thread-2) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/repfacttest_c8n_1x3/state.json zxid: 429] for collection [repfacttest_c8n_1x3] has occurred - updating... (live nodes size: [4]) 2> 374496 INFO (TEST-ReplicationFactorTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.AbstractFullDistribZkTestBase Found 3 replicas and leader on 127.0.0.1:32875__c for shard1 in repfacttest_c8n_1x3 2> 374496 INFO (TEST-ReplicationFactorTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.AbstractFullDistribZkTestBase Took 3054.0 ms to see all replicas become active. 2> 374497 INFO (TEST-ReplicationFactorTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.AbstractFullDistribZkTestBase Wait for recoveries to finish - wait 30SECONDS for each attempt 2> 374497 INFO (TEST-ReplicationFactorTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.AbstractDistribZkTestBase Wait for recoveries to finish - collection:collection1 failOnTimeout:true timeout:30SECONDS 2> 374500 INFO (TEST-ReplicationFactorTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.AbstractDistribZkTestBase Recoveries finished - collection:collection1 2> 374519 INFO (qtp1639791653-1928) [n:127.0.0.1:40643__c c:control_collection s:shard1 r:core_node2 x:control_collection_shard1_replica_n1 t:null-1111] o.a.s.u.p.LogUpdateProcessorFactory webapp=/_c path=/update params={waitSearcher=true&commit=true&softCommit=false&wt=javabin&version=2}{commit=} 0 13 2> 374525 INFO (SocketProxy-Acceptor-33207) [n: c: s: r: x: t:] o.a.s.c.s.c.SocketProxy accepted Socket[addr=/127.0.0.1,port=51350,localport=33207], receiveBufferSize: 65536 2> 374531 INFO (SocketProxy-Acceptor-33207) [n: c: s: r: x: t:] o.a.s.c.s.c.SocketProxy proxy connection Socket[addr=/127.0.0.1,port=35371,localport=41720], receiveBufferSize=65536 2> 374560 INFO (SocketProxy-Acceptor-46873) [n: c: s: r: x: t:] o.a.s.c.s.c.SocketProxy accepted Socket[addr=/127.0.0.1,port=55636,localport=46873], receiveBufferSize: 65536 2> 374563 INFO (SocketProxy-Acceptor-32875) [n: c: s: r: x: t:] o.a.s.c.s.c.SocketProxy accepted Socket[addr=/127.0.0.1,port=39744,localport=32875], receiveBufferSize: 65536 2> 374575 INFO (SocketProxy-Acceptor-32875) [n: c: s: r: x: t:] o.a.s.c.s.c.SocketProxy proxy connection Socket[addr=/127.0.0.1,port=38185,localport=35244], receiveBufferSize=65536 2> 374578 INFO (SocketProxy-Acceptor-46873) [n: c: s: r: x: t:] o.a.s.c.s.c.SocketProxy proxy connection Socket[addr=/127.0.0.1,port=40767,localport=33730], receiveBufferSize=65536 2> 374594 INFO (qtp1274442124-2045) [n:127.0.0.1:46873__c c:collection1 s:shard1 r:core_node6 x:collection1_shard1_replica_n3 t:null-1112] o.a.s.u.p.LogUpdateProcessorFactory webapp=/_c path=/update params={update.distrib=TOLEADER&waitSearcher=true&openSearcher=true&commit=true&softCommit=false&distrib.from=http://127.0.0.1:33207/_c/collection1_shard2_replica_n2/&commit_end_point=leaders&wt=javabin&version=2&expungeDeletes=false}{commit=} 0 7 2> 374601 INFO (qtp1455143443-2012) [n:127.0.0.1:32875__c c:collection1 s:shard3 r:core_node4 x:collection1_shard3_replica_n1 t:null-1112] o.a.s.u.p.LogUpdateProcessorFactory webapp=/_c path=/update params={update.distrib=TOLEADER&waitSearcher=true&openSearcher=true&commit=true&softCommit=false&distrib.from=http://127.0.0.1:33207/_c/collection1_shard2_replica_n2/&commit_end_point=leaders&wt=javabin&version=2&expungeDeletes=false}{commit=} 0 15 2> 374604 INFO (qtp1461542799-1985) [n:127.0.0.1:33207__c c:collection1 s:shard2 r:core_node5 x:collection1_shard2_replica_n2 t:null-1112] o.a.s.u.p.LogUpdateProcessorFactory webapp=/_c path=/update params={_stateVer_=collection1:5&waitSearcher=true&commit=true&softCommit=false&wt=javabin&version=2}{commit=} 0 72 2> 374619 INFO (SocketProxy-Acceptor-33207) [n: c: s: r: x: t:] o.a.s.c.s.c.SocketProxy accepted Socket[addr=/127.0.0.1,port=51366,localport=33207], receiveBufferSize: 65536 2> 374625 INFO (SocketProxy-Acceptor-33207) [n: c: s: r: x: t:] o.a.s.c.s.c.SocketProxy proxy connection Socket[addr=/127.0.0.1,port=35371,localport=41730], receiveBufferSize=65536 2> 374630 INFO (qtp1461542799-1986) [n:127.0.0.1:33207__c c:collection1 s:shard2 r:core_node5 x:collection1_shard2_replica_n2 t:null-1113] o.a.s.c.S.Request webapp=/_c path=/select params={q=*:*&distrib=false&tests=checkShardConsistency&rows=0&wt=javabin&version=2} hits=0 status=0 QTime=3 2> 374643 INFO (qtp1455143443-2011) [n:127.0.0.1:32875__c c:collection1 s:shard3 r:core_node4 x:collection1_shard3_replica_n1 t:null-1114] o.a.s.c.S.Request webapp=/_c path=/select params={q=*:*&distrib=false&tests=checkShardConsistency&rows=0&wt=javabin&version=2} hits=0 status=0 QTime=2 2> 374655 INFO (qtp1274442124-2040) [n:127.0.0.1:46873__c c:collection1 s:shard1 r:core_node6 x:collection1_shard1_replica_n3 t:null-1115] o.a.s.c.S.Request webapp=/_c path=/select params={q=*:*&distrib=false&tests=checkShardConsistency&rows=0&wt=javabin&version=2} hits=0 status=0 QTime=2 2> 376658 INFO (TEST-ReplicationFactorTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.ReplicationFactorTest Now testing replication factor handling for repfacttest_c8n_2x2 2> 376696 INFO (OverseerThreadFactory-1128-thread-3) [n: c:repfacttest_c8n_2x2 s: r: x: t:] o.a.s.c.a.c.CreateCollectionCmd Create collection repfacttest_c8n_2x2 2> 376872 INFO (OverseerStateUpdate-72077814988931076-127.0.0.1:40643__c-n_0000000000) [n:127.0.0.1:40643__c c: s: r: x: t:] o.a.s.c.o.SliceMutator createReplica() { 2> "core":"repfacttest_c8n_2x2_shard1_replica_n1", 2> "node_name":"127.0.0.1:40643__c", 2> "base_url":"http://127.0.0.1:40643/_c", 2> "collection":"repfacttest_c8n_2x2", 2> "shard":"shard1", 2> "state":"down", 2> "type":"NRT", 2> "operation":"ADDREPLICA", 2> "waitForFinalState":"false"} 2> 376893 INFO (OverseerStateUpdate-72077814988931076-127.0.0.1:40643__c-n_0000000000) [n:127.0.0.1:40643__c c: s: r: x: t:] o.a.s.c.o.SliceMutator createReplica() { 2> "core":"repfacttest_c8n_2x2_shard2_replica_n2", 2> "node_name":"127.0.0.1:46873__c", 2> "base_url":"http://127.0.0.1:46873/_c", 2> "collection":"repfacttest_c8n_2x2", 2> "shard":"shard2", 2> "state":"down", 2> "type":"NRT", 2> "operation":"ADDREPLICA", 2> "waitForFinalState":"false"} 2> 376912 INFO (OverseerStateUpdate-72077814988931076-127.0.0.1:40643__c-n_0000000000) [n:127.0.0.1:40643__c c: s: r: x: t:] o.a.s.c.o.SliceMutator createReplica() { 2> "core":"repfacttest_c8n_2x2_shard1_replica_n4", 2> "node_name":"127.0.0.1:33207__c", 2> "base_url":"http://127.0.0.1:33207/_c", 2> "collection":"repfacttest_c8n_2x2", 2> "shard":"shard1", 2> "state":"down", 2> "type":"NRT", 2> "operation":"ADDREPLICA", 2> "waitForFinalState":"false"} 2> 376932 INFO (OverseerStateUpdate-72077814988931076-127.0.0.1:40643__c-n_0000000000) [n:127.0.0.1:40643__c c: s: r: x: t:] o.a.s.c.o.SliceMutator createReplica() { 2> "core":"repfacttest_c8n_2x2_shard2_replica_n6", 2> "node_name":"127.0.0.1:32875__c", 2> "base_url":"http://127.0.0.1:32875/_c", 2> "collection":"repfacttest_c8n_2x2", 2> "shard":"shard2", 2> "state":"down", 2> "type":"NRT", 2> "operation":"ADDREPLICA", 2> "waitForFinalState":"false"} 2> 377049 INFO (zkCallback-1122-thread-2) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/repfacttest_c8n_2x2/state.json zxid: 451] for collection [repfacttest_c8n_2x2] has occurred - updating... (live nodes size: [4]) 2> 377061 INFO (SocketProxy-Acceptor-40643) [n: c: s: r: x: t:] o.a.s.c.s.c.SocketProxy accepted Socket[addr=/127.0.0.1,port=48988,localport=40643], receiveBufferSize: 65536 2> 377062 INFO (SocketProxy-Acceptor-46873) [n: c: s: r: x: t:] o.a.s.c.s.c.SocketProxy accepted Socket[addr=/127.0.0.1,port=55638,localport=46873], receiveBufferSize: 65536 2> 377063 INFO (SocketProxy-Acceptor-33207) [n: c: s: r: x: t:] o.a.s.c.s.c.SocketProxy accepted Socket[addr=/127.0.0.1,port=51372,localport=33207], receiveBufferSize: 65536 2> 377070 INFO (SocketProxy-Acceptor-32875) [n: c: s: r: x: t:] o.a.s.c.s.c.SocketProxy accepted Socket[addr=/127.0.0.1,port=39756,localport=32875], receiveBufferSize: 65536 2> 377078 INFO (SocketProxy-Acceptor-46873) [n: c: s: r: x: t:] o.a.s.c.s.c.SocketProxy proxy connection Socket[addr=/127.0.0.1,port=40767,localport=33744], receiveBufferSize=65536 2> 377083 INFO (SocketProxy-Acceptor-33207) [n: c: s: r: x: t:] o.a.s.c.s.c.SocketProxy proxy connection Socket[addr=/127.0.0.1,port=35371,localport=41746], receiveBufferSize=65536 2> 377085 INFO (SocketProxy-Acceptor-32875) [n: c: s: r: x: t:] o.a.s.c.s.c.SocketProxy proxy connection Socket[addr=/127.0.0.1,port=38185,localport=35258], receiveBufferSize=65536 2> 377087 INFO (SocketProxy-Acceptor-40643) [n: c: s: r: x: t:] o.a.s.c.s.c.SocketProxy proxy connection Socket[addr=/127.0.0.1,port=43165,localport=55102], receiveBufferSize=65536 2> 377089 INFO (qtp1274442124-2044) [n:127.0.0.1:46873__c c: s: r: x:repfacttest_c8n_2x2_shard2_replica_n2 t:null-1117] o.a.s.h.a.CoreAdminOperation core create command qt=/admin/cores&collection.configName=conf1&newCollection=true&collection=repfacttest_c8n_2x2&version=2&replicaType=NRT&coreNodeName=core_node5&name=repfacttest_c8n_2x2_shard2_replica_n2&action=CREATE&numShards=2&shard=shard2&wt=javabin 2> 377090 INFO (qtp1461542799-1982) [n:127.0.0.1:33207__c c: s: r: x:repfacttest_c8n_2x2_shard1_replica_n4 t:null-1118] o.a.s.h.a.CoreAdminOperation core create command qt=/admin/cores&collection.configName=conf1&newCollection=true&collection=repfacttest_c8n_2x2&version=2&replicaType=NRT&coreNodeName=core_node7&name=repfacttest_c8n_2x2_shard1_replica_n4&action=CREATE&numShards=2&shard=shard1&wt=javabin 2> 377092 INFO (qtp1455143443-2012) [n:127.0.0.1:32875__c c: s: r: x:repfacttest_c8n_2x2_shard2_replica_n6 t:null-1119] o.a.s.h.a.CoreAdminOperation core create command qt=/admin/cores&collection.configName=conf1&newCollection=true&collection=repfacttest_c8n_2x2&version=2&replicaType=NRT&coreNodeName=core_node8&name=repfacttest_c8n_2x2_shard2_replica_n6&action=CREATE&numShards=2&shard=shard2&wt=javabin 2> 377094 INFO (qtp1639791653-1927) [n:127.0.0.1:40643__c c: s: r: x:repfacttest_c8n_2x2_shard1_replica_n1 t:null-1120] o.a.s.h.a.CoreAdminOperation core create command qt=/admin/cores&collection.configName=conf1&newCollection=true&collection=repfacttest_c8n_2x2&version=2&replicaType=NRT&coreNodeName=core_node3&name=repfacttest_c8n_2x2_shard1_replica_n1&action=CREATE&numShards=2&shard=shard1&wt=javabin 2> 377145 INFO (qtp1274442124-2044) [n:127.0.0.1:46873__c c:repfacttest_c8n_2x2 s:shard2 r:core_node5 x:repfacttest_c8n_2x2_shard2_replica_n2 t:null-1117] o.a.s.c.SolrConfig Using Lucene MatchVersion: 9.8.0 2> 377146 INFO (qtp1639791653-1927) [n:127.0.0.1:40643__c c:repfacttest_c8n_2x2 s:shard1 r:core_node3 x:repfacttest_c8n_2x2_shard1_replica_n1 t:null-1120] o.a.s.c.SolrConfig Using Lucene MatchVersion: 9.8.0 2> 377146 INFO (qtp1461542799-1982) [n:127.0.0.1:33207__c c:repfacttest_c8n_2x2 s:shard1 r:core_node7 x:repfacttest_c8n_2x2_shard1_replica_n4 t:null-1118] o.a.s.c.SolrConfig Using Lucene MatchVersion: 9.8.0 2> 377148 INFO (qtp1455143443-2012) [n:127.0.0.1:32875__c c:repfacttest_c8n_2x2 s:shard2 r:core_node8 x:repfacttest_c8n_2x2_shard2_replica_n6 t:null-1119] o.a.s.c.SolrConfig Using Lucene MatchVersion: 9.8.0 2> 377158 INFO (qtp1274442124-2044) [n:127.0.0.1:46873__c c:repfacttest_c8n_2x2 s:shard2 r:core_node5 x:repfacttest_c8n_2x2_shard2_replica_n2 t:null-1117] o.a.s.s.IndexSchema Schema name=test 2> 377160 INFO (qtp1461542799-1982) [n:127.0.0.1:33207__c c:repfacttest_c8n_2x2 s:shard1 r:core_node7 x:repfacttest_c8n_2x2_shard1_replica_n4 t:null-1118] o.a.s.s.IndexSchema Schema name=test 2> 377160 INFO (qtp1639791653-1927) [n:127.0.0.1:40643__c c:repfacttest_c8n_2x2 s:shard1 r:core_node3 x:repfacttest_c8n_2x2_shard1_replica_n1 t:null-1120] o.a.s.s.IndexSchema Schema name=test 2> 377161 INFO (qtp1455143443-2012) [n:127.0.0.1:32875__c c:repfacttest_c8n_2x2 s:shard2 r:core_node8 x:repfacttest_c8n_2x2_shard2_replica_n6 t:null-1119] o.a.s.s.IndexSchema Schema name=test 2> 392075 ERROR (OverseerThreadFactory-1128-thread-3) [n: c:repfacttest_c8n_2x2 s: r: x: t:] o.a.s.c.a.c.CollectionHandlingUtils Error from shard: http://127.0.0.1:40643/_c 2> => org.apache.solr.client.solrj.SolrServerException: Total timeout 15000 ms elapsed 2> at org.apache.solr.client.solrj.impl.Http2SolrClient$1.onFailure(Http2SolrClient.java:506) 2> org.apache.solr.client.solrj.SolrServerException: Total timeout 15000 ms elapsed 2> at org.apache.solr.client.solrj.impl.Http2SolrClient$1.onFailure(Http2SolrClient.java:506) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.eclipse.jetty.client.ResponseNotifier.notifyFailure(ResponseNotifier.java:192) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.ResponseNotifier.notifyFailure(ResponseNotifier.java:184) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpReceiver.abort(HttpReceiver.java:559) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpChannel.abortResponse(HttpChannel.java:154) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpChannel.abort(HttpChannel.java:147) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpExchange.abort(HttpExchange.java:273) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpConversation.abort(HttpConversation.java:159) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpRequest.abort(HttpRequest.java:925) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpConnection$RequestTimeouts.onExpired(HttpConnection.java:342) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpConnection$RequestTimeouts.onExpired(HttpConnection.java:322) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.io.CyclicTimeouts.onTimeoutExpired(CyclicTimeouts.java:110) ~[jetty-io-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.io.CyclicTimeouts$Timeouts.onTimeoutExpired(CyclicTimeouts.java:197) ~[jetty-io-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.io.CyclicTimeout$Wakeup.run(CyclicTimeout.java:294) ~[jetty-io-10.0.19.jar:10.0.19] 2> at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) ~[?:?] 2> at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264) ~[?:?] 2> at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:304) ~[?:?] 2> at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) [?:?] 2> at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) [?:?] 2> at java.base/java.lang.Thread.run(Thread.java:829) [?:?] 2> Caused by: java.util.concurrent.TimeoutException: Total timeout 15000 ms elapsed 2> ... 11 more 2> 392136 ERROR (OverseerThreadFactory-1128-thread-3) [n: c:repfacttest_c8n_2x2 s: r: x: t:] o.a.s.c.a.c.CollectionHandlingUtils Error from shard: http://127.0.0.1:46873/_c 2> => org.apache.solr.client.solrj.SolrServerException: Total timeout 15000 ms elapsed 2> at org.apache.solr.client.solrj.impl.Http2SolrClient$1.onFailure(Http2SolrClient.java:506) 2> org.apache.solr.client.solrj.SolrServerException: Total timeout 15000 ms elapsed 2> at org.apache.solr.client.solrj.impl.Http2SolrClient$1.onFailure(Http2SolrClient.java:506) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.eclipse.jetty.client.ResponseNotifier.notifyFailure(ResponseNotifier.java:192) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.ResponseNotifier.notifyFailure(ResponseNotifier.java:184) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpReceiver.abort(HttpReceiver.java:559) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpChannel.abortResponse(HttpChannel.java:154) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpChannel.abort(HttpChannel.java:147) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpExchange.abort(HttpExchange.java:273) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpConversation.abort(HttpConversation.java:159) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpRequest.abort(HttpRequest.java:925) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpConnection$RequestTimeouts.onExpired(HttpConnection.java:342) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpConnection$RequestTimeouts.onExpired(HttpConnection.java:322) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.io.CyclicTimeouts.onTimeoutExpired(CyclicTimeouts.java:110) ~[jetty-io-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.io.CyclicTimeouts$Timeouts.onTimeoutExpired(CyclicTimeouts.java:197) ~[jetty-io-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.io.CyclicTimeout$Wakeup.run(CyclicTimeout.java:294) ~[jetty-io-10.0.19.jar:10.0.19] 2> at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) ~[?:?] 2> at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264) ~[?:?] 2> at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:304) ~[?:?] 2> at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) [?:?] 2> at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) [?:?] 2> at java.base/java.lang.Thread.run(Thread.java:829) [?:?] 2> Caused by: java.util.concurrent.TimeoutException: Total timeout 15000 ms elapsed 2> ... 11 more 2> 392164 ERROR (OverseerThreadFactory-1128-thread-3) [n: c:repfacttest_c8n_2x2 s: r: x: t:] o.a.s.c.a.c.CollectionHandlingUtils Error from shard: http://127.0.0.1:33207/_c 2> => org.apache.solr.client.solrj.SolrServerException: Total timeout 15000 ms elapsed 2> at org.apache.solr.client.solrj.impl.Http2SolrClient$1.onFailure(Http2SolrClient.java:506) 2> org.apache.solr.client.solrj.SolrServerException: Total timeout 15000 ms elapsed 2> at org.apache.solr.client.solrj.impl.Http2SolrClient$1.onFailure(Http2SolrClient.java:506) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.eclipse.jetty.client.ResponseNotifier.notifyFailure(ResponseNotifier.java:192) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.ResponseNotifier.notifyFailure(ResponseNotifier.java:184) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpReceiver.abort(HttpReceiver.java:559) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpChannel.abortResponse(HttpChannel.java:154) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpChannel.abort(HttpChannel.java:147) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpExchange.abort(HttpExchange.java:273) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpConversation.abort(HttpConversation.java:159) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpRequest.abort(HttpRequest.java:925) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpConnection$RequestTimeouts.onExpired(HttpConnection.java:342) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpConnection$RequestTimeouts.onExpired(HttpConnection.java:322) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.io.CyclicTimeouts.onTimeoutExpired(CyclicTimeouts.java:110) ~[jetty-io-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.io.CyclicTimeouts$Timeouts.onTimeoutExpired(CyclicTimeouts.java:197) ~[jetty-io-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.io.CyclicTimeout$Wakeup.run(CyclicTimeout.java:294) ~[jetty-io-10.0.19.jar:10.0.19] 2> at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) ~[?:?] 2> at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264) ~[?:?] 2> at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:304) ~[?:?] 2> at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) [?:?] 2> at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) [?:?] 2> at java.base/java.lang.Thread.run(Thread.java:829) [?:?] 2> Caused by: java.util.concurrent.TimeoutException: Total timeout 15000 ms elapsed 2> ... 11 more 2> 392191 INFO (qtp1274442124-2044) [n:127.0.0.1:46873__c c:repfacttest_c8n_2x2 s:shard2 r:core_node5 x:repfacttest_c8n_2x2_shard2_replica_n2 t:null-1117] o.a.s.s.IndexSchema Loaded schema test/1.0 with uniqueid field id 2> 392192 INFO (qtp1639791653-1927) [n:127.0.0.1:40643__c c:repfacttest_c8n_2x2 s:shard1 r:core_node3 x:repfacttest_c8n_2x2_shard1_replica_n1 t:null-1120] o.a.s.s.IndexSchema Loaded schema test/1.0 with uniqueid field id 2> 392192 INFO (qtp1455143443-2012) [n:127.0.0.1:32875__c c:repfacttest_c8n_2x2 s:shard2 r:core_node8 x:repfacttest_c8n_2x2_shard2_replica_n6 t:null-1119] o.a.s.s.IndexSchema Loaded schema test/1.0 with uniqueid field id 2> 392178 ERROR (OverseerThreadFactory-1128-thread-3) [n: c:repfacttest_c8n_2x2 s: r: x: t:] o.a.s.c.a.c.CollectionHandlingUtils Error from shard: http://127.0.0.1:32875/_c 2> => org.apache.solr.client.solrj.SolrServerException: Total timeout 15000 ms elapsed 2> at org.apache.solr.client.solrj.impl.Http2SolrClient$1.onFailure(Http2SolrClient.java:506) 2> org.apache.solr.client.solrj.SolrServerException: Total timeout 15000 ms elapsed 2> at org.apache.solr.client.solrj.impl.Http2SolrClient$1.onFailure(Http2SolrClient.java:506) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.eclipse.jetty.client.ResponseNotifier.notifyFailure(ResponseNotifier.java:192) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.ResponseNotifier.notifyFailure(ResponseNotifier.java:184) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpReceiver.abort(HttpReceiver.java:559) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpChannel.abortResponse(HttpChannel.java:154) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpChannel.abort(HttpChannel.java:147) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpExchange.abort(HttpExchange.java:273) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpConversation.abort(HttpConversation.java:159) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpRequest.abort(HttpRequest.java:925) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpConnection$RequestTimeouts.onExpired(HttpConnection.java:342) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpConnection$RequestTimeouts.onExpired(HttpConnection.java:322) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.io.CyclicTimeouts.onTimeoutExpired(CyclicTimeouts.java:110) ~[jetty-io-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.io.CyclicTimeouts$Timeouts.onTimeoutExpired(CyclicTimeouts.java:197) ~[jetty-io-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.io.CyclicTimeout$Wakeup.run(CyclicTimeout.java:294) ~[jetty-io-10.0.19.jar:10.0.19] 2> at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) ~[?:?] 2> at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264) ~[?:?] 2> at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:304) ~[?:?] 2> at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) [?:?] 2> at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) [?:?] 2> at java.base/java.lang.Thread.run(Thread.java:829) [?:?] 2> Caused by: java.util.concurrent.TimeoutException: Total timeout 15000 ms elapsed 2> ... 11 more 2> 392193 INFO (qtp1461542799-1982) [n:127.0.0.1:33207__c c:repfacttest_c8n_2x2 s:shard1 r:core_node7 x:repfacttest_c8n_2x2_shard1_replica_n4 t:null-1118] o.a.s.s.IndexSchema Loaded schema test/1.0 with uniqueid field id 2> 392194 ERROR (OverseerThreadFactory-1128-thread-3) [n: c:repfacttest_c8n_2x2 s: r: x: t:] o.a.s.c.a.c.CollectionHandlingUtils Cleaning up collection [repfacttest_c8n_2x2]. 2> 392227 INFO (OverseerThreadFactory-1128-thread-3) [n: c:repfacttest_c8n_2x2 s: r: x: t:] o.a.s.c.a.c.CollectionHandlingUtils Executing Collection Cmd=action=UNLOAD&deleteInstanceDir=true&deleteDataDir=true, asyncId=null 2> 392285 WARN (qtp1455143443-2013) [n:127.0.0.1:32875__c c: s: r: x:repfacttest_c8n_2x2_shard2_replica_n6 t:null-1122] o.a.s.c.CoreContainer Cannot unload non-existent core 'repfacttest_c8n_2x2_shard2_replica_n6' 2> 392285 WARN (qtp1274442124-2040) [n:127.0.0.1:46873__c c: s: r: x:repfacttest_c8n_2x2_shard2_replica_n2 t:null-1123] o.a.s.c.CoreContainer Cannot unload non-existent core 'repfacttest_c8n_2x2_shard2_replica_n2' 2> 392285 WARN (qtp1461542799-1984) [n:127.0.0.1:33207__c c: s: r: x:repfacttest_c8n_2x2_shard1_replica_n4 t:null-1121] o.a.s.c.CoreContainer Cannot unload non-existent core 'repfacttest_c8n_2x2_shard1_replica_n4' 2> 392286 WARN (qtp1639791653-1926) [n:127.0.0.1:40643__c c: s: r: x:repfacttest_c8n_2x2_shard1_replica_n1 t:null-1124] o.a.s.c.CoreContainer Cannot unload non-existent core 'repfacttest_c8n_2x2_shard1_replica_n1' 2> 392314 ERROR (qtp1461542799-1984) [n:127.0.0.1:33207__c c: s: r: x:repfacttest_c8n_2x2_shard1_replica_n4 t:null-1121] o.a.s.h.RequestHandlerBase Client exception 2> => org.apache.solr.common.SolrException: Cannot unload non-existent core [repfacttest_c8n_2x2_shard1_replica_n4] 2> at org.apache.solr.core.CoreContainer.unload(CoreContainer.java:2142) 2> org.apache.solr.common.SolrException: Cannot unload non-existent core [repfacttest_c8n_2x2_shard1_replica_n4] 2> at org.apache.solr.core.CoreContainer.unload(CoreContainer.java:2142) ~[main/:?] 2> at org.apache.solr.handler.admin.CoreAdminOperation.lambda$static$1(CoreAdminOperation.java:128) ~[main/:?] 2> at org.apache.solr.handler.admin.CoreAdminOperation.execute(CoreAdminOperation.java:414) ~[main/:?] 2> at org.apache.solr.handler.admin.CoreAdminHandler$CallInfo.call(CoreAdminHandler.java:374) ~[main/:?] 2> at org.apache.solr.handler.admin.CoreAdminHandler.handleRequestBody(CoreAdminHandler.java:231) ~[main/:?] 2> at org.apache.solr.handler.RequestHandlerBase.handleRequest(RequestHandlerBase.java:226) [main/:?] 2> at org.apache.solr.servlet.HttpSolrCall.handleAdmin(HttpSolrCall.java:932) [main/:?] 2> at org.apache.solr.servlet.HttpSolrCall.handleAdminRequest(HttpSolrCall.java:880) [main/:?] 2> at org.apache.solr.servlet.HttpSolrCall.call(HttpSolrCall.java:541) [main/:?] 2> at org.apache.solr.servlet.SolrDispatchFilter.dispatch(SolrDispatchFilter.java:262) [main/:?] 2> at org.apache.solr.servlet.SolrDispatchFilter.lambda$doFilter$0(SolrDispatchFilter.java:219) [main/:?] 2> at org.apache.solr.servlet.ServletUtils.traceHttpRequestExecution2(ServletUtils.java:246) [main/:?] 2> at org.apache.solr.servlet.ServletUtils.rateLimitRequest(ServletUtils.java:215) [main/:?] 2> at org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:213) [main/:?] 2> at org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:195) [main/:?] 2> at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:202) [jetty-servlet-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1635) [jetty-servlet-10.0.19.jar:10.0.19] 2> at org.apache.solr.embedded.JettySolrRunner$DebugFilter.doFilter(JettySolrRunner.java:187) [solr-test-framework-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:202) [jetty-servlet-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1635) [jetty-servlet-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:527) [jetty-servlet-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:221) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1580) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:221) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1384) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:176) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:484) [jetty-servlet-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1553) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:174) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1306) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:129) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:122) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.rewrite.handler.RewriteHandler.handle(RewriteHandler.java:301) [jetty-rewrite-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:122) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.gzip.GzipHandler.handle(GzipHandler.java:790) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:122) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.Server.handle(Server.java:563) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.HttpChannel$RequestDispatchable.dispatch(HttpChannel.java:1598) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:753) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:501) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.HttpChannel.run(HttpChannel.java:461) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:969) [jetty-util-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.doRunJob(QueuedThreadPool.java:1194) [jetty-util-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1149) [jetty-util-10.0.19.jar:10.0.19] 2> at java.base/java.lang.Thread.run(Thread.java:829) [?:?] 2> 392314 ERROR (qtp1274442124-2040) [n:127.0.0.1:46873__c c: s: r: x:repfacttest_c8n_2x2_shard2_replica_n2 t:null-1123] o.a.s.h.RequestHandlerBase Client exception 2> => org.apache.solr.common.SolrException: Cannot unload non-existent core [repfacttest_c8n_2x2_shard2_replica_n2] 2> at org.apache.solr.core.CoreContainer.unload(CoreContainer.java:2142) 2> org.apache.solr.common.SolrException: Cannot unload non-existent core [repfacttest_c8n_2x2_shard2_replica_n2] 2> at org.apache.solr.core.CoreContainer.unload(CoreContainer.java:2142) ~[main/:?] 2> at org.apache.solr.handler.admin.CoreAdminOperation.lambda$static$1(CoreAdminOperation.java:128) ~[main/:?] 2> at org.apache.solr.handler.admin.CoreAdminOperation.execute(CoreAdminOperation.java:414) ~[main/:?] 2> at org.apache.solr.handler.admin.CoreAdminHandler$CallInfo.call(CoreAdminHandler.java:374) ~[main/:?] 2> at org.apache.solr.handler.admin.CoreAdminHandler.handleRequestBody(CoreAdminHandler.java:231) ~[main/:?] 2> at org.apache.solr.handler.RequestHandlerBase.handleRequest(RequestHandlerBase.java:226) [main/:?] 2> at org.apache.solr.servlet.HttpSolrCall.handleAdmin(HttpSolrCall.java:932) [main/:?] 2> at org.apache.solr.servlet.HttpSolrCall.handleAdminRequest(HttpSolrCall.java:880) [main/:?] 2> at org.apache.solr.servlet.HttpSolrCall.call(HttpSolrCall.java:541) [main/:?] 2> at org.apache.solr.servlet.SolrDispatchFilter.dispatch(SolrDispatchFilter.java:262) [main/:?] 2> at org.apache.solr.servlet.SolrDispatchFilter.lambda$doFilter$0(SolrDispatchFilter.java:219) [main/:?] 2> at org.apache.solr.servlet.ServletUtils.traceHttpRequestExecution2(ServletUtils.java:246) [main/:?] 2> at org.apache.solr.servlet.ServletUtils.rateLimitRequest(ServletUtils.java:215) [main/:?] 2> at org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:213) [main/:?] 2> at org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:195) [main/:?] 2> at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:202) [jetty-servlet-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1635) [jetty-servlet-10.0.19.jar:10.0.19] 2> at org.apache.solr.embedded.JettySolrRunner$DebugFilter.doFilter(JettySolrRunner.java:187) [solr-test-framework-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:202) [jetty-servlet-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1635) [jetty-servlet-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:527) [jetty-servlet-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:221) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1580) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:221) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1384) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:176) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:484) [jetty-servlet-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1553) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:174) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1306) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:129) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:122) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.rewrite.handler.RewriteHandler.handle(RewriteHandler.java:301) [jetty-rewrite-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:122) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.gzip.GzipHandler.handle(GzipHandler.java:790) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:122) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.Server.handle(Server.java:563) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.HttpChannel$RequestDispatchable.dispatch(HttpChannel.java:1598) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:753) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:501) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.HttpChannel.run(HttpChannel.java:461) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:969) [jetty-util-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.doRunJob(QueuedThreadPool.java:1194) [jetty-util-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1149) [jetty-util-10.0.19.jar:10.0.19] 2> at java.base/java.lang.Thread.run(Thread.java:829) [?:?] 2> 392313 ERROR (qtp1455143443-2013) [n:127.0.0.1:32875__c c: s: r: x:repfacttest_c8n_2x2_shard2_replica_n6 t:null-1122] o.a.s.h.RequestHandlerBase Client exception 2> => org.apache.solr.common.SolrException: Cannot unload non-existent core [repfacttest_c8n_2x2_shard2_replica_n6] 2> at org.apache.solr.core.CoreContainer.unload(CoreContainer.java:2142) 2> org.apache.solr.common.SolrException: Cannot unload non-existent core [repfacttest_c8n_2x2_shard2_replica_n6] 2> at org.apache.solr.core.CoreContainer.unload(CoreContainer.java:2142) ~[main/:?] 2> at org.apache.solr.handler.admin.CoreAdminOperation.lambda$static$1(CoreAdminOperation.java:128) ~[main/:?] 2> at org.apache.solr.handler.admin.CoreAdminOperation.execute(CoreAdminOperation.java:414) ~[main/:?] 2> at org.apache.solr.handler.admin.CoreAdminHandler$CallInfo.call(CoreAdminHandler.java:374) ~[main/:?] 2> at org.apache.solr.handler.admin.CoreAdminHandler.handleRequestBody(CoreAdminHandler.java:231) ~[main/:?] 2> at org.apache.solr.handler.RequestHandlerBase.handleRequest(RequestHandlerBase.java:226) [main/:?] 2> at org.apache.solr.servlet.HttpSolrCall.handleAdmin(HttpSolrCall.java:932) [main/:?] 2> at org.apache.solr.servlet.HttpSolrCall.handleAdminRequest(HttpSolrCall.java:880) [main/:?] 2> at org.apache.solr.servlet.HttpSolrCall.call(HttpSolrCall.java:541) [main/:?] 2> at org.apache.solr.servlet.SolrDispatchFilter.dispatch(SolrDispatchFilter.java:262) [main/:?] 2> at org.apache.solr.servlet.SolrDispatchFilter.lambda$doFilter$0(SolrDispatchFilter.java:219) [main/:?] 2> at org.apache.solr.servlet.ServletUtils.traceHttpRequestExecution2(ServletUtils.java:246) [main/:?] 2> at org.apache.solr.servlet.ServletUtils.rateLimitRequest(ServletUtils.java:215) [main/:?] 2> at org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:213) [main/:?] 2> at org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:195) [main/:?] 2> at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:202) [jetty-servlet-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1635) [jetty-servlet-10.0.19.jar:10.0.19] 2> at org.apache.solr.embedded.JettySolrRunner$DebugFilter.doFilter(JettySolrRunner.java:187) [solr-test-framework-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:202) [jetty-servlet-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1635) [jetty-servlet-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:527) [jetty-servlet-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:221) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1580) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:221) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1384) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:176) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:484) [jetty-servlet-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1553) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:174) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1306) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:129) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:122) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.rewrite.handler.RewriteHandler.handle(RewriteHandler.java:301) [jetty-rewrite-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:122) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.gzip.GzipHandler.handle(GzipHandler.java:790) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:122) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.Server.handle(Server.java:563) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.HttpChannel$RequestDispatchable.dispatch(HttpChannel.java:1598) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:753) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:501) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.HttpChannel.run(HttpChannel.java:461) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:969) [jetty-util-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.doRunJob(QueuedThreadPool.java:1194) [jetty-util-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1149) [jetty-util-10.0.19.jar:10.0.19] 2> at java.base/java.lang.Thread.run(Thread.java:829) [?:?] 2> 392315 ERROR (qtp1639791653-1926) [n:127.0.0.1:40643__c c: s: r: x:repfacttest_c8n_2x2_shard1_replica_n1 t:null-1124] o.a.s.h.RequestHandlerBase Client exception 2> => org.apache.solr.common.SolrException: Cannot unload non-existent core [repfacttest_c8n_2x2_shard1_replica_n1] 2> at org.apache.solr.core.CoreContainer.unload(CoreContainer.java:2142) 2> org.apache.solr.common.SolrException: Cannot unload non-existent core [repfacttest_c8n_2x2_shard1_replica_n1] 2> at org.apache.solr.core.CoreContainer.unload(CoreContainer.java:2142) ~[main/:?] 2> at org.apache.solr.handler.admin.CoreAdminOperation.lambda$static$1(CoreAdminOperation.java:128) ~[main/:?] 2> at org.apache.solr.handler.admin.CoreAdminOperation.execute(CoreAdminOperation.java:414) ~[main/:?] 2> at org.apache.solr.handler.admin.CoreAdminHandler$CallInfo.call(CoreAdminHandler.java:374) ~[main/:?] 2> at org.apache.solr.handler.admin.CoreAdminHandler.handleRequestBody(CoreAdminHandler.java:231) ~[main/:?] 2> at org.apache.solr.handler.RequestHandlerBase.handleRequest(RequestHandlerBase.java:226) [main/:?] 2> at org.apache.solr.servlet.HttpSolrCall.handleAdmin(HttpSolrCall.java:932) [main/:?] 2> at org.apache.solr.servlet.HttpSolrCall.handleAdminRequest(HttpSolrCall.java:880) [main/:?] 2> at org.apache.solr.servlet.HttpSolrCall.call(HttpSolrCall.java:541) [main/:?] 2> at org.apache.solr.servlet.SolrDispatchFilter.dispatch(SolrDispatchFilter.java:262) [main/:?] 2> at org.apache.solr.servlet.SolrDispatchFilter.lambda$doFilter$0(SolrDispatchFilter.java:219) [main/:?] 2> at org.apache.solr.servlet.ServletUtils.traceHttpRequestExecution2(ServletUtils.java:246) [main/:?] 2> at org.apache.solr.servlet.ServletUtils.rateLimitRequest(ServletUtils.java:215) [main/:?] 2> at org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:213) [main/:?] 2> at org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:195) [main/:?] 2> at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:202) [jetty-servlet-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1635) [jetty-servlet-10.0.19.jar:10.0.19] 2> at org.apache.solr.embedded.JettySolrRunner$DebugFilter.doFilter(JettySolrRunner.java:187) [solr-test-framework-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:202) [jetty-servlet-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1635) [jetty-servlet-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:527) [jetty-servlet-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:221) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1580) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:221) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1384) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:176) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:484) [jetty-servlet-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1553) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:174) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1306) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:129) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:122) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.rewrite.handler.RewriteHandler.handle(RewriteHandler.java:301) [jetty-rewrite-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:122) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.gzip.GzipHandler.handle(GzipHandler.java:790) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:122) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.Server.handle(Server.java:563) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.HttpChannel$RequestDispatchable.dispatch(HttpChannel.java:1598) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:753) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:501) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.HttpChannel.run(HttpChannel.java:461) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:969) [jetty-util-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.doRunJob(QueuedThreadPool.java:1194) [jetty-util-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1149) [jetty-util-10.0.19.jar:10.0.19] 2> at java.base/java.lang.Thread.run(Thread.java:829) [?:?] 2> 392362 INFO (qtp1639791653-1926) [n:127.0.0.1:40643__c c: s: r: x:repfacttest_c8n_2x2_shard1_replica_n1 t:null-1124] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/cores params={deleteInstanceDir=true&core=repfacttest_c8n_2x2_shard1_replica_n1&qt=/admin/cores&deleteDataDir=true&action=UNLOAD&wt=javabin&version=2} status=400 QTime=70 2> 392362 INFO (qtp1455143443-2013) [n:127.0.0.1:32875__c c: s: r: x:repfacttest_c8n_2x2_shard2_replica_n6 t:null-1122] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/cores params={deleteInstanceDir=true&core=repfacttest_c8n_2x2_shard2_replica_n6&qt=/admin/cores&deleteDataDir=true&action=UNLOAD&wt=javabin&version=2} status=400 QTime=70 2> 392363 INFO (qtp1461542799-1984) [n:127.0.0.1:33207__c c: s: r: x:repfacttest_c8n_2x2_shard1_replica_n4 t:null-1121] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/cores params={deleteInstanceDir=true&core=repfacttest_c8n_2x2_shard1_replica_n4&qt=/admin/cores&deleteDataDir=true&action=UNLOAD&wt=javabin&version=2} status=400 QTime=68 2> 392364 INFO (qtp1274442124-2040) [n:127.0.0.1:46873__c c: s: r: x:repfacttest_c8n_2x2_shard2_replica_n2 t:null-1123] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/cores params={deleteInstanceDir=true&core=repfacttest_c8n_2x2_shard2_replica_n2&qt=/admin/cores&deleteDataDir=true&action=UNLOAD&wt=javabin&version=2} status=400 QTime=69 2> 392401 ERROR (OverseerThreadFactory-1128-thread-3) [n: c:repfacttest_c8n_2x2 s: r: x: t:] o.a.s.c.a.c.CollectionHandlingUtils Error from shard: http://127.0.0.1:40643/_c 2> => org.apache.solr.client.solrj.impl.BaseHttpSolrClient$RemoteSolrException: Error from server at http://127.0.0.1:40643/_c/admin/cores: Cannot unload non-existent core [repfacttest_c8n_2x2_shard1_replica_n1] 2> at org.apache.solr.client.solrj.impl.Http2SolrClient.processErrorsAndResponse(Http2SolrClient.java:920) 2> org.apache.solr.client.solrj.impl.BaseHttpSolrClient$RemoteSolrException: Error from server at http://127.0.0.1:40643/_c/admin/cores: Cannot unload non-existent core [repfacttest_c8n_2x2_shard1_replica_n1] 2> at org.apache.solr.client.solrj.impl.Http2SolrClient.processErrorsAndResponse(Http2SolrClient.java:920) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.Http2SolrClient.processErrorsAndResponse(Http2SolrClient.java:576) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.Http2SolrClient$1.lambda$onHeaders$0(Http2SolrClient.java:481) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.common.util.ExecutorUtil$MDCAwareThreadPoolExecutor.lambda$execute$0(ExecutorUtil.java:294) [solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) [?:?] 2> at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) [?:?] 2> at java.base/java.lang.Thread.run(Thread.java:829) [?:?] 2> 392417 ERROR (OverseerThreadFactory-1128-thread-3) [n: c:repfacttest_c8n_2x2 s: r: x: t:] o.a.s.c.a.c.CollectionHandlingUtils Error from shard: http://127.0.0.1:33207/_c 2> => org.apache.solr.client.solrj.impl.BaseHttpSolrClient$RemoteSolrException: Error from server at http://127.0.0.1:33207/_c/admin/cores: Cannot unload non-existent core [repfacttest_c8n_2x2_shard1_replica_n4] 2> at org.apache.solr.client.solrj.impl.Http2SolrClient.processErrorsAndResponse(Http2SolrClient.java:920) 2> org.apache.solr.client.solrj.impl.BaseHttpSolrClient$RemoteSolrException: Error from server at http://127.0.0.1:33207/_c/admin/cores: Cannot unload non-existent core [repfacttest_c8n_2x2_shard1_replica_n4] 2> at org.apache.solr.client.solrj.impl.Http2SolrClient.processErrorsAndResponse(Http2SolrClient.java:920) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.Http2SolrClient.processErrorsAndResponse(Http2SolrClient.java:576) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.Http2SolrClient$1.lambda$onHeaders$0(Http2SolrClient.java:481) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.common.util.ExecutorUtil$MDCAwareThreadPoolExecutor.lambda$execute$0(ExecutorUtil.java:294) [solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) [?:?] 2> at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) [?:?] 2> at java.base/java.lang.Thread.run(Thread.java:829) [?:?] 2> 392424 ERROR (OverseerThreadFactory-1128-thread-3) [n: c:repfacttest_c8n_2x2 s: r: x: t:] o.a.s.c.a.c.CollectionHandlingUtils Error from shard: http://127.0.0.1:32875/_c 2> => org.apache.solr.client.solrj.impl.BaseHttpSolrClient$RemoteSolrException: Error from server at http://127.0.0.1:32875/_c/admin/cores: Cannot unload non-existent core [repfacttest_c8n_2x2_shard2_replica_n6] 2> at org.apache.solr.client.solrj.impl.Http2SolrClient.processErrorsAndResponse(Http2SolrClient.java:920) 2> org.apache.solr.client.solrj.impl.BaseHttpSolrClient$RemoteSolrException: Error from server at http://127.0.0.1:32875/_c/admin/cores: Cannot unload non-existent core [repfacttest_c8n_2x2_shard2_replica_n6] 2> at org.apache.solr.client.solrj.impl.Http2SolrClient.processErrorsAndResponse(Http2SolrClient.java:920) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.Http2SolrClient.processErrorsAndResponse(Http2SolrClient.java:576) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.Http2SolrClient$1.lambda$onHeaders$0(Http2SolrClient.java:481) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.common.util.ExecutorUtil$MDCAwareThreadPoolExecutor.lambda$execute$0(ExecutorUtil.java:294) [solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) [?:?] 2> at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) [?:?] 2> at java.base/java.lang.Thread.run(Thread.java:829) [?:?] 2> 392430 ERROR (OverseerThreadFactory-1128-thread-3) [n: c:repfacttest_c8n_2x2 s: r: x: t:] o.a.s.c.a.c.CollectionHandlingUtils Error from shard: http://127.0.0.1:46873/_c 2> => org.apache.solr.client.solrj.impl.BaseHttpSolrClient$RemoteSolrException: Error from server at http://127.0.0.1:46873/_c/admin/cores: Cannot unload non-existent core [repfacttest_c8n_2x2_shard2_replica_n2] 2> at org.apache.solr.client.solrj.impl.Http2SolrClient.processErrorsAndResponse(Http2SolrClient.java:920) 2> org.apache.solr.client.solrj.impl.BaseHttpSolrClient$RemoteSolrException: Error from server at http://127.0.0.1:46873/_c/admin/cores: Cannot unload non-existent core [repfacttest_c8n_2x2_shard2_replica_n2] 2> at org.apache.solr.client.solrj.impl.Http2SolrClient.processErrorsAndResponse(Http2SolrClient.java:920) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.Http2SolrClient.processErrorsAndResponse(Http2SolrClient.java:576) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.impl.Http2SolrClient$1.lambda$onHeaders$0(Http2SolrClient.java:481) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.common.util.ExecutorUtil$MDCAwareThreadPoolExecutor.lambda$execute$0(ExecutorUtil.java:294) [solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) [?:?] 2> at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) [?:?] 2> at java.base/java.lang.Thread.run(Thread.java:829) [?:?] 2> 392495 INFO (qtp1455143443-2012) [n:127.0.0.1:32875__c c:repfacttest_c8n_2x2 s:shard2 r:core_node8 x:repfacttest_c8n_2x2_shard2_replica_n6 t:null-1119] o.a.s.c.CoreContainer Creating SolrCore 'repfacttest_c8n_2x2_shard2_replica_n6' using configuration from configset conf1, trusted=true 2> 392504 INFO (qtp1455143443-2012) [n:127.0.0.1:32875__c c:repfacttest_c8n_2x2 s:shard2 r:core_node8 x:repfacttest_c8n_2x2_shard2_replica_n6 t:null-1119] o.a.s.c.SolrCore Opening new SolrCore at [/build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.cloud.ReplicationFactorTest_246C98A4C257C021-001/shard-2-001/cores/repfacttest_c8n_2x2_shard2_replica_n6], dataDir=[/build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.cloud.ReplicationFactorTest_246C98A4C257C021-001/shard-2-001/cores/repfacttest_c8n_2x2_shard2_replica_n6/data/] 2> 392514 INFO (qtp1461542799-1982) [n:127.0.0.1:33207__c c:repfacttest_c8n_2x2 s:shard1 r:core_node7 x:repfacttest_c8n_2x2_shard1_replica_n4 t:null-1118] o.a.s.c.CoreContainer Creating SolrCore 'repfacttest_c8n_2x2_shard1_replica_n4' using configuration from configset conf1, trusted=true 2> 392519 INFO (qtp1461542799-1982) [n:127.0.0.1:33207__c c:repfacttest_c8n_2x2 s:shard1 r:core_node7 x:repfacttest_c8n_2x2_shard1_replica_n4 t:null-1118] o.a.s.c.SolrCore Opening new SolrCore at [/build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.cloud.ReplicationFactorTest_246C98A4C257C021-001/shard-1-001/cores/repfacttest_c8n_2x2_shard1_replica_n4], dataDir=[/build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.cloud.ReplicationFactorTest_246C98A4C257C021-001/shard-1-001/cores/repfacttest_c8n_2x2_shard1_replica_n4/data/] 2> 392556 INFO (qtp1461542799-1982) [n:127.0.0.1:33207__c c:repfacttest_c8n_2x2 s:shard1 r:core_node7 x:repfacttest_c8n_2x2_shard1_replica_n4 t:null-1118] o.a.s.u.RandomMergePolicy RandomMergePolicy wrapping class org.apache.lucene.index.TieredMergePolicy: [TieredMergePolicy: maxMergeAtOnce=40, maxMergedSegmentMB=105.98709392547607, floorSegmentMB=0.24309539794921875, forceMergeDeletesPctAllowed=18.166870365028444, segmentsPerTier=19.0, maxCFSSegmentSizeMB=8.796093022208E12, noCFSRatio=0.0, deletesPctAllowed=47.18653045948602 2> 392556 INFO (qtp1455143443-2012) [n:127.0.0.1:32875__c c:repfacttest_c8n_2x2 s:shard2 r:core_node8 x:repfacttest_c8n_2x2_shard2_replica_n6 t:null-1119] o.a.s.u.RandomMergePolicy RandomMergePolicy wrapping class org.apache.lucene.index.TieredMergePolicy: [TieredMergePolicy: maxMergeAtOnce=40, maxMergedSegmentMB=105.98709392547607, floorSegmentMB=0.24309539794921875, forceMergeDeletesPctAllowed=18.166870365028444, segmentsPerTier=19.0, maxCFSSegmentSizeMB=8.796093022208E12, noCFSRatio=0.0, deletesPctAllowed=47.18653045948602 2> 392557 INFO (qtp1274442124-2044) [n:127.0.0.1:46873__c c:repfacttest_c8n_2x2 s:shard2 r:core_node5 x:repfacttest_c8n_2x2_shard2_replica_n2 t:null-1117] o.a.s.c.CoreContainer Creating SolrCore 'repfacttest_c8n_2x2_shard2_replica_n2' using configuration from configset conf1, trusted=true 2> 392557 INFO (qtp1639791653-1927) [n:127.0.0.1:40643__c c:repfacttest_c8n_2x2 s:shard1 r:core_node3 x:repfacttest_c8n_2x2_shard1_replica_n1 t:null-1120] o.a.s.c.CoreContainer Creating SolrCore 'repfacttest_c8n_2x2_shard1_replica_n1' using configuration from configset conf1, trusted=true 2> 392560 INFO (qtp1274442124-2044) [n:127.0.0.1:46873__c c:repfacttest_c8n_2x2 s:shard2 r:core_node5 x:repfacttest_c8n_2x2_shard2_replica_n2 t:null-1117] o.a.s.c.SolrCore Opening new SolrCore at [/build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.cloud.ReplicationFactorTest_246C98A4C257C021-001/shard-3-001/cores/repfacttest_c8n_2x2_shard2_replica_n2], dataDir=[/build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.cloud.ReplicationFactorTest_246C98A4C257C021-001/shard-3-001/cores/repfacttest_c8n_2x2_shard2_replica_n2/data/] 2> 392560 INFO (qtp1639791653-1927) [n:127.0.0.1:40643__c c:repfacttest_c8n_2x2 s:shard1 r:core_node3 x:repfacttest_c8n_2x2_shard1_replica_n1 t:null-1120] o.a.s.c.SolrCore Opening new SolrCore at [/build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.cloud.ReplicationFactorTest_246C98A4C257C021-001/control-001/cores/repfacttest_c8n_2x2_shard1_replica_n1], dataDir=[/build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.cloud.ReplicationFactorTest_246C98A4C257C021-001/control-001/cores/repfacttest_c8n_2x2_shard1_replica_n1/data/] 2> 392567 INFO (zkCallback-1122-thread-2) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDeleted path:/collections/repfacttest_c8n_2x2/state.json zxid: 460] for collection [repfacttest_c8n_2x2] has occurred - updating... (live nodes size: [4]) 2> 392568 INFO (zkCallback-1122-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDeleted path:/collections/repfacttest_c8n_2x2/state.json zxid: 460] for collection [repfacttest_c8n_2x2] has occurred - updating... (live nodes size: [4]) 2> 392585 INFO (qtp1274442124-2044) [n:127.0.0.1:46873__c c:repfacttest_c8n_2x2 s:shard2 r:core_node5 x:repfacttest_c8n_2x2_shard2_replica_n2 t:null-1117] o.a.s.u.RandomMergePolicy RandomMergePolicy wrapping class org.apache.lucene.index.TieredMergePolicy: [TieredMergePolicy: maxMergeAtOnce=40, maxMergedSegmentMB=105.98709392547607, floorSegmentMB=0.24309539794921875, forceMergeDeletesPctAllowed=18.166870365028444, segmentsPerTier=19.0, maxCFSSegmentSizeMB=8.796093022208E12, noCFSRatio=0.0, deletesPctAllowed=47.18653045948602 2> 392585 INFO (qtp1639791653-1927) [n:127.0.0.1:40643__c c:repfacttest_c8n_2x2 s:shard1 r:core_node3 x:repfacttest_c8n_2x2_shard1_replica_n1 t:null-1120] o.a.s.u.RandomMergePolicy RandomMergePolicy wrapping class org.apache.lucene.index.TieredMergePolicy: [TieredMergePolicy: maxMergeAtOnce=24, maxMergedSegmentMB=96.44478511810303, floorSegmentMB=0.5299530029296875, forceMergeDeletesPctAllowed=2.7259169653814954, segmentsPerTier=33.0, maxCFSSegmentSizeMB=8.796093022208E12, noCFSRatio=0.0, deletesPctAllowed=22.936533526459158 2> 392601 INFO (OverseerThreadFactory-1128-thread-3) [n: c:repfacttest_c8n_2x2 s: r: x: t:] o.a.s.c.a.c.CreateCollectionCmd Cleaned up artifacts for failed create collection for [repfacttest_c8n_2x2] 2> 392606 ERROR (OverseerThreadFactory-1128-thread-3) [n: c:repfacttest_c8n_2x2 s: r: x: t:] o.a.s.c.a.c.OverseerCollectionMessageHandler Collection repfacttest_c8n_2x2}, operation create failed 2> => org.apache.solr.common.SolrException: Underlying core creation failed while creating collection: repfacttest_c8n_2x2 2> at org.apache.solr.cloud.api.collections.CreateCollectionCmd.call(CreateCollectionCmd.java:447) 2> org.apache.solr.common.SolrException: Underlying core creation failed while creating collection: repfacttest_c8n_2x2 2> at org.apache.solr.cloud.api.collections.CreateCollectionCmd.call(CreateCollectionCmd.java:447) ~[main/:?] 2> at org.apache.solr.cloud.api.collections.OverseerCollectionMessageHandler.processMessage(OverseerCollectionMessageHandler.java:129) [main/:?] 2> at org.apache.solr.cloud.OverseerTaskProcessor$Runner.run(OverseerTaskProcessor.java:564) [main/:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.common.util.ExecutorUtil$MDCAwareThreadPoolExecutor.lambda$execute$0(ExecutorUtil.java:294) [solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) [?:?] 2> at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) [?:?] 2> at java.base/java.lang.Thread.run(Thread.java:829) [?:?] 2> 392611 WARN (qtp1274442124-2044) [n:127.0.0.1:46873__c c:repfacttest_c8n_2x2 s:shard2 r:core_node5 x:repfacttest_c8n_2x2_shard2_replica_n2 t:null-1117] o.a.s.c.RequestHandlers INVALID paramSet a in requestHandler {type = requestHandler,name = /dump,class = DumpRequestHandler,attributes = {initParams=a, name=/dump, class=DumpRequestHandler},args = {defaults={a=A, b=B}}} 2> 392611 WARN (qtp1639791653-1927) [n:127.0.0.1:40643__c c:repfacttest_c8n_2x2 s:shard1 r:core_node3 x:repfacttest_c8n_2x2_shard1_replica_n1 t:null-1120] o.a.s.c.RequestHandlers INVALID paramSet a in requestHandler {type = requestHandler,name = /dump,class = DumpRequestHandler,attributes = {initParams=a, name=/dump, class=DumpRequestHandler},args = {defaults={a=A, b=B}}} 2> 392611 WARN (qtp1455143443-2012) [n:127.0.0.1:32875__c c:repfacttest_c8n_2x2 s:shard2 r:core_node8 x:repfacttest_c8n_2x2_shard2_replica_n6 t:null-1119] o.a.s.c.RequestHandlers INVALID paramSet a in requestHandler {type = requestHandler,name = /dump,class = DumpRequestHandler,attributes = {initParams=a, name=/dump, class=DumpRequestHandler},args = {defaults={a=A, b=B}}} 2> 392611 WARN (qtp1461542799-1982) [n:127.0.0.1:33207__c c:repfacttest_c8n_2x2 s:shard1 r:core_node7 x:repfacttest_c8n_2x2_shard1_replica_n4 t:null-1118] o.a.s.c.RequestHandlers INVALID paramSet a in requestHandler {type = requestHandler,name = /dump,class = DumpRequestHandler,attributes = {initParams=a, name=/dump, class=DumpRequestHandler},args = {defaults={a=A, b=B}}} 2> 392629 ERROR (qtp1455143443-2014) [n:127.0.0.1:32875__c c:repfacttest_c8n_2x2 s: r: x: t:null-1116] o.a.s.h.RequestHandlerBase Client exception 2> => org.apache.solr.common.SolrException: Underlying core creation failed while creating collection: repfacttest_c8n_2x2 2> at org.apache.solr.client.solrj.SolrResponse.getException(SolrResponse.java:56) 2> org.apache.solr.common.SolrException: Underlying core creation failed while creating collection: repfacttest_c8n_2x2 2> at org.apache.solr.client.solrj.SolrResponse.getException(SolrResponse.java:56) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.handler.admin.api.CreateCollectionAPI.createCollection(CreateCollectionAPI.java:140) ~[main/:?] 2> at org.apache.solr.handler.admin.CollectionsHandler$CollectionOperation.lambda$static$0(CollectionsHandler.java:513) ~[main/:?] 2> at org.apache.solr.handler.admin.CollectionsHandler$CollectionOperation.execute(CollectionsHandler.java:1265) ~[main/:?] 2> at org.apache.solr.handler.admin.CollectionsHandler.invokeAction(CollectionsHandler.java:315) ~[main/:?] 2> at org.apache.solr.handler.admin.CollectionsHandler.handleRequestBody(CollectionsHandler.java:293) ~[main/:?] 2> at org.apache.solr.handler.RequestHandlerBase.handleRequest(RequestHandlerBase.java:226) [main/:?] 2> at org.apache.solr.servlet.HttpSolrCall.handleAdmin(HttpSolrCall.java:932) [main/:?] 2> at org.apache.solr.servlet.HttpSolrCall.handleAdminRequest(HttpSolrCall.java:880) [main/:?] 2> at org.apache.solr.servlet.HttpSolrCall.call(HttpSolrCall.java:541) [main/:?] 2> at org.apache.solr.servlet.SolrDispatchFilter.dispatch(SolrDispatchFilter.java:262) [main/:?] 2> at org.apache.solr.servlet.SolrDispatchFilter.lambda$doFilter$0(SolrDispatchFilter.java:219) [main/:?] 2> at org.apache.solr.servlet.ServletUtils.traceHttpRequestExecution2(ServletUtils.java:246) [main/:?] 2> at org.apache.solr.servlet.ServletUtils.rateLimitRequest(ServletUtils.java:215) [main/:?] 2> at org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:213) [main/:?] 2> at org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:195) [main/:?] 2> at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:202) [jetty-servlet-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1635) [jetty-servlet-10.0.19.jar:10.0.19] 2> at org.apache.solr.embedded.JettySolrRunner$DebugFilter.doFilter(JettySolrRunner.java:187) [solr-test-framework-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:202) [jetty-servlet-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1635) [jetty-servlet-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:527) [jetty-servlet-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:221) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1580) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:221) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1384) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:176) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:484) [jetty-servlet-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1553) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:174) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1306) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:129) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:122) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.rewrite.handler.RewriteHandler.handle(RewriteHandler.java:301) [jetty-rewrite-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:122) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.gzip.GzipHandler.handle(GzipHandler.java:822) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:122) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.Server.handle(Server.java:563) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.HttpChannel$RequestDispatchable.dispatch(HttpChannel.java:1598) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:753) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:501) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:287) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:314) [jetty-io-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:100) [jetty-io-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.io.SelectableChannelEndPoint$1.run(SelectableChannelEndPoint.java:53) [jetty-io-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:969) [jetty-util-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.doRunJob(QueuedThreadPool.java:1194) [jetty-util-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1149) [jetty-util-10.0.19.jar:10.0.19] 2> at java.base/java.lang.Thread.run(Thread.java:829) [?:?] 2> 392642 INFO (qtp1455143443-2014) [n:127.0.0.1:32875__c c:repfacttest_c8n_2x2 s: r: x: t:null-1116] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/collections params={pullReplicas=0&collection.configName=conf1&nrtReplicas=2&name=repfacttest_c8n_2x2&action=CREATE&numShards=2&tlogReplicas=0&wt=javabin&version=2} status=400 QTime=15969 2> 392658 INFO (TEST-ReplicationFactorTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.SolrTestCaseJ4 ###Ending test 2> 392822 INFO (closeThreadPool-1268-thread-2) [n: c: s: r: x: t:] o.e.j.s.Server Stopped Server@e3d4963{STOPPING}[10.0.19,sto=0] 2> 392825 INFO (closeThreadPool-1268-thread-4) [n: c: s: r: x: t:] o.e.j.s.Server Stopped Server@550ca21e{STOPPING}[10.0.19,sto=0] 2> 392825 INFO (closeThreadPool-1268-thread-3) [n: c: s: r: x: t:] o.e.j.s.Server Stopped Server@9e82187{STOPPING}[10.0.19,sto=0] 2> 392826 INFO (closeThreadPool-1268-thread-5) [n: c: s: r: x: t:] o.e.j.s.Server Stopped Server@6296cee8{STOPPING}[10.0.19,sto=0] 2> 392849 INFO (closeThreadPool-1268-thread-5) [n: c: s: r: x: t:] o.e.j.s.AbstractConnector Stopped ServerConnector@5322d1b4{HTTP/1.1, (http/1.1, h2c)}{127.0.0.1:0} 2> 392849 INFO (closeThreadPool-1268-thread-2) [n: c: s: r: x: t:] o.e.j.s.AbstractConnector Stopped ServerConnector@4c084e5d{HTTP/1.1, (http/1.1, h2c)}{127.0.0.1:0} 2> 392865 INFO (closeThreadPool-1268-thread-3) [n: c: s: r: x: t:] o.e.j.s.AbstractConnector Stopped ServerConnector@75081bba{HTTP/1.1, (http/1.1, h2c)}{127.0.0.1:0} 2> 392866 INFO (closeThreadPool-1268-thread-5) [n: c: s: r: x: t:] o.e.j.s.h.ContextHandler Stopped o.e.j.s.ServletContextHandler@24b0c50c{/_c,file:///build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-cwd/,STOPPED} 2> 392866 INFO (closeThreadPool-1268-thread-2) [n: c: s: r: x: t:] o.e.j.s.h.ContextHandler Stopped o.e.j.s.ServletContextHandler@48c1479c{/_c,file:///build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-cwd/,STOPPED} 2> 392867 INFO (closeThreadPool-1268-thread-3) [n: c: s: r: x: t:] o.e.j.s.h.ContextHandler Stopped o.e.j.s.ServletContextHandler@5d5e94e6{/_c,file:///build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-cwd/,STOPPED} 2> 392868 INFO (closeThreadPool-1268-thread-4) [n: c: s: r: x: t:] o.e.j.s.AbstractConnector Stopped ServerConnector@5284241c{HTTP/1.1, (http/1.1, h2c)}{127.0.0.1:0} 2> 392870 INFO (closeThreadPool-1268-thread-4) [n: c: s: r: x: t:] o.e.j.s.h.ContextHandler Stopped o.e.j.s.ServletContextHandler@1fb42ac7{/_c,file:///build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-cwd/,STOPPED} 2> 392966 INFO (qtp1455143443-2012) [n:127.0.0.1:32875__c c:repfacttest_c8n_2x2 s:shard2 r:core_node8 x:repfacttest_c8n_2x2_shard2_replica_n6 t:null-1119] o.a.s.u.UpdateHandler Using UpdateLog implementation: org.apache.solr.update.UpdateLog 2> 392966 INFO (qtp1455143443-2012) [n:127.0.0.1:32875__c c:repfacttest_c8n_2x2 s:shard2 r:core_node8 x:repfacttest_c8n_2x2_shard2_replica_n6 t:null-1119] o.a.s.u.UpdateLog Initializing UpdateLog: dataDir= defaultSyncLevel=FLUSH numRecordsToKeep=100 maxNumLogsToKeep=10 numVersionBuckets=65536 2> 392967 INFO (qtp1274442124-2044) [n:127.0.0.1:46873__c c:repfacttest_c8n_2x2 s:shard2 r:core_node5 x:repfacttest_c8n_2x2_shard2_replica_n2 t:null-1117] o.a.s.u.UpdateHandler Using UpdateLog implementation: org.apache.solr.update.UpdateLog 2> 392967 INFO (qtp1274442124-2044) [n:127.0.0.1:46873__c c:repfacttest_c8n_2x2 s:shard2 r:core_node5 x:repfacttest_c8n_2x2_shard2_replica_n2 t:null-1117] o.a.s.u.UpdateLog Initializing UpdateLog: dataDir= defaultSyncLevel=FLUSH numRecordsToKeep=100 maxNumLogsToKeep=10 numVersionBuckets=65536 2> 392968 INFO (qtp1461542799-1982) [n:127.0.0.1:33207__c c:repfacttest_c8n_2x2 s:shard1 r:core_node7 x:repfacttest_c8n_2x2_shard1_replica_n4 t:null-1118] o.a.s.u.UpdateHandler Using UpdateLog implementation: org.apache.solr.update.UpdateLog 2> 392968 INFO (qtp1461542799-1982) [n:127.0.0.1:33207__c c:repfacttest_c8n_2x2 s:shard1 r:core_node7 x:repfacttest_c8n_2x2_shard1_replica_n4 t:null-1118] o.a.s.u.UpdateLog Initializing UpdateLog: dataDir= defaultSyncLevel=FLUSH numRecordsToKeep=100 maxNumLogsToKeep=10 numVersionBuckets=65536 2> 392968 INFO (qtp1639791653-1927) [n:127.0.0.1:40643__c c:repfacttest_c8n_2x2 s:shard1 r:core_node3 x:repfacttest_c8n_2x2_shard1_replica_n1 t:null-1120] o.a.s.u.UpdateHandler Using UpdateLog implementation: org.apache.solr.update.UpdateLog 2> 392969 INFO (qtp1639791653-1927) [n:127.0.0.1:40643__c c:repfacttest_c8n_2x2 s:shard1 r:core_node3 x:repfacttest_c8n_2x2_shard1_replica_n1 t:null-1120] o.a.s.u.UpdateLog Initializing UpdateLog: dataDir= defaultSyncLevel=FLUSH numRecordsToKeep=100 maxNumLogsToKeep=10 numVersionBuckets=65536 2> 392989 INFO (qtp1639791653-1927) [n:127.0.0.1:40643__c c:repfacttest_c8n_2x2 s:shard1 r:core_node3 x:repfacttest_c8n_2x2_shard1_replica_n1 t:null-1120] o.a.s.u.CommitTracker Hard AutoCommit: disabled 2> 392989 INFO (qtp1639791653-1927) [n:127.0.0.1:40643__c c:repfacttest_c8n_2x2 s:shard1 r:core_node3 x:repfacttest_c8n_2x2_shard1_replica_n1 t:null-1120] o.a.s.u.CommitTracker Soft AutoCommit: disabled 2> 392990 INFO (qtp1274442124-2044) [n:127.0.0.1:46873__c c:repfacttest_c8n_2x2 s:shard2 r:core_node5 x:repfacttest_c8n_2x2_shard2_replica_n2 t:null-1117] o.a.s.u.CommitTracker Hard AutoCommit: disabled 2> 392990 INFO (qtp1274442124-2044) [n:127.0.0.1:46873__c c:repfacttest_c8n_2x2 s:shard2 r:core_node5 x:repfacttest_c8n_2x2_shard2_replica_n2 t:null-1117] o.a.s.u.CommitTracker Soft AutoCommit: disabled 2> 392991 INFO (qtp1461542799-1982) [n:127.0.0.1:33207__c c:repfacttest_c8n_2x2 s:shard1 r:core_node7 x:repfacttest_c8n_2x2_shard1_replica_n4 t:null-1118] o.a.s.u.CommitTracker Hard AutoCommit: disabled 2> 392991 INFO (qtp1461542799-1982) [n:127.0.0.1:33207__c c:repfacttest_c8n_2x2 s:shard1 r:core_node7 x:repfacttest_c8n_2x2_shard1_replica_n4 t:null-1118] o.a.s.u.CommitTracker Soft AutoCommit: disabled 2> 392992 INFO (qtp1455143443-2012) [n:127.0.0.1:32875__c c:repfacttest_c8n_2x2 s:shard2 r:core_node8 x:repfacttest_c8n_2x2_shard2_replica_n6 t:null-1119] o.a.s.u.CommitTracker Hard AutoCommit: disabled 2> 392992 INFO (qtp1455143443-2012) [n:127.0.0.1:32875__c c:repfacttest_c8n_2x2 s:shard2 r:core_node8 x:repfacttest_c8n_2x2_shard2_replica_n6 t:null-1119] o.a.s.u.CommitTracker Soft AutoCommit: disabled 2> 393005 INFO (qtp1274442124-2044) [n:127.0.0.1:46873__c c:repfacttest_c8n_2x2 s:shard2 r:core_node5 x:repfacttest_c8n_2x2_shard2_replica_n2 t:null-1117] o.a.s.u.RandomMergePolicy RandomMergePolicy wrapping class org.apache.lucene.index.LogByteSizeMergePolicy: [LogByteSizeMergePolicy: minMergeSize=1677721, mergeFactor=33, maxMergeSize=2147483648, maxMergeSizeForForcedMerge=9223372036854775807, calibrateSizeByDeletes=false, maxMergeDocs=2147483647, maxCFSSegmentSizeMB=8.796093022208E12, noCFSRatio=0.6324117596483627] 2> 393005 INFO (qtp1639791653-1927) [n:127.0.0.1:40643__c c:repfacttest_c8n_2x2 s:shard1 r:core_node3 x:repfacttest_c8n_2x2_shard1_replica_n1 t:null-1120] o.a.s.u.RandomMergePolicy RandomMergePolicy wrapping class org.apache.lucene.index.LogByteSizeMergePolicy: [LogByteSizeMergePolicy: minMergeSize=1677721, mergeFactor=18, maxMergeSize=2147483648, maxMergeSizeForForcedMerge=9223372036854775807, calibrateSizeByDeletes=true, maxMergeDocs=2147483647, maxCFSSegmentSizeMB=8.796093022208E12, noCFSRatio=1.0] 2> 393006 INFO (qtp1461542799-1982) [n:127.0.0.1:33207__c c:repfacttest_c8n_2x2 s:shard1 r:core_node7 x:repfacttest_c8n_2x2_shard1_replica_n4 t:null-1118] o.a.s.u.RandomMergePolicy RandomMergePolicy wrapping class org.apache.lucene.index.LogByteSizeMergePolicy: [LogByteSizeMergePolicy: minMergeSize=1677721, mergeFactor=33, maxMergeSize=2147483648, maxMergeSizeForForcedMerge=9223372036854775807, calibrateSizeByDeletes=false, maxMergeDocs=2147483647, maxCFSSegmentSizeMB=8.796093022208E12, noCFSRatio=0.6324117596483627] 2> 393009 INFO (qtp1455143443-2012) [n:127.0.0.1:32875__c c:repfacttest_c8n_2x2 s:shard2 r:core_node8 x:repfacttest_c8n_2x2_shard2_replica_n6 t:null-1119] o.a.s.u.RandomMergePolicy RandomMergePolicy wrapping class org.apache.lucene.index.LogByteSizeMergePolicy: [LogByteSizeMergePolicy: minMergeSize=1677721, mergeFactor=33, maxMergeSize=2147483648, maxMergeSizeForForcedMerge=9223372036854775807, calibrateSizeByDeletes=false, maxMergeDocs=2147483647, maxCFSSegmentSizeMB=8.796093022208E12, noCFSRatio=0.6324117596483627] 2> 393048 ERROR (qtp1274442124-2044) [n:127.0.0.1:46873__c c:repfacttest_c8n_2x2 s:shard2 r:core_node5 x:repfacttest_c8n_2x2_shard2_replica_n2 t:null-1117] o.a.s.r.ManagedResourceStorage Failed to get config name due to 2> => org.apache.solr.common.SolrException: Could not find collection : repfacttest_c8n_2x2 2> at org.apache.solr.common.cloud.ClusterState.getCollection(ClusterState.java:121) 2> org.apache.solr.common.SolrException: Could not find collection : repfacttest_c8n_2x2 2> at org.apache.solr.common.cloud.ClusterState.getCollection(ClusterState.java:121) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.rest.ManagedResourceStorage.newStorageIO(ManagedResourceStorage.java:100) [main/:?] 2> at org.apache.solr.core.SolrCore.initRestManager(SolrCore.java:3260) [main/:?] 2> at org.apache.solr.core.SolrCore.(SolrCore.java:1179) [main/:?] 2> at org.apache.solr.core.SolrCore.(SolrCore.java:1057) [main/:?] 2> at org.apache.solr.core.CoreContainer.createFromDescriptor(CoreContainer.java:1706) [main/:?] 2> at org.apache.solr.core.CoreContainer.create(CoreContainer.java:1572) [main/:?] 2> at org.apache.solr.handler.admin.CoreAdminOperation.lambda$static$0(CoreAdminOperation.java:114) [main/:?] 2> at org.apache.solr.handler.admin.CoreAdminOperation.execute(CoreAdminOperation.java:414) [main/:?] 2> at org.apache.solr.handler.admin.CoreAdminHandler$CallInfo.call(CoreAdminHandler.java:374) [main/:?] 2> at org.apache.solr.handler.admin.CoreAdminHandler.handleRequestBody(CoreAdminHandler.java:231) [main/:?] 2> at org.apache.solr.handler.RequestHandlerBase.handleRequest(RequestHandlerBase.java:226) [main/:?] 2> at org.apache.solr.servlet.HttpSolrCall.handleAdmin(HttpSolrCall.java:932) [main/:?] 2> at org.apache.solr.servlet.HttpSolrCall.handleAdminRequest(HttpSolrCall.java:880) [main/:?] 2> at org.apache.solr.servlet.HttpSolrCall.call(HttpSolrCall.java:541) [main/:?] 2> at org.apache.solr.servlet.SolrDispatchFilter.dispatch(SolrDispatchFilter.java:262) [main/:?] 2> at org.apache.solr.servlet.SolrDispatchFilter.lambda$doFilter$0(SolrDispatchFilter.java:219) [main/:?] 2> at org.apache.solr.servlet.ServletUtils.traceHttpRequestExecution2(ServletUtils.java:246) [main/:?] 2> at org.apache.solr.servlet.ServletUtils.rateLimitRequest(ServletUtils.java:215) [main/:?] 2> at org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:213) [main/:?] 2> at org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:195) [main/:?] 2> at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:202) [jetty-servlet-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1635) [jetty-servlet-10.0.19.jar:10.0.19] 2> at org.apache.solr.embedded.JettySolrRunner$DebugFilter.doFilter(JettySolrRunner.java:187) [solr-test-framework-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:202) [jetty-servlet-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1635) [jetty-servlet-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:527) [jetty-servlet-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:221) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1580) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:221) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1384) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:176) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:484) [jetty-servlet-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1553) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:174) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1306) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:129) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:122) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.rewrite.handler.RewriteHandler.handle(RewriteHandler.java:301) [jetty-rewrite-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:122) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.gzip.GzipHandler.handle(GzipHandler.java:790) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:122) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.Server.handle(Server.java:563) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.HttpChannel$RequestDispatchable.dispatch(HttpChannel.java:1598) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:753) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:501) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.HttpChannel.run(HttpChannel.java:461) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:969) [jetty-util-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.doRunJob(QueuedThreadPool.java:1194) [jetty-util-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1149) [jetty-util-10.0.19.jar:10.0.19] 2> at java.base/java.lang.Thread.run(Thread.java:829) [?:?] 2> 393048 ERROR (qtp1461542799-1982) [n:127.0.0.1:33207__c c:repfacttest_c8n_2x2 s:shard1 r:core_node7 x:repfacttest_c8n_2x2_shard1_replica_n4 t:null-1118] o.a.s.r.ManagedResourceStorage Failed to get config name due to 2> => org.apache.solr.common.SolrException: Could not find collection : repfacttest_c8n_2x2 2> at org.apache.solr.common.cloud.ClusterState.getCollection(ClusterState.java:121) 2> org.apache.solr.common.SolrException: Could not find collection : repfacttest_c8n_2x2 2> at org.apache.solr.common.cloud.ClusterState.getCollection(ClusterState.java:121) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.rest.ManagedResourceStorage.newStorageIO(ManagedResourceStorage.java:100) [main/:?] 2> at org.apache.solr.core.SolrCore.initRestManager(SolrCore.java:3260) [main/:?] 2> at org.apache.solr.core.SolrCore.(SolrCore.java:1179) [main/:?] 2> at org.apache.solr.core.SolrCore.(SolrCore.java:1057) [main/:?] 2> at org.apache.solr.core.CoreContainer.createFromDescriptor(CoreContainer.java:1706) [main/:?] 2> at org.apache.solr.core.CoreContainer.create(CoreContainer.java:1572) [main/:?] 2> at org.apache.solr.handler.admin.CoreAdminOperation.lambda$static$0(CoreAdminOperation.java:114) [main/:?] 2> at org.apache.solr.handler.admin.CoreAdminOperation.execute(CoreAdminOperation.java:414) [main/:?] 2> at org.apache.solr.handler.admin.CoreAdminHandler$CallInfo.call(CoreAdminHandler.java:374) [main/:?] 2> at org.apache.solr.handler.admin.CoreAdminHandler.handleRequestBody(CoreAdminHandler.java:231) [main/:?] 2> at org.apache.solr.handler.RequestHandlerBase.handleRequest(RequestHandlerBase.java:226) [main/:?] 2> at org.apache.solr.servlet.HttpSolrCall.handleAdmin(HttpSolrCall.java:932) [main/:?] 2> at org.apache.solr.servlet.HttpSolrCall.handleAdminRequest(HttpSolrCall.java:880) [main/:?] 2> at org.apache.solr.servlet.HttpSolrCall.call(HttpSolrCall.java:541) [main/:?] 2> at org.apache.solr.servlet.SolrDispatchFilter.dispatch(SolrDispatchFilter.java:262) [main/:?] 2> at org.apache.solr.servlet.SolrDispatchFilter.lambda$doFilter$0(SolrDispatchFilter.java:219) [main/:?] 2> at org.apache.solr.servlet.ServletUtils.traceHttpRequestExecution2(ServletUtils.java:246) [main/:?] 2> at org.apache.solr.servlet.ServletUtils.rateLimitRequest(ServletUtils.java:215) [main/:?] 2> at org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:213) [main/:?] 2> at org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:195) [main/:?] 2> at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:202) [jetty-servlet-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1635) [jetty-servlet-10.0.19.jar:10.0.19] 2> at org.apache.solr.embedded.JettySolrRunner$DebugFilter.doFilter(JettySolrRunner.java:187) [solr-test-framework-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:202) [jetty-servlet-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1635) [jetty-servlet-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:527) [jetty-servlet-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:221) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1580) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:221) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1384) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:176) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:484) [jetty-servlet-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1553) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:174) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1306) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:129) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:122) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.rewrite.handler.RewriteHandler.handle(RewriteHandler.java:301) [jetty-rewrite-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:122) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.gzip.GzipHandler.handle(GzipHandler.java:790) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:122) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.Server.handle(Server.java:563) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.HttpChannel$RequestDispatchable.dispatch(HttpChannel.java:1598) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:753) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:501) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.HttpChannel.run(HttpChannel.java:461) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:969) [jetty-util-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.doRunJob(QueuedThreadPool.java:1194) [jetty-util-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1149) [jetty-util-10.0.19.jar:10.0.19] 2> at java.base/java.lang.Thread.run(Thread.java:829) [?:?] 2> 393070 INFO (qtp1274442124-2044) [n:127.0.0.1:46873__c c:repfacttest_c8n_2x2 s:shard2 r:core_node5 x:repfacttest_c8n_2x2_shard2_replica_n2 t:null-1117] o.a.s.c.SolrCore CLOSING SolrCore org.apache.solr.core.SolrCore@52ae7e86 repfacttest_c8n_2x2_shard2_replica_n2 2> 393048 ERROR (qtp1455143443-2012) [n:127.0.0.1:32875__c c:repfacttest_c8n_2x2 s:shard2 r:core_node8 x:repfacttest_c8n_2x2_shard2_replica_n6 t:null-1119] o.a.s.r.ManagedResourceStorage Failed to get config name due to 2> => org.apache.solr.common.SolrException: Could not find collection : repfacttest_c8n_2x2 2> at org.apache.solr.common.cloud.ClusterState.getCollection(ClusterState.java:121) 2> org.apache.solr.common.SolrException: Could not find collection : repfacttest_c8n_2x2 2> at org.apache.solr.common.cloud.ClusterState.getCollection(ClusterState.java:121) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.rest.ManagedResourceStorage.newStorageIO(ManagedResourceStorage.java:100) [main/:?] 2> at org.apache.solr.core.SolrCore.initRestManager(SolrCore.java:3260) [main/:?] 2> at org.apache.solr.core.SolrCore.(SolrCore.java:1179) [main/:?] 2> at org.apache.solr.core.SolrCore.(SolrCore.java:1057) [main/:?] 2> at org.apache.solr.core.CoreContainer.createFromDescriptor(CoreContainer.java:1706) [main/:?] 2> at org.apache.solr.core.CoreContainer.create(CoreContainer.java:1572) [main/:?] 2> at org.apache.solr.handler.admin.CoreAdminOperation.lambda$static$0(CoreAdminOperation.java:114) [main/:?] 2> at org.apache.solr.handler.admin.CoreAdminOperation.execute(CoreAdminOperation.java:414) [main/:?] 2> at org.apache.solr.handler.admin.CoreAdminHandler$CallInfo.call(CoreAdminHandler.java:374) [main/:?] 2> at org.apache.solr.handler.admin.CoreAdminHandler.handleRequestBody(CoreAdminHandler.java:231) [main/:?] 2> at org.apache.solr.handler.RequestHandlerBase.handleRequest(RequestHandlerBase.java:226) [main/:?] 2> at org.apache.solr.servlet.HttpSolrCall.handleAdmin(HttpSolrCall.java:932) [main/:?] 2> at org.apache.solr.servlet.HttpSolrCall.handleAdminRequest(HttpSolrCall.java:880) [main/:?] 2> at org.apache.solr.servlet.HttpSolrCall.call(HttpSolrCall.java:541) [main/:?] 2> at org.apache.solr.servlet.SolrDispatchFilter.dispatch(SolrDispatchFilter.java:262) [main/:?] 2> at org.apache.solr.servlet.SolrDispatchFilter.lambda$doFilter$0(SolrDispatchFilter.java:219) [main/:?] 2> at org.apache.solr.servlet.ServletUtils.traceHttpRequestExecution2(ServletUtils.java:246) [main/:?] 2> at org.apache.solr.servlet.ServletUtils.rateLimitRequest(ServletUtils.java:215) [main/:?] 2> at org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:213) [main/:?] 2> at org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:195) [main/:?] 2> at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:202) [jetty-servlet-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1635) [jetty-servlet-10.0.19.jar:10.0.19] 2> at org.apache.solr.embedded.JettySolrRunner$DebugFilter.doFilter(JettySolrRunner.java:187) [solr-test-framework-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:202) [jetty-servlet-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1635) [jetty-servlet-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:527) [jetty-servlet-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:221) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1580) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:221) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1384) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:176) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:484) [jetty-servlet-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1553) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:174) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1306) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:129) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:122) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.rewrite.handler.RewriteHandler.handle(RewriteHandler.java:301) [jetty-rewrite-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:122) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.gzip.GzipHandler.handle(GzipHandler.java:790) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:122) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.Server.handle(Server.java:563) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.HttpChannel$RequestDispatchable.dispatch(HttpChannel.java:1598) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:753) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:501) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.HttpChannel.run(HttpChannel.java:461) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:969) [jetty-util-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.doRunJob(QueuedThreadPool.java:1194) [jetty-util-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1149) [jetty-util-10.0.19.jar:10.0.19] 2> at java.base/java.lang.Thread.run(Thread.java:829) [?:?] 2> 393048 ERROR (qtp1639791653-1927) [n:127.0.0.1:40643__c c:repfacttest_c8n_2x2 s:shard1 r:core_node3 x:repfacttest_c8n_2x2_shard1_replica_n1 t:null-1120] o.a.s.r.ManagedResourceStorage Failed to get config name due to 2> => org.apache.solr.common.SolrException: Could not find collection : repfacttest_c8n_2x2 2> at org.apache.solr.common.cloud.ClusterState.getCollection(ClusterState.java:121) 2> org.apache.solr.common.SolrException: Could not find collection : repfacttest_c8n_2x2 2> at org.apache.solr.common.cloud.ClusterState.getCollection(ClusterState.java:121) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.rest.ManagedResourceStorage.newStorageIO(ManagedResourceStorage.java:100) [main/:?] 2> at org.apache.solr.core.SolrCore.initRestManager(SolrCore.java:3260) [main/:?] 2> at org.apache.solr.core.SolrCore.(SolrCore.java:1179) [main/:?] 2> at org.apache.solr.core.SolrCore.(SolrCore.java:1057) [main/:?] 2> at org.apache.solr.core.CoreContainer.createFromDescriptor(CoreContainer.java:1706) [main/:?] 2> at org.apache.solr.core.CoreContainer.create(CoreContainer.java:1572) [main/:?] 2> at org.apache.solr.handler.admin.CoreAdminOperation.lambda$static$0(CoreAdminOperation.java:114) [main/:?] 2> at org.apache.solr.handler.admin.CoreAdminOperation.execute(CoreAdminOperation.java:414) [main/:?] 2> at org.apache.solr.handler.admin.CoreAdminHandler$CallInfo.call(CoreAdminHandler.java:374) [main/:?] 2> at org.apache.solr.handler.admin.CoreAdminHandler.handleRequestBody(CoreAdminHandler.java:231) [main/:?] 2> at org.apache.solr.handler.RequestHandlerBase.handleRequest(RequestHandlerBase.java:226) [main/:?] 2> at org.apache.solr.servlet.HttpSolrCall.handleAdmin(HttpSolrCall.java:932) [main/:?] 2> at org.apache.solr.servlet.HttpSolrCall.handleAdminRequest(HttpSolrCall.java:880) [main/:?] 2> at org.apache.solr.servlet.HttpSolrCall.call(HttpSolrCall.java:541) [main/:?] 2> at org.apache.solr.servlet.SolrDispatchFilter.dispatch(SolrDispatchFilter.java:262) [main/:?] 2> at org.apache.solr.servlet.SolrDispatchFilter.lambda$doFilter$0(SolrDispatchFilter.java:219) [main/:?] 2> at org.apache.solr.servlet.ServletUtils.traceHttpRequestExecution2(ServletUtils.java:246) [main/:?] 2> at org.apache.solr.servlet.ServletUtils.rateLimitRequest(ServletUtils.java:215) [main/:?] 2> at org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:213) [main/:?] 2> at org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:195) [main/:?] 2> at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:202) [jetty-servlet-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1635) [jetty-servlet-10.0.19.jar:10.0.19] 2> at org.apache.solr.embedded.JettySolrRunner$DebugFilter.doFilter(JettySolrRunner.java:187) [solr-test-framework-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:202) [jetty-servlet-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1635) [jetty-servlet-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:527) [jetty-servlet-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:221) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1580) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:221) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1384) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:176) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:484) [jetty-servlet-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1553) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:174) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1306) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:129) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:122) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.rewrite.handler.RewriteHandler.handle(RewriteHandler.java:301) [jetty-rewrite-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:122) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.gzip.GzipHandler.handle(GzipHandler.java:790) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:122) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.Server.handle(Server.java:563) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.HttpChannel$RequestDispatchable.dispatch(HttpChannel.java:1598) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:753) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:501) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.HttpChannel.run(HttpChannel.java:461) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:969) [jetty-util-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.doRunJob(QueuedThreadPool.java:1194) [jetty-util-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1149) [jetty-util-10.0.19.jar:10.0.19] 2> at java.base/java.lang.Thread.run(Thread.java:829) [?:?] 2> 393071 INFO (qtp1274442124-2044) [n:127.0.0.1:46873__c c:repfacttest_c8n_2x2 s:shard2 r:core_node5 x:repfacttest_c8n_2x2_shard2_replica_n2 t:null-1117] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.core.repfacttest_c8n_2x2.shard2.replica_n2 tag=SolrCore@52ae7e86 2> 393071 INFO (qtp1461542799-1982) [n:127.0.0.1:33207__c c:repfacttest_c8n_2x2 s:shard1 r:core_node7 x:repfacttest_c8n_2x2_shard1_replica_n4 t:null-1118] o.a.s.c.SolrCore CLOSING SolrCore org.apache.solr.core.SolrCore@2b31b4b4 repfacttest_c8n_2x2_shard1_replica_n4 2> 393084 INFO (qtp1455143443-2012) [n:127.0.0.1:32875__c c:repfacttest_c8n_2x2 s:shard2 r:core_node8 x:repfacttest_c8n_2x2_shard2_replica_n6 t:null-1119] o.a.s.c.SolrCore CLOSING SolrCore org.apache.solr.core.SolrCore@67b1152 repfacttest_c8n_2x2_shard2_replica_n6 2> 393088 INFO (qtp1639791653-1927) [n:127.0.0.1:40643__c c:repfacttest_c8n_2x2 s:shard1 r:core_node3 x:repfacttest_c8n_2x2_shard1_replica_n1 t:null-1120] o.a.s.c.SolrCore CLOSING SolrCore org.apache.solr.core.SolrCore@559f2185 repfacttest_c8n_2x2_shard1_replica_n1 2> 393088 INFO (qtp1455143443-2012) [n:127.0.0.1:32875__c c:repfacttest_c8n_2x2 s:shard2 r:core_node8 x:repfacttest_c8n_2x2_shard2_replica_n6 t:null-1119] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.core.repfacttest_c8n_2x2.shard2.replica_n6 tag=SolrCore@67b1152 2> 393088 INFO (qtp1461542799-1982) [n:127.0.0.1:33207__c c:repfacttest_c8n_2x2 s:shard1 r:core_node7 x:repfacttest_c8n_2x2_shard1_replica_n4 t:null-1118] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.core.repfacttest_c8n_2x2.shard1.replica_n4 tag=SolrCore@2b31b4b4 2> 393092 INFO (qtp1639791653-1927) [n:127.0.0.1:40643__c c:repfacttest_c8n_2x2 s:shard1 r:core_node3 x:repfacttest_c8n_2x2_shard1_replica_n1 t:null-1120] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.core.repfacttest_c8n_2x2.shard1.replica_n1 tag=SolrCore@559f2185 2> 393094 INFO (searcherExecutor-1260-thread-1-processing-repfacttest_c8n_2x2_shard2_replica_n2 null-1117 core_node5 127.0.0.1:46873__c repfacttest_c8n_2x2 shard2) [n:127.0.0.1:46873__c c:repfacttest_c8n_2x2 s:shard2 r:core_node5 x:repfacttest_c8n_2x2_shard2_replica_n2 t:null-1117] o.a.s.c.SolrCore Registered new searcher autowarm time: 0 ms 2> 393095 INFO (searcherExecutor-1258-thread-1-processing-repfacttest_c8n_2x2_shard1_replica_n4 null-1118 core_node7 127.0.0.1:33207__c repfacttest_c8n_2x2 shard1) [n:127.0.0.1:33207__c c:repfacttest_c8n_2x2 s:shard1 r:core_node7 x:repfacttest_c8n_2x2_shard1_replica_n4 t:null-1118] o.a.s.c.SolrCore Registered new searcher autowarm time: 0 ms 2> 393095 INFO (searcherExecutor-1256-thread-1-processing-repfacttest_c8n_2x2_shard2_replica_n6 null-1119 core_node8 127.0.0.1:32875__c repfacttest_c8n_2x2 shard2) [n:127.0.0.1:32875__c c:repfacttest_c8n_2x2 s:shard2 r:core_node8 x:repfacttest_c8n_2x2_shard2_replica_n6 t:null-1119] o.a.s.c.SolrCore Registered new searcher autowarm time: 0 ms 2> 393096 INFO (searcherExecutor-1262-thread-1-processing-repfacttest_c8n_2x2_shard1_replica_n1 null-1120 core_node3 127.0.0.1:40643__c repfacttest_c8n_2x2 shard1) [n:127.0.0.1:40643__c c:repfacttest_c8n_2x2 s:shard1 r:core_node3 x:repfacttest_c8n_2x2_shard1_replica_n1 t:null-1120] o.a.s.c.SolrCore Registered new searcher autowarm time: 0 ms 2> 393339 INFO (qtp1274442124-2044) [n:127.0.0.1:46873__c c:repfacttest_c8n_2x2 s:shard2 r:core_node5 x:repfacttest_c8n_2x2_shard2_replica_n2 t:null-1117] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.collection.repfacttest_c8n_2x2.shard2.leader tag=SolrCore@52ae7e86 2> 393347 INFO (qtp1274442124-2044) [n:127.0.0.1:46873__c c:repfacttest_c8n_2x2 s:shard2 r:core_node5 x:repfacttest_c8n_2x2_shard2_replica_n2 t:null-1117] o.a.s.u.DirectUpdateHandler2 Committing on IndexWriter.close() ... SKIPPED (unnecessary). 2> 393352 INFO (qtp1639791653-1927) [n:127.0.0.1:40643__c c:repfacttest_c8n_2x2 s:shard1 r:core_node3 x:repfacttest_c8n_2x2_shard1_replica_n1 t:null-1120] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.collection.repfacttest_c8n_2x2.shard1.leader tag=SolrCore@559f2185 2> 393359 INFO (qtp1639791653-1927) [n:127.0.0.1:40643__c c:repfacttest_c8n_2x2 s:shard1 r:core_node3 x:repfacttest_c8n_2x2_shard1_replica_n1 t:null-1120] o.a.s.u.DirectUpdateHandler2 Committing on IndexWriter.close() ... SKIPPED (unnecessary). 2> 393366 INFO (qtp1461542799-1982) [n:127.0.0.1:33207__c c:repfacttest_c8n_2x2 s:shard1 r:core_node7 x:repfacttest_c8n_2x2_shard1_replica_n4 t:null-1118] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.collection.repfacttest_c8n_2x2.shard1.leader tag=SolrCore@2b31b4b4 2> 393366 INFO (qtp1455143443-2012) [n:127.0.0.1:32875__c c:repfacttest_c8n_2x2 s:shard2 r:core_node8 x:repfacttest_c8n_2x2_shard2_replica_n6 t:null-1119] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.collection.repfacttest_c8n_2x2.shard2.leader tag=SolrCore@67b1152 2> 393374 INFO (qtp1455143443-2012) [n:127.0.0.1:32875__c c:repfacttest_c8n_2x2 s:shard2 r:core_node8 x:repfacttest_c8n_2x2_shard2_replica_n6 t:null-1119] o.a.s.u.DirectUpdateHandler2 Committing on IndexWriter.close() ... SKIPPED (unnecessary). 2> 393374 INFO (qtp1461542799-1982) [n:127.0.0.1:33207__c c:repfacttest_c8n_2x2 s:shard1 r:core_node7 x:repfacttest_c8n_2x2_shard1_replica_n4 t:null-1118] o.a.s.u.DirectUpdateHandler2 Committing on IndexWriter.close() ... SKIPPED (unnecessary). 2> 393391 ERROR (qtp1455143443-2012) [n:127.0.0.1:32875__c c: s: r: x: t:null-1119] o.a.s.h.RequestHandlerBase Client exception 2> => org.apache.solr.common.SolrException: Error CREATEing SolrCore 'repfacttest_c8n_2x2_shard2_replica_n6': Unable to create core [repfacttest_c8n_2x2_shard2_replica_n6] Caused by: Could not find collection : repfacttest_c8n_2x2 2> at org.apache.solr.core.CoreContainer.create(CoreContainer.java:1614) 2> org.apache.solr.common.SolrException: Error CREATEing SolrCore 'repfacttest_c8n_2x2_shard2_replica_n6': Unable to create core [repfacttest_c8n_2x2_shard2_replica_n6] Caused by: Could not find collection : repfacttest_c8n_2x2 2> at org.apache.solr.core.CoreContainer.create(CoreContainer.java:1614) ~[main/:?] 2> at org.apache.solr.handler.admin.CoreAdminOperation.lambda$static$0(CoreAdminOperation.java:114) ~[main/:?] 2> at org.apache.solr.handler.admin.CoreAdminOperation.execute(CoreAdminOperation.java:414) ~[main/:?] 2> at org.apache.solr.handler.admin.CoreAdminHandler$CallInfo.call(CoreAdminHandler.java:374) ~[main/:?] 2> at org.apache.solr.handler.admin.CoreAdminHandler.handleRequestBody(CoreAdminHandler.java:231) ~[main/:?] 2> at org.apache.solr.handler.RequestHandlerBase.handleRequest(RequestHandlerBase.java:226) [main/:?] 2> at org.apache.solr.servlet.HttpSolrCall.handleAdmin(HttpSolrCall.java:932) [main/:?] 2> at org.apache.solr.servlet.HttpSolrCall.handleAdminRequest(HttpSolrCall.java:880) [main/:?] 2> at org.apache.solr.servlet.HttpSolrCall.call(HttpSolrCall.java:541) [main/:?] 2> at org.apache.solr.servlet.SolrDispatchFilter.dispatch(SolrDispatchFilter.java:262) [main/:?] 2> at org.apache.solr.servlet.SolrDispatchFilter.lambda$doFilter$0(SolrDispatchFilter.java:219) [main/:?] 2> at org.apache.solr.servlet.ServletUtils.traceHttpRequestExecution2(ServletUtils.java:246) [main/:?] 2> at org.apache.solr.servlet.ServletUtils.rateLimitRequest(ServletUtils.java:215) [main/:?] 2> at org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:213) [main/:?] 2> at org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:195) [main/:?] 2> at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:202) [jetty-servlet-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1635) [jetty-servlet-10.0.19.jar:10.0.19] 2> at org.apache.solr.embedded.JettySolrRunner$DebugFilter.doFilter(JettySolrRunner.java:187) [solr-test-framework-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:202) [jetty-servlet-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1635) [jetty-servlet-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:527) [jetty-servlet-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:221) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1580) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:221) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1384) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:176) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:484) [jetty-servlet-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1553) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:174) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1306) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:129) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:122) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.rewrite.handler.RewriteHandler.handle(RewriteHandler.java:301) [jetty-rewrite-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:122) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.gzip.GzipHandler.handle(GzipHandler.java:790) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:122) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.Server.handle(Server.java:563) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.HttpChannel$RequestDispatchable.dispatch(HttpChannel.java:1598) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:753) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:501) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.HttpChannel.run(HttpChannel.java:461) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:969) [jetty-util-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.doRunJob(QueuedThreadPool.java:1194) [jetty-util-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1149) [jetty-util-10.0.19.jar:10.0.19] 2> at java.base/java.lang.Thread.run(Thread.java:829) [?:?] 2> Caused by: org.apache.solr.common.SolrException: Unable to create core [repfacttest_c8n_2x2_shard2_replica_n6] 2> at org.apache.solr.core.CoreContainer.createFromDescriptor(CoreContainer.java:1739) ~[main/:?] 2> at org.apache.solr.core.CoreContainer.create(CoreContainer.java:1572) ~[main/:?] 2> ... 44 more 2> Caused by: org.apache.solr.common.SolrException: Failed to load config name for collection:repfacttest_c8n_2x2 due to: 2> at org.apache.solr.core.SolrCore.(SolrCore.java:1232) ~[main/:?] 2> at org.apache.solr.core.SolrCore.(SolrCore.java:1057) ~[main/:?] 2> at org.apache.solr.core.CoreContainer.createFromDescriptor(CoreContainer.java:1706) ~[main/:?] 2> at org.apache.solr.core.CoreContainer.create(CoreContainer.java:1572) ~[main/:?] 2> ... 44 more 2> Caused by: org.apache.solr.common.SolrException: Failed to load config name for collection:repfacttest_c8n_2x2 due to: 2> at org.apache.solr.rest.ManagedResourceStorage.newStorageIO(ManagedResourceStorage.java:103) ~[main/:?] 2> at org.apache.solr.core.SolrCore.initRestManager(SolrCore.java:3260) ~[main/:?] 2> at org.apache.solr.core.SolrCore.(SolrCore.java:1179) ~[main/:?] 2> at org.apache.solr.core.SolrCore.(SolrCore.java:1057) ~[main/:?] 2> at org.apache.solr.core.CoreContainer.createFromDescriptor(CoreContainer.java:1706) ~[main/:?] 2> at org.apache.solr.core.CoreContainer.create(CoreContainer.java:1572) ~[main/:?] 2> ... 44 more 2> Caused by: org.apache.solr.common.SolrException: Could not find collection : repfacttest_c8n_2x2 2> at org.apache.solr.common.cloud.ClusterState.getCollection(ClusterState.java:121) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.rest.ManagedResourceStorage.newStorageIO(ManagedResourceStorage.java:100) ~[main/:?] 2> at org.apache.solr.core.SolrCore.initRestManager(SolrCore.java:3260) ~[main/:?] 2> at org.apache.solr.core.SolrCore.(SolrCore.java:1179) ~[main/:?] 2> at org.apache.solr.core.SolrCore.(SolrCore.java:1057) ~[main/:?] 2> at org.apache.solr.core.CoreContainer.createFromDescriptor(CoreContainer.java:1706) ~[main/:?] 2> at org.apache.solr.core.CoreContainer.create(CoreContainer.java:1572) ~[main/:?] 2> ... 44 more 2> 393391 ERROR (qtp1274442124-2044) [n:127.0.0.1:46873__c c: s: r: x: t:null-1117] o.a.s.h.RequestHandlerBase Client exception 2> => org.apache.solr.common.SolrException: Error CREATEing SolrCore 'repfacttest_c8n_2x2_shard2_replica_n2': Unable to create core [repfacttest_c8n_2x2_shard2_replica_n2] Caused by: Could not find collection : repfacttest_c8n_2x2 2> at org.apache.solr.core.CoreContainer.create(CoreContainer.java:1614) 2> org.apache.solr.common.SolrException: Error CREATEing SolrCore 'repfacttest_c8n_2x2_shard2_replica_n2': Unable to create core [repfacttest_c8n_2x2_shard2_replica_n2] Caused by: Could not find collection : repfacttest_c8n_2x2 2> at org.apache.solr.core.CoreContainer.create(CoreContainer.java:1614) ~[main/:?] 2> at org.apache.solr.handler.admin.CoreAdminOperation.lambda$static$0(CoreAdminOperation.java:114) ~[main/:?] 2> at org.apache.solr.handler.admin.CoreAdminOperation.execute(CoreAdminOperation.java:414) ~[main/:?] 2> at org.apache.solr.handler.admin.CoreAdminHandler$CallInfo.call(CoreAdminHandler.java:374) ~[main/:?] 2> at org.apache.solr.handler.admin.CoreAdminHandler.handleRequestBody(CoreAdminHandler.java:231) ~[main/:?] 2> at org.apache.solr.handler.RequestHandlerBase.handleRequest(RequestHandlerBase.java:226) [main/:?] 2> at org.apache.solr.servlet.HttpSolrCall.handleAdmin(HttpSolrCall.java:932) [main/:?] 2> at org.apache.solr.servlet.HttpSolrCall.handleAdminRequest(HttpSolrCall.java:880) [main/:?] 2> at org.apache.solr.servlet.HttpSolrCall.call(HttpSolrCall.java:541) [main/:?] 2> at org.apache.solr.servlet.SolrDispatchFilter.dispatch(SolrDispatchFilter.java:262) [main/:?] 2> at org.apache.solr.servlet.SolrDispatchFilter.lambda$doFilter$0(SolrDispatchFilter.java:219) [main/:?] 2> at org.apache.solr.servlet.ServletUtils.traceHttpRequestExecution2(ServletUtils.java:246) [main/:?] 2> at org.apache.solr.servlet.ServletUtils.rateLimitRequest(ServletUtils.java:215) [main/:?] 2> at org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:213) [main/:?] 2> at org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:195) [main/:?] 2> at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:202) [jetty-servlet-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1635) [jetty-servlet-10.0.19.jar:10.0.19] 2> at org.apache.solr.embedded.JettySolrRunner$DebugFilter.doFilter(JettySolrRunner.java:187) [solr-test-framework-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:202) [jetty-servlet-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1635) [jetty-servlet-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:527) [jetty-servlet-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:221) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1580) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:221) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1384) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:176) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:484) [jetty-servlet-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1553) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:174) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1306) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:129) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:122) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.rewrite.handler.RewriteHandler.handle(RewriteHandler.java:301) [jetty-rewrite-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:122) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.gzip.GzipHandler.handle(GzipHandler.java:790) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:122) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.Server.handle(Server.java:563) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.HttpChannel$RequestDispatchable.dispatch(HttpChannel.java:1598) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:753) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:501) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.HttpChannel.run(HttpChannel.java:461) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:969) [jetty-util-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.doRunJob(QueuedThreadPool.java:1194) [jetty-util-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1149) [jetty-util-10.0.19.jar:10.0.19] 2> at java.base/java.lang.Thread.run(Thread.java:829) [?:?] 2> Caused by: org.apache.solr.common.SolrException: Unable to create core [repfacttest_c8n_2x2_shard2_replica_n2] 2> at org.apache.solr.core.CoreContainer.createFromDescriptor(CoreContainer.java:1739) ~[main/:?] 2> at org.apache.solr.core.CoreContainer.create(CoreContainer.java:1572) ~[main/:?] 2> ... 44 more 2> Caused by: org.apache.solr.common.SolrException: Failed to load config name for collection:repfacttest_c8n_2x2 due to: 2> at org.apache.solr.core.SolrCore.(SolrCore.java:1232) ~[main/:?] 2> at org.apache.solr.core.SolrCore.(SolrCore.java:1057) ~[main/:?] 2> at org.apache.solr.core.CoreContainer.createFromDescriptor(CoreContainer.java:1706) ~[main/:?] 2> at org.apache.solr.core.CoreContainer.create(CoreContainer.java:1572) ~[main/:?] 2> ... 44 more 2> Caused by: org.apache.solr.common.SolrException: Failed to load config name for collection:repfacttest_c8n_2x2 due to: 2> at org.apache.solr.rest.ManagedResourceStorage.newStorageIO(ManagedResourceStorage.java:103) ~[main/:?] 2> at org.apache.solr.core.SolrCore.initRestManager(SolrCore.java:3260) ~[main/:?] 2> at org.apache.solr.core.SolrCore.(SolrCore.java:1179) ~[main/:?] 2> at org.apache.solr.core.SolrCore.(SolrCore.java:1057) ~[main/:?] 2> at org.apache.solr.core.CoreContainer.createFromDescriptor(CoreContainer.java:1706) ~[main/:?] 2> at org.apache.solr.core.CoreContainer.create(CoreContainer.java:1572) ~[main/:?] 2> ... 44 more 2> Caused by: org.apache.solr.common.SolrException: Could not find collection : repfacttest_c8n_2x2 2> at org.apache.solr.common.cloud.ClusterState.getCollection(ClusterState.java:121) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.rest.ManagedResourceStorage.newStorageIO(ManagedResourceStorage.java:100) ~[main/:?] 2> at org.apache.solr.core.SolrCore.initRestManager(SolrCore.java:3260) ~[main/:?] 2> at org.apache.solr.core.SolrCore.(SolrCore.java:1179) ~[main/:?] 2> at org.apache.solr.core.SolrCore.(SolrCore.java:1057) ~[main/:?] 2> at org.apache.solr.core.CoreContainer.createFromDescriptor(CoreContainer.java:1706) ~[main/:?] 2> at org.apache.solr.core.CoreContainer.create(CoreContainer.java:1572) ~[main/:?] 2> ... 44 more 2> 393424 INFO (qtp1455143443-2012) [n:127.0.0.1:32875__c c: s: r: x: t:null-1119] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/cores params={qt=/admin/cores&coreNodeName=core_node8&collection.configName=conf1&newCollection=true&name=repfacttest_c8n_2x2_shard2_replica_n6&action=CREATE&numShards=2&collection=repfacttest_c8n_2x2&shard=shard2&wt=javabin&version=2&replicaType=NRT} status=400 QTime=16334 2> 393426 INFO (qtp1274442124-2044) [n:127.0.0.1:46873__c c: s: r: x: t:null-1117] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/cores params={qt=/admin/cores&coreNodeName=core_node5&collection.configName=conf1&newCollection=true&name=repfacttest_c8n_2x2_shard2_replica_n2&action=CREATE&numShards=2&collection=repfacttest_c8n_2x2&shard=shard2&wt=javabin&version=2&replicaType=NRT} status=400 QTime=16339 2> 393392 ERROR (qtp1461542799-1982) [n:127.0.0.1:33207__c c: s: r: x: t:null-1118] o.a.s.h.RequestHandlerBase Client exception 2> => org.apache.solr.common.SolrException: Error CREATEing SolrCore 'repfacttest_c8n_2x2_shard1_replica_n4': Unable to create core [repfacttest_c8n_2x2_shard1_replica_n4] Caused by: Could not find collection : repfacttest_c8n_2x2 2> at org.apache.solr.core.CoreContainer.create(CoreContainer.java:1614) 2> org.apache.solr.common.SolrException: Error CREATEing SolrCore 'repfacttest_c8n_2x2_shard1_replica_n4': Unable to create core [repfacttest_c8n_2x2_shard1_replica_n4] Caused by: Could not find collection : repfacttest_c8n_2x2 2> at org.apache.solr.core.CoreContainer.create(CoreContainer.java:1614) ~[main/:?] 2> at org.apache.solr.handler.admin.CoreAdminOperation.lambda$static$0(CoreAdminOperation.java:114) ~[main/:?] 2> at org.apache.solr.handler.admin.CoreAdminOperation.execute(CoreAdminOperation.java:414) ~[main/:?] 2> at org.apache.solr.handler.admin.CoreAdminHandler$CallInfo.call(CoreAdminHandler.java:374) ~[main/:?] 2> at org.apache.solr.handler.admin.CoreAdminHandler.handleRequestBody(CoreAdminHandler.java:231) ~[main/:?] 2> at org.apache.solr.handler.RequestHandlerBase.handleRequest(RequestHandlerBase.java:226) [main/:?] 2> at org.apache.solr.servlet.HttpSolrCall.handleAdmin(HttpSolrCall.java:932) [main/:?] 2> at org.apache.solr.servlet.HttpSolrCall.handleAdminRequest(HttpSolrCall.java:880) [main/:?] 2> at org.apache.solr.servlet.HttpSolrCall.call(HttpSolrCall.java:541) [main/:?] 2> at org.apache.solr.servlet.SolrDispatchFilter.dispatch(SolrDispatchFilter.java:262) [main/:?] 2> at org.apache.solr.servlet.SolrDispatchFilter.lambda$doFilter$0(SolrDispatchFilter.java:219) [main/:?] 2> at org.apache.solr.servlet.ServletUtils.traceHttpRequestExecution2(ServletUtils.java:246) [main/:?] 2> at org.apache.solr.servlet.ServletUtils.rateLimitRequest(ServletUtils.java:215) [main/:?] 2> at org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:213) [main/:?] 2> at org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:195) [main/:?] 2> at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:202) [jetty-servlet-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1635) [jetty-servlet-10.0.19.jar:10.0.19] 2> at org.apache.solr.embedded.JettySolrRunner$DebugFilter.doFilter(JettySolrRunner.java:187) [solr-test-framework-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:202) [jetty-servlet-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1635) [jetty-servlet-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:527) [jetty-servlet-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:221) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1580) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:221) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1384) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:176) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:484) [jetty-servlet-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1553) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:174) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1306) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:129) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:122) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.rewrite.handler.RewriteHandler.handle(RewriteHandler.java:301) [jetty-rewrite-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:122) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.gzip.GzipHandler.handle(GzipHandler.java:790) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:122) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.Server.handle(Server.java:563) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.HttpChannel$RequestDispatchable.dispatch(HttpChannel.java:1598) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:753) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:501) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.HttpChannel.run(HttpChannel.java:461) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:969) [jetty-util-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.doRunJob(QueuedThreadPool.java:1194) [jetty-util-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1149) [jetty-util-10.0.19.jar:10.0.19] 2> at java.base/java.lang.Thread.run(Thread.java:829) [?:?] 2> Caused by: org.apache.solr.common.SolrException: Unable to create core [repfacttest_c8n_2x2_shard1_replica_n4] 2> at org.apache.solr.core.CoreContainer.createFromDescriptor(CoreContainer.java:1739) ~[main/:?] 2> at org.apache.solr.core.CoreContainer.create(CoreContainer.java:1572) ~[main/:?] 2> ... 44 more 2> Caused by: org.apache.solr.common.SolrException: Failed to load config name for collection:repfacttest_c8n_2x2 due to: 2> at org.apache.solr.core.SolrCore.(SolrCore.java:1232) ~[main/:?] 2> at org.apache.solr.core.SolrCore.(SolrCore.java:1057) ~[main/:?] 2> at org.apache.solr.core.CoreContainer.createFromDescriptor(CoreContainer.java:1706) ~[main/:?] 2> at org.apache.solr.core.CoreContainer.create(CoreContainer.java:1572) ~[main/:?] 2> ... 44 more 2> Caused by: org.apache.solr.common.SolrException: Failed to load config name for collection:repfacttest_c8n_2x2 due to: 2> at org.apache.solr.rest.ManagedResourceStorage.newStorageIO(ManagedResourceStorage.java:103) ~[main/:?] 2> at org.apache.solr.core.SolrCore.initRestManager(SolrCore.java:3260) ~[main/:?] 2> at org.apache.solr.core.SolrCore.(SolrCore.java:1179) ~[main/:?] 2> at org.apache.solr.core.SolrCore.(SolrCore.java:1057) ~[main/:?] 2> at org.apache.solr.core.CoreContainer.createFromDescriptor(CoreContainer.java:1706) ~[main/:?] 2> at org.apache.solr.core.CoreContainer.create(CoreContainer.java:1572) ~[main/:?] 2> ... 44 more 2> Caused by: org.apache.solr.common.SolrException: Could not find collection : repfacttest_c8n_2x2 2> at org.apache.solr.common.cloud.ClusterState.getCollection(ClusterState.java:121) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.rest.ManagedResourceStorage.newStorageIO(ManagedResourceStorage.java:100) ~[main/:?] 2> at org.apache.solr.core.SolrCore.initRestManager(SolrCore.java:3260) ~[main/:?] 2> at org.apache.solr.core.SolrCore.(SolrCore.java:1179) ~[main/:?] 2> at org.apache.solr.core.SolrCore.(SolrCore.java:1057) ~[main/:?] 2> at org.apache.solr.core.CoreContainer.createFromDescriptor(CoreContainer.java:1706) ~[main/:?] 2> at org.apache.solr.core.CoreContainer.create(CoreContainer.java:1572) ~[main/:?] 2> ... 44 more 2> 393392 ERROR (qtp1639791653-1927) [n:127.0.0.1:40643__c c: s: r: x: t:null-1120] o.a.s.h.RequestHandlerBase Client exception 2> => org.apache.solr.common.SolrException: Error CREATEing SolrCore 'repfacttest_c8n_2x2_shard1_replica_n1': Unable to create core [repfacttest_c8n_2x2_shard1_replica_n1] Caused by: Could not find collection : repfacttest_c8n_2x2 2> at org.apache.solr.core.CoreContainer.create(CoreContainer.java:1614) 2> org.apache.solr.common.SolrException: Error CREATEing SolrCore 'repfacttest_c8n_2x2_shard1_replica_n1': Unable to create core [repfacttest_c8n_2x2_shard1_replica_n1] Caused by: Could not find collection : repfacttest_c8n_2x2 2> at org.apache.solr.core.CoreContainer.create(CoreContainer.java:1614) ~[main/:?] 2> at org.apache.solr.handler.admin.CoreAdminOperation.lambda$static$0(CoreAdminOperation.java:114) ~[main/:?] 2> at org.apache.solr.handler.admin.CoreAdminOperation.execute(CoreAdminOperation.java:414) ~[main/:?] 2> at org.apache.solr.handler.admin.CoreAdminHandler$CallInfo.call(CoreAdminHandler.java:374) ~[main/:?] 2> at org.apache.solr.handler.admin.CoreAdminHandler.handleRequestBody(CoreAdminHandler.java:231) ~[main/:?] 2> at org.apache.solr.handler.RequestHandlerBase.handleRequest(RequestHandlerBase.java:226) [main/:?] 2> at org.apache.solr.servlet.HttpSolrCall.handleAdmin(HttpSolrCall.java:932) [main/:?] 2> at org.apache.solr.servlet.HttpSolrCall.handleAdminRequest(HttpSolrCall.java:880) [main/:?] 2> at org.apache.solr.servlet.HttpSolrCall.call(HttpSolrCall.java:541) [main/:?] 2> at org.apache.solr.servlet.SolrDispatchFilter.dispatch(SolrDispatchFilter.java:262) [main/:?] 2> at org.apache.solr.servlet.SolrDispatchFilter.lambda$doFilter$0(SolrDispatchFilter.java:219) [main/:?] 2> at org.apache.solr.servlet.ServletUtils.traceHttpRequestExecution2(ServletUtils.java:246) [main/:?] 2> at org.apache.solr.servlet.ServletUtils.rateLimitRequest(ServletUtils.java:215) [main/:?] 2> at org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:213) [main/:?] 2> at org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:195) [main/:?] 2> at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:202) [jetty-servlet-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1635) [jetty-servlet-10.0.19.jar:10.0.19] 2> at org.apache.solr.embedded.JettySolrRunner$DebugFilter.doFilter(JettySolrRunner.java:187) [solr-test-framework-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:202) [jetty-servlet-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1635) [jetty-servlet-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:527) [jetty-servlet-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:221) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1580) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:221) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1384) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:176) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:484) [jetty-servlet-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1553) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:174) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1306) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:129) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:122) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.rewrite.handler.RewriteHandler.handle(RewriteHandler.java:301) [jetty-rewrite-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:122) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.gzip.GzipHandler.handle(GzipHandler.java:790) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:122) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.Server.handle(Server.java:563) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.HttpChannel$RequestDispatchable.dispatch(HttpChannel.java:1598) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:753) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:501) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.HttpChannel.run(HttpChannel.java:461) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:969) [jetty-util-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.doRunJob(QueuedThreadPool.java:1194) [jetty-util-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1149) [jetty-util-10.0.19.jar:10.0.19] 2> at java.base/java.lang.Thread.run(Thread.java:829) [?:?] 2> Caused by: org.apache.solr.common.SolrException: Unable to create core [repfacttest_c8n_2x2_shard1_replica_n1] 2> at org.apache.solr.core.CoreContainer.createFromDescriptor(CoreContainer.java:1739) ~[main/:?] 2> at org.apache.solr.core.CoreContainer.create(CoreContainer.java:1572) ~[main/:?] 2> ... 44 more 2> Caused by: org.apache.solr.common.SolrException: Failed to load config name for collection:repfacttest_c8n_2x2 due to: 2> at org.apache.solr.core.SolrCore.(SolrCore.java:1232) ~[main/:?] 2> at org.apache.solr.core.SolrCore.(SolrCore.java:1057) ~[main/:?] 2> at org.apache.solr.core.CoreContainer.createFromDescriptor(CoreContainer.java:1706) ~[main/:?] 2> at org.apache.solr.core.CoreContainer.create(CoreContainer.java:1572) ~[main/:?] 2> ... 44 more 2> Caused by: org.apache.solr.common.SolrException: Failed to load config name for collection:repfacttest_c8n_2x2 due to: 2> at org.apache.solr.rest.ManagedResourceStorage.newStorageIO(ManagedResourceStorage.java:103) ~[main/:?] 2> at org.apache.solr.core.SolrCore.initRestManager(SolrCore.java:3260) ~[main/:?] 2> at org.apache.solr.core.SolrCore.(SolrCore.java:1179) ~[main/:?] 2> at org.apache.solr.core.SolrCore.(SolrCore.java:1057) ~[main/:?] 2> at org.apache.solr.core.CoreContainer.createFromDescriptor(CoreContainer.java:1706) ~[main/:?] 2> at org.apache.solr.core.CoreContainer.create(CoreContainer.java:1572) ~[main/:?] 2> ... 44 more 2> Caused by: org.apache.solr.common.SolrException: Could not find collection : repfacttest_c8n_2x2 2> at org.apache.solr.common.cloud.ClusterState.getCollection(ClusterState.java:121) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.rest.ManagedResourceStorage.newStorageIO(ManagedResourceStorage.java:100) ~[main/:?] 2> at org.apache.solr.core.SolrCore.initRestManager(SolrCore.java:3260) ~[main/:?] 2> at org.apache.solr.core.SolrCore.(SolrCore.java:1179) ~[main/:?] 2> at org.apache.solr.core.SolrCore.(SolrCore.java:1057) ~[main/:?] 2> at org.apache.solr.core.CoreContainer.createFromDescriptor(CoreContainer.java:1706) ~[main/:?] 2> at org.apache.solr.core.CoreContainer.create(CoreContainer.java:1572) ~[main/:?] 2> ... 44 more 2> 393435 INFO (closeThreadPool-1268-thread-3) [n: c: s: r: x: t:] o.a.s.c.CoreContainer Shutting down CoreContainer instance=1546002582 2> 393438 INFO (closeThreadPool-1268-thread-4) [n: c: s: r: x: t:] o.a.s.c.CoreContainer Shutting down CoreContainer instance=1902520038 2> 393439 INFO (qtp1461542799-1982) [n:127.0.0.1:33207__c c: s: r: x: t:null-1118] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/cores params={qt=/admin/cores&coreNodeName=core_node7&collection.configName=conf1&newCollection=true&name=repfacttest_c8n_2x2_shard1_replica_n4&action=CREATE&numShards=2&collection=repfacttest_c8n_2x2&shard=shard1&wt=javabin&version=2&replicaType=NRT} status=400 QTime=16351 2> 393441 INFO (qtp1639791653-1927) [n:127.0.0.1:40643__c c: s: r: x: t:null-1120] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/cores params={qt=/admin/cores&coreNodeName=core_node3&collection.configName=conf1&newCollection=true&name=repfacttest_c8n_2x2_shard1_replica_n1&action=CREATE&numShards=2&collection=repfacttest_c8n_2x2&shard=shard1&wt=javabin&version=2&replicaType=NRT} status=400 QTime=16349 2> 393441 INFO (closeThreadPool-1268-thread-3) [n: c: s: r: x: t:] o.a.s.c.ZkController Remove node as live in ZooKeeper:/live_nodes/127.0.0.1:46873__c 2> 393441 INFO (closeThreadPool-1268-thread-4) [n: c: s: r: x: t:] o.a.s.c.ZkController Remove node as live in ZooKeeper:/live_nodes/127.0.0.1:32875__c 2> 393444 INFO (closeThreadPool-1268-thread-5) [n: c: s: r: x: t:] o.a.s.c.CoreContainer Shutting down CoreContainer instance=852473705 2> 393445 INFO (closeThreadPool-1268-thread-5) [n: c: s: r: x: t:] o.a.s.c.ZkController Remove node as live in ZooKeeper:/live_nodes/127.0.0.1:33207__c 2> 393448 INFO (closeThreadPool-1268-thread-3) [n: c: s: r: x: t:] o.a.s.c.ZkController Publish this node as DOWN... 2> 393448 INFO (closeThreadPool-1268-thread-3) [n: c: s: r: x: t:] o.a.s.c.ZkController Publish node=127.0.0.1:46873__c as DOWN 2> 393449 INFO (closeThreadPool-1268-thread-4) [n: c: s: r: x: t:] o.a.s.c.ZkController Publish this node as DOWN... 2> 393449 INFO (closeThreadPool-1268-thread-4) [n: c: s: r: x: t:] o.a.s.c.ZkController Publish node=127.0.0.1:32875__c as DOWN 2> 393450 INFO (closeThreadPool-1268-thread-5) [n: c: s: r: x: t:] o.a.s.c.ZkController Publish this node as DOWN... 2> 393450 INFO (closeThreadPool-1268-thread-5) [n: c: s: r: x: t:] o.a.s.c.ZkController Publish node=127.0.0.1:33207__c as DOWN 2> 393451 INFO (zkCallback-1161-thread-2) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (4) -> (1) 2> 393451 INFO (zkCallback-1180-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (4) -> (1) 2> 393451 INFO (zkCallback-1199-thread-2) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (4) -> (1) 2> 393452 INFO (zkCallback-1122-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (4) -> (1) 2> 393485 INFO (coreCloseExecutor-1287-thread-1) [n:127.0.0.1:46873__c c:collection1 s:shard1 r:core_node6 x:collection1_shard1_replica_n3 t:] o.a.s.c.SolrCore CLOSING SolrCore org.apache.solr.core.SolrCore@53878feb collection1_shard1_replica_n3 2> 393487 INFO (coreCloseExecutor-1288-thread-1) [n:127.0.0.1:33207__c c:collection1 s:shard2 r:core_node5 x:collection1_shard2_replica_n2 t:] o.a.s.c.SolrCore CLOSING SolrCore org.apache.solr.core.SolrCore@2b039adb collection1_shard2_replica_n2 2> 393487 INFO (coreCloseExecutor-1287-thread-1) [n:127.0.0.1:46873__c c:collection1 s:shard1 r:core_node6 x:collection1_shard1_replica_n3 t:] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.core.collection1.shard1.replica_n3 tag=SolrCore@53878feb 2> 393488 INFO (coreCloseExecutor-1288-thread-1) [n:127.0.0.1:33207__c c:collection1 s:shard2 r:core_node5 x:collection1_shard2_replica_n2 t:] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.core.collection1.shard2.replica_n2 tag=SolrCore@2b039adb 2> 393488 INFO (coreCloseExecutor-1288-thread-2) [n:127.0.0.1:33207__c c:repfacttest_c8n_1x3 s:shard1 r:core_node6 x:repfacttest_c8n_1x3_shard1_replica_n4 t:] o.a.s.c.SolrCore CLOSING SolrCore org.apache.solr.core.SolrCore@6307be12 repfacttest_c8n_1x3_shard1_replica_n4 2> 393489 INFO (coreCloseExecutor-1289-thread-1) [n:127.0.0.1:32875__c c:collection1 s:shard3 r:core_node4 x:collection1_shard3_replica_n1 t:] o.a.s.c.SolrCore CLOSING SolrCore org.apache.solr.core.SolrCore@28b50bba collection1_shard3_replica_n1 2> 393490 INFO (coreCloseExecutor-1289-thread-2) [n:127.0.0.1:32875__c c:repfacttest_c8n_1x3 s:shard1 r:core_node3 x:repfacttest_c8n_1x3_shard1_replica_n1 t:] o.a.s.c.SolrCore CLOSING SolrCore org.apache.solr.core.SolrCore@7d14ea1 repfacttest_c8n_1x3_shard1_replica_n1 2> 393490 INFO (coreCloseExecutor-1289-thread-1) [n:127.0.0.1:32875__c c:collection1 s:shard3 r:core_node4 x:collection1_shard3_replica_n1 t:] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.core.collection1.shard3.replica_n1 tag=SolrCore@28b50bba 2> 393603 INFO (zkCallback-1161-thread-2) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/repfacttest_c8n_1x3/state.json zxid: 474] for collection [repfacttest_c8n_1x3] has occurred - updating... (live nodes size: [1]) 2> 393604 INFO (zkCallback-1122-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/repfacttest_c8n_1x3/state.json zxid: 474] for collection [repfacttest_c8n_1x3] has occurred - updating... (live nodes size: [1]) 2> 393604 INFO (zkCallback-1180-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/repfacttest_c8n_1x3/state.json zxid: 474] for collection [repfacttest_c8n_1x3] has occurred - updating... (live nodes size: [1]) 2> 393604 INFO (zkCallback-1180-thread-2) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/repfacttest_c8n_1x3/state.json zxid: 474] for collection [repfacttest_c8n_1x3] has occurred - updating... (live nodes size: [1]) 2> 393615 INFO (zkCallback-1180-thread-2) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/collection1/state.json zxid: 475] for collection [collection1] has occurred - updating... (live nodes size: [1]) 2> 393615 INFO (zkCallback-1161-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/collection1/state.json zxid: 475] for collection [collection1] has occurred - updating... (live nodes size: [1]) 2> 393615 INFO (zkCallback-1180-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/collection1/state.json zxid: 475] for collection [collection1] has occurred - updating... (live nodes size: [1]) 2> 393615 INFO (zkCallback-1199-thread-2) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/collection1/state.json zxid: 475] for collection [collection1] has occurred - updating... (live nodes size: [1]) 2> 393615 INFO (zkCallback-1199-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/collection1/state.json zxid: 475] for collection [collection1] has occurred - updating... (live nodes size: [1]) 2> 393615 INFO (zkCallback-1161-thread-2) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/collection1/state.json zxid: 475] for collection [collection1] has occurred - updating... (live nodes size: [1]) 2> 393650 INFO (closeThreadPool-1268-thread-2) [n: c: s: r: x: t:] o.a.s.c.CoreContainer Shutting down CoreContainer instance=1457949995 2> 393651 INFO (closeThreadPool-1268-thread-2) [n: c: s: r: x: t:] o.a.s.c.ZkController Remove node as live in ZooKeeper:/live_nodes/127.0.0.1:40643__c 2> 393655 INFO (closeThreadPool-1268-thread-2) [n: c: s: r: x: t:] o.a.s.c.ZkController Publish this node as DOWN... 2> 393656 INFO (closeThreadPool-1268-thread-2) [n: c: s: r: x: t:] o.a.s.c.ZkController Publish node=127.0.0.1:40643__c as DOWN 2> 393657 INFO (zkCallback-1199-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (1) -> (0) 2> 393658 INFO (zkCallback-1161-thread-2) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (1) -> (0) 2> 393658 INFO (zkCallback-1180-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (1) -> (0) 2> 393658 INFO (zkCallback-1122-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (1) -> (0) 2> 393678 INFO (coreCloseExecutor-1292-thread-1) [n:127.0.0.1:40643__c c:control_collection s:shard1 r:core_node2 x:control_collection_shard1_replica_n1 t:] o.a.s.c.SolrCore CLOSING SolrCore org.apache.solr.core.SolrCore@54649c39 control_collection_shard1_replica_n1 2> 393678 INFO (coreCloseExecutor-1292-thread-1) [n:127.0.0.1:40643__c c:control_collection s:shard1 r:core_node2 x:control_collection_shard1_replica_n1 t:] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.core.control_collection.shard1.replica_n1 tag=SolrCore@54649c39 2> 393678 INFO (coreCloseExecutor-1292-thread-2) [n:127.0.0.1:40643__c c:repfacttest_c8n_1x3 s:shard1 r:core_node5 x:repfacttest_c8n_1x3_shard1_replica_n2 t:] o.a.s.c.SolrCore CLOSING SolrCore org.apache.solr.core.SolrCore@7312fde1 repfacttest_c8n_1x3_shard1_replica_n2 2> 393787 INFO (OverseerCollectionConfigSetProcessor-72077814988931076-127.0.0.1:40643__c-n_0000000000) [n: c: s: r: x: t:] o.a.s.c.OverseerTaskQueue Response ZK path: /overseer/collection-queue-work/qnr-0000000012 doesn't exist. Requestor may have disconnected from ZooKeeper 2> 393789 INFO (zkCallback-1161-thread-2) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/repfacttest_c8n_1x3/state.json zxid: 480] for collection [repfacttest_c8n_1x3] has occurred - updating... (live nodes size: [0]) 2> 393790 INFO (zkCallback-1180-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/repfacttest_c8n_1x3/state.json zxid: 480] for collection [repfacttest_c8n_1x3] has occurred - updating... (live nodes size: [0]) 2> 393790 INFO (zkCallback-1180-thread-2) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/repfacttest_c8n_1x3/state.json zxid: 480] for collection [repfacttest_c8n_1x3] has occurred - updating... (live nodes size: [0]) 2> 393790 INFO (zkCallback-1122-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/repfacttest_c8n_1x3/state.json zxid: 480] for collection [repfacttest_c8n_1x3] has occurred - updating... (live nodes size: [0]) 2> 393799 INFO (zkCallback-1122-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/control_collection/state.json zxid: 482] for collection [control_collection] has occurred - updating... (live nodes size: [0]) 2> 393799 INFO (zkCallback-1122-thread-2) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/control_collection/state.json zxid: 482] for collection [control_collection] has occurred - updating... (live nodes size: [0]) 2> 393975 INFO (coreCloseExecutor-1287-thread-1) [n:127.0.0.1:46873__c c:collection1 s:shard1 r:core_node6 x:collection1_shard1_replica_n3 t:] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.collection.collection1.shard1.leader tag=SolrCore@53878feb 2> 393989 INFO (coreCloseExecutor-1287-thread-1) [n:127.0.0.1:46873__c c:collection1 s:shard1 r:core_node6 x:collection1_shard1_replica_n3 t:] o.a.s.u.DirectUpdateHandler2 Committing on IndexWriter.close() ... SKIPPED (unnecessary). 2> 393995 INFO (coreCloseExecutor-1289-thread-1) [n:127.0.0.1:32875__c c:collection1 s:shard3 r:core_node4 x:collection1_shard3_replica_n1 t:] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.collection.collection1.shard3.leader tag=SolrCore@28b50bba 2> 393996 INFO (coreCloseExecutor-1289-thread-2) [n:127.0.0.1:32875__c c:repfacttest_c8n_1x3 s:shard1 r:core_node3 x:repfacttest_c8n_1x3_shard1_replica_n1 t:] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.core.repfacttest_c8n_1x3.shard1.replica_n1 tag=SolrCore@7d14ea1 2> 394005 INFO (closeThreadPool-1268-thread-3) [n: c: s: r: x: t:] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.node tag=null 2> 394008 INFO (coreCloseExecutor-1289-thread-1) [n:127.0.0.1:32875__c c:collection1 s:shard3 r:core_node4 x:collection1_shard3_replica_n1 t:] o.a.s.u.DirectUpdateHandler2 Committing on IndexWriter.close() ... SKIPPED (unnecessary). 2> 394026 INFO (coreCloseExecutor-1288-thread-1) [n:127.0.0.1:33207__c c:collection1 s:shard2 r:core_node5 x:collection1_shard2_replica_n2 t:] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.collection.collection1.shard2.leader tag=SolrCore@2b039adb 2> 394026 INFO (coreCloseExecutor-1288-thread-2) [n:127.0.0.1:33207__c c:repfacttest_c8n_1x3 s:shard1 r:core_node6 x:repfacttest_c8n_1x3_shard1_replica_n4 t:] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.core.repfacttest_c8n_1x3.shard1.replica_n4 tag=SolrCore@6307be12 2> 394032 INFO (coreCloseExecutor-1288-thread-1) [n:127.0.0.1:33207__c c:collection1 s:shard2 r:core_node5 x:collection1_shard2_replica_n2 t:] o.a.s.u.DirectUpdateHandler2 Committing on IndexWriter.close() ... SKIPPED (unnecessary). 2> 394200 INFO (coreCloseExecutor-1292-thread-1) [n:127.0.0.1:40643__c c:control_collection s:shard1 r:core_node2 x:control_collection_shard1_replica_n1 t:] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.collection.control_collection.shard1.leader tag=SolrCore@54649c39 2> 394201 INFO (coreCloseExecutor-1292-thread-2) [n:127.0.0.1:40643__c c:repfacttest_c8n_1x3 s:shard1 r:core_node5 x:repfacttest_c8n_1x3_shard1_replica_n2 t:] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.core.repfacttest_c8n_1x3.shard1.replica_n2 tag=SolrCore@7312fde1 2> 394207 INFO (coreCloseExecutor-1292-thread-1) [n:127.0.0.1:40643__c c:control_collection s:shard1 r:core_node2 x:control_collection_shard1_replica_n1 t:] o.a.s.u.DirectUpdateHandler2 Committing on IndexWriter.close() ... SKIPPED (unnecessary). 2> 394276 INFO (closeThreadPool-1268-thread-3) [n: c: s: r: x: t:] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.jvm tag=null 2> 394551 INFO (closeThreadPool-1268-thread-3) [n: c: s: r: x: t:] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.jetty tag=null 2> 394551 INFO (closeThreadPool-1268-thread-3) [n: c: s: r: x: t:] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.cluster tag=null 2> 394609 INFO (coreCloseExecutor-1289-thread-2) [n:127.0.0.1:32875__c c:repfacttest_c8n_1x3 s:shard1 r:core_node3 x:repfacttest_c8n_1x3_shard1_replica_n1 t:] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.collection.repfacttest_c8n_1x3.shard1.leader tag=SolrCore@7d14ea1 2> 394615 INFO (coreCloseExecutor-1289-thread-2) [n:127.0.0.1:32875__c c:repfacttest_c8n_1x3 s:shard1 r:core_node3 x:repfacttest_c8n_1x3_shard1_replica_n1 t:] o.a.s.u.DirectUpdateHandler2 Committing on IndexWriter.close() . 2> 394649 INFO (closeThreadPool-1268-thread-4) [n: c: s: r: x: t:] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.node tag=null 2> 394664 INFO (coreCloseExecutor-1288-thread-2) [n:127.0.0.1:33207__c c:repfacttest_c8n_1x3 s:shard1 r:core_node6 x:repfacttest_c8n_1x3_shard1_replica_n4 t:] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.collection.repfacttest_c8n_1x3.shard1.leader tag=SolrCore@6307be12 2> 394676 WARN (closeThreadPool-1268-thread-3) [n: c: s: r: x: t:] o.a.s.c.s.c.SocketProxy Closing 5 connections to: http://127.0.0.1:46873/_c, target: http://127.0.0.1:40767/_c 2> 394682 INFO (coreCloseExecutor-1288-thread-2) [n:127.0.0.1:33207__c c:repfacttest_c8n_1x3 s:shard1 r:core_node6 x:repfacttest_c8n_1x3_shard1_replica_n4 t:] o.a.s.u.DirectUpdateHandler2 Committing on IndexWriter.close() . 2> 394702 INFO (closeThreadPool-1268-thread-5) [n: c: s: r: x: t:] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.node tag=null 2> 394704 INFO (closeThreadPool-1268-thread-4) [n: c: s: r: x: t:] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.jvm tag=null 2> 394727 INFO (closeThreadPool-1268-thread-4) [n: c: s: r: x: t:] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.jetty tag=null 2> 394727 INFO (closeThreadPool-1268-thread-4) [n: c: s: r: x: t:] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.cluster tag=null 2> 394752 INFO (closeThreadPool-1268-thread-5) [n: c: s: r: x: t:] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.jvm tag=null 2> 394776 INFO (closeThreadPool-1268-thread-5) [n: c: s: r: x: t:] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.jetty tag=null 2> 394777 INFO (closeThreadPool-1268-thread-5) [n: c: s: r: x: t:] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.cluster tag=null 2> 394786 INFO (coreCloseExecutor-1292-thread-2) [n:127.0.0.1:40643__c c:repfacttest_c8n_1x3 s:shard1 r:core_node5 x:repfacttest_c8n_1x3_shard1_replica_n2 t:] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.collection.repfacttest_c8n_1x3.shard1.leader tag=SolrCore@7312fde1 2> 394797 INFO (coreCloseExecutor-1292-thread-2) [n:127.0.0.1:40643__c c:repfacttest_c8n_1x3 s:shard1 r:core_node5 x:repfacttest_c8n_1x3_shard1_replica_n2 t:] o.a.s.u.DirectUpdateHandler2 Committing on IndexWriter.close() . 2> 394812 INFO (closeThreadPool-1268-thread-2) [n: c: s: r: x: t:] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.node tag=null 2> 394840 INFO (closeThreadPool-1268-thread-2) [n: c: s: r: x: t:] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.jvm tag=null 2> 394850 WARN (closeThreadPool-1268-thread-4) [n: c: s: r: x: t:] o.a.s.c.s.c.SocketProxy Closing 11 connections to: http://127.0.0.1:32875/_c, target: http://127.0.0.1:38185/_c 2> 394858 INFO (closeThreadPool-1268-thread-2) [n: c: s: r: x: t:] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.jetty tag=null 2> 394859 INFO (closeThreadPool-1268-thread-2) [n: c: s: r: x: t:] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.cluster tag=null 2> 394864 INFO (closeThreadPool-1296-thread-1) [n: c: s: r: x: t:] o.a.s.c.Overseer Overseer (id=72077814988931076-127.0.0.1:40643__c-n_0000000000) closing 2> 394866 INFO (OverseerStateUpdate-72077814988931076-127.0.0.1:40643__c-n_0000000000) [n:127.0.0.1:40643__c c: s: r: x: t:] o.a.s.c.Overseer Overseer Loop exiting : 127.0.0.1:40643__c 2> 394886 INFO (closeThreadPool-1296-thread-2) [n: c: s: r: x: t:] o.a.s.c.Overseer Overseer (id=72077814988931076-127.0.0.1:40643__c-n_0000000000) closing 2> 394893 WARN (closeThreadPool-1268-thread-5) [n: c: s: r: x: t:] o.a.s.c.s.c.SocketProxy Closing 3 connections to: http://127.0.0.1:33207/_c, target: http://127.0.0.1:35371/_c 2> 394994 INFO (closeThreadPool-1268-thread-2) [n: c: s: r: x: t:] o.a.s.c.Overseer Overseer (id=72077814988931076-127.0.0.1:40643__c-n_0000000000) closing 2> 394998 WARN (closeThreadPool-1268-thread-2) [n: c: s: r: x: t:] o.a.s.c.s.c.SocketProxy Closing 1 connections to: http://127.0.0.1:40643/_c, target: http://127.0.0.1:43165/_c 2> 395004 INFO (TEST-ReplicationFactorTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.u.ErrorLogMuter Closing ErrorLogMuter-regex-23 after mutting 0 log messages 2> 395004 INFO (TEST-ReplicationFactorTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.u.ErrorLogMuter Creating ErrorLogMuter-regex-24 for ERROR logs matching regex: ignore_exception 2> 395007 INFO (TEST-ReplicationFactorTest.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.ZkTestServer Shutting down ZkTestServer. 2> 395229 WARN (ZkTestServer Run Thread) [n: c: s: r: x: t:] o.a.s.c.ZkTestServer Watch limit violations: 2> Maximum concurrent create/delete watches above limit: 2> 2> 71 /solr/collections/repfacttest_c8n_1x3/terms/shard1 2> 6 /solr/clusterprops.json 2> 6 /solr/aliases.json 2> 4 /solr/packages.json 2> 4 /solr/security.json 2> 4 /solr/configs/conf1 2> 3 /solr/collections/collection1/collectionprops.json 2> 3 /solr/collections/repfacttest_c8n_1x3/collectionprops.json 2> 2 /solr/collections/repfacttest_c8n_2x2/state.json 2> 2 /solr/collections/collection1/terms/shard1 2> 2 /solr/collections/collection1/terms/shard2 2> 2 /solr/collections/collection1/terms/shard3 2> 2 /solr/collections/control_collection/terms/shard1 2> 2> Maximum concurrent data watches above limit: 2> 2> 61 /solr/collections/repfacttest_c8n_1x3/state.json 2> 26 /solr/collections/collection1/state.json 2> 9 /solr/collections/control_collection/state.json 2> 3 /solr/collections/repfacttest_c8n_2x2/state.json 2> 2> Maximum concurrent children watches above limit: 2> 2> 84 /solr/overseer/queue 2> 27 /solr/live_nodes 2> 22 /solr/overseer/collection-queue-work 2> 21 /solr/collections 2> 8 /solr/collections/collection1/state.json 2> 5 /solr/collections/repfacttest_c8n_1x3/state.json 2> 3 /solr/collections/control_collection/state.json 2> 2 /solr/collections/repfacttest_c8n_2x2/state.json 2> > org.apache.solr.client.solrj.impl.BaseHttpSolrClient$RemoteSolrException: Error from server at http://127.0.0.1:38185/_c: Underlying core creation failed while creating collection: repfacttest_c8n_2x2 > at __randomizedtesting.SeedInfo.seed([246C98A4C257C021:AC38A77E6CABADD9]:0) > at app//org.apache.solr.client.solrj.impl.HttpSolrClient.executeMethod(HttpSolrClient.java:747) > at app//org.apache.solr.client.solrj.impl.HttpSolrClient.request(HttpSolrClient.java:263) > at app//org.apache.solr.client.solrj.impl.HttpSolrClient.request(HttpSolrClient.java:244) > at app//org.apache.solr.client.solrj.SolrClient.request(SolrClient.java:1192) > at app//org.apache.solr.cloud.AbstractFullDistribZkTestBase.createCollection(AbstractFullDistribZkTestBase.java:2100) > at app//org.apache.solr.cloud.AbstractFullDistribZkTestBase.createCollection(AbstractFullDistribZkTestBase.java:2138) > at app//org.apache.solr.cloud.AbstractFullDistribZkTestBase.createCollection(AbstractFullDistribZkTestBase.java:2026) > at app//org.apache.solr.cloud.ReplicationFactorTest.createCollectionWithRetry(ReplicationFactorTest.java:514) > at app//org.apache.solr.cloud.ReplicationFactorTest.testRf2NotUsingDirectUpdates(ReplicationFactorTest.java:111) > at app//org.apache.solr.cloud.ReplicationFactorTest.test(ReplicationFactorTest.java:95) > at java.base@11.0.22/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method) > at java.base@11.0.22/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) > at java.base@11.0.22/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) > at java.base@11.0.22/java.lang.reflect.Method.invoke(Method.java:566) > at app//com.carrotsearch.randomizedtesting.RandomizedRunner.invoke(RandomizedRunner.java:1758) > at app//com.carrotsearch.randomizedtesting.RandomizedRunner$8.evaluate(RandomizedRunner.java:946) > at app//com.carrotsearch.randomizedtesting.RandomizedRunner$9.evaluate(RandomizedRunner.java:982) > at app//com.carrotsearch.randomizedtesting.RandomizedRunner$10.evaluate(RandomizedRunner.java:996) > at app//org.apache.solr.BaseDistributedSearchTestCase$ShardsRepeatRule$ShardsFixedStatement.callStatement(BaseDistributedSearchTestCase.java:1163) > at app//org.apache.solr.BaseDistributedSearchTestCase$ShardsRepeatRule$ShardsStatement.evaluate(BaseDistributedSearchTestCase.java:1134) > at app//com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:80) > at app//org.junit.rules.RunRules.evaluate(RunRules.java:20) > at app//org.apache.lucene.tests.util.TestRuleSetupTeardownChained$1.evaluate(TestRuleSetupTeardownChained.java:48) > at app//org.apache.lucene.tests.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:43) > at app//org.apache.lucene.tests.util.TestRuleThreadAndTestName$1.evaluate(TestRuleThreadAndTestName.java:45) > at app//org.apache.lucene.tests.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:60) > at app//org.apache.lucene.tests.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:44) > at app//org.junit.rules.RunRules.evaluate(RunRules.java:20) > at app//com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) > at app//com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:390) > at app//com.carrotsearch.randomizedtesting.ThreadLeakControl.forkTimeoutingTask(ThreadLeakControl.java:843) > at app//com.carrotsearch.randomizedtesting.ThreadLeakControl$3.evaluate(ThreadLeakControl.java:490) > at app//com.carrotsearch.randomizedtesting.RandomizedRunner.runSingleTest(RandomizedRunner.java:955) > at app//com.carrotsearch.randomizedtesting.RandomizedRunner$5.evaluate(RandomizedRunner.java:840) > at app//com.carrotsearch.randomizedtesting.RandomizedRunner$6.evaluate(RandomizedRunner.java:891) > at app//com.carrotsearch.randomizedtesting.RandomizedRunner$7.evaluate(RandomizedRunner.java:902) > at app//com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) > at app//com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) > at app//org.apache.lucene.tests.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:43) > at app//org.apache.lucene.tests.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:43) > at app//com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:80) > at app//org.junit.rules.RunRules.evaluate(RunRules.java:20) > at app//org.apache.lucene.tests.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:43) > at app//com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) > at app//org.apache.lucene.tests.util.TestRuleStoreClassName$1.evaluate(TestRuleStoreClassName.java:38) > at app//com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:40) > at app//com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:40) > at app//com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) > at app//com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) > at app//org.apache.lucene.tests.util.TestRuleAssertionsRequired$1.evaluate(TestRuleAssertionsRequired.java:53) > at app//org.apache.lucene.tests.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:43) > at app//org.apache.lucene.tests.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:44) > at app//org.apache.lucene.tests.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:60) > at app//org.apache.lucene.tests.util.TestRuleIgnoreTestSuites$1.evaluate(TestRuleIgnoreTestSuites.java:47) > at app//org.junit.rules.RunRules.evaluate(RunRules.java:20) > at app//com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) > at app//com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:390) > at app//com.carrotsearch.randomizedtesting.ThreadLeakControl.lambda$forkTimeoutingTask$0(ThreadLeakControl.java:850) > at java.base@11.0.22/java.lang.Thread.run(Thread.java:829) 2> NOTE: reproduce with: gradlew test --tests ReplicationFactorTest.test -Dtests.seed=246C98A4C257C021 -Dtests.locale=shi -Dtests.timezone=Africa/Mogadishu -Dtests.asserts=true -Dtests.file.encoding=UTF-8 2> 395295 INFO (SUITE-ReplicationFactorTest-seed#[246C98A4C257C021]-worker) [n: c: s: r: x: t:] o.a.s.u.ErrorLogMuter Closing ErrorLogMuter-regex-24 after mutting 0 log messages 2> 395296 INFO (SUITE-ReplicationFactorTest-seed#[246C98A4C257C021]-worker) [n: c: s: r: x: t:] o.a.s.u.ErrorLogMuter Creating ErrorLogMuter-regex-25 for ERROR logs matching regex: ignore_exception 2> NOTE: leaving temporary files on disk at: /build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.cloud.ReplicationFactorTest_246C98A4C257C021-001 2> NOTE: test params are: codec=Asserting(Lucene95): {multiDefault=PostingsFormat(name=Direct), a_t=TestBloomFilteredLucenePostings(BloomFilteringPostingsFormat(Lucene90)), _root_=TestBloomFilteredLucenePostings(BloomFilteringPostingsFormat(Lucene90)), id=BlockTreeOrds(blocksize=128), text=PostingsFormat(name=LuceneFixedGap)}, docValues:{range_facet_l_dv=DocValuesFormat(name=Lucene90), _version_=DocValuesFormat(name=Lucene90), intDefault=DocValuesFormat(name=Asserting), id_i1=DocValuesFormat(name=Lucene90), range_facet_i_dv=DocValuesFormat(name=Asserting), intDvoDefault=DocValuesFormat(name=Lucene90), range_facet_l=DocValuesFormat(name=Asserting), timestamp=DocValuesFormat(name=Lucene90)}, maxPointsInLeafNode=1268, maxMBSortInHeap=6.055975984627296, sim=Asserting(RandomSimilarity(queryNorm=false): {}), locale=shi, timezone=Africa/Mogadishu 2> NOTE: Linux 6.1.61-3-sophgo-08357-g369f7207fe69 riscv64/Oracle Corporation 11.0.22 (64-bit)/cpus=1,threads=2,free=295306528,total=497025024 2> NOTE: All tests run in this JVM: [SampleTest, TestJoin, TestLuceneMatchVersion, TestExportTool, BasicDistributedZkTest, CollectionStateZnodeTest, DeleteStatusTest, HttpPartitionTest, MoveReplicaTest, OverseerStatusTest, ReplicationFactorTest] org.apache.solr.cloud.api.collections.TestReplicaProperties > test FAILED org.apache.solr.client.solrj.impl.BaseHttpSolrClient$RemoteSolrException: Error from server at http://127.0.0.1:45309/mr/x: Underlying core creation failed while creating collection: testcollection at __randomizedtesting.SeedInfo.seed([246C98A4C257C021:AC38A77E6CABADD9]:0) at app//org.apache.solr.client.solrj.impl.HttpSolrClient.executeMethod(HttpSolrClient.java:747) at app//org.apache.solr.client.solrj.impl.HttpSolrClient.request(HttpSolrClient.java:263) at app//org.apache.solr.client.solrj.impl.HttpSolrClient.request(HttpSolrClient.java:244) at app//org.apache.solr.client.solrj.impl.LBSolrClient.doRequest(LBSolrClient.java:386) at app//org.apache.solr.client.solrj.impl.LBSolrClient.request(LBSolrClient.java:352) at app//org.apache.solr.client.solrj.impl.CloudSolrClient.sendRequest(CloudSolrClient.java:1199) at app//org.apache.solr.client.solrj.impl.CloudSolrClient.requestWithRetryOnStaleState(CloudSolrClient.java:898) at app//org.apache.solr.client.solrj.impl.CloudSolrClient.request(CloudSolrClient.java:826) at app//org.apache.solr.client.solrj.SolrClient.request(SolrClient.java:1192) at app//org.apache.solr.cloud.AbstractFullDistribZkTestBase.createCollection(AbstractFullDistribZkTestBase.java:2103) at app//org.apache.solr.cloud.AbstractFullDistribZkTestBase.createCollection(AbstractFullDistribZkTestBase.java:2163) at app//org.apache.solr.cloud.api.collections.TestReplicaProperties.test(TestReplicaProperties.java:54) at java.base@11.0.22/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at java.base@11.0.22/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at java.base@11.0.22/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.base@11.0.22/java.lang.reflect.Method.invoke(Method.java:566) at app//com.carrotsearch.randomizedtesting.RandomizedRunner.invoke(RandomizedRunner.java:1758) at app//com.carrotsearch.randomizedtesting.RandomizedRunner$8.evaluate(RandomizedRunner.java:946) at app//com.carrotsearch.randomizedtesting.RandomizedRunner$9.evaluate(RandomizedRunner.java:982) at app//com.carrotsearch.randomizedtesting.RandomizedRunner$10.evaluate(RandomizedRunner.java:996) at app//org.apache.solr.BaseDistributedSearchTestCase$ShardsRepeatRule$ShardsFixedStatement.callStatement(BaseDistributedSearchTestCase.java:1163) at app//org.apache.solr.BaseDistributedSearchTestCase$ShardsRepeatRule$ShardsStatement.evaluate(BaseDistributedSearchTestCase.java:1134) at app//com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:80) at app//org.junit.rules.RunRules.evaluate(RunRules.java:20) at app//org.apache.lucene.tests.util.TestRuleSetupTeardownChained$1.evaluate(TestRuleSetupTeardownChained.java:48) at app//org.apache.lucene.tests.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:43) at app//org.apache.lucene.tests.util.TestRuleThreadAndTestName$1.evaluate(TestRuleThreadAndTestName.java:45) at app//org.apache.lucene.tests.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:60) at app//org.apache.lucene.tests.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:44) at app//org.junit.rules.RunRules.evaluate(RunRules.java:20) at app//com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at app//com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:390) at app//com.carrotsearch.randomizedtesting.ThreadLeakControl.forkTimeoutingTask(ThreadLeakControl.java:843) at app//com.carrotsearch.randomizedtesting.ThreadLeakControl$3.evaluate(ThreadLeakControl.java:490) at app//com.carrotsearch.randomizedtesting.RandomizedRunner.runSingleTest(RandomizedRunner.java:955) at app//com.carrotsearch.randomizedtesting.RandomizedRunner$5.evaluate(RandomizedRunner.java:840) at app//com.carrotsearch.randomizedtesting.RandomizedRunner$6.evaluate(RandomizedRunner.java:891) at app//com.carrotsearch.randomizedtesting.RandomizedRunner$7.evaluate(RandomizedRunner.java:902) at app//com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at app//com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at app//org.apache.lucene.tests.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:43) at app//org.apache.lucene.tests.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:43) at app//com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:80) at app//org.junit.rules.RunRules.evaluate(RunRules.java:20) at app//org.apache.lucene.tests.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:43) at app//com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at app//org.apache.lucene.tests.util.TestRuleStoreClassName$1.evaluate(TestRuleStoreClassName.java:38) at app//com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:40) at app//com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:40) at app//com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at app//com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at app//org.apache.lucene.tests.util.TestRuleAssertionsRequired$1.evaluate(TestRuleAssertionsRequired.java:53) at app//org.apache.lucene.tests.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:43) at app//org.apache.lucene.tests.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:44) at app//org.apache.lucene.tests.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:60) at app//org.apache.lucene.tests.util.TestRuleIgnoreTestSuites$1.evaluate(TestRuleIgnoreTestSuites.java:47) at app//org.junit.rules.RunRules.evaluate(RunRules.java:20) at app//com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at app//com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:390) at app//com.carrotsearch.randomizedtesting.ThreadLeakControl.lambda$forkTimeoutingTask$0(ThreadLeakControl.java:850) at java.base@11.0.22/java.lang.Thread.run(Thread.java:829) org.apache.solr.cloud.api.collections.TestReplicaProperties > test suite's output saved to /build/solr/src/solr-9.4.1/solr/core/build/test-results/test/outputs/OUTPUT-org.apache.solr.cloud.api.collections.TestReplicaProperties.txt, copied below: 2> 444653 INFO (SUITE-TestReplicaProperties-seed#[246C98A4C257C021]-worker) [n: c: s: r: x: t:] o.a.s.SolrTestCase Setting 'solr.default.confdir' system property to test-framework derived value of '/build/solr/src/solr-9.4.1/solr/server/solr/configsets/_default/conf' 2> 444655 INFO (SUITE-TestReplicaProperties-seed#[246C98A4C257C021]-worker) [n: c: s: r: x: t:] o.a.s.SolrTestCaseJ4 SecureRandom sanity checks: test.solr.allowed.securerandom=null & java.security.egd=file:/dev/./urandom 2> 444659 INFO (SUITE-TestReplicaProperties-seed#[246C98A4C257C021]-worker) [n: c: s: r: x: t:] o.a.s.u.ErrorLogMuter Closing ErrorLogMuter-regex-45 after mutting 0 log messages 2> 444660 INFO (SUITE-TestReplicaProperties-seed#[246C98A4C257C021]-worker) [n: c: s: r: x: t:] o.a.s.u.ErrorLogMuter Creating ErrorLogMuter-regex-46 for ERROR logs matching regex: ignore_exception 2> 444665 INFO (SUITE-TestReplicaProperties-seed#[246C98A4C257C021]-worker) [n: c: s: r: x: t:] o.a.s.SolrTestCaseJ4 Created dataDir: /build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.cloud.api.collections.TestReplicaProperties_246C98A4C257C021-001/data-dir-18-001 2> 444667 WARN (SUITE-TestReplicaProperties-seed#[246C98A4C257C021]-worker) [n: c: s: r: x: t:] o.a.s.SolrTestCaseJ4 startTrackingSearchers: numOpens=41 numCloses=41 2> 444668 INFO (SUITE-TestReplicaProperties-seed#[246C98A4C257C021]-worker) [n: c: s: r: x: t:] o.a.s.SolrTestCaseJ4 Using TrieFields (NUMERIC_POINTS_SYSPROP=false) w/NUMERIC_DOCVALUES_SYSPROP=false 2> 444673 INFO (SUITE-TestReplicaProperties-seed#[246C98A4C257C021]-worker) [n: c: s: r: x: t:] o.a.s.SolrTestCaseJ4 Randomized ssl (false) and clientAuth (false) via: @org.apache.solr.util.RandomizeSSL(reason="", ssl=0.0/0.0, value=0.0/0.0, clientAuth=0.0/0.0) 2> 444674 INFO (SUITE-TestReplicaProperties-seed#[246C98A4C257C021]-worker) [n: c: s: r: x: t:] o.a.s.BaseDistributedSearchTestCase Setting hostContext system property: /mr/x 2> 444751 INFO (TEST-TestReplicaProperties.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.u.ErrorLogMuter Closing ErrorLogMuter-regex-46 after mutting 0 log messages 2> 444752 INFO (TEST-TestReplicaProperties.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.u.ErrorLogMuter Creating ErrorLogMuter-regex-47 for ERROR logs matching regex: ignore_exception 2> 444757 INFO (TEST-TestReplicaProperties.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.ZkTestServer STARTING ZK TEST SERVER 2> 444758 INFO (ZkTestServer Run Thread) [n: c: s: r: x: t:] o.a.s.c.ZkTestServer client port: 0.0.0.0/0.0.0.0:0 2> 444759 INFO (ZkTestServer Run Thread) [n: c: s: r: x: t:] o.a.s.c.ZkTestServer Starting server 2> 444769 WARN (ZkTestServer Run Thread) [n: c: s: r: x: t:] o.a.z.s.ServerCnxnFactory maxCnxns is not configured, using default value 0. 2> 444858 INFO (TEST-TestReplicaProperties.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.ZkTestServer start zk server on port: 41191 2> 444866 WARN (TEST-TestReplicaProperties.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.c.SolrZkClient Using default ZkCredentialsInjector. ZkCredentialsInjector is not secure, it creates an empty list of credentials which leads to 'OPEN_ACL_UNSAFE' ACLs to Zookeeper nodes 2> 444875 INFO (TEST-TestReplicaProperties.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.c.ConnectionManager Waiting up to 30000ms for client to connect to ZooKeeper 2> 444890 INFO (zkConnectionManagerCallback-1417-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ConnectionManager zkClient has connected 2> 444891 INFO (TEST-TestReplicaProperties.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper 2> 444891 WARN (TEST-TestReplicaProperties.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.c.SolrZkClient Using default ZkACLProvider. DefaultZkACLProvider is not secure, it creates 'OPEN_ACL_UNSAFE' ACLs to Zookeeper nodes 2> 444899 WARN (TEST-TestReplicaProperties.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.c.SolrZkClient Using default ZkCredentialsInjector. ZkCredentialsInjector is not secure, it creates an empty list of credentials which leads to 'OPEN_ACL_UNSAFE' ACLs to Zookeeper nodes 2> 444907 INFO (TEST-TestReplicaProperties.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.c.ConnectionManager Waiting up to 30000ms for client to connect to ZooKeeper 2> 444917 INFO (zkConnectionManagerCallback-1419-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ConnectionManager zkClient has connected 2> 444917 INFO (TEST-TestReplicaProperties.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper 2> 444918 WARN (TEST-TestReplicaProperties.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.c.SolrZkClient Using default ZkACLProvider. DefaultZkACLProvider is not secure, it creates 'OPEN_ACL_UNSAFE' ACLs to Zookeeper nodes 2> 444928 INFO (TEST-TestReplicaProperties.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.ZkTestServer put /build/solr/src/solr-9.4.1/solr/core/build/resources/test/solr/collection1/conf/solrconfig-tlog.xml to /configs/conf1/solrconfig.xml 2> 444939 INFO (TEST-TestReplicaProperties.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.ZkTestServer put /build/solr/src/solr-9.4.1/solr/core/build/resources/test/solr/collection1/conf/schema15.xml to /configs/conf1/schema.xml 2> 444952 INFO (TEST-TestReplicaProperties.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.ZkTestServer put /build/solr/src/solr-9.4.1/solr/core/build/resources/test/solr/collection1/conf/solrconfig.snippet.randomindexconfig.xml to /configs/conf1/solrconfig.snippet.randomindexconfig.xml 2> 444963 INFO (TEST-TestReplicaProperties.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.ZkTestServer put /build/solr/src/solr-9.4.1/solr/core/build/resources/test/solr/collection1/conf/stopwords.txt to /configs/conf1/stopwords.txt 2> 444975 INFO (TEST-TestReplicaProperties.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.ZkTestServer put /build/solr/src/solr-9.4.1/solr/core/build/resources/test/solr/collection1/conf/protwords.txt to /configs/conf1/protwords.txt 2> 444987 INFO (TEST-TestReplicaProperties.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.ZkTestServer put /build/solr/src/solr-9.4.1/solr/core/build/resources/test/solr/collection1/conf/currency.xml to /configs/conf1/currency.xml 2> 444998 INFO (TEST-TestReplicaProperties.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.ZkTestServer put /build/solr/src/solr-9.4.1/solr/core/build/resources/test/solr/collection1/conf/enumsConfig.xml to /configs/conf1/enumsConfig.xml 2> 445010 INFO (TEST-TestReplicaProperties.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.ZkTestServer put /build/solr/src/solr-9.4.1/solr/core/build/resources/test/solr/collection1/conf/open-exchange-rates.json to /configs/conf1/open-exchange-rates.json 2> 445020 INFO (TEST-TestReplicaProperties.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.ZkTestServer put /build/solr/src/solr-9.4.1/solr/core/build/resources/test/solr/collection1/conf/mapping-ISOLatin1Accent.txt to /configs/conf1/mapping-ISOLatin1Accent.txt 2> 445031 INFO (TEST-TestReplicaProperties.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.ZkTestServer put /build/solr/src/solr-9.4.1/solr/core/build/resources/test/solr/collection1/conf/old_synonyms.txt to /configs/conf1/old_synonyms.txt 2> 445042 INFO (TEST-TestReplicaProperties.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.ZkTestServer put /build/solr/src/solr-9.4.1/solr/core/build/resources/test/solr/collection1/conf/synonyms.txt to /configs/conf1/synonyms.txt 2> 445053 INFO (TEST-TestReplicaProperties.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.AbstractFullDistribZkTestBase Will use NRT replicas unless explicitly asked otherwise 2> 448232 INFO (TEST-TestReplicaProperties.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.e.JettySolrRunner Start Jetty (configured port=0, binding port=0) 2> 448233 INFO (TEST-TestReplicaProperties.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.e.JettySolrRunner Trying to start Jetty on port 0 try number 2 ... 2> 448234 INFO (TEST-TestReplicaProperties.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.e.j.s.Server jetty-10.0.19; built: 2023-12-15T20:38:32.477Z; git: 8492d1c78f122bb30cce20aecfa07e7283facd47; jvm 11.0.22+7 2> 448251 INFO (TEST-TestReplicaProperties.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.e.j.s.s.DefaultSessionIdManager Session workerName=node0 2> 448256 INFO (TEST-TestReplicaProperties.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.e.j.s.h.ContextHandler Started o.e.j.s.ServletContextHandler@e11bc2c{/mr/x,file:///build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-cwd/,AVAILABLE} 2> 448260 INFO (TEST-TestReplicaProperties.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.e.j.s.AbstractConnector Started ServerConnector@239f2e18{HTTP/1.1, (http/1.1, h2c)}{127.0.0.1:40873} 2> 448262 INFO (TEST-TestReplicaProperties.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.e.j.s.Server Started Server@635ed4f2{STARTING}[10.0.19,sto=0] @448438ms 2> 448264 ERROR (TEST-TestReplicaProperties.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.u.StartupLoggingUtils Missing Java Option solr.log.dir. Logging may be missing or incomplete. 2> 448264 INFO (TEST-TestReplicaProperties.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.s.CoreContainerProvider Using logger factory org.apache.logging.slf4j.Log4jLoggerFactory 2> 448265 INFO (TEST-TestReplicaProperties.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.s.CoreContainerProvider ___ _ Welcome to Apache Solr™ version 9.4.1 2> 448265 INFO (TEST-TestReplicaProperties.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.s.CoreContainerProvider / __| ___| |_ _ Starting in cloud mode on port null 2> 448266 INFO (TEST-TestReplicaProperties.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.s.CoreContainerProvider \__ \/ _ \ | '_| Install dir: /build/solr/src/solr-9.4.1/solr 2> 448266 INFO (TEST-TestReplicaProperties.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.s.CoreContainerProvider |___/\___/_|_| Start time: 2024-01-21T21:08:21.334696Z 2> 448268 INFO (TEST-TestReplicaProperties.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.s.CoreContainerProvider Solr Home: /build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.cloud.api.collections.TestReplicaProperties_246C98A4C257C021-001/control-001 (source: servlet config: solr.solr.home) 2> 448271 WARN (TEST-TestReplicaProperties.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.c.SolrZkClient Using default ZkCredentialsInjector. ZkCredentialsInjector is not secure, it creates an empty list of credentials which leads to 'OPEN_ACL_UNSAFE' ACLs to Zookeeper nodes 2> 448283 INFO (TEST-TestReplicaProperties.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.c.ConnectionManager Waiting up to 30000ms for client to connect to ZooKeeper 2> 448294 INFO (zkConnectionManagerCallback-1421-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ConnectionManager zkClient has connected 2> 448295 INFO (TEST-TestReplicaProperties.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper 2> 448296 WARN (TEST-TestReplicaProperties.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.c.SolrZkClient Using default ZkACLProvider. DefaultZkACLProvider is not secure, it creates 'OPEN_ACL_UNSAFE' ACLs to Zookeeper nodes 2> 448404 INFO (TEST-TestReplicaProperties.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.NodeConfig Loading solr.xml from SolrHome (not found in ZooKeeper) 2> 448405 INFO (TEST-TestReplicaProperties.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.SolrXmlConfig Loading solr.xml from /build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.cloud.api.collections.TestReplicaProperties_246C98A4C257C021-001/control-001/solr.xml 2> 455957 INFO (TEST-TestReplicaProperties.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.u.t.SimplePropagator Always-on trace id generation enabled. 2> 456004 INFO (TEST-TestReplicaProperties.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.ZkContainer Zookeeper client=127.0.0.1:41191/solr 2> 456005 WARN (TEST-TestReplicaProperties.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.c.SolrZkClient Using default ZkCredentialsInjector. ZkCredentialsInjector is not secure, it creates an empty list of credentials which leads to 'OPEN_ACL_UNSAFE' ACLs to Zookeeper nodes 2> 456017 INFO (TEST-TestReplicaProperties.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.c.ConnectionManager Waiting up to 15000ms for client to connect to ZooKeeper 2> 456028 INFO (zkConnectionManagerCallback-1431-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ConnectionManager zkClient has connected 2> 456028 INFO (TEST-TestReplicaProperties.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper 2> 456029 WARN (TEST-TestReplicaProperties.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.c.SolrZkClient Using default ZkACLProvider. DefaultZkACLProvider is not secure, it creates 'OPEN_ACL_UNSAFE' ACLs to Zookeeper nodes 2> 456136 INFO (TEST-TestReplicaProperties.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.DistributedClusterStateUpdater Creating DistributedClusterStateUpdater with useDistributedStateUpdate=false. Solr will be using Overseer based cluster state updates. 2> 456147 INFO (TEST-TestReplicaProperties.test-seed#[246C98A4C257C021]) [n:127.0.0.1:40873_mr%2Fx c: s: r: x: t:] o.a.s.c.c.ConnectionManager Waiting up to 15000ms for client to connect to ZooKeeper 2> 456156 INFO (zkConnectionManagerCallback-1433-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ConnectionManager zkClient has connected 2> 456156 INFO (TEST-TestReplicaProperties.test-seed#[246C98A4C257C021]) [n:127.0.0.1:40873_mr%2Fx c: s: r: x: t:] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper 2> 456308 WARN (TEST-TestReplicaProperties.test-seed#[246C98A4C257C021]) [n:127.0.0.1:40873_mr%2Fx c: s: r: x: t:] o.a.s.c.ZkController Contents of zookeeper /security.json are world-readable; consider setting up ACLs as described in https://solr.apache.org/guide/solr/latest/deployment-guide/zookeeper-access-control.html 2> 456332 INFO (TEST-TestReplicaProperties.test-seed#[246C98A4C257C021]) [n:127.0.0.1:40873_mr%2Fx c: s: r: x: t:] o.a.s.c.DistributedClusterStateUpdater Creating DistributedClusterStateUpdater with useDistributedStateUpdate=false. Solr will be using Overseer based cluster state updates. 2> 456350 INFO (TEST-TestReplicaProperties.test-seed#[246C98A4C257C021]) [n:127.0.0.1:40873_mr%2Fx c: s: r: x: t:] o.a.s.c.OverseerElectionContext I am going to be the leader 127.0.0.1:40873_mr%2Fx 2> 456355 INFO (TEST-TestReplicaProperties.test-seed#[246C98A4C257C021]) [n:127.0.0.1:40873_mr%2Fx c: s: r: x: t:] o.a.s.c.Overseer Overseer (id=72077818027180036-127.0.0.1:40873_mr%2Fx-n_0000000000) starting 2> 456408 INFO (OverseerStateUpdate-72077818027180036-127.0.0.1:40873_mr%2Fx-n_0000000000) [n:127.0.0.1:40873_mr%2Fx c: s: r: x: t:] o.a.s.c.Overseer Starting to work on the main queue : 127.0.0.1:40873_mr%2Fx 2> 456409 INFO (TEST-TestReplicaProperties.test-seed#[246C98A4C257C021]) [n:127.0.0.1:40873_mr%2Fx c: s: r: x: t:] o.a.s.c.ZkController Register node as live in ZooKeeper:/live_nodes/127.0.0.1:40873_mr%2Fx 2> 456419 INFO (OverseerStateUpdate-72077818027180036-127.0.0.1:40873_mr%2Fx-n_0000000000) [n:127.0.0.1:40873_mr%2Fx c: s: r: x: t:] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (0) -> (1) 2> 456437 WARN (TEST-TestReplicaProperties.test-seed#[246C98A4C257C021]) [n:127.0.0.1:40873_mr%2Fx c: s: r: x: t:] o.a.s.c.CoreContainer Not all security plugins configured! authentication=disabled authorization=disabled. Solr is only as secure as you make it. Consider configuring authentication/authorization before exposing Solr to users internal or external. See https://s.apache.org/solrsecurity for more info 2> 457125 INFO (TEST-TestReplicaProperties.test-seed#[246C98A4C257C021]) [n:127.0.0.1:40873_mr%2Fx c: s: r: x: t:] o.a.s.c.CorePropertiesLocator Found 0 core definitions underneath /build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.cloud.api.collections.TestReplicaProperties_246C98A4C257C021-001/control-001/cores 2> 457609 INFO (TEST-TestReplicaProperties.test-seed#[246C98A4C257C021]) [n:127.0.0.1:40873_mr%2Fx c: s: r: x: t:] o.a.s.j.SolrRequestAuthorizer Creating a new SolrRequestAuthorizer 2> 457825 INFO (TEST-TestReplicaProperties.test-seed#[246C98A4C257C021]) [n:127.0.0.1:40873_mr%2Fx c: s: r: x: t:] o.a.s.e.JettySolrRunner Jetty properties: {hostContext=/mr/x, solr.data.dir=/build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.cloud.api.collections.TestReplicaProperties_246C98A4C257C021-001/tempDir-001/control/data, hostPort=40873, coreRootDirectory=/build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-cwd/../../../../../../../../../build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.cloud.api.collections.TestReplicaProperties_246C98A4C257C021-001/control-001/cores, replicaType=NRT} 2> 457837 WARN (TEST-TestReplicaProperties.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.c.SolrZkClient Using default ZkCredentialsInjector. ZkCredentialsInjector is not secure, it creates an empty list of credentials which leads to 'OPEN_ACL_UNSAFE' ACLs to Zookeeper nodes 2> 457848 INFO (TEST-TestReplicaProperties.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.c.ConnectionManager Waiting up to 15000ms for client to connect to ZooKeeper 2> 457857 INFO (zkConnectionManagerCallback-1446-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ConnectionManager zkClient has connected 2> 457858 INFO (TEST-TestReplicaProperties.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper 2> 457858 WARN (TEST-TestReplicaProperties.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.c.SolrZkClient Using default ZkACLProvider. DefaultZkACLProvider is not secure, it creates 'OPEN_ACL_UNSAFE' ACLs to Zookeeper nodes 2> 457864 INFO (TEST-TestReplicaProperties.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (0) -> (1) 2> 457872 INFO (TEST-TestReplicaProperties.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.s.i.ZkClientClusterStateProvider Cluster at 127.0.0.1:41191/solr ready 2> 457917 INFO (OverseerThreadFactory-1438-thread-1) [n: c:control_collection s: r: x: t:] o.a.s.c.a.c.CreateCollectionCmd Create collection control_collection 2> 458079 INFO (OverseerStateUpdate-72077818027180036-127.0.0.1:40873_mr%2Fx-n_0000000000) [n:127.0.0.1:40873_mr%2Fx c: s: r: x: t:] o.a.s.c.o.SliceMutator createReplica() { 2> "core":"control_collection_shard1_replica_n1", 2> "node_name":"127.0.0.1:40873_mr%2Fx", 2> "base_url":"http://127.0.0.1:40873/mr/x", 2> "collection":"control_collection", 2> "shard":"shard1", 2> "state":"down", 2> "type":"NRT", 2> "operation":"ADDREPLICA", 2> "waitForFinalState":"false"} 2> 458196 INFO (zkCallback-1432-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/control_collection/state.json zxid: 153] for collection [control_collection] has occurred - updating... (live nodes size: [1]) 2> 458218 INFO (qtp1805675424-2337) [n:127.0.0.1:40873_mr%2Fx c: s: r: x:control_collection_shard1_replica_n1 t:null-5395] o.a.s.h.a.CoreAdminOperation core create command qt=/admin/cores&collection.configName=conf1&newCollection=true&collection=control_collection&version=2&replicaType=NRT&coreNodeName=core_node2&name=control_collection_shard1_replica_n1&action=CREATE&numShards=1&shard=shard1&wt=javabin 2> 458366 INFO (qtp1805675424-2337) [n:127.0.0.1:40873_mr%2Fx c:control_collection s:shard1 r:core_node2 x:control_collection_shard1_replica_n1 t:null-5395] o.a.s.c.SolrConfig Using Lucene MatchVersion: 9.8.0 2> 458412 INFO (qtp1805675424-2337) [n:127.0.0.1:40873_mr%2Fx c:control_collection s:shard1 r:core_node2 x:control_collection_shard1_replica_n1 t:null-5395] o.a.s.s.IndexSchema Schema name=test 2> 459208 INFO (qtp1805675424-2337) [n:127.0.0.1:40873_mr%2Fx c:control_collection s:shard1 r:core_node2 x:control_collection_shard1_replica_n1 t:null-5395] o.a.s.s.IndexSchema Loaded schema test/1.6 with uniqueid field id 2> 459732 INFO (qtp1805675424-2337) [n:127.0.0.1:40873_mr%2Fx c:control_collection s:shard1 r:core_node2 x:control_collection_shard1_replica_n1 t:null-5395] o.a.s.c.CoreContainer Creating SolrCore 'control_collection_shard1_replica_n1' using configuration from configset conf1, trusted=true 2> 459737 INFO (qtp1805675424-2337) [n:127.0.0.1:40873_mr%2Fx c:control_collection s:shard1 r:core_node2 x:control_collection_shard1_replica_n1 t:null-5395] o.a.s.c.SolrCore Opening new SolrCore at [/build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.cloud.api.collections.TestReplicaProperties_246C98A4C257C021-001/control-001/cores/control_collection_shard1_replica_n1], dataDir=[/build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.cloud.api.collections.TestReplicaProperties_246C98A4C257C021-001/control-001/cores/control_collection_shard1_replica_n1/data/] 2> 459765 INFO (qtp1805675424-2337) [n:127.0.0.1:40873_mr%2Fx c:control_collection s:shard1 r:core_node2 x:control_collection_shard1_replica_n1 t:null-5395] o.a.s.u.RandomMergePolicy RandomMergePolicy wrapping class org.apache.lucene.index.LogByteSizeMergePolicy: [LogByteSizeMergePolicy: minMergeSize=1677721, mergeFactor=18, maxMergeSize=2147483648, maxMergeSizeForForcedMerge=9223372036854775807, calibrateSizeByDeletes=false, maxMergeDocs=2147483647, maxCFSSegmentSizeMB=8.796093022208E12, noCFSRatio=1.0] 2> 459792 WARN (qtp1805675424-2337) [n:127.0.0.1:40873_mr%2Fx c:control_collection s:shard1 r:core_node2 x:control_collection_shard1_replica_n1 t:null-5395] o.a.s.c.RequestHandlers INVALID paramSet a in requestHandler {type = requestHandler,name = /dump,class = DumpRequestHandler,attributes = {initParams=a, name=/dump, class=DumpRequestHandler},args = {defaults={a=A, b=B}}} 2> 460414 INFO (qtp1805675424-2337) [n:127.0.0.1:40873_mr%2Fx c:control_collection s:shard1 r:core_node2 x:control_collection_shard1_replica_n1 t:null-5395] o.a.s.j.SolrRequestAuthorizer Creating a new SolrRequestAuthorizer 2> 460478 INFO (qtp1805675424-2337) [n:127.0.0.1:40873_mr%2Fx c:control_collection s:shard1 r:core_node2 x:control_collection_shard1_replica_n1 t:null-5395] o.a.s.u.UpdateHandler Using UpdateLog implementation: org.apache.solr.update.UpdateLog 2> 460479 INFO (qtp1805675424-2337) [n:127.0.0.1:40873_mr%2Fx c:control_collection s:shard1 r:core_node2 x:control_collection_shard1_replica_n1 t:null-5395] o.a.s.u.UpdateLog Initializing UpdateLog: dataDir= defaultSyncLevel=FLUSH numRecordsToKeep=100 maxNumLogsToKeep=10 numVersionBuckets=65536 2> 460501 INFO (qtp1805675424-2337) [n:127.0.0.1:40873_mr%2Fx c:control_collection s:shard1 r:core_node2 x:control_collection_shard1_replica_n1 t:null-5395] o.a.s.u.CommitTracker Hard AutoCommit: disabled 2> 460502 INFO (qtp1805675424-2337) [n:127.0.0.1:40873_mr%2Fx c:control_collection s:shard1 r:core_node2 x:control_collection_shard1_replica_n1 t:null-5395] o.a.s.u.CommitTracker Soft AutoCommit: disabled 2> 460514 INFO (qtp1805675424-2337) [n:127.0.0.1:40873_mr%2Fx c:control_collection s:shard1 r:core_node2 x:control_collection_shard1_replica_n1 t:null-5395] o.a.s.u.RandomMergePolicy RandomMergePolicy wrapping class org.apache.lucene.index.TieredMergePolicy: [TieredMergePolicy: maxMergeAtOnce=50, maxMergedSegmentMB=16.045801162719727, floorSegmentMB=0.9055280685424805, forceMergeDeletesPctAllowed=17.089761324479934, segmentsPerTier=27.0, maxCFSSegmentSizeMB=8.796093022208E12, noCFSRatio=0.3629416153254694, deletesPctAllowed=40.30256013139616 2> 460535 INFO (qtp1805675424-2337) [n:127.0.0.1:40873_mr%2Fx c:control_collection s:shard1 r:core_node2 x:control_collection_shard1_replica_n1 t:null-5395] o.a.s.r.ManagedResourceStorage Configured ZooKeeperStorageIO with znodeBase: /configs/conf1 2> 460538 INFO (qtp1805675424-2337) [n:127.0.0.1:40873_mr%2Fx c:control_collection s:shard1 r:core_node2 x:control_collection_shard1_replica_n1 t:null-5395] o.a.s.r.ManagedResourceStorage Loaded null at path _rest_managed.json using ZooKeeperStorageIO:path=/configs/conf1 2> 460543 INFO (qtp1805675424-2337) [n:127.0.0.1:40873_mr%2Fx c:control_collection s:shard1 r:core_node2 x:control_collection_shard1_replica_n1 t:null-5395] o.a.s.h.ReplicationHandler Commits will be reserved for 10000 ms 2> 460548 INFO (qtp1805675424-2337) [n:127.0.0.1:40873_mr%2Fx c:control_collection s:shard1 r:core_node2 x:control_collection_shard1_replica_n1 t:null-5395] o.a.s.u.UpdateLog Could not find max version in index or recent updates, using new clock 1788735718546210816 2> 460593 INFO (searcherExecutor-1448-thread-1-processing-control_collection_shard1_replica_n1 null-5395 core_node2 127.0.0.1:40873_mr%2Fx control_collection shard1) [n:127.0.0.1:40873_mr%2Fx c:control_collection s:shard1 r:core_node2 x:control_collection_shard1_replica_n1 t:null-5395] o.a.s.c.SolrCore Registered new searcher autowarm time: 0 ms 2> 460597 INFO (qtp1805675424-2337) [n:127.0.0.1:40873_mr%2Fx c:control_collection s:shard1 r:core_node2 x:control_collection_shard1_replica_n1 t:null-5395] o.a.s.c.ZkShardTerms Successful update of terms at /collections/control_collection/terms/shard1 to Terms{values={core_node2=0}, version=0} for registerTerm 2> 460598 INFO (qtp1805675424-2337) [n:127.0.0.1:40873_mr%2Fx c:control_collection s:shard1 r:core_node2 x:control_collection_shard1_replica_n1 t:null-5395] o.a.s.c.ShardLeaderElectionContextBase make sure parent is created /collections/control_collection/leaders/shard1 2> 460630 INFO (qtp1805675424-2337) [n:127.0.0.1:40873_mr%2Fx c:control_collection s:shard1 r:core_node2 x:control_collection_shard1_replica_n1 t:null-5395] o.a.s.c.ShardLeaderElectionContext Enough replicas found to continue. 2> 460630 INFO (qtp1805675424-2337) [n:127.0.0.1:40873_mr%2Fx c:control_collection s:shard1 r:core_node2 x:control_collection_shard1_replica_n1 t:null-5395] o.a.s.c.ShardLeaderElectionContext I may be the new leader - try and sync 2> 460630 INFO (qtp1805675424-2337) [n:127.0.0.1:40873_mr%2Fx c:control_collection s:shard1 r:core_node2 x:control_collection_shard1_replica_n1 t:null-5395] o.a.s.c.SyncStrategy Sync replicas to http://127.0.0.1:40873/mr/x/control_collection_shard1_replica_n1/ 2> 460633 INFO (qtp1805675424-2337) [n:127.0.0.1:40873_mr%2Fx c:control_collection s:shard1 r:core_node2 x:control_collection_shard1_replica_n1 t:null-5395] o.a.s.c.SyncStrategy Sync Success - now sync replicas to me 2> 460635 INFO (qtp1805675424-2337) [n:127.0.0.1:40873_mr%2Fx c:control_collection s:shard1 r:core_node2 x:control_collection_shard1_replica_n1 t:null-5395] o.a.s.c.SyncStrategy http://127.0.0.1:40873/mr/x/control_collection_shard1_replica_n1/ has no replicas 2> 460636 INFO (qtp1805675424-2337) [n:127.0.0.1:40873_mr%2Fx c:control_collection s:shard1 r:core_node2 x:control_collection_shard1_replica_n1 t:null-5395] o.a.s.c.ShardLeaderElectionContextBase Creating leader registration node /collections/control_collection/leaders/shard1/leader after winning as /collections/control_collection/leader_elect/shard1/election/72077818027180036-core_node2-n_0000000000 2> 460657 INFO (qtp1805675424-2337) [n:127.0.0.1:40873_mr%2Fx c:control_collection s:shard1 r:core_node2 x:control_collection_shard1_replica_n1 t:null-5395] o.a.s.c.ShardLeaderElectionContext I am the new leader: http://127.0.0.1:40873/mr/x/control_collection_shard1_replica_n1/ shard1 2> 460682 INFO (qtp1805675424-2337) [n:127.0.0.1:40873_mr%2Fx c:control_collection s:shard1 r:core_node2 x:control_collection_shard1_replica_n1 t:null-5395] o.a.s.c.ZkController I am the leader, no recovery necessary 2> 460718 INFO (qtp1805675424-2337) [n:127.0.0.1:40873_mr%2Fx c: s: r: x: t:null-5395] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/cores params={qt=/admin/cores&coreNodeName=core_node2&collection.configName=conf1&newCollection=true&name=control_collection_shard1_replica_n1&action=CREATE&numShards=1&collection=control_collection&shard=shard1&wt=javabin&version=2&replicaType=NRT} status=0 QTime=2502 2> 460733 INFO (qtp1805675424-2334) [n:127.0.0.1:40873_mr%2Fx c:control_collection s: r: x: t:null-5394] o.a.s.h.a.CollectionsHandler Wait for new collection to be active for at most 45 seconds. Check all shard replicas 2> 460803 INFO (zkCallback-1432-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/control_collection/state.json zxid: 177] for collection [control_collection] has occurred - updating... (live nodes size: [1]) 2> 460803 INFO (zkCallback-1432-thread-2) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/control_collection/state.json zxid: 177] for collection [control_collection] has occurred - updating... (live nodes size: [1]) 2> 460810 INFO (qtp1805675424-2334) [n:127.0.0.1:40873_mr%2Fx c:control_collection s: r: x: t:null-5394] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/collections params={collection.configName=conf1&name=control_collection&nrtReplicas=1&action=CREATE&numShards=1&createNodeSet=127.0.0.1:40873_mr%252Fx&wt=javabin&version=2} status=0 QTime=2921 2> 460813 INFO (TEST-TestReplicaProperties.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.AbstractFullDistribZkTestBase Waiting to see 1 active replicas in collection: control_collection 2> 460942 WARN (TEST-TestReplicaProperties.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.c.SolrZkClient Using default ZkCredentialsInjector. ZkCredentialsInjector is not secure, it creates an empty list of credentials which leads to 'OPEN_ACL_UNSAFE' ACLs to Zookeeper nodes 2> 460950 INFO (TEST-TestReplicaProperties.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.c.ConnectionManager Waiting up to 15000ms for client to connect to ZooKeeper 2> 460961 INFO (zkConnectionManagerCallback-1457-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ConnectionManager zkClient has connected 2> 460962 INFO (TEST-TestReplicaProperties.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper 2> 460962 WARN (TEST-TestReplicaProperties.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.c.SolrZkClient Using default ZkACLProvider. DefaultZkACLProvider is not secure, it creates 'OPEN_ACL_UNSAFE' ACLs to Zookeeper nodes 2> 460970 INFO (TEST-TestReplicaProperties.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (0) -> (1) 2> 460979 INFO (TEST-TestReplicaProperties.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.s.i.ZkClientClusterStateProvider Cluster at 127.0.0.1:41191/solr ready 2> 460980 INFO (TEST-TestReplicaProperties.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.ChaosMonkey monkey: init - expire sessions:false cause connection loss:false 2> 461018 INFO (OverseerThreadFactory-1438-thread-2) [n: c:collection1 s: r: x: t:] o.a.s.c.a.c.CreateCollectionCmd Create collection collection1 2> 461022 INFO (OverseerCollectionConfigSetProcessor-72077818027180036-127.0.0.1:40873_mr%2Fx-n_0000000000) [n: c: s: r: x: t:] o.a.s.c.OverseerTaskQueue Response ZK path: /overseer/collection-queue-work/qnr-0000000000 doesn't exist. Requestor may have disconnected from ZooKeeper 2> 461252 WARN (OverseerThreadFactory-1438-thread-2) [n: c:collection1 s: r: x: t:] o.a.s.c.a.c.CreateCollectionCmd It is unusual to create a collection (collection1) without cores. 2> 461262 INFO (qtp1805675424-2336) [n:127.0.0.1:40873_mr%2Fx c:collection1 s: r: x: t:null-5396] o.a.s.h.a.CollectionsHandler Wait for new collection to be active for at most 45 seconds. Check all shard replicas 2> 461265 INFO (qtp1805675424-2336) [n:127.0.0.1:40873_mr%2Fx c:collection1 s: r: x: t:null-5396] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/collections params={collection.configName=conf1&name=collection1&nrtReplicas=1&action=CREATE&numShards=2&createNodeSet=&wt=javabin&version=2} status=0 QTime=277 2> 461271 INFO (TEST-TestReplicaProperties.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.SolrCloudTestCase active slice count: 2 expected: 2 2> 461272 INFO (TEST-TestReplicaProperties.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.SolrCloudTestCase active replica count: 0 expected replica count: 0 2> 461272 INFO (TEST-TestReplicaProperties.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.AbstractFullDistribZkTestBase Creating jetty instances pullReplicaCount=0 numOtherReplicas=4 2> 463038 INFO (OverseerCollectionConfigSetProcessor-72077818027180036-127.0.0.1:40873_mr%2Fx-n_0000000000) [n: c: s: r: x: t:] o.a.s.c.OverseerTaskQueue Response ZK path: /overseer/collection-queue-work/qnr-0000000002 doesn't exist. Requestor may have disconnected from ZooKeeper 2> 464371 INFO (TEST-TestReplicaProperties.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.AbstractFullDistribZkTestBase create jetty 1 in directory /build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.cloud.api.collections.TestReplicaProperties_246C98A4C257C021-001/shard-1-001 of type NRT for shard2 2> 464385 INFO (closeThreadPool-1458-thread-1) [n: c: s: r: x: t:] o.a.s.e.JettySolrRunner Start Jetty (configured port=0, binding port=0) 2> 464386 INFO (closeThreadPool-1458-thread-1) [n: c: s: r: x: t:] o.a.s.e.JettySolrRunner Trying to start Jetty on port 0 try number 2 ... 2> 464387 INFO (closeThreadPool-1458-thread-1) [n: c: s: r: x: t:] o.e.j.s.Server jetty-10.0.19; built: 2023-12-15T20:38:32.477Z; git: 8492d1c78f122bb30cce20aecfa07e7283facd47; jvm 11.0.22+7 2> 464395 INFO (closeThreadPool-1458-thread-1) [n: c: s: r: x: t:] o.e.j.s.s.DefaultSessionIdManager Session workerName=node0 2> 464400 INFO (closeThreadPool-1458-thread-1) [n: c: s: r: x: t:] o.e.j.s.h.ContextHandler Started o.e.j.s.ServletContextHandler@58c53952{/mr/x,file:///build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-cwd/,AVAILABLE} 2> 464405 INFO (closeThreadPool-1458-thread-1) [n: c: s: r: x: t:] o.e.j.s.AbstractConnector Started ServerConnector@2ff96fc2{HTTP/1.1, (http/1.1, h2c)}{127.0.0.1:46725} 2> 464406 INFO (closeThreadPool-1458-thread-1) [n: c: s: r: x: t:] o.e.j.s.Server Started Server@3fb8e994{STARTING}[10.0.19,sto=0] @464583ms 2> 464408 ERROR (closeThreadPool-1458-thread-1) [n: c: s: r: x: t:] o.a.s.u.StartupLoggingUtils Missing Java Option solr.log.dir. Logging may be missing or incomplete. 2> 464408 INFO (closeThreadPool-1458-thread-1) [n: c: s: r: x: t:] o.a.s.s.CoreContainerProvider Using logger factory org.apache.logging.slf4j.Log4jLoggerFactory 2> 464409 INFO (closeThreadPool-1458-thread-1) [n: c: s: r: x: t:] o.a.s.s.CoreContainerProvider ___ _ Welcome to Apache Solr™ version 9.4.1 2> 464409 INFO (closeThreadPool-1458-thread-1) [n: c: s: r: x: t:] o.a.s.s.CoreContainerProvider / __| ___| |_ _ Starting in cloud mode on port null 2> 464409 INFO (closeThreadPool-1458-thread-1) [n: c: s: r: x: t:] o.a.s.s.CoreContainerProvider \__ \/ _ \ | '_| Install dir: /build/solr/src/solr-9.4.1/solr 2> 464410 INFO (closeThreadPool-1458-thread-1) [n: c: s: r: x: t:] o.a.s.s.CoreContainerProvider |___/\___/_|_| Start time: 2024-01-21T21:08:37.478142Z 2> 464411 INFO (closeThreadPool-1458-thread-1) [n: c: s: r: x: t:] o.a.s.s.CoreContainerProvider Solr Home: /build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.cloud.api.collections.TestReplicaProperties_246C98A4C257C021-001/shard-1-001 (source: servlet config: solr.solr.home) 2> 464414 WARN (closeThreadPool-1458-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.SolrZkClient Using default ZkCredentialsInjector. ZkCredentialsInjector is not secure, it creates an empty list of credentials which leads to 'OPEN_ACL_UNSAFE' ACLs to Zookeeper nodes 2> 464425 INFO (closeThreadPool-1458-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ConnectionManager Waiting up to 30000ms for client to connect to ZooKeeper 2> 464437 INFO (zkConnectionManagerCallback-1460-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ConnectionManager zkClient has connected 2> 464437 INFO (closeThreadPool-1458-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper 2> 464438 WARN (closeThreadPool-1458-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.SolrZkClient Using default ZkACLProvider. DefaultZkACLProvider is not secure, it creates 'OPEN_ACL_UNSAFE' ACLs to Zookeeper nodes 2> 464545 INFO (closeThreadPool-1458-thread-1) [n: c: s: r: x: t:] o.a.s.c.NodeConfig Loading solr.xml from SolrHome (not found in ZooKeeper) 2> 464546 INFO (closeThreadPool-1458-thread-1) [n: c: s: r: x: t:] o.a.s.c.SolrXmlConfig Loading solr.xml from /build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.cloud.api.collections.TestReplicaProperties_246C98A4C257C021-001/shard-1-001/solr.xml 2> 467625 INFO (TEST-TestReplicaProperties.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.AbstractFullDistribZkTestBase create jetty 2 in directory /build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.cloud.api.collections.TestReplicaProperties_246C98A4C257C021-001/shard-2-001 of type NRT for shard1 2> 467641 INFO (closeThreadPool-1458-thread-2) [n: c: s: r: x: t:] o.a.s.e.JettySolrRunner Start Jetty (configured port=0, binding port=0) 2> 467641 INFO (closeThreadPool-1458-thread-2) [n: c: s: r: x: t:] o.a.s.e.JettySolrRunner Trying to start Jetty on port 0 try number 2 ... 2> 467642 INFO (closeThreadPool-1458-thread-2) [n: c: s: r: x: t:] o.e.j.s.Server jetty-10.0.19; built: 2023-12-15T20:38:32.477Z; git: 8492d1c78f122bb30cce20aecfa07e7283facd47; jvm 11.0.22+7 2> 467657 INFO (closeThreadPool-1458-thread-2) [n: c: s: r: x: t:] o.e.j.s.s.DefaultSessionIdManager Session workerName=node0 2> 467664 INFO (closeThreadPool-1458-thread-2) [n: c: s: r: x: t:] o.e.j.s.h.ContextHandler Started o.e.j.s.ServletContextHandler@32f83b4e{/mr/x,file:///build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-cwd/,AVAILABLE} 2> 467668 INFO (closeThreadPool-1458-thread-2) [n: c: s: r: x: t:] o.e.j.s.AbstractConnector Started ServerConnector@71dbcd31{HTTP/1.1, (http/1.1, h2c)}{127.0.0.1:33625} 2> 467671 INFO (closeThreadPool-1458-thread-2) [n: c: s: r: x: t:] o.e.j.s.Server Started Server@2aba3825{STARTING}[10.0.19,sto=0] @467847ms 2> 467672 ERROR (closeThreadPool-1458-thread-2) [n: c: s: r: x: t:] o.a.s.u.StartupLoggingUtils Missing Java Option solr.log.dir. Logging may be missing or incomplete. 2> 467673 INFO (closeThreadPool-1458-thread-2) [n: c: s: r: x: t:] o.a.s.s.CoreContainerProvider Using logger factory org.apache.logging.slf4j.Log4jLoggerFactory 2> 467673 INFO (closeThreadPool-1458-thread-2) [n: c: s: r: x: t:] o.a.s.s.CoreContainerProvider ___ _ Welcome to Apache Solr™ version 9.4.1 2> 467673 INFO (closeThreadPool-1458-thread-2) [n: c: s: r: x: t:] o.a.s.s.CoreContainerProvider / __| ___| |_ _ Starting in cloud mode on port null 2> 467674 INFO (closeThreadPool-1458-thread-2) [n: c: s: r: x: t:] o.a.s.s.CoreContainerProvider \__ \/ _ \ | '_| Install dir: /build/solr/src/solr-9.4.1/solr 2> 467674 INFO (closeThreadPool-1458-thread-2) [n: c: s: r: x: t:] o.a.s.s.CoreContainerProvider |___/\___/_|_| Start time: 2024-01-21T21:08:40.742264Z 2> 467675 INFO (closeThreadPool-1458-thread-2) [n: c: s: r: x: t:] o.a.s.s.CoreContainerProvider Solr Home: /build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.cloud.api.collections.TestReplicaProperties_246C98A4C257C021-001/shard-2-001 (source: servlet config: solr.solr.home) 2> 467679 WARN (closeThreadPool-1458-thread-2) [n: c: s: r: x: t:] o.a.s.c.c.SolrZkClient Using default ZkCredentialsInjector. ZkCredentialsInjector is not secure, it creates an empty list of credentials which leads to 'OPEN_ACL_UNSAFE' ACLs to Zookeeper nodes 2> 467693 INFO (closeThreadPool-1458-thread-2) [n: c: s: r: x: t:] o.a.s.c.c.ConnectionManager Waiting up to 30000ms for client to connect to ZooKeeper 2> 467708 INFO (zkConnectionManagerCallback-1465-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ConnectionManager zkClient has connected 2> 467709 INFO (closeThreadPool-1458-thread-2) [n: c: s: r: x: t:] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper 2> 467709 WARN (closeThreadPool-1458-thread-2) [n: c: s: r: x: t:] o.a.s.c.c.SolrZkClient Using default ZkACLProvider. DefaultZkACLProvider is not secure, it creates 'OPEN_ACL_UNSAFE' ACLs to Zookeeper nodes 2> 467818 INFO (closeThreadPool-1458-thread-2) [n: c: s: r: x: t:] o.a.s.c.NodeConfig Loading solr.xml from SolrHome (not found in ZooKeeper) 2> 467819 INFO (closeThreadPool-1458-thread-2) [n: c: s: r: x: t:] o.a.s.c.SolrXmlConfig Loading solr.xml from /build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.cloud.api.collections.TestReplicaProperties_246C98A4C257C021-001/shard-2-001/solr.xml 2> 469974 INFO (closeThreadPool-1458-thread-2) [n: c: s: r: x: t:] o.a.s.c.ZkContainer Zookeeper client=127.0.0.1:41191/solr 2> 469976 WARN (closeThreadPool-1458-thread-2) [n: c: s: r: x: t:] o.a.s.c.c.SolrZkClient Using default ZkCredentialsInjector. ZkCredentialsInjector is not secure, it creates an empty list of credentials which leads to 'OPEN_ACL_UNSAFE' ACLs to Zookeeper nodes 2> 469991 INFO (closeThreadPool-1458-thread-2) [n: c: s: r: x: t:] o.a.s.c.c.ConnectionManager Waiting up to 15000ms for client to connect to ZooKeeper 2> 470032 INFO (zkConnectionManagerCallback-1475-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ConnectionManager zkClient has connected 2> 470032 INFO (closeThreadPool-1458-thread-2) [n: c: s: r: x: t:] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper 2> 470033 WARN (closeThreadPool-1458-thread-2) [n: c: s: r: x: t:] o.a.s.c.c.SolrZkClient Using default ZkACLProvider. DefaultZkACLProvider is not secure, it creates 'OPEN_ACL_UNSAFE' ACLs to Zookeeper nodes 2> 470146 INFO (closeThreadPool-1458-thread-2) [n: c: s: r: x: t:] o.a.s.c.DistributedClusterStateUpdater Creating DistributedClusterStateUpdater with useDistributedStateUpdate=false. Solr will be using Overseer based cluster state updates. 2> 470160 INFO (closeThreadPool-1458-thread-2) [n:127.0.0.1:33625_mr%2Fx c: s: r: x: t:] o.a.s.c.c.ConnectionManager Waiting up to 15000ms for client to connect to ZooKeeper 2> 470173 INFO (zkConnectionManagerCallback-1477-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ConnectionManager zkClient has connected 2> 470174 INFO (closeThreadPool-1458-thread-2) [n:127.0.0.1:33625_mr%2Fx c: s: r: x: t:] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper 2> 470225 WARN (closeThreadPool-1458-thread-2) [n:127.0.0.1:33625_mr%2Fx c: s: r: x: t:] o.a.s.c.ZkController Contents of zookeeper /security.json are world-readable; consider setting up ACLs as described in https://solr.apache.org/guide/solr/latest/deployment-guide/zookeeper-access-control.html 2> 470234 INFO (closeThreadPool-1458-thread-2) [n:127.0.0.1:33625_mr%2Fx c: s: r: x: t:] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (0) -> (1) 2> 470255 INFO (closeThreadPool-1458-thread-2) [n:127.0.0.1:33625_mr%2Fx c: s: r: x: t:] o.a.s.c.DistributedClusterStateUpdater Creating DistributedClusterStateUpdater with useDistributedStateUpdate=false. Solr will be using Overseer based cluster state updates. 2> 470281 INFO (closeThreadPool-1458-thread-2) [n:127.0.0.1:33625_mr%2Fx c: s: r: x: t:] o.a.s.c.ZkController Publish node=127.0.0.1:33625_mr%2Fx as DOWN 2> 470290 INFO (closeThreadPool-1458-thread-2) [n:127.0.0.1:33625_mr%2Fx c: s: r: x: t:] o.a.s.c.ZkController Register node as live in ZooKeeper:/live_nodes/127.0.0.1:33625_mr%2Fx 2> 470299 INFO (zkCallback-1432-thread-2) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (1) -> (2) 2> 470302 INFO (zkCallback-1476-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (1) -> (2) 2> 470303 INFO (zkCallback-1456-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (1) -> (2) 2> 470319 WARN (closeThreadPool-1458-thread-2) [n:127.0.0.1:33625_mr%2Fx c: s: r: x: t:] o.a.s.c.CoreContainer Not all security plugins configured! authentication=disabled authorization=disabled. Solr is only as secure as you make it. Consider configuring authentication/authorization before exposing Solr to users internal or external. See https://s.apache.org/solrsecurity for more info 2> 470875 INFO (closeThreadPool-1458-thread-1) [n: c: s: r: x: t:] o.a.s.c.ZkContainer Zookeeper client=127.0.0.1:41191/solr 2> 470876 WARN (closeThreadPool-1458-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.SolrZkClient Using default ZkCredentialsInjector. ZkCredentialsInjector is not secure, it creates an empty list of credentials which leads to 'OPEN_ACL_UNSAFE' ACLs to Zookeeper nodes 2> 470892 INFO (closeThreadPool-1458-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ConnectionManager Waiting up to 15000ms for client to connect to ZooKeeper 2> 470904 INFO (zkConnectionManagerCallback-1488-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ConnectionManager zkClient has connected 2> 470906 INFO (closeThreadPool-1458-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper 2> 470906 WARN (closeThreadPool-1458-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.SolrZkClient Using default ZkACLProvider. DefaultZkACLProvider is not secure, it creates 'OPEN_ACL_UNSAFE' ACLs to Zookeeper nodes 2> 471015 INFO (closeThreadPool-1458-thread-1) [n: c: s: r: x: t:] o.a.s.c.DistributedClusterStateUpdater Creating DistributedClusterStateUpdater with useDistributedStateUpdate=false. Solr will be using Overseer based cluster state updates. 2> 471030 INFO (closeThreadPool-1458-thread-1) [n:127.0.0.1:46725_mr%2Fx c: s: r: x: t:] o.a.s.c.c.ConnectionManager Waiting up to 15000ms for client to connect to ZooKeeper 2> 471040 INFO (zkConnectionManagerCallback-1490-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ConnectionManager zkClient has connected 2> 471040 INFO (closeThreadPool-1458-thread-1) [n:127.0.0.1:46725_mr%2Fx c: s: r: x: t:] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper 2> 471050 INFO (TEST-TestReplicaProperties.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.AbstractFullDistribZkTestBase create jetty 3 in directory /build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.cloud.api.collections.TestReplicaProperties_246C98A4C257C021-001/shard-3-001 of type NRT for shard2 2> 471098 INFO (closeThreadPool-1458-thread-3) [n: c: s: r: x: t:] o.a.s.e.JettySolrRunner Start Jetty (configured port=0, binding port=0) 2> 471099 INFO (closeThreadPool-1458-thread-3) [n: c: s: r: x: t:] o.a.s.e.JettySolrRunner Trying to start Jetty on port 0 try number 2 ... 2> 471100 WARN (closeThreadPool-1458-thread-1) [n:127.0.0.1:46725_mr%2Fx c: s: r: x: t:] o.a.s.c.ZkController Contents of zookeeper /security.json are world-readable; consider setting up ACLs as described in https://solr.apache.org/guide/solr/latest/deployment-guide/zookeeper-access-control.html 2> 471101 INFO (closeThreadPool-1458-thread-3) [n: c: s: r: x: t:] o.e.j.s.Server jetty-10.0.19; built: 2023-12-15T20:38:32.477Z; git: 8492d1c78f122bb30cce20aecfa07e7283facd47; jvm 11.0.22+7 2> 471108 INFO (closeThreadPool-1458-thread-1) [n:127.0.0.1:46725_mr%2Fx c: s: r: x: t:] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (0) -> (2) 2> 471130 INFO (closeThreadPool-1458-thread-3) [n: c: s: r: x: t:] o.e.j.s.s.DefaultSessionIdManager Session workerName=node0 2> 471134 INFO (closeThreadPool-1458-thread-1) [n:127.0.0.1:46725_mr%2Fx c: s: r: x: t:] o.a.s.c.DistributedClusterStateUpdater Creating DistributedClusterStateUpdater with useDistributedStateUpdate=false. Solr will be using Overseer based cluster state updates. 2> 471144 INFO (closeThreadPool-1458-thread-3) [n: c: s: r: x: t:] o.e.j.s.h.ContextHandler Started o.e.j.s.ServletContextHandler@47807e89{/mr/x,file:///build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-cwd/,AVAILABLE} 2> 471150 INFO (closeThreadPool-1458-thread-3) [n: c: s: r: x: t:] o.e.j.s.AbstractConnector Started ServerConnector@69194c1b{HTTP/1.1, (http/1.1, h2c)}{127.0.0.1:45309} 2> 471152 INFO (closeThreadPool-1458-thread-3) [n: c: s: r: x: t:] o.e.j.s.Server Started Server@3ea779b4{STARTING}[10.0.19,sto=0] @471329ms 2> 471154 INFO (closeThreadPool-1458-thread-1) [n:127.0.0.1:46725_mr%2Fx c: s: r: x: t:] o.a.s.c.ZkController Publish node=127.0.0.1:46725_mr%2Fx as DOWN 2> 471155 ERROR (closeThreadPool-1458-thread-3) [n: c: s: r: x: t:] o.a.s.u.StartupLoggingUtils Missing Java Option solr.log.dir. Logging may be missing or incomplete. 2> 471158 INFO (closeThreadPool-1458-thread-3) [n: c: s: r: x: t:] o.a.s.s.CoreContainerProvider Using logger factory org.apache.logging.slf4j.Log4jLoggerFactory 2> 471158 INFO (closeThreadPool-1458-thread-3) [n: c: s: r: x: t:] o.a.s.s.CoreContainerProvider ___ _ Welcome to Apache Solr™ version 9.4.1 2> 471158 INFO (closeThreadPool-1458-thread-3) [n: c: s: r: x: t:] o.a.s.s.CoreContainerProvider / __| ___| |_ _ Starting in cloud mode on port null 2> 471159 INFO (closeThreadPool-1458-thread-3) [n: c: s: r: x: t:] o.a.s.s.CoreContainerProvider \__ \/ _ \ | '_| Install dir: /build/solr/src/solr-9.4.1/solr 2> 471159 INFO (closeThreadPool-1458-thread-3) [n: c: s: r: x: t:] o.a.s.s.CoreContainerProvider |___/\___/_|_| Start time: 2024-01-21T21:08:44.227739Z 2> 471163 INFO (closeThreadPool-1458-thread-3) [n: c: s: r: x: t:] o.a.s.s.CoreContainerProvider Solr Home: /build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.cloud.api.collections.TestReplicaProperties_246C98A4C257C021-001/shard-3-001 (source: servlet config: solr.solr.home) 2> 471166 WARN (closeThreadPool-1458-thread-3) [n: c: s: r: x: t:] o.a.s.c.c.SolrZkClient Using default ZkCredentialsInjector. ZkCredentialsInjector is not secure, it creates an empty list of credentials which leads to 'OPEN_ACL_UNSAFE' ACLs to Zookeeper nodes 2> 471167 INFO (closeThreadPool-1458-thread-1) [n:127.0.0.1:46725_mr%2Fx c: s: r: x: t:] o.a.s.c.ZkController Register node as live in ZooKeeper:/live_nodes/127.0.0.1:46725_mr%2Fx 2> 471177 INFO (zkCallback-1432-thread-2) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (2) -> (3) 2> 471178 INFO (zkCallback-1456-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (2) -> (3) 2> 471178 INFO (zkCallback-1476-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (2) -> (3) 2> 471180 INFO (zkCallback-1489-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (2) -> (3) 2> 471190 INFO (closeThreadPool-1458-thread-3) [n: c: s: r: x: t:] o.a.s.c.c.ConnectionManager Waiting up to 30000ms for client to connect to ZooKeeper 2> 471197 WARN (closeThreadPool-1458-thread-1) [n:127.0.0.1:46725_mr%2Fx c: s: r: x: t:] o.a.s.c.CoreContainer Not all security plugins configured! authentication=disabled authorization=disabled. Solr is only as secure as you make it. Consider configuring authentication/authorization before exposing Solr to users internal or external. See https://s.apache.org/solrsecurity for more info 2> 471202 INFO (zkConnectionManagerCallback-1494-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ConnectionManager zkClient has connected 2> 471203 INFO (closeThreadPool-1458-thread-3) [n: c: s: r: x: t:] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper 2> 471203 WARN (closeThreadPool-1458-thread-3) [n: c: s: r: x: t:] o.a.s.c.c.SolrZkClient Using default ZkACLProvider. DefaultZkACLProvider is not secure, it creates 'OPEN_ACL_UNSAFE' ACLs to Zookeeper nodes 2> 471311 INFO (closeThreadPool-1458-thread-3) [n: c: s: r: x: t:] o.a.s.c.NodeConfig Loading solr.xml from SolrHome (not found in ZooKeeper) 2> 471313 INFO (closeThreadPool-1458-thread-3) [n: c: s: r: x: t:] o.a.s.c.SolrXmlConfig Loading solr.xml from /build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.cloud.api.collections.TestReplicaProperties_246C98A4C257C021-001/shard-3-001/solr.xml 2> 471577 INFO (closeThreadPool-1458-thread-2) [n:127.0.0.1:33625_mr%2Fx c: s: r: x: t:] o.a.s.c.CorePropertiesLocator Found 0 core definitions underneath /build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.cloud.api.collections.TestReplicaProperties_246C98A4C257C021-001/shard-2-001/cores 2> 472039 INFO (closeThreadPool-1458-thread-1) [n:127.0.0.1:46725_mr%2Fx c: s: r: x: t:] o.a.s.c.CorePropertiesLocator Found 0 core definitions underneath /build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.cloud.api.collections.TestReplicaProperties_246C98A4C257C021-001/shard-1-001/cores 2> 472100 INFO (closeThreadPool-1458-thread-2) [n:127.0.0.1:33625_mr%2Fx c: s: r: x: t:] o.a.s.j.SolrRequestAuthorizer Creating a new SolrRequestAuthorizer 2> 472322 INFO (closeThreadPool-1458-thread-2) [n:127.0.0.1:33625_mr%2Fx c: s: r: x: t:] o.a.s.e.JettySolrRunner Jetty properties: {hostContext=/mr/x, solrconfig=solrconfig.xml, solr.data.dir=/build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.cloud.api.collections.TestReplicaProperties_246C98A4C257C021-001/tempDir-001/jetty2, hostPort=33625, coreRootDirectory=/build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-cwd/../../../../../../../../../build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.cloud.api.collections.TestReplicaProperties_246C98A4C257C021-001/shard-2-001/cores, replicaType=NRT} 2> 472325 INFO (closeThreadPool-1458-thread-2) [n: c: s: r: x: t:] o.a.s.c.AbstractFullDistribZkTestBase waitForLiveNode: 127.0.0.1:33625_mr%2Fx 2> 472502 INFO (closeThreadPool-1458-thread-1) [n:127.0.0.1:46725_mr%2Fx c: s: r: x: t:] o.a.s.j.SolrRequestAuthorizer Creating a new SolrRequestAuthorizer 2> 473025 INFO (closeThreadPool-1458-thread-1) [n:127.0.0.1:46725_mr%2Fx c: s: r: x: t:] o.a.s.e.JettySolrRunner Jetty properties: {hostContext=/mr/x, solrconfig=solrconfig.xml, solr.data.dir=/build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.cloud.api.collections.TestReplicaProperties_246C98A4C257C021-001/tempDir-001/jetty1, hostPort=46725, coreRootDirectory=/build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-cwd/../../../../../../../../../build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.cloud.api.collections.TestReplicaProperties_246C98A4C257C021-001/shard-1-001/cores, replicaType=NRT} 2> 473030 INFO (closeThreadPool-1458-thread-1) [n: c: s: r: x: t:] o.a.s.c.AbstractFullDistribZkTestBase waitForLiveNode: 127.0.0.1:46725_mr%2Fx 2> 473567 INFO (closeThreadPool-1458-thread-3) [n: c: s: r: x: t:] o.a.s.c.ZkContainer Zookeeper client=127.0.0.1:41191/solr 2> 473568 WARN (closeThreadPool-1458-thread-3) [n: c: s: r: x: t:] o.a.s.c.c.SolrZkClient Using default ZkCredentialsInjector. ZkCredentialsInjector is not secure, it creates an empty list of credentials which leads to 'OPEN_ACL_UNSAFE' ACLs to Zookeeper nodes 2> 473581 INFO (closeThreadPool-1458-thread-3) [n: c: s: r: x: t:] o.a.s.c.c.ConnectionManager Waiting up to 15000ms for client to connect to ZooKeeper 2> 473595 INFO (zkConnectionManagerCallback-1508-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ConnectionManager zkClient has connected 2> 473596 INFO (closeThreadPool-1458-thread-3) [n: c: s: r: x: t:] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper 2> 473596 WARN (closeThreadPool-1458-thread-3) [n: c: s: r: x: t:] o.a.s.c.c.SolrZkClient Using default ZkACLProvider. DefaultZkACLProvider is not secure, it creates 'OPEN_ACL_UNSAFE' ACLs to Zookeeper nodes 2> 473704 INFO (closeThreadPool-1458-thread-3) [n: c: s: r: x: t:] o.a.s.c.DistributedClusterStateUpdater Creating DistributedClusterStateUpdater with useDistributedStateUpdate=false. Solr will be using Overseer based cluster state updates. 2> 473716 INFO (closeThreadPool-1458-thread-3) [n:127.0.0.1:45309_mr%2Fx c: s: r: x: t:] o.a.s.c.c.ConnectionManager Waiting up to 15000ms for client to connect to ZooKeeper 2> 473725 INFO (zkConnectionManagerCallback-1510-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ConnectionManager zkClient has connected 2> 473726 INFO (closeThreadPool-1458-thread-3) [n:127.0.0.1:45309_mr%2Fx c: s: r: x: t:] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper 2> 473773 WARN (closeThreadPool-1458-thread-3) [n:127.0.0.1:45309_mr%2Fx c: s: r: x: t:] o.a.s.c.ZkController Contents of zookeeper /security.json are world-readable; consider setting up ACLs as described in https://solr.apache.org/guide/solr/latest/deployment-guide/zookeeper-access-control.html 2> 473780 INFO (closeThreadPool-1458-thread-3) [n:127.0.0.1:45309_mr%2Fx c: s: r: x: t:] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (0) -> (3) 2> 473797 INFO (closeThreadPool-1458-thread-3) [n:127.0.0.1:45309_mr%2Fx c: s: r: x: t:] o.a.s.c.DistributedClusterStateUpdater Creating DistributedClusterStateUpdater with useDistributedStateUpdate=false. Solr will be using Overseer based cluster state updates. 2> 473812 INFO (closeThreadPool-1458-thread-3) [n:127.0.0.1:45309_mr%2Fx c: s: r: x: t:] o.a.s.c.ZkController Publish node=127.0.0.1:45309_mr%2Fx as DOWN 2> 473821 INFO (closeThreadPool-1458-thread-3) [n:127.0.0.1:45309_mr%2Fx c: s: r: x: t:] o.a.s.c.ZkController Register node as live in ZooKeeper:/live_nodes/127.0.0.1:45309_mr%2Fx 2> 473830 INFO (zkCallback-1456-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (3) -> (4) 2> 473830 INFO (zkCallback-1489-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (3) -> (4) 2> 473830 INFO (zkCallback-1432-thread-2) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (3) -> (4) 2> 473831 INFO (zkCallback-1476-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (3) -> (4) 2> 473832 INFO (zkCallback-1509-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (3) -> (4) 2> 473847 WARN (closeThreadPool-1458-thread-3) [n:127.0.0.1:45309_mr%2Fx c: s: r: x: t:] o.a.s.c.CoreContainer Not all security plugins configured! authentication=disabled authorization=disabled. Solr is only as secure as you make it. Consider configuring authentication/authorization before exposing Solr to users internal or external. See https://s.apache.org/solrsecurity for more info 2> 474450 INFO (TEST-TestReplicaProperties.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.AbstractFullDistribZkTestBase create jetty 4 in directory /build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.cloud.api.collections.TestReplicaProperties_246C98A4C257C021-001/shard-4-001 of type NRT for shard1 2> 474461 INFO (closeThreadPool-1458-thread-1) [n: c: s: r: x: t:] o.a.s.e.JettySolrRunner Start Jetty (configured port=0, binding port=0) 2> 474461 INFO (closeThreadPool-1458-thread-1) [n: c: s: r: x: t:] o.a.s.e.JettySolrRunner Trying to start Jetty on port 0 try number 2 ... 2> 474462 INFO (closeThreadPool-1458-thread-1) [n: c: s: r: x: t:] o.e.j.s.Server jetty-10.0.19; built: 2023-12-15T20:38:32.477Z; git: 8492d1c78f122bb30cce20aecfa07e7283facd47; jvm 11.0.22+7 2> 474480 INFO (closeThreadPool-1458-thread-1) [n: c: s: r: x: t:] o.e.j.s.s.DefaultSessionIdManager Session workerName=node0 2> 474484 INFO (closeThreadPool-1458-thread-1) [n: c: s: r: x: t:] o.e.j.s.h.ContextHandler Started o.e.j.s.ServletContextHandler@2462f35e{/mr/x,file:///build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-cwd/,AVAILABLE} 2> 474488 INFO (closeThreadPool-1458-thread-1) [n: c: s: r: x: t:] o.e.j.s.AbstractConnector Started ServerConnector@6454ebad{HTTP/1.1, (http/1.1, h2c)}{127.0.0.1:34767} 2> 474488 INFO (closeThreadPool-1458-thread-1) [n: c: s: r: x: t:] o.e.j.s.Server Started Server@283b567e{STARTING}[10.0.19,sto=0] @474665ms 2> 474490 ERROR (closeThreadPool-1458-thread-1) [n: c: s: r: x: t:] o.a.s.u.StartupLoggingUtils Missing Java Option solr.log.dir. Logging may be missing or incomplete. 2> 474492 INFO (closeThreadPool-1458-thread-1) [n: c: s: r: x: t:] o.a.s.s.CoreContainerProvider Using logger factory org.apache.logging.slf4j.Log4jLoggerFactory 2> 474493 INFO (closeThreadPool-1458-thread-1) [n: c: s: r: x: t:] o.a.s.s.CoreContainerProvider ___ _ Welcome to Apache Solr™ version 9.4.1 2> 474493 INFO (closeThreadPool-1458-thread-1) [n: c: s: r: x: t:] o.a.s.s.CoreContainerProvider / __| ___| |_ _ Starting in cloud mode on port null 2> 474494 INFO (closeThreadPool-1458-thread-1) [n: c: s: r: x: t:] o.a.s.s.CoreContainerProvider \__ \/ _ \ | '_| Install dir: /build/solr/src/solr-9.4.1/solr 2> 474494 INFO (closeThreadPool-1458-thread-1) [n: c: s: r: x: t:] o.a.s.s.CoreContainerProvider |___/\___/_|_| Start time: 2024-01-21T21:08:47.562212Z 2> 474495 INFO (closeThreadPool-1458-thread-1) [n: c: s: r: x: t:] o.a.s.s.CoreContainerProvider Solr Home: /build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.cloud.api.collections.TestReplicaProperties_246C98A4C257C021-001/shard-4-001 (source: servlet config: solr.solr.home) 2> 474497 WARN (closeThreadPool-1458-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.SolrZkClient Using default ZkCredentialsInjector. ZkCredentialsInjector is not secure, it creates an empty list of credentials which leads to 'OPEN_ACL_UNSAFE' ACLs to Zookeeper nodes 2> 474510 INFO (closeThreadPool-1458-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ConnectionManager Waiting up to 30000ms for client to connect to ZooKeeper 2> 474520 INFO (zkConnectionManagerCallback-1516-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ConnectionManager zkClient has connected 2> 474520 INFO (closeThreadPool-1458-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper 2> 474521 WARN (closeThreadPool-1458-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.SolrZkClient Using default ZkACLProvider. DefaultZkACLProvider is not secure, it creates 'OPEN_ACL_UNSAFE' ACLs to Zookeeper nodes 2> 474607 INFO (closeThreadPool-1458-thread-3) [n:127.0.0.1:45309_mr%2Fx c: s: r: x: t:] o.a.s.c.CorePropertiesLocator Found 0 core definitions underneath /build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.cloud.api.collections.TestReplicaProperties_246C98A4C257C021-001/shard-3-001/cores 2> 474627 INFO (closeThreadPool-1458-thread-1) [n: c: s: r: x: t:] o.a.s.c.NodeConfig Loading solr.xml from SolrHome (not found in ZooKeeper) 2> 474628 INFO (closeThreadPool-1458-thread-1) [n: c: s: r: x: t:] o.a.s.c.SolrXmlConfig Loading solr.xml from /build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.cloud.api.collections.TestReplicaProperties_246C98A4C257C021-001/shard-4-001/solr.xml 2> 475085 INFO (closeThreadPool-1458-thread-3) [n:127.0.0.1:45309_mr%2Fx c: s: r: x: t:] o.a.s.j.SolrRequestAuthorizer Creating a new SolrRequestAuthorizer 2> 475322 INFO (closeThreadPool-1458-thread-3) [n:127.0.0.1:45309_mr%2Fx c: s: r: x: t:] o.a.s.e.JettySolrRunner Jetty properties: {hostContext=/mr/x, solrconfig=solrconfig.xml, solr.data.dir=/build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.cloud.api.collections.TestReplicaProperties_246C98A4C257C021-001/tempDir-001/jetty3, hostPort=45309, coreRootDirectory=/build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-cwd/../../../../../../../../../build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.cloud.api.collections.TestReplicaProperties_246C98A4C257C021-001/shard-3-001/cores, replicaType=NRT} 2> 475326 INFO (closeThreadPool-1458-thread-3) [n: c: s: r: x: t:] o.a.s.c.AbstractFullDistribZkTestBase waitForLiveNode: 127.0.0.1:45309_mr%2Fx 2> 477747 INFO (closeThreadPool-1458-thread-1) [n: c: s: r: x: t:] o.a.s.c.ZkContainer Zookeeper client=127.0.0.1:41191/solr 2> 477748 WARN (closeThreadPool-1458-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.SolrZkClient Using default ZkCredentialsInjector. ZkCredentialsInjector is not secure, it creates an empty list of credentials which leads to 'OPEN_ACL_UNSAFE' ACLs to Zookeeper nodes 2> 477759 INFO (closeThreadPool-1458-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ConnectionManager Waiting up to 15000ms for client to connect to ZooKeeper 2> 477769 INFO (zkConnectionManagerCallback-1527-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ConnectionManager zkClient has connected 2> 477769 INFO (closeThreadPool-1458-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper 2> 477770 WARN (closeThreadPool-1458-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.SolrZkClient Using default ZkACLProvider. DefaultZkACLProvider is not secure, it creates 'OPEN_ACL_UNSAFE' ACLs to Zookeeper nodes 2> 477877 INFO (closeThreadPool-1458-thread-1) [n: c: s: r: x: t:] o.a.s.c.DistributedClusterStateUpdater Creating DistributedClusterStateUpdater with useDistributedStateUpdate=false. Solr will be using Overseer based cluster state updates. 2> 477887 INFO (closeThreadPool-1458-thread-1) [n:127.0.0.1:34767_mr%2Fx c: s: r: x: t:] o.a.s.c.c.ConnectionManager Waiting up to 15000ms for client to connect to ZooKeeper 2> 477895 INFO (zkConnectionManagerCallback-1529-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ConnectionManager zkClient has connected 2> 477896 INFO (closeThreadPool-1458-thread-1) [n:127.0.0.1:34767_mr%2Fx c: s: r: x: t:] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper 2> 477941 WARN (closeThreadPool-1458-thread-1) [n:127.0.0.1:34767_mr%2Fx c: s: r: x: t:] o.a.s.c.ZkController Contents of zookeeper /security.json are world-readable; consider setting up ACLs as described in https://solr.apache.org/guide/solr/latest/deployment-guide/zookeeper-access-control.html 2> 477948 INFO (closeThreadPool-1458-thread-1) [n:127.0.0.1:34767_mr%2Fx c: s: r: x: t:] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (0) -> (4) 2> 477964 INFO (closeThreadPool-1458-thread-1) [n:127.0.0.1:34767_mr%2Fx c: s: r: x: t:] o.a.s.c.DistributedClusterStateUpdater Creating DistributedClusterStateUpdater with useDistributedStateUpdate=false. Solr will be using Overseer based cluster state updates. 2> 477978 INFO (closeThreadPool-1458-thread-1) [n:127.0.0.1:34767_mr%2Fx c: s: r: x: t:] o.a.s.c.ZkController Publish node=127.0.0.1:34767_mr%2Fx as DOWN 2> 477984 INFO (closeThreadPool-1458-thread-1) [n:127.0.0.1:34767_mr%2Fx c: s: r: x: t:] o.a.s.c.ZkController Register node as live in ZooKeeper:/live_nodes/127.0.0.1:34767_mr%2Fx 2> 477991 INFO (zkCallback-1489-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (4) -> (5) 2> 477992 INFO (zkCallback-1456-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (4) -> (5) 2> 477992 INFO (zkCallback-1476-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (4) -> (5) 2> 477992 INFO (zkCallback-1432-thread-2) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (4) -> (5) 2> 477993 INFO (zkCallback-1509-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (4) -> (5) 2> 477994 INFO (zkCallback-1528-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (4) -> (5) 2> 478008 WARN (closeThreadPool-1458-thread-1) [n:127.0.0.1:34767_mr%2Fx c: s: r: x: t:] o.a.s.c.CoreContainer Not all security plugins configured! authentication=disabled authorization=disabled. Solr is only as secure as you make it. Consider configuring authentication/authorization before exposing Solr to users internal or external. See https://s.apache.org/solrsecurity for more info 2> 478649 INFO (closeThreadPool-1458-thread-1) [n:127.0.0.1:34767_mr%2Fx c: s: r: x: t:] o.a.s.c.CorePropertiesLocator Found 0 core definitions underneath /build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.cloud.api.collections.TestReplicaProperties_246C98A4C257C021-001/shard-4-001/cores 2> 479413 INFO (closeThreadPool-1458-thread-1) [n:127.0.0.1:34767_mr%2Fx c: s: r: x: t:] o.a.s.j.SolrRequestAuthorizer Creating a new SolrRequestAuthorizer 2> 479632 INFO (closeThreadPool-1458-thread-1) [n:127.0.0.1:34767_mr%2Fx c: s: r: x: t:] o.a.s.e.JettySolrRunner Jetty properties: {hostContext=/mr/x, solrconfig=solrconfig.xml, solr.data.dir=/build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.cloud.api.collections.TestReplicaProperties_246C98A4C257C021-001/tempDir-001/jetty4, hostPort=34767, coreRootDirectory=/build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-cwd/../../../../../../../../../build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.cloud.api.collections.TestReplicaProperties_246C98A4C257C021-001/shard-4-001/cores, replicaType=NRT} 2> 479635 INFO (closeThreadPool-1458-thread-1) [n: c: s: r: x: t:] o.a.s.c.AbstractFullDistribZkTestBase waitForLiveNode: 127.0.0.1:34767_mr%2Fx 2> 479710 INFO (OverseerThreadFactory-1438-thread-3) [n: c:collection1 s:shard1 r: x: t:] o.a.s.c.a.c.AddReplicaCmd Node Identified 127.0.0.1:33625_mr%2Fx for creating new replica of shard shard1 for collection collection1 2> 479726 INFO (OverseerThreadFactory-1438-thread-4) [n: c:collection1 s:shard2 r: x: t:] o.a.s.c.a.c.AddReplicaCmd Node Identified 127.0.0.1:45309_mr%2Fx for creating new replica of shard shard2 for collection collection1 2> 479726 INFO (OverseerThreadFactory-1438-thread-3) [n: c:collection1 s:shard1 r: x: t:] o.a.s.c.a.c.AddReplicaCmd Returning CreateReplica command. 2> 479739 INFO (OverseerThreadFactory-1438-thread-4) [n: c:collection1 s:shard2 r: x: t:] o.a.s.c.a.c.AddReplicaCmd Returning CreateReplica command. 2> 479741 INFO (OverseerStateUpdate-72077818027180036-127.0.0.1:40873_mr%2Fx-n_0000000000) [n:127.0.0.1:40873_mr%2Fx c: s: r: x: t:] o.a.s.c.o.SliceMutator createReplica() { 2> "core":"collection1_shard1_replica_n1", 2> "node_name":"127.0.0.1:33625_mr%2Fx", 2> "base_url":"http://127.0.0.1:33625/mr/x", 2> "collection":"collection1", 2> "shard":"shard1", 2> "state":"down", 2> "type":"NRT", 2> "operation":"addreplica"} 2> 479757 INFO (zkCallback-1432-thread-2) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/collection1/state.json zxid: 243] for collection [collection1] has occurred - updating... (live nodes size: [5]) 2> 479765 INFO (OverseerStateUpdate-72077818027180036-127.0.0.1:40873_mr%2Fx-n_0000000000) [n:127.0.0.1:40873_mr%2Fx c: s: r: x: t:] o.a.s.c.o.SliceMutator createReplica() { 2> "core":"collection1_shard2_replica_n2", 2> "node_name":"127.0.0.1:45309_mr%2Fx", 2> "base_url":"http://127.0.0.1:45309/mr/x", 2> "collection":"collection1", 2> "shard":"shard2", 2> "state":"down", 2> "type":"NRT", 2> "operation":"addreplica"} 2> 479781 INFO (qtp1977894741-2399) [n:127.0.0.1:33625_mr%2Fx c: s: r: x:collection1_shard1_replica_n1 t:null-5401] o.a.s.h.a.CoreAdminOperation core create command qt=/admin/cores&coreNodeName=core_node3&collection.configName=conf1&name=collection1_shard1_replica_n1&action=CREATE&collection=collection1&shard=shard1&wt=javabin&version=2&replicaType=NRT 2> 479879 INFO (qtp1977894741-2399) [n:127.0.0.1:33625_mr%2Fx c:collection1 s:shard1 r:core_node3 x:collection1_shard1_replica_n1 t:null-5401] o.a.s.c.SolrConfig Using Lucene MatchVersion: 9.8.0 2> 479920 INFO (zkCallback-1432-thread-2) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/collection1/state.json zxid: 247] for collection [collection1] has occurred - updating... (live nodes size: [5]) 2> 479925 INFO (qtp1977894741-2399) [n:127.0.0.1:33625_mr%2Fx c:collection1 s:shard1 r:core_node3 x:collection1_shard1_replica_n1 t:null-5401] o.a.s.s.IndexSchema Schema name=test 2> 479944 INFO (qtp1831257447-2437) [n:127.0.0.1:45309_mr%2Fx c: s: r: x:collection1_shard2_replica_n2 t:null-5402] o.a.s.h.a.CoreAdminOperation core create command qt=/admin/cores&coreNodeName=core_node4&collection.configName=conf1&name=collection1_shard2_replica_n2&action=CREATE&collection=collection1&shard=shard2&wt=javabin&version=2&replicaType=NRT 2> 480025 INFO (qtp1831257447-2437) [n:127.0.0.1:45309_mr%2Fx c:collection1 s:shard2 r:core_node4 x:collection1_shard2_replica_n2 t:null-5402] o.a.s.c.SolrConfig Using Lucene MatchVersion: 9.8.0 2> 480063 INFO (qtp1831257447-2437) [n:127.0.0.1:45309_mr%2Fx c:collection1 s:shard2 r:core_node4 x:collection1_shard2_replica_n2 t:null-5402] o.a.s.s.IndexSchema Schema name=test 2> 480159 INFO (qtp1977894741-2399) [n:127.0.0.1:33625_mr%2Fx c:collection1 s:shard1 r:core_node3 x:collection1_shard1_replica_n1 t:null-5401] o.a.s.s.IndexSchema Loaded schema test/1.6 with uniqueid field id 2> 480262 INFO (qtp1831257447-2437) [n:127.0.0.1:45309_mr%2Fx c:collection1 s:shard2 r:core_node4 x:collection1_shard2_replica_n2 t:null-5402] o.a.s.s.IndexSchema Loaded schema test/1.6 with uniqueid field id 2> 480319 INFO (qtp1977894741-2399) [n:127.0.0.1:33625_mr%2Fx c:collection1 s:shard1 r:core_node3 x:collection1_shard1_replica_n1 t:null-5401] o.a.s.c.CoreContainer Creating SolrCore 'collection1_shard1_replica_n1' using configuration from configset conf1, trusted=true 2> 480327 INFO (qtp1977894741-2399) [n:127.0.0.1:33625_mr%2Fx c:collection1 s:shard1 r:core_node3 x:collection1_shard1_replica_n1 t:null-5401] o.a.s.c.SolrCore Opening new SolrCore at [/build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.cloud.api.collections.TestReplicaProperties_246C98A4C257C021-001/shard-2-001/cores/collection1_shard1_replica_n1], dataDir=[/build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.cloud.api.collections.TestReplicaProperties_246C98A4C257C021-001/shard-2-001/cores/collection1_shard1_replica_n1/data/] 2> 480358 INFO (qtp1977894741-2399) [n:127.0.0.1:33625_mr%2Fx c:collection1 s:shard1 r:core_node3 x:collection1_shard1_replica_n1 t:null-5401] o.a.s.u.RandomMergePolicy RandomMergePolicy wrapping class org.apache.lucene.index.LogByteSizeMergePolicy: [LogByteSizeMergePolicy: minMergeSize=1677721, mergeFactor=18, maxMergeSize=2147483648, maxMergeSizeForForcedMerge=9223372036854775807, calibrateSizeByDeletes=false, maxMergeDocs=2147483647, maxCFSSegmentSizeMB=8.796093022208E12, noCFSRatio=1.0] 2> 480411 WARN (qtp1977894741-2399) [n:127.0.0.1:33625_mr%2Fx c:collection1 s:shard1 r:core_node3 x:collection1_shard1_replica_n1 t:null-5401] o.a.s.c.RequestHandlers INVALID paramSet a in requestHandler {type = requestHandler,name = /dump,class = DumpRequestHandler,attributes = {initParams=a, name=/dump, class=DumpRequestHandler},args = {defaults={a=A, b=B}}} 2> 480431 INFO (qtp1831257447-2437) [n:127.0.0.1:45309_mr%2Fx c:collection1 s:shard2 r:core_node4 x:collection1_shard2_replica_n2 t:null-5402] o.a.s.c.CoreContainer Creating SolrCore 'collection1_shard2_replica_n2' using configuration from configset conf1, trusted=true 2> 480434 INFO (qtp1831257447-2437) [n:127.0.0.1:45309_mr%2Fx c:collection1 s:shard2 r:core_node4 x:collection1_shard2_replica_n2 t:null-5402] o.a.s.c.SolrCore Opening new SolrCore at [/build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.cloud.api.collections.TestReplicaProperties_246C98A4C257C021-001/shard-3-001/cores/collection1_shard2_replica_n2], dataDir=[/build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.cloud.api.collections.TestReplicaProperties_246C98A4C257C021-001/shard-3-001/cores/collection1_shard2_replica_n2/data/] 2> 480457 INFO (qtp1831257447-2437) [n:127.0.0.1:45309_mr%2Fx c:collection1 s:shard2 r:core_node4 x:collection1_shard2_replica_n2 t:null-5402] o.a.s.u.RandomMergePolicy RandomMergePolicy wrapping class org.apache.lucene.index.LogByteSizeMergePolicy: [LogByteSizeMergePolicy: minMergeSize=1677721, mergeFactor=18, maxMergeSize=2147483648, maxMergeSizeForForcedMerge=9223372036854775807, calibrateSizeByDeletes=false, maxMergeDocs=2147483647, maxCFSSegmentSizeMB=8.796093022208E12, noCFSRatio=1.0] 2> 480472 WARN (qtp1831257447-2437) [n:127.0.0.1:45309_mr%2Fx c:collection1 s:shard2 r:core_node4 x:collection1_shard2_replica_n2 t:null-5402] o.a.s.c.RequestHandlers INVALID paramSet a in requestHandler {type = requestHandler,name = /dump,class = DumpRequestHandler,attributes = {initParams=a, name=/dump, class=DumpRequestHandler},args = {defaults={a=A, b=B}}} 2> 480869 INFO (qtp1977894741-2399) [n:127.0.0.1:33625_mr%2Fx c:collection1 s:shard1 r:core_node3 x:collection1_shard1_replica_n1 t:null-5401] o.a.s.j.SolrRequestAuthorizer Creating a new SolrRequestAuthorizer 2> 480927 INFO (qtp1977894741-2399) [n:127.0.0.1:33625_mr%2Fx c:collection1 s:shard1 r:core_node3 x:collection1_shard1_replica_n1 t:null-5401] o.a.s.u.UpdateHandler Using UpdateLog implementation: org.apache.solr.update.UpdateLog 2> 480928 INFO (qtp1977894741-2399) [n:127.0.0.1:33625_mr%2Fx c:collection1 s:shard1 r:core_node3 x:collection1_shard1_replica_n1 t:null-5401] o.a.s.u.UpdateLog Initializing UpdateLog: dataDir= defaultSyncLevel=FLUSH numRecordsToKeep=100 maxNumLogsToKeep=10 numVersionBuckets=65536 2> 480947 INFO (qtp1977894741-2399) [n:127.0.0.1:33625_mr%2Fx c:collection1 s:shard1 r:core_node3 x:collection1_shard1_replica_n1 t:null-5401] o.a.s.u.CommitTracker Hard AutoCommit: disabled 2> 480947 INFO (qtp1977894741-2399) [n:127.0.0.1:33625_mr%2Fx c:collection1 s:shard1 r:core_node3 x:collection1_shard1_replica_n1 t:null-5401] o.a.s.u.CommitTracker Soft AutoCommit: disabled 2> 480954 INFO (qtp1831257447-2437) [n:127.0.0.1:45309_mr%2Fx c:collection1 s:shard2 r:core_node4 x:collection1_shard2_replica_n2 t:null-5402] o.a.s.j.SolrRequestAuthorizer Creating a new SolrRequestAuthorizer 2> 480961 INFO (qtp1977894741-2399) [n:127.0.0.1:33625_mr%2Fx c:collection1 s:shard1 r:core_node3 x:collection1_shard1_replica_n1 t:null-5401] o.a.s.u.RandomMergePolicy RandomMergePolicy wrapping class org.apache.lucene.index.TieredMergePolicy: [TieredMergePolicy: maxMergeAtOnce=50, maxMergedSegmentMB=16.045801162719727, floorSegmentMB=0.9055280685424805, forceMergeDeletesPctAllowed=17.089761324479934, segmentsPerTier=27.0, maxCFSSegmentSizeMB=8.796093022208E12, noCFSRatio=0.3629416153254694, deletesPctAllowed=40.30256013139616 2> 480986 INFO (qtp1977894741-2399) [n:127.0.0.1:33625_mr%2Fx c:collection1 s:shard1 r:core_node3 x:collection1_shard1_replica_n1 t:null-5401] o.a.s.r.ManagedResourceStorage Configured ZooKeeperStorageIO with znodeBase: /configs/conf1 2> 480989 INFO (qtp1977894741-2399) [n:127.0.0.1:33625_mr%2Fx c:collection1 s:shard1 r:core_node3 x:collection1_shard1_replica_n1 t:null-5401] o.a.s.r.ManagedResourceStorage Loaded null at path _rest_managed.json using ZooKeeperStorageIO:path=/configs/conf1 2> 480994 INFO (qtp1977894741-2399) [n:127.0.0.1:33625_mr%2Fx c:collection1 s:shard1 r:core_node3 x:collection1_shard1_replica_n1 t:null-5401] o.a.s.h.ReplicationHandler Commits will be reserved for 10000 ms 2> 480996 INFO (qtp1977894741-2399) [n:127.0.0.1:33625_mr%2Fx c:collection1 s:shard1 r:core_node3 x:collection1_shard1_replica_n1 t:null-5401] o.a.s.u.UpdateLog Could not find max version in index or recent updates, using new clock 1788735739987492864 2> 481014 INFO (qtp1831257447-2437) [n:127.0.0.1:45309_mr%2Fx c:collection1 s:shard2 r:core_node4 x:collection1_shard2_replica_n2 t:null-5402] o.a.s.u.UpdateHandler Using UpdateLog implementation: org.apache.solr.update.UpdateLog 2> 481014 INFO (qtp1831257447-2437) [n:127.0.0.1:45309_mr%2Fx c:collection1 s:shard2 r:core_node4 x:collection1_shard2_replica_n2 t:null-5402] o.a.s.u.UpdateLog Initializing UpdateLog: dataDir= defaultSyncLevel=FLUSH numRecordsToKeep=100 maxNumLogsToKeep=10 numVersionBuckets=65536 2> 481033 INFO (qtp1831257447-2437) [n:127.0.0.1:45309_mr%2Fx c:collection1 s:shard2 r:core_node4 x:collection1_shard2_replica_n2 t:null-5402] o.a.s.u.CommitTracker Hard AutoCommit: disabled 2> 481033 INFO (qtp1831257447-2437) [n:127.0.0.1:45309_mr%2Fx c:collection1 s:shard2 r:core_node4 x:collection1_shard2_replica_n2 t:null-5402] o.a.s.u.CommitTracker Soft AutoCommit: disabled 2> 481036 INFO (searcherExecutor-1539-thread-1-processing-collection1_shard1_replica_n1 null-5401 core_node3 127.0.0.1:33625_mr%2Fx collection1 shard1) [n:127.0.0.1:33625_mr%2Fx c:collection1 s:shard1 r:core_node3 x:collection1_shard1_replica_n1 t:null-5401] o.a.s.c.SolrCore Registered new searcher autowarm time: 0 ms 2> 481039 INFO (qtp1977894741-2399) [n:127.0.0.1:33625_mr%2Fx c:collection1 s:shard1 r:core_node3 x:collection1_shard1_replica_n1 t:null-5401] o.a.s.c.ZkShardTerms Successful update of terms at /collections/collection1/terms/shard1 to Terms{values={core_node3=0}, version=0} for registerTerm 2> 481040 INFO (qtp1977894741-2399) [n:127.0.0.1:33625_mr%2Fx c:collection1 s:shard1 r:core_node3 x:collection1_shard1_replica_n1 t:null-5401] o.a.s.c.ShardLeaderElectionContextBase make sure parent is created /collections/collection1/leaders/shard1 2> 481046 INFO (qtp1831257447-2437) [n:127.0.0.1:45309_mr%2Fx c:collection1 s:shard2 r:core_node4 x:collection1_shard2_replica_n2 t:null-5402] o.a.s.u.RandomMergePolicy RandomMergePolicy wrapping class org.apache.lucene.index.TieredMergePolicy: [TieredMergePolicy: maxMergeAtOnce=50, maxMergedSegmentMB=16.045801162719727, floorSegmentMB=0.9055280685424805, forceMergeDeletesPctAllowed=17.089761324479934, segmentsPerTier=27.0, maxCFSSegmentSizeMB=8.796093022208E12, noCFSRatio=0.3629416153254694, deletesPctAllowed=40.30256013139616 2> 481059 INFO (qtp1831257447-2437) [n:127.0.0.1:45309_mr%2Fx c:collection1 s:shard2 r:core_node4 x:collection1_shard2_replica_n2 t:null-5402] o.a.s.r.ManagedResourceStorage Configured ZooKeeperStorageIO with znodeBase: /configs/conf1 2> 481062 INFO (qtp1831257447-2437) [n:127.0.0.1:45309_mr%2Fx c:collection1 s:shard2 r:core_node4 x:collection1_shard2_replica_n2 t:null-5402] o.a.s.r.ManagedResourceStorage Loaded null at path _rest_managed.json using ZooKeeperStorageIO:path=/configs/conf1 2> 481067 INFO (qtp1831257447-2437) [n:127.0.0.1:45309_mr%2Fx c:collection1 s:shard2 r:core_node4 x:collection1_shard2_replica_n2 t:null-5402] o.a.s.h.ReplicationHandler Commits will be reserved for 10000 ms 2> 481068 INFO (qtp1831257447-2437) [n:127.0.0.1:45309_mr%2Fx c:collection1 s:shard2 r:core_node4 x:collection1_shard2_replica_n2 t:null-5402] o.a.s.u.UpdateLog Could not find max version in index or recent updates, using new clock 1788735740062990336 2> 481070 INFO (qtp1977894741-2399) [n:127.0.0.1:33625_mr%2Fx c:collection1 s:shard1 r:core_node3 x:collection1_shard1_replica_n1 t:null-5401] o.a.s.c.ShardLeaderElectionContext Enough replicas found to continue. 2> 481070 INFO (qtp1977894741-2399) [n:127.0.0.1:33625_mr%2Fx c:collection1 s:shard1 r:core_node3 x:collection1_shard1_replica_n1 t:null-5401] o.a.s.c.ShardLeaderElectionContext I may be the new leader - try and sync 2> 481070 INFO (qtp1977894741-2399) [n:127.0.0.1:33625_mr%2Fx c:collection1 s:shard1 r:core_node3 x:collection1_shard1_replica_n1 t:null-5401] o.a.s.c.SyncStrategy Sync replicas to http://127.0.0.1:33625/mr/x/collection1_shard1_replica_n1/ 2> 481073 INFO (qtp1977894741-2399) [n:127.0.0.1:33625_mr%2Fx c:collection1 s:shard1 r:core_node3 x:collection1_shard1_replica_n1 t:null-5401] o.a.s.c.SyncStrategy Sync Success - now sync replicas to me 2> 481076 INFO (qtp1977894741-2399) [n:127.0.0.1:33625_mr%2Fx c:collection1 s:shard1 r:core_node3 x:collection1_shard1_replica_n1 t:null-5401] o.a.s.c.SyncStrategy http://127.0.0.1:33625/mr/x/collection1_shard1_replica_n1/ has no replicas 2> 481076 INFO (qtp1977894741-2399) [n:127.0.0.1:33625_mr%2Fx c:collection1 s:shard1 r:core_node3 x:collection1_shard1_replica_n1 t:null-5401] o.a.s.c.ShardLeaderElectionContextBase Creating leader registration node /collections/collection1/leaders/shard1/leader after winning as /collections/collection1/leader_elect/shard1/election/72077818027180042-core_node3-n_0000000000 2> 481098 INFO (qtp1977894741-2399) [n:127.0.0.1:33625_mr%2Fx c:collection1 s:shard1 r:core_node3 x:collection1_shard1_replica_n1 t:null-5401] o.a.s.c.ShardLeaderElectionContext I am the new leader: http://127.0.0.1:33625/mr/x/collection1_shard1_replica_n1/ shard1 2> 481111 INFO (qtp1831257447-2437) [n:127.0.0.1:45309_mr%2Fx c:collection1 s:shard2 r:core_node4 x:collection1_shard2_replica_n2 t:null-5402] o.a.s.c.ZkShardTerms Successful update of terms at /collections/collection1/terms/shard2 to Terms{values={core_node4=0}, version=0} for registerTerm 2> 481111 INFO (qtp1831257447-2437) [n:127.0.0.1:45309_mr%2Fx c:collection1 s:shard2 r:core_node4 x:collection1_shard2_replica_n2 t:null-5402] o.a.s.c.ShardLeaderElectionContextBase make sure parent is created /collections/collection1/leaders/shard2 2> 481111 INFO (searcherExecutor-1541-thread-1-processing-collection1_shard2_replica_n2 null-5402 core_node4 127.0.0.1:45309_mr%2Fx collection1 shard2) [n:127.0.0.1:45309_mr%2Fx c:collection1 s:shard2 r:core_node4 x:collection1_shard2_replica_n2 t:null-5402] o.a.s.c.SolrCore Registered new searcher autowarm time: 0 ms 2> 481140 INFO (qtp1831257447-2437) [n:127.0.0.1:45309_mr%2Fx c:collection1 s:shard2 r:core_node4 x:collection1_shard2_replica_n2 t:null-5402] o.a.s.c.ShardLeaderElectionContext Enough replicas found to continue. 2> 481141 INFO (qtp1831257447-2437) [n:127.0.0.1:45309_mr%2Fx c:collection1 s:shard2 r:core_node4 x:collection1_shard2_replica_n2 t:null-5402] o.a.s.c.ShardLeaderElectionContext I may be the new leader - try and sync 2> 481141 INFO (qtp1831257447-2437) [n:127.0.0.1:45309_mr%2Fx c:collection1 s:shard2 r:core_node4 x:collection1_shard2_replica_n2 t:null-5402] o.a.s.c.SyncStrategy Sync replicas to http://127.0.0.1:45309/mr/x/collection1_shard2_replica_n2/ 2> 481143 INFO (qtp1831257447-2437) [n:127.0.0.1:45309_mr%2Fx c:collection1 s:shard2 r:core_node4 x:collection1_shard2_replica_n2 t:null-5402] o.a.s.c.SyncStrategy Sync Success - now sync replicas to me 2> 481146 INFO (qtp1831257447-2437) [n:127.0.0.1:45309_mr%2Fx c:collection1 s:shard2 r:core_node4 x:collection1_shard2_replica_n2 t:null-5402] o.a.s.c.SyncStrategy http://127.0.0.1:45309/mr/x/collection1_shard2_replica_n2/ has no replicas 2> 481146 INFO (qtp1831257447-2437) [n:127.0.0.1:45309_mr%2Fx c:collection1 s:shard2 r:core_node4 x:collection1_shard2_replica_n2 t:null-5402] o.a.s.c.ShardLeaderElectionContextBase Creating leader registration node /collections/collection1/leaders/shard2/leader after winning as /collections/collection1/leader_elect/shard2/election/72077818027180047-core_node4-n_0000000000 2> 481166 INFO (qtp1831257447-2437) [n:127.0.0.1:45309_mr%2Fx c:collection1 s:shard2 r:core_node4 x:collection1_shard2_replica_n2 t:null-5402] o.a.s.c.ShardLeaderElectionContext I am the new leader: http://127.0.0.1:45309/mr/x/collection1_shard2_replica_n2/ shard2 2> 481279 INFO (zkCallback-1509-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/collection1/state.json zxid: 278] for collection [collection1] has occurred - updating... (live nodes size: [5]) 2> 481279 INFO (zkCallback-1476-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/collection1/state.json zxid: 278] for collection [collection1] has occurred - updating... (live nodes size: [5]) 2> 481288 INFO (qtp1977894741-2399) [n:127.0.0.1:33625_mr%2Fx c:collection1 s:shard1 r:core_node3 x:collection1_shard1_replica_n1 t:null-5401] o.a.s.c.ZkController I am the leader, no recovery necessary 2> 481288 INFO (qtp1831257447-2437) [n:127.0.0.1:45309_mr%2Fx c:collection1 s:shard2 r:core_node4 x:collection1_shard2_replica_n2 t:null-5402] o.a.s.c.ZkController I am the leader, no recovery necessary 2> 481317 INFO (qtp1831257447-2437) [n:127.0.0.1:45309_mr%2Fx c: s: r: x: t:null-5402] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/cores params={qt=/admin/cores&coreNodeName=core_node4&collection.configName=conf1&name=collection1_shard2_replica_n2&action=CREATE&collection=collection1&shard=shard2&wt=javabin&version=2&replicaType=NRT} status=0 QTime=1375 2> 481317 INFO (qtp1977894741-2399) [n:127.0.0.1:33625_mr%2Fx c: s: r: x: t:null-5401] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/cores params={qt=/admin/cores&coreNodeName=core_node3&collection.configName=conf1&name=collection1_shard1_replica_n1&action=CREATE&collection=collection1&shard=shard1&wt=javabin&version=2&replicaType=NRT} status=0 QTime=1541 2> 481344 INFO (qtp1977894741-2398) [n:127.0.0.1:33625_mr%2Fx c:collection1 s: r: x: t:null-5399] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/collections params={node=127.0.0.1:33625_mr%252Fx&action=ADDREPLICA&collection=collection1&shard=shard1&type=NRT&wt=javabin&version=2} status=0 QTime=1677 2> 481345 INFO (qtp1977894741-2401) [n:127.0.0.1:33625_mr%2Fx c:collection1 s: r: x: t:null-5398] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/collections params={node=127.0.0.1:45309_mr%252Fx&action=ADDREPLICA&collection=collection1&shard=shard2&type=NRT&wt=javabin&version=2} status=0 QTime=1678 2> 481408 INFO (zkCallback-1476-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/collection1/state.json zxid: 288] for collection [collection1] has occurred - updating... (live nodes size: [5]) 2> 481408 INFO (zkCallback-1509-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/collection1/state.json zxid: 288] for collection [collection1] has occurred - updating... (live nodes size: [5]) 2> 481409 INFO (zkCallback-1476-thread-2) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/collection1/state.json zxid: 288] for collection [collection1] has occurred - updating... (live nodes size: [5]) 2> 481410 INFO (zkCallback-1509-thread-2) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/collection1/state.json zxid: 288] for collection [collection1] has occurred - updating... (live nodes size: [5]) 2> 481727 INFO (OverseerCollectionConfigSetProcessor-72077818027180036-127.0.0.1:40873_mr%2Fx-n_0000000000) [n: c: s: r: x: t:] o.a.s.c.OverseerTaskQueue Response ZK path: /overseer/collection-queue-work/qnr-0000000004 doesn't exist. Requestor may have disconnected from ZooKeeper 2> 481732 INFO (OverseerCollectionConfigSetProcessor-72077818027180036-127.0.0.1:40873_mr%2Fx-n_0000000000) [n: c: s: r: x: t:] o.a.s.c.OverseerTaskQueue Response ZK path: /overseer/collection-queue-work/qnr-0000000005 doesn't exist. Requestor may have disconnected from ZooKeeper 2> 481743 INFO (OverseerThreadFactory-1438-thread-4) [n: c:collection1 s:shard1 r: x: t:] o.a.s.c.a.c.AddReplicaCmd Node Identified 127.0.0.1:34767_mr%2Fx for creating new replica of shard shard1 for collection collection1 2> 481745 INFO (OverseerThreadFactory-1438-thread-5) [n: c:collection1 s:shard2 r: x: t:] o.a.s.c.a.c.AddReplicaCmd Node Identified 127.0.0.1:46725_mr%2Fx for creating new replica of shard shard2 for collection collection1 2> 481758 INFO (OverseerThreadFactory-1438-thread-4) [n: c:collection1 s:shard1 r: x: t:] o.a.s.c.a.c.AddReplicaCmd Returning CreateReplica command. 2> 481766 INFO (OverseerThreadFactory-1438-thread-5) [n: c:collection1 s:shard2 r: x: t:] o.a.s.c.a.c.AddReplicaCmd Returning CreateReplica command. 2> 481773 INFO (OverseerStateUpdate-72077818027180036-127.0.0.1:40873_mr%2Fx-n_0000000000) [n:127.0.0.1:40873_mr%2Fx c: s: r: x: t:] o.a.s.c.o.SliceMutator createReplica() { 2> "core":"collection1_shard1_replica_n5", 2> "node_name":"127.0.0.1:34767_mr%2Fx", 2> "base_url":"http://127.0.0.1:34767/mr/x", 2> "collection":"collection1", 2> "shard":"shard1", 2> "state":"down", 2> "type":"NRT", 2> "operation":"addreplica"} 2> 481790 INFO (OverseerStateUpdate-72077818027180036-127.0.0.1:40873_mr%2Fx-n_0000000000) [n:127.0.0.1:40873_mr%2Fx c: s: r: x: t:] o.a.s.c.o.SliceMutator createReplica() { 2> "core":"collection1_shard2_replica_n6", 2> "node_name":"127.0.0.1:46725_mr%2Fx", 2> "base_url":"http://127.0.0.1:46725/mr/x", 2> "collection":"collection1", 2> "shard":"shard2", 2> "state":"down", 2> "type":"NRT", 2> "operation":"addreplica"} 2> 481905 INFO (zkCallback-1509-thread-2) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/collection1/state.json zxid: 301] for collection [collection1] has occurred - updating... (live nodes size: [5]) 2> 481905 INFO (zkCallback-1476-thread-2) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/collection1/state.json zxid: 301] for collection [collection1] has occurred - updating... (live nodes size: [5]) 2> 481905 INFO (zkCallback-1509-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/collection1/state.json zxid: 301] for collection [collection1] has occurred - updating... (live nodes size: [5]) 2> 481905 INFO (zkCallback-1432-thread-2) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/collection1/state.json zxid: 301] for collection [collection1] has occurred - updating... (live nodes size: [5]) 2> 481905 INFO (zkCallback-1476-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/collection1/state.json zxid: 301] for collection [collection1] has occurred - updating... (live nodes size: [5]) 2> 481931 INFO (qtp852564526-2471) [n:127.0.0.1:34767_mr%2Fx c: s: r: x:collection1_shard1_replica_n5 t:null-5403] o.a.s.h.a.CoreAdminOperation core create command qt=/admin/cores&coreNodeName=core_node7&collection.configName=conf1&name=collection1_shard1_replica_n5&action=CREATE&collection=collection1&shard=shard1&wt=javabin&version=2&replicaType=NRT 2> 481931 INFO (qtp734457442-2385) [n:127.0.0.1:46725_mr%2Fx c: s: r: x:collection1_shard2_replica_n6 t:null-5404] o.a.s.h.a.CoreAdminOperation core create command qt=/admin/cores&coreNodeName=core_node8&collection.configName=conf1&name=collection1_shard2_replica_n6&action=CREATE&collection=collection1&shard=shard2&wt=javabin&version=2&replicaType=NRT 2> 482009 INFO (qtp852564526-2471) [n:127.0.0.1:34767_mr%2Fx c:collection1 s:shard1 r:core_node7 x:collection1_shard1_replica_n5 t:null-5403] o.a.s.c.SolrConfig Using Lucene MatchVersion: 9.8.0 2> 482011 INFO (qtp734457442-2385) [n:127.0.0.1:46725_mr%2Fx c:collection1 s:shard2 r:core_node8 x:collection1_shard2_replica_n6 t:null-5404] o.a.s.c.SolrConfig Using Lucene MatchVersion: 9.8.0 2> 482046 INFO (qtp734457442-2385) [n:127.0.0.1:46725_mr%2Fx c:collection1 s:shard2 r:core_node8 x:collection1_shard2_replica_n6 t:null-5404] o.a.s.s.IndexSchema Schema name=test 2> 482047 INFO (qtp852564526-2471) [n:127.0.0.1:34767_mr%2Fx c:collection1 s:shard1 r:core_node7 x:collection1_shard1_replica_n5 t:null-5403] o.a.s.s.IndexSchema Schema name=test 2> 482067 INFO (zkCallback-1476-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/collection1/state.json zxid: 305] for collection [collection1] has occurred - updating... (live nodes size: [5]) 2> 482067 INFO (zkCallback-1476-thread-2) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/collection1/state.json zxid: 305] for collection [collection1] has occurred - updating... (live nodes size: [5]) 2> 482067 INFO (zkCallback-1509-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/collection1/state.json zxid: 305] for collection [collection1] has occurred - updating... (live nodes size: [5]) 2> 482067 INFO (zkCallback-1509-thread-2) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/collection1/state.json zxid: 305] for collection [collection1] has occurred - updating... (live nodes size: [5]) 2> 482230 INFO (qtp852564526-2471) [n:127.0.0.1:34767_mr%2Fx c:collection1 s:shard1 r:core_node7 x:collection1_shard1_replica_n5 t:null-5403] o.a.s.s.IndexSchema Loaded schema test/1.6 with uniqueid field id 2> 482230 INFO (qtp734457442-2385) [n:127.0.0.1:46725_mr%2Fx c:collection1 s:shard2 r:core_node8 x:collection1_shard2_replica_n6 t:null-5404] o.a.s.s.IndexSchema Loaded schema test/1.6 with uniqueid field id 2> 482383 INFO (qtp852564526-2471) [n:127.0.0.1:34767_mr%2Fx c:collection1 s:shard1 r:core_node7 x:collection1_shard1_replica_n5 t:null-5403] o.a.s.c.CoreContainer Creating SolrCore 'collection1_shard1_replica_n5' using configuration from configset conf1, trusted=true 2> 482386 INFO (qtp852564526-2471) [n:127.0.0.1:34767_mr%2Fx c:collection1 s:shard1 r:core_node7 x:collection1_shard1_replica_n5 t:null-5403] o.a.s.c.SolrCore Opening new SolrCore at [/build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.cloud.api.collections.TestReplicaProperties_246C98A4C257C021-001/shard-4-001/cores/collection1_shard1_replica_n5], dataDir=[/build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.cloud.api.collections.TestReplicaProperties_246C98A4C257C021-001/shard-4-001/cores/collection1_shard1_replica_n5/data/] 2> 482387 INFO (qtp734457442-2385) [n:127.0.0.1:46725_mr%2Fx c:collection1 s:shard2 r:core_node8 x:collection1_shard2_replica_n6 t:null-5404] o.a.s.c.CoreContainer Creating SolrCore 'collection1_shard2_replica_n6' using configuration from configset conf1, trusted=true 2> 482390 INFO (qtp734457442-2385) [n:127.0.0.1:46725_mr%2Fx c:collection1 s:shard2 r:core_node8 x:collection1_shard2_replica_n6 t:null-5404] o.a.s.c.SolrCore Opening new SolrCore at [/build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.cloud.api.collections.TestReplicaProperties_246C98A4C257C021-001/shard-1-001/cores/collection1_shard2_replica_n6], dataDir=[/build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.cloud.api.collections.TestReplicaProperties_246C98A4C257C021-001/shard-1-001/cores/collection1_shard2_replica_n6/data/] 2> 482402 INFO (qtp852564526-2471) [n:127.0.0.1:34767_mr%2Fx c:collection1 s:shard1 r:core_node7 x:collection1_shard1_replica_n5 t:null-5403] o.a.s.u.RandomMergePolicy RandomMergePolicy wrapping class org.apache.lucene.index.LogByteSizeMergePolicy: [LogByteSizeMergePolicy: minMergeSize=1677721, mergeFactor=18, maxMergeSize=2147483648, maxMergeSizeForForcedMerge=9223372036854775807, calibrateSizeByDeletes=false, maxMergeDocs=2147483647, maxCFSSegmentSizeMB=8.796093022208E12, noCFSRatio=1.0] 2> 482409 INFO (qtp734457442-2385) [n:127.0.0.1:46725_mr%2Fx c:collection1 s:shard2 r:core_node8 x:collection1_shard2_replica_n6 t:null-5404] o.a.s.u.RandomMergePolicy RandomMergePolicy wrapping class org.apache.lucene.index.LogByteSizeMergePolicy: [LogByteSizeMergePolicy: minMergeSize=1677721, mergeFactor=18, maxMergeSize=2147483648, maxMergeSizeForForcedMerge=9223372036854775807, calibrateSizeByDeletes=false, maxMergeDocs=2147483647, maxCFSSegmentSizeMB=8.796093022208E12, noCFSRatio=1.0] 2> 482416 WARN (qtp852564526-2471) [n:127.0.0.1:34767_mr%2Fx c:collection1 s:shard1 r:core_node7 x:collection1_shard1_replica_n5 t:null-5403] o.a.s.c.RequestHandlers INVALID paramSet a in requestHandler {type = requestHandler,name = /dump,class = DumpRequestHandler,attributes = {initParams=a, name=/dump, class=DumpRequestHandler},args = {defaults={a=A, b=B}}} 2> 482421 WARN (qtp734457442-2385) [n:127.0.0.1:46725_mr%2Fx c:collection1 s:shard2 r:core_node8 x:collection1_shard2_replica_n6 t:null-5404] o.a.s.c.RequestHandlers INVALID paramSet a in requestHandler {type = requestHandler,name = /dump,class = DumpRequestHandler,attributes = {initParams=a, name=/dump, class=DumpRequestHandler},args = {defaults={a=A, b=B}}} 2> 482954 INFO (qtp852564526-2471) [n:127.0.0.1:34767_mr%2Fx c:collection1 s:shard1 r:core_node7 x:collection1_shard1_replica_n5 t:null-5403] o.a.s.j.SolrRequestAuthorizer Creating a new SolrRequestAuthorizer 2> 483022 INFO (qtp852564526-2471) [n:127.0.0.1:34767_mr%2Fx c:collection1 s:shard1 r:core_node7 x:collection1_shard1_replica_n5 t:null-5403] o.a.s.u.UpdateHandler Using UpdateLog implementation: org.apache.solr.update.UpdateLog 2> 483023 INFO (qtp852564526-2471) [n:127.0.0.1:34767_mr%2Fx c:collection1 s:shard1 r:core_node7 x:collection1_shard1_replica_n5 t:null-5403] o.a.s.u.UpdateLog Initializing UpdateLog: dataDir= defaultSyncLevel=FLUSH numRecordsToKeep=100 maxNumLogsToKeep=10 numVersionBuckets=65536 2> 483028 INFO (qtp734457442-2385) [n:127.0.0.1:46725_mr%2Fx c:collection1 s:shard2 r:core_node8 x:collection1_shard2_replica_n6 t:null-5404] o.a.s.j.SolrRequestAuthorizer Creating a new SolrRequestAuthorizer 2> 483042 INFO (qtp852564526-2471) [n:127.0.0.1:34767_mr%2Fx c:collection1 s:shard1 r:core_node7 x:collection1_shard1_replica_n5 t:null-5403] o.a.s.u.CommitTracker Hard AutoCommit: disabled 2> 483043 INFO (qtp852564526-2471) [n:127.0.0.1:34767_mr%2Fx c:collection1 s:shard1 r:core_node7 x:collection1_shard1_replica_n5 t:null-5403] o.a.s.u.CommitTracker Soft AutoCommit: disabled 2> 483056 INFO (qtp852564526-2471) [n:127.0.0.1:34767_mr%2Fx c:collection1 s:shard1 r:core_node7 x:collection1_shard1_replica_n5 t:null-5403] o.a.s.u.RandomMergePolicy RandomMergePolicy wrapping class org.apache.lucene.index.TieredMergePolicy: [TieredMergePolicy: maxMergeAtOnce=50, maxMergedSegmentMB=16.045801162719727, floorSegmentMB=0.9055280685424805, forceMergeDeletesPctAllowed=17.089761324479934, segmentsPerTier=27.0, maxCFSSegmentSizeMB=8.796093022208E12, noCFSRatio=0.3629416153254694, deletesPctAllowed=40.30256013139616 2> 483069 INFO (qtp852564526-2471) [n:127.0.0.1:34767_mr%2Fx c:collection1 s:shard1 r:core_node7 x:collection1_shard1_replica_n5 t:null-5403] o.a.s.r.ManagedResourceStorage Configured ZooKeeperStorageIO with znodeBase: /configs/conf1 2> 483071 INFO (qtp852564526-2471) [n:127.0.0.1:34767_mr%2Fx c:collection1 s:shard1 r:core_node7 x:collection1_shard1_replica_n5 t:null-5403] o.a.s.r.ManagedResourceStorage Loaded null at path _rest_managed.json using ZooKeeperStorageIO:path=/configs/conf1 2> 483076 INFO (qtp852564526-2471) [n:127.0.0.1:34767_mr%2Fx c:collection1 s:shard1 r:core_node7 x:collection1_shard1_replica_n5 t:null-5403] o.a.s.h.ReplicationHandler Commits will be reserved for 10000 ms 2> 483077 INFO (qtp852564526-2471) [n:127.0.0.1:34767_mr%2Fx c:collection1 s:shard1 r:core_node7 x:collection1_shard1_replica_n5 t:null-5403] o.a.s.u.UpdateLog Could not find max version in index or recent updates, using new clock 1788735742169579520 2> 483095 INFO (qtp734457442-2385) [n:127.0.0.1:46725_mr%2Fx c:collection1 s:shard2 r:core_node8 x:collection1_shard2_replica_n6 t:null-5404] o.a.s.u.UpdateHandler Using UpdateLog implementation: org.apache.solr.update.UpdateLog 2> 483095 INFO (qtp734457442-2385) [n:127.0.0.1:46725_mr%2Fx c:collection1 s:shard2 r:core_node8 x:collection1_shard2_replica_n6 t:null-5404] o.a.s.u.UpdateLog Initializing UpdateLog: dataDir= defaultSyncLevel=FLUSH numRecordsToKeep=100 maxNumLogsToKeep=10 numVersionBuckets=65536 2> 483114 INFO (qtp734457442-2385) [n:127.0.0.1:46725_mr%2Fx c:collection1 s:shard2 r:core_node8 x:collection1_shard2_replica_n6 t:null-5404] o.a.s.u.CommitTracker Hard AutoCommit: disabled 2> 483114 INFO (qtp734457442-2385) [n:127.0.0.1:46725_mr%2Fx c:collection1 s:shard2 r:core_node8 x:collection1_shard2_replica_n6 t:null-5404] o.a.s.u.CommitTracker Soft AutoCommit: disabled 2> 483114 INFO (searcherExecutor-1553-thread-1-processing-collection1_shard1_replica_n5 null-5403 core_node7 127.0.0.1:34767_mr%2Fx collection1 shard1) [n:127.0.0.1:34767_mr%2Fx c:collection1 s:shard1 r:core_node7 x:collection1_shard1_replica_n5 t:null-5403] o.a.s.c.SolrCore Registered new searcher autowarm time: 0 ms 2> 483116 INFO (qtp852564526-2471) [n:127.0.0.1:34767_mr%2Fx c:collection1 s:shard1 r:core_node7 x:collection1_shard1_replica_n5 t:null-5403] o.a.s.c.ZkShardTerms Successful update of terms at /collections/collection1/terms/shard1 to Terms{values={core_node3=0, core_node7=0}, version=1} for registerTerm 2> 483116 INFO (qtp852564526-2471) [n:127.0.0.1:34767_mr%2Fx c:collection1 s:shard1 r:core_node7 x:collection1_shard1_replica_n5 t:null-5403] o.a.s.c.ShardLeaderElectionContextBase make sure parent is created /collections/collection1/leaders/shard1 2> 483130 INFO (qtp734457442-2385) [n:127.0.0.1:46725_mr%2Fx c:collection1 s:shard2 r:core_node8 x:collection1_shard2_replica_n6 t:null-5404] o.a.s.u.RandomMergePolicy RandomMergePolicy wrapping class org.apache.lucene.index.TieredMergePolicy: [TieredMergePolicy: maxMergeAtOnce=50, maxMergedSegmentMB=16.045801162719727, floorSegmentMB=0.9055280685424805, forceMergeDeletesPctAllowed=17.089761324479934, segmentsPerTier=27.0, maxCFSSegmentSizeMB=8.796093022208E12, noCFSRatio=0.3629416153254694, deletesPctAllowed=40.30256013139616 2> 483141 INFO (qtp852564526-2471) [n:127.0.0.1:34767_mr%2Fx c:collection1 s:shard1 r:core_node7 x:collection1_shard1_replica_n5 t:null-5403] o.a.s.c.ZkController Core needs to recover:collection1_shard1_replica_n5 2> 483145 INFO (updateExecutor-1523-thread-1-processing-collection1_shard1_replica_n5 null-5403 core_node7 127.0.0.1:34767_mr%2Fx collection1 shard1) [n:127.0.0.1:34767_mr%2Fx c:collection1 s:shard1 r:core_node7 x:collection1_shard1_replica_n5 t:null-5403] o.a.s.u.DefaultSolrCoreState Running recovery 2> 483145 INFO (qtp734457442-2385) [n:127.0.0.1:46725_mr%2Fx c:collection1 s:shard2 r:core_node8 x:collection1_shard2_replica_n6 t:null-5404] o.a.s.r.ManagedResourceStorage Configured ZooKeeperStorageIO with znodeBase: /configs/conf1 2> 483149 INFO (qtp734457442-2385) [n:127.0.0.1:46725_mr%2Fx c:collection1 s:shard2 r:core_node8 x:collection1_shard2_replica_n6 t:null-5404] o.a.s.r.ManagedResourceStorage Loaded null at path _rest_managed.json using ZooKeeperStorageIO:path=/configs/conf1 2> 483149 INFO (recoveryExecutor-1525-thread-1-processing-collection1_shard1_replica_n5 null-5403 core_node7 127.0.0.1:34767_mr%2Fx collection1 shard1) [n:127.0.0.1:34767_mr%2Fx c:collection1 s:shard1 r:core_node7 x:collection1_shard1_replica_n5 t:null-5403] o.a.s.c.RecoveryStrategy Starting recovery process. recoveringAfterStartup=true 2> 483149 INFO (recoveryExecutor-1525-thread-1-processing-collection1_shard1_replica_n5 null-5403 core_node7 127.0.0.1:34767_mr%2Fx collection1 shard1) [n:127.0.0.1:34767_mr%2Fx c:collection1 s:shard1 r:core_node7 x:collection1_shard1_replica_n5 t:null-5403] o.a.s.c.RecoveryStrategy startupVersions is empty 2> 483159 INFO (qtp734457442-2385) [n:127.0.0.1:46725_mr%2Fx c:collection1 s:shard2 r:core_node8 x:collection1_shard2_replica_n6 t:null-5404] o.a.s.h.ReplicationHandler Commits will be reserved for 10000 ms 2> 483160 INFO (qtp734457442-2385) [n:127.0.0.1:46725_mr%2Fx c:collection1 s:shard2 r:core_node8 x:collection1_shard2_replica_n6 t:null-5404] o.a.s.u.UpdateLog Could not find max version in index or recent updates, using new clock 1788735742256611328 2> 483169 INFO (qtp852564526-2471) [n:127.0.0.1:34767_mr%2Fx c: s: r: x: t:null-5403] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/cores params={qt=/admin/cores&coreNodeName=core_node7&collection.configName=conf1&name=collection1_shard1_replica_n5&action=CREATE&collection=collection1&shard=shard1&wt=javabin&version=2&replicaType=NRT} status=0 QTime=1240 2> 483188 INFO (qtp1977894741-2397) [n:127.0.0.1:33625_mr%2Fx c:collection1 s: r: x: t:null-5400] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/collections params={node=127.0.0.1:34767_mr%252Fx&action=ADDREPLICA&collection=collection1&shard=shard1&type=NRT&wt=javabin&version=2} status=0 QTime=3521 2> 483204 INFO (qtp734457442-2385) [n:127.0.0.1:46725_mr%2Fx c:collection1 s:shard2 r:core_node8 x:collection1_shard2_replica_n6 t:null-5404] o.a.s.c.ZkShardTerms Successful update of terms at /collections/collection1/terms/shard2 to Terms{values={core_node4=0, core_node8=0}, version=1} for registerTerm 2> 483205 INFO (qtp734457442-2385) [n:127.0.0.1:46725_mr%2Fx c:collection1 s:shard2 r:core_node8 x:collection1_shard2_replica_n6 t:null-5404] o.a.s.c.ShardLeaderElectionContextBase make sure parent is created /collections/collection1/leaders/shard2 2> 483206 INFO (searcherExecutor-1555-thread-1-processing-collection1_shard2_replica_n6 null-5404 core_node8 127.0.0.1:46725_mr%2Fx collection1 shard2) [n:127.0.0.1:46725_mr%2Fx c:collection1 s:shard2 r:core_node8 x:collection1_shard2_replica_n6 t:null-5404] o.a.s.c.SolrCore Registered new searcher autowarm time: 0 ms 2> 483210 INFO (qtp1977894741-2399) [n:127.0.0.1:33625_mr%2Fx c:collection1 s:shard1 r:core_node3 x:collection1_shard1_replica_n1 t:null-5405] o.a.s.c.S.Request webapp=/mr/x path=/admin/ping params={wt=javabin&version=2} hits=0 status=0 QTime=42 2> 483210 INFO (qtp1977894741-2399) [n:127.0.0.1:33625_mr%2Fx c:collection1 s:shard1 r:core_node3 x:collection1_shard1_replica_n1 t:null-5405] o.a.s.c.S.Request webapp=/mr/x path=/admin/ping params={wt=javabin&version=2} status=0 QTime=43 2> 483212 INFO (recoveryExecutor-1525-thread-1-processing-collection1_shard1_replica_n5 null-5403 core_node7 127.0.0.1:34767_mr%2Fx collection1 shard1) [n:127.0.0.1:34767_mr%2Fx c:collection1 s:shard1 r:core_node7 x:collection1_shard1_replica_n5 t:null-5403] o.a.s.c.RecoveryStrategy Begin buffering updates. core=[collection1_shard1_replica_n5] 2> 483215 INFO (recoveryExecutor-1525-thread-1-processing-collection1_shard1_replica_n5 null-5403 core_node7 127.0.0.1:34767_mr%2Fx collection1 shard1) [n:127.0.0.1:34767_mr%2Fx c:collection1 s:shard1 r:core_node7 x:collection1_shard1_replica_n5 t:null-5403] o.a.s.u.UpdateLog Starting to buffer updates. FSUpdateLog{state=ACTIVE, tlog=null} 2> 483215 INFO (recoveryExecutor-1525-thread-1-processing-collection1_shard1_replica_n5 null-5403 core_node7 127.0.0.1:34767_mr%2Fx collection1 shard1) [n:127.0.0.1:34767_mr%2Fx c:collection1 s:shard1 r:core_node7 x:collection1_shard1_replica_n5 t:null-5403] o.a.s.c.RecoveryStrategy Publishing state of core [collection1_shard1_replica_n5] as recovering, leader is [http://127.0.0.1:33625/mr/x/collection1_shard1_replica_n1/] and I am [http://127.0.0.1:34767/mr/x/collection1_shard1_replica_n5/] 2> 483226 INFO (recoveryExecutor-1525-thread-1-processing-collection1_shard1_replica_n5 null-5403 core_node7 127.0.0.1:34767_mr%2Fx collection1 shard1) [n:127.0.0.1:34767_mr%2Fx c:collection1 s:shard1 r:core_node7 x:collection1_shard1_replica_n5 t:null-5403] o.a.s.c.RecoveryStrategy Sending prep recovery command to [http://127.0.0.1:33625/mr/x]; [WaitForState: action=PREPRECOVERY&core=collection1_shard1_replica_n1&nodeName=127.0.0.1:34767_mr%252Fx&coreNodeName=core_node7&state=recovering&checkLive=true&onlyIfLeader=true&onlyIfLeaderActive=true] 2> 483231 INFO (qtp734457442-2385) [n:127.0.0.1:46725_mr%2Fx c:collection1 s:shard2 r:core_node8 x:collection1_shard2_replica_n6 t:null-5404] o.a.s.c.ZkController Core needs to recover:collection1_shard2_replica_n6 2> 483233 INFO (qtp1977894741-2398) [n:127.0.0.1:33625_mr%2Fx c: s: r: x:collection1_shard1_replica_n1 t:null-5406] o.a.s.h.a.PrepRecoveryOp Going to wait for coreNodeName: core_node7, state: recovering, checkLive: true, onlyIfLeader: true, onlyIfLeaderActive: true 2> 483241 INFO (qtp1977894741-2398) [n:127.0.0.1:33625_mr%2Fx c: s: r: x: t:null-5406] o.a.s.h.a.PrepRecoveryOp In WaitForState(recovering): collection=collection1, shard=shard1, thisCore=collection1_shard1_replica_n1, leaderDoesNotNeedRecovery=false, isLeader? true, live=true, checkLive=true, currentState=down, localState=active, nodeName=127.0.0.1:34767_mr%2Fx, coreNodeName=core_node7, onlyIfActiveCheckResult=false, nodeProps: core_node7:{ 2> "core":"collection1_shard1_replica_n5", 2> "node_name":"127.0.0.1:34767_mr%2Fx", 2> "base_url":"http://127.0.0.1:34767/mr/x", 2> "state":"down", 2> "type":"NRT", 2> "force_set_state":"false"} 2> 483241 INFO (updateExecutor-1484-thread-1-processing-collection1_shard2_replica_n6 null-5404 core_node8 127.0.0.1:46725_mr%2Fx collection1 shard2) [n:127.0.0.1:46725_mr%2Fx c:collection1 s:shard2 r:core_node8 x:collection1_shard2_replica_n6 t:null-5404] o.a.s.u.DefaultSolrCoreState Running recovery 2> 483242 INFO (qtp1977894741-2398) [n:127.0.0.1:33625_mr%2Fx c: s: r: x: t:null-5406] o.a.s.h.a.PrepRecoveryOp In WaitForState(recovering): collection=collection1, shard=shard1, thisCore=collection1_shard1_replica_n1, leaderDoesNotNeedRecovery=false, isLeader? true, live=true, checkLive=true, currentState=down, localState=active, nodeName=127.0.0.1:34767_mr%2Fx, coreNodeName=core_node7, onlyIfActiveCheckResult=false, nodeProps: core_node7:{ 2> "core":"collection1_shard1_replica_n5", 2> "node_name":"127.0.0.1:34767_mr%2Fx", 2> "base_url":"http://127.0.0.1:34767/mr/x", 2> "state":"down", 2> "type":"NRT", 2> "force_set_state":"false"} 2> 483243 INFO (qtp1977894741-2398) [n:127.0.0.1:33625_mr%2Fx c: s: r: x: t:null-5406] o.a.s.h.a.PrepRecoveryOp In WaitForState(recovering): collection=collection1, shard=shard1, thisCore=collection1_shard1_replica_n1, leaderDoesNotNeedRecovery=false, isLeader? true, live=true, checkLive=true, currentState=down, localState=active, nodeName=127.0.0.1:34767_mr%2Fx, coreNodeName=core_node7, onlyIfActiveCheckResult=false, nodeProps: core_node7:{ 2> "core":"collection1_shard1_replica_n5", 2> "node_name":"127.0.0.1:34767_mr%2Fx", 2> "base_url":"http://127.0.0.1:34767/mr/x", 2> "state":"down", 2> "type":"NRT", 2> "force_set_state":"false"} 2> 483244 INFO (recoveryExecutor-1486-thread-1-processing-collection1_shard2_replica_n6 null-5404 core_node8 127.0.0.1:46725_mr%2Fx collection1 shard2) [n:127.0.0.1:46725_mr%2Fx c:collection1 s:shard2 r:core_node8 x:collection1_shard2_replica_n6 t:null-5404] o.a.s.c.RecoveryStrategy Starting recovery process. recoveringAfterStartup=true 2> 483244 INFO (recoveryExecutor-1486-thread-1-processing-collection1_shard2_replica_n6 null-5404 core_node8 127.0.0.1:46725_mr%2Fx collection1 shard2) [n:127.0.0.1:46725_mr%2Fx c:collection1 s:shard2 r:core_node8 x:collection1_shard2_replica_n6 t:null-5404] o.a.s.c.RecoveryStrategy startupVersions is empty 2> 483244 INFO (qtp1977894741-2398) [n:127.0.0.1:33625_mr%2Fx c: s: r: x: t:null-5406] o.a.s.h.a.PrepRecoveryOp In WaitForState(recovering): collection=collection1, shard=shard1, thisCore=collection1_shard1_replica_n1, leaderDoesNotNeedRecovery=false, isLeader? true, live=true, checkLive=true, currentState=down, localState=active, nodeName=127.0.0.1:34767_mr%2Fx, coreNodeName=core_node7, onlyIfActiveCheckResult=false, nodeProps: core_node7:{ 2> "core":"collection1_shard1_replica_n5", 2> "node_name":"127.0.0.1:34767_mr%2Fx", 2> "base_url":"http://127.0.0.1:34767/mr/x", 2> "state":"down", 2> "type":"NRT", 2> "force_set_state":"false"} 2> 483262 INFO (qtp734457442-2385) [n:127.0.0.1:46725_mr%2Fx c: s: r: x: t:null-5404] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/cores params={qt=/admin/cores&coreNodeName=core_node8&collection.configName=conf1&name=collection1_shard2_replica_n6&action=CREATE&collection=collection1&shard=shard2&wt=javabin&version=2&replicaType=NRT} status=0 QTime=1333 2> 483262 INFO (qtp1831257447-2439) [n:127.0.0.1:45309_mr%2Fx c:collection1 s:shard2 r:core_node4 x:collection1_shard2_replica_n2 t:null-5407] o.a.s.c.S.Request webapp=/mr/x path=/admin/ping params={wt=javabin&version=2} hits=0 status=0 QTime=3 2> 483262 INFO (qtp1831257447-2439) [n:127.0.0.1:45309_mr%2Fx c:collection1 s:shard2 r:core_node4 x:collection1_shard2_replica_n2 t:null-5407] o.a.s.c.S.Request webapp=/mr/x path=/admin/ping params={wt=javabin&version=2} status=0 QTime=3 2> 483264 INFO (recoveryExecutor-1486-thread-1-processing-collection1_shard2_replica_n6 null-5404 core_node8 127.0.0.1:46725_mr%2Fx collection1 shard2) [n:127.0.0.1:46725_mr%2Fx c:collection1 s:shard2 r:core_node8 x:collection1_shard2_replica_n6 t:null-5404] o.a.s.c.RecoveryStrategy Begin buffering updates. core=[collection1_shard2_replica_n6] 2> 483270 INFO (recoveryExecutor-1486-thread-1-processing-collection1_shard2_replica_n6 null-5404 core_node8 127.0.0.1:46725_mr%2Fx collection1 shard2) [n:127.0.0.1:46725_mr%2Fx c:collection1 s:shard2 r:core_node8 x:collection1_shard2_replica_n6 t:null-5404] o.a.s.u.UpdateLog Starting to buffer updates. FSUpdateLog{state=ACTIVE, tlog=null} 2> 483270 INFO (recoveryExecutor-1486-thread-1-processing-collection1_shard2_replica_n6 null-5404 core_node8 127.0.0.1:46725_mr%2Fx collection1 shard2) [n:127.0.0.1:46725_mr%2Fx c:collection1 s:shard2 r:core_node8 x:collection1_shard2_replica_n6 t:null-5404] o.a.s.c.RecoveryStrategy Publishing state of core [collection1_shard2_replica_n6] as recovering, leader is [http://127.0.0.1:45309/mr/x/collection1_shard2_replica_n2/] and I am [http://127.0.0.1:46725/mr/x/collection1_shard2_replica_n6/] 2> 483278 INFO (recoveryExecutor-1486-thread-1-processing-collection1_shard2_replica_n6 null-5404 core_node8 127.0.0.1:46725_mr%2Fx collection1 shard2) [n:127.0.0.1:46725_mr%2Fx c:collection1 s:shard2 r:core_node8 x:collection1_shard2_replica_n6 t:null-5404] o.a.s.c.RecoveryStrategy Sending prep recovery command to [http://127.0.0.1:45309/mr/x]; [WaitForState: action=PREPRECOVERY&core=collection1_shard2_replica_n2&nodeName=127.0.0.1:46725_mr%252Fx&coreNodeName=core_node8&state=recovering&checkLive=true&onlyIfLeader=true&onlyIfLeaderActive=true] 2> 483281 INFO (qtp1977894741-2400) [n:127.0.0.1:33625_mr%2Fx c:collection1 s: r: x: t:null-5397] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/collections params={node=127.0.0.1:46725_mr%252Fx&action=ADDREPLICA&collection=collection1&shard=shard2&type=NRT&wt=javabin&version=2} status=0 QTime=3614 2> 483283 INFO (qtp1831257447-2440) [n:127.0.0.1:45309_mr%2Fx c: s: r: x:collection1_shard2_replica_n2 t:null-5408] o.a.s.h.a.PrepRecoveryOp Going to wait for coreNodeName: core_node8, state: recovering, checkLive: true, onlyIfLeader: true, onlyIfLeaderActive: true 2> 483283 INFO (TEST-TestReplicaProperties.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.AbstractFullDistribZkTestBase Waiting to see 4 active replicas in collection: collection1 2> 483284 INFO (qtp1831257447-2440) [n:127.0.0.1:45309_mr%2Fx c: s: r: x: t:null-5408] o.a.s.h.a.PrepRecoveryOp In WaitForState(recovering): collection=collection1, shard=shard2, thisCore=collection1_shard2_replica_n2, leaderDoesNotNeedRecovery=false, isLeader? true, live=true, checkLive=true, currentState=down, localState=active, nodeName=127.0.0.1:46725_mr%2Fx, coreNodeName=core_node8, onlyIfActiveCheckResult=false, nodeProps: core_node8:{ 2> "core":"collection1_shard2_replica_n6", 2> "node_name":"127.0.0.1:46725_mr%2Fx", 2> "base_url":"http://127.0.0.1:46725/mr/x", 2> "state":"down", 2> "type":"NRT", 2> "force_set_state":"false"} 2> 483285 INFO (qtp1831257447-2440) [n:127.0.0.1:45309_mr%2Fx c: s: r: x: t:null-5408] o.a.s.h.a.PrepRecoveryOp In WaitForState(recovering): collection=collection1, shard=shard2, thisCore=collection1_shard2_replica_n2, leaderDoesNotNeedRecovery=false, isLeader? true, live=true, checkLive=true, currentState=down, localState=active, nodeName=127.0.0.1:46725_mr%2Fx, coreNodeName=core_node8, onlyIfActiveCheckResult=false, nodeProps: core_node8:{ 2> "core":"collection1_shard2_replica_n6", 2> "node_name":"127.0.0.1:46725_mr%2Fx", 2> "base_url":"http://127.0.0.1:46725/mr/x", 2> "state":"down", 2> "type":"NRT", 2> "force_set_state":"false"} 2> 483286 INFO (qtp1831257447-2440) [n:127.0.0.1:45309_mr%2Fx c: s: r: x: t:null-5408] o.a.s.h.a.PrepRecoveryOp In WaitForState(recovering): collection=collection1, shard=shard2, thisCore=collection1_shard2_replica_n2, leaderDoesNotNeedRecovery=false, isLeader? true, live=true, checkLive=true, currentState=down, localState=active, nodeName=127.0.0.1:46725_mr%2Fx, coreNodeName=core_node8, onlyIfActiveCheckResult=false, nodeProps: core_node8:{ 2> "core":"collection1_shard2_replica_n6", 2> "node_name":"127.0.0.1:46725_mr%2Fx", 2> "base_url":"http://127.0.0.1:46725/mr/x", 2> "state":"down", 2> "type":"NRT", 2> "force_set_state":"false"} 2> 483287 INFO (qtp1831257447-2440) [n:127.0.0.1:45309_mr%2Fx c: s: r: x: t:null-5408] o.a.s.h.a.PrepRecoveryOp In WaitForState(recovering): collection=collection1, shard=shard2, thisCore=collection1_shard2_replica_n2, leaderDoesNotNeedRecovery=false, isLeader? true, live=true, checkLive=true, currentState=down, localState=active, nodeName=127.0.0.1:46725_mr%2Fx, coreNodeName=core_node8, onlyIfActiveCheckResult=false, nodeProps: core_node8:{ 2> "core":"collection1_shard2_replica_n6", 2> "node_name":"127.0.0.1:46725_mr%2Fx", 2> "base_url":"http://127.0.0.1:46725/mr/x", 2> "state":"down", 2> "type":"NRT", 2> "force_set_state":"false"} 2> 483389 INFO (zkCallback-1489-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/collection1/state.json zxid: 327] for collection [collection1] has occurred - updating... (live nodes size: [5]) 2> 483390 INFO (zkCallback-1476-thread-2) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/collection1/state.json zxid: 327] for collection [collection1] has occurred - updating... (live nodes size: [5]) 2> 483390 INFO (zkCallback-1509-thread-2) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/collection1/state.json zxid: 327] for collection [collection1] has occurred - updating... (live nodes size: [5]) 2> 483390 INFO (zkCallback-1476-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/collection1/state.json zxid: 327] for collection [collection1] has occurred - updating... (live nodes size: [5]) 2> 483390 INFO (zkCallback-1509-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/collection1/state.json zxid: 327] for collection [collection1] has occurred - updating... (live nodes size: [5]) 2> 483389 INFO (zkCallback-1528-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/collection1/state.json zxid: 327] for collection [collection1] has occurred - updating... (live nodes size: [5]) 2> 483389 INFO (zkCallback-1456-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/collection1/state.json zxid: 327] for collection [collection1] has occurred - updating... (live nodes size: [5]) 2> 483401 INFO (watches-1478-thread-1) [n:127.0.0.1:33625_mr%2Fx c: s: r: x: t:] o.a.s.h.a.PrepRecoveryOp In WaitForState(recovering): collection=collection1, shard=shard1, thisCore=collection1_shard1_replica_n1, leaderDoesNotNeedRecovery=false, isLeader? true, live=true, checkLive=true, currentState=recovering, localState=active, nodeName=127.0.0.1:34767_mr%2Fx, coreNodeName=core_node7, onlyIfActiveCheckResult=false, nodeProps: core_node7:{ 2> "core":"collection1_shard1_replica_n5", 2> "node_name":"127.0.0.1:34767_mr%2Fx", 2> "base_url":"http://127.0.0.1:34767/mr/x", 2> "state":"recovering", 2> "type":"NRT", 2> "force_set_state":"false"} 2> 483401 INFO (watches-1511-thread-1) [n:127.0.0.1:45309_mr%2Fx c: s: r: x: t:] o.a.s.h.a.PrepRecoveryOp In WaitForState(recovering): collection=collection1, shard=shard2, thisCore=collection1_shard2_replica_n2, leaderDoesNotNeedRecovery=false, isLeader? true, live=true, checkLive=true, currentState=recovering, localState=active, nodeName=127.0.0.1:46725_mr%2Fx, coreNodeName=core_node8, onlyIfActiveCheckResult=false, nodeProps: core_node8:{ 2> "core":"collection1_shard2_replica_n6", 2> "node_name":"127.0.0.1:46725_mr%2Fx", 2> "base_url":"http://127.0.0.1:46725/mr/x", 2> "state":"recovering", 2> "type":"NRT", 2> "force_set_state":"false"} 2> 483402 INFO (qtp1977894741-2398) [n:127.0.0.1:33625_mr%2Fx c: s: r: x: t:null-5406] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/cores params={nodeName=127.0.0.1:34767_mr%252Fx&onlyIfLeaderActive=true&core=collection1_shard1_replica_n1&coreNodeName=core_node7&action=PREPRECOVERY&checkLive=true&state=recovering&onlyIfLeader=true&wt=javabin&version=2} status=0 QTime=172 2> 483402 INFO (qtp1831257447-2440) [n:127.0.0.1:45309_mr%2Fx c: s: r: x: t:null-5408] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/cores params={nodeName=127.0.0.1:46725_mr%252Fx&onlyIfLeaderActive=true&core=collection1_shard2_replica_n2&coreNodeName=core_node8&action=PREPRECOVERY&checkLive=true&state=recovering&onlyIfLeader=true&wt=javabin&version=2} status=0 QTime=120 2> 483749 INFO (OverseerCollectionConfigSetProcessor-72077818027180036-127.0.0.1:40873_mr%2Fx-n_0000000000) [n: c: s: r: x: t:] o.a.s.c.OverseerTaskQueue Response ZK path: /overseer/collection-queue-work/qnr-0000000006 doesn't exist. Requestor may have disconnected from ZooKeeper 2> 483755 INFO (OverseerCollectionConfigSetProcessor-72077818027180036-127.0.0.1:40873_mr%2Fx-n_0000000000) [n: c: s: r: x: t:] o.a.s.c.OverseerTaskQueue Response ZK path: /overseer/collection-queue-work/qnr-0000000007 doesn't exist. Requestor may have disconnected from ZooKeeper 2> 483904 INFO (recoveryExecutor-1486-thread-1-processing-collection1_shard2_replica_n6 null-5404 core_node8 127.0.0.1:46725_mr%2Fx collection1 shard2) [n:127.0.0.1:46725_mr%2Fx c:collection1 s:shard2 r:core_node8 x:collection1_shard2_replica_n6 t:null-5404] o.a.s.c.RecoveryStrategy Attempting to PeerSync from [http://127.0.0.1:45309/mr/x/collection1_shard2_replica_n2/] - recoveringAfterStartup=[true] 2> 483904 INFO (recoveryExecutor-1525-thread-1-processing-collection1_shard1_replica_n5 null-5403 core_node7 127.0.0.1:34767_mr%2Fx collection1 shard1) [n:127.0.0.1:34767_mr%2Fx c:collection1 s:shard1 r:core_node7 x:collection1_shard1_replica_n5 t:null-5403] o.a.s.c.RecoveryStrategy Attempting to PeerSync from [http://127.0.0.1:33625/mr/x/collection1_shard1_replica_n1/] - recoveringAfterStartup=[true] 2> 483907 WARN (recoveryExecutor-1525-thread-1-processing-collection1_shard1_replica_n5 null-5403 core_node7 127.0.0.1:34767_mr%2Fx collection1 shard1) [n:127.0.0.1:34767_mr%2Fx c:collection1 s:shard1 r:core_node7 x:collection1_shard1_replica_n5 t:null-5403] o.a.s.u.PeerSyncWithLeader no frame of reference to tell if we've missed updates 2> 483908 INFO (recoveryExecutor-1525-thread-1-processing-collection1_shard1_replica_n5 null-5403 core_node7 127.0.0.1:34767_mr%2Fx collection1 shard1) [n:127.0.0.1:34767_mr%2Fx c:collection1 s:shard1 r:core_node7 x:collection1_shard1_replica_n5 t:null-5403] o.a.s.c.RecoveryStrategy PeerSync Recovery was not successful - trying replication. 2> 483908 INFO (recoveryExecutor-1525-thread-1-processing-collection1_shard1_replica_n5 null-5403 core_node7 127.0.0.1:34767_mr%2Fx collection1 shard1) [n:127.0.0.1:34767_mr%2Fx c:collection1 s:shard1 r:core_node7 x:collection1_shard1_replica_n5 t:null-5403] o.a.s.c.RecoveryStrategy Starting Replication Recovery. 2> 483908 WARN (recoveryExecutor-1486-thread-1-processing-collection1_shard2_replica_n6 null-5404 core_node8 127.0.0.1:46725_mr%2Fx collection1 shard2) [n:127.0.0.1:46725_mr%2Fx c:collection1 s:shard2 r:core_node8 x:collection1_shard2_replica_n6 t:null-5404] o.a.s.u.PeerSyncWithLeader no frame of reference to tell if we've missed updates 2> 483908 INFO (recoveryExecutor-1525-thread-1-processing-collection1_shard1_replica_n5 null-5403 core_node7 127.0.0.1:34767_mr%2Fx collection1 shard1) [n:127.0.0.1:34767_mr%2Fx c:collection1 s:shard1 r:core_node7 x:collection1_shard1_replica_n5 t:null-5403] o.a.s.c.RecoveryStrategy Attempting to replicate from [http://127.0.0.1:33625/mr/x/collection1_shard1_replica_n1/]. 2> 483909 INFO (recoveryExecutor-1486-thread-1-processing-collection1_shard2_replica_n6 null-5404 core_node8 127.0.0.1:46725_mr%2Fx collection1 shard2) [n:127.0.0.1:46725_mr%2Fx c:collection1 s:shard2 r:core_node8 x:collection1_shard2_replica_n6 t:null-5404] o.a.s.c.RecoveryStrategy PeerSync Recovery was not successful - trying replication. 2> 483910 INFO (recoveryExecutor-1486-thread-1-processing-collection1_shard2_replica_n6 null-5404 core_node8 127.0.0.1:46725_mr%2Fx collection1 shard2) [n:127.0.0.1:46725_mr%2Fx c:collection1 s:shard2 r:core_node8 x:collection1_shard2_replica_n6 t:null-5404] o.a.s.c.RecoveryStrategy Starting Replication Recovery. 2> 483910 INFO (recoveryExecutor-1486-thread-1-processing-collection1_shard2_replica_n6 null-5404 core_node8 127.0.0.1:46725_mr%2Fx collection1 shard2) [n:127.0.0.1:46725_mr%2Fx c:collection1 s:shard2 r:core_node8 x:collection1_shard2_replica_n6 t:null-5404] o.a.s.c.RecoveryStrategy Attempting to replicate from [http://127.0.0.1:45309/mr/x/collection1_shard2_replica_n2/]. 2> 483995 INFO (qtp852564526-2474) [n:127.0.0.1:34767_mr%2Fx c:collection1 s:shard1 r:core_node7 x:collection1_shard1_replica_n5 t:null-5410] o.a.s.u.p.DistributedUpdateProcessor Ignoring commit while not ACTIVE - state: BUFFERING replay: false 2> 483996 INFO (qtp852564526-2474) [n:127.0.0.1:34767_mr%2Fx c:collection1 s:shard1 r:core_node7 x:collection1_shard1_replica_n5 t:null-5410] o.a.s.u.p.LogUpdateProcessorFactory webapp=/mr/x path=/update params={update.distrib=FROMLEADER&waitSearcher=true&openSearcher=false&commit=true&softCommit=false&distrib.from=http://127.0.0.1:33625/mr/x/collection1_shard1_replica_n1/&commit_end_point=replicas&wt=javabin&version=2&expungeDeletes=false}{commit=} 0 8 2> 483997 INFO (qtp734457442-2388) [n:127.0.0.1:46725_mr%2Fx c:collection1 s:shard2 r:core_node8 x:collection1_shard2_replica_n6 t:null-5409] o.a.s.u.p.DistributedUpdateProcessor Ignoring commit while not ACTIVE - state: BUFFERING replay: false 2> 483997 INFO (qtp734457442-2388) [n:127.0.0.1:46725_mr%2Fx c:collection1 s:shard2 r:core_node8 x:collection1_shard2_replica_n6 t:null-5409] o.a.s.u.p.LogUpdateProcessorFactory webapp=/mr/x path=/update params={update.distrib=FROMLEADER&waitSearcher=true&openSearcher=false&commit=true&softCommit=false&distrib.from=http://127.0.0.1:45309/mr/x/collection1_shard2_replica_n2/&commit_end_point=replicas&wt=javabin&version=2&expungeDeletes=false}{commit=} 0 8 2> 483998 INFO (qtp1977894741-2399) [n:127.0.0.1:33625_mr%2Fx c:collection1 s:shard1 r:core_node3 x:collection1_shard1_replica_n1 t:null-5410] o.a.s.u.p.LogUpdateProcessorFactory webapp=/mr/x path=/update params={update.distrib=TOLEADER&waitSearcher=true&openSearcher=false&commit=true&softCommit=false&distrib.from=http://127.0.0.1:45309/mr/x/collection1_shard2_replica_n2/&commit_end_point=leaders&wt=javabin&version=2&expungeDeletes=false}{commit=} 0 49 2> 483999 INFO (qtp1831257447-2438) [n:127.0.0.1:45309_mr%2Fx c:collection1 s:shard2 r:core_node4 x:collection1_shard2_replica_n2 t:null-5409] o.a.s.u.p.LogUpdateProcessorFactory webapp=/mr/x path=/update params={update.distrib=TOLEADER&waitSearcher=true&openSearcher=false&commit=true&softCommit=false&distrib.from=http://127.0.0.1:33625/mr/x/collection1_shard1_replica_n1/&commit_end_point=leaders&wt=javabin&version=2&expungeDeletes=false}{commit=} 0 51 2> 484020 INFO (qtp734457442-2383) [n:127.0.0.1:46725_mr%2Fx c:collection1 s:shard2 r:core_node8 x:collection1_shard2_replica_n6 t:null-5410] o.a.s.u.p.DistributedUpdateProcessor Ignoring commit while not ACTIVE - state: BUFFERING replay: false 2> 484021 INFO (qtp734457442-2383) [n:127.0.0.1:46725_mr%2Fx c:collection1 s:shard2 r:core_node8 x:collection1_shard2_replica_n6 t:null-5410] o.a.s.u.p.LogUpdateProcessorFactory webapp=/mr/x path=/update params={update.distrib=FROMLEADER&waitSearcher=true&openSearcher=false&commit=true&softCommit=false&distrib.from=http://127.0.0.1:45309/mr/x/collection1_shard2_replica_n2/&commit_end_point=replicas&wt=javabin&version=2&expungeDeletes=false}{commit=} 0 5 2> 484028 INFO (qtp852564526-2469) [n:127.0.0.1:34767_mr%2Fx c:collection1 s:shard1 r:core_node7 x:collection1_shard1_replica_n5 t:null-5409] o.a.s.u.p.DistributedUpdateProcessor Ignoring commit while not ACTIVE - state: BUFFERING replay: false 2> 484029 INFO (qtp852564526-2469) [n:127.0.0.1:34767_mr%2Fx c:collection1 s:shard1 r:core_node7 x:collection1_shard1_replica_n5 t:null-5409] o.a.s.u.p.LogUpdateProcessorFactory webapp=/mr/x path=/update params={update.distrib=FROMLEADER&waitSearcher=true&openSearcher=false&commit=true&softCommit=false&distrib.from=http://127.0.0.1:33625/mr/x/collection1_shard1_replica_n1/&commit_end_point=replicas&wt=javabin&version=2&expungeDeletes=false}{commit=} 0 11 2> 484030 INFO (qtp1831257447-2435) [n:127.0.0.1:45309_mr%2Fx c:collection1 s:shard2 r:core_node4 x:collection1_shard2_replica_n2 t:null-5410] o.a.s.u.p.LogUpdateProcessorFactory webapp=/mr/x path=/update params={waitSearcher=true&openSearcher=false&commit=true&softCommit=false&wt=javabin&version=2}{commit=} 0 116 2> 484030 INFO (qtp1977894741-2401) [n:127.0.0.1:33625_mr%2Fx c:collection1 s:shard1 r:core_node3 x:collection1_shard1_replica_n1 t:null-5409] o.a.s.u.p.LogUpdateProcessorFactory webapp=/mr/x path=/update params={waitSearcher=true&openSearcher=false&commit=true&softCommit=false&wt=javabin&version=2}{commit=} 0 118 2> 484041 INFO (qtp1977894741-2400) [n:127.0.0.1:33625_mr%2Fx c:collection1 s:shard1 r:core_node3 x:collection1_shard1_replica_n1 t:null-5411] o.a.s.c.S.Request webapp=/mr/x path=/replication params={qt=/replication&wt=javabin&version=2&command=indexversion} status=0 QTime=2 2> 484041 INFO (qtp1831257447-2439) [n:127.0.0.1:45309_mr%2Fx c:collection1 s:shard2 r:core_node4 x:collection1_shard2_replica_n2 t:null-5412] o.a.s.c.S.Request webapp=/mr/x path=/replication params={qt=/replication&wt=javabin&version=2&command=indexversion} status=0 QTime=2 2> 484042 INFO (recoveryExecutor-1525-thread-1-processing-collection1_shard1_replica_n5 null-5403 core_node7 127.0.0.1:34767_mr%2Fx collection1 shard1) [n:127.0.0.1:34767_mr%2Fx c:collection1 s:shard1 r:core_node7 x:collection1_shard1_replica_n5 t:null-5403] o.a.s.h.IndexFetcher Leader's generation: 1 2> 484043 INFO (recoveryExecutor-1486-thread-1-processing-collection1_shard2_replica_n6 null-5404 core_node8 127.0.0.1:46725_mr%2Fx collection1 shard2) [n:127.0.0.1:46725_mr%2Fx c:collection1 s:shard2 r:core_node8 x:collection1_shard2_replica_n6 t:null-5404] o.a.s.h.IndexFetcher Leader's generation: 1 2> 484043 INFO (recoveryExecutor-1525-thread-1-processing-collection1_shard1_replica_n5 null-5403 core_node7 127.0.0.1:34767_mr%2Fx collection1 shard1) [n:127.0.0.1:34767_mr%2Fx c:collection1 s:shard1 r:core_node7 x:collection1_shard1_replica_n5 t:null-5403] o.a.s.h.IndexFetcher Leader's version: 0 2> 484043 INFO (recoveryExecutor-1486-thread-1-processing-collection1_shard2_replica_n6 null-5404 core_node8 127.0.0.1:46725_mr%2Fx collection1 shard2) [n:127.0.0.1:46725_mr%2Fx c:collection1 s:shard2 r:core_node8 x:collection1_shard2_replica_n6 t:null-5404] o.a.s.h.IndexFetcher Leader's version: 0 2> 484043 INFO (recoveryExecutor-1525-thread-1-processing-collection1_shard1_replica_n5 null-5403 core_node7 127.0.0.1:34767_mr%2Fx collection1 shard1) [n:127.0.0.1:34767_mr%2Fx c:collection1 s:shard1 r:core_node7 x:collection1_shard1_replica_n5 t:null-5403] o.a.s.h.IndexFetcher Follower's generation: 1 2> 484043 INFO (recoveryExecutor-1486-thread-1-processing-collection1_shard2_replica_n6 null-5404 core_node8 127.0.0.1:46725_mr%2Fx collection1 shard2) [n:127.0.0.1:46725_mr%2Fx c:collection1 s:shard2 r:core_node8 x:collection1_shard2_replica_n6 t:null-5404] o.a.s.h.IndexFetcher Follower's generation: 1 2> 484043 INFO (recoveryExecutor-1525-thread-1-processing-collection1_shard1_replica_n5 null-5403 core_node7 127.0.0.1:34767_mr%2Fx collection1 shard1) [n:127.0.0.1:34767_mr%2Fx c:collection1 s:shard1 r:core_node7 x:collection1_shard1_replica_n5 t:null-5403] o.a.s.h.IndexFetcher Follower's version: 0 2> 484043 INFO (recoveryExecutor-1486-thread-1-processing-collection1_shard2_replica_n6 null-5404 core_node8 127.0.0.1:46725_mr%2Fx collection1 shard2) [n:127.0.0.1:46725_mr%2Fx c:collection1 s:shard2 r:core_node8 x:collection1_shard2_replica_n6 t:null-5404] o.a.s.h.IndexFetcher Follower's version: 0 2> 484043 INFO (recoveryExecutor-1525-thread-1-processing-collection1_shard1_replica_n5 null-5403 core_node7 127.0.0.1:34767_mr%2Fx collection1 shard1) [n:127.0.0.1:34767_mr%2Fx c:collection1 s:shard1 r:core_node7 x:collection1_shard1_replica_n5 t:null-5403] o.a.s.c.RecoveryStrategy No replay needed. 2> 484044 INFO (recoveryExecutor-1486-thread-1-processing-collection1_shard2_replica_n6 null-5404 core_node8 127.0.0.1:46725_mr%2Fx collection1 shard2) [n:127.0.0.1:46725_mr%2Fx c:collection1 s:shard2 r:core_node8 x:collection1_shard2_replica_n6 t:null-5404] o.a.s.c.RecoveryStrategy No replay needed. 2> 484050 INFO (recoveryExecutor-1525-thread-1-processing-collection1_shard1_replica_n5 null-5403 core_node7 127.0.0.1:34767_mr%2Fx collection1 shard1) [n:127.0.0.1:34767_mr%2Fx c:collection1 s:shard1 r:core_node7 x:collection1_shard1_replica_n5 t:null-5403] o.a.s.c.RecoveryStrategy Replication Recovery was successful. 2> 484052 INFO (recoveryExecutor-1525-thread-1-processing-collection1_shard1_replica_n5 null-5403 core_node7 127.0.0.1:34767_mr%2Fx collection1 shard1) [n:127.0.0.1:34767_mr%2Fx c:collection1 s:shard1 r:core_node7 x:collection1_shard1_replica_n5 t:null-5403] o.a.s.c.RecoveryStrategy Registering as Active after recovery. 2> 484052 INFO (recoveryExecutor-1486-thread-1-processing-collection1_shard2_replica_n6 null-5404 core_node8 127.0.0.1:46725_mr%2Fx collection1 shard2) [n:127.0.0.1:46725_mr%2Fx c:collection1 s:shard2 r:core_node8 x:collection1_shard2_replica_n6 t:null-5404] o.a.s.c.RecoveryStrategy Replication Recovery was successful. 2> 484053 INFO (recoveryExecutor-1486-thread-1-processing-collection1_shard2_replica_n6 null-5404 core_node8 127.0.0.1:46725_mr%2Fx collection1 shard2) [n:127.0.0.1:46725_mr%2Fx c:collection1 s:shard2 r:core_node8 x:collection1_shard2_replica_n6 t:null-5404] o.a.s.c.RecoveryStrategy Registering as Active after recovery. 2> 484057 INFO (recoveryExecutor-1525-thread-1-processing-collection1_shard1_replica_n5 null-5403 core_node7 127.0.0.1:34767_mr%2Fx collection1 shard1) [n:127.0.0.1:34767_mr%2Fx c:collection1 s:shard1 r:core_node7 x:collection1_shard1_replica_n5 t:null-5403] o.a.s.c.RecoveryStrategy Updating version bucket highest from index after successful recovery. 2> 484057 INFO (recoveryExecutor-1525-thread-1-processing-collection1_shard1_replica_n5 null-5403 core_node7 127.0.0.1:34767_mr%2Fx collection1 shard1) [n:127.0.0.1:34767_mr%2Fx c:collection1 s:shard1 r:core_node7 x:collection1_shard1_replica_n5 t:null-5403] o.a.s.u.UpdateLog Could not find max version in index or recent updates, using new clock 1788735743197184000 2> 484057 INFO (recoveryExecutor-1486-thread-1-processing-collection1_shard2_replica_n6 null-5404 core_node8 127.0.0.1:46725_mr%2Fx collection1 shard2) [n:127.0.0.1:46725_mr%2Fx c:collection1 s:shard2 r:core_node8 x:collection1_shard2_replica_n6 t:null-5404] o.a.s.c.RecoveryStrategy Updating version bucket highest from index after successful recovery. 2> 484058 INFO (recoveryExecutor-1486-thread-1-processing-collection1_shard2_replica_n6 null-5404 core_node8 127.0.0.1:46725_mr%2Fx collection1 shard2) [n:127.0.0.1:46725_mr%2Fx c:collection1 s:shard2 r:core_node8 x:collection1_shard2_replica_n6 t:null-5404] o.a.s.u.UpdateLog Could not find max version in index or recent updates, using new clock 1788735743198232576 2> 484065 INFO (recoveryExecutor-1525-thread-1-processing-collection1_shard1_replica_n5 null-5403 core_node7 127.0.0.1:34767_mr%2Fx collection1 shard1) [n:127.0.0.1:34767_mr%2Fx c:collection1 s:shard1 r:core_node7 x:collection1_shard1_replica_n5 t:null-5403] o.a.s.c.RecoveryStrategy Finished recovery process, successful=[true] msTimeTaken=916.0 2> 484066 INFO (recoveryExecutor-1525-thread-1-processing-collection1_shard1_replica_n5 null-5403 core_node7 127.0.0.1:34767_mr%2Fx collection1 shard1) [n:127.0.0.1:34767_mr%2Fx c:collection1 s:shard1 r:core_node7 x:collection1_shard1_replica_n5 t:null-5403] o.a.s.c.RecoveryStrategy Finished recovery process. recoveringAfterStartup=true msTimeTaken=916.0 2> 484066 INFO (recoveryExecutor-1486-thread-1-processing-collection1_shard2_replica_n6 null-5404 core_node8 127.0.0.1:46725_mr%2Fx collection1 shard2) [n:127.0.0.1:46725_mr%2Fx c:collection1 s:shard2 r:core_node8 x:collection1_shard2_replica_n6 t:null-5404] o.a.s.c.RecoveryStrategy Finished recovery process, successful=[true] msTimeTaken=822.0 2> 484067 INFO (recoveryExecutor-1486-thread-1-processing-collection1_shard2_replica_n6 null-5404 core_node8 127.0.0.1:46725_mr%2Fx collection1 shard2) [n:127.0.0.1:46725_mr%2Fx c:collection1 s:shard2 r:core_node8 x:collection1_shard2_replica_n6 t:null-5404] o.a.s.c.RecoveryStrategy Finished recovery process. recoveringAfterStartup=true msTimeTaken=822.0 2> 484171 INFO (zkCallback-1489-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/collection1/state.json zxid: 335] for collection [collection1] has occurred - updating... (live nodes size: [5]) 2> 484171 INFO (zkCallback-1456-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/collection1/state.json zxid: 335] for collection [collection1] has occurred - updating... (live nodes size: [5]) 2> 484172 INFO (zkCallback-1509-thread-2) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/collection1/state.json zxid: 335] for collection [collection1] has occurred - updating... (live nodes size: [5]) 2> 484172 INFO (zkCallback-1476-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/collection1/state.json zxid: 335] for collection [collection1] has occurred - updating... (live nodes size: [5]) 2> 484172 INFO (zkCallback-1528-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/collection1/state.json zxid: 335] for collection [collection1] has occurred - updating... (live nodes size: [5]) 2> 484172 INFO (zkCallback-1509-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/collection1/state.json zxid: 335] for collection [collection1] has occurred - updating... (live nodes size: [5]) 2> 484171 INFO (zkCallback-1476-thread-2) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/collection1/state.json zxid: 335] for collection [collection1] has occurred - updating... (live nodes size: [5]) 2> 484194 INFO (TEST-TestReplicaProperties.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.SolrTestCaseJ4 ###Starting test 2> 484209 WARN (TEST-TestReplicaProperties.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.c.SolrZkClient Using default ZkCredentialsInjector. ZkCredentialsInjector is not secure, it creates an empty list of credentials which leads to 'OPEN_ACL_UNSAFE' ACLs to Zookeeper nodes 2> 484222 INFO (TEST-TestReplicaProperties.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.c.ConnectionManager Waiting up to 15000ms for client to connect to ZooKeeper 2> 484233 INFO (zkConnectionManagerCallback-1571-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ConnectionManager zkClient has connected 2> 484233 INFO (TEST-TestReplicaProperties.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper 2> 484234 WARN (TEST-TestReplicaProperties.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.c.SolrZkClient Using default ZkACLProvider. DefaultZkACLProvider is not secure, it creates 'OPEN_ACL_UNSAFE' ACLs to Zookeeper nodes 2> 484241 INFO (TEST-TestReplicaProperties.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (0) -> (5) 2> 484248 INFO (TEST-TestReplicaProperties.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.s.i.ZkClientClusterStateProvider Cluster at 127.0.0.1:41191/solr ready 2> 484277 INFO (OverseerThreadFactory-1438-thread-5) [n: c:testcollection s: r: x: t:] o.a.s.c.a.c.CreateCollectionCmd Create collection testcollection 2> 484553 INFO (OverseerStateUpdate-72077818027180036-127.0.0.1:40873_mr%2Fx-n_0000000000) [n:127.0.0.1:40873_mr%2Fx c: s: r: x: t:] o.a.s.c.o.SliceMutator createReplica() { 2> "core":"testcollection_shard2_replica_n1", 2> "node_name":"127.0.0.1:34767_mr%2Fx", 2> "base_url":"http://127.0.0.1:34767/mr/x", 2> "collection":"testcollection", 2> "shard":"shard2", 2> "state":"down", 2> "type":"NRT", 2> "operation":"ADDREPLICA", 2> "waitForFinalState":"false"} 2> 484571 INFO (OverseerStateUpdate-72077818027180036-127.0.0.1:40873_mr%2Fx-n_0000000000) [n:127.0.0.1:40873_mr%2Fx c: s: r: x: t:] o.a.s.c.o.SliceMutator createReplica() { 2> "core":"testcollection_shard3_replica_n2", 2> "node_name":"127.0.0.1:40873_mr%2Fx", 2> "base_url":"http://127.0.0.1:40873/mr/x", 2> "collection":"testcollection", 2> "shard":"shard3", 2> "state":"down", 2> "type":"NRT", 2> "operation":"ADDREPLICA", 2> "waitForFinalState":"false"} 2> 484592 INFO (OverseerStateUpdate-72077818027180036-127.0.0.1:40873_mr%2Fx-n_0000000000) [n:127.0.0.1:40873_mr%2Fx c: s: r: x: t:] o.a.s.c.o.SliceMutator createReplica() { 2> "core":"testcollection_shard2_replica_n4", 2> "node_name":"127.0.0.1:45309_mr%2Fx", 2> "base_url":"http://127.0.0.1:45309/mr/x", 2> "collection":"testcollection", 2> "shard":"shard2", 2> "state":"down", 2> "type":"NRT", 2> "operation":"ADDREPLICA", 2> "waitForFinalState":"false"} 2> 484613 INFO (OverseerStateUpdate-72077818027180036-127.0.0.1:40873_mr%2Fx-n_0000000000) [n:127.0.0.1:40873_mr%2Fx c: s: r: x: t:] o.a.s.c.o.SliceMutator createReplica() { 2> "core":"testcollection_shard3_replica_n6", 2> "node_name":"127.0.0.1:45309_mr%2Fx", 2> "base_url":"http://127.0.0.1:45309/mr/x", 2> "collection":"testcollection", 2> "shard":"shard3", 2> "state":"down", 2> "type":"NRT", 2> "operation":"ADDREPLICA", 2> "waitForFinalState":"false"} 2> 484634 INFO (OverseerStateUpdate-72077818027180036-127.0.0.1:40873_mr%2Fx-n_0000000000) [n:127.0.0.1:40873_mr%2Fx c: s: r: x: t:] o.a.s.c.o.SliceMutator createReplica() { 2> "core":"testcollection_shard1_replica_n8", 2> "node_name":"127.0.0.1:33625_mr%2Fx", 2> "base_url":"http://127.0.0.1:33625/mr/x", 2> "collection":"testcollection", 2> "shard":"shard1", 2> "state":"down", 2> "type":"NRT", 2> "operation":"ADDREPLICA", 2> "waitForFinalState":"false"} 2> 484655 INFO (OverseerStateUpdate-72077818027180036-127.0.0.1:40873_mr%2Fx-n_0000000000) [n:127.0.0.1:40873_mr%2Fx c: s: r: x: t:] o.a.s.c.o.SliceMutator createReplica() { 2> "core":"testcollection_shard1_replica_n10", 2> "node_name":"127.0.0.1:46725_mr%2Fx", 2> "base_url":"http://127.0.0.1:46725/mr/x", 2> "collection":"testcollection", 2> "shard":"shard1", 2> "state":"down", 2> "type":"NRT", 2> "operation":"ADDREPLICA", 2> "waitForFinalState":"false"} 2> 484677 INFO (OverseerStateUpdate-72077818027180036-127.0.0.1:40873_mr%2Fx-n_0000000000) [n:127.0.0.1:40873_mr%2Fx c: s: r: x: t:] o.a.s.c.o.SliceMutator createReplica() { 2> "core":"testcollection_shard2_replica_n12", 2> "node_name":"127.0.0.1:33625_mr%2Fx", 2> "base_url":"http://127.0.0.1:33625/mr/x", 2> "collection":"testcollection", 2> "shard":"shard2", 2> "state":"down", 2> "type":"NRT", 2> "operation":"ADDREPLICA", 2> "waitForFinalState":"false"} 2> 484698 INFO (OverseerStateUpdate-72077818027180036-127.0.0.1:40873_mr%2Fx-n_0000000000) [n:127.0.0.1:40873_mr%2Fx c: s: r: x: t:] o.a.s.c.o.SliceMutator createReplica() { 2> "core":"testcollection_shard3_replica_n14", 2> "node_name":"127.0.0.1:46725_mr%2Fx", 2> "base_url":"http://127.0.0.1:46725/mr/x", 2> "collection":"testcollection", 2> "shard":"shard3", 2> "state":"down", 2> "type":"NRT", 2> "operation":"ADDREPLICA", 2> "waitForFinalState":"false"} 2> 484717 INFO (OverseerStateUpdate-72077818027180036-127.0.0.1:40873_mr%2Fx-n_0000000000) [n:127.0.0.1:40873_mr%2Fx c: s: r: x: t:] o.a.s.c.o.SliceMutator createReplica() { 2> "core":"testcollection_shard1_replica_n16", 2> "node_name":"127.0.0.1:34767_mr%2Fx", 2> "base_url":"http://127.0.0.1:34767/mr/x", 2> "collection":"testcollection", 2> "shard":"shard1", 2> "state":"down", 2> "type":"NRT", 2> "operation":"ADDREPLICA", 2> "waitForFinalState":"false"} 2> 484834 INFO (zkCallback-1432-thread-2) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/testcollection/state.json zxid: 373] for collection [testcollection] has occurred - updating... (live nodes size: [5]) 2> 484846 INFO (qtp852564526-2470) [n:127.0.0.1:34767_mr%2Fx c: s: r: x:testcollection_shard2_replica_n1 t:null-5414] o.a.s.h.a.CoreAdminOperation core create command qt=/admin/cores&collection.configName=conf1&newCollection=true&collection=testcollection&version=2&replicaType=NRT&coreNodeName=core_node3&name=testcollection_shard2_replica_n1&action=CREATE&numShards=3&shard=shard2&wt=javabin 2> 484849 INFO (qtp1831257447-2437) [n:127.0.0.1:45309_mr%2Fx c: s: r: x:testcollection_shard2_replica_n4 t:null-5415] o.a.s.h.a.CoreAdminOperation core create command qt=/admin/cores&collection.configName=conf1&newCollection=true&collection=testcollection&version=2&replicaType=NRT&coreNodeName=core_node7&name=testcollection_shard2_replica_n4&action=CREATE&numShards=3&shard=shard2&wt=javabin 2> 484851 INFO (qtp1831257447-2438) [n:127.0.0.1:45309_mr%2Fx c: s: r: x:testcollection_shard3_replica_n6 t:null-5416] o.a.s.h.a.CoreAdminOperation core create command qt=/admin/cores&collection.configName=conf1&newCollection=true&collection=testcollection&version=2&replicaType=NRT&coreNodeName=core_node9&name=testcollection_shard3_replica_n6&action=CREATE&numShards=3&shard=shard3&wt=javabin 2> 484853 INFO (qtp1977894741-2398) [n:127.0.0.1:33625_mr%2Fx c: s: r: x:testcollection_shard1_replica_n8 t:null-5417] o.a.s.h.a.CoreAdminOperation core create command qt=/admin/cores&collection.configName=conf1&newCollection=true&collection=testcollection&version=2&replicaType=NRT&coreNodeName=core_node11&name=testcollection_shard1_replica_n8&action=CREATE&numShards=3&shard=shard1&wt=javabin 2> 484867 INFO (qtp734457442-2384) [n:127.0.0.1:46725_mr%2Fx c: s: r: x:testcollection_shard1_replica_n10 t:null-5418] o.a.s.h.a.CoreAdminOperation core create command qt=/admin/cores&collection.configName=conf1&newCollection=true&collection=testcollection&version=2&replicaType=NRT&coreNodeName=core_node13&name=testcollection_shard1_replica_n10&action=CREATE&numShards=3&shard=shard1&wt=javabin 2> 484869 INFO (qtp1977894741-2396) [n:127.0.0.1:33625_mr%2Fx c: s: r: x:testcollection_shard2_replica_n12 t:null-5419] o.a.s.h.a.CoreAdminOperation core create command qt=/admin/cores&collection.configName=conf1&newCollection=true&collection=testcollection&version=2&replicaType=NRT&coreNodeName=core_node15&name=testcollection_shard2_replica_n12&action=CREATE&numShards=3&shard=shard2&wt=javabin 2> 503912 INFO (qtp734457442-2385) [n:127.0.0.1:46725_mr%2Fx c: s: r: x:testcollection_shard3_replica_n14 t:null-5420] o.a.s.h.a.CoreAdminOperation core create command qt=/admin/cores&collection.configName=conf1&newCollection=true&collection=testcollection&version=2&replicaType=NRT&coreNodeName=core_node17&name=testcollection_shard3_replica_n14&action=CREATE&numShards=3&shard=shard3&wt=javabin 2> 504012 INFO (qtp1805675424-2333) [n:127.0.0.1:40873_mr%2Fx c: s: r: x:testcollection_shard3_replica_n2 t:null-5421] o.a.s.h.a.CoreAdminOperation core create command qt=/admin/cores&collection.configName=conf1&newCollection=true&collection=testcollection&version=2&replicaType=NRT&coreNodeName=core_node5&name=testcollection_shard3_replica_n2&action=CREATE&numShards=3&shard=shard3&wt=javabin 2> 504130 INFO (qtp1831257447-2438) [n:127.0.0.1:45309_mr%2Fx c:testcollection s:shard3 r:core_node9 x:testcollection_shard3_replica_n6 t:null-5416] o.a.s.c.SolrConfig Using Lucene MatchVersion: 9.8.0 2> 504125 INFO (qtp1831257447-2437) [n:127.0.0.1:45309_mr%2Fx c:testcollection s:shard2 r:core_node7 x:testcollection_shard2_replica_n4 t:null-5415] o.a.s.c.SolrConfig Using Lucene MatchVersion: 9.8.0 2> 504126 INFO (qtp734457442-2384) [n:127.0.0.1:46725_mr%2Fx c:testcollection s:shard1 r:core_node13 x:testcollection_shard1_replica_n10 t:null-5418] o.a.s.c.SolrConfig Using Lucene MatchVersion: 9.8.0 2> 504131 INFO (qtp1977894741-2396) [n:127.0.0.1:33625_mr%2Fx c:testcollection s:shard2 r:core_node15 x:testcollection_shard2_replica_n12 t:null-5419] o.a.s.c.SolrConfig Using Lucene MatchVersion: 9.8.0 2> 504131 INFO (qtp1805675424-2333) [n:127.0.0.1:40873_mr%2Fx c:testcollection s:shard3 r:core_node5 x:testcollection_shard3_replica_n2 t:null-5421] o.a.s.c.SolrConfig Using Lucene MatchVersion: 9.8.0 2> 504133 INFO (qtp852564526-2470) [n:127.0.0.1:34767_mr%2Fx c:testcollection s:shard2 r:core_node3 x:testcollection_shard2_replica_n1 t:null-5414] o.a.s.c.SolrConfig Using Lucene MatchVersion: 9.8.0 2> 504129 INFO (qtp1977894741-2398) [n:127.0.0.1:33625_mr%2Fx c:testcollection s:shard1 r:core_node11 x:testcollection_shard1_replica_n8 t:null-5417] o.a.s.c.SolrConfig Using Lucene MatchVersion: 9.8.0 2> 504128 INFO (qtp734457442-2385) [n:127.0.0.1:46725_mr%2Fx c:testcollection s:shard3 r:core_node17 x:testcollection_shard3_replica_n14 t:null-5420] o.a.s.c.SolrConfig Using Lucene MatchVersion: 9.8.0 2> 504145 INFO (qtp852564526-2471) [n:127.0.0.1:34767_mr%2Fx c: s: r: x:testcollection_shard1_replica_n16 t:null-5422] o.a.s.h.a.CoreAdminOperation core create command qt=/admin/cores&collection.configName=conf1&newCollection=true&collection=testcollection&version=2&replicaType=NRT&coreNodeName=core_node18&name=testcollection_shard1_replica_n16&action=CREATE&numShards=3&shard=shard1&wt=javabin 2> 504059 ERROR (OverseerThreadFactory-1438-thread-5) [n: c:testcollection s: r: x: t:] o.a.s.c.a.c.CollectionHandlingUtils Error from shard: http://127.0.0.1:33625/mr/x 2> => org.apache.solr.client.solrj.SolrServerException: Total timeout 15000 ms elapsed 2> at org.apache.solr.client.solrj.impl.Http2SolrClient$1.onFailure(Http2SolrClient.java:506) 2> org.apache.solr.client.solrj.SolrServerException: Total timeout 15000 ms elapsed 2> at org.apache.solr.client.solrj.impl.Http2SolrClient$1.onFailure(Http2SolrClient.java:506) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.eclipse.jetty.client.ResponseNotifier.notifyFailure(ResponseNotifier.java:192) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.ResponseNotifier.notifyFailure(ResponseNotifier.java:184) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpReceiver.abort(HttpReceiver.java:559) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpChannel.abortResponse(HttpChannel.java:154) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpChannel.abort(HttpChannel.java:147) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpExchange.abort(HttpExchange.java:273) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpConversation.abort(HttpConversation.java:159) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpRequest.abort(HttpRequest.java:925) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpConnection$RequestTimeouts.onExpired(HttpConnection.java:342) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpConnection$RequestTimeouts.onExpired(HttpConnection.java:322) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.io.CyclicTimeouts.onTimeoutExpired(CyclicTimeouts.java:110) ~[jetty-io-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.io.CyclicTimeouts$Timeouts.onTimeoutExpired(CyclicTimeouts.java:197) ~[jetty-io-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.io.CyclicTimeout$Wakeup.run(CyclicTimeout.java:294) ~[jetty-io-10.0.19.jar:10.0.19] 2> at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) ~[?:?] 2> at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264) ~[?:?] 2> at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:304) ~[?:?] 2> at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) [?:?] 2> at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) [?:?] 2> at java.base/java.lang.Thread.run(Thread.java:829) [?:?] 2> Caused by: java.util.concurrent.TimeoutException: Total timeout 15000 ms elapsed 2> ... 11 more 2> 504170 INFO (qtp734457442-2384) [n:127.0.0.1:46725_mr%2Fx c:testcollection s:shard1 r:core_node13 x:testcollection_shard1_replica_n10 t:null-5418] o.a.s.s.IndexSchema Schema name=test 2> 504179 INFO (qtp852564526-2470) [n:127.0.0.1:34767_mr%2Fx c:testcollection s:shard2 r:core_node3 x:testcollection_shard2_replica_n1 t:null-5414] o.a.s.s.IndexSchema Schema name=test 2> 504179 INFO (qtp1977894741-2396) [n:127.0.0.1:33625_mr%2Fx c:testcollection s:shard2 r:core_node15 x:testcollection_shard2_replica_n12 t:null-5419] o.a.s.s.IndexSchema Schema name=test 2> 504179 INFO (qtp1805675424-2333) [n:127.0.0.1:40873_mr%2Fx c:testcollection s:shard3 r:core_node5 x:testcollection_shard3_replica_n2 t:null-5421] o.a.s.s.IndexSchema Schema name=test 2> 504184 INFO (qtp1831257447-2438) [n:127.0.0.1:45309_mr%2Fx c:testcollection s:shard3 r:core_node9 x:testcollection_shard3_replica_n6 t:null-5416] o.a.s.s.IndexSchema Schema name=test 2> 504170 ERROR (OverseerThreadFactory-1438-thread-5) [n: c:testcollection s: r: x: t:] o.a.s.c.a.c.CollectionHandlingUtils Error from shard: http://127.0.0.1:33625/mr/x 2> => org.apache.solr.client.solrj.SolrServerException: Total timeout 15000 ms elapsed 2> at org.apache.solr.client.solrj.impl.Http2SolrClient$1.onFailure(Http2SolrClient.java:506) 2> org.apache.solr.client.solrj.SolrServerException: Total timeout 15000 ms elapsed 2> at org.apache.solr.client.solrj.impl.Http2SolrClient$1.onFailure(Http2SolrClient.java:506) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.eclipse.jetty.client.ResponseNotifier.notifyFailure(ResponseNotifier.java:192) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.ResponseNotifier.notifyFailure(ResponseNotifier.java:184) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpReceiver.abort(HttpReceiver.java:559) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpChannel.abortResponse(HttpChannel.java:154) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpChannel.abort(HttpChannel.java:147) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpExchange.abort(HttpExchange.java:273) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpConversation.abort(HttpConversation.java:159) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpRequest.abort(HttpRequest.java:925) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpConnection$RequestTimeouts.onExpired(HttpConnection.java:342) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpConnection$RequestTimeouts.onExpired(HttpConnection.java:322) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.io.CyclicTimeouts.onTimeoutExpired(CyclicTimeouts.java:110) ~[jetty-io-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.io.CyclicTimeouts$Timeouts.onTimeoutExpired(CyclicTimeouts.java:197) ~[jetty-io-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.io.CyclicTimeout$Wakeup.run(CyclicTimeout.java:294) ~[jetty-io-10.0.19.jar:10.0.19] 2> at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) ~[?:?] 2> at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264) ~[?:?] 2> at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:304) ~[?:?] 2> at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) [?:?] 2> at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) [?:?] 2> at java.base/java.lang.Thread.run(Thread.java:829) [?:?] 2> Caused by: java.util.concurrent.TimeoutException: Total timeout 15000 ms elapsed 2> ... 11 more 2> 504186 ERROR (OverseerThreadFactory-1438-thread-5) [n: c:testcollection s: r: x: t:] o.a.s.c.a.c.CollectionHandlingUtils Error from shard: http://127.0.0.1:45309/mr/x 2> => org.apache.solr.client.solrj.SolrServerException: Total timeout 15000 ms elapsed 2> at org.apache.solr.client.solrj.impl.Http2SolrClient$1.onFailure(Http2SolrClient.java:506) 2> org.apache.solr.client.solrj.SolrServerException: Total timeout 15000 ms elapsed 2> at org.apache.solr.client.solrj.impl.Http2SolrClient$1.onFailure(Http2SolrClient.java:506) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.eclipse.jetty.client.ResponseNotifier.notifyFailure(ResponseNotifier.java:192) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.ResponseNotifier.notifyFailure(ResponseNotifier.java:184) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpReceiver.abort(HttpReceiver.java:559) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpChannel.abortResponse(HttpChannel.java:154) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpChannel.abort(HttpChannel.java:147) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpExchange.abort(HttpExchange.java:273) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpConversation.abort(HttpConversation.java:159) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpRequest.abort(HttpRequest.java:925) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpConnection$RequestTimeouts.onExpired(HttpConnection.java:342) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpConnection$RequestTimeouts.onExpired(HttpConnection.java:322) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.io.CyclicTimeouts.onTimeoutExpired(CyclicTimeouts.java:110) ~[jetty-io-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.io.CyclicTimeouts$Timeouts.onTimeoutExpired(CyclicTimeouts.java:197) ~[jetty-io-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.io.CyclicTimeout$Wakeup.run(CyclicTimeout.java:294) ~[jetty-io-10.0.19.jar:10.0.19] 2> at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) ~[?:?] 2> at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264) ~[?:?] 2> at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:304) ~[?:?] 2> at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) [?:?] 2> at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) [?:?] 2> at java.base/java.lang.Thread.run(Thread.java:829) [?:?] 2> Caused by: java.util.concurrent.TimeoutException: Total timeout 15000 ms elapsed 2> ... 11 more 2> 504199 ERROR (OverseerThreadFactory-1438-thread-5) [n: c:testcollection s: r: x: t:] o.a.s.c.a.c.CollectionHandlingUtils Error from shard: http://127.0.0.1:45309/mr/x 2> => org.apache.solr.client.solrj.SolrServerException: Total timeout 15000 ms elapsed 2> at org.apache.solr.client.solrj.impl.Http2SolrClient$1.onFailure(Http2SolrClient.java:506) 2> org.apache.solr.client.solrj.SolrServerException: Total timeout 15000 ms elapsed 2> at org.apache.solr.client.solrj.impl.Http2SolrClient$1.onFailure(Http2SolrClient.java:506) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.eclipse.jetty.client.ResponseNotifier.notifyFailure(ResponseNotifier.java:192) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.ResponseNotifier.notifyFailure(ResponseNotifier.java:184) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpReceiver.abort(HttpReceiver.java:559) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpChannel.abortResponse(HttpChannel.java:154) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpChannel.abort(HttpChannel.java:147) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpExchange.abort(HttpExchange.java:273) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpConversation.abort(HttpConversation.java:159) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpRequest.abort(HttpRequest.java:925) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpConnection$RequestTimeouts.onExpired(HttpConnection.java:342) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpConnection$RequestTimeouts.onExpired(HttpConnection.java:322) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.io.CyclicTimeouts.onTimeoutExpired(CyclicTimeouts.java:110) ~[jetty-io-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.io.CyclicTimeouts$Timeouts.onTimeoutExpired(CyclicTimeouts.java:197) ~[jetty-io-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.io.CyclicTimeout$Wakeup.run(CyclicTimeout.java:294) ~[jetty-io-10.0.19.jar:10.0.19] 2> at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) ~[?:?] 2> at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264) ~[?:?] 2> at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:304) ~[?:?] 2> at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) [?:?] 2> at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) [?:?] 2> at java.base/java.lang.Thread.run(Thread.java:829) [?:?] 2> Caused by: java.util.concurrent.TimeoutException: Total timeout 15000 ms elapsed 2> ... 11 more 2> 504218 INFO (qtp1831257447-2437) [n:127.0.0.1:45309_mr%2Fx c:testcollection s:shard2 r:core_node7 x:testcollection_shard2_replica_n4 t:null-5415] o.a.s.s.IndexSchema Schema name=test 2> 504218 INFO (qtp734457442-2385) [n:127.0.0.1:46725_mr%2Fx c:testcollection s:shard3 r:core_node17 x:testcollection_shard3_replica_n14 t:null-5420] o.a.s.s.IndexSchema Schema name=test 2> 504226 INFO (qtp1977894741-2398) [n:127.0.0.1:33625_mr%2Fx c:testcollection s:shard1 r:core_node11 x:testcollection_shard1_replica_n8 t:null-5417] o.a.s.s.IndexSchema Schema name=test 2> 504218 ERROR (OverseerThreadFactory-1438-thread-5) [n: c:testcollection s: r: x: t:] o.a.s.c.a.c.CollectionHandlingUtils Error from shard: http://127.0.0.1:46725/mr/x 2> => org.apache.solr.client.solrj.SolrServerException: Total timeout 15000 ms elapsed 2> at org.apache.solr.client.solrj.impl.Http2SolrClient$1.onFailure(Http2SolrClient.java:506) 2> org.apache.solr.client.solrj.SolrServerException: Total timeout 15000 ms elapsed 2> at org.apache.solr.client.solrj.impl.Http2SolrClient$1.onFailure(Http2SolrClient.java:506) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.eclipse.jetty.client.ResponseNotifier.notifyFailure(ResponseNotifier.java:192) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.ResponseNotifier.notifyFailure(ResponseNotifier.java:184) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpReceiver.abort(HttpReceiver.java:559) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpChannel.abortResponse(HttpChannel.java:154) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpChannel.abort(HttpChannel.java:147) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpExchange.abort(HttpExchange.java:273) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpConversation.abort(HttpConversation.java:159) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpRequest.abort(HttpRequest.java:925) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpConnection$RequestTimeouts.onExpired(HttpConnection.java:342) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpConnection$RequestTimeouts.onExpired(HttpConnection.java:322) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.io.CyclicTimeouts.onTimeoutExpired(CyclicTimeouts.java:110) ~[jetty-io-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.io.CyclicTimeouts$Timeouts.onTimeoutExpired(CyclicTimeouts.java:197) ~[jetty-io-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.io.CyclicTimeout$Wakeup.run(CyclicTimeout.java:294) ~[jetty-io-10.0.19.jar:10.0.19] 2> at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) ~[?:?] 2> at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264) ~[?:?] 2> at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:304) ~[?:?] 2> at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) [?:?] 2> at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) [?:?] 2> at java.base/java.lang.Thread.run(Thread.java:829) [?:?] 2> Caused by: java.util.concurrent.TimeoutException: Total timeout 15000 ms elapsed 2> ... 11 more 2> 504234 ERROR (OverseerThreadFactory-1438-thread-5) [n: c:testcollection s: r: x: t:] o.a.s.c.a.c.CollectionHandlingUtils Error from shard: http://127.0.0.1:46725/mr/x 2> => org.apache.solr.client.solrj.SolrServerException: Total timeout 15000 ms elapsed 2> at org.apache.solr.client.solrj.impl.Http2SolrClient$1.onFailure(Http2SolrClient.java:506) 2> org.apache.solr.client.solrj.SolrServerException: Total timeout 15000 ms elapsed 2> at org.apache.solr.client.solrj.impl.Http2SolrClient$1.onFailure(Http2SolrClient.java:506) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.eclipse.jetty.client.ResponseNotifier.notifyFailure(ResponseNotifier.java:192) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.ResponseNotifier.notifyFailure(ResponseNotifier.java:184) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpReceiver.abort(HttpReceiver.java:559) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpChannel.abortResponse(HttpChannel.java:154) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpChannel.abort(HttpChannel.java:147) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpExchange.abort(HttpExchange.java:273) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpConversation.abort(HttpConversation.java:159) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpRequest.abort(HttpRequest.java:925) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpConnection$RequestTimeouts.onExpired(HttpConnection.java:342) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpConnection$RequestTimeouts.onExpired(HttpConnection.java:322) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.io.CyclicTimeouts.onTimeoutExpired(CyclicTimeouts.java:110) ~[jetty-io-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.io.CyclicTimeouts$Timeouts.onTimeoutExpired(CyclicTimeouts.java:197) ~[jetty-io-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.io.CyclicTimeout$Wakeup.run(CyclicTimeout.java:294) ~[jetty-io-10.0.19.jar:10.0.19] 2> at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) ~[?:?] 2> at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264) ~[?:?] 2> at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:304) ~[?:?] 2> at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) [?:?] 2> at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) [?:?] 2> at java.base/java.lang.Thread.run(Thread.java:829) [?:?] 2> Caused by: java.util.concurrent.TimeoutException: Total timeout 15000 ms elapsed 2> ... 11 more 2> 504248 ERROR (OverseerThreadFactory-1438-thread-5) [n: c:testcollection s: r: x: t:] o.a.s.c.a.c.CollectionHandlingUtils Error from shard: http://127.0.0.1:34767/mr/x 2> => org.apache.solr.client.solrj.SolrServerException: Total timeout 15000 ms elapsed 2> at org.apache.solr.client.solrj.impl.Http2SolrClient$1.onFailure(Http2SolrClient.java:506) 2> org.apache.solr.client.solrj.SolrServerException: Total timeout 15000 ms elapsed 2> at org.apache.solr.client.solrj.impl.Http2SolrClient$1.onFailure(Http2SolrClient.java:506) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.eclipse.jetty.client.ResponseNotifier.notifyFailure(ResponseNotifier.java:192) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.ResponseNotifier.notifyFailure(ResponseNotifier.java:184) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpReceiver.abort(HttpReceiver.java:559) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpChannel.abortResponse(HttpChannel.java:154) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpChannel.abort(HttpChannel.java:147) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpExchange.abort(HttpExchange.java:273) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpConversation.abort(HttpConversation.java:159) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpRequest.abort(HttpRequest.java:925) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpConnection$RequestTimeouts.onExpired(HttpConnection.java:342) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpConnection$RequestTimeouts.onExpired(HttpConnection.java:322) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.io.CyclicTimeouts.onTimeoutExpired(CyclicTimeouts.java:110) ~[jetty-io-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.io.CyclicTimeouts$Timeouts.onTimeoutExpired(CyclicTimeouts.java:197) ~[jetty-io-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.io.CyclicTimeout$Wakeup.run(CyclicTimeout.java:294) ~[jetty-io-10.0.19.jar:10.0.19] 2> at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) ~[?:?] 2> at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264) ~[?:?] 2> at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:304) ~[?:?] 2> at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) [?:?] 2> at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) [?:?] 2> at java.base/java.lang.Thread.run(Thread.java:829) [?:?] 2> Caused by: java.util.concurrent.TimeoutException: Total timeout 15000 ms elapsed 2> ... 11 more 2> 504278 INFO (qtp852564526-2471) [n:127.0.0.1:34767_mr%2Fx c:testcollection s:shard1 r:core_node18 x:testcollection_shard1_replica_n16 t:null-5422] o.a.s.c.SolrConfig Using Lucene MatchVersion: 9.8.0 2> 504268 ERROR (OverseerThreadFactory-1438-thread-5) [n: c:testcollection s: r: x: t:] o.a.s.c.a.c.CollectionHandlingUtils Error from shard: http://127.0.0.1:40873/mr/x 2> => org.apache.solr.client.solrj.SolrServerException: Total timeout 15000 ms elapsed 2> at org.apache.solr.client.solrj.impl.Http2SolrClient$1.onFailure(Http2SolrClient.java:506) 2> org.apache.solr.client.solrj.SolrServerException: Total timeout 15000 ms elapsed 2> at org.apache.solr.client.solrj.impl.Http2SolrClient$1.onFailure(Http2SolrClient.java:506) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.eclipse.jetty.client.ResponseNotifier.notifyFailure(ResponseNotifier.java:192) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.ResponseNotifier.notifyFailure(ResponseNotifier.java:184) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpReceiver.abort(HttpReceiver.java:559) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpChannel.abortResponse(HttpChannel.java:154) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpChannel.abort(HttpChannel.java:147) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpExchange.abort(HttpExchange.java:273) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpConversation.abort(HttpConversation.java:159) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpRequest.abort(HttpRequest.java:925) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpConnection$RequestTimeouts.onExpired(HttpConnection.java:342) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.client.HttpConnection$RequestTimeouts.onExpired(HttpConnection.java:322) ~[jetty-client-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.io.CyclicTimeouts.onTimeoutExpired(CyclicTimeouts.java:110) ~[jetty-io-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.io.CyclicTimeouts$Timeouts.onTimeoutExpired(CyclicTimeouts.java:197) ~[jetty-io-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.io.CyclicTimeout$Wakeup.run(CyclicTimeout.java:294) ~[jetty-io-10.0.19.jar:10.0.19] 2> at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) ~[?:?] 2> at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264) ~[?:?] 2> at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:304) ~[?:?] 2> at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) [?:?] 2> at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) [?:?] 2> at java.base/java.lang.Thread.run(Thread.java:829) [?:?] 2> Caused by: java.util.concurrent.TimeoutException: Total timeout 15000 ms elapsed 2> ... 11 more 2> 504300 INFO (qtp852564526-2471) [n:127.0.0.1:34767_mr%2Fx c:testcollection s:shard1 r:core_node18 x:testcollection_shard1_replica_n16 t:null-5422] o.a.s.s.IndexSchema Schema name=test 2> 504603 INFO (qtp1805675424-2333) [n:127.0.0.1:40873_mr%2Fx c:testcollection s:shard3 r:core_node5 x:testcollection_shard3_replica_n2 t:null-5421] o.a.s.s.IndexSchema Loaded schema test/1.6 with uniqueid field id 2> 504607 INFO (qtp1831257447-2438) [n:127.0.0.1:45309_mr%2Fx c:testcollection s:shard3 r:core_node9 x:testcollection_shard3_replica_n6 t:null-5416] o.a.s.s.IndexSchema Loaded schema test/1.6 with uniqueid field id 2> 504612 INFO (qtp734457442-2385) [n:127.0.0.1:46725_mr%2Fx c:testcollection s:shard3 r:core_node17 x:testcollection_shard3_replica_n14 t:null-5420] o.a.s.s.IndexSchema Loaded schema test/1.6 with uniqueid field id 2> 504612 INFO (qtp734457442-2384) [n:127.0.0.1:46725_mr%2Fx c:testcollection s:shard1 r:core_node13 x:testcollection_shard1_replica_n10 t:null-5418] o.a.s.s.IndexSchema Loaded schema test/1.6 with uniqueid field id 2> 504620 INFO (qtp1977894741-2396) [n:127.0.0.1:33625_mr%2Fx c:testcollection s:shard2 r:core_node15 x:testcollection_shard2_replica_n12 t:null-5419] o.a.s.s.IndexSchema Loaded schema test/1.6 with uniqueid field id 2> 504622 INFO (qtp852564526-2470) [n:127.0.0.1:34767_mr%2Fx c:testcollection s:shard2 r:core_node3 x:testcollection_shard2_replica_n1 t:null-5414] o.a.s.s.IndexSchema Loaded schema test/1.6 with uniqueid field id 2> 504622 INFO (qtp1831257447-2437) [n:127.0.0.1:45309_mr%2Fx c:testcollection s:shard2 r:core_node7 x:testcollection_shard2_replica_n4 t:null-5415] o.a.s.s.IndexSchema Loaded schema test/1.6 with uniqueid field id 2> 504623 INFO (qtp852564526-2471) [n:127.0.0.1:34767_mr%2Fx c:testcollection s:shard1 r:core_node18 x:testcollection_shard1_replica_n16 t:null-5422] o.a.s.s.IndexSchema Loaded schema test/1.6 with uniqueid field id 2> 504623 INFO (qtp1977894741-2398) [n:127.0.0.1:33625_mr%2Fx c:testcollection s:shard1 r:core_node11 x:testcollection_shard1_replica_n8 t:null-5417] o.a.s.s.IndexSchema Loaded schema test/1.6 with uniqueid field id 2> 504968 INFO (qtp1977894741-2398) [n:127.0.0.1:33625_mr%2Fx c:testcollection s:shard1 r:core_node11 x:testcollection_shard1_replica_n8 t:null-5417] o.a.s.c.CoreContainer Creating SolrCore 'testcollection_shard1_replica_n8' using configuration from configset conf1, trusted=true 2> 504968 INFO (qtp1805675424-2333) [n:127.0.0.1:40873_mr%2Fx c:testcollection s:shard3 r:core_node5 x:testcollection_shard3_replica_n2 t:null-5421] o.a.s.c.CoreContainer Creating SolrCore 'testcollection_shard3_replica_n2' using configuration from configset conf1, trusted=true 2> 504971 INFO (qtp852564526-2471) [n:127.0.0.1:34767_mr%2Fx c:testcollection s:shard1 r:core_node18 x:testcollection_shard1_replica_n16 t:null-5422] o.a.s.c.CoreContainer Creating SolrCore 'testcollection_shard1_replica_n16' using configuration from configset conf1, trusted=true 2> 504972 INFO (qtp734457442-2385) [n:127.0.0.1:46725_mr%2Fx c:testcollection s:shard3 r:core_node17 x:testcollection_shard3_replica_n14 t:null-5420] o.a.s.c.CoreContainer Creating SolrCore 'testcollection_shard3_replica_n14' using configuration from configset conf1, trusted=true 2> 504977 INFO (qtp852564526-2471) [n:127.0.0.1:34767_mr%2Fx c:testcollection s:shard1 r:core_node18 x:testcollection_shard1_replica_n16 t:null-5422] o.a.s.c.SolrCore Opening new SolrCore at [/build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.cloud.api.collections.TestReplicaProperties_246C98A4C257C021-001/shard-4-001/cores/testcollection_shard1_replica_n16], dataDir=[/build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.cloud.api.collections.TestReplicaProperties_246C98A4C257C021-001/shard-4-001/cores/testcollection_shard1_replica_n16/data/] 2> 504977 INFO (qtp1977894741-2398) [n:127.0.0.1:33625_mr%2Fx c:testcollection s:shard1 r:core_node11 x:testcollection_shard1_replica_n8 t:null-5417] o.a.s.c.SolrCore Opening new SolrCore at [/build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.cloud.api.collections.TestReplicaProperties_246C98A4C257C021-001/shard-2-001/cores/testcollection_shard1_replica_n8], dataDir=[/build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.cloud.api.collections.TestReplicaProperties_246C98A4C257C021-001/shard-2-001/cores/testcollection_shard1_replica_n8/data/] 2> 504977 INFO (qtp1805675424-2333) [n:127.0.0.1:40873_mr%2Fx c:testcollection s:shard3 r:core_node5 x:testcollection_shard3_replica_n2 t:null-5421] o.a.s.c.SolrCore Opening new SolrCore at [/build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.cloud.api.collections.TestReplicaProperties_246C98A4C257C021-001/control-001/cores/testcollection_shard3_replica_n2], dataDir=[/build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.cloud.api.collections.TestReplicaProperties_246C98A4C257C021-001/control-001/cores/testcollection_shard3_replica_n2/data/] 2> 504977 INFO (qtp734457442-2385) [n:127.0.0.1:46725_mr%2Fx c:testcollection s:shard3 r:core_node17 x:testcollection_shard3_replica_n14 t:null-5420] o.a.s.c.SolrCore Opening new SolrCore at [/build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.cloud.api.collections.TestReplicaProperties_246C98A4C257C021-001/shard-1-001/cores/testcollection_shard3_replica_n14], dataDir=[/build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.cloud.api.collections.TestReplicaProperties_246C98A4C257C021-001/shard-1-001/cores/testcollection_shard3_replica_n14/data/] 2> 504979 INFO (qtp1831257447-2437) [n:127.0.0.1:45309_mr%2Fx c:testcollection s:shard2 r:core_node7 x:testcollection_shard2_replica_n4 t:null-5415] o.a.s.c.CoreContainer Creating SolrCore 'testcollection_shard2_replica_n4' using configuration from configset conf1, trusted=true 2> 504979 INFO (qtp734457442-2384) [n:127.0.0.1:46725_mr%2Fx c:testcollection s:shard1 r:core_node13 x:testcollection_shard1_replica_n10 t:null-5418] o.a.s.c.CoreContainer Creating SolrCore 'testcollection_shard1_replica_n10' using configuration from configset conf1, trusted=true 2> 504981 INFO (qtp1977894741-2396) [n:127.0.0.1:33625_mr%2Fx c:testcollection s:shard2 r:core_node15 x:testcollection_shard2_replica_n12 t:null-5419] o.a.s.c.CoreContainer Creating SolrCore 'testcollection_shard2_replica_n12' using configuration from configset conf1, trusted=true 2> 504981 INFO (qtp852564526-2470) [n:127.0.0.1:34767_mr%2Fx c:testcollection s:shard2 r:core_node3 x:testcollection_shard2_replica_n1 t:null-5414] o.a.s.c.CoreContainer Creating SolrCore 'testcollection_shard2_replica_n1' using configuration from configset conf1, trusted=true 2> 504983 INFO (qtp1831257447-2437) [n:127.0.0.1:45309_mr%2Fx c:testcollection s:shard2 r:core_node7 x:testcollection_shard2_replica_n4 t:null-5415] o.a.s.c.SolrCore Opening new SolrCore at [/build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.cloud.api.collections.TestReplicaProperties_246C98A4C257C021-001/shard-3-001/cores/testcollection_shard2_replica_n4], dataDir=[/build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.cloud.api.collections.TestReplicaProperties_246C98A4C257C021-001/shard-3-001/cores/testcollection_shard2_replica_n4/data/] 2> 504983 INFO (qtp734457442-2384) [n:127.0.0.1:46725_mr%2Fx c:testcollection s:shard1 r:core_node13 x:testcollection_shard1_replica_n10 t:null-5418] o.a.s.c.SolrCore Opening new SolrCore at [/build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.cloud.api.collections.TestReplicaProperties_246C98A4C257C021-001/shard-1-001/cores/testcollection_shard1_replica_n10], dataDir=[/build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.cloud.api.collections.TestReplicaProperties_246C98A4C257C021-001/shard-1-001/cores/testcollection_shard1_replica_n10/data/] 2> 504984 INFO (qtp1831257447-2438) [n:127.0.0.1:45309_mr%2Fx c:testcollection s:shard3 r:core_node9 x:testcollection_shard3_replica_n6 t:null-5416] o.a.s.c.CoreContainer Creating SolrCore 'testcollection_shard3_replica_n6' using configuration from configset conf1, trusted=true 2> 504985 INFO (qtp1977894741-2396) [n:127.0.0.1:33625_mr%2Fx c:testcollection s:shard2 r:core_node15 x:testcollection_shard2_replica_n12 t:null-5419] o.a.s.c.SolrCore Opening new SolrCore at [/build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.cloud.api.collections.TestReplicaProperties_246C98A4C257C021-001/shard-2-001/cores/testcollection_shard2_replica_n12], dataDir=[/build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.cloud.api.collections.TestReplicaProperties_246C98A4C257C021-001/shard-2-001/cores/testcollection_shard2_replica_n12/data/] 2> 504986 INFO (qtp852564526-2470) [n:127.0.0.1:34767_mr%2Fx c:testcollection s:shard2 r:core_node3 x:testcollection_shard2_replica_n1 t:null-5414] o.a.s.c.SolrCore Opening new SolrCore at [/build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.cloud.api.collections.TestReplicaProperties_246C98A4C257C021-001/shard-4-001/cores/testcollection_shard2_replica_n1], dataDir=[/build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.cloud.api.collections.TestReplicaProperties_246C98A4C257C021-001/shard-4-001/cores/testcollection_shard2_replica_n1/data/] 2> 504990 INFO (qtp1831257447-2438) [n:127.0.0.1:45309_mr%2Fx c:testcollection s:shard3 r:core_node9 x:testcollection_shard3_replica_n6 t:null-5416] o.a.s.c.SolrCore Opening new SolrCore at [/build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.cloud.api.collections.TestReplicaProperties_246C98A4C257C021-001/shard-3-001/cores/testcollection_shard3_replica_n6], dataDir=[/build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.cloud.api.collections.TestReplicaProperties_246C98A4C257C021-001/shard-3-001/cores/testcollection_shard3_replica_n6/data/] 2> 505044 INFO (qtp1977894741-2396) [n:127.0.0.1:33625_mr%2Fx c:testcollection s:shard2 r:core_node15 x:testcollection_shard2_replica_n12 t:null-5419] o.a.s.u.RandomMergePolicy RandomMergePolicy wrapping class org.apache.lucene.index.LogByteSizeMergePolicy: [LogByteSizeMergePolicy: minMergeSize=1677721, mergeFactor=18, maxMergeSize=2147483648, maxMergeSizeForForcedMerge=9223372036854775807, calibrateSizeByDeletes=false, maxMergeDocs=2147483647, maxCFSSegmentSizeMB=8.796093022208E12, noCFSRatio=1.0] 2> 505044 INFO (qtp852564526-2471) [n:127.0.0.1:34767_mr%2Fx c:testcollection s:shard1 r:core_node18 x:testcollection_shard1_replica_n16 t:null-5422] o.a.s.u.RandomMergePolicy RandomMergePolicy wrapping class org.apache.lucene.index.LogDocMergePolicy: [LogDocMergePolicy: minMergeSize=1000, mergeFactor=14, maxMergeSize=9223372036854775807, maxMergeSizeForForcedMerge=9223372036854775807, calibrateSizeByDeletes=false, maxMergeDocs=2147483647, maxCFSSegmentSizeMB=8.796093022208E12, noCFSRatio=0.36167167121278376] 2> 505045 INFO (qtp734457442-2384) [n:127.0.0.1:46725_mr%2Fx c:testcollection s:shard1 r:core_node13 x:testcollection_shard1_replica_n10 t:null-5418] o.a.s.u.RandomMergePolicy RandomMergePolicy wrapping class org.apache.lucene.index.LogByteSizeMergePolicy: [LogByteSizeMergePolicy: minMergeSize=1677721, mergeFactor=18, maxMergeSize=2147483648, maxMergeSizeForForcedMerge=9223372036854775807, calibrateSizeByDeletes=false, maxMergeDocs=2147483647, maxCFSSegmentSizeMB=8.796093022208E12, noCFSRatio=1.0] 2> 505048 INFO (qtp734457442-2385) [n:127.0.0.1:46725_mr%2Fx c:testcollection s:shard3 r:core_node17 x:testcollection_shard3_replica_n14 t:null-5420] o.a.s.u.RandomMergePolicy RandomMergePolicy wrapping class org.apache.lucene.index.LogDocMergePolicy: [LogDocMergePolicy: minMergeSize=1000, mergeFactor=14, maxMergeSize=9223372036854775807, maxMergeSizeForForcedMerge=9223372036854775807, calibrateSizeByDeletes=false, maxMergeDocs=2147483647, maxCFSSegmentSizeMB=8.796093022208E12, noCFSRatio=0.36167167121278376] 2> 505050 INFO (qtp1977894741-2398) [n:127.0.0.1:33625_mr%2Fx c:testcollection s:shard1 r:core_node11 x:testcollection_shard1_replica_n8 t:null-5417] o.a.s.u.RandomMergePolicy RandomMergePolicy wrapping class org.apache.lucene.index.LogByteSizeMergePolicy: [LogByteSizeMergePolicy: minMergeSize=1677721, mergeFactor=18, maxMergeSize=2147483648, maxMergeSizeForForcedMerge=9223372036854775807, calibrateSizeByDeletes=false, maxMergeDocs=2147483647, maxCFSSegmentSizeMB=8.796093022208E12, noCFSRatio=1.0] 2> 505050 INFO (qtp1831257447-2437) [n:127.0.0.1:45309_mr%2Fx c:testcollection s:shard2 r:core_node7 x:testcollection_shard2_replica_n4 t:null-5415] o.a.s.u.RandomMergePolicy RandomMergePolicy wrapping class org.apache.lucene.index.LogDocMergePolicy: [LogDocMergePolicy: minMergeSize=1000, mergeFactor=14, maxMergeSize=9223372036854775807, maxMergeSizeForForcedMerge=9223372036854775807, calibrateSizeByDeletes=false, maxMergeDocs=2147483647, maxCFSSegmentSizeMB=8.796093022208E12, noCFSRatio=0.36167167121278376] 2> 505051 INFO (qtp1831257447-2438) [n:127.0.0.1:45309_mr%2Fx c:testcollection s:shard3 r:core_node9 x:testcollection_shard3_replica_n6 t:null-5416] o.a.s.u.RandomMergePolicy RandomMergePolicy wrapping class org.apache.lucene.index.LogByteSizeMergePolicy: [LogByteSizeMergePolicy: minMergeSize=1677721, mergeFactor=18, maxMergeSize=2147483648, maxMergeSizeForForcedMerge=9223372036854775807, calibrateSizeByDeletes=false, maxMergeDocs=2147483647, maxCFSSegmentSizeMB=8.796093022208E12, noCFSRatio=1.0] 2> 505054 INFO (qtp852564526-2470) [n:127.0.0.1:34767_mr%2Fx c:testcollection s:shard2 r:core_node3 x:testcollection_shard2_replica_n1 t:null-5414] o.a.s.u.RandomMergePolicy RandomMergePolicy wrapping class org.apache.lucene.index.LogByteSizeMergePolicy: [LogByteSizeMergePolicy: minMergeSize=1677721, mergeFactor=18, maxMergeSize=2147483648, maxMergeSizeForForcedMerge=9223372036854775807, calibrateSizeByDeletes=false, maxMergeDocs=2147483647, maxCFSSegmentSizeMB=8.796093022208E12, noCFSRatio=1.0] 2> 505054 INFO (qtp1805675424-2333) [n:127.0.0.1:40873_mr%2Fx c:testcollection s:shard3 r:core_node5 x:testcollection_shard3_replica_n2 t:null-5421] o.a.s.u.RandomMergePolicy RandomMergePolicy wrapping class org.apache.lucene.index.LogByteSizeMergePolicy: [LogByteSizeMergePolicy: minMergeSize=1677721, mergeFactor=18, maxMergeSize=2147483648, maxMergeSizeForForcedMerge=9223372036854775807, calibrateSizeByDeletes=false, maxMergeDocs=2147483647, maxCFSSegmentSizeMB=8.796093022208E12, noCFSRatio=1.0] 2> 505112 WARN (qtp852564526-2470) [n:127.0.0.1:34767_mr%2Fx c:testcollection s:shard2 r:core_node3 x:testcollection_shard2_replica_n1 t:null-5414] o.a.s.c.RequestHandlers INVALID paramSet a in requestHandler {type = requestHandler,name = /dump,class = DumpRequestHandler,attributes = {initParams=a, name=/dump, class=DumpRequestHandler},args = {defaults={a=A, b=B}}} 2> 505113 WARN (qtp1977894741-2398) [n:127.0.0.1:33625_mr%2Fx c:testcollection s:shard1 r:core_node11 x:testcollection_shard1_replica_n8 t:null-5417] o.a.s.c.RequestHandlers INVALID paramSet a in requestHandler {type = requestHandler,name = /dump,class = DumpRequestHandler,attributes = {initParams=a, name=/dump, class=DumpRequestHandler},args = {defaults={a=A, b=B}}} 2> 505112 WARN (qtp734457442-2384) [n:127.0.0.1:46725_mr%2Fx c:testcollection s:shard1 r:core_node13 x:testcollection_shard1_replica_n10 t:null-5418] o.a.s.c.RequestHandlers INVALID paramSet a in requestHandler {type = requestHandler,name = /dump,class = DumpRequestHandler,attributes = {initParams=a, name=/dump, class=DumpRequestHandler},args = {defaults={a=A, b=B}}} 2> 505113 WARN (qtp1831257447-2438) [n:127.0.0.1:45309_mr%2Fx c:testcollection s:shard3 r:core_node9 x:testcollection_shard3_replica_n6 t:null-5416] o.a.s.c.RequestHandlers INVALID paramSet a in requestHandler {type = requestHandler,name = /dump,class = DumpRequestHandler,attributes = {initParams=a, name=/dump, class=DumpRequestHandler},args = {defaults={a=A, b=B}}} 2> 505113 WARN (qtp1977894741-2396) [n:127.0.0.1:33625_mr%2Fx c:testcollection s:shard2 r:core_node15 x:testcollection_shard2_replica_n12 t:null-5419] o.a.s.c.RequestHandlers INVALID paramSet a in requestHandler {type = requestHandler,name = /dump,class = DumpRequestHandler,attributes = {initParams=a, name=/dump, class=DumpRequestHandler},args = {defaults={a=A, b=B}}} 2> 505111 WARN (qtp1805675424-2333) [n:127.0.0.1:40873_mr%2Fx c:testcollection s:shard3 r:core_node5 x:testcollection_shard3_replica_n2 t:null-5421] o.a.s.c.RequestHandlers INVALID paramSet a in requestHandler {type = requestHandler,name = /dump,class = DumpRequestHandler,attributes = {initParams=a, name=/dump, class=DumpRequestHandler},args = {defaults={a=A, b=B}}} 2> 505112 WARN (qtp1831257447-2437) [n:127.0.0.1:45309_mr%2Fx c:testcollection s:shard2 r:core_node7 x:testcollection_shard2_replica_n4 t:null-5415] o.a.s.c.RequestHandlers INVALID paramSet a in requestHandler {type = requestHandler,name = /dump,class = DumpRequestHandler,attributes = {initParams=a, name=/dump, class=DumpRequestHandler},args = {defaults={a=A, b=B}}} 2> 505112 WARN (qtp852564526-2471) [n:127.0.0.1:34767_mr%2Fx c:testcollection s:shard1 r:core_node18 x:testcollection_shard1_replica_n16 t:null-5422] o.a.s.c.RequestHandlers INVALID paramSet a in requestHandler {type = requestHandler,name = /dump,class = DumpRequestHandler,attributes = {initParams=a, name=/dump, class=DumpRequestHandler},args = {defaults={a=A, b=B}}} 2> 505111 WARN (qtp734457442-2385) [n:127.0.0.1:46725_mr%2Fx c:testcollection s:shard3 r:core_node17 x:testcollection_shard3_replica_n14 t:null-5420] o.a.s.c.RequestHandlers INVALID paramSet a in requestHandler {type = requestHandler,name = /dump,class = DumpRequestHandler,attributes = {initParams=a, name=/dump, class=DumpRequestHandler},args = {defaults={a=A, b=B}}} 2> 505570 INFO (qtp852564526-2471) [n:127.0.0.1:34767_mr%2Fx c:testcollection s:shard1 r:core_node18 x:testcollection_shard1_replica_n16 t:null-5422] o.a.s.u.UpdateHandler Using UpdateLog implementation: org.apache.solr.update.UpdateLog 2> 505570 INFO (qtp734457442-2385) [n:127.0.0.1:46725_mr%2Fx c:testcollection s:shard3 r:core_node17 x:testcollection_shard3_replica_n14 t:null-5420] o.a.s.u.UpdateHandler Using UpdateLog implementation: org.apache.solr.update.UpdateLog 2> 505570 INFO (qtp852564526-2471) [n:127.0.0.1:34767_mr%2Fx c:testcollection s:shard1 r:core_node18 x:testcollection_shard1_replica_n16 t:null-5422] o.a.s.u.UpdateLog Initializing UpdateLog: dataDir= defaultSyncLevel=FLUSH numRecordsToKeep=100 maxNumLogsToKeep=10 numVersionBuckets=65536 2> 505571 INFO (qtp734457442-2385) [n:127.0.0.1:46725_mr%2Fx c:testcollection s:shard3 r:core_node17 x:testcollection_shard3_replica_n14 t:null-5420] o.a.s.u.UpdateLog Initializing UpdateLog: dataDir= defaultSyncLevel=FLUSH numRecordsToKeep=100 maxNumLogsToKeep=10 numVersionBuckets=65536 2> 505571 INFO (qtp1831257447-2437) [n:127.0.0.1:45309_mr%2Fx c:testcollection s:shard2 r:core_node7 x:testcollection_shard2_replica_n4 t:null-5415] o.a.s.u.UpdateHandler Using UpdateLog implementation: org.apache.solr.update.UpdateLog 2> 505571 INFO (qtp1831257447-2438) [n:127.0.0.1:45309_mr%2Fx c:testcollection s:shard3 r:core_node9 x:testcollection_shard3_replica_n6 t:null-5416] o.a.s.u.UpdateHandler Using UpdateLog implementation: org.apache.solr.update.UpdateLog 2> 505571 INFO (qtp1831257447-2437) [n:127.0.0.1:45309_mr%2Fx c:testcollection s:shard2 r:core_node7 x:testcollection_shard2_replica_n4 t:null-5415] o.a.s.u.UpdateLog Initializing UpdateLog: dataDir= defaultSyncLevel=FLUSH numRecordsToKeep=100 maxNumLogsToKeep=10 numVersionBuckets=65536 2> 505571 INFO (qtp1831257447-2438) [n:127.0.0.1:45309_mr%2Fx c:testcollection s:shard3 r:core_node9 x:testcollection_shard3_replica_n6 t:null-5416] o.a.s.u.UpdateLog Initializing UpdateLog: dataDir= defaultSyncLevel=FLUSH numRecordsToKeep=100 maxNumLogsToKeep=10 numVersionBuckets=65536 2> 505572 INFO (qtp1977894741-2398) [n:127.0.0.1:33625_mr%2Fx c:testcollection s:shard1 r:core_node11 x:testcollection_shard1_replica_n8 t:null-5417] o.a.s.u.UpdateHandler Using UpdateLog implementation: org.apache.solr.update.UpdateLog 2> 505573 INFO (qtp1977894741-2398) [n:127.0.0.1:33625_mr%2Fx c:testcollection s:shard1 r:core_node11 x:testcollection_shard1_replica_n8 t:null-5417] o.a.s.u.UpdateLog Initializing UpdateLog: dataDir= defaultSyncLevel=FLUSH numRecordsToKeep=100 maxNumLogsToKeep=10 numVersionBuckets=65536 2> 505579 INFO (qtp1805675424-2333) [n:127.0.0.1:40873_mr%2Fx c:testcollection s:shard3 r:core_node5 x:testcollection_shard3_replica_n2 t:null-5421] o.a.s.u.UpdateHandler Using UpdateLog implementation: org.apache.solr.update.UpdateLog 2> 505579 INFO (qtp1805675424-2333) [n:127.0.0.1:40873_mr%2Fx c:testcollection s:shard3 r:core_node5 x:testcollection_shard3_replica_n2 t:null-5421] o.a.s.u.UpdateLog Initializing UpdateLog: dataDir= defaultSyncLevel=FLUSH numRecordsToKeep=100 maxNumLogsToKeep=10 numVersionBuckets=65536 2> 505579 INFO (qtp734457442-2384) [n:127.0.0.1:46725_mr%2Fx c:testcollection s:shard1 r:core_node13 x:testcollection_shard1_replica_n10 t:null-5418] o.a.s.u.UpdateHandler Using UpdateLog implementation: org.apache.solr.update.UpdateLog 2> 505579 INFO (qtp734457442-2384) [n:127.0.0.1:46725_mr%2Fx c:testcollection s:shard1 r:core_node13 x:testcollection_shard1_replica_n10 t:null-5418] o.a.s.u.UpdateLog Initializing UpdateLog: dataDir= defaultSyncLevel=FLUSH numRecordsToKeep=100 maxNumLogsToKeep=10 numVersionBuckets=65536 2> 505582 INFO (qtp1977894741-2396) [n:127.0.0.1:33625_mr%2Fx c:testcollection s:shard2 r:core_node15 x:testcollection_shard2_replica_n12 t:null-5419] o.a.s.u.UpdateHandler Using UpdateLog implementation: org.apache.solr.update.UpdateLog 2> 505583 INFO (qtp1977894741-2396) [n:127.0.0.1:33625_mr%2Fx c:testcollection s:shard2 r:core_node15 x:testcollection_shard2_replica_n12 t:null-5419] o.a.s.u.UpdateLog Initializing UpdateLog: dataDir= defaultSyncLevel=FLUSH numRecordsToKeep=100 maxNumLogsToKeep=10 numVersionBuckets=65536 2> 506130 INFO (qtp852564526-2471) [n:127.0.0.1:34767_mr%2Fx c:testcollection s:shard1 r:core_node18 x:testcollection_shard1_replica_n16 t:null-5422] o.a.s.u.CommitTracker Hard AutoCommit: disabled 2> 506130 INFO (qtp852564526-2471) [n:127.0.0.1:34767_mr%2Fx c:testcollection s:shard1 r:core_node18 x:testcollection_shard1_replica_n16 t:null-5422] o.a.s.u.CommitTracker Soft AutoCommit: disabled 2> 506131 INFO (qtp734457442-2385) [n:127.0.0.1:46725_mr%2Fx c:testcollection s:shard3 r:core_node17 x:testcollection_shard3_replica_n14 t:null-5420] o.a.s.u.CommitTracker Hard AutoCommit: disabled 2> 506131 INFO (qtp734457442-2385) [n:127.0.0.1:46725_mr%2Fx c:testcollection s:shard3 r:core_node17 x:testcollection_shard3_replica_n14 t:null-5420] o.a.s.u.CommitTracker Soft AutoCommit: disabled 2> 506132 INFO (qtp1831257447-2437) [n:127.0.0.1:45309_mr%2Fx c:testcollection s:shard2 r:core_node7 x:testcollection_shard2_replica_n4 t:null-5415] o.a.s.u.CommitTracker Hard AutoCommit: disabled 2> 506132 INFO (qtp1831257447-2438) [n:127.0.0.1:45309_mr%2Fx c:testcollection s:shard3 r:core_node9 x:testcollection_shard3_replica_n6 t:null-5416] o.a.s.u.CommitTracker Hard AutoCommit: disabled 2> 506132 INFO (qtp1831257447-2437) [n:127.0.0.1:45309_mr%2Fx c:testcollection s:shard2 r:core_node7 x:testcollection_shard2_replica_n4 t:null-5415] o.a.s.u.CommitTracker Soft AutoCommit: disabled 2> 506132 INFO (qtp1831257447-2438) [n:127.0.0.1:45309_mr%2Fx c:testcollection s:shard3 r:core_node9 x:testcollection_shard3_replica_n6 t:null-5416] o.a.s.u.CommitTracker Soft AutoCommit: disabled 2> 506132 INFO (qtp1977894741-2398) [n:127.0.0.1:33625_mr%2Fx c:testcollection s:shard1 r:core_node11 x:testcollection_shard1_replica_n8 t:null-5417] o.a.s.u.CommitTracker Hard AutoCommit: disabled 2> 506133 INFO (qtp1977894741-2398) [n:127.0.0.1:33625_mr%2Fx c:testcollection s:shard1 r:core_node11 x:testcollection_shard1_replica_n8 t:null-5417] o.a.s.u.CommitTracker Soft AutoCommit: disabled 2> 506133 INFO (qtp852564526-2470) [n:127.0.0.1:34767_mr%2Fx c:testcollection s:shard2 r:core_node3 x:testcollection_shard2_replica_n1 t:null-5414] o.a.s.u.UpdateHandler Using UpdateLog implementation: org.apache.solr.update.UpdateLog 2> 506134 INFO (qtp852564526-2470) [n:127.0.0.1:34767_mr%2Fx c:testcollection s:shard2 r:core_node3 x:testcollection_shard2_replica_n1 t:null-5414] o.a.s.u.UpdateLog Initializing UpdateLog: dataDir= defaultSyncLevel=FLUSH numRecordsToKeep=100 maxNumLogsToKeep=10 numVersionBuckets=65536 2> 506136 INFO (qtp1805675424-2333) [n:127.0.0.1:40873_mr%2Fx c:testcollection s:shard3 r:core_node5 x:testcollection_shard3_replica_n2 t:null-5421] o.a.s.u.CommitTracker Hard AutoCommit: disabled 2> 506136 INFO (qtp1805675424-2333) [n:127.0.0.1:40873_mr%2Fx c:testcollection s:shard3 r:core_node5 x:testcollection_shard3_replica_n2 t:null-5421] o.a.s.u.CommitTracker Soft AutoCommit: disabled 2> 506137 INFO (qtp734457442-2384) [n:127.0.0.1:46725_mr%2Fx c:testcollection s:shard1 r:core_node13 x:testcollection_shard1_replica_n10 t:null-5418] o.a.s.u.CommitTracker Hard AutoCommit: disabled 2> 506137 INFO (qtp734457442-2384) [n:127.0.0.1:46725_mr%2Fx c:testcollection s:shard1 r:core_node13 x:testcollection_shard1_replica_n10 t:null-5418] o.a.s.u.CommitTracker Soft AutoCommit: disabled 2> 506139 INFO (qtp1977894741-2396) [n:127.0.0.1:33625_mr%2Fx c:testcollection s:shard2 r:core_node15 x:testcollection_shard2_replica_n12 t:null-5419] o.a.s.u.CommitTracker Hard AutoCommit: disabled 2> 506139 INFO (qtp1977894741-2396) [n:127.0.0.1:33625_mr%2Fx c:testcollection s:shard2 r:core_node15 x:testcollection_shard2_replica_n12 t:null-5419] o.a.s.u.CommitTracker Soft AutoCommit: disabled 2> 506153 INFO (qtp734457442-2385) [n:127.0.0.1:46725_mr%2Fx c:testcollection s:shard3 r:core_node17 x:testcollection_shard3_replica_n14 t:null-5420] o.a.s.u.RandomMergePolicy RandomMergePolicy wrapping class org.apache.lucene.index.LogByteSizeMergePolicy: [LogByteSizeMergePolicy: minMergeSize=1677721, mergeFactor=45, maxMergeSize=2147483648, maxMergeSizeForForcedMerge=9223372036854775807, calibrateSizeByDeletes=true, maxMergeDocs=2147483647, maxCFSSegmentSizeMB=8.796093022208E12, noCFSRatio=0.0] 2> 506153 INFO (qtp852564526-2471) [n:127.0.0.1:34767_mr%2Fx c:testcollection s:shard1 r:core_node18 x:testcollection_shard1_replica_n16 t:null-5422] o.a.s.u.RandomMergePolicy RandomMergePolicy wrapping class org.apache.lucene.index.LogByteSizeMergePolicy: [LogByteSizeMergePolicy: minMergeSize=1677721, mergeFactor=45, maxMergeSize=2147483648, maxMergeSizeForForcedMerge=9223372036854775807, calibrateSizeByDeletes=true, maxMergeDocs=2147483647, maxCFSSegmentSizeMB=8.796093022208E12, noCFSRatio=0.0] 2> 506157 INFO (qtp1831257447-2438) [n:127.0.0.1:45309_mr%2Fx c:testcollection s:shard3 r:core_node9 x:testcollection_shard3_replica_n6 t:null-5416] o.a.s.u.RandomMergePolicy RandomMergePolicy wrapping class org.apache.lucene.index.TieredMergePolicy: [TieredMergePolicy: maxMergeAtOnce=50, maxMergedSegmentMB=16.045801162719727, floorSegmentMB=0.9055280685424805, forceMergeDeletesPctAllowed=17.089761324479934, segmentsPerTier=27.0, maxCFSSegmentSizeMB=8.796093022208E12, noCFSRatio=0.3629416153254694, deletesPctAllowed=40.30256013139616 2> 506157 INFO (qtp1831257447-2437) [n:127.0.0.1:45309_mr%2Fx c:testcollection s:shard2 r:core_node7 x:testcollection_shard2_replica_n4 t:null-5415] o.a.s.u.RandomMergePolicy RandomMergePolicy wrapping class org.apache.lucene.index.LogByteSizeMergePolicy: [LogByteSizeMergePolicy: minMergeSize=1677721, mergeFactor=45, maxMergeSize=2147483648, maxMergeSizeForForcedMerge=9223372036854775807, calibrateSizeByDeletes=true, maxMergeDocs=2147483647, maxCFSSegmentSizeMB=8.796093022208E12, noCFSRatio=0.0] 2> 506160 INFO (qtp1977894741-2398) [n:127.0.0.1:33625_mr%2Fx c:testcollection s:shard1 r:core_node11 x:testcollection_shard1_replica_n8 t:null-5417] o.a.s.u.RandomMergePolicy RandomMergePolicy wrapping class org.apache.lucene.index.TieredMergePolicy: [TieredMergePolicy: maxMergeAtOnce=50, maxMergedSegmentMB=16.045801162719727, floorSegmentMB=0.9055280685424805, forceMergeDeletesPctAllowed=17.089761324479934, segmentsPerTier=27.0, maxCFSSegmentSizeMB=8.796093022208E12, noCFSRatio=0.3629416153254694, deletesPctAllowed=40.30256013139616 2> 506161 INFO (qtp1805675424-2333) [n:127.0.0.1:40873_mr%2Fx c:testcollection s:shard3 r:core_node5 x:testcollection_shard3_replica_n2 t:null-5421] o.a.s.u.RandomMergePolicy RandomMergePolicy wrapping class org.apache.lucene.index.TieredMergePolicy: [TieredMergePolicy: maxMergeAtOnce=50, maxMergedSegmentMB=16.045801162719727, floorSegmentMB=0.9055280685424805, forceMergeDeletesPctAllowed=17.089761324479934, segmentsPerTier=27.0, maxCFSSegmentSizeMB=8.796093022208E12, noCFSRatio=0.3629416153254694, deletesPctAllowed=40.30256013139616 2> 506163 INFO (qtp852564526-2470) [n:127.0.0.1:34767_mr%2Fx c:testcollection s:shard2 r:core_node3 x:testcollection_shard2_replica_n1 t:null-5414] o.a.s.u.CommitTracker Hard AutoCommit: disabled 2> 506163 INFO (qtp1977894741-2396) [n:127.0.0.1:33625_mr%2Fx c:testcollection s:shard2 r:core_node15 x:testcollection_shard2_replica_n12 t:null-5419] o.a.s.u.RandomMergePolicy RandomMergePolicy wrapping class org.apache.lucene.index.TieredMergePolicy: [TieredMergePolicy: maxMergeAtOnce=50, maxMergedSegmentMB=16.045801162719727, floorSegmentMB=0.9055280685424805, forceMergeDeletesPctAllowed=17.089761324479934, segmentsPerTier=27.0, maxCFSSegmentSizeMB=8.796093022208E12, noCFSRatio=0.3629416153254694, deletesPctAllowed=40.30256013139616 2> 506163 INFO (qtp734457442-2384) [n:127.0.0.1:46725_mr%2Fx c:testcollection s:shard1 r:core_node13 x:testcollection_shard1_replica_n10 t:null-5418] o.a.s.u.RandomMergePolicy RandomMergePolicy wrapping class org.apache.lucene.index.TieredMergePolicy: [TieredMergePolicy: maxMergeAtOnce=50, maxMergedSegmentMB=16.045801162719727, floorSegmentMB=0.9055280685424805, forceMergeDeletesPctAllowed=17.089761324479934, segmentsPerTier=27.0, maxCFSSegmentSizeMB=8.796093022208E12, noCFSRatio=0.3629416153254694, deletesPctAllowed=40.30256013139616 2> 506163 INFO (qtp852564526-2470) [n:127.0.0.1:34767_mr%2Fx c:testcollection s:shard2 r:core_node3 x:testcollection_shard2_replica_n1 t:null-5414] o.a.s.u.CommitTracker Soft AutoCommit: disabled 2> 506181 INFO (qtp852564526-2470) [n:127.0.0.1:34767_mr%2Fx c:testcollection s:shard2 r:core_node3 x:testcollection_shard2_replica_n1 t:null-5414] o.a.s.u.RandomMergePolicy RandomMergePolicy wrapping class org.apache.lucene.index.TieredMergePolicy: [TieredMergePolicy: maxMergeAtOnce=50, maxMergedSegmentMB=16.045801162719727, floorSegmentMB=0.9055280685424805, forceMergeDeletesPctAllowed=17.089761324479934, segmentsPerTier=27.0, maxCFSSegmentSizeMB=8.796093022208E12, noCFSRatio=0.3629416153254694, deletesPctAllowed=40.30256013139616 2> 506189 INFO (qtp1831257447-2438) [n:127.0.0.1:45309_mr%2Fx c:testcollection s:shard3 r:core_node9 x:testcollection_shard3_replica_n6 t:null-5416] o.a.s.r.ManagedResourceStorage Configured ZooKeeperStorageIO with znodeBase: /configs/conf1 2> 506189 INFO (qtp734457442-2384) [n:127.0.0.1:46725_mr%2Fx c:testcollection s:shard1 r:core_node13 x:testcollection_shard1_replica_n10 t:null-5418] o.a.s.r.ManagedResourceStorage Configured ZooKeeperStorageIO with znodeBase: /configs/conf1 2> 506189 INFO (qtp1977894741-2396) [n:127.0.0.1:33625_mr%2Fx c:testcollection s:shard2 r:core_node15 x:testcollection_shard2_replica_n12 t:null-5419] o.a.s.r.ManagedResourceStorage Configured ZooKeeperStorageIO with znodeBase: /configs/conf1 2> 506189 INFO (qtp1805675424-2333) [n:127.0.0.1:40873_mr%2Fx c:testcollection s:shard3 r:core_node5 x:testcollection_shard3_replica_n2 t:null-5421] o.a.s.r.ManagedResourceStorage Configured ZooKeeperStorageIO with znodeBase: /configs/conf1 2> 506190 INFO (qtp852564526-2471) [n:127.0.0.1:34767_mr%2Fx c:testcollection s:shard1 r:core_node18 x:testcollection_shard1_replica_n16 t:null-5422] o.a.s.r.ManagedResourceStorage Configured ZooKeeperStorageIO with znodeBase: /configs/conf1 2> 506191 INFO (qtp1977894741-2398) [n:127.0.0.1:33625_mr%2Fx c:testcollection s:shard1 r:core_node11 x:testcollection_shard1_replica_n8 t:null-5417] o.a.s.r.ManagedResourceStorage Configured ZooKeeperStorageIO with znodeBase: /configs/conf1 2> 506193 INFO (qtp734457442-2385) [n:127.0.0.1:46725_mr%2Fx c:testcollection s:shard3 r:core_node17 x:testcollection_shard3_replica_n14 t:null-5420] o.a.s.r.ManagedResourceStorage Configured ZooKeeperStorageIO with znodeBase: /configs/conf1 2> 506193 INFO (qtp1831257447-2437) [n:127.0.0.1:45309_mr%2Fx c:testcollection s:shard2 r:core_node7 x:testcollection_shard2_replica_n4 t:null-5415] o.a.s.r.ManagedResourceStorage Configured ZooKeeperStorageIO with znodeBase: /configs/conf1 2> 506194 INFO (qtp852564526-2470) [n:127.0.0.1:34767_mr%2Fx c:testcollection s:shard2 r:core_node3 x:testcollection_shard2_replica_n1 t:null-5414] o.a.s.r.ManagedResourceStorage Configured ZooKeeperStorageIO with znodeBase: /configs/conf1 2> 506195 INFO (qtp1831257447-2438) [n:127.0.0.1:45309_mr%2Fx c:testcollection s:shard3 r:core_node9 x:testcollection_shard3_replica_n6 t:null-5416] o.a.s.r.ManagedResourceStorage Loaded null at path _rest_managed.json using ZooKeeperStorageIO:path=/configs/conf1 2> 506195 INFO (qtp1977894741-2398) [n:127.0.0.1:33625_mr%2Fx c:testcollection s:shard1 r:core_node11 x:testcollection_shard1_replica_n8 t:null-5417] o.a.s.r.ManagedResourceStorage Loaded null at path _rest_managed.json using ZooKeeperStorageIO:path=/configs/conf1 2> 506195 INFO (qtp852564526-2471) [n:127.0.0.1:34767_mr%2Fx c:testcollection s:shard1 r:core_node18 x:testcollection_shard1_replica_n16 t:null-5422] o.a.s.r.ManagedResourceStorage Loaded null at path _rest_managed.json using ZooKeeperStorageIO:path=/configs/conf1 2> 506195 INFO (qtp1977894741-2396) [n:127.0.0.1:33625_mr%2Fx c:testcollection s:shard2 r:core_node15 x:testcollection_shard2_replica_n12 t:null-5419] o.a.s.r.ManagedResourceStorage Loaded null at path _rest_managed.json using ZooKeeperStorageIO:path=/configs/conf1 2> 506196 INFO (qtp734457442-2384) [n:127.0.0.1:46725_mr%2Fx c:testcollection s:shard1 r:core_node13 x:testcollection_shard1_replica_n10 t:null-5418] o.a.s.r.ManagedResourceStorage Loaded null at path _rest_managed.json using ZooKeeperStorageIO:path=/configs/conf1 2> 506196 INFO (qtp1805675424-2333) [n:127.0.0.1:40873_mr%2Fx c:testcollection s:shard3 r:core_node5 x:testcollection_shard3_replica_n2 t:null-5421] o.a.s.r.ManagedResourceStorage Loaded null at path _rest_managed.json using ZooKeeperStorageIO:path=/configs/conf1 2> 506196 INFO (qtp734457442-2385) [n:127.0.0.1:46725_mr%2Fx c:testcollection s:shard3 r:core_node17 x:testcollection_shard3_replica_n14 t:null-5420] o.a.s.r.ManagedResourceStorage Loaded null at path _rest_managed.json using ZooKeeperStorageIO:path=/configs/conf1 2> 506197 INFO (qtp1831257447-2437) [n:127.0.0.1:45309_mr%2Fx c:testcollection s:shard2 r:core_node7 x:testcollection_shard2_replica_n4 t:null-5415] o.a.s.r.ManagedResourceStorage Loaded null at path _rest_managed.json using ZooKeeperStorageIO:path=/configs/conf1 2> 506197 INFO (qtp852564526-2470) [n:127.0.0.1:34767_mr%2Fx c:testcollection s:shard2 r:core_node3 x:testcollection_shard2_replica_n1 t:null-5414] o.a.s.r.ManagedResourceStorage Loaded null at path _rest_managed.json using ZooKeeperStorageIO:path=/configs/conf1 2> 506205 INFO (qtp852564526-2471) [n:127.0.0.1:34767_mr%2Fx c:testcollection s:shard1 r:core_node18 x:testcollection_shard1_replica_n16 t:null-5422] o.a.s.h.ReplicationHandler Commits will be reserved for 10000 ms 2> 506205 INFO (qtp1805675424-2333) [n:127.0.0.1:40873_mr%2Fx c:testcollection s:shard3 r:core_node5 x:testcollection_shard3_replica_n2 t:null-5421] o.a.s.h.ReplicationHandler Commits will be reserved for 10000 ms 2> 506205 INFO (qtp1831257447-2437) [n:127.0.0.1:45309_mr%2Fx c:testcollection s:shard2 r:core_node7 x:testcollection_shard2_replica_n4 t:null-5415] o.a.s.h.ReplicationHandler Commits will be reserved for 10000 ms 2> 506205 INFO (qtp1977894741-2398) [n:127.0.0.1:33625_mr%2Fx c:testcollection s:shard1 r:core_node11 x:testcollection_shard1_replica_n8 t:null-5417] o.a.s.h.ReplicationHandler Commits will be reserved for 10000 ms 2> 506205 INFO (qtp1831257447-2438) [n:127.0.0.1:45309_mr%2Fx c:testcollection s:shard3 r:core_node9 x:testcollection_shard3_replica_n6 t:null-5416] o.a.s.h.ReplicationHandler Commits will be reserved for 10000 ms 2> 506205 INFO (qtp1977894741-2396) [n:127.0.0.1:33625_mr%2Fx c:testcollection s:shard2 r:core_node15 x:testcollection_shard2_replica_n12 t:null-5419] o.a.s.h.ReplicationHandler Commits will be reserved for 10000 ms 2> 506205 INFO (qtp734457442-2385) [n:127.0.0.1:46725_mr%2Fx c:testcollection s:shard3 r:core_node17 x:testcollection_shard3_replica_n14 t:null-5420] o.a.s.h.ReplicationHandler Commits will be reserved for 10000 ms 2> 506206 INFO (qtp734457442-2384) [n:127.0.0.1:46725_mr%2Fx c:testcollection s:shard1 r:core_node13 x:testcollection_shard1_replica_n10 t:null-5418] o.a.s.h.ReplicationHandler Commits will be reserved for 10000 ms 2> 506207 INFO (qtp852564526-2470) [n:127.0.0.1:34767_mr%2Fx c:testcollection s:shard2 r:core_node3 x:testcollection_shard2_replica_n1 t:null-5414] o.a.s.h.ReplicationHandler Commits will be reserved for 10000 ms 2> 506207 INFO (qtp1805675424-2333) [n:127.0.0.1:40873_mr%2Fx c:testcollection s:shard3 r:core_node5 x:testcollection_shard3_replica_n2 t:null-5421] o.a.s.u.UpdateLog Could not find max version in index or recent updates, using new clock 1788735766423142400 2> 506207 INFO (qtp852564526-2471) [n:127.0.0.1:34767_mr%2Fx c:testcollection s:shard1 r:core_node18 x:testcollection_shard1_replica_n16 t:null-5422] o.a.s.u.UpdateLog Could not find max version in index or recent updates, using new clock 1788735766423142400 2> 506207 INFO (qtp1831257447-2438) [n:127.0.0.1:45309_mr%2Fx c:testcollection s:shard3 r:core_node9 x:testcollection_shard3_replica_n6 t:null-5416] o.a.s.u.UpdateLog Could not find max version in index or recent updates, using new clock 1788735766423142400 2> 506207 INFO (qtp734457442-2385) [n:127.0.0.1:46725_mr%2Fx c:testcollection s:shard3 r:core_node17 x:testcollection_shard3_replica_n14 t:null-5420] o.a.s.u.UpdateLog Could not find max version in index or recent updates, using new clock 1788735766423142400 2> 506207 INFO (qtp1977894741-2396) [n:127.0.0.1:33625_mr%2Fx c:testcollection s:shard2 r:core_node15 x:testcollection_shard2_replica_n12 t:null-5419] o.a.s.u.UpdateLog Could not find max version in index or recent updates, using new clock 1788735766423142400 2> 506207 INFO (qtp1831257447-2437) [n:127.0.0.1:45309_mr%2Fx c:testcollection s:shard2 r:core_node7 x:testcollection_shard2_replica_n4 t:null-5415] o.a.s.u.UpdateLog Could not find max version in index or recent updates, using new clock 1788735766423142400 2> 506207 INFO (qtp852564526-2470) [n:127.0.0.1:34767_mr%2Fx c:testcollection s:shard2 r:core_node3 x:testcollection_shard2_replica_n1 t:null-5414] o.a.s.u.UpdateLog Could not find max version in index or recent updates, using new clock 1788735766423142400 2> 506207 INFO (qtp1977894741-2398) [n:127.0.0.1:33625_mr%2Fx c:testcollection s:shard1 r:core_node11 x:testcollection_shard1_replica_n8 t:null-5417] o.a.s.u.UpdateLog Could not find max version in index or recent updates, using new clock 1788735766423142400 2> 506207 INFO (qtp734457442-2384) [n:127.0.0.1:46725_mr%2Fx c:testcollection s:shard1 r:core_node13 x:testcollection_shard1_replica_n10 t:null-5418] o.a.s.u.UpdateLog Could not find max version in index or recent updates, using new clock 1788735766423142400 2> 506325 INFO (searcherExecutor-1579-thread-1-processing-testcollection_shard3_replica_n14 null-5420 core_node17 127.0.0.1:46725_mr%2Fx testcollection shard3) [n:127.0.0.1:46725_mr%2Fx c:testcollection s:shard3 r:core_node17 x:testcollection_shard3_replica_n14 t:null-5420] o.a.s.c.SolrCore Registered new searcher autowarm time: 0 ms 2> 506326 INFO (searcherExecutor-1575-thread-1-processing-testcollection_shard1_replica_n8 null-5417 core_node11 127.0.0.1:33625_mr%2Fx testcollection shard1) [n:127.0.0.1:33625_mr%2Fx c:testcollection s:shard1 r:core_node11 x:testcollection_shard1_replica_n8 t:null-5417] o.a.s.c.SolrCore Registered new searcher autowarm time: 0 ms 2> 506327 INFO (searcherExecutor-1577-thread-1-processing-testcollection_shard1_replica_n16 null-5422 core_node18 127.0.0.1:34767_mr%2Fx testcollection shard1) [n:127.0.0.1:34767_mr%2Fx c:testcollection s:shard1 r:core_node18 x:testcollection_shard1_replica_n16 t:null-5422] o.a.s.c.SolrCore Registered new searcher autowarm time: 0 ms 2> 506327 INFO (searcherExecutor-1587-thread-1-processing-testcollection_shard2_replica_n1 null-5414 core_node3 127.0.0.1:34767_mr%2Fx testcollection shard2) [n:127.0.0.1:34767_mr%2Fx c:testcollection s:shard2 r:core_node3 x:testcollection_shard2_replica_n1 t:null-5414] o.a.s.c.SolrCore Registered new searcher autowarm time: 0 ms 2> 506328 INFO (searcherExecutor-1573-thread-1-processing-testcollection_shard3_replica_n2 null-5421 core_node5 127.0.0.1:40873_mr%2Fx testcollection shard3) [n:127.0.0.1:40873_mr%2Fx c:testcollection s:shard3 r:core_node5 x:testcollection_shard3_replica_n2 t:null-5421] o.a.s.c.SolrCore Registered new searcher autowarm time: 0 ms 2> 506328 INFO (searcherExecutor-1581-thread-1-processing-testcollection_shard2_replica_n4 null-5415 core_node7 127.0.0.1:45309_mr%2Fx testcollection shard2) [n:127.0.0.1:45309_mr%2Fx c:testcollection s:shard2 r:core_node7 x:testcollection_shard2_replica_n4 t:null-5415] o.a.s.c.SolrCore Registered new searcher autowarm time: 0 ms 2> 506331 INFO (searcherExecutor-1583-thread-1-processing-testcollection_shard1_replica_n10 null-5418 core_node13 127.0.0.1:46725_mr%2Fx testcollection shard1) [n:127.0.0.1:46725_mr%2Fx c:testcollection s:shard1 r:core_node13 x:testcollection_shard1_replica_n10 t:null-5418] o.a.s.c.SolrCore Registered new searcher autowarm time: 0 ms 2> 506332 INFO (searcherExecutor-1585-thread-1-processing-testcollection_shard2_replica_n12 null-5419 core_node15 127.0.0.1:33625_mr%2Fx testcollection shard2) [n:127.0.0.1:33625_mr%2Fx c:testcollection s:shard2 r:core_node15 x:testcollection_shard2_replica_n12 t:null-5419] o.a.s.c.SolrCore Registered new searcher autowarm time: 0 ms 2> 506333 INFO (searcherExecutor-1589-thread-1-processing-testcollection_shard3_replica_n6 null-5416 core_node9 127.0.0.1:45309_mr%2Fx testcollection shard3) [n:127.0.0.1:45309_mr%2Fx c:testcollection s:shard3 r:core_node9 x:testcollection_shard3_replica_n6 t:null-5416] o.a.s.c.SolrCore Registered new searcher autowarm time: 0 ms 2> 506341 INFO (qtp1831257447-2438) [n:127.0.0.1:45309_mr%2Fx c:testcollection s:shard3 r:core_node9 x:testcollection_shard3_replica_n6 t:null-5416] o.a.s.c.ZkShardTerms Failed to save terms, version is not a match, retrying 2> 506342 INFO (qtp734457442-2385) [n:127.0.0.1:46725_mr%2Fx c:testcollection s:shard3 r:core_node17 x:testcollection_shard3_replica_n14 t:null-5420] o.a.s.c.ZkShardTerms Failed to save terms, version is not a match, retrying 2> 506342 INFO (qtp852564526-2471) [n:127.0.0.1:34767_mr%2Fx c:testcollection s:shard1 r:core_node18 x:testcollection_shard1_replica_n16 t:null-5422] o.a.s.c.ZkShardTerms Failed to save terms, version is not a match, retrying 2> 506340 INFO (qtp1805675424-2333) [n:127.0.0.1:40873_mr%2Fx c:testcollection s:shard3 r:core_node5 x:testcollection_shard3_replica_n2 t:null-5421] o.a.s.c.ZkShardTerms Successful update of terms at /collections/testcollection/terms/shard3 to Terms{values={core_node5=0}, version=0} for registerTerm 2> 506341 INFO (qtp1977894741-2398) [n:127.0.0.1:33625_mr%2Fx c:testcollection s:shard1 r:core_node11 x:testcollection_shard1_replica_n8 t:null-5417] o.a.s.c.ZkShardTerms Successful update of terms at /collections/testcollection/terms/shard1 to Terms{values={core_node11=0}, version=0} for registerTerm 2> 506350 INFO (qtp1805675424-2333) [n:127.0.0.1:40873_mr%2Fx c:testcollection s:shard3 r:core_node5 x:testcollection_shard3_replica_n2 t:null-5421] o.a.s.c.ShardLeaderElectionContextBase make sure parent is created /collections/testcollection/leaders/shard3 2> 506350 INFO (qtp1977894741-2398) [n:127.0.0.1:33625_mr%2Fx c:testcollection s:shard1 r:core_node11 x:testcollection_shard1_replica_n8 t:null-5417] o.a.s.c.ShardLeaderElectionContextBase make sure parent is created /collections/testcollection/leaders/shard1 2> 506357 INFO (qtp852564526-2471) [n:127.0.0.1:34767_mr%2Fx c:testcollection s:shard1 r:core_node18 x:testcollection_shard1_replica_n16 t:null-5422] o.a.s.c.ZkShardTerms Successful update of terms at /collections/testcollection/terms/shard1 to Terms{values={core_node18=0, core_node11=0}, version=1} for registerTerm 2> 506358 INFO (qtp852564526-2471) [n:127.0.0.1:34767_mr%2Fx c:testcollection s:shard1 r:core_node18 x:testcollection_shard1_replica_n16 t:null-5422] o.a.s.c.ShardLeaderElectionContextBase make sure parent is created /collections/testcollection/leaders/shard1 2> 506358 INFO (qtp1831257447-2438) [n:127.0.0.1:45309_mr%2Fx c:testcollection s:shard3 r:core_node9 x:testcollection_shard3_replica_n6 t:null-5416] o.a.s.c.ZkShardTerms Successful update of terms at /collections/testcollection/terms/shard3 to Terms{values={core_node5=0, core_node9=0}, version=1} for registerTerm 2> 506358 INFO (qtp734457442-2385) [n:127.0.0.1:46725_mr%2Fx c:testcollection s:shard3 r:core_node17 x:testcollection_shard3_replica_n14 t:null-5420] o.a.s.c.ZkShardTerms Failed to save terms, version is not a match, retrying 2> 506362 INFO (qtp1831257447-2438) [n:127.0.0.1:45309_mr%2Fx c:testcollection s:shard3 r:core_node9 x:testcollection_shard3_replica_n6 t:null-5416] o.a.s.c.ShardLeaderElectionContextBase make sure parent is created /collections/testcollection/leaders/shard3 2> 506375 INFO (qtp734457442-2385) [n:127.0.0.1:46725_mr%2Fx c:testcollection s:shard3 r:core_node17 x:testcollection_shard3_replica_n14 t:null-5420] o.a.s.c.ZkShardTerms Successful update of terms at /collections/testcollection/terms/shard3 to Terms{values={core_node17=0, core_node5=0, core_node9=0}, version=2} for registerTerm 2> 506375 INFO (qtp734457442-2385) [n:127.0.0.1:46725_mr%2Fx c:testcollection s:shard3 r:core_node17 x:testcollection_shard3_replica_n14 t:null-5420] o.a.s.c.ShardLeaderElectionContextBase make sure parent is created /collections/testcollection/leaders/shard3 2> 506387 INFO (qtp1977894741-2396) [n:127.0.0.1:33625_mr%2Fx c:testcollection s:shard2 r:core_node15 x:testcollection_shard2_replica_n12 t:null-5419] o.a.s.c.ZkShardTerms Successful update of terms at /collections/testcollection/terms/shard2 to Terms{values={core_node15=0}, version=0} for registerTerm 2> 506387 INFO (qtp1977894741-2396) [n:127.0.0.1:33625_mr%2Fx c:testcollection s:shard2 r:core_node15 x:testcollection_shard2_replica_n12 t:null-5419] o.a.s.c.ShardLeaderElectionContextBase make sure parent is created /collections/testcollection/leaders/shard2 2> 506395 INFO (qtp852564526-2470) [n:127.0.0.1:34767_mr%2Fx c:testcollection s:shard2 r:core_node3 x:testcollection_shard2_replica_n1 t:null-5414] o.a.s.c.ZkShardTerms Failed to save terms, version is not a match, retrying 2> 506400 INFO (qtp1831257447-2437) [n:127.0.0.1:45309_mr%2Fx c:testcollection s:shard2 r:core_node7 x:testcollection_shard2_replica_n4 t:null-5415] o.a.s.c.ZkShardTerms Successful update of terms at /collections/testcollection/terms/shard2 to Terms{values={core_node7=0, core_node15=0}, version=1} for registerTerm 2> 506401 INFO (qtp1831257447-2437) [n:127.0.0.1:45309_mr%2Fx c:testcollection s:shard2 r:core_node7 x:testcollection_shard2_replica_n4 t:null-5415] o.a.s.c.ShardLeaderElectionContextBase make sure parent is created /collections/testcollection/leaders/shard2 2> 506404 INFO (qtp852564526-2470) [n:127.0.0.1:34767_mr%2Fx c:testcollection s:shard2 r:core_node3 x:testcollection_shard2_replica_n1 t:null-5414] o.a.s.c.ZkShardTerms Failed to save terms, version is not a match, retrying 2> 506409 INFO (qtp734457442-2384) [n:127.0.0.1:46725_mr%2Fx c:testcollection s:shard1 r:core_node13 x:testcollection_shard1_replica_n10 t:null-5418] o.a.s.c.ZkShardTerms Successful update of terms at /collections/testcollection/terms/shard1 to Terms{values={core_node18=0, core_node13=0, core_node11=0}, version=2} for registerTerm 2> 506415 INFO (qtp734457442-2384) [n:127.0.0.1:46725_mr%2Fx c:testcollection s:shard1 r:core_node13 x:testcollection_shard1_replica_n10 t:null-5418] o.a.s.c.ShardLeaderElectionContextBase make sure parent is created /collections/testcollection/leaders/shard1 2> 506421 INFO (qtp852564526-2470) [n:127.0.0.1:34767_mr%2Fx c:testcollection s:shard2 r:core_node3 x:testcollection_shard2_replica_n1 t:null-5414] o.a.s.c.ZkShardTerms Successful update of terms at /collections/testcollection/terms/shard2 to Terms{values={core_node7=0, core_node3=0, core_node15=0}, version=2} for registerTerm 2> 506426 INFO (qtp852564526-2470) [n:127.0.0.1:34767_mr%2Fx c:testcollection s:shard2 r:core_node3 x:testcollection_shard2_replica_n1 t:null-5414] o.a.s.c.ShardLeaderElectionContextBase make sure parent is created /collections/testcollection/leaders/shard2 2> 506440 INFO (qtp1805675424-2333) [n:127.0.0.1:40873_mr%2Fx c:testcollection s:shard3 r:core_node5 x:testcollection_shard3_replica_n2 t:null-5421] o.a.s.c.ShardLeaderElectionContext Enough replicas found to continue. 2> 506440 INFO (qtp1805675424-2333) [n:127.0.0.1:40873_mr%2Fx c:testcollection s:shard3 r:core_node5 x:testcollection_shard3_replica_n2 t:null-5421] o.a.s.c.ShardLeaderElectionContext I may be the new leader - try and sync 2> 506441 INFO (qtp1805675424-2333) [n:127.0.0.1:40873_mr%2Fx c:testcollection s:shard3 r:core_node5 x:testcollection_shard3_replica_n2 t:null-5421] o.a.s.c.SyncStrategy Sync replicas to http://127.0.0.1:40873/mr/x/testcollection_shard3_replica_n2/ 2> 506443 INFO (qtp852564526-2471) [n:127.0.0.1:34767_mr%2Fx c:testcollection s:shard1 r:core_node18 x:testcollection_shard1_replica_n16 t:null-5422] o.a.s.c.ShardLeaderElectionContext Enough replicas found to continue. 2> 506443 INFO (qtp852564526-2471) [n:127.0.0.1:34767_mr%2Fx c:testcollection s:shard1 r:core_node18 x:testcollection_shard1_replica_n16 t:null-5422] o.a.s.c.ShardLeaderElectionContext I may be the new leader - try and sync 2> 506443 INFO (qtp852564526-2471) [n:127.0.0.1:34767_mr%2Fx c:testcollection s:shard1 r:core_node18 x:testcollection_shard1_replica_n16 t:null-5422] o.a.s.c.SyncStrategy Sync replicas to http://127.0.0.1:34767/mr/x/testcollection_shard1_replica_n16/ 2> 506450 INFO (qtp1805675424-2333) [n:127.0.0.1:40873_mr%2Fx c:testcollection s:shard3 r:core_node5 x:testcollection_shard3_replica_n2 t:null-5421] o.a.s.u.PeerSync PeerSync: core=testcollection_shard3_replica_n2 url=http://127.0.0.1:40873/mr/x START replicas=[http://127.0.0.1:45309/mr/x/testcollection_shard3_replica_n6/, http://127.0.0.1:46725/mr/x/testcollection_shard3_replica_n14/] nUpdates=100 2> 506451 INFO (qtp852564526-2471) [n:127.0.0.1:34767_mr%2Fx c:testcollection s:shard1 r:core_node18 x:testcollection_shard1_replica_n16 t:null-5422] o.a.s.u.PeerSync PeerSync: core=testcollection_shard1_replica_n16 url=http://127.0.0.1:34767/mr/x START replicas=[http://127.0.0.1:33625/mr/x/testcollection_shard1_replica_n8/, http://127.0.0.1:46725/mr/x/testcollection_shard1_replica_n10/] nUpdates=100 2> 506460 INFO (qtp1805675424-2333) [n:127.0.0.1:40873_mr%2Fx c:testcollection s:shard3 r:core_node5 x:testcollection_shard3_replica_n2 t:null-5421] o.a.s.u.PeerSync PeerSync: core=testcollection_shard3_replica_n2 url=http://127.0.0.1:40873/mr/x DONE. We have no versions. sync failed. 2> 506464 INFO (qtp1977894741-2396) [n:127.0.0.1:33625_mr%2Fx c:testcollection s:shard2 r:core_node15 x:testcollection_shard2_replica_n12 t:null-5419] o.a.s.c.ShardLeaderElectionContext Enough replicas found to continue. 2> 506464 INFO (qtp1977894741-2396) [n:127.0.0.1:33625_mr%2Fx c:testcollection s:shard2 r:core_node15 x:testcollection_shard2_replica_n12 t:null-5419] o.a.s.c.ShardLeaderElectionContext I may be the new leader - try and sync 2> 506464 INFO (qtp1977894741-2396) [n:127.0.0.1:33625_mr%2Fx c:testcollection s:shard2 r:core_node15 x:testcollection_shard2_replica_n12 t:null-5419] o.a.s.c.SyncStrategy Sync replicas to http://127.0.0.1:33625/mr/x/testcollection_shard2_replica_n12/ 2> 506466 INFO (qtp852564526-2471) [n:127.0.0.1:34767_mr%2Fx c:testcollection s:shard1 r:core_node18 x:testcollection_shard1_replica_n16 t:null-5422] o.a.s.u.PeerSync PeerSync: core=testcollection_shard1_replica_n16 url=http://127.0.0.1:34767/mr/x DONE. We have no versions. sync failed. 2> 506471 INFO (qtp1977894741-2396) [n:127.0.0.1:33625_mr%2Fx c:testcollection s:shard2 r:core_node15 x:testcollection_shard2_replica_n12 t:null-5419] o.a.s.u.PeerSync PeerSync: core=testcollection_shard2_replica_n12 url=http://127.0.0.1:33625/mr/x START replicas=[http://127.0.0.1:34767/mr/x/testcollection_shard2_replica_n1/, http://127.0.0.1:45309/mr/x/testcollection_shard2_replica_n4/] nUpdates=100 2> 506479 INFO (qtp1977894741-2396) [n:127.0.0.1:33625_mr%2Fx c:testcollection s:shard2 r:core_node15 x:testcollection_shard2_replica_n12 t:null-5419] o.a.s.u.PeerSync PeerSync: core=testcollection_shard2_replica_n12 url=http://127.0.0.1:33625/mr/x DONE. We have no versions. sync failed. 2> 506488 INFO (qtp734457442-2388) [n:127.0.0.1:46725_mr%2Fx c:testcollection s:shard3 r:core_node17 x:testcollection_shard3_replica_n14 t:null-5421] o.a.s.c.S.Request webapp=/mr/x path=/get params={distrib=false&qt=/get&fingerprint=false&getVersions=100&wt=javabin&version=2} status=0 QTime=21 2> 506489 INFO (qtp1831257447-2440) [n:127.0.0.1:45309_mr%2Fx c:testcollection s:shard3 r:core_node9 x:testcollection_shard3_replica_n6 t:null-5421] o.a.s.c.S.Request webapp=/mr/x path=/get params={distrib=false&qt=/get&fingerprint=false&getVersions=100&wt=javabin&version=2} status=0 QTime=22 2> 506498 INFO (qtp1831257447-2435) [n:127.0.0.1:45309_mr%2Fx c:testcollection s:shard2 r:core_node7 x:testcollection_shard2_replica_n4 t:null-5419] o.a.s.c.S.Request webapp=/mr/x path=/get params={distrib=false&qt=/get&fingerprint=false&getVersions=100&wt=javabin&version=2} status=0 QTime=3 2> 506498 INFO (qtp734457442-2387) [n:127.0.0.1:46725_mr%2Fx c:testcollection s:shard1 r:core_node13 x:testcollection_shard1_replica_n10 t:null-5422] o.a.s.c.S.Request webapp=/mr/x path=/get params={distrib=false&qt=/get&fingerprint=false&getVersions=100&wt=javabin&version=2} status=0 QTime=3 2> 506499 INFO (qtp1977894741-2397) [n:127.0.0.1:33625_mr%2Fx c:testcollection s:shard1 r:core_node11 x:testcollection_shard1_replica_n8 t:null-5422] o.a.s.c.S.Request webapp=/mr/x path=/get params={distrib=false&qt=/get&fingerprint=false&getVersions=100&wt=javabin&version=2} status=0 QTime=3 2> 506499 INFO (qtp852564526-2469) [n:127.0.0.1:34767_mr%2Fx c:testcollection s:shard2 r:core_node3 x:testcollection_shard2_replica_n1 t:null-5419] o.a.s.c.S.Request webapp=/mr/x path=/get params={distrib=false&qt=/get&fingerprint=false&getVersions=100&wt=javabin&version=2} status=0 QTime=3 2> 506513 INFO (qtp1805675424-2333) [n:127.0.0.1:40873_mr%2Fx c:testcollection s:shard3 r:core_node5 x:testcollection_shard3_replica_n2 t:null-5421] o.a.s.c.SyncStrategy Leader's attempt to sync with shard failed, moving to the next candidate 2> 506513 INFO (qtp1977894741-2396) [n:127.0.0.1:33625_mr%2Fx c:testcollection s:shard2 r:core_node15 x:testcollection_shard2_replica_n12 t:null-5419] o.a.s.c.SyncStrategy Leader's attempt to sync with shard failed, moving to the next candidate 2> 506514 INFO (qtp1805675424-2333) [n:127.0.0.1:40873_mr%2Fx c:testcollection s:shard3 r:core_node5 x:testcollection_shard3_replica_n2 t:null-5421] o.a.s.c.ShardLeaderElectionContext We failed sync, but we have no versions - we can't sync in that case - we were active before, so become leader anyway 2> 506514 INFO (qtp852564526-2471) [n:127.0.0.1:34767_mr%2Fx c:testcollection s:shard1 r:core_node18 x:testcollection_shard1_replica_n16 t:null-5422] o.a.s.c.SyncStrategy Leader's attempt to sync with shard failed, moving to the next candidate 2> 506514 INFO (qtp1977894741-2396) [n:127.0.0.1:33625_mr%2Fx c:testcollection s:shard2 r:core_node15 x:testcollection_shard2_replica_n12 t:null-5419] o.a.s.c.ShardLeaderElectionContext We failed sync, but we have no versions - we can't sync in that case - we were active before, so become leader anyway 2> 506514 INFO (qtp852564526-2471) [n:127.0.0.1:34767_mr%2Fx c:testcollection s:shard1 r:core_node18 x:testcollection_shard1_replica_n16 t:null-5422] o.a.s.c.ShardLeaderElectionContext We failed sync, but we have no versions - we can't sync in that case - we were active before, so become leader anyway 2> 506514 INFO (qtp1977894741-2396) [n:127.0.0.1:33625_mr%2Fx c:testcollection s:shard2 r:core_node15 x:testcollection_shard2_replica_n12 t:null-5419] o.a.s.c.ShardLeaderElectionContextBase Creating leader registration node /collections/testcollection/leaders/shard2/leader after winning as /collections/testcollection/leader_elect/shard2/election/72077818027180042-core_node15-n_0000000000 2> 506514 INFO (qtp1805675424-2333) [n:127.0.0.1:40873_mr%2Fx c:testcollection s:shard3 r:core_node5 x:testcollection_shard3_replica_n2 t:null-5421] o.a.s.c.ShardLeaderElectionContextBase Creating leader registration node /collections/testcollection/leaders/shard3/leader after winning as /collections/testcollection/leader_elect/shard3/election/72077818027180036-core_node5-n_0000000000 2> 506515 INFO (qtp852564526-2471) [n:127.0.0.1:34767_mr%2Fx c:testcollection s:shard1 r:core_node18 x:testcollection_shard1_replica_n16 t:null-5422] o.a.s.c.ShardLeaderElectionContextBase Creating leader registration node /collections/testcollection/leaders/shard1/leader after winning as /collections/testcollection/leader_elect/shard1/election/72077818027180050-core_node18-n_0000000000 2> 506537 INFO (qtp1977894741-2396) [n:127.0.0.1:33625_mr%2Fx c:testcollection s:shard2 r:core_node15 x:testcollection_shard2_replica_n12 t:null-5419] o.a.s.c.ShardLeaderElectionContext I am the new leader: http://127.0.0.1:33625/mr/x/testcollection_shard2_replica_n12/ shard2 2> 506541 INFO (qtp852564526-2471) [n:127.0.0.1:34767_mr%2Fx c:testcollection s:shard1 r:core_node18 x:testcollection_shard1_replica_n16 t:null-5422] o.a.s.c.ShardLeaderElectionContext I am the new leader: http://127.0.0.1:34767/mr/x/testcollection_shard1_replica_n16/ shard1 2> 506543 INFO (qtp1805675424-2333) [n:127.0.0.1:40873_mr%2Fx c:testcollection s:shard3 r:core_node5 x:testcollection_shard3_replica_n2 t:null-5421] o.a.s.c.ShardLeaderElectionContext I am the new leader: http://127.0.0.1:40873/mr/x/testcollection_shard3_replica_n2/ shard3 2> 506665 INFO (zkCallback-1528-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/testcollection/state.json zxid: 488] for collection [testcollection] has occurred - updating... (live nodes size: [5]) 2> 506665 INFO (zkCallback-1432-thread-2) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/testcollection/state.json zxid: 488] for collection [testcollection] has occurred - updating... (live nodes size: [5]) 2> 506665 INFO (zkCallback-1476-thread-2) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/testcollection/state.json zxid: 488] for collection [testcollection] has occurred - updating... (live nodes size: [5]) 2> 506681 INFO (qtp852564526-2471) [n:127.0.0.1:34767_mr%2Fx c:testcollection s:shard1 r:core_node18 x:testcollection_shard1_replica_n16 t:null-5422] o.a.s.c.ZkController I am the leader, no recovery necessary 2> 506681 INFO (qtp1977894741-2396) [n:127.0.0.1:33625_mr%2Fx c:testcollection s:shard2 r:core_node15 x:testcollection_shard2_replica_n12 t:null-5419] o.a.s.c.ZkController I am the leader, no recovery necessary 2> 506682 INFO (qtp1805675424-2333) [n:127.0.0.1:40873_mr%2Fx c:testcollection s:shard3 r:core_node5 x:testcollection_shard3_replica_n2 t:null-5421] o.a.s.c.ZkController I am the leader, no recovery necessary 2> 506738 INFO (qtp1977894741-2396) [n:127.0.0.1:33625_mr%2Fx c: s: r: x: t:null-5419] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/cores params={qt=/admin/cores&coreNodeName=core_node15&collection.configName=conf1&newCollection=true&name=testcollection_shard2_replica_n12&action=CREATE&numShards=3&collection=testcollection&shard=shard2&wt=javabin&version=2&replicaType=NRT} status=0 QTime=21872 2> 506739 INFO (qtp852564526-2471) [n:127.0.0.1:34767_mr%2Fx c: s: r: x: t:null-5422] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/cores params={qt=/admin/cores&coreNodeName=core_node18&collection.configName=conf1&newCollection=true&name=testcollection_shard1_replica_n16&action=CREATE&numShards=3&collection=testcollection&shard=shard1&wt=javabin&version=2&replicaType=NRT} status=0 QTime=2618 2> 506742 INFO (qtp1805675424-2333) [n:127.0.0.1:40873_mr%2Fx c: s: r: x: t:null-5421] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/cores params={qt=/admin/cores&coreNodeName=core_node5&collection.configName=conf1&newCollection=true&name=testcollection_shard3_replica_n2&action=CREATE&numShards=3&collection=testcollection&shard=shard3&wt=javabin&version=2&replicaType=NRT} status=0 QTime=21872 2> 506743 ERROR (OverseerThreadFactory-1438-thread-5) [n: c:testcollection s: r: x: t:] o.a.s.c.a.c.CollectionHandlingUtils Cleaning up collection [testcollection]. 2> 506755 INFO (OverseerThreadFactory-1438-thread-5) [n: c:testcollection s: r: x: t:] o.a.s.c.a.c.CollectionHandlingUtils Executing Collection Cmd=action=UNLOAD&deleteInstanceDir=true&deleteDataDir=true, asyncId=null 2> 506764 INFO (qtp1977894741-2399) [n:127.0.0.1:33625_mr%2Fx c: s: r: x:testcollection_shard1_replica_n8 t:null-5423] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.core.testcollection.shard1.replica_n8 tag=null 2> 506765 INFO (qtp734457442-2383) [n:127.0.0.1:46725_mr%2Fx c: s: r: x:testcollection_shard1_replica_n10 t:null-5424] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.core.testcollection.shard1.replica_n10 tag=null 2> 506767 INFO (qtp852564526-2472) [n:127.0.0.1:34767_mr%2Fx c: s: r: x:testcollection_shard1_replica_n16 t:null-5425] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.core.testcollection.shard1.replica_n16 tag=null 2> 506771 INFO (qtp1831257447-2439) [n:127.0.0.1:45309_mr%2Fx c: s: r: x:testcollection_shard2_replica_n4 t:null-5427] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.core.testcollection.shard2.replica_n4 tag=null 2> 506777 INFO (qtp1805675424-2336) [n:127.0.0.1:40873_mr%2Fx c: s: r: x:testcollection_shard3_replica_n2 t:null-5429] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.core.testcollection.shard3.replica_n2 tag=null 2> 506836 INFO (zkCallback-1476-thread-2) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/testcollection/state.json zxid: 493] for collection [testcollection] has occurred - updating... (live nodes size: [5]) 2> 506836 INFO (zkCallback-1528-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/testcollection/state.json zxid: 493] for collection [testcollection] has occurred - updating... (live nodes size: [5]) 2> 506836 INFO (zkCallback-1432-thread-2) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/testcollection/state.json zxid: 493] for collection [testcollection] has occurred - updating... (live nodes size: [5]) 2> 506836 INFO (zkCallback-1528-thread-2) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/testcollection/state.json zxid: 493] for collection [testcollection] has occurred - updating... (live nodes size: [5]) 2> 506836 INFO (zkCallback-1432-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/testcollection/state.json zxid: 493] for collection [testcollection] has occurred - updating... (live nodes size: [5]) 2> 506836 INFO (zkCallback-1476-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/testcollection/state.json zxid: 493] for collection [testcollection] has occurred - updating... (live nodes size: [5]) 2> 507214 INFO (qtp734457442-2383) [n:127.0.0.1:46725_mr%2Fx c: s: r: x:testcollection_shard1_replica_n10 t:null-5424] o.a.s.c.SolrCore CLOSING SolrCore org.apache.solr.core.SolrCore@e07a1c2 testcollection_shard1_replica_n10 2> 507205 INFO (qtp734457442-2386) [n:127.0.0.1:46725_mr%2Fx c: s: r: x:testcollection_shard3_replica_n14 t:null-5431] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.core.testcollection.shard3.replica_n14 tag=null 2> 507219 INFO (qtp1977894741-2401) [n:127.0.0.1:33625_mr%2Fx c: s: r: x:testcollection_shard2_replica_n12 t:null-5428] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.core.testcollection.shard2.replica_n12 tag=null 2> 507223 INFO (qtp1977894741-2399) [n:127.0.0.1:33625_mr%2Fx c: s: r: x:testcollection_shard1_replica_n8 t:null-5423] o.a.s.c.SolrCore CLOSING SolrCore org.apache.solr.core.SolrCore@3c3f52f1 testcollection_shard1_replica_n8 2> 507230 INFO (qtp852564526-2472) [n:127.0.0.1:34767_mr%2Fx c: s: r: x:testcollection_shard1_replica_n16 t:null-5425] o.a.s.c.SolrCore CLOSING SolrCore org.apache.solr.core.SolrCore@65e58f11 testcollection_shard1_replica_n16 2> 507227 INFO (qtp852564526-2473) [n:127.0.0.1:34767_mr%2Fx c: s: r: x:testcollection_shard2_replica_n1 t:null-5426] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.core.testcollection.shard2.replica_n1 tag=null 2> 507244 INFO (qtp1805675424-2336) [n:127.0.0.1:40873_mr%2Fx c: s: r: x:testcollection_shard3_replica_n2 t:null-5429] o.a.s.c.SolrCore CLOSING SolrCore org.apache.solr.core.SolrCore@6f7d7345 testcollection_shard3_replica_n2 2> 507246 INFO (qtp1831257447-2440) [n:127.0.0.1:45309_mr%2Fx c: s: r: x:testcollection_shard3_replica_n6 t:null-5430] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.core.testcollection.shard3.replica_n6 tag=null 2> 507250 INFO (qtp1805675424-2336) [n:127.0.0.1:40873_mr%2Fx c: s: r: x:testcollection_shard3_replica_n2 t:null-5429] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.core.testcollection.shard3.replica_n2 tag=SolrCore@6f7d7345 2> 507250 INFO (qtp1831257447-2439) [n:127.0.0.1:45309_mr%2Fx c: s: r: x:testcollection_shard2_replica_n4 t:null-5427] o.a.s.c.SolrCore CLOSING SolrCore org.apache.solr.core.SolrCore@5e7f3dd testcollection_shard2_replica_n4 2> 507252 INFO (qtp1805675424-2336) [n:127.0.0.1:40873_mr%2Fx c: s: r: x:testcollection_shard3_replica_n2 t:null-5429] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.collection.testcollection.shard3.leader tag=SolrCore@6f7d7345 2> 507260 INFO (qtp1805675424-2336) [n:127.0.0.1:40873_mr%2Fx c: s: r: x:testcollection_shard3_replica_n2 t:null-5429] o.a.s.u.DirectUpdateHandler2 Committing on IndexWriter.close() ... SKIPPED (unnecessary). 2> 507315 INFO (qtp1805675424-2336) [n:127.0.0.1:40873_mr%2Fx c: s: r: x: t:null-5429] o.a.s.c.ZkShardTerms Successful update of terms at /collections/testcollection/terms/shard3 to Terms{values={core_node17=0, core_node9=0}, version=3} for removeTerm 2> 507339 INFO (qtp1805675424-2336) [n:127.0.0.1:40873_mr%2Fx c: s: r: x: t:null-5429] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/cores params={deleteInstanceDir=true&core=testcollection_shard3_replica_n2&qt=/admin/cores&deleteDataDir=true&action=UNLOAD&wt=javabin&version=2} status=0 QTime=567 2> 507425 INFO (qtp1977894741-2398) [n:127.0.0.1:33625_mr%2Fx c:testcollection s:shard1 r:core_node11 x:testcollection_shard1_replica_n8 t:null-5417] o.a.s.c.ZkShardTerms Successful update of terms at /collections/testcollection/terms/shard1 to Terms{values={core_node18=0, core_node13=0}, version=3} for removeTerm 2> 507435 INFO (qtp1977894741-2398) [n:127.0.0.1:33625_mr%2Fx c:testcollection s:shard1 r:core_node11 x:testcollection_shard1_replica_n8 t:null-5417] o.a.s.c.ShardLeaderElectionContextBase No version found for ephemeral leader parent node, won't remove previous leader registration. 2> 507436 ERROR (qtp1977894741-2398) [n:127.0.0.1:33625_mr%2Fx c:testcollection s:shard1 r:core_node11 x:testcollection_shard1_replica_n8 t:null-5417] o.a.s.c.ZkContainer Exception registering core testcollection_shard1_replica_n8 2> => org.apache.solr.common.SolrException: SolrCore is no longer available to register 2> at org.apache.solr.cloud.ZkController.register(ZkController.java:1351) 2> org.apache.solr.common.SolrException: SolrCore is no longer available to register 2> at org.apache.solr.cloud.ZkController.register(ZkController.java:1351) ~[main/:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.cloud.ZkController.register(ZkController.java:1246) ~[main/:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.core.ZkContainer.lambda$registerInZk$1(ZkContainer.java:218) [main/:?] 2> at org.apache.solr.core.ZkContainer.registerInZk(ZkContainer.java:247) [main/:?] 2> at org.apache.solr.core.CoreContainer.registerCore(CoreContainer.java:1493) [main/:?] 2> at org.apache.solr.core.CoreContainer.createFromDescriptor(CoreContainer.java:1716) [main/:?] 2> at org.apache.solr.core.CoreContainer.create(CoreContainer.java:1572) [main/:?] 2> at org.apache.solr.handler.admin.CoreAdminOperation.lambda$static$0(CoreAdminOperation.java:114) [main/:?] 2> at org.apache.solr.handler.admin.CoreAdminOperation.execute(CoreAdminOperation.java:414) [main/:?] 2> at org.apache.solr.handler.admin.CoreAdminHandler$CallInfo.call(CoreAdminHandler.java:374) [main/:?] 2> at org.apache.solr.handler.admin.CoreAdminHandler.handleRequestBody(CoreAdminHandler.java:231) [main/:?] 2> at org.apache.solr.handler.RequestHandlerBase.handleRequest(RequestHandlerBase.java:226) [main/:?] 2> at org.apache.solr.servlet.HttpSolrCall.handleAdmin(HttpSolrCall.java:932) [main/:?] 2> at org.apache.solr.servlet.HttpSolrCall.handleAdminRequest(HttpSolrCall.java:880) [main/:?] 2> at org.apache.solr.servlet.HttpSolrCall.call(HttpSolrCall.java:541) [main/:?] 2> at org.apache.solr.servlet.SolrDispatchFilter.dispatch(SolrDispatchFilter.java:262) [main/:?] 2> at org.apache.solr.servlet.SolrDispatchFilter.lambda$doFilter$0(SolrDispatchFilter.java:219) [main/:?] 2> at org.apache.solr.servlet.ServletUtils.traceHttpRequestExecution2(ServletUtils.java:246) [main/:?] 2> at org.apache.solr.servlet.ServletUtils.rateLimitRequest(ServletUtils.java:215) [main/:?] 2> at org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:213) [main/:?] 2> at org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:195) [main/:?] 2> at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:202) [jetty-servlet-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1635) [jetty-servlet-10.0.19.jar:10.0.19] 2> at org.apache.solr.embedded.JettySolrRunner$DebugFilter.doFilter(JettySolrRunner.java:187) [solr-test-framework-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:202) [jetty-servlet-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1635) [jetty-servlet-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:527) [jetty-servlet-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:221) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1580) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:221) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1384) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:176) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:484) [jetty-servlet-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1553) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:174) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1306) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:129) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:122) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.rewrite.handler.RewriteHandler.handle(RewriteHandler.java:301) [jetty-rewrite-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:122) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.gzip.GzipHandler.handle(GzipHandler.java:790) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:122) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.Server.handle(Server.java:563) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.HttpChannel$RequestDispatchable.dispatch(HttpChannel.java:1598) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:753) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:501) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.HttpChannel.run(HttpChannel.java:461) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:969) [jetty-util-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.doRunJob(QueuedThreadPool.java:1194) [jetty-util-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1149) [jetty-util-10.0.19.jar:10.0.19] 2> at java.base/java.lang.Thread.run(Thread.java:829) [?:?] 2> 507511 INFO (qtp852564526-2470) [n:127.0.0.1:34767_mr%2Fx c:testcollection s:shard2 r:core_node3 x:testcollection_shard2_replica_n1 t:null-5414] o.a.s.c.ZkShardTerms Successful update of terms at /collections/testcollection/terms/shard2 to Terms{values={core_node7=0, core_node15=0}, version=3} for removeTerm 2> 507512 INFO (qtp1977894741-2398) [n:127.0.0.1:33625_mr%2Fx c: s: r: x: t:null-5417] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/cores params={qt=/admin/cores&coreNodeName=core_node11&collection.configName=conf1&newCollection=true&name=testcollection_shard1_replica_n8&action=CREATE&numShards=3&collection=testcollection&shard=shard1&wt=javabin&version=2&replicaType=NRT} status=0 QTime=22661 2> 507512 INFO (qtp734457442-2384) [n:127.0.0.1:46725_mr%2Fx c:testcollection s:shard1 r:core_node13 x:testcollection_shard1_replica_n10 t:null-5418] o.a.s.c.ZkShardTerms Successful update of terms at /collections/testcollection/terms/shard1 to Terms{values={core_node18=0}, version=4} for removeTerm 2> 507514 INFO (qtp1831257447-2437) [n:127.0.0.1:45309_mr%2Fx c:testcollection s:shard2 r:core_node7 x:testcollection_shard2_replica_n4 t:null-5415] o.a.s.c.ZkShardTerms Failed to save terms, version is not a match, retrying 2> 507531 INFO (qtp734457442-2384) [n:127.0.0.1:46725_mr%2Fx c:testcollection s:shard1 r:core_node13 x:testcollection_shard1_replica_n10 t:null-5418] o.a.s.c.ShardLeaderElectionContextBase No version found for ephemeral leader parent node, won't remove previous leader registration. 2> 507532 INFO (qtp852564526-2470) [n:127.0.0.1:34767_mr%2Fx c:testcollection s:shard2 r:core_node3 x:testcollection_shard2_replica_n1 t:null-5414] o.a.s.c.ShardLeaderElectionContextBase No version found for ephemeral leader parent node, won't remove previous leader registration. 2> 507535 INFO (qtp1831257447-2437) [n:127.0.0.1:45309_mr%2Fx c:testcollection s:shard2 r:core_node7 x:testcollection_shard2_replica_n4 t:null-5415] o.a.s.c.ZkShardTerms Successful update of terms at /collections/testcollection/terms/shard2 to Terms{values={core_node15=0}, version=4} for removeTerm 2> 507547 INFO (qtp1831257447-2437) [n:127.0.0.1:45309_mr%2Fx c:testcollection s:shard2 r:core_node7 x:testcollection_shard2_replica_n4 t:null-5415] o.a.s.c.ShardLeaderElectionContextBase No version found for ephemeral leader parent node, won't remove previous leader registration. 2> 507532 ERROR (qtp852564526-2470) [n:127.0.0.1:34767_mr%2Fx c:testcollection s:shard2 r:core_node3 x:testcollection_shard2_replica_n1 t:null-5414] o.a.s.c.ZkContainer Exception registering core testcollection_shard2_replica_n1 2> => org.apache.solr.common.SolrException: SolrCore is no longer available to register 2> at org.apache.solr.cloud.ZkController.register(ZkController.java:1351) 2> org.apache.solr.common.SolrException: SolrCore is no longer available to register 2> at org.apache.solr.cloud.ZkController.register(ZkController.java:1351) ~[main/:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.cloud.ZkController.register(ZkController.java:1246) ~[main/:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.core.ZkContainer.lambda$registerInZk$1(ZkContainer.java:218) [main/:?] 2> at org.apache.solr.core.ZkContainer.registerInZk(ZkContainer.java:247) [main/:?] 2> at org.apache.solr.core.CoreContainer.registerCore(CoreContainer.java:1493) [main/:?] 2> at org.apache.solr.core.CoreContainer.createFromDescriptor(CoreContainer.java:1716) [main/:?] 2> at org.apache.solr.core.CoreContainer.create(CoreContainer.java:1572) [main/:?] 2> at org.apache.solr.handler.admin.CoreAdminOperation.lambda$static$0(CoreAdminOperation.java:114) [main/:?] 2> at org.apache.solr.handler.admin.CoreAdminOperation.execute(CoreAdminOperation.java:414) [main/:?] 2> at org.apache.solr.handler.admin.CoreAdminHandler$CallInfo.call(CoreAdminHandler.java:374) [main/:?] 2> at org.apache.solr.handler.admin.CoreAdminHandler.handleRequestBody(CoreAdminHandler.java:231) [main/:?] 2> at org.apache.solr.handler.RequestHandlerBase.handleRequest(RequestHandlerBase.java:226) [main/:?] 2> at org.apache.solr.servlet.HttpSolrCall.handleAdmin(HttpSolrCall.java:932) [main/:?] 2> at org.apache.solr.servlet.HttpSolrCall.handleAdminRequest(HttpSolrCall.java:880) [main/:?] 2> at org.apache.solr.servlet.HttpSolrCall.call(HttpSolrCall.java:541) [main/:?] 2> at org.apache.solr.servlet.SolrDispatchFilter.dispatch(SolrDispatchFilter.java:262) [main/:?] 2> at org.apache.solr.servlet.SolrDispatchFilter.lambda$doFilter$0(SolrDispatchFilter.java:219) [main/:?] 2> at org.apache.solr.servlet.ServletUtils.traceHttpRequestExecution2(ServletUtils.java:246) [main/:?] 2> at org.apache.solr.servlet.ServletUtils.rateLimitRequest(ServletUtils.java:215) [main/:?] 2> at org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:213) [main/:?] 2> at org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:195) [main/:?] 2> at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:202) [jetty-servlet-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1635) [jetty-servlet-10.0.19.jar:10.0.19] 2> at org.apache.solr.embedded.JettySolrRunner$DebugFilter.doFilter(JettySolrRunner.java:187) [solr-test-framework-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:202) [jetty-servlet-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1635) [jetty-servlet-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:527) [jetty-servlet-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:221) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1580) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:221) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1384) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:176) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:484) [jetty-servlet-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1553) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:174) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1306) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:129) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:122) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.rewrite.handler.RewriteHandler.handle(RewriteHandler.java:301) [jetty-rewrite-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:122) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.gzip.GzipHandler.handle(GzipHandler.java:790) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:122) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.Server.handle(Server.java:563) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.HttpChannel$RequestDispatchable.dispatch(HttpChannel.java:1598) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:753) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:501) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.HttpChannel.run(HttpChannel.java:461) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:969) [jetty-util-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.doRunJob(QueuedThreadPool.java:1194) [jetty-util-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1149) [jetty-util-10.0.19.jar:10.0.19] 2> at java.base/java.lang.Thread.run(Thread.java:829) [?:?] 2> 507531 ERROR (qtp734457442-2384) [n:127.0.0.1:46725_mr%2Fx c:testcollection s:shard1 r:core_node13 x:testcollection_shard1_replica_n10 t:null-5418] o.a.s.c.ZkContainer Exception registering core testcollection_shard1_replica_n10 2> => org.apache.solr.common.SolrException: SolrCore is no longer available to register 2> at org.apache.solr.cloud.ZkController.register(ZkController.java:1351) 2> org.apache.solr.common.SolrException: SolrCore is no longer available to register 2> at org.apache.solr.cloud.ZkController.register(ZkController.java:1351) ~[main/:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.cloud.ZkController.register(ZkController.java:1246) ~[main/:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.core.ZkContainer.lambda$registerInZk$1(ZkContainer.java:218) [main/:?] 2> at org.apache.solr.core.ZkContainer.registerInZk(ZkContainer.java:247) [main/:?] 2> at org.apache.solr.core.CoreContainer.registerCore(CoreContainer.java:1493) [main/:?] 2> at org.apache.solr.core.CoreContainer.createFromDescriptor(CoreContainer.java:1716) [main/:?] 2> at org.apache.solr.core.CoreContainer.create(CoreContainer.java:1572) [main/:?] 2> at org.apache.solr.handler.admin.CoreAdminOperation.lambda$static$0(CoreAdminOperation.java:114) [main/:?] 2> at org.apache.solr.handler.admin.CoreAdminOperation.execute(CoreAdminOperation.java:414) [main/:?] 2> at org.apache.solr.handler.admin.CoreAdminHandler$CallInfo.call(CoreAdminHandler.java:374) [main/:?] 2> at org.apache.solr.handler.admin.CoreAdminHandler.handleRequestBody(CoreAdminHandler.java:231) [main/:?] 2> at org.apache.solr.handler.RequestHandlerBase.handleRequest(RequestHandlerBase.java:226) [main/:?] 2> at org.apache.solr.servlet.HttpSolrCall.handleAdmin(HttpSolrCall.java:932) [main/:?] 2> at org.apache.solr.servlet.HttpSolrCall.handleAdminRequest(HttpSolrCall.java:880) [main/:?] 2> at org.apache.solr.servlet.HttpSolrCall.call(HttpSolrCall.java:541) [main/:?] 2> at org.apache.solr.servlet.SolrDispatchFilter.dispatch(SolrDispatchFilter.java:262) [main/:?] 2> at org.apache.solr.servlet.SolrDispatchFilter.lambda$doFilter$0(SolrDispatchFilter.java:219) [main/:?] 2> at org.apache.solr.servlet.ServletUtils.traceHttpRequestExecution2(ServletUtils.java:246) [main/:?] 2> at org.apache.solr.servlet.ServletUtils.rateLimitRequest(ServletUtils.java:215) [main/:?] 2> at org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:213) [main/:?] 2> at org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:195) [main/:?] 2> at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:202) [jetty-servlet-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1635) [jetty-servlet-10.0.19.jar:10.0.19] 2> at org.apache.solr.embedded.JettySolrRunner$DebugFilter.doFilter(JettySolrRunner.java:187) [solr-test-framework-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:202) [jetty-servlet-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1635) [jetty-servlet-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:527) [jetty-servlet-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:221) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1580) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:221) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1384) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:176) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:484) [jetty-servlet-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1553) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:174) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1306) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:129) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:122) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.rewrite.handler.RewriteHandler.handle(RewriteHandler.java:301) [jetty-rewrite-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:122) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.gzip.GzipHandler.handle(GzipHandler.java:790) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:122) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.Server.handle(Server.java:563) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.HttpChannel$RequestDispatchable.dispatch(HttpChannel.java:1598) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:753) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:501) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.HttpChannel.run(HttpChannel.java:461) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:969) [jetty-util-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.doRunJob(QueuedThreadPool.java:1194) [jetty-util-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1149) [jetty-util-10.0.19.jar:10.0.19] 2> at java.base/java.lang.Thread.run(Thread.java:829) [?:?] 2> 507569 INFO (qtp852564526-2470) [n:127.0.0.1:34767_mr%2Fx c: s: r: x: t:null-5414] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/cores params={qt=/admin/cores&coreNodeName=core_node3&collection.configName=conf1&newCollection=true&name=testcollection_shard2_replica_n1&action=CREATE&numShards=3&collection=testcollection&shard=shard2&wt=javabin&version=2&replicaType=NRT} status=0 QTime=22725 2> 507570 INFO (qtp734457442-2384) [n:127.0.0.1:46725_mr%2Fx c: s: r: x: t:null-5418] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/cores params={qt=/admin/cores&coreNodeName=core_node13&collection.configName=conf1&newCollection=true&name=testcollection_shard1_replica_n10&action=CREATE&numShards=3&collection=testcollection&shard=shard1&wt=javabin&version=2&replicaType=NRT} status=0 QTime=22717 2> 507547 ERROR (qtp1831257447-2437) [n:127.0.0.1:45309_mr%2Fx c:testcollection s:shard2 r:core_node7 x:testcollection_shard2_replica_n4 t:null-5415] o.a.s.c.ZkContainer Exception registering core testcollection_shard2_replica_n4 2> => org.apache.solr.common.SolrException: SolrCore is no longer available to register 2> at org.apache.solr.cloud.ZkController.register(ZkController.java:1351) 2> org.apache.solr.common.SolrException: SolrCore is no longer available to register 2> at org.apache.solr.cloud.ZkController.register(ZkController.java:1351) ~[main/:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.cloud.ZkController.register(ZkController.java:1246) ~[main/:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.core.ZkContainer.lambda$registerInZk$1(ZkContainer.java:218) [main/:?] 2> at org.apache.solr.core.ZkContainer.registerInZk(ZkContainer.java:247) [main/:?] 2> at org.apache.solr.core.CoreContainer.registerCore(CoreContainer.java:1493) [main/:?] 2> at org.apache.solr.core.CoreContainer.createFromDescriptor(CoreContainer.java:1716) [main/:?] 2> at org.apache.solr.core.CoreContainer.create(CoreContainer.java:1572) [main/:?] 2> at org.apache.solr.handler.admin.CoreAdminOperation.lambda$static$0(CoreAdminOperation.java:114) [main/:?] 2> at org.apache.solr.handler.admin.CoreAdminOperation.execute(CoreAdminOperation.java:414) [main/:?] 2> at org.apache.solr.handler.admin.CoreAdminHandler$CallInfo.call(CoreAdminHandler.java:374) [main/:?] 2> at org.apache.solr.handler.admin.CoreAdminHandler.handleRequestBody(CoreAdminHandler.java:231) [main/:?] 2> at org.apache.solr.handler.RequestHandlerBase.handleRequest(RequestHandlerBase.java:226) [main/:?] 2> at org.apache.solr.servlet.HttpSolrCall.handleAdmin(HttpSolrCall.java:932) [main/:?] 2> at org.apache.solr.servlet.HttpSolrCall.handleAdminRequest(HttpSolrCall.java:880) [main/:?] 2> at org.apache.solr.servlet.HttpSolrCall.call(HttpSolrCall.java:541) [main/:?] 2> at org.apache.solr.servlet.SolrDispatchFilter.dispatch(SolrDispatchFilter.java:262) [main/:?] 2> at org.apache.solr.servlet.SolrDispatchFilter.lambda$doFilter$0(SolrDispatchFilter.java:219) [main/:?] 2> at org.apache.solr.servlet.ServletUtils.traceHttpRequestExecution2(ServletUtils.java:246) [main/:?] 2> at org.apache.solr.servlet.ServletUtils.rateLimitRequest(ServletUtils.java:215) [main/:?] 2> at org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:213) [main/:?] 2> at org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:195) [main/:?] 2> at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:202) [jetty-servlet-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1635) [jetty-servlet-10.0.19.jar:10.0.19] 2> at org.apache.solr.embedded.JettySolrRunner$DebugFilter.doFilter(JettySolrRunner.java:187) [solr-test-framework-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:202) [jetty-servlet-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1635) [jetty-servlet-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:527) [jetty-servlet-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:221) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1580) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:221) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1384) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:176) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:484) [jetty-servlet-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1553) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:174) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1306) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:129) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:122) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.rewrite.handler.RewriteHandler.handle(RewriteHandler.java:301) [jetty-rewrite-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:122) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.gzip.GzipHandler.handle(GzipHandler.java:790) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:122) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.Server.handle(Server.java:563) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.HttpChannel$RequestDispatchable.dispatch(HttpChannel.java:1598) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:753) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:501) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.HttpChannel.run(HttpChannel.java:461) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:969) [jetty-util-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.doRunJob(QueuedThreadPool.java:1194) [jetty-util-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1149) [jetty-util-10.0.19.jar:10.0.19] 2> at java.base/java.lang.Thread.run(Thread.java:829) [?:?] 2> 507579 INFO (qtp1831257447-2437) [n:127.0.0.1:45309_mr%2Fx c: s: r: x: t:null-5415] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/cores params={qt=/admin/cores&coreNodeName=core_node7&collection.configName=conf1&newCollection=true&name=testcollection_shard2_replica_n4&action=CREATE&numShards=3&collection=testcollection&shard=shard2&wt=javabin&version=2&replicaType=NRT} status=0 QTime=22732 2> 507698 INFO (qtp1977894741-2399) [n:127.0.0.1:33625_mr%2Fx c: s: r: x:testcollection_shard1_replica_n8 t:null-5423] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.core.testcollection.shard1.replica_n8 tag=SolrCore@3c3f52f1 2> 507701 INFO (qtp1977894741-2401) [n:127.0.0.1:33625_mr%2Fx c: s: r: x:testcollection_shard2_replica_n12 t:null-5428] o.a.s.c.SolrCore CLOSING SolrCore org.apache.solr.core.SolrCore@62a47336 testcollection_shard2_replica_n12 2> 507702 INFO (qtp1977894741-2399) [n:127.0.0.1:33625_mr%2Fx c: s: r: x:testcollection_shard1_replica_n8 t:null-5423] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.collection.testcollection.shard1.leader tag=SolrCore@3c3f52f1 2> 507702 INFO (qtp1977894741-2401) [n:127.0.0.1:33625_mr%2Fx c: s: r: x:testcollection_shard2_replica_n12 t:null-5428] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.core.testcollection.shard2.replica_n12 tag=SolrCore@62a47336 2> 507703 INFO (qtp1977894741-2401) [n:127.0.0.1:33625_mr%2Fx c: s: r: x:testcollection_shard2_replica_n12 t:null-5428] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.collection.testcollection.shard2.leader tag=SolrCore@62a47336 2> 507707 INFO (qtp1977894741-2399) [n:127.0.0.1:33625_mr%2Fx c: s: r: x:testcollection_shard1_replica_n8 t:null-5423] o.a.s.u.DirectUpdateHandler2 Committing on IndexWriter.close() ... SKIPPED (unnecessary). 2> 507710 INFO (qtp852564526-2473) [n:127.0.0.1:34767_mr%2Fx c: s: r: x:testcollection_shard2_replica_n1 t:null-5426] o.a.s.c.SolrCore CLOSING SolrCore org.apache.solr.core.SolrCore@6861827e testcollection_shard2_replica_n1 2> 507707 INFO (qtp852564526-2472) [n:127.0.0.1:34767_mr%2Fx c: s: r: x:testcollection_shard1_replica_n16 t:null-5425] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.core.testcollection.shard1.replica_n16 tag=SolrCore@65e58f11 2> 507711 INFO (qtp852564526-2472) [n:127.0.0.1:34767_mr%2Fx c: s: r: x:testcollection_shard1_replica_n16 t:null-5425] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.collection.testcollection.shard1.leader tag=SolrCore@65e58f11 2> 507715 INFO (qtp1977894741-2401) [n:127.0.0.1:33625_mr%2Fx c: s: r: x:testcollection_shard2_replica_n12 t:null-5428] o.a.s.u.DirectUpdateHandler2 Committing on IndexWriter.close() ... SKIPPED (unnecessary). 2> 507715 INFO (qtp852564526-2473) [n:127.0.0.1:34767_mr%2Fx c: s: r: x:testcollection_shard2_replica_n1 t:null-5426] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.core.testcollection.shard2.replica_n1 tag=SolrCore@6861827e 2> 507715 INFO (qtp852564526-2473) [n:127.0.0.1:34767_mr%2Fx c: s: r: x:testcollection_shard2_replica_n1 t:null-5426] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.collection.testcollection.shard2.leader tag=SolrCore@6861827e 2> 507744 INFO (qtp852564526-2472) [n:127.0.0.1:34767_mr%2Fx c: s: r: x:testcollection_shard1_replica_n16 t:null-5425] o.a.s.u.DirectUpdateHandler2 Committing on IndexWriter.close() ... SKIPPED (unnecessary). 2> 507765 INFO (qtp852564526-2473) [n:127.0.0.1:34767_mr%2Fx c: s: r: x:testcollection_shard2_replica_n1 t:null-5426] o.a.s.u.DirectUpdateHandler2 Committing on IndexWriter.close() ... SKIPPED (unnecessary). 2> 507788 INFO (qtp734457442-2386) [n:127.0.0.1:46725_mr%2Fx c: s: r: x:testcollection_shard3_replica_n14 t:null-5431] o.a.s.c.SolrCore CLOSING SolrCore org.apache.solr.core.SolrCore@4589ed21 testcollection_shard3_replica_n14 2> 507784 INFO (qtp734457442-2383) [n:127.0.0.1:46725_mr%2Fx c: s: r: x:testcollection_shard1_replica_n10 t:null-5424] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.core.testcollection.shard1.replica_n10 tag=SolrCore@e07a1c2 2> 507790 INFO (qtp734457442-2383) [n:127.0.0.1:46725_mr%2Fx c: s: r: x:testcollection_shard1_replica_n10 t:null-5424] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.collection.testcollection.shard1.leader tag=SolrCore@e07a1c2 2> 507789 INFO (qtp1831257447-2439) [n:127.0.0.1:45309_mr%2Fx c: s: r: x:testcollection_shard2_replica_n4 t:null-5427] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.core.testcollection.shard2.replica_n4 tag=SolrCore@5e7f3dd 2> 507790 INFO (qtp734457442-2386) [n:127.0.0.1:46725_mr%2Fx c: s: r: x:testcollection_shard3_replica_n14 t:null-5431] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.core.testcollection.shard3.replica_n14 tag=SolrCore@4589ed21 2> 507790 INFO (qtp1831257447-2439) [n:127.0.0.1:45309_mr%2Fx c: s: r: x:testcollection_shard2_replica_n4 t:null-5427] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.collection.testcollection.shard2.leader tag=SolrCore@5e7f3dd 2> 507790 INFO (qtp734457442-2386) [n:127.0.0.1:46725_mr%2Fx c: s: r: x:testcollection_shard3_replica_n14 t:null-5431] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.collection.testcollection.shard3.leader tag=SolrCore@4589ed21 2> 507801 INFO (qtp1831257447-2439) [n:127.0.0.1:45309_mr%2Fx c: s: r: x:testcollection_shard2_replica_n4 t:null-5427] o.a.s.u.DirectUpdateHandler2 Committing on IndexWriter.close() ... SKIPPED (unnecessary). 2> 507798 INFO (qtp734457442-2386) [n:127.0.0.1:46725_mr%2Fx c: s: r: x:testcollection_shard3_replica_n14 t:null-5431] o.a.s.u.DirectUpdateHandler2 Committing on IndexWriter.close() ... SKIPPED (unnecessary). 2> 507797 INFO (qtp1831257447-2440) [n:127.0.0.1:45309_mr%2Fx c: s: r: x:testcollection_shard3_replica_n6 t:null-5430] o.a.s.c.SolrCore CLOSING SolrCore org.apache.solr.core.SolrCore@5bdb473e testcollection_shard3_replica_n6 2> 507823 INFO (qtp1831257447-2440) [n:127.0.0.1:45309_mr%2Fx c: s: r: x:testcollection_shard3_replica_n6 t:null-5430] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.core.testcollection.shard3.replica_n6 tag=SolrCore@5bdb473e 2> 507841 INFO (qtp734457442-2383) [n:127.0.0.1:46725_mr%2Fx c: s: r: x:testcollection_shard1_replica_n10 t:null-5424] o.a.s.u.DirectUpdateHandler2 Committing on IndexWriter.close() ... SKIPPED (unnecessary). 2> 507841 INFO (qtp1831257447-2440) [n:127.0.0.1:45309_mr%2Fx c: s: r: x:testcollection_shard3_replica_n6 t:null-5430] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.collection.testcollection.shard3.leader tag=SolrCore@5bdb473e 2> 507861 INFO (qtp1831257447-2440) [n:127.0.0.1:45309_mr%2Fx c: s: r: x:testcollection_shard3_replica_n6 t:null-5430] o.a.s.u.DirectUpdateHandler2 Committing on IndexWriter.close() ... SKIPPED (unnecessary). 2> 507882 INFO (qtp1977894741-2401) [n:127.0.0.1:33625_mr%2Fx c: s: r: x: t:null-5428] o.a.s.c.ZkShardTerms Successful update of terms at /collections/testcollection/terms/shard2 to Terms{values={}, version=5} for removeTerm 2> 507883 INFO (qtp852564526-2472) [n:127.0.0.1:34767_mr%2Fx c: s: r: x: t:null-5425] o.a.s.c.ZkShardTerms Successful update of terms at /collections/testcollection/terms/shard1 to Terms{values={}, version=5} for removeTerm 2> 507891 INFO (qtp734457442-2386) [n:127.0.0.1:46725_mr%2Fx c: s: r: x: t:null-5431] o.a.s.c.ZkShardTerms Successful update of terms at /collections/testcollection/terms/shard3 to Terms{values={core_node9=0}, version=4} for removeTerm 2> 507899 WARN (zkCallback-1476-thread-2) [n: c: s: r: x: t:] o.a.s.c.LeaderElector Our node is no longer in line to be leader 2> 507899 INFO (qtp734457442-2386) [n:127.0.0.1:46725_mr%2Fx c: s: r: x: t:null-5431] o.a.s.c.ShardLeaderElectionContextBase No version found for ephemeral leader parent node, won't remove previous leader registration. 2> 507900 WARN (zkCallback-1489-thread-2) [n: c: s: r: x: t:] o.a.s.c.LeaderElector Our node is no longer in line to be leader 2> 507901 WARN (zkCallback-1509-thread-1) [n: c: s: r: x: t:] o.a.s.c.LeaderElector Our node is no longer in line to be leader 2> 507902 WARN (zkCallback-1528-thread-2) [n: c: s: r: x: t:] o.a.s.c.LeaderElector Our node is no longer in line to be leader 2> 507904 INFO (qtp852564526-2472) [n:127.0.0.1:34767_mr%2Fx c: s: r: x: t:null-5425] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/cores params={deleteInstanceDir=true&core=testcollection_shard1_replica_n16&qt=/admin/cores&deleteDataDir=true&action=UNLOAD&wt=javabin&version=2} status=0 QTime=1140 2> 507904 INFO (qtp734457442-2386) [n:127.0.0.1:46725_mr%2Fx c: s: r: x: t:null-5431] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/cores params={deleteInstanceDir=true&core=testcollection_shard3_replica_n14&qt=/admin/cores&deleteDataDir=true&action=UNLOAD&wt=javabin&version=2} status=0 QTime=1128 2> 507907 INFO (qtp1977894741-2401) [n:127.0.0.1:33625_mr%2Fx c: s: r: x: t:null-5428] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/cores params={deleteInstanceDir=true&core=testcollection_shard2_replica_n12&qt=/admin/cores&deleteDataDir=true&action=UNLOAD&wt=javabin&version=2} status=0 QTime=1138 2> 507916 INFO (qtp1977894741-2399) [n:127.0.0.1:33625_mr%2Fx c: s: r: x: t:null-5423] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/cores params={deleteInstanceDir=true&core=testcollection_shard1_replica_n8&qt=/admin/cores&deleteDataDir=true&action=UNLOAD&wt=javabin&version=2} status=0 QTime=1155 2> 507917 INFO (qtp1831257447-2440) [n:127.0.0.1:45309_mr%2Fx c: s: r: x: t:null-5430] o.a.s.c.ZkShardTerms Successful update of terms at /collections/testcollection/terms/shard3 to Terms{values={}, version=5} for removeTerm 2> 507920 INFO (qtp1831257447-2439) [n:127.0.0.1:45309_mr%2Fx c: s: r: x: t:null-5427] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/cores params={deleteInstanceDir=true&core=testcollection_shard2_replica_n4&qt=/admin/cores&deleteDataDir=true&action=UNLOAD&wt=javabin&version=2} status=0 QTime=1152 2> 507921 INFO (qtp852564526-2473) [n:127.0.0.1:34767_mr%2Fx c: s: r: x: t:null-5426] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/cores params={deleteInstanceDir=true&core=testcollection_shard2_replica_n1&qt=/admin/cores&deleteDataDir=true&action=UNLOAD&wt=javabin&version=2} status=0 QTime=1155 2> 507926 INFO (qtp734457442-2383) [n:127.0.0.1:46725_mr%2Fx c: s: r: x: t:null-5424] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/cores params={deleteInstanceDir=true&core=testcollection_shard1_replica_n10&qt=/admin/cores&deleteDataDir=true&action=UNLOAD&wt=javabin&version=2} status=0 QTime=1164 2> 507926 INFO (qtp1831257447-2440) [n:127.0.0.1:45309_mr%2Fx c: s: r: x: t:null-5430] o.a.s.c.ShardLeaderElectionContextBase No version found for ephemeral leader parent node, won't remove previous leader registration. 2> 507929 WARN (zkCallback-1489-thread-2) [n: c: s: r: x: t:] o.a.s.c.LeaderElector Our node is no longer in line to be leader 2> 507930 INFO (qtp1831257447-2440) [n:127.0.0.1:45309_mr%2Fx c: s: r: x: t:null-5430] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/cores params={deleteInstanceDir=true&core=testcollection_shard3_replica_n6&qt=/admin/cores&deleteDataDir=true&action=UNLOAD&wt=javabin&version=2} status=0 QTime=1156 2> 508064 INFO (zkCallback-1432-thread-2) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDeleted path:/collections/testcollection/state.json zxid: 545] for collection [testcollection] has occurred - updating... (live nodes size: [5]) 2> 508064 INFO (zkCallback-1432-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDeleted path:/collections/testcollection/state.json zxid: 545] for collection [testcollection] has occurred - updating... (live nodes size: [5]) 2> 508064 INFO (zkCallback-1432-thread-3) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDeleted path:/collections/testcollection/state.json zxid: 545] for collection [testcollection] has occurred - updating... (live nodes size: [5]) 2> 508065 INFO (zkCallback-1432-thread-4) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDeleted path:/collections/testcollection/state.json zxid: 545] for collection [testcollection] has occurred - updating... (live nodes size: [5]) 2> 508186 INFO (OverseerThreadFactory-1438-thread-5) [n: c:testcollection s: r: x: t:] o.a.s.c.a.c.CreateCollectionCmd Cleaned up artifacts for failed create collection for [testcollection] 2> 508187 ERROR (OverseerThreadFactory-1438-thread-5) [n: c:testcollection s: r: x: t:] o.a.s.c.a.c.OverseerCollectionMessageHandler Collection testcollection}, operation create failed 2> => org.apache.solr.common.SolrException: Underlying core creation failed while creating collection: testcollection 2> at org.apache.solr.cloud.api.collections.CreateCollectionCmd.call(CreateCollectionCmd.java:447) 2> org.apache.solr.common.SolrException: Underlying core creation failed while creating collection: testcollection 2> at org.apache.solr.cloud.api.collections.CreateCollectionCmd.call(CreateCollectionCmd.java:447) ~[main/:?] 2> at org.apache.solr.cloud.api.collections.OverseerCollectionMessageHandler.processMessage(OverseerCollectionMessageHandler.java:129) [main/:?] 2> at org.apache.solr.cloud.OverseerTaskProcessor$Runner.run(OverseerTaskProcessor.java:564) [main/:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.common.util.ExecutorUtil$MDCAwareThreadPoolExecutor.lambda$execute$0(ExecutorUtil.java:294) [solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) [?:?] 2> at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) [?:?] 2> at java.base/java.lang.Thread.run(Thread.java:829) [?:?] 2> 508210 ERROR (qtp1831257447-2436) [n:127.0.0.1:45309_mr%2Fx c:testcollection s: r: x: t:null-5413] o.a.s.h.RequestHandlerBase Client exception 2> => org.apache.solr.common.SolrException: Underlying core creation failed while creating collection: testcollection 2> at org.apache.solr.client.solrj.SolrResponse.getException(SolrResponse.java:56) 2> org.apache.solr.common.SolrException: Underlying core creation failed while creating collection: testcollection 2> at org.apache.solr.client.solrj.SolrResponse.getException(SolrResponse.java:56) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.handler.admin.api.CreateCollectionAPI.createCollection(CreateCollectionAPI.java:140) ~[main/:?] 2> at org.apache.solr.handler.admin.CollectionsHandler$CollectionOperation.lambda$static$0(CollectionsHandler.java:513) ~[main/:?] 2> at org.apache.solr.handler.admin.CollectionsHandler$CollectionOperation.execute(CollectionsHandler.java:1265) ~[main/:?] 2> at org.apache.solr.handler.admin.CollectionsHandler.invokeAction(CollectionsHandler.java:315) ~[main/:?] 2> at org.apache.solr.handler.admin.CollectionsHandler.handleRequestBody(CollectionsHandler.java:293) ~[main/:?] 2> at org.apache.solr.handler.RequestHandlerBase.handleRequest(RequestHandlerBase.java:226) [main/:?] 2> at org.apache.solr.servlet.HttpSolrCall.handleAdmin(HttpSolrCall.java:932) [main/:?] 2> at org.apache.solr.servlet.HttpSolrCall.handleAdminRequest(HttpSolrCall.java:880) [main/:?] 2> at org.apache.solr.servlet.HttpSolrCall.call(HttpSolrCall.java:541) [main/:?] 2> at org.apache.solr.servlet.SolrDispatchFilter.dispatch(SolrDispatchFilter.java:262) [main/:?] 2> at org.apache.solr.servlet.SolrDispatchFilter.lambda$doFilter$0(SolrDispatchFilter.java:219) [main/:?] 2> at org.apache.solr.servlet.ServletUtils.traceHttpRequestExecution2(ServletUtils.java:246) [main/:?] 2> at org.apache.solr.servlet.ServletUtils.rateLimitRequest(ServletUtils.java:215) [main/:?] 2> at org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:213) [main/:?] 2> at org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:195) [main/:?] 2> at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:202) [jetty-servlet-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1635) [jetty-servlet-10.0.19.jar:10.0.19] 2> at org.apache.solr.embedded.JettySolrRunner$DebugFilter.doFilter(JettySolrRunner.java:187) [solr-test-framework-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:202) [jetty-servlet-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1635) [jetty-servlet-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:527) [jetty-servlet-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:221) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1580) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:221) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1384) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:176) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:484) [jetty-servlet-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1553) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:174) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1306) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:129) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:122) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.rewrite.handler.RewriteHandler.handle(RewriteHandler.java:301) [jetty-rewrite-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:122) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.gzip.GzipHandler.handle(GzipHandler.java:822) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:122) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.Server.handle(Server.java:563) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.HttpChannel$RequestDispatchable.dispatch(HttpChannel.java:1598) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:753) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:501) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:287) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:314) [jetty-io-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:100) [jetty-io-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.io.SelectableChannelEndPoint$1.run(SelectableChannelEndPoint.java:53) [jetty-io-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:969) [jetty-util-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.doRunJob(QueuedThreadPool.java:1194) [jetty-util-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1149) [jetty-util-10.0.19.jar:10.0.19] 2> at java.base/java.lang.Thread.run(Thread.java:829) [?:?] 2> 508224 INFO (qtp1831257447-2436) [n:127.0.0.1:45309_mr%2Fx c:testcollection s: r: x: t:null-5413] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/collections params={pullReplicas=0&collection.configName=conf1&nrtReplicas=3&name=testcollection&action=CREATE&numShards=3&tlogReplicas=0&wt=javabin&version=2} status=400 QTime=23967 2> 508354 INFO (TEST-TestReplicaProperties.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.SolrTestCaseJ4 ###Ending test 2> 508487 INFO (closeThreadPool-1627-thread-2) [n: c: s: r: x: t:] o.e.j.s.Server Stopped Server@635ed4f2{STOPPING}[10.0.19,sto=0] 2> 508488 INFO (closeThreadPool-1627-thread-1) [n: c: s: r: x: t:] o.e.j.s.Server Stopped Server@2aba3825{STOPPING}[10.0.19,sto=0] 2> 508489 INFO (closeThreadPool-1627-thread-4) [n: c: s: r: x: t:] o.e.j.s.Server Stopped Server@3ea779b4{STOPPING}[10.0.19,sto=0] 2> 508490 INFO (closeThreadPool-1627-thread-5) [n: c: s: r: x: t:] o.e.j.s.Server Stopped Server@3fb8e994{STOPPING}[10.0.19,sto=0] 2> 508490 INFO (closeThreadPool-1627-thread-3) [n: c: s: r: x: t:] o.e.j.s.Server Stopped Server@283b567e{STOPPING}[10.0.19,sto=0] 2> 508497 INFO (closeThreadPool-1627-thread-2) [n: c: s: r: x: t:] o.e.j.s.AbstractConnector Stopped ServerConnector@239f2e18{HTTP/1.1, (http/1.1, h2c)}{127.0.0.1:0} 2> 508501 INFO (closeThreadPool-1627-thread-5) [n: c: s: r: x: t:] o.e.j.s.AbstractConnector Stopped ServerConnector@2ff96fc2{HTTP/1.1, (http/1.1, h2c)}{127.0.0.1:0} 2> 508501 INFO (closeThreadPool-1627-thread-1) [n: c: s: r: x: t:] o.e.j.s.AbstractConnector Stopped ServerConnector@71dbcd31{HTTP/1.1, (http/1.1, h2c)}{127.0.0.1:0} 2> 508501 INFO (closeThreadPool-1627-thread-3) [n: c: s: r: x: t:] o.e.j.s.AbstractConnector Stopped ServerConnector@6454ebad{HTTP/1.1, (http/1.1, h2c)}{127.0.0.1:0} 2> 508503 INFO (closeThreadPool-1627-thread-5) [n: c: s: r: x: t:] o.e.j.s.h.ContextHandler Stopped o.e.j.s.ServletContextHandler@58c53952{/mr/x,file:///build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-cwd/,STOPPED} 2> 508503 INFO (closeThreadPool-1627-thread-1) [n: c: s: r: x: t:] o.e.j.s.h.ContextHandler Stopped o.e.j.s.ServletContextHandler@32f83b4e{/mr/x,file:///build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-cwd/,STOPPED} 2> 508503 INFO (closeThreadPool-1627-thread-3) [n: c: s: r: x: t:] o.e.j.s.h.ContextHandler Stopped o.e.j.s.ServletContextHandler@2462f35e{/mr/x,file:///build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-cwd/,STOPPED} 2> 508503 INFO (closeThreadPool-1627-thread-2) [n: c: s: r: x: t:] o.e.j.s.h.ContextHandler Stopped o.e.j.s.ServletContextHandler@e11bc2c{/mr/x,file:///build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-cwd/,STOPPED} 2> 508505 INFO (closeThreadPool-1627-thread-4) [n: c: s: r: x: t:] o.e.j.s.AbstractConnector Stopped ServerConnector@69194c1b{HTTP/1.1, (http/1.1, h2c)}{127.0.0.1:0} 2> 508507 INFO (closeThreadPool-1627-thread-4) [n: c: s: r: x: t:] o.e.j.s.h.ContextHandler Stopped o.e.j.s.ServletContextHandler@47807e89{/mr/x,file:///build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-cwd/,STOPPED} 2> 508537 INFO (closeThreadPool-1627-thread-1) [n: c: s: r: x: t:] o.a.s.c.CoreContainer Shutting down CoreContainer instance=1818086018 2> 508537 INFO (closeThreadPool-1627-thread-3) [n: c: s: r: x: t:] o.a.s.c.CoreContainer Shutting down CoreContainer instance=1254902714 2> 508537 INFO (closeThreadPool-1627-thread-1) [n: c: s: r: x: t:] o.a.s.c.ZkController Remove node as live in ZooKeeper:/live_nodes/127.0.0.1:33625_mr%2Fx 2> 508537 INFO (closeThreadPool-1627-thread-3) [n: c: s: r: x: t:] o.a.s.c.ZkController Remove node as live in ZooKeeper:/live_nodes/127.0.0.1:34767_mr%2Fx 2> 508542 INFO (closeThreadPool-1627-thread-1) [n: c: s: r: x: t:] o.a.s.c.ZkController Publish this node as DOWN... 2> 508542 INFO (closeThreadPool-1627-thread-1) [n: c: s: r: x: t:] o.a.s.c.ZkController Publish node=127.0.0.1:33625_mr%2Fx as DOWN 2> 508543 INFO (closeThreadPool-1627-thread-3) [n: c: s: r: x: t:] o.a.s.c.ZkController Publish this node as DOWN... 2> 508543 INFO (closeThreadPool-1627-thread-3) [n: c: s: r: x: t:] o.a.s.c.ZkController Publish node=127.0.0.1:34767_mr%2Fx as DOWN 2> 508545 INFO (zkCallback-1489-thread-2) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (5) -> (3) 2> 508545 INFO (zkCallback-1509-thread-2) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (5) -> (3) 2> 508546 INFO (zkCallback-1476-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (5) -> (3) 2> 508546 INFO (zkCallback-1528-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (5) -> (3) 2> 508546 INFO (zkCallback-1432-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (5) -> (3) 2> 508565 INFO (coreCloseExecutor-1632-thread-1) [n:127.0.0.1:34767_mr%2Fx c:collection1 s:shard1 r:core_node7 x:collection1_shard1_replica_n5 t:] o.a.s.c.SolrCore CLOSING SolrCore org.apache.solr.core.SolrCore@53cbed62 collection1_shard1_replica_n5 2> 508566 INFO (coreCloseExecutor-1632-thread-1) [n:127.0.0.1:34767_mr%2Fx c:collection1 s:shard1 r:core_node7 x:collection1_shard1_replica_n5 t:] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.core.collection1.shard1.replica_n5 tag=SolrCore@53cbed62 2> 508566 INFO (coreCloseExecutor-1633-thread-1) [n:127.0.0.1:33625_mr%2Fx c:collection1 s:shard1 r:core_node3 x:collection1_shard1_replica_n1 t:] o.a.s.c.SolrCore CLOSING SolrCore org.apache.solr.core.SolrCore@173a3912 collection1_shard1_replica_n1 2> 508567 INFO (coreCloseExecutor-1633-thread-1) [n:127.0.0.1:33625_mr%2Fx c:collection1 s:shard1 r:core_node3 x:collection1_shard1_replica_n1 t:] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.core.collection1.shard1.replica_n1 tag=SolrCore@173a3912 2> 508680 INFO (zkCallback-1476-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/collection1/state.json zxid: 573] for collection [collection1] has occurred - updating... (live nodes size: [3]) 2> 508681 INFO (zkCallback-1509-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/collection1/state.json zxid: 573] for collection [collection1] has occurred - updating... (live nodes size: [3]) 2> 508681 INFO (zkCallback-1528-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/collection1/state.json zxid: 573] for collection [collection1] has occurred - updating... (live nodes size: [3]) 2> 508681 INFO (zkCallback-1476-thread-2) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/collection1/state.json zxid: 573] for collection [collection1] has occurred - updating... (live nodes size: [3]) 2> 508680 INFO (zkCallback-1509-thread-2) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/collection1/state.json zxid: 573] for collection [collection1] has occurred - updating... (live nodes size: [3]) 2> 508680 INFO (zkCallback-1489-thread-2) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/collection1/state.json zxid: 573] for collection [collection1] has occurred - updating... (live nodes size: [3]) 2> 508771 INFO (closeThreadPool-1627-thread-2) [n: c: s: r: x: t:] o.a.s.c.CoreContainer Shutting down CoreContainer instance=1680609994 2> 508772 INFO (closeThreadPool-1627-thread-2) [n: c: s: r: x: t:] o.a.s.c.ZkController Remove node as live in ZooKeeper:/live_nodes/127.0.0.1:40873_mr%2Fx 2> 508777 INFO (closeThreadPool-1627-thread-2) [n: c: s: r: x: t:] o.a.s.c.ZkController Publish this node as DOWN... 2> 508777 INFO (closeThreadPool-1627-thread-2) [n: c: s: r: x: t:] o.a.s.c.ZkController Publish node=127.0.0.1:40873_mr%2Fx as DOWN 2> 508780 INFO (zkCallback-1476-thread-2) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (3) -> (2) 2> 508780 INFO (zkCallback-1509-thread-2) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (3) -> (2) 2> 508781 INFO (zkCallback-1489-thread-2) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (3) -> (2) 2> 508781 INFO (zkCallback-1528-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (3) -> (2) 2> 508781 INFO (zkCallback-1432-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (3) -> (2) 2> 508799 INFO (coreCloseExecutor-1636-thread-1) [n:127.0.0.1:40873_mr%2Fx c:control_collection s:shard1 r:core_node2 x:control_collection_shard1_replica_n1 t:] o.a.s.c.SolrCore CLOSING SolrCore org.apache.solr.core.SolrCore@5c31184 control_collection_shard1_replica_n1 2> 508800 INFO (coreCloseExecutor-1636-thread-1) [n:127.0.0.1:40873_mr%2Fx c:control_collection s:shard1 r:core_node2 x:control_collection_shard1_replica_n1 t:] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.core.control_collection.shard1.replica_n1 tag=SolrCore@5c31184 2> 508908 INFO (zkCallback-1432-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/control_collection/state.json zxid: 577] for collection [control_collection] has occurred - updating... (live nodes size: [2]) 2> 508908 INFO (zkCallback-1432-thread-3) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/control_collection/state.json zxid: 577] for collection [control_collection] has occurred - updating... (live nodes size: [2]) 2> 508911 INFO (coreCloseExecutor-1633-thread-1) [n:127.0.0.1:33625_mr%2Fx c:collection1 s:shard1 r:core_node3 x:collection1_shard1_replica_n1 t:] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.collection.collection1.shard1.leader tag=SolrCore@173a3912 2> 508918 INFO (coreCloseExecutor-1633-thread-1) [n:127.0.0.1:33625_mr%2Fx c:collection1 s:shard1 r:core_node3 x:collection1_shard1_replica_n1 t:] o.a.s.u.DirectUpdateHandler2 Committing on IndexWriter.close() ... SKIPPED (unnecessary). 2> 508934 INFO (closeThreadPool-1627-thread-1) [n: c: s: r: x: t:] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.node tag=null 2> 508951 INFO (coreCloseExecutor-1632-thread-1) [n:127.0.0.1:34767_mr%2Fx c:collection1 s:shard1 r:core_node7 x:collection1_shard1_replica_n5 t:] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.collection.collection1.shard1.leader tag=SolrCore@53cbed62 2> 508959 INFO (coreCloseExecutor-1632-thread-1) [n:127.0.0.1:34767_mr%2Fx c:collection1 s:shard1 r:core_node7 x:collection1_shard1_replica_n5 t:] o.a.s.u.DirectUpdateHandler2 Committing on IndexWriter.close() ... SKIPPED (unnecessary). 2> 508972 INFO (closeThreadPool-1627-thread-3) [n: c: s: r: x: t:] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.node tag=null 2> 509088 INFO (closeThreadPool-1627-thread-1) [n: c: s: r: x: t:] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.jvm tag=null 2> 509165 INFO (closeThreadPool-1627-thread-1) [n: c: s: r: x: t:] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.jetty tag=null 2> 509166 INFO (closeThreadPool-1627-thread-1) [n: c: s: r: x: t:] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.cluster tag=null 2> 509198 INFO (closeThreadPool-1627-thread-3) [n: c: s: r: x: t:] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.jvm tag=null 2> 509223 INFO (coreCloseExecutor-1636-thread-1) [n:127.0.0.1:40873_mr%2Fx c:control_collection s:shard1 r:core_node2 x:control_collection_shard1_replica_n1 t:] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.collection.control_collection.shard1.leader tag=SolrCore@5c31184 2> 509232 INFO (coreCloseExecutor-1636-thread-1) [n:127.0.0.1:40873_mr%2Fx c:control_collection s:shard1 r:core_node2 x:control_collection_shard1_replica_n1 t:] o.a.s.u.DirectUpdateHandler2 Committing on IndexWriter.close() ... SKIPPED (unnecessary). 2> 509255 INFO (closeThreadPool-1627-thread-2) [n: c: s: r: x: t:] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.node tag=null 2> 509297 INFO (closeThreadPool-1627-thread-2) [n: c: s: r: x: t:] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.jvm tag=null 2> 509304 INFO (closeThreadPool-1627-thread-3) [n: c: s: r: x: t:] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.jetty tag=null 2> 509305 INFO (closeThreadPool-1627-thread-3) [n: c: s: r: x: t:] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.cluster tag=null 2> 509315 INFO (closeThreadPool-1627-thread-2) [n: c: s: r: x: t:] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.jetty tag=null 2> 509316 INFO (closeThreadPool-1627-thread-2) [n: c: s: r: x: t:] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.cluster tag=null 2> 509330 INFO (closeThreadPool-1639-thread-1) [n: c: s: r: x: t:] o.a.s.c.Overseer Overseer (id=72077818027180036-127.0.0.1:40873_mr%2Fx-n_0000000000) closing 2> 509332 INFO (OverseerStateUpdate-72077818027180036-127.0.0.1:40873_mr%2Fx-n_0000000000) [n:127.0.0.1:40873_mr%2Fx c: s: r: x: t:] o.a.s.c.Overseer Overseer Loop exiting : 127.0.0.1:40873_mr%2Fx 2> 509351 INFO (closeThreadPool-1639-thread-2) [n: c: s: r: x: t:] o.a.s.c.Overseer Overseer (id=72077818027180036-127.0.0.1:40873_mr%2Fx-n_0000000000) closing 2> 509358 INFO (zkCallback-1489-thread-2) [n: c: s: r: x: t:] o.a.s.c.OverseerElectionContext I am going to be the leader 127.0.0.1:46725_mr%2Fx 2> 509364 INFO (zkCallback-1489-thread-2) [n:127.0.0.1:46725_mr%2Fx c: s: r: x: t:] o.a.s.c.Overseer Overseer (id=72077818027180044-127.0.0.1:46725_mr%2Fx-n_0000000002) starting 2> 509401 INFO (OverseerStateUpdate-72077818027180044-127.0.0.1:46725_mr%2Fx-n_0000000002) [n:127.0.0.1:46725_mr%2Fx c: s: r: x: t:] o.a.s.c.Overseer Starting to work on the main queue : 127.0.0.1:46725_mr%2Fx 2> 509458 INFO (closeThreadPool-1627-thread-2) [n: c: s: r: x: t:] o.a.s.c.Overseer Overseer (id=72077818027180036-127.0.0.1:40873_mr%2Fx-n_0000000000) closing 2> 511005 ERROR (qtp734457442-2385) [n:127.0.0.1:46725_mr%2Fx c:testcollection s:shard3 r:core_node17 x:testcollection_shard3_replica_n14 t:null-5420] o.a.s.c.ZkController Error getting leader from zk 2> => java.lang.InterruptedException: sleep interrupted 2> at java.base/java.lang.Thread.sleep(Native Method) 2> java.lang.InterruptedException: sleep interrupted 2> at java.base/java.lang.Thread.sleep(Native Method) ~[?:?] 2> at org.apache.solr.cloud.ZkController.getLeaderProps(ZkController.java:1565) ~[main/:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.cloud.ZkController.getLeaderProps(ZkController.java:1531) ~[main/:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.cloud.ZkController.getLeader(ZkController.java:1473) [main/:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.cloud.ZkController.register(ZkController.java:1335) [main/:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.cloud.ZkController.register(ZkController.java:1246) [main/:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.core.ZkContainer.lambda$registerInZk$1(ZkContainer.java:218) [main/:?] 2> at org.apache.solr.core.ZkContainer.registerInZk(ZkContainer.java:247) [main/:?] 2> at org.apache.solr.core.CoreContainer.registerCore(CoreContainer.java:1493) [main/:?] 2> at org.apache.solr.core.CoreContainer.createFromDescriptor(CoreContainer.java:1716) [main/:?] 2> at org.apache.solr.core.CoreContainer.create(CoreContainer.java:1572) [main/:?] 2> at org.apache.solr.handler.admin.CoreAdminOperation.lambda$static$0(CoreAdminOperation.java:114) [main/:?] 2> at org.apache.solr.handler.admin.CoreAdminOperation.execute(CoreAdminOperation.java:414) [main/:?] 2> at org.apache.solr.handler.admin.CoreAdminHandler$CallInfo.call(CoreAdminHandler.java:374) [main/:?] 2> at org.apache.solr.handler.admin.CoreAdminHandler.handleRequestBody(CoreAdminHandler.java:231) [main/:?] 2> at org.apache.solr.handler.RequestHandlerBase.handleRequest(RequestHandlerBase.java:226) [main/:?] 2> at org.apache.solr.servlet.HttpSolrCall.handleAdmin(HttpSolrCall.java:932) [main/:?] 2> at org.apache.solr.servlet.HttpSolrCall.handleAdminRequest(HttpSolrCall.java:880) [main/:?] 2> at org.apache.solr.servlet.HttpSolrCall.call(HttpSolrCall.java:541) [main/:?] 2> at org.apache.solr.servlet.SolrDispatchFilter.dispatch(SolrDispatchFilter.java:262) [main/:?] 2> at org.apache.solr.servlet.SolrDispatchFilter.lambda$doFilter$0(SolrDispatchFilter.java:219) [main/:?] 2> at org.apache.solr.servlet.ServletUtils.traceHttpRequestExecution2(ServletUtils.java:246) [main/:?] 2> at org.apache.solr.servlet.ServletUtils.rateLimitRequest(ServletUtils.java:215) [main/:?] 2> at org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:213) [main/:?] 2> at org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:195) [main/:?] 2> at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:202) [jetty-servlet-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1635) [jetty-servlet-10.0.19.jar:10.0.19] 2> at org.apache.solr.embedded.JettySolrRunner$DebugFilter.doFilter(JettySolrRunner.java:187) [solr-test-framework-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:202) [jetty-servlet-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1635) [jetty-servlet-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:527) [jetty-servlet-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:221) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1580) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:221) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1384) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:176) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:484) [jetty-servlet-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1553) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:174) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1306) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:129) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:122) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.rewrite.handler.RewriteHandler.handle(RewriteHandler.java:301) [jetty-rewrite-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:122) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.gzip.GzipHandler.handle(GzipHandler.java:790) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:122) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.Server.handle(Server.java:563) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.HttpChannel$RequestDispatchable.dispatch(HttpChannel.java:1598) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:753) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:501) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.HttpChannel.run(HttpChannel.java:461) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:969) [jetty-util-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.doRunJob(QueuedThreadPool.java:1194) [jetty-util-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1149) [jetty-util-10.0.19.jar:10.0.19] 2> at java.base/java.lang.Thread.run(Thread.java:829) [?:?] 2> 511009 ERROR (qtp1831257447-2438) [n:127.0.0.1:45309_mr%2Fx c:testcollection s:shard3 r:core_node9 x:testcollection_shard3_replica_n6 t:null-5416] o.a.s.c.ZkController Error getting leader from zk 2> => java.lang.InterruptedException: sleep interrupted 2> at java.base/java.lang.Thread.sleep(Native Method) 2> java.lang.InterruptedException: sleep interrupted 2> at java.base/java.lang.Thread.sleep(Native Method) ~[?:?] 2> at org.apache.solr.cloud.ZkController.getLeaderProps(ZkController.java:1565) ~[main/:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.cloud.ZkController.getLeaderProps(ZkController.java:1531) ~[main/:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.cloud.ZkController.getLeader(ZkController.java:1473) [main/:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.cloud.ZkController.register(ZkController.java:1335) [main/:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.cloud.ZkController.register(ZkController.java:1246) [main/:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.core.ZkContainer.lambda$registerInZk$1(ZkContainer.java:218) [main/:?] 2> at org.apache.solr.core.ZkContainer.registerInZk(ZkContainer.java:247) [main/:?] 2> at org.apache.solr.core.CoreContainer.registerCore(CoreContainer.java:1493) [main/:?] 2> at org.apache.solr.core.CoreContainer.createFromDescriptor(CoreContainer.java:1716) [main/:?] 2> at org.apache.solr.core.CoreContainer.create(CoreContainer.java:1572) [main/:?] 2> at org.apache.solr.handler.admin.CoreAdminOperation.lambda$static$0(CoreAdminOperation.java:114) [main/:?] 2> at org.apache.solr.handler.admin.CoreAdminOperation.execute(CoreAdminOperation.java:414) [main/:?] 2> at org.apache.solr.handler.admin.CoreAdminHandler$CallInfo.call(CoreAdminHandler.java:374) [main/:?] 2> at org.apache.solr.handler.admin.CoreAdminHandler.handleRequestBody(CoreAdminHandler.java:231) [main/:?] 2> at org.apache.solr.handler.RequestHandlerBase.handleRequest(RequestHandlerBase.java:226) [main/:?] 2> at org.apache.solr.servlet.HttpSolrCall.handleAdmin(HttpSolrCall.java:932) [main/:?] 2> at org.apache.solr.servlet.HttpSolrCall.handleAdminRequest(HttpSolrCall.java:880) [main/:?] 2> at org.apache.solr.servlet.HttpSolrCall.call(HttpSolrCall.java:541) [main/:?] 2> at org.apache.solr.servlet.SolrDispatchFilter.dispatch(SolrDispatchFilter.java:262) [main/:?] 2> at org.apache.solr.servlet.SolrDispatchFilter.lambda$doFilter$0(SolrDispatchFilter.java:219) [main/:?] 2> at org.apache.solr.servlet.ServletUtils.traceHttpRequestExecution2(ServletUtils.java:246) [main/:?] 2> at org.apache.solr.servlet.ServletUtils.rateLimitRequest(ServletUtils.java:215) [main/:?] 2> at org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:213) [main/:?] 2> at org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:195) [main/:?] 2> at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:202) [jetty-servlet-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1635) [jetty-servlet-10.0.19.jar:10.0.19] 2> at org.apache.solr.embedded.JettySolrRunner$DebugFilter.doFilter(JettySolrRunner.java:187) [solr-test-framework-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:202) [jetty-servlet-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1635) [jetty-servlet-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:527) [jetty-servlet-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:221) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1580) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:221) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1384) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:176) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:484) [jetty-servlet-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1553) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:174) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1306) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:129) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:122) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.rewrite.handler.RewriteHandler.handle(RewriteHandler.java:301) [jetty-rewrite-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:122) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.gzip.GzipHandler.handle(GzipHandler.java:790) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:122) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.Server.handle(Server.java:563) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.HttpChannel$RequestDispatchable.dispatch(HttpChannel.java:1598) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:753) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:501) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.HttpChannel.run(HttpChannel.java:461) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:969) [jetty-util-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.doRunJob(QueuedThreadPool.java:1194) [jetty-util-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1149) [jetty-util-10.0.19.jar:10.0.19] 2> at java.base/java.lang.Thread.run(Thread.java:829) [?:?] 2> 511020 ERROR (qtp734457442-2385) [n:127.0.0.1:46725_mr%2Fx c:testcollection s:shard3 r:core_node17 x:testcollection_shard3_replica_n14 t:null-5420] o.a.s.c.ZkContainer Exception registering core testcollection_shard3_replica_n14 2> => org.apache.solr.common.SolrException: Error getting leader from zk for shard shard3 2> at org.apache.solr.cloud.ZkController.getLeader(ZkController.java:1516) 2> org.apache.solr.common.SolrException: Error getting leader from zk for shard shard3 2> at org.apache.solr.cloud.ZkController.getLeader(ZkController.java:1516) ~[main/:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.cloud.ZkController.register(ZkController.java:1335) ~[main/:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.cloud.ZkController.register(ZkController.java:1246) ~[main/:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.core.ZkContainer.lambda$registerInZk$1(ZkContainer.java:218) [main/:?] 2> at org.apache.solr.core.ZkContainer.registerInZk(ZkContainer.java:247) [main/:?] 2> at org.apache.solr.core.CoreContainer.registerCore(CoreContainer.java:1493) [main/:?] 2> at org.apache.solr.core.CoreContainer.createFromDescriptor(CoreContainer.java:1716) [main/:?] 2> at org.apache.solr.core.CoreContainer.create(CoreContainer.java:1572) [main/:?] 2> at org.apache.solr.handler.admin.CoreAdminOperation.lambda$static$0(CoreAdminOperation.java:114) [main/:?] 2> at org.apache.solr.handler.admin.CoreAdminOperation.execute(CoreAdminOperation.java:414) [main/:?] 2> at org.apache.solr.handler.admin.CoreAdminHandler$CallInfo.call(CoreAdminHandler.java:374) [main/:?] 2> at org.apache.solr.handler.admin.CoreAdminHandler.handleRequestBody(CoreAdminHandler.java:231) [main/:?] 2> at org.apache.solr.handler.RequestHandlerBase.handleRequest(RequestHandlerBase.java:226) [main/:?] 2> at org.apache.solr.servlet.HttpSolrCall.handleAdmin(HttpSolrCall.java:932) [main/:?] 2> at org.apache.solr.servlet.HttpSolrCall.handleAdminRequest(HttpSolrCall.java:880) [main/:?] 2> at org.apache.solr.servlet.HttpSolrCall.call(HttpSolrCall.java:541) [main/:?] 2> at org.apache.solr.servlet.SolrDispatchFilter.dispatch(SolrDispatchFilter.java:262) [main/:?] 2> at org.apache.solr.servlet.SolrDispatchFilter.lambda$doFilter$0(SolrDispatchFilter.java:219) [main/:?] 2> at org.apache.solr.servlet.ServletUtils.traceHttpRequestExecution2(ServletUtils.java:246) [main/:?] 2> at org.apache.solr.servlet.ServletUtils.rateLimitRequest(ServletUtils.java:215) [main/:?] 2> at org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:213) [main/:?] 2> at org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:195) [main/:?] 2> at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:202) [jetty-servlet-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1635) [jetty-servlet-10.0.19.jar:10.0.19] 2> at org.apache.solr.embedded.JettySolrRunner$DebugFilter.doFilter(JettySolrRunner.java:187) [solr-test-framework-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:202) [jetty-servlet-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1635) [jetty-servlet-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:527) [jetty-servlet-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:221) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1580) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:221) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1384) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:176) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:484) [jetty-servlet-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1553) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:174) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1306) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:129) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:122) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.rewrite.handler.RewriteHandler.handle(RewriteHandler.java:301) [jetty-rewrite-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:122) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.gzip.GzipHandler.handle(GzipHandler.java:790) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:122) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.Server.handle(Server.java:563) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.HttpChannel$RequestDispatchable.dispatch(HttpChannel.java:1598) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:753) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:501) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.HttpChannel.run(HttpChannel.java:461) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:969) [jetty-util-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.doRunJob(QueuedThreadPool.java:1194) [jetty-util-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1149) [jetty-util-10.0.19.jar:10.0.19] 2> at java.base/java.lang.Thread.run(Thread.java:829) [?:?] 2> Caused by: java.lang.InterruptedException: sleep interrupted 2> at java.base/java.lang.Thread.sleep(Native Method) ~[?:?] 2> at org.apache.solr.cloud.ZkController.getLeaderProps(ZkController.java:1565) ~[main/:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.cloud.ZkController.getLeaderProps(ZkController.java:1531) ~[main/:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.cloud.ZkController.getLeader(ZkController.java:1473) ~[main/:9.4.1 (not a git checkout) - builduser] 2> ... 51 more 2> 511021 ERROR (qtp1831257447-2438) [n:127.0.0.1:45309_mr%2Fx c:testcollection s:shard3 r:core_node9 x:testcollection_shard3_replica_n6 t:null-5416] o.a.s.c.ZkContainer Exception registering core testcollection_shard3_replica_n6 2> => org.apache.solr.common.SolrException: Error getting leader from zk for shard shard3 2> at org.apache.solr.cloud.ZkController.getLeader(ZkController.java:1516) 2> org.apache.solr.common.SolrException: Error getting leader from zk for shard shard3 2> at org.apache.solr.cloud.ZkController.getLeader(ZkController.java:1516) ~[main/:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.cloud.ZkController.register(ZkController.java:1335) ~[main/:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.cloud.ZkController.register(ZkController.java:1246) ~[main/:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.core.ZkContainer.lambda$registerInZk$1(ZkContainer.java:218) [main/:?] 2> at org.apache.solr.core.ZkContainer.registerInZk(ZkContainer.java:247) [main/:?] 2> at org.apache.solr.core.CoreContainer.registerCore(CoreContainer.java:1493) [main/:?] 2> at org.apache.solr.core.CoreContainer.createFromDescriptor(CoreContainer.java:1716) [main/:?] 2> at org.apache.solr.core.CoreContainer.create(CoreContainer.java:1572) [main/:?] 2> at org.apache.solr.handler.admin.CoreAdminOperation.lambda$static$0(CoreAdminOperation.java:114) [main/:?] 2> at org.apache.solr.handler.admin.CoreAdminOperation.execute(CoreAdminOperation.java:414) [main/:?] 2> at org.apache.solr.handler.admin.CoreAdminHandler$CallInfo.call(CoreAdminHandler.java:374) [main/:?] 2> at org.apache.solr.handler.admin.CoreAdminHandler.handleRequestBody(CoreAdminHandler.java:231) [main/:?] 2> at org.apache.solr.handler.RequestHandlerBase.handleRequest(RequestHandlerBase.java:226) [main/:?] 2> at org.apache.solr.servlet.HttpSolrCall.handleAdmin(HttpSolrCall.java:932) [main/:?] 2> at org.apache.solr.servlet.HttpSolrCall.handleAdminRequest(HttpSolrCall.java:880) [main/:?] 2> at org.apache.solr.servlet.HttpSolrCall.call(HttpSolrCall.java:541) [main/:?] 2> at org.apache.solr.servlet.SolrDispatchFilter.dispatch(SolrDispatchFilter.java:262) [main/:?] 2> at org.apache.solr.servlet.SolrDispatchFilter.lambda$doFilter$0(SolrDispatchFilter.java:219) [main/:?] 2> at org.apache.solr.servlet.ServletUtils.traceHttpRequestExecution2(ServletUtils.java:246) [main/:?] 2> at org.apache.solr.servlet.ServletUtils.rateLimitRequest(ServletUtils.java:215) [main/:?] 2> at org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:213) [main/:?] 2> at org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:195) [main/:?] 2> at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:202) [jetty-servlet-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1635) [jetty-servlet-10.0.19.jar:10.0.19] 2> at org.apache.solr.embedded.JettySolrRunner$DebugFilter.doFilter(JettySolrRunner.java:187) [solr-test-framework-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:202) [jetty-servlet-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1635) [jetty-servlet-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:527) [jetty-servlet-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:221) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1580) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:221) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1384) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:176) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:484) [jetty-servlet-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1553) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:174) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1306) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:129) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:122) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.rewrite.handler.RewriteHandler.handle(RewriteHandler.java:301) [jetty-rewrite-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:122) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.gzip.GzipHandler.handle(GzipHandler.java:790) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:122) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.Server.handle(Server.java:563) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.HttpChannel$RequestDispatchable.dispatch(HttpChannel.java:1598) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:753) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:501) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.HttpChannel.run(HttpChannel.java:461) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:969) [jetty-util-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.doRunJob(QueuedThreadPool.java:1194) [jetty-util-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1149) [jetty-util-10.0.19.jar:10.0.19] 2> at java.base/java.lang.Thread.run(Thread.java:829) [?:?] 2> Caused by: java.lang.InterruptedException: sleep interrupted 2> at java.base/java.lang.Thread.sleep(Native Method) ~[?:?] 2> at org.apache.solr.cloud.ZkController.getLeaderProps(ZkController.java:1565) ~[main/:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.cloud.ZkController.getLeaderProps(ZkController.java:1531) ~[main/:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.cloud.ZkController.getLeader(ZkController.java:1473) ~[main/:9.4.1 (not a git checkout) - builduser] 2> ... 51 more 2> 511035 INFO (qtp734457442-2385) [n:127.0.0.1:46725_mr%2Fx c: s: r: x: t:null-5420] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/cores params={qt=/admin/cores&coreNodeName=core_node17&collection.configName=conf1&newCollection=true&name=testcollection_shard3_replica_n14&action=CREATE&numShards=3&collection=testcollection&shard=shard3&wt=javabin&version=2&replicaType=NRT} status=0 QTime=26166 2> 511036 INFO (closeThreadPool-1627-thread-5) [n: c: s: r: x: t:] o.a.s.c.CoreContainer Shutting down CoreContainer instance=177416711 2> 511037 INFO (closeThreadPool-1627-thread-5) [n: c: s: r: x: t:] o.a.s.c.ZkController Remove node as live in ZooKeeper:/live_nodes/127.0.0.1:46725_mr%2Fx 2> 511039 INFO (qtp1831257447-2438) [n:127.0.0.1:45309_mr%2Fx c: s: r: x: t:null-5416] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/cores params={qt=/admin/cores&coreNodeName=core_node9&collection.configName=conf1&newCollection=true&name=testcollection_shard3_replica_n6&action=CREATE&numShards=3&collection=testcollection&shard=shard3&wt=javabin&version=2&replicaType=NRT} status=0 QTime=26189 2> 511040 INFO (closeThreadPool-1627-thread-4) [n: c: s: r: x: t:] o.a.s.c.CoreContainer Shutting down CoreContainer instance=1049579933 2> 511041 INFO (closeThreadPool-1627-thread-4) [n: c: s: r: x: t:] o.a.s.c.ZkController Remove node as live in ZooKeeper:/live_nodes/127.0.0.1:45309_mr%2Fx 2> 511041 INFO (closeThreadPool-1627-thread-5) [n: c: s: r: x: t:] o.a.s.c.ZkController Publish this node as DOWN... 2> 511041 INFO (closeThreadPool-1627-thread-5) [n: c: s: r: x: t:] o.a.s.c.ZkController Publish node=127.0.0.1:46725_mr%2Fx as DOWN 2> 511044 INFO (zkCallback-1489-thread-2) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (2) -> (1) 2> 511045 INFO (closeThreadPool-1627-thread-4) [n: c: s: r: x: t:] o.a.s.c.ZkController Publish this node as DOWN... 2> 511045 INFO (closeThreadPool-1627-thread-4) [n: c: s: r: x: t:] o.a.s.c.ZkController Publish node=127.0.0.1:45309_mr%2Fx as DOWN 2> 511046 INFO (zkCallback-1509-thread-2) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (2) -> (0) 2> 511048 INFO (zkCallback-1489-thread-2) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (1) -> (0) 2> 511065 INFO (coreCloseExecutor-1646-thread-1) [n:127.0.0.1:46725_mr%2Fx c:collection1 s:shard2 r:core_node8 x:collection1_shard2_replica_n6 t:] o.a.s.c.SolrCore CLOSING SolrCore org.apache.solr.core.SolrCore@36102fdc collection1_shard2_replica_n6 2> 511065 INFO (coreCloseExecutor-1646-thread-1) [n:127.0.0.1:46725_mr%2Fx c:collection1 s:shard2 r:core_node8 x:collection1_shard2_replica_n6 t:] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.core.collection1.shard2.replica_n6 tag=SolrCore@36102fdc 2> 511067 INFO (coreCloseExecutor-1647-thread-1) [n:127.0.0.1:45309_mr%2Fx c:collection1 s:shard2 r:core_node4 x:collection1_shard2_replica_n2 t:] o.a.s.c.SolrCore CLOSING SolrCore org.apache.solr.core.SolrCore@7ceedef8 collection1_shard2_replica_n2 2> 511067 INFO (coreCloseExecutor-1647-thread-1) [n:127.0.0.1:45309_mr%2Fx c:collection1 s:shard2 r:core_node4 x:collection1_shard2_replica_n2 t:] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.core.collection1.shard2.replica_n2 tag=SolrCore@7ceedef8 2> 511080 INFO (zkCallback-1489-thread-2) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/collection1/state.json zxid: 589] for collection [collection1] has occurred - updating... (live nodes size: [0]) 2> 511080 INFO (zkCallback-1509-thread-2) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/collection1/state.json zxid: 589] for collection [collection1] has occurred - updating... (live nodes size: [0]) 2> 511080 INFO (zkCallback-1509-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/collection1/state.json zxid: 589] for collection [collection1] has occurred - updating... (live nodes size: [0]) 2> 511199 INFO (zkCallback-1489-thread-2) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/collection1/state.json zxid: 591] for collection [collection1] has occurred - updating... (live nodes size: [0]) 2> 511199 INFO (zkCallback-1509-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/collection1/state.json zxid: 591] for collection [collection1] has occurred - updating... (live nodes size: [0]) 2> 511199 INFO (zkCallback-1509-thread-2) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/collection1/state.json zxid: 591] for collection [collection1] has occurred - updating... (live nodes size: [0]) 2> 511389 INFO (coreCloseExecutor-1647-thread-1) [n:127.0.0.1:45309_mr%2Fx c:collection1 s:shard2 r:core_node4 x:collection1_shard2_replica_n2 t:] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.collection.collection1.shard2.leader tag=SolrCore@7ceedef8 2> 511396 INFO (coreCloseExecutor-1647-thread-1) [n:127.0.0.1:45309_mr%2Fx c:collection1 s:shard2 r:core_node4 x:collection1_shard2_replica_n2 t:] o.a.s.u.DirectUpdateHandler2 Committing on IndexWriter.close() ... SKIPPED (unnecessary). 2> 511407 INFO (closeThreadPool-1627-thread-4) [n: c: s: r: x: t:] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.node tag=null 2> 511415 INFO (coreCloseExecutor-1646-thread-1) [n:127.0.0.1:46725_mr%2Fx c:collection1 s:shard2 r:core_node8 x:collection1_shard2_replica_n6 t:] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.collection.collection1.shard2.leader tag=SolrCore@36102fdc 2> 511421 INFO (coreCloseExecutor-1646-thread-1) [n:127.0.0.1:46725_mr%2Fx c:collection1 s:shard2 r:core_node8 x:collection1_shard2_replica_n6 t:] o.a.s.u.DirectUpdateHandler2 Committing on IndexWriter.close() ... SKIPPED (unnecessary). 2> 511429 INFO (closeThreadPool-1627-thread-5) [n: c: s: r: x: t:] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.node tag=null 2> 511447 INFO (closeThreadPool-1627-thread-4) [n: c: s: r: x: t:] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.jvm tag=null 2> 511463 INFO (closeThreadPool-1627-thread-4) [n: c: s: r: x: t:] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.jetty tag=null 2> 511464 INFO (closeThreadPool-1627-thread-4) [n: c: s: r: x: t:] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.cluster tag=null 2> 511468 INFO (closeThreadPool-1627-thread-5) [n: c: s: r: x: t:] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.jvm tag=null 2> 511487 INFO (closeThreadPool-1627-thread-5) [n: c: s: r: x: t:] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.jetty tag=null 2> 511488 INFO (closeThreadPool-1627-thread-5) [n: c: s: r: x: t:] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.cluster tag=null 2> 511498 INFO (closeThreadPool-1649-thread-2) [n: c: s: r: x: t:] o.a.s.c.Overseer Overseer (id=72077818027180044-127.0.0.1:46725_mr%2Fx-n_0000000002) closing 2> 511502 INFO (OverseerStateUpdate-72077818027180044-127.0.0.1:46725_mr%2Fx-n_0000000002) [n:127.0.0.1:46725_mr%2Fx c: s: r: x: t:] o.a.s.c.Overseer Overseer Loop exiting : 127.0.0.1:46725_mr%2Fx 2> 511508 INFO (closeThreadPool-1649-thread-1) [n: c: s: r: x: t:] o.a.s.c.Overseer Overseer (id=72077818027180044-127.0.0.1:46725_mr%2Fx-n_0000000002) closing 2> 511618 INFO (closeThreadPool-1627-thread-5) [n: c: s: r: x: t:] o.a.s.c.Overseer Overseer (id=72077818027180044-127.0.0.1:46725_mr%2Fx-n_0000000002) closing 2> 511629 INFO (TEST-TestReplicaProperties.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.u.ErrorLogMuter Closing ErrorLogMuter-regex-47 after mutting 0 log messages 2> 511629 INFO (TEST-TestReplicaProperties.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.u.ErrorLogMuter Creating ErrorLogMuter-regex-48 for ERROR logs matching regex: ignore_exception 2> 511633 INFO (TEST-TestReplicaProperties.test-seed#[246C98A4C257C021]) [n: c: s: r: x: t:] o.a.s.c.ZkTestServer Shutting down ZkTestServer. 2> 511858 WARN (ZkTestServer Run Thread) [n: c: s: r: x: t:] o.a.s.c.ZkTestServer Watch limit violations: 2> Maximum concurrent create/delete watches above limit: 2> 2> 18 /solr/collections/testcollection/terms/shard1 2> 18 /solr/collections/testcollection/terms/shard2 2> 18 /solr/collections/testcollection/terms/shard3 2> 8 /solr/clusterprops.json 2> 8 /solr/aliases.json 2> 5 /solr/packages.json 2> 5 /solr/security.json 2> 5 /solr/configs/conf1 2> 5 /solr/collections/collection1/terms/shard1 2> 5 /solr/collections/collection1/terms/shard2 2> 4 /solr/collections/collection1/collectionprops.json 2> 4 /solr/collections/testcollection/state.json 2> 2 /solr/collections/control_collection/terms/shard1 2> 2> Maximum concurrent data watches above limit: 2> 2> 52 /solr/collections/collection1/state.json 2> 18 /solr/collections/testcollection/state.json 2> 8 /solr/collections/control_collection/state.json 2> 2 /solr/overseer_elect/election/72077818027180036-127.0.0.1:40873_mr%2Fx-n_0000000000 2> 2 /solr/collections/testcollection/leader_elect/shard2/election/72077818027180042-core_node15-n_0000000000 2> 2 /solr/collections/testcollection/leader_elect/shard1/election/72077818027180050-core_node18-n_0000000000 2> 2> Maximum concurrent children watches above limit: 2> 2> 80 /solr/overseer/queue 2> 40 /solr/live_nodes 2> 22 /solr/collections 2> 21 /solr/overseer/collection-queue-work 2> 9 /solr/collections/collection1/state.json 2> 8 /solr/collections/testcollection/state.json 2> 3 /solr/collections/control_collection/state.json 2> 2 /solr/overseer/queue-work 2> > org.apache.solr.client.solrj.impl.BaseHttpSolrClient$RemoteSolrException: Error from server at http://127.0.0.1:45309/mr/x: Underlying core creation failed while creating collection: testcollection > at __randomizedtesting.SeedInfo.seed([246C98A4C257C021:AC38A77E6CABADD9]:0) > at app//org.apache.solr.client.solrj.impl.HttpSolrClient.executeMethod(HttpSolrClient.java:747) > at app//org.apache.solr.client.solrj.impl.HttpSolrClient.request(HttpSolrClient.java:263) > at app//org.apache.solr.client.solrj.impl.HttpSolrClient.request(HttpSolrClient.java:244) > at app//org.apache.solr.client.solrj.impl.LBSolrClient.doRequest(LBSolrClient.java:386) > at app//org.apache.solr.client.solrj.impl.LBSolrClient.request(LBSolrClient.java:352) > at app//org.apache.solr.client.solrj.impl.CloudSolrClient.sendRequest(CloudSolrClient.java:1199) > at app//org.apache.solr.client.solrj.impl.CloudSolrClient.requestWithRetryOnStaleState(CloudSolrClient.java:898) > at app//org.apache.solr.client.solrj.impl.CloudSolrClient.request(CloudSolrClient.java:826) > at app//org.apache.solr.client.solrj.SolrClient.request(SolrClient.java:1192) > at app//org.apache.solr.cloud.AbstractFullDistribZkTestBase.createCollection(AbstractFullDistribZkTestBase.java:2103) > at app//org.apache.solr.cloud.AbstractFullDistribZkTestBase.createCollection(AbstractFullDistribZkTestBase.java:2163) > at app//org.apache.solr.cloud.api.collections.TestReplicaProperties.test(TestReplicaProperties.java:54) > at java.base@11.0.22/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method) > at java.base@11.0.22/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) > at java.base@11.0.22/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) > at java.base@11.0.22/java.lang.reflect.Method.invoke(Method.java:566) > at app//com.carrotsearch.randomizedtesting.RandomizedRunner.invoke(RandomizedRunner.java:1758) > at app//com.carrotsearch.randomizedtesting.RandomizedRunner$8.evaluate(RandomizedRunner.java:946) > at app//com.carrotsearch.randomizedtesting.RandomizedRunner$9.evaluate(RandomizedRunner.java:982) > at app//com.carrotsearch.randomizedtesting.RandomizedRunner$10.evaluate(RandomizedRunner.java:996) > at app//org.apache.solr.BaseDistributedSearchTestCase$ShardsRepeatRule$ShardsFixedStatement.callStatement(BaseDistributedSearchTestCase.java:1163) > at app//org.apache.solr.BaseDistributedSearchTestCase$ShardsRepeatRule$ShardsStatement.evaluate(BaseDistributedSearchTestCase.java:1134) > at app//com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:80) > at app//org.junit.rules.RunRules.evaluate(RunRules.java:20) > at app//org.apache.lucene.tests.util.TestRuleSetupTeardownChained$1.evaluate(TestRuleSetupTeardownChained.java:48) > at app//org.apache.lucene.tests.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:43) > at app//org.apache.lucene.tests.util.TestRuleThreadAndTestName$1.evaluate(TestRuleThreadAndTestName.java:45) > at app//org.apache.lucene.tests.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:60) > at app//org.apache.lucene.tests.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:44) > at app//org.junit.rules.RunRules.evaluate(RunRules.java:20) > at app//com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) > at app//com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:390) > at app//com.carrotsearch.randomizedtesting.ThreadLeakControl.forkTimeoutingTask(ThreadLeakControl.java:843) > at app//com.carrotsearch.randomizedtesting.ThreadLeakControl$3.evaluate(ThreadLeakControl.java:490) > at app//com.carrotsearch.randomizedtesting.RandomizedRunner.runSingleTest(RandomizedRunner.java:955) > at app//com.carrotsearch.randomizedtesting.RandomizedRunner$5.evaluate(RandomizedRunner.java:840) > at app//com.carrotsearch.randomizedtesting.RandomizedRunner$6.evaluate(RandomizedRunner.java:891) > at app//com.carrotsearch.randomizedtesting.RandomizedRunner$7.evaluate(RandomizedRunner.java:902) > at app//com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) > at app//com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) > at app//org.apache.lucene.tests.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:43) > at app//org.apache.lucene.tests.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:43) > at app//com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:80) > at app//org.junit.rules.RunRules.evaluate(RunRules.java:20) > at app//org.apache.lucene.tests.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:43) > at app//com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) > at app//org.apache.lucene.tests.util.TestRuleStoreClassName$1.evaluate(TestRuleStoreClassName.java:38) > at app//com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:40) > at app//com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:40) > at app//com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) > at app//com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) > at app//org.apache.lucene.tests.util.TestRuleAssertionsRequired$1.evaluate(TestRuleAssertionsRequired.java:53) > at app//org.apache.lucene.tests.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:43) > at app//org.apache.lucene.tests.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:44) > at app//org.apache.lucene.tests.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:60) > at app//org.apache.lucene.tests.util.TestRuleIgnoreTestSuites$1.evaluate(TestRuleIgnoreTestSuites.java:47) > at app//org.junit.rules.RunRules.evaluate(RunRules.java:20) > at app//com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) > at app//com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:390) > at app//com.carrotsearch.randomizedtesting.ThreadLeakControl.lambda$forkTimeoutingTask$0(ThreadLeakControl.java:850) > at java.base@11.0.22/java.lang.Thread.run(Thread.java:829) 2> NOTE: reproduce with: gradlew test --tests TestReplicaProperties.test -Dtests.seed=246C98A4C257C021 -Dtests.locale=en-MP -Dtests.timezone=Africa/Mogadishu -Dtests.asserts=true -Dtests.file.encoding=UTF-8 2> 511956 INFO (SUITE-TestReplicaProperties-seed#[246C98A4C257C021]-worker) [n: c: s: r: x: t:] o.a.s.u.ErrorLogMuter Closing ErrorLogMuter-regex-48 after mutting 0 log messages 2> 511957 INFO (SUITE-TestReplicaProperties-seed#[246C98A4C257C021]-worker) [n: c: s: r: x: t:] o.a.s.u.ErrorLogMuter Creating ErrorLogMuter-regex-49 for ERROR logs matching regex: ignore_exception 2> NOTE: leaving temporary files on disk at: /build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.cloud.api.collections.TestReplicaProperties_246C98A4C257C021-001 2> NOTE: test params are: codec=Asserting(Lucene95): {}, docValues:{}, maxPointsInLeafNode=1682, maxMBSortInHeap=7.771202853985787, sim=Asserting(RandomSimilarity(queryNorm=false): {}), locale=en-MP, timezone=Africa/Mogadishu 2> NOTE: Linux 6.1.61-3-sophgo-08357-g369f7207fe69 riscv64/Oracle Corporation 11.0.22 (64-bit)/cpus=1,threads=1,free=163707232,total=479199232 2> NOTE: All tests run in this JVM: [ConvertedLegacyTest, TestCrossCoreJoin, TestSolrCoreProperties, JerseyResourceTest, TestEmbeddedSolrServerSchemaAPI, ChaosMonkeySafeLeaderWithPullReplicasTest, ConnectionManagerTest, DistributedApiAsyncTrackerTest, LeaderElectionTest, NodeMutatorTest, ParallelCommitExecutionTest, SSLMigrationTest, TestAuthenticationFramework, TestConfigSetsAPIExclusivity, TestLRUStatsCacheCloud, TestQueryingOnDownCollection, TestStressInPlaceUpdates, ZkControllerTest, CollectionsAPIDistributedZkTest, TestReplicaProperties] WARNING: Test org.apache.solr.schema.DocValuesTest wrote 11,269,540 bytes of output. WARNING: Test org.apache.solr.schema.TestCloudSchemaless wrote 12,686,171 bytes of output. WARNING: Test org.apache.solr.search.facet.TestCloudJSONFacetSKGEquiv wrote 14,775,900 bytes of output. > Task :solr:solrj:test :solr:solrj:test (SUCCESS): 1143 test(s), 40 skipped > Task :solr:solrj-streaming:wipeTaskTemp > Task :solr:modules:ltr:compileTestJava Note: Some input files use or override a deprecated API. Note: Recompile with -Xlint:deprecation for details. > Task :solr:modules:ltr:testClasses > Task :solr:solrj-zookeeper:wipeTaskTemp > Task :solr:modules:opentelemetry:classes UP-TO-DATE > Task :solr:modules:ltr:test > Task :solr:modules:opentelemetry:compileTestJava > Task :solr:modules:clustering:test :solr:modules:clustering:test (SUCCESS): 15 test(s) > Task :solr:modules:opentelemetry:testClasses > Task :solr:modules:opentelemetry:test > Task :solr:modules:hadoop-auth:testClasses > Task :solr:modules:hadoop-auth:test > Task :solr:core:test org.apache.solr.search.join.ShardJoinImplicitTest > classMethod FAILED org.apache.solr.client.solrj.impl.BaseHttpSolrClient$RemoteSolrException: Error from server at https://127.0.0.1:44957/solr/children_a_replica_n6: Task queue processing has stalled for 25115 ms with 0 remaining elements to process. at __randomizedtesting.SeedInfo.seed([246C98A4C257C021]:0) at app//org.apache.solr.client.solrj.impl.HttpSolrClient.executeMethod(HttpSolrClient.java:747) at app//org.apache.solr.client.solrj.impl.HttpSolrClient.request(HttpSolrClient.java:263) at app//org.apache.solr.client.solrj.impl.HttpSolrClient.request(HttpSolrClient.java:244) at app//org.apache.solr.client.solrj.impl.LBSolrClient.doRequest(LBSolrClient.java:386) at app//org.apache.solr.client.solrj.impl.LBSolrClient.request(LBSolrClient.java:352) at app//org.apache.solr.client.solrj.impl.CloudSolrClient.sendRequest(CloudSolrClient.java:1199) at app//org.apache.solr.client.solrj.impl.CloudSolrClient.requestWithRetryOnStaleState(CloudSolrClient.java:898) at app//org.apache.solr.client.solrj.impl.CloudSolrClient.request(CloudSolrClient.java:826) at app//org.apache.solr.client.solrj.SolrRequest.process(SolrRequest.java:234) at app//org.apache.solr.client.solrj.request.UpdateRequest.commit(UpdateRequest.java:228) at app//org.apache.solr.search.join.ShardToShardJoinAbstract.setupCluster(ShardToShardJoinAbstract.java:142) at app//org.apache.solr.search.join.ShardJoinImplicitTest.setupCluster(ShardJoinImplicitTest.java:37) at java.base@11.0.22/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at java.base@11.0.22/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at java.base@11.0.22/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.base@11.0.22/java.lang.reflect.Method.invoke(Method.java:566) at app//com.carrotsearch.randomizedtesting.RandomizedRunner.invoke(RandomizedRunner.java:1758) at app//com.carrotsearch.randomizedtesting.RandomizedRunner$6.evaluate(RandomizedRunner.java:886) at app//com.carrotsearch.randomizedtesting.RandomizedRunner$7.evaluate(RandomizedRunner.java:902) at app//com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at app//com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at app//org.apache.lucene.tests.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:43) at app//org.apache.lucene.tests.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:43) at app//com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:80) at app//org.junit.rules.RunRules.evaluate(RunRules.java:20) at app//org.apache.lucene.tests.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:43) at app//com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at app//org.apache.lucene.tests.util.TestRuleStoreClassName$1.evaluate(TestRuleStoreClassName.java:38) at app//com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:40) at app//com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:40) at app//com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at app//com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at app//org.apache.lucene.tests.util.TestRuleAssertionsRequired$1.evaluate(TestRuleAssertionsRequired.java:53) at app//org.apache.lucene.tests.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:43) at app//org.apache.lucene.tests.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:44) at app//org.apache.lucene.tests.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:60) at app//org.apache.lucene.tests.util.TestRuleIgnoreTestSuites$1.evaluate(TestRuleIgnoreTestSuites.java:47) at app//org.junit.rules.RunRules.evaluate(RunRules.java:20) at app//com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at app//com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:390) at app//com.carrotsearch.randomizedtesting.ThreadLeakControl.lambda$forkTimeoutingTask$0(ThreadLeakControl.java:850) at java.base@11.0.22/java.lang.Thread.run(Thread.java:829) org.apache.solr.search.join.ShardJoinImplicitTest > test suite's output saved to /build/solr/src/solr-9.4.1/solr/core/build/test-results/test/outputs/OUTPUT-org.apache.solr.search.join.ShardJoinImplicitTest.txt, copied below: 2> 1706221 INFO (SUITE-ShardJoinImplicitTest-seed#[246C98A4C257C021]-worker) [n: c: s: r: x: t:] o.a.s.SolrTestCase Setting 'solr.default.confdir' system property to test-framework derived value of '/build/solr/src/solr-9.4.1/solr/server/solr/configsets/_default/conf' 2> 1706222 INFO (SUITE-ShardJoinImplicitTest-seed#[246C98A4C257C021]-worker) [n: c: s: r: x: t:] o.a.s.SolrTestCaseJ4 SecureRandom sanity checks: test.solr.allowed.securerandom=null & java.security.egd=file:/dev/./urandom 2> 1706226 INFO (SUITE-ShardJoinImplicitTest-seed#[246C98A4C257C021]-worker) [n: c: s: r: x: t:] o.a.s.u.ErrorLogMuter Closing ErrorLogMuter-regex-226 after mutting 0 log messages 2> 1706227 INFO (SUITE-ShardJoinImplicitTest-seed#[246C98A4C257C021]-worker) [n: c: s: r: x: t:] o.a.s.u.ErrorLogMuter Creating ErrorLogMuter-regex-227 for ERROR logs matching regex: ignore_exception 2> 1706231 INFO (SUITE-ShardJoinImplicitTest-seed#[246C98A4C257C021]-worker) [n: c: s: r: x: t:] o.a.s.SolrTestCaseJ4 Created dataDir: /build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.search.join.ShardJoinImplicitTest_246C98A4C257C021-001/data-dir-65-001 2> 1706232 WARN (SUITE-ShardJoinImplicitTest-seed#[246C98A4C257C021]-worker) [n: c: s: r: x: t:] o.a.s.SolrTestCaseJ4 startTrackingSearchers: numOpens=16 numCloses=16 2> 1706234 INFO (SUITE-ShardJoinImplicitTest-seed#[246C98A4C257C021]-worker) [n: c: s: r: x: t:] o.a.s.SolrTestCaseJ4 Using TrieFields (NUMERIC_POINTS_SYSPROP=false) w/NUMERIC_DOCVALUES_SYSPROP=true 2> 1706238 INFO (SUITE-ShardJoinImplicitTest-seed#[246C98A4C257C021]-worker) [n: c: s: r: x: t:] o.a.s.SolrTestCaseJ4 Randomized ssl (true) and clientAuth (false) via: @org.apache.solr.util.RandomizeSSL(reason="", ssl=0.0/0.0, value=0.0/0.0, clientAuth=0.0/0.0) 2> 1706248 INFO (SUITE-ShardJoinImplicitTest-seed#[246C98A4C257C021]-worker) [n: c: s: r: x: t:] o.a.s.u.t.SimplePropagator Always-on trace id generation enabled. 2> 1706248 INFO (SUITE-ShardJoinImplicitTest-seed#[246C98A4C257C021]-worker) [n: c: s: r: x: t:] o.a.s.c.MiniSolrCloudCluster Starting cluster of 5 servers in /build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.search.join.ShardJoinImplicitTest_246C98A4C257C021-001/tempDir-001 2> 1706250 INFO (SUITE-ShardJoinImplicitTest-seed#[246C98A4C257C021]-worker) [n: c: s: r: x: t:] o.a.s.c.ZkTestServer STARTING ZK TEST SERVER 2> 1706251 INFO (ZkTestServer Run Thread) [n: c: s: r: x: t:] o.a.s.c.ZkTestServer client port: 0.0.0.0/0.0.0.0:0 2> 1706251 INFO (ZkTestServer Run Thread) [n: c: s: r: x: t:] o.a.s.c.ZkTestServer Starting server 2> 1706263 WARN (ZkTestServer Run Thread) [n: c: s: r: x: t:] o.a.z.s.ServerCnxnFactory maxCnxns is not configured, using default value 0. 2> 1706351 INFO (SUITE-ShardJoinImplicitTest-seed#[246C98A4C257C021]-worker) [n: c: s: r: x: t:] o.a.s.c.ZkTestServer start zk server on port: 33915 2> 1706359 WARN (SUITE-ShardJoinImplicitTest-seed#[246C98A4C257C021]-worker) [n: c: s: r: x: t:] o.a.s.c.c.SolrZkClient Using default ZkCredentialsInjector. ZkCredentialsInjector is not secure, it creates an empty list of credentials which leads to 'OPEN_ACL_UNSAFE' ACLs to Zookeeper nodes 2> 1706382 INFO (SUITE-ShardJoinImplicitTest-seed#[246C98A4C257C021]-worker) [n: c: s: r: x: t:] o.a.s.c.c.ConnectionManager Waiting up to 30000ms for client to connect to ZooKeeper 2> 1706407 INFO (zkConnectionManagerCallback-3604-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ConnectionManager zkClient has connected 2> 1706408 INFO (SUITE-ShardJoinImplicitTest-seed#[246C98A4C257C021]-worker) [n: c: s: r: x: t:] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper 2> 1706408 WARN (SUITE-ShardJoinImplicitTest-seed#[246C98A4C257C021]-worker) [n: c: s: r: x: t:] o.a.s.c.c.SolrZkClient Using default ZkACLProvider. DefaultZkACLProvider is not secure, it creates 'OPEN_ACL_UNSAFE' ACLs to Zookeeper nodes 2> 1706417 WARN (SUITE-ShardJoinImplicitTest-seed#[246C98A4C257C021]-worker) [n: c: s: r: x: t:] o.a.s.c.c.SolrZkClient Using default ZkCredentialsInjector. ZkCredentialsInjector is not secure, it creates an empty list of credentials which leads to 'OPEN_ACL_UNSAFE' ACLs to Zookeeper nodes 2> 1706424 INFO (SUITE-ShardJoinImplicitTest-seed#[246C98A4C257C021]-worker) [n: c: s: r: x: t:] o.a.s.c.c.ConnectionManager Waiting up to 30000ms for client to connect to ZooKeeper 2> 1706432 INFO (zkConnectionManagerCallback-3606-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ConnectionManager zkClient has connected 2> 1706432 INFO (SUITE-ShardJoinImplicitTest-seed#[246C98A4C257C021]-worker) [n: c: s: r: x: t:] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper 2> 1706433 WARN (SUITE-ShardJoinImplicitTest-seed#[246C98A4C257C021]-worker) [n: c: s: r: x: t:] o.a.s.c.c.SolrZkClient Using default ZkACLProvider. DefaultZkACLProvider is not secure, it creates 'OPEN_ACL_UNSAFE' ACLs to Zookeeper nodes 2> 1706433 WARN (SUITE-ShardJoinImplicitTest-seed#[246C98A4C257C021]-worker) [n: c: s: r: x: t:] o.a.s.c.c.SolrZkClient Using default ZkCredentialsInjector. ZkCredentialsInjector is not secure, it creates an empty list of credentials which leads to 'OPEN_ACL_UNSAFE' ACLs to Zookeeper nodes 2> 1706442 INFO (SUITE-ShardJoinImplicitTest-seed#[246C98A4C257C021]-worker) [n: c: s: r: x: t:] o.a.s.c.c.ConnectionManager Waiting up to 15000ms for client to connect to ZooKeeper 2> 1706451 INFO (zkConnectionManagerCallback-3608-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ConnectionManager zkClient has connected 2> 1706452 INFO (SUITE-ShardJoinImplicitTest-seed#[246C98A4C257C021]-worker) [n: c: s: r: x: t:] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper 2> 1706452 WARN (SUITE-ShardJoinImplicitTest-seed#[246C98A4C257C021]-worker) [n: c: s: r: x: t:] o.a.s.c.c.SolrZkClient Using default ZkACLProvider. DefaultZkACLProvider is not secure, it creates 'OPEN_ACL_UNSAFE' ACLs to Zookeeper nodes 2> 1706607 INFO (jetty-launcher-3609-thread-3) [n: c: s: r: x: t:] o.a.s.e.JettySolrRunner Start Jetty (configured port=0, binding port=0) 2> 1706607 INFO (jetty-launcher-3609-thread-1) [n: c: s: r: x: t:] o.a.s.e.JettySolrRunner Start Jetty (configured port=0, binding port=0) 2> 1706607 INFO (jetty-launcher-3609-thread-3) [n: c: s: r: x: t:] o.a.s.e.JettySolrRunner Trying to start Jetty on port 0 try number 2 ... 2> 1706607 INFO (jetty-launcher-3609-thread-4) [n: c: s: r: x: t:] o.a.s.e.JettySolrRunner Start Jetty (configured port=0, binding port=0) 2> 1706607 INFO (jetty-launcher-3609-thread-5) [n: c: s: r: x: t:] o.a.s.e.JettySolrRunner Start Jetty (configured port=0, binding port=0) 2> 1706607 INFO (jetty-launcher-3609-thread-1) [n: c: s: r: x: t:] o.a.s.e.JettySolrRunner Trying to start Jetty on port 0 try number 2 ... 2> 1706607 INFO (jetty-launcher-3609-thread-2) [n: c: s: r: x: t:] o.a.s.e.JettySolrRunner Start Jetty (configured port=0, binding port=0) 2> 1706607 INFO (jetty-launcher-3609-thread-4) [n: c: s: r: x: t:] o.a.s.e.JettySolrRunner Trying to start Jetty on port 0 try number 2 ... 2> 1706607 INFO (jetty-launcher-3609-thread-5) [n: c: s: r: x: t:] o.a.s.e.JettySolrRunner Trying to start Jetty on port 0 try number 2 ... 2> 1706608 INFO (jetty-launcher-3609-thread-2) [n: c: s: r: x: t:] o.a.s.e.JettySolrRunner Trying to start Jetty on port 0 try number 2 ... 2> 1706608 INFO (jetty-launcher-3609-thread-3) [n: c: s: r: x: t:] o.e.j.s.Server jetty-10.0.19; built: 2023-12-15T20:38:32.477Z; git: 8492d1c78f122bb30cce20aecfa07e7283facd47; jvm 11.0.22+7 2> 1706611 INFO (jetty-launcher-3609-thread-1) [n: c: s: r: x: t:] o.e.j.s.Server jetty-10.0.19; built: 2023-12-15T20:38:32.477Z; git: 8492d1c78f122bb30cce20aecfa07e7283facd47; jvm 11.0.22+7 2> 1706612 INFO (jetty-launcher-3609-thread-5) [n: c: s: r: x: t:] o.e.j.s.Server jetty-10.0.19; built: 2023-12-15T20:38:32.477Z; git: 8492d1c78f122bb30cce20aecfa07e7283facd47; jvm 11.0.22+7 2> 1706612 INFO (jetty-launcher-3609-thread-2) [n: c: s: r: x: t:] o.e.j.s.Server jetty-10.0.19; built: 2023-12-15T20:38:32.477Z; git: 8492d1c78f122bb30cce20aecfa07e7283facd47; jvm 11.0.22+7 2> 1706612 INFO (jetty-launcher-3609-thread-4) [n: c: s: r: x: t:] o.e.j.s.Server jetty-10.0.19; built: 2023-12-15T20:38:32.477Z; git: 8492d1c78f122bb30cce20aecfa07e7283facd47; jvm 11.0.22+7 2> 1706678 INFO (jetty-launcher-3609-thread-4) [n: c: s: r: x: t:] o.e.j.s.s.DefaultSessionIdManager Session workerName=node0 2> 1706680 INFO (jetty-launcher-3609-thread-2) [n: c: s: r: x: t:] o.e.j.s.s.DefaultSessionIdManager Session workerName=node0 2> 1706682 INFO (jetty-launcher-3609-thread-5) [n: c: s: r: x: t:] o.e.j.s.s.DefaultSessionIdManager Session workerName=node0 2> 1706692 INFO (jetty-launcher-3609-thread-2) [n: c: s: r: x: t:] o.e.j.s.h.ContextHandler Started o.e.j.s.ServletContextHandler@10e9a287{/solr,file:///build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-cwd/,AVAILABLE} 2> 1706692 INFO (jetty-launcher-3609-thread-4) [n: c: s: r: x: t:] o.e.j.s.h.ContextHandler Started o.e.j.s.ServletContextHandler@359fff57{/solr,file:///build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-cwd/,AVAILABLE} 2> 1706692 INFO (jetty-launcher-3609-thread-5) [n: c: s: r: x: t:] o.e.j.s.h.ContextHandler Started o.e.j.s.ServletContextHandler@6ed9f03c{/solr,file:///build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-cwd/,AVAILABLE} 2> 1706703 INFO (jetty-launcher-3609-thread-2) [n: c: s: r: x: t:] o.e.j.s.AbstractConnector Started ServerConnector@344d64e7{ssl, (ssl, alpn, http/1.1, h2)}{127.0.0.1:36867} 2> 1706703 INFO (jetty-launcher-3609-thread-4) [n: c: s: r: x: t:] o.e.j.s.AbstractConnector Started ServerConnector@45eb814e{ssl, (ssl, alpn, http/1.1, h2)}{127.0.0.1:36209} 2> 1706703 INFO (jetty-launcher-3609-thread-5) [n: c: s: r: x: t:] o.e.j.s.AbstractConnector Started ServerConnector@6c105610{ssl, (ssl, alpn, http/1.1, h2)}{127.0.0.1:35379} 2> 1706705 INFO (jetty-launcher-3609-thread-5) [n: c: s: r: x: t:] o.e.j.s.Server Started Server@2a789ec4{STARTING}[10.0.19,sto=0] @1706882ms 2> 1706705 INFO (jetty-launcher-3609-thread-4) [n: c: s: r: x: t:] o.e.j.s.Server Started Server@1346323a{STARTING}[10.0.19,sto=0] @1706882ms 2> 1706705 INFO (jetty-launcher-3609-thread-2) [n: c: s: r: x: t:] o.e.j.s.Server Started Server@725d94a3{STARTING}[10.0.19,sto=0] @1706882ms 2> 1706707 INFO (jetty-launcher-3609-thread-3) [n: c: s: r: x: t:] o.e.j.s.s.DefaultSessionIdManager Session workerName=node0 2> 1706709 ERROR (jetty-launcher-3609-thread-4) [n: c: s: r: x: t:] o.a.s.u.StartupLoggingUtils Missing Java Option solr.log.dir. Logging may be missing or incomplete. 2> 1706709 ERROR (jetty-launcher-3609-thread-5) [n: c: s: r: x: t:] o.a.s.u.StartupLoggingUtils Missing Java Option solr.log.dir. Logging may be missing or incomplete. 2> 1706709 ERROR (jetty-launcher-3609-thread-2) [n: c: s: r: x: t:] o.a.s.u.StartupLoggingUtils Missing Java Option solr.log.dir. Logging may be missing or incomplete. 2> 1706709 INFO (jetty-launcher-3609-thread-4) [n: c: s: r: x: t:] o.a.s.s.CoreContainerProvider Using logger factory org.apache.logging.slf4j.Log4jLoggerFactory 2> 1706709 INFO (jetty-launcher-3609-thread-5) [n: c: s: r: x: t:] o.a.s.s.CoreContainerProvider Using logger factory org.apache.logging.slf4j.Log4jLoggerFactory 2> 1706709 INFO (jetty-launcher-3609-thread-2) [n: c: s: r: x: t:] o.a.s.s.CoreContainerProvider Using logger factory org.apache.logging.slf4j.Log4jLoggerFactory 2> 1706710 INFO (jetty-launcher-3609-thread-1) [n: c: s: r: x: t:] o.e.j.s.s.DefaultSessionIdManager Session workerName=node0 2> 1706710 INFO (jetty-launcher-3609-thread-4) [n: c: s: r: x: t:] o.a.s.s.CoreContainerProvider ___ _ Welcome to Apache Solr™ version 9.4.1 2> 1706710 INFO (jetty-launcher-3609-thread-2) [n: c: s: r: x: t:] o.a.s.s.CoreContainerProvider ___ _ Welcome to Apache Solr™ version 9.4.1 2> 1706711 INFO (jetty-launcher-3609-thread-5) [n: c: s: r: x: t:] o.a.s.s.CoreContainerProvider ___ _ Welcome to Apache Solr™ version 9.4.1 2> 1706711 INFO (jetty-launcher-3609-thread-4) [n: c: s: r: x: t:] o.a.s.s.CoreContainerProvider / __| ___| |_ _ Starting in cloud mode on port null 2> 1706711 INFO (jetty-launcher-3609-thread-2) [n: c: s: r: x: t:] o.a.s.s.CoreContainerProvider / __| ___| |_ _ Starting in cloud mode on port null 2> 1706711 INFO (jetty-launcher-3609-thread-5) [n: c: s: r: x: t:] o.a.s.s.CoreContainerProvider / __| ___| |_ _ Starting in cloud mode on port null 2> 1706711 INFO (jetty-launcher-3609-thread-4) [n: c: s: r: x: t:] o.a.s.s.CoreContainerProvider \__ \/ _ \ | '_| Install dir: /build/solr/src/solr-9.4.1/solr 2> 1706712 INFO (jetty-launcher-3609-thread-2) [n: c: s: r: x: t:] o.a.s.s.CoreContainerProvider \__ \/ _ \ | '_| Install dir: /build/solr/src/solr-9.4.1/solr 2> 1706712 INFO (jetty-launcher-3609-thread-5) [n: c: s: r: x: t:] o.a.s.s.CoreContainerProvider \__ \/ _ \ | '_| Install dir: /build/solr/src/solr-9.4.1/solr 2> 1706713 INFO (jetty-launcher-3609-thread-3) [n: c: s: r: x: t:] o.e.j.s.h.ContextHandler Started o.e.j.s.ServletContextHandler@96ac04c{/solr,file:///build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-cwd/,AVAILABLE} 2> 1706712 INFO (jetty-launcher-3609-thread-2) [n: c: s: r: x: t:] o.a.s.s.CoreContainerProvider |___/\___/_|_| Start time: 2024-01-21T21:29:19.780827Z 2> 1706712 INFO (jetty-launcher-3609-thread-5) [n: c: s: r: x: t:] o.a.s.s.CoreContainerProvider |___/\___/_|_| Start time: 2024-01-21T21:29:19.780788Z 2> 1706712 INFO (jetty-launcher-3609-thread-4) [n: c: s: r: x: t:] o.a.s.s.CoreContainerProvider |___/\___/_|_| Start time: 2024-01-21T21:29:19.780777Z 2> 1706716 INFO (jetty-launcher-3609-thread-4) [n: c: s: r: x: t:] o.a.s.s.CoreContainerProvider Solr Home: /build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.search.join.ShardJoinImplicitTest_246C98A4C257C021-001/tempDir-001/node4 (source: servlet config: solr.solr.home) 2> 1706716 INFO (jetty-launcher-3609-thread-2) [n: c: s: r: x: t:] o.a.s.s.CoreContainerProvider Solr Home: /build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.search.join.ShardJoinImplicitTest_246C98A4C257C021-001/tempDir-001/node1 (source: servlet config: solr.solr.home) 2> 1706716 INFO (jetty-launcher-3609-thread-1) [n: c: s: r: x: t:] o.e.j.s.h.ContextHandler Started o.e.j.s.ServletContextHandler@29213bd9{/solr,file:///build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-cwd/,AVAILABLE} 2> 1706716 INFO (jetty-launcher-3609-thread-5) [n: c: s: r: x: t:] o.a.s.s.CoreContainerProvider Solr Home: /build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.search.join.ShardJoinImplicitTest_246C98A4C257C021-001/tempDir-001/node5 (source: servlet config: solr.solr.home) 2> 1706719 WARN (jetty-launcher-3609-thread-2) [n: c: s: r: x: t:] o.a.s.c.c.SolrZkClient Using default ZkCredentialsInjector. ZkCredentialsInjector is not secure, it creates an empty list of credentials which leads to 'OPEN_ACL_UNSAFE' ACLs to Zookeeper nodes 2> 1706719 WARN (jetty-launcher-3609-thread-4) [n: c: s: r: x: t:] o.a.s.c.c.SolrZkClient Using default ZkCredentialsInjector. ZkCredentialsInjector is not secure, it creates an empty list of credentials which leads to 'OPEN_ACL_UNSAFE' ACLs to Zookeeper nodes 2> 1706719 WARN (jetty-launcher-3609-thread-5) [n: c: s: r: x: t:] o.a.s.c.c.SolrZkClient Using default ZkCredentialsInjector. ZkCredentialsInjector is not secure, it creates an empty list of credentials which leads to 'OPEN_ACL_UNSAFE' ACLs to Zookeeper nodes 2> 1706722 INFO (jetty-launcher-3609-thread-3) [n: c: s: r: x: t:] o.e.j.s.AbstractConnector Started ServerConnector@4dcb8271{ssl, (ssl, alpn, http/1.1, h2)}{127.0.0.1:44957} 2> 1706723 INFO (jetty-launcher-3609-thread-3) [n: c: s: r: x: t:] o.e.j.s.Server Started Server@37dd8df4{STARTING}[10.0.19,sto=0] @1706900ms 2> 1706726 INFO (jetty-launcher-3609-thread-1) [n: c: s: r: x: t:] o.e.j.s.AbstractConnector Started ServerConnector@35bbdf47{ssl, (ssl, alpn, http/1.1, h2)}{127.0.0.1:34035} 2> 1706727 INFO (jetty-launcher-3609-thread-1) [n: c: s: r: x: t:] o.e.j.s.Server Started Server@328df163{STARTING}[10.0.19,sto=0] @1706904ms 2> 1706727 ERROR (jetty-launcher-3609-thread-3) [n: c: s: r: x: t:] o.a.s.u.StartupLoggingUtils Missing Java Option solr.log.dir. Logging may be missing or incomplete. 2> 1706728 INFO (jetty-launcher-3609-thread-3) [n: c: s: r: x: t:] o.a.s.s.CoreContainerProvider Using logger factory org.apache.logging.slf4j.Log4jLoggerFactory 2> 1706728 INFO (jetty-launcher-3609-thread-3) [n: c: s: r: x: t:] o.a.s.s.CoreContainerProvider ___ _ Welcome to Apache Solr™ version 9.4.1 2> 1706729 INFO (jetty-launcher-3609-thread-3) [n: c: s: r: x: t:] o.a.s.s.CoreContainerProvider / __| ___| |_ _ Starting in cloud mode on port null 2> 1706729 INFO (jetty-launcher-3609-thread-3) [n: c: s: r: x: t:] o.a.s.s.CoreContainerProvider \__ \/ _ \ | '_| Install dir: /build/solr/src/solr-9.4.1/solr 2> 1706729 INFO (jetty-launcher-3609-thread-3) [n: c: s: r: x: t:] o.a.s.s.CoreContainerProvider |___/\___/_|_| Start time: 2024-01-21T21:29:19.797889Z 2> 1706730 ERROR (jetty-launcher-3609-thread-1) [n: c: s: r: x: t:] o.a.s.u.StartupLoggingUtils Missing Java Option solr.log.dir. Logging may be missing or incomplete. 2> 1706730 INFO (jetty-launcher-3609-thread-1) [n: c: s: r: x: t:] o.a.s.s.CoreContainerProvider Using logger factory org.apache.logging.slf4j.Log4jLoggerFactory 2> 1706730 INFO (jetty-launcher-3609-thread-1) [n: c: s: r: x: t:] o.a.s.s.CoreContainerProvider ___ _ Welcome to Apache Solr™ version 9.4.1 2> 1706731 INFO (jetty-launcher-3609-thread-1) [n: c: s: r: x: t:] o.a.s.s.CoreContainerProvider / __| ___| |_ _ Starting in cloud mode on port null 2> 1706731 INFO (jetty-launcher-3609-thread-1) [n: c: s: r: x: t:] o.a.s.s.CoreContainerProvider \__ \/ _ \ | '_| Install dir: /build/solr/src/solr-9.4.1/solr 2> 1706731 INFO (jetty-launcher-3609-thread-3) [n: c: s: r: x: t:] o.a.s.s.CoreContainerProvider Solr Home: /build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.search.join.ShardJoinImplicitTest_246C98A4C257C021-001/tempDir-001/node3 (source: servlet config: solr.solr.home) 2> 1706732 INFO (jetty-launcher-3609-thread-1) [n: c: s: r: x: t:] o.a.s.s.CoreContainerProvider |___/\___/_|_| Start time: 2024-01-21T21:29:19.800032Z 2> 1706734 INFO (jetty-launcher-3609-thread-1) [n: c: s: r: x: t:] o.a.s.s.CoreContainerProvider Solr Home: /build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.search.join.ShardJoinImplicitTest_246C98A4C257C021-001/tempDir-001/node2 (source: servlet config: solr.solr.home) 2> 1706736 WARN (jetty-launcher-3609-thread-3) [n: c: s: r: x: t:] o.a.s.c.c.SolrZkClient Using default ZkCredentialsInjector. ZkCredentialsInjector is not secure, it creates an empty list of credentials which leads to 'OPEN_ACL_UNSAFE' ACLs to Zookeeper nodes 2> 1706738 WARN (jetty-launcher-3609-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.SolrZkClient Using default ZkCredentialsInjector. ZkCredentialsInjector is not secure, it creates an empty list of credentials which leads to 'OPEN_ACL_UNSAFE' ACLs to Zookeeper nodes 2> 1706749 INFO (jetty-launcher-3609-thread-5) [n: c: s: r: x: t:] o.a.s.c.c.ConnectionManager Waiting up to 30000ms for client to connect to ZooKeeper 2> 1706754 INFO (jetty-launcher-3609-thread-2) [n: c: s: r: x: t:] o.a.s.c.c.ConnectionManager Waiting up to 30000ms for client to connect to ZooKeeper 2> 1706759 INFO (jetty-launcher-3609-thread-4) [n: c: s: r: x: t:] o.a.s.c.c.ConnectionManager Waiting up to 30000ms for client to connect to ZooKeeper 2> 1706763 INFO (zkConnectionManagerCallback-3615-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ConnectionManager zkClient has connected 2> 1706763 INFO (jetty-launcher-3609-thread-5) [n: c: s: r: x: t:] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper 2> 1706764 WARN (jetty-launcher-3609-thread-5) [n: c: s: r: x: t:] o.a.s.c.c.SolrZkClient Using default ZkACLProvider. DefaultZkACLProvider is not secure, it creates 'OPEN_ACL_UNSAFE' ACLs to Zookeeper nodes 2> 1706766 INFO (jetty-launcher-3609-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ConnectionManager Waiting up to 30000ms for client to connect to ZooKeeper 2> 1706767 INFO (jetty-launcher-3609-thread-5) [n: c: s: r: x: t:] o.a.s.c.NodeConfig solr.xml found in ZooKeeper. Loading... 2> 1706771 INFO (jetty-launcher-3609-thread-3) [n: c: s: r: x: t:] o.a.s.c.c.ConnectionManager Waiting up to 30000ms for client to connect to ZooKeeper 2> 1706775 INFO (zkConnectionManagerCallback-3611-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ConnectionManager zkClient has connected 2> 1706776 INFO (jetty-launcher-3609-thread-4) [n: c: s: r: x: t:] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper 2> 1706776 WARN (jetty-launcher-3609-thread-4) [n: c: s: r: x: t:] o.a.s.c.c.SolrZkClient Using default ZkACLProvider. DefaultZkACLProvider is not secure, it creates 'OPEN_ACL_UNSAFE' ACLs to Zookeeper nodes 2> 1706777 INFO (zkConnectionManagerCallback-3613-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ConnectionManager zkClient has connected 2> 1706777 INFO (jetty-launcher-3609-thread-2) [n: c: s: r: x: t:] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper 2> 1706777 WARN (jetty-launcher-3609-thread-2) [n: c: s: r: x: t:] o.a.s.c.c.SolrZkClient Using default ZkACLProvider. DefaultZkACLProvider is not secure, it creates 'OPEN_ACL_UNSAFE' ACLs to Zookeeper nodes 2> 1706780 INFO (jetty-launcher-3609-thread-4) [n: c: s: r: x: t:] o.a.s.c.NodeConfig solr.xml found in ZooKeeper. Loading... 2> 1706780 INFO (zkConnectionManagerCallback-3619-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ConnectionManager zkClient has connected 2> 1706780 INFO (jetty-launcher-3609-thread-2) [n: c: s: r: x: t:] o.a.s.c.NodeConfig solr.xml found in ZooKeeper. Loading... 2> 1706781 INFO (jetty-launcher-3609-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper 2> 1706781 WARN (jetty-launcher-3609-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.SolrZkClient Using default ZkACLProvider. DefaultZkACLProvider is not secure, it creates 'OPEN_ACL_UNSAFE' ACLs to Zookeeper nodes 2> 1706784 INFO (zkConnectionManagerCallback-3617-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ConnectionManager zkClient has connected 2> 1706784 INFO (jetty-launcher-3609-thread-3) [n: c: s: r: x: t:] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper 2> 1706785 WARN (jetty-launcher-3609-thread-3) [n: c: s: r: x: t:] o.a.s.c.c.SolrZkClient Using default ZkACLProvider. DefaultZkACLProvider is not secure, it creates 'OPEN_ACL_UNSAFE' ACLs to Zookeeper nodes 2> 1706785 INFO (jetty-launcher-3609-thread-1) [n: c: s: r: x: t:] o.a.s.c.NodeConfig solr.xml found in ZooKeeper. Loading... 2> 1706788 INFO (jetty-launcher-3609-thread-3) [n: c: s: r: x: t:] o.a.s.c.NodeConfig solr.xml found in ZooKeeper. Loading... 2> 1706789 INFO (jetty-launcher-3609-thread-5) [n: c: s: r: x: t:] o.a.s.c.SolrXmlConfig Metrics collection is disabled. 2> 1706798 INFO (jetty-launcher-3609-thread-4) [n: c: s: r: x: t:] o.a.s.c.SolrXmlConfig Metrics collection is disabled. 2> 1706798 INFO (jetty-launcher-3609-thread-2) [n: c: s: r: x: t:] o.a.s.c.SolrXmlConfig Metrics collection is disabled. 2> 1706803 INFO (jetty-launcher-3609-thread-1) [n: c: s: r: x: t:] o.a.s.c.SolrXmlConfig Metrics collection is disabled. 2> 1706806 INFO (jetty-launcher-3609-thread-3) [n: c: s: r: x: t:] o.a.s.c.SolrXmlConfig Metrics collection is disabled. 2> 1706930 WARN (jetty-launcher-3609-thread-5) [n: c: s: r: x: t:] o.e.j.u.s.S.config Trusting all certificates configured for Client@737cb037[provider=null,keyStore=null,trustStore=null] 2> 1706930 WARN (jetty-launcher-3609-thread-5) [n: c: s: r: x: t:] o.e.j.u.s.S.config No Client EndPointIdentificationAlgorithm configured for Client@737cb037[provider=null,keyStore=null,trustStore=null] 2> 1706930 WARN (jetty-launcher-3609-thread-4) [n: c: s: r: x: t:] o.e.j.u.s.S.config Trusting all certificates configured for Client@42c27e55[provider=null,keyStore=null,trustStore=null] 2> 1706931 WARN (jetty-launcher-3609-thread-2) [n: c: s: r: x: t:] o.e.j.u.s.S.config Trusting all certificates configured for Client@66102353[provider=null,keyStore=null,trustStore=null] 2> 1706931 WARN (jetty-launcher-3609-thread-1) [n: c: s: r: x: t:] o.e.j.u.s.S.config Trusting all certificates configured for Client@4ce202ba[provider=null,keyStore=null,trustStore=null] 2> 1706938 WARN (jetty-launcher-3609-thread-1) [n: c: s: r: x: t:] o.e.j.u.s.S.config No Client EndPointIdentificationAlgorithm configured for Client@4ce202ba[provider=null,keyStore=null,trustStore=null] 2> 1706938 WARN (jetty-launcher-3609-thread-4) [n: c: s: r: x: t:] o.e.j.u.s.S.config No Client EndPointIdentificationAlgorithm configured for Client@42c27e55[provider=null,keyStore=null,trustStore=null] 2> 1706938 WARN (jetty-launcher-3609-thread-2) [n: c: s: r: x: t:] o.e.j.u.s.S.config No Client EndPointIdentificationAlgorithm configured for Client@66102353[provider=null,keyStore=null,trustStore=null] 2> 1706943 WARN (jetty-launcher-3609-thread-3) [n: c: s: r: x: t:] o.e.j.u.s.S.config Trusting all certificates configured for Client@74945c46[provider=null,keyStore=null,trustStore=null] 2> 1706944 WARN (jetty-launcher-3609-thread-3) [n: c: s: r: x: t:] o.e.j.u.s.S.config No Client EndPointIdentificationAlgorithm configured for Client@74945c46[provider=null,keyStore=null,trustStore=null] 2> 1706966 WARN (jetty-launcher-3609-thread-1) [n: c: s: r: x: t:] o.e.j.u.s.S.config Trusting all certificates configured for Client@109514d7[provider=null,keyStore=null,trustStore=null] 2> 1706967 WARN (jetty-launcher-3609-thread-1) [n: c: s: r: x: t:] o.e.j.u.s.S.config No Client EndPointIdentificationAlgorithm configured for Client@109514d7[provider=null,keyStore=null,trustStore=null] 2> 1706969 WARN (jetty-launcher-3609-thread-2) [n: c: s: r: x: t:] o.e.j.u.s.S.config Trusting all certificates configured for Client@66da8d19[provider=null,keyStore=null,trustStore=null] 2> 1706969 WARN (jetty-launcher-3609-thread-2) [n: c: s: r: x: t:] o.e.j.u.s.S.config No Client EndPointIdentificationAlgorithm configured for Client@66da8d19[provider=null,keyStore=null,trustStore=null] 2> 1706971 WARN (jetty-launcher-3609-thread-5) [n: c: s: r: x: t:] o.e.j.u.s.S.config Trusting all certificates configured for Client@a19a813[provider=null,keyStore=null,trustStore=null] 2> 1706971 WARN (jetty-launcher-3609-thread-5) [n: c: s: r: x: t:] o.e.j.u.s.S.config No Client EndPointIdentificationAlgorithm configured for Client@a19a813[provider=null,keyStore=null,trustStore=null] 2> 1706975 WARN (jetty-launcher-3609-thread-3) [n: c: s: r: x: t:] o.e.j.u.s.S.config Trusting all certificates configured for Client@69b0909d[provider=null,keyStore=null,trustStore=null] 2> 1706975 WARN (jetty-launcher-3609-thread-3) [n: c: s: r: x: t:] o.e.j.u.s.S.config No Client EndPointIdentificationAlgorithm configured for Client@69b0909d[provider=null,keyStore=null,trustStore=null] 2> 1706977 WARN (jetty-launcher-3609-thread-4) [n: c: s: r: x: t:] o.e.j.u.s.S.config Trusting all certificates configured for Client@566749e6[provider=null,keyStore=null,trustStore=null] 2> 1706977 WARN (jetty-launcher-3609-thread-4) [n: c: s: r: x: t:] o.e.j.u.s.S.config No Client EndPointIdentificationAlgorithm configured for Client@566749e6[provider=null,keyStore=null,trustStore=null] 2> 1706985 INFO (jetty-launcher-3609-thread-1) [n: c: s: r: x: t:] o.a.s.c.ZkContainer Zookeeper client=127.0.0.1:33915/solr 2> 1706990 INFO (jetty-launcher-3609-thread-1) [n: c: s: r: x: t:] o.a.s.c.DistributedClusterStateUpdater Creating DistributedClusterStateUpdater with useDistributedStateUpdate=true. Solr will be using distributed cluster state updates. 2> 1706992 INFO (jetty-launcher-3609-thread-5) [n: c: s: r: x: t:] o.a.s.c.ZkContainer Zookeeper client=127.0.0.1:33915/solr 2> 1706993 INFO (jetty-launcher-3609-thread-5) [n: c: s: r: x: t:] o.a.s.c.DistributedClusterStateUpdater Creating DistributedClusterStateUpdater with useDistributedStateUpdate=true. Solr will be using distributed cluster state updates. 2> 1706998 INFO (jetty-launcher-3609-thread-2) [n: c: s: r: x: t:] o.a.s.c.ZkContainer Zookeeper client=127.0.0.1:33915/solr 2> 1707000 INFO (jetty-launcher-3609-thread-2) [n: c: s: r: x: t:] o.a.s.c.DistributedClusterStateUpdater Creating DistributedClusterStateUpdater with useDistributedStateUpdate=true. Solr will be using distributed cluster state updates. 2> 1707001 INFO (jetty-launcher-3609-thread-3) [n: c: s: r: x: t:] o.a.s.c.ZkContainer Zookeeper client=127.0.0.1:33915/solr 2> 1707002 INFO (jetty-launcher-3609-thread-3) [n: c: s: r: x: t:] o.a.s.c.DistributedClusterStateUpdater Creating DistributedClusterStateUpdater with useDistributedStateUpdate=true. Solr will be using distributed cluster state updates. 2> 1707004 INFO (jetty-launcher-3609-thread-4) [n: c: s: r: x: t:] o.a.s.c.ZkContainer Zookeeper client=127.0.0.1:33915/solr 2> 1707005 INFO (jetty-launcher-3609-thread-4) [n: c: s: r: x: t:] o.a.s.c.DistributedClusterStateUpdater Creating DistributedClusterStateUpdater with useDistributedStateUpdate=true. Solr will be using distributed cluster state updates. 2> 1707029 INFO (jetty-launcher-3609-thread-5) [n:127.0.0.1:35379_solr c: s: r: x: t:] o.a.s.c.c.ConnectionManager Waiting up to 15000ms for client to connect to ZooKeeper 2> 1707041 INFO (jetty-launcher-3609-thread-4) [n:127.0.0.1:36209_solr c: s: r: x: t:] o.a.s.c.c.ConnectionManager Waiting up to 15000ms for client to connect to ZooKeeper 2> 1707047 INFO (jetty-launcher-3609-thread-1) [n:127.0.0.1:34035_solr c: s: r: x: t:] o.a.s.c.c.ConnectionManager Waiting up to 15000ms for client to connect to ZooKeeper 2> 1707049 INFO (jetty-launcher-3609-thread-3) [n:127.0.0.1:44957_solr c: s: r: x: t:] o.a.s.c.c.ConnectionManager Waiting up to 15000ms for client to connect to ZooKeeper 2> 1707053 INFO (zkConnectionManagerCallback-3661-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ConnectionManager zkClient has connected 2> 1707053 INFO (jetty-launcher-3609-thread-5) [n:127.0.0.1:35379_solr c: s: r: x: t:] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper 2> 1707054 INFO (jetty-launcher-3609-thread-2) [n:127.0.0.1:36867_solr c: s: r: x: t:] o.a.s.c.c.ConnectionManager Waiting up to 15000ms for client to connect to ZooKeeper 2> 1707058 INFO (zkConnectionManagerCallback-3667-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ConnectionManager zkClient has connected 2> 1707058 INFO (jetty-launcher-3609-thread-3) [n:127.0.0.1:44957_solr c: s: r: x: t:] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper 2> 1707061 INFO (zkConnectionManagerCallback-3669-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ConnectionManager zkClient has connected 2> 1707061 INFO (jetty-launcher-3609-thread-4) [n:127.0.0.1:36209_solr c: s: r: x: t:] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper 2> 1707064 INFO (zkConnectionManagerCallback-3659-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ConnectionManager zkClient has connected 2> 1707065 INFO (zkConnectionManagerCallback-3665-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ConnectionManager zkClient has connected 2> 1707065 INFO (jetty-launcher-3609-thread-1) [n:127.0.0.1:34035_solr c: s: r: x: t:] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper 2> 1707065 INFO (jetty-launcher-3609-thread-2) [n:127.0.0.1:36867_solr c: s: r: x: t:] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper 2> 1707219 WARN (jetty-launcher-3609-thread-4) [n:127.0.0.1:36209_solr c: s: r: x: t:] o.a.s.c.ZkController Contents of zookeeper /security.json are world-readable; consider setting up ACLs as described in https://solr.apache.org/guide/solr/latest/deployment-guide/zookeeper-access-control.html 2> 1707219 WARN (jetty-launcher-3609-thread-1) [n:127.0.0.1:34035_solr c: s: r: x: t:] o.a.s.c.ZkController Contents of zookeeper /security.json are world-readable; consider setting up ACLs as described in https://solr.apache.org/guide/solr/latest/deployment-guide/zookeeper-access-control.html 2> 1707219 WARN (jetty-launcher-3609-thread-3) [n:127.0.0.1:44957_solr c: s: r: x: t:] o.a.s.c.ZkController Contents of zookeeper /security.json are world-readable; consider setting up ACLs as described in https://solr.apache.org/guide/solr/latest/deployment-guide/zookeeper-access-control.html 2> 1707220 WARN (jetty-launcher-3609-thread-2) [n:127.0.0.1:36867_solr c: s: r: x: t:] o.a.s.c.ZkController Contents of zookeeper /security.json are world-readable; consider setting up ACLs as described in https://solr.apache.org/guide/solr/latest/deployment-guide/zookeeper-access-control.html 2> 1707220 WARN (jetty-launcher-3609-thread-5) [n:127.0.0.1:35379_solr c: s: r: x: t:] o.a.s.c.ZkController Contents of zookeeper /security.json are world-readable; consider setting up ACLs as described in https://solr.apache.org/guide/solr/latest/deployment-guide/zookeeper-access-control.html 2> 1707244 INFO (jetty-launcher-3609-thread-4) [n:127.0.0.1:36209_solr c: s: r: x: t:] o.a.s.c.DistributedClusterStateUpdater Creating DistributedClusterStateUpdater with useDistributedStateUpdate=true. Solr will be using distributed cluster state updates. 2> 1707244 INFO (jetty-launcher-3609-thread-1) [n:127.0.0.1:34035_solr c: s: r: x: t:] o.a.s.c.DistributedClusterStateUpdater Creating DistributedClusterStateUpdater with useDistributedStateUpdate=true. Solr will be using distributed cluster state updates. 2> 1707244 INFO (jetty-launcher-3609-thread-5) [n:127.0.0.1:35379_solr c: s: r: x: t:] o.a.s.c.DistributedClusterStateUpdater Creating DistributedClusterStateUpdater with useDistributedStateUpdate=true. Solr will be using distributed cluster state updates. 2> 1707245 INFO (jetty-launcher-3609-thread-2) [n:127.0.0.1:36867_solr c: s: r: x: t:] o.a.s.c.DistributedClusterStateUpdater Creating DistributedClusterStateUpdater with useDistributedStateUpdate=true. Solr will be using distributed cluster state updates. 2> 1707246 INFO (jetty-launcher-3609-thread-3) [n:127.0.0.1:44957_solr c: s: r: x: t:] o.a.s.c.DistributedClusterStateUpdater Creating DistributedClusterStateUpdater with useDistributedStateUpdate=true. Solr will be using distributed cluster state updates. 2> 1707266 INFO (jetty-launcher-3609-thread-5) [n:127.0.0.1:35379_solr c: s: r: x: t:] o.a.s.c.OverseerElectionContext I am going to be the leader 127.0.0.1:35379_solr 2> 1707272 INFO (jetty-launcher-3609-thread-4) [n:127.0.0.1:36209_solr c: s: r: x: t:] o.a.s.c.ZkController Register node as live in ZooKeeper:/live_nodes/127.0.0.1:36209_solr 2> 1707272 INFO (jetty-launcher-3609-thread-1) [n:127.0.0.1:34035_solr c: s: r: x: t:] o.a.s.c.ZkController Register node as live in ZooKeeper:/live_nodes/127.0.0.1:34035_solr 2> 1707273 INFO (jetty-launcher-3609-thread-3) [n:127.0.0.1:44957_solr c: s: r: x: t:] o.a.s.c.ZkController Register node as live in ZooKeeper:/live_nodes/127.0.0.1:44957_solr 2> 1707274 INFO (jetty-launcher-3609-thread-5) [n:127.0.0.1:35379_solr c: s: r: x: t:] o.a.s.c.Overseer Overseer (id=72077900700319752-127.0.0.1:35379_solr-n_0000000000) starting 2> 1707274 INFO (jetty-launcher-3609-thread-2) [n:127.0.0.1:36867_solr c: s: r: x: t:] o.a.s.c.ZkController Register node as live in ZooKeeper:/live_nodes/127.0.0.1:36867_solr 2> 1707288 INFO (zkCallback-3660-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (0) -> (4) 2> 1707289 INFO (zkCallback-3664-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (0) -> (4) 2> 1707289 INFO (zkCallback-3668-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (0) -> (4) 2> 1707291 INFO (zkCallback-3658-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (0) -> (4) 2> 1707293 INFO (zkCallback-3666-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (0) -> (4) 2> 1707310 WARN (jetty-launcher-3609-thread-4) [n:127.0.0.1:36209_solr c: s: r: x: t:] o.a.s.c.CoreContainer Not all security plugins configured! authentication=disabled authorization=disabled. Solr is only as secure as you make it. Consider configuring authentication/authorization before exposing Solr to users internal or external. See https://s.apache.org/solrsecurity for more info 2> 1707311 WARN (jetty-launcher-3609-thread-1) [n:127.0.0.1:34035_solr c: s: r: x: t:] o.a.s.c.CoreContainer Not all security plugins configured! authentication=disabled authorization=disabled. Solr is only as secure as you make it. Consider configuring authentication/authorization before exposing Solr to users internal or external. See https://s.apache.org/solrsecurity for more info 2> 1707311 WARN (jetty-launcher-3609-thread-3) [n:127.0.0.1:44957_solr c: s: r: x: t:] o.a.s.c.CoreContainer Not all security plugins configured! authentication=disabled authorization=disabled. Solr is only as secure as you make it. Consider configuring authentication/authorization before exposing Solr to users internal or external. See https://s.apache.org/solrsecurity for more info 2> 1707311 WARN (jetty-launcher-3609-thread-2) [n:127.0.0.1:36867_solr c: s: r: x: t:] o.a.s.c.CoreContainer Not all security plugins configured! authentication=disabled authorization=disabled. Solr is only as secure as you make it. Consider configuring authentication/authorization before exposing Solr to users internal or external. See https://s.apache.org/solrsecurity for more info 2> 1707340 INFO (OverseerStateUpdate-72077900700319752-127.0.0.1:35379_solr-n_0000000000) [n:127.0.0.1:35379_solr c: s: r: x: t:] o.a.s.c.Overseer Starting to work on the main queue : 127.0.0.1:35379_solr 2> 1707340 INFO (jetty-launcher-3609-thread-5) [n:127.0.0.1:35379_solr c: s: r: x: t:] o.a.s.c.ZkController Publish node=127.0.0.1:35379_solr as DOWN 2> 1707344 INFO (jetty-launcher-3609-thread-5) [n:127.0.0.1:35379_solr c: s: r: x: t:] o.a.s.c.ZkController Register node as live in ZooKeeper:/live_nodes/127.0.0.1:35379_solr 2> 1707351 INFO (OverseerStateUpdate-72077900700319752-127.0.0.1:35379_solr-n_0000000000) [n:127.0.0.1:35379_solr c: s: r: x: t:] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (4) -> (5) 2> 1707355 INFO (zkCallback-3664-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (4) -> (5) 2> 1707356 INFO (zkCallback-3658-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (4) -> (5) 2> 1707356 INFO (zkCallback-3666-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (4) -> (5) 2> 1707356 INFO (zkCallback-3668-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (4) -> (5) 2> 1707371 WARN (jetty-launcher-3609-thread-5) [n:127.0.0.1:35379_solr c: s: r: x: t:] o.a.s.c.CoreContainer Not all security plugins configured! authentication=disabled authorization=disabled. Solr is only as secure as you make it. Consider configuring authentication/authorization before exposing Solr to users internal or external. See https://s.apache.org/solrsecurity for more info 2> 1707569 INFO (jetty-launcher-3609-thread-5) [n:127.0.0.1:35379_solr c: s: r: x: t:] o.a.s.c.CorePropertiesLocator Found 0 core definitions underneath /build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.search.join.ShardJoinImplicitTest_246C98A4C257C021-001/tempDir-001/node5 2> 1707744 INFO (jetty-launcher-3609-thread-4) [n:127.0.0.1:36209_solr c: s: r: x: t:] o.a.s.c.CorePropertiesLocator Found 0 core definitions underneath /build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.search.join.ShardJoinImplicitTest_246C98A4C257C021-001/tempDir-001/node4 2> 1708528 INFO (jetty-launcher-3609-thread-2) [n:127.0.0.1:36867_solr c: s: r: x: t:] o.a.s.c.CorePropertiesLocator Found 0 core definitions underneath /build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.search.join.ShardJoinImplicitTest_246C98A4C257C021-001/tempDir-001/node1 2> 1708530 INFO (jetty-launcher-3609-thread-1) [n:127.0.0.1:34035_solr c: s: r: x: t:] o.a.s.c.CorePropertiesLocator Found 0 core definitions underneath /build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.search.join.ShardJoinImplicitTest_246C98A4C257C021-001/tempDir-001/node2 2> 1708550 INFO (jetty-launcher-3609-thread-3) [n:127.0.0.1:44957_solr c: s: r: x: t:] o.a.s.c.CorePropertiesLocator Found 0 core definitions underneath /build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.search.join.ShardJoinImplicitTest_246C98A4C257C021-001/tempDir-001/node3 2> 1708561 INFO (jetty-launcher-3609-thread-5) [n:127.0.0.1:35379_solr c: s: r: x: t:] o.a.s.j.SolrRequestAuthorizer Creating a new SolrRequestAuthorizer 2> 1708719 INFO (jetty-launcher-3609-thread-4) [n:127.0.0.1:36209_solr c: s: r: x: t:] o.a.s.j.SolrRequestAuthorizer Creating a new SolrRequestAuthorizer 2> 1708732 INFO (jetty-launcher-3609-thread-5) [n:127.0.0.1:35379_solr c: s: r: x: t:] o.a.s.e.JettySolrRunner Jetty properties: {hostContext=/solr, hostPort=35379, zkHost=127.0.0.1:33915/solr} 2> 1708888 INFO (jetty-launcher-3609-thread-4) [n:127.0.0.1:36209_solr c: s: r: x: t:] o.a.s.e.JettySolrRunner Jetty properties: {hostContext=/solr, hostPort=36209, zkHost=127.0.0.1:33915/solr} 2> 1708927 INFO (jetty-launcher-3609-thread-2) [n:127.0.0.1:36867_solr c: s: r: x: t:] o.a.s.j.SolrRequestAuthorizer Creating a new SolrRequestAuthorizer 2> 1708957 INFO (jetty-launcher-3609-thread-3) [n:127.0.0.1:44957_solr c: s: r: x: t:] o.a.s.j.SolrRequestAuthorizer Creating a new SolrRequestAuthorizer 2> 1709025 INFO (jetty-launcher-3609-thread-1) [n:127.0.0.1:34035_solr c: s: r: x: t:] o.a.s.j.SolrRequestAuthorizer Creating a new SolrRequestAuthorizer 2> 1709107 INFO (jetty-launcher-3609-thread-2) [n:127.0.0.1:36867_solr c: s: r: x: t:] o.a.s.e.JettySolrRunner Jetty properties: {hostContext=/solr, hostPort=36867, zkHost=127.0.0.1:33915/solr} 2> 1709159 INFO (jetty-launcher-3609-thread-3) [n:127.0.0.1:44957_solr c: s: r: x: t:] o.a.s.e.JettySolrRunner Jetty properties: {hostContext=/solr, hostPort=44957, zkHost=127.0.0.1:33915/solr} 2> 1709216 INFO (jetty-launcher-3609-thread-1) [n:127.0.0.1:34035_solr c: s: r: x: t:] o.a.s.e.JettySolrRunner Jetty properties: {hostContext=/solr, hostPort=34035, zkHost=127.0.0.1:33915/solr} 2> 1709230 INFO (SUITE-ShardJoinImplicitTest-seed#[246C98A4C257C021]-worker) [n: c: s: r: x: t:] o.a.s.c.MiniSolrCloudCluster waitForAllNodes: numServers=5 2> 1709230 INFO (SUITE-ShardJoinImplicitTest-seed#[246C98A4C257C021]-worker) [n: c: s: r: x: t:] o.a.s.c.MiniSolrCloudCluster waitForNode: 127.0.0.1:35379_solr 2> 1709232 WARN (SUITE-ShardJoinImplicitTest-seed#[246C98A4C257C021]-worker) [n: c: s: r: x: t:] o.a.s.c.c.SolrZkClient Using default ZkCredentialsInjector. ZkCredentialsInjector is not secure, it creates an empty list of credentials which leads to 'OPEN_ACL_UNSAFE' ACLs to Zookeeper nodes 2> 1709241 INFO (SUITE-ShardJoinImplicitTest-seed#[246C98A4C257C021]-worker) [n: c: s: r: x: t:] o.a.s.c.c.ConnectionManager Waiting up to 15000ms for client to connect to ZooKeeper 2> 1709250 INFO (zkConnectionManagerCallback-3702-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ConnectionManager zkClient has connected 2> 1709250 INFO (SUITE-ShardJoinImplicitTest-seed#[246C98A4C257C021]-worker) [n: c: s: r: x: t:] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper 2> 1709251 WARN (SUITE-ShardJoinImplicitTest-seed#[246C98A4C257C021]-worker) [n: c: s: r: x: t:] o.a.s.c.c.SolrZkClient Using default ZkACLProvider. DefaultZkACLProvider is not secure, it creates 'OPEN_ACL_UNSAFE' ACLs to Zookeeper nodes 2> 1709256 INFO (SUITE-ShardJoinImplicitTest-seed#[246C98A4C257C021]-worker) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (0) -> (5) 2> 1709263 INFO (SUITE-ShardJoinImplicitTest-seed#[246C98A4C257C021]-worker) [n: c: s: r: x: t:] o.a.s.c.s.i.ZkClientClusterStateProvider Cluster at 127.0.0.1:33915/solr ready 2> 1709264 INFO (SUITE-ShardJoinImplicitTest-seed#[246C98A4C257C021]-worker) [n: c: s: r: x: t:] o.a.s.c.MiniSolrCloudCluster waitForNode: 127.0.0.1:36209_solr 2> 1709264 INFO (SUITE-ShardJoinImplicitTest-seed#[246C98A4C257C021]-worker) [n: c: s: r: x: t:] o.a.s.c.MiniSolrCloudCluster waitForNode: 127.0.0.1:36867_solr 2> 1709264 INFO (SUITE-ShardJoinImplicitTest-seed#[246C98A4C257C021]-worker) [n: c: s: r: x: t:] o.a.s.c.MiniSolrCloudCluster waitForNode: 127.0.0.1:44957_solr 2> 1709264 INFO (SUITE-ShardJoinImplicitTest-seed#[246C98A4C257C021]-worker) [n: c: s: r: x: t:] o.a.s.c.MiniSolrCloudCluster waitForNode: 127.0.0.1:34035_solr 2> 1709574 INFO (qtp1283346820-5578) [n:127.0.0.1:36209_solr c: s: r: x: t:null-9680] o.a.s.s.HttpSolrCall [admin] webapp=null path=/cluster/plugin params={wt=javabin&version=2} status=0 QTime=127 2> 1709719 INFO (OverseerThreadFactory-3682-thread-1) [n: c:children s: r: x: t:] o.a.s.c.a.c.CreateCollectionCmd Create collection children 2> 1710053 INFO (OverseerThreadFactory-3682-thread-1) [n: c:children s: r: x: t:] o.a.s.c.o.SliceMutator createReplica() { 2> "core":"children_d_replica_n1", 2> "node_name":"127.0.0.1:34035_solr", 2> "base_url":"https://127.0.0.1:34035/solr", 2> "collection":"children", 2> "shard":"d", 2> "state":"down", 2> "type":"NRT", 2> "operation":"ADDREPLICA", 2> "waitForFinalState":"false"} 2> 1710062 INFO (OverseerThreadFactory-3682-thread-1) [n: c:children s: r: x: t:] o.a.s.c.o.SliceMutator createReplica() { 2> "core":"children_b_replica_n2", 2> "node_name":"127.0.0.1:36209_solr", 2> "base_url":"https://127.0.0.1:36209/solr", 2> "collection":"children", 2> "shard":"b", 2> "state":"down", 2> "type":"NRT", 2> "operation":"ADDREPLICA", 2> "waitForFinalState":"false"} 2> 1710072 INFO (OverseerThreadFactory-3682-thread-1) [n: c:children s: r: x: t:] o.a.s.c.o.SliceMutator createReplica() { 2> "core":"children_c_replica_n3", 2> "node_name":"127.0.0.1:35379_solr", 2> "base_url":"https://127.0.0.1:35379/solr", 2> "collection":"children", 2> "shard":"c", 2> "state":"down", 2> "type":"NRT", 2> "operation":"ADDREPLICA", 2> "waitForFinalState":"false"} 2> 1710082 INFO (OverseerThreadFactory-3682-thread-1) [n: c:children s: r: x: t:] o.a.s.c.o.SliceMutator createReplica() { 2> "core":"children_a_replica_n4", 2> "node_name":"127.0.0.1:35379_solr", 2> "base_url":"https://127.0.0.1:35379/solr", 2> "collection":"children", 2> "shard":"a", 2> "state":"down", 2> "type":"NRT", 2> "operation":"ADDREPLICA", 2> "waitForFinalState":"false"} 2> 1710092 INFO (OverseerThreadFactory-3682-thread-1) [n: c:children s: r: x: t:] o.a.s.c.o.SliceMutator createReplica() { 2> "core":"children_e_replica_n5", 2> "node_name":"127.0.0.1:36209_solr", 2> "base_url":"https://127.0.0.1:36209/solr", 2> "collection":"children", 2> "shard":"e", 2> "state":"down", 2> "type":"NRT", 2> "operation":"ADDREPLICA", 2> "waitForFinalState":"false"} 2> 1710101 INFO (OverseerThreadFactory-3682-thread-1) [n: c:children s: r: x: t:] o.a.s.c.o.SliceMutator createReplica() { 2> "core":"children_a_replica_n6", 2> "node_name":"127.0.0.1:44957_solr", 2> "base_url":"https://127.0.0.1:44957/solr", 2> "collection":"children", 2> "shard":"a", 2> "state":"down", 2> "type":"NRT", 2> "operation":"ADDREPLICA", 2> "waitForFinalState":"false"} 2> 1710111 INFO (OverseerThreadFactory-3682-thread-1) [n: c:children s: r: x: t:] o.a.s.c.o.SliceMutator createReplica() { 2> "core":"children_e_replica_n7", 2> "node_name":"127.0.0.1:36867_solr", 2> "base_url":"https://127.0.0.1:36867/solr", 2> "collection":"children", 2> "shard":"e", 2> "state":"down", 2> "type":"NRT", 2> "operation":"ADDREPLICA", 2> "waitForFinalState":"false"} 2> 1710121 INFO (OverseerThreadFactory-3682-thread-1) [n: c:children s: r: x: t:] o.a.s.c.o.SliceMutator createReplica() { 2> "core":"children_d_replica_n8", 2> "node_name":"127.0.0.1:44957_solr", 2> "base_url":"https://127.0.0.1:44957/solr", 2> "collection":"children", 2> "shard":"d", 2> "state":"down", 2> "type":"NRT", 2> "operation":"ADDREPLICA", 2> "waitForFinalState":"false"} 2> 1710132 INFO (OverseerThreadFactory-3682-thread-1) [n: c:children s: r: x: t:] o.a.s.c.o.SliceMutator createReplica() { 2> "core":"children_c_replica_n9", 2> "node_name":"127.0.0.1:36867_solr", 2> "base_url":"https://127.0.0.1:36867/solr", 2> "collection":"children", 2> "shard":"c", 2> "state":"down", 2> "type":"NRT", 2> "operation":"ADDREPLICA", 2> "waitForFinalState":"false"} 2> 1710142 INFO (OverseerThreadFactory-3682-thread-1) [n: c:children s: r: x: t:] o.a.s.c.o.SliceMutator createReplica() { 2> "core":"children_b_replica_n10", 2> "node_name":"127.0.0.1:34035_solr", 2> "base_url":"https://127.0.0.1:34035/solr", 2> "collection":"children", 2> "shard":"b", 2> "state":"down", 2> "type":"NRT", 2> "operation":"ADDREPLICA", 2> "waitForFinalState":"false"} 2> 1710386 INFO (qtp266123491-5590) [n:127.0.0.1:34035_solr c: s: r: x:children_b_replica_n10 t:null-9684] o.a.s.h.a.CoreAdminOperation core create command qt=/admin/cores&collection.configName=_default&newCollection=true&collection=children&version=2&replicaType=NRT&coreNodeName=core_node20&name=children_b_replica_n10&action=CREATE&numShards=5&shard=b&wt=javabin 2> 1710386 INFO (qtp266123491-5585) [n:127.0.0.1:34035_solr c: s: r: x:children_d_replica_n1 t:null-9683] o.a.s.h.a.CoreAdminOperation core create command qt=/admin/cores&collection.configName=_default&newCollection=true&collection=children&version=2&replicaType=NRT&coreNodeName=core_node11&name=children_d_replica_n1&action=CREATE&numShards=5&shard=d&wt=javabin 2> 1710386 INFO (qtp1283346820-5568) [n:127.0.0.1:36209_solr c: s: r: x:children_b_replica_n2 t:null-9682] o.a.s.h.a.CoreAdminOperation core create command qt=/admin/cores&collection.configName=_default&newCollection=true&collection=children&version=2&replicaType=NRT&coreNodeName=core_node12&name=children_b_replica_n2&action=CREATE&numShards=5&shard=b&wt=javabin 2> 1710386 INFO (qtp1283346820-5570) [n:127.0.0.1:36209_solr c: s: r: x:children_e_replica_n5 t:null-9685] o.a.s.h.a.CoreAdminOperation core create command qt=/admin/cores&collection.configName=_default&newCollection=true&collection=children&version=2&replicaType=NRT&coreNodeName=core_node15&name=children_e_replica_n5&action=CREATE&numShards=5&shard=e&wt=javabin 2> 1710428 INFO (qtp484878969-5581) [n:127.0.0.1:44957_solr c: s: r: x:children_a_replica_n6 t:null-9687] o.a.s.h.a.CoreAdminOperation core create command qt=/admin/cores&collection.configName=_default&newCollection=true&collection=children&version=2&replicaType=NRT&coreNodeName=core_node16&name=children_a_replica_n6&action=CREATE&numShards=5&shard=a&wt=javabin 2> 1710428 INFO (qtp1167887541-5571) [n:127.0.0.1:36867_solr c: s: r: x:children_e_replica_n7 t:null-9688] o.a.s.h.a.CoreAdminOperation core create command qt=/admin/cores&collection.configName=_default&newCollection=true&collection=children&version=2&replicaType=NRT&coreNodeName=core_node17&name=children_e_replica_n7&action=CREATE&numShards=5&shard=e&wt=javabin 2> 1710429 INFO (qtp484878969-5589) [n:127.0.0.1:44957_solr c: s: r: x:children_d_replica_n8 t:null-9686] o.a.s.h.a.CoreAdminOperation core create command qt=/admin/cores&collection.configName=_default&newCollection=true&collection=children&version=2&replicaType=NRT&coreNodeName=core_node18&name=children_d_replica_n8&action=CREATE&numShards=5&shard=d&wt=javabin 2> 1710429 INFO (qtp1167887541-5575) [n:127.0.0.1:36867_solr c: s: r: x:children_c_replica_n9 t:null-9689] o.a.s.h.a.CoreAdminOperation core create command qt=/admin/cores&collection.configName=_default&newCollection=true&collection=children&version=2&replicaType=NRT&coreNodeName=core_node19&name=children_c_replica_n9&action=CREATE&numShards=5&shard=c&wt=javabin 2> 1710444 INFO (qtp780434683-5566) [n:127.0.0.1:35379_solr c: s: r: x:children_c_replica_n3 t:null-9690] o.a.s.h.a.CoreAdminOperation core create command qt=/admin/cores&collection.configName=_default&newCollection=true&collection=children&version=2&replicaType=NRT&coreNodeName=core_node13&name=children_c_replica_n3&action=CREATE&numShards=5&shard=c&wt=javabin 2> 1710445 INFO (qtp780434683-5572) [n:127.0.0.1:35379_solr c: s: r: x:children_a_replica_n4 t:null-9691] o.a.s.h.a.CoreAdminOperation core create command qt=/admin/cores&collection.configName=_default&newCollection=true&collection=children&version=2&replicaType=NRT&coreNodeName=core_node14&name=children_a_replica_n4&action=CREATE&numShards=5&shard=a&wt=javabin 2> 1710569 INFO (qtp266123491-5585) [n:127.0.0.1:34035_solr c:children s:d r:core_node11 x:children_d_replica_n1 t:null-9683] o.a.s.c.SolrConfig Using Lucene MatchVersion: 9.8.0 2> 1710570 INFO (qtp266123491-5590) [n:127.0.0.1:34035_solr c:children s:b r:core_node20 x:children_b_replica_n10 t:null-9684] o.a.s.c.SolrConfig Using Lucene MatchVersion: 9.8.0 2> 1710570 INFO (qtp1283346820-5568) [n:127.0.0.1:36209_solr c:children s:b r:core_node12 x:children_b_replica_n2 t:null-9682] o.a.s.c.SolrConfig Using Lucene MatchVersion: 9.8.0 2> 1710575 INFO (qtp484878969-5589) [n:127.0.0.1:44957_solr c:children s:d r:core_node18 x:children_d_replica_n8 t:null-9686] o.a.s.c.SolrConfig Using Lucene MatchVersion: 9.8.0 2> 1711338 INFO (qtp1167887541-5575) [n:127.0.0.1:36867_solr c:children s:c r:core_node19 x:children_c_replica_n9 t:null-9689] o.a.s.c.SolrConfig Using Lucene MatchVersion: 9.8.0 2> 1711338 INFO (qtp1167887541-5571) [n:127.0.0.1:36867_solr c:children s:e r:core_node17 x:children_e_replica_n7 t:null-9688] o.a.s.c.SolrConfig Using Lucene MatchVersion: 9.8.0 2> 1711349 INFO (qtp484878969-5581) [n:127.0.0.1:44957_solr c:children s:a r:core_node16 x:children_a_replica_n6 t:null-9687] o.a.s.c.SolrConfig Using Lucene MatchVersion: 9.8.0 2> 1711356 INFO (qtp780434683-5572) [n:127.0.0.1:35379_solr c:children s:a r:core_node14 x:children_a_replica_n4 t:null-9691] o.a.s.c.SolrConfig Using Lucene MatchVersion: 9.8.0 2> 1711357 INFO (qtp780434683-5566) [n:127.0.0.1:35379_solr c:children s:c r:core_node13 x:children_c_replica_n3 t:null-9690] o.a.s.c.SolrConfig Using Lucene MatchVersion: 9.8.0 2> 1711382 INFO (qtp1283346820-5568) [n:127.0.0.1:36209_solr c:children s:b r:core_node12 x:children_b_replica_n2 t:null-9682] o.a.s.s.IndexSchema Schema name=default-config 2> 1711382 INFO (qtp484878969-5581) [n:127.0.0.1:44957_solr c:children s:a r:core_node16 x:children_a_replica_n6 t:null-9687] o.a.s.s.IndexSchema Schema name=default-config 2> 1711382 INFO (qtp484878969-5589) [n:127.0.0.1:44957_solr c:children s:d r:core_node18 x:children_d_replica_n8 t:null-9686] o.a.s.s.IndexSchema Schema name=default-config 2> 1711382 INFO (qtp266123491-5590) [n:127.0.0.1:34035_solr c:children s:b r:core_node20 x:children_b_replica_n10 t:null-9684] o.a.s.s.IndexSchema Schema name=default-config 2> 1711383 INFO (qtp266123491-5585) [n:127.0.0.1:34035_solr c:children s:d r:core_node11 x:children_d_replica_n1 t:null-9683] o.a.s.s.IndexSchema Schema name=default-config 2> 1711386 INFO (qtp1283346820-5570) [n:127.0.0.1:36209_solr c:children s:e r:core_node15 x:children_e_replica_n5 t:null-9685] o.a.s.c.SolrConfig Using Lucene MatchVersion: 9.8.0 2> 1711394 INFO (qtp1167887541-5571) [n:127.0.0.1:36867_solr c:children s:e r:core_node17 x:children_e_replica_n7 t:null-9688] o.a.s.s.IndexSchema Schema name=default-config 2> 1711394 INFO (qtp1167887541-5575) [n:127.0.0.1:36867_solr c:children s:c r:core_node19 x:children_c_replica_n9 t:null-9689] o.a.s.s.IndexSchema Schema name=default-config 2> 1711402 INFO (qtp1283346820-5570) [n:127.0.0.1:36209_solr c:children s:e r:core_node15 x:children_e_replica_n5 t:null-9685] o.a.s.s.IndexSchema Schema name=default-config 2> 1711405 INFO (qtp780434683-5566) [n:127.0.0.1:35379_solr c:children s:c r:core_node13 x:children_c_replica_n3 t:null-9690] o.a.s.s.IndexSchema Schema name=default-config 2> 1711405 INFO (qtp780434683-5572) [n:127.0.0.1:35379_solr c:children s:a r:core_node14 x:children_a_replica_n4 t:null-9691] o.a.s.s.IndexSchema Schema name=default-config 2> 1712211 INFO (qtp1283346820-5568) [n:127.0.0.1:36209_solr c:children s:b r:core_node12 x:children_b_replica_n2 t:null-9682] o.a.s.s.IndexSchema Loaded schema default-config/1.6 with uniqueid field id 2> 1712211 INFO (qtp266123491-5590) [n:127.0.0.1:34035_solr c:children s:b r:core_node20 x:children_b_replica_n10 t:null-9684] o.a.s.s.IndexSchema Loaded schema default-config/1.6 with uniqueid field id 2> 1712211 INFO (qtp266123491-5585) [n:127.0.0.1:34035_solr c:children s:d r:core_node11 x:children_d_replica_n1 t:null-9683] o.a.s.s.IndexSchema Loaded schema default-config/1.6 with uniqueid field id 2> 1712211 INFO (qtp1167887541-5571) [n:127.0.0.1:36867_solr c:children s:e r:core_node17 x:children_e_replica_n7 t:null-9688] o.a.s.s.IndexSchema Loaded schema default-config/1.6 with uniqueid field id 2> 1712211 INFO (qtp1283346820-5570) [n:127.0.0.1:36209_solr c:children s:e r:core_node15 x:children_e_replica_n5 t:null-9685] o.a.s.s.IndexSchema Loaded schema default-config/1.6 with uniqueid field id 2> 1712211 INFO (qtp484878969-5581) [n:127.0.0.1:44957_solr c:children s:a r:core_node16 x:children_a_replica_n6 t:null-9687] o.a.s.s.IndexSchema Loaded schema default-config/1.6 with uniqueid field id 2> 1712211 INFO (qtp266123491-5590) [n:127.0.0.1:34035_solr c:children s:b r:core_node20 x:children_b_replica_n10 t:null-9684] o.a.s.c.CoreContainer Creating SolrCore 'children_b_replica_n10' using configuration from configset _default, trusted=true 2> 1712212 INFO (qtp1167887541-5571) [n:127.0.0.1:36867_solr c:children s:e r:core_node17 x:children_e_replica_n7 t:null-9688] o.a.s.c.CoreContainer Creating SolrCore 'children_e_replica_n7' using configuration from configset _default, trusted=true 2> 1712212 INFO (qtp1283346820-5570) [n:127.0.0.1:36209_solr c:children s:e r:core_node15 x:children_e_replica_n5 t:null-9685] o.a.s.c.CoreContainer Creating SolrCore 'children_e_replica_n5' using configuration from configset _default, trusted=true 2> 1712212 INFO (qtp484878969-5581) [n:127.0.0.1:44957_solr c:children s:a r:core_node16 x:children_a_replica_n6 t:null-9687] o.a.s.c.CoreContainer Creating SolrCore 'children_a_replica_n6' using configuration from configset _default, trusted=true 2> 1712212 INFO (qtp484878969-5589) [n:127.0.0.1:44957_solr c:children s:d r:core_node18 x:children_d_replica_n8 t:null-9686] o.a.s.s.IndexSchema Loaded schema default-config/1.6 with uniqueid field id 2> 1712212 INFO (qtp266123491-5585) [n:127.0.0.1:34035_solr c:children s:d r:core_node11 x:children_d_replica_n1 t:null-9683] o.a.s.c.CoreContainer Creating SolrCore 'children_d_replica_n1' using configuration from configset _default, trusted=true 2> 1712211 INFO (qtp1167887541-5575) [n:127.0.0.1:36867_solr c:children s:c r:core_node19 x:children_c_replica_n9 t:null-9689] o.a.s.s.IndexSchema Loaded schema default-config/1.6 with uniqueid field id 2> 1712212 INFO (qtp1283346820-5568) [n:127.0.0.1:36209_solr c:children s:b r:core_node12 x:children_b_replica_n2 t:null-9682] o.a.s.c.CoreContainer Creating SolrCore 'children_b_replica_n2' using configuration from configset _default, trusted=true 2> 1712212 INFO (qtp484878969-5589) [n:127.0.0.1:44957_solr c:children s:d r:core_node18 x:children_d_replica_n8 t:null-9686] o.a.s.c.CoreContainer Creating SolrCore 'children_d_replica_n8' using configuration from configset _default, trusted=true 2> 1712211 INFO (qtp780434683-5566) [n:127.0.0.1:35379_solr c:children s:c r:core_node13 x:children_c_replica_n3 t:null-9690] o.a.s.s.IndexSchema Loaded schema default-config/1.6 with uniqueid field id 2> 1712211 INFO (qtp780434683-5572) [n:127.0.0.1:35379_solr c:children s:a r:core_node14 x:children_a_replica_n4 t:null-9691] o.a.s.s.IndexSchema Loaded schema default-config/1.6 with uniqueid field id 2> 1712213 INFO (qtp1167887541-5575) [n:127.0.0.1:36867_solr c:children s:c r:core_node19 x:children_c_replica_n9 t:null-9689] o.a.s.c.CoreContainer Creating SolrCore 'children_c_replica_n9' using configuration from configset _default, trusted=true 2> 1712213 INFO (qtp780434683-5566) [n:127.0.0.1:35379_solr c:children s:c r:core_node13 x:children_c_replica_n3 t:null-9690] o.a.s.c.CoreContainer Creating SolrCore 'children_c_replica_n3' using configuration from configset _default, trusted=true 2> 1712213 INFO (qtp780434683-5572) [n:127.0.0.1:35379_solr c:children s:a r:core_node14 x:children_a_replica_n4 t:null-9691] o.a.s.c.CoreContainer Creating SolrCore 'children_a_replica_n4' using configuration from configset _default, trusted=true 2> 1712214 INFO (qtp266123491-5590) [n:127.0.0.1:34035_solr c:children s:b r:core_node20 x:children_b_replica_n10 t:null-9684] o.a.s.c.SolrCore Opening new SolrCore at [/build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.search.join.ShardJoinImplicitTest_246C98A4C257C021-001/tempDir-001/node2/children_b_replica_n10], dataDir=[/build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.search.join.ShardJoinImplicitTest_246C98A4C257C021-001/tempDir-001/node2/children_b_replica_n10/data/] 2> 1712214 INFO (qtp484878969-5589) [n:127.0.0.1:44957_solr c:children s:d r:core_node18 x:children_d_replica_n8 t:null-9686] o.a.s.c.SolrCore Opening new SolrCore at [/build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.search.join.ShardJoinImplicitTest_246C98A4C257C021-001/tempDir-001/node3/children_d_replica_n8], dataDir=[/build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.search.join.ShardJoinImplicitTest_246C98A4C257C021-001/tempDir-001/node3/children_d_replica_n8/data/] 2> 1712214 INFO (qtp1283346820-5570) [n:127.0.0.1:36209_solr c:children s:e r:core_node15 x:children_e_replica_n5 t:null-9685] o.a.s.c.SolrCore Opening new SolrCore at [/build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.search.join.ShardJoinImplicitTest_246C98A4C257C021-001/tempDir-001/node4/children_e_replica_n5], dataDir=[/build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.search.join.ShardJoinImplicitTest_246C98A4C257C021-001/tempDir-001/node4/children_e_replica_n5/data/] 2> 1712214 INFO (qtp1167887541-5571) [n:127.0.0.1:36867_solr c:children s:e r:core_node17 x:children_e_replica_n7 t:null-9688] o.a.s.c.SolrCore Opening new SolrCore at [/build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.search.join.ShardJoinImplicitTest_246C98A4C257C021-001/tempDir-001/node1/children_e_replica_n7], dataDir=[/build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.search.join.ShardJoinImplicitTest_246C98A4C257C021-001/tempDir-001/node1/children_e_replica_n7/data/] 2> 1712215 INFO (qtp780434683-5572) [n:127.0.0.1:35379_solr c:children s:a r:core_node14 x:children_a_replica_n4 t:null-9691] o.a.s.c.SolrCore Opening new SolrCore at [/build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.search.join.ShardJoinImplicitTest_246C98A4C257C021-001/tempDir-001/node5/children_a_replica_n4], dataDir=[/build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.search.join.ShardJoinImplicitTest_246C98A4C257C021-001/tempDir-001/node5/children_a_replica_n4/data/] 2> 1712215 INFO (qtp484878969-5581) [n:127.0.0.1:44957_solr c:children s:a r:core_node16 x:children_a_replica_n6 t:null-9687] o.a.s.c.SolrCore Opening new SolrCore at [/build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.search.join.ShardJoinImplicitTest_246C98A4C257C021-001/tempDir-001/node3/children_a_replica_n6], dataDir=[/build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.search.join.ShardJoinImplicitTest_246C98A4C257C021-001/tempDir-001/node3/children_a_replica_n6/data/] 2> 1712215 INFO (qtp1283346820-5568) [n:127.0.0.1:36209_solr c:children s:b r:core_node12 x:children_b_replica_n2 t:null-9682] o.a.s.c.SolrCore Opening new SolrCore at [/build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.search.join.ShardJoinImplicitTest_246C98A4C257C021-001/tempDir-001/node4/children_b_replica_n2], dataDir=[/build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.search.join.ShardJoinImplicitTest_246C98A4C257C021-001/tempDir-001/node4/children_b_replica_n2/data/] 2> 1712215 INFO (qtp780434683-5566) [n:127.0.0.1:35379_solr c:children s:c r:core_node13 x:children_c_replica_n3 t:null-9690] o.a.s.c.SolrCore Opening new SolrCore at [/build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.search.join.ShardJoinImplicitTest_246C98A4C257C021-001/tempDir-001/node5/children_c_replica_n3], dataDir=[/build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.search.join.ShardJoinImplicitTest_246C98A4C257C021-001/tempDir-001/node5/children_c_replica_n3/data/] 2> 1712215 INFO (qtp266123491-5585) [n:127.0.0.1:34035_solr c:children s:d r:core_node11 x:children_d_replica_n1 t:null-9683] o.a.s.c.SolrCore Opening new SolrCore at [/build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.search.join.ShardJoinImplicitTest_246C98A4C257C021-001/tempDir-001/node2/children_d_replica_n1], dataDir=[/build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.search.join.ShardJoinImplicitTest_246C98A4C257C021-001/tempDir-001/node2/children_d_replica_n1/data/] 2> 1712215 INFO (qtp1167887541-5575) [n:127.0.0.1:36867_solr c:children s:c r:core_node19 x:children_c_replica_n9 t:null-9689] o.a.s.c.SolrCore Opening new SolrCore at [/build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.search.join.ShardJoinImplicitTest_246C98A4C257C021-001/tempDir-001/node1/children_c_replica_n9], dataDir=[/build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.search.join.ShardJoinImplicitTest_246C98A4C257C021-001/tempDir-001/node1/children_c_replica_n9/data/] 2> 1733567 WARN (SUITE-ShardJoinImplicitTest-seed#[246C98A4C257C021]-worker-SendThread(127.0.0.1:33915)) [n: c: s: r: x: t:] o.a.z.ClientCnxn Client session timed out, have not heard from server in 23788ms for session id 0x10012780366000d 2> 1733609 WARN (SUITE-ShardJoinImplicitTest-seed#[246C98A4C257C021]-worker-SendThread(127.0.0.1:33915)) [n: c: s: r: x: t:] o.a.z.ClientCnxn Session 0x10012780366000d for server localhost/127.0.0.1:33915, Closing socket connection. Attempting reconnect except it is a SessionExpiredException. 2> => org.apache.zookeeper.ClientCnxn$SessionTimeoutException: Client session timed out, have not heard from server in 23788ms for session id 0x10012780366000d 2> at org.apache.zookeeper.ClientCnxn$SendThread.run(ClientCnxn.java:1242) 2> org.apache.zookeeper.ClientCnxn$SessionTimeoutException: Client session timed out, have not heard from server in 23788ms for session id 0x10012780366000d 2> at org.apache.zookeeper.ClientCnxn$SendThread.run(ClientCnxn.java:1242) [zookeeper-3.9.1.jar:3.9.1] 2> 1733744 WARN (zkConnectionManagerCallback-3702-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ConnectionManager Watcher org.apache.solr.common.cloud.ConnectionManager@6974560f name: ZooKeeperConnection Watcher:127.0.0.1:33915/solr got event WatchedEvent state:Disconnected type:None path:null zxid: -1 path: null type: None 2> 1733746 WARN (zkConnectionManagerCallback-3702-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ConnectionManager zkClient has disconnected 2> 1734145 INFO (qtp1283346820-5570) [n:127.0.0.1:36209_solr c:children s:e r:core_node15 x:children_e_replica_n5 t:null-9685] o.a.s.j.SolrRequestAuthorizer Creating a new SolrRequestAuthorizer 2> 1734146 INFO (qtp1167887541-5571) [n:127.0.0.1:36867_solr c:children s:e r:core_node17 x:children_e_replica_n7 t:null-9688] o.a.s.j.SolrRequestAuthorizer Creating a new SolrRequestAuthorizer 2> 1734160 INFO (qtp484878969-5589) [n:127.0.0.1:44957_solr c:children s:d r:core_node18 x:children_d_replica_n8 t:null-9686] o.a.s.j.SolrRequestAuthorizer Creating a new SolrRequestAuthorizer 2> 1734181 INFO (qtp780434683-5566) [n:127.0.0.1:35379_solr c:children s:c r:core_node13 x:children_c_replica_n3 t:null-9690] o.a.s.j.SolrRequestAuthorizer Creating a new SolrRequestAuthorizer 2> 1734238 INFO (qtp1167887541-5571) [n:127.0.0.1:36867_solr c:children s:e r:core_node17 x:children_e_replica_n7 t:null-9688] o.a.s.u.UpdateHandler Using UpdateLog implementation: org.apache.solr.update.UpdateLog 2> 1734238 INFO (qtp1283346820-5570) [n:127.0.0.1:36209_solr c:children s:e r:core_node15 x:children_e_replica_n5 t:null-9685] o.a.s.u.UpdateHandler Using UpdateLog implementation: org.apache.solr.update.UpdateLog 2> 1734240 INFO (qtp1283346820-5570) [n:127.0.0.1:36209_solr c:children s:e r:core_node15 x:children_e_replica_n5 t:null-9685] o.a.s.u.UpdateLog Initializing UpdateLog: dataDir= defaultSyncLevel=FLUSH numRecordsToKeep=100 maxNumLogsToKeep=10 numVersionBuckets=65536 2> 1734240 INFO (qtp1167887541-5571) [n:127.0.0.1:36867_solr c:children s:e r:core_node17 x:children_e_replica_n7 t:null-9688] o.a.s.u.UpdateLog Initializing UpdateLog: dataDir= defaultSyncLevel=FLUSH numRecordsToKeep=100 maxNumLogsToKeep=10 numVersionBuckets=65536 2> 1734258 INFO (qtp484878969-5589) [n:127.0.0.1:44957_solr c:children s:d r:core_node18 x:children_d_replica_n8 t:null-9686] o.a.s.u.UpdateHandler Using UpdateLog implementation: org.apache.solr.update.UpdateLog 2> 1734258 INFO (qtp484878969-5589) [n:127.0.0.1:44957_solr c:children s:d r:core_node18 x:children_d_replica_n8 t:null-9686] o.a.s.u.UpdateLog Initializing UpdateLog: dataDir= defaultSyncLevel=FLUSH numRecordsToKeep=100 maxNumLogsToKeep=10 numVersionBuckets=65536 2> 1734259 INFO (qtp1167887541-5575) [n:127.0.0.1:36867_solr c:children s:c r:core_node19 x:children_c_replica_n9 t:null-9689] o.a.s.u.UpdateHandler Using UpdateLog implementation: org.apache.solr.update.UpdateLog 2> 1734260 INFO (qtp1167887541-5575) [n:127.0.0.1:36867_solr c:children s:c r:core_node19 x:children_c_replica_n9 t:null-9689] o.a.s.u.UpdateLog Initializing UpdateLog: dataDir= defaultSyncLevel=FLUSH numRecordsToKeep=100 maxNumLogsToKeep=10 numVersionBuckets=65536 2> 1734260 INFO (qtp1283346820-5568) [n:127.0.0.1:36209_solr c:children s:b r:core_node12 x:children_b_replica_n2 t:null-9682] o.a.s.u.UpdateHandler Using UpdateLog implementation: org.apache.solr.update.UpdateLog 2> 1734260 INFO (qtp1283346820-5568) [n:127.0.0.1:36209_solr c:children s:b r:core_node12 x:children_b_replica_n2 t:null-9682] o.a.s.u.UpdateLog Initializing UpdateLog: dataDir= defaultSyncLevel=FLUSH numRecordsToKeep=100 maxNumLogsToKeep=10 numVersionBuckets=65536 2> 1734261 INFO (qtp484878969-5581) [n:127.0.0.1:44957_solr c:children s:a r:core_node16 x:children_a_replica_n6 t:null-9687] o.a.s.u.UpdateHandler Using UpdateLog implementation: org.apache.solr.update.UpdateLog 2> 1734261 INFO (qtp484878969-5581) [n:127.0.0.1:44957_solr c:children s:a r:core_node16 x:children_a_replica_n6 t:null-9687] o.a.s.u.UpdateLog Initializing UpdateLog: dataDir= defaultSyncLevel=FLUSH numRecordsToKeep=100 maxNumLogsToKeep=10 numVersionBuckets=65536 2> 1734264 INFO (qtp266123491-5590) [n:127.0.0.1:34035_solr c:children s:b r:core_node20 x:children_b_replica_n10 t:null-9684] o.a.s.j.SolrRequestAuthorizer Creating a new SolrRequestAuthorizer 2> 1734290 INFO (qtp780434683-5566) [n:127.0.0.1:35379_solr c:children s:c r:core_node13 x:children_c_replica_n3 t:null-9690] o.a.s.u.UpdateHandler Using UpdateLog implementation: org.apache.solr.update.UpdateLog 2> 1734291 INFO (qtp780434683-5566) [n:127.0.0.1:35379_solr c:children s:c r:core_node13 x:children_c_replica_n3 t:null-9690] o.a.s.u.UpdateLog Initializing UpdateLog: dataDir= defaultSyncLevel=FLUSH numRecordsToKeep=100 maxNumLogsToKeep=10 numVersionBuckets=65536 2> 1734291 INFO (qtp780434683-5572) [n:127.0.0.1:35379_solr c:children s:a r:core_node14 x:children_a_replica_n4 t:null-9691] o.a.s.u.UpdateHandler Using UpdateLog implementation: org.apache.solr.update.UpdateLog 2> 1734291 INFO (qtp780434683-5572) [n:127.0.0.1:35379_solr c:children s:a r:core_node14 x:children_a_replica_n4 t:null-9691] o.a.s.u.UpdateLog Initializing UpdateLog: dataDir= defaultSyncLevel=FLUSH numRecordsToKeep=100 maxNumLogsToKeep=10 numVersionBuckets=65536 2> 1734326 INFO (qtp266123491-5590) [n:127.0.0.1:34035_solr c:children s:b r:core_node20 x:children_b_replica_n10 t:null-9684] o.a.s.u.UpdateHandler Using UpdateLog implementation: org.apache.solr.update.UpdateLog 2> 1734326 INFO (qtp266123491-5590) [n:127.0.0.1:34035_solr c:children s:b r:core_node20 x:children_b_replica_n10 t:null-9684] o.a.s.u.UpdateLog Initializing UpdateLog: dataDir= defaultSyncLevel=FLUSH numRecordsToKeep=100 maxNumLogsToKeep=10 numVersionBuckets=65536 2> 1734327 INFO (qtp266123491-5585) [n:127.0.0.1:34035_solr c:children s:d r:core_node11 x:children_d_replica_n1 t:null-9683] o.a.s.u.UpdateHandler Using UpdateLog implementation: org.apache.solr.update.UpdateLog 2> 1734328 INFO (qtp266123491-5585) [n:127.0.0.1:34035_solr c:children s:d r:core_node11 x:children_d_replica_n1 t:null-9683] o.a.s.u.UpdateLog Initializing UpdateLog: dataDir= defaultSyncLevel=FLUSH numRecordsToKeep=100 maxNumLogsToKeep=10 numVersionBuckets=65536 2> 1734270 INFO (qtp484878969-5589) [n:127.0.0.1:44957_solr c:children s:d r:core_node18 x:children_d_replica_n8 t:null-9686] o.a.s.u.CommitTracker Hard AutoCommit: if uncommitted for 15000ms; 2> 1734307 INFO (qtp780434683-5572) [n:127.0.0.1:35379_solr c:children s:a r:core_node14 x:children_a_replica_n4 t:null-9691] o.a.s.u.CommitTracker Hard AutoCommit: if uncommitted for 15000ms; 2> 1734277 INFO (qtp484878969-5581) [n:127.0.0.1:44957_solr c:children s:a r:core_node16 x:children_a_replica_n6 t:null-9687] o.a.s.u.CommitTracker Hard AutoCommit: if uncommitted for 15000ms; 2> 1734331 INFO (qtp484878969-5589) [n:127.0.0.1:44957_solr c:children s:d r:core_node18 x:children_d_replica_n8 t:null-9686] o.a.s.u.CommitTracker Soft AutoCommit: if uncommitted for 3000ms; 2> 1734275 INFO (qtp1283346820-5568) [n:127.0.0.1:36209_solr c:children s:b r:core_node12 x:children_b_replica_n2 t:null-9682] o.a.s.u.CommitTracker Hard AutoCommit: if uncommitted for 15000ms; 2> 1734266 INFO (qtp1167887541-5571) [n:127.0.0.1:36867_solr c:children s:e r:core_node17 x:children_e_replica_n7 t:null-9688] o.a.s.u.CommitTracker Hard AutoCommit: if uncommitted for 15000ms; 2> 1734308 INFO (qtp780434683-5566) [n:127.0.0.1:35379_solr c:children s:c r:core_node13 x:children_c_replica_n3 t:null-9690] o.a.s.u.CommitTracker Hard AutoCommit: if uncommitted for 15000ms; 2> 1734331 INFO (qtp1283346820-5568) [n:127.0.0.1:36209_solr c:children s:b r:core_node12 x:children_b_replica_n2 t:null-9682] o.a.s.u.CommitTracker Soft AutoCommit: if uncommitted for 3000ms; 2> 1734266 INFO (qtp1283346820-5570) [n:127.0.0.1:36209_solr c:children s:e r:core_node15 x:children_e_replica_n5 t:null-9685] o.a.s.u.CommitTracker Hard AutoCommit: if uncommitted for 15000ms; 2> 1734274 INFO (qtp1167887541-5575) [n:127.0.0.1:36867_solr c:children s:c r:core_node19 x:children_c_replica_n9 t:null-9689] o.a.s.u.CommitTracker Hard AutoCommit: if uncommitted for 15000ms; 2> 1734332 INFO (qtp1283346820-5570) [n:127.0.0.1:36209_solr c:children s:e r:core_node15 x:children_e_replica_n5 t:null-9685] o.a.s.u.CommitTracker Soft AutoCommit: if uncommitted for 3000ms; 2> 1734331 INFO (qtp780434683-5566) [n:127.0.0.1:35379_solr c:children s:c r:core_node13 x:children_c_replica_n3 t:null-9690] o.a.s.u.CommitTracker Soft AutoCommit: if uncommitted for 3000ms; 2> 1734332 INFO (qtp1167887541-5575) [n:127.0.0.1:36867_solr c:children s:c r:core_node19 x:children_c_replica_n9 t:null-9689] o.a.s.u.CommitTracker Soft AutoCommit: if uncommitted for 3000ms; 2> 1734331 INFO (qtp484878969-5581) [n:127.0.0.1:44957_solr c:children s:a r:core_node16 x:children_a_replica_n6 t:null-9687] o.a.s.u.CommitTracker Soft AutoCommit: if uncommitted for 3000ms; 2> 1734331 INFO (qtp780434683-5572) [n:127.0.0.1:35379_solr c:children s:a r:core_node14 x:children_a_replica_n4 t:null-9691] o.a.s.u.CommitTracker Soft AutoCommit: if uncommitted for 3000ms; 2> 1734333 INFO (qtp1167887541-5571) [n:127.0.0.1:36867_solr c:children s:e r:core_node17 x:children_e_replica_n7 t:null-9688] o.a.s.u.CommitTracker Soft AutoCommit: if uncommitted for 3000ms; 2> 1734340 INFO (qtp266123491-5590) [n:127.0.0.1:34035_solr c:children s:b r:core_node20 x:children_b_replica_n10 t:null-9684] o.a.s.u.CommitTracker Hard AutoCommit: if uncommitted for 15000ms; 2> 1734340 INFO (qtp266123491-5590) [n:127.0.0.1:34035_solr c:children s:b r:core_node20 x:children_b_replica_n10 t:null-9684] o.a.s.u.CommitTracker Soft AutoCommit: if uncommitted for 3000ms; 2> 1734341 INFO (qtp266123491-5585) [n:127.0.0.1:34035_solr c:children s:d r:core_node11 x:children_d_replica_n1 t:null-9683] o.a.s.u.CommitTracker Hard AutoCommit: if uncommitted for 15000ms; 2> 1734341 INFO (qtp266123491-5585) [n:127.0.0.1:34035_solr c:children s:d r:core_node11 x:children_d_replica_n1 t:null-9683] o.a.s.u.CommitTracker Soft AutoCommit: if uncommitted for 3000ms; 2> 1734382 INFO (qtp1283346820-5568) [n:127.0.0.1:36209_solr c:children s:b r:core_node12 x:children_b_replica_n2 t:null-9682] o.a.s.r.ManagedResourceStorage Configured ZooKeeperStorageIO with znodeBase: /configs/_default 2> 1734384 INFO (qtp1283346820-5570) [n:127.0.0.1:36209_solr c:children s:e r:core_node15 x:children_e_replica_n5 t:null-9685] o.a.s.r.ManagedResourceStorage Configured ZooKeeperStorageIO with znodeBase: /configs/_default 2> 1734386 INFO (qtp1167887541-5575) [n:127.0.0.1:36867_solr c:children s:c r:core_node19 x:children_c_replica_n9 t:null-9689] o.a.s.r.ManagedResourceStorage Configured ZooKeeperStorageIO with znodeBase: /configs/_default 2> 1734386 INFO (qtp484878969-5589) [n:127.0.0.1:44957_solr c:children s:d r:core_node18 x:children_d_replica_n8 t:null-9686] o.a.s.r.ManagedResourceStorage Configured ZooKeeperStorageIO with znodeBase: /configs/_default 2> 1734388 INFO (qtp780434683-5572) [n:127.0.0.1:35379_solr c:children s:a r:core_node14 x:children_a_replica_n4 t:null-9691] o.a.s.r.ManagedResourceStorage Configured ZooKeeperStorageIO with znodeBase: /configs/_default 2> 1734388 INFO (qtp1283346820-5568) [n:127.0.0.1:36209_solr c:children s:b r:core_node12 x:children_b_replica_n2 t:null-9682] o.a.s.r.ManagedResourceStorage Loaded null at path _rest_managed.json using ZooKeeperStorageIO:path=/configs/_default 2> 1734389 INFO (qtp1283346820-5570) [n:127.0.0.1:36209_solr c:children s:e r:core_node15 x:children_e_replica_n5 t:null-9685] o.a.s.r.ManagedResourceStorage Loaded null at path _rest_managed.json using ZooKeeperStorageIO:path=/configs/_default 2> 1734390 INFO (qtp266123491-5585) [n:127.0.0.1:34035_solr c:children s:d r:core_node11 x:children_d_replica_n1 t:null-9683] o.a.s.r.ManagedResourceStorage Configured ZooKeeperStorageIO with znodeBase: /configs/_default 2> 1734390 INFO (qtp780434683-5566) [n:127.0.0.1:35379_solr c:children s:c r:core_node13 x:children_c_replica_n3 t:null-9690] o.a.s.r.ManagedResourceStorage Configured ZooKeeperStorageIO with znodeBase: /configs/_default 2> 1734391 INFO (qtp1167887541-5575) [n:127.0.0.1:36867_solr c:children s:c r:core_node19 x:children_c_replica_n9 t:null-9689] o.a.s.r.ManagedResourceStorage Loaded null at path _rest_managed.json using ZooKeeperStorageIO:path=/configs/_default 2> 1734391 INFO (qtp484878969-5589) [n:127.0.0.1:44957_solr c:children s:d r:core_node18 x:children_d_replica_n8 t:null-9686] o.a.s.r.ManagedResourceStorage Loaded null at path _rest_managed.json using ZooKeeperStorageIO:path=/configs/_default 2> 1734391 INFO (qtp266123491-5590) [n:127.0.0.1:34035_solr c:children s:b r:core_node20 x:children_b_replica_n10 t:null-9684] o.a.s.r.ManagedResourceStorage Configured ZooKeeperStorageIO with znodeBase: /configs/_default 2> 1734392 INFO (qtp1283346820-5568) [n:127.0.0.1:36209_solr c:children s:b r:core_node12 x:children_b_replica_n2 t:null-9682] o.a.s.s.ZkIndexSchemaReader Creating ZooKeeper watch for the managed schema at /configs/_default/managed-schema.xml 2> 1734392 INFO (qtp780434683-5572) [n:127.0.0.1:35379_solr c:children s:a r:core_node14 x:children_a_replica_n4 t:null-9691] o.a.s.r.ManagedResourceStorage Loaded null at path _rest_managed.json using ZooKeeperStorageIO:path=/configs/_default 2> 1734393 INFO (qtp1167887541-5571) [n:127.0.0.1:36867_solr c:children s:e r:core_node17 x:children_e_replica_n7 t:null-9688] o.a.s.r.ManagedResourceStorage Configured ZooKeeperStorageIO with znodeBase: /configs/_default 2> 1734393 INFO (qtp484878969-5581) [n:127.0.0.1:44957_solr c:children s:a r:core_node16 x:children_a_replica_n6 t:null-9687] o.a.s.r.ManagedResourceStorage Configured ZooKeeperStorageIO with znodeBase: /configs/_default 2> 1734393 INFO (qtp1283346820-5570) [n:127.0.0.1:36209_solr c:children s:e r:core_node15 x:children_e_replica_n5 t:null-9685] o.a.s.s.ZkIndexSchemaReader Creating ZooKeeper watch for the managed schema at /configs/_default/managed-schema.xml 2> 1734394 INFO (qtp780434683-5566) [n:127.0.0.1:35379_solr c:children s:c r:core_node13 x:children_c_replica_n3 t:null-9690] o.a.s.r.ManagedResourceStorage Loaded null at path _rest_managed.json using ZooKeeperStorageIO:path=/configs/_default 2> 1734394 INFO (qtp266123491-5585) [n:127.0.0.1:34035_solr c:children s:d r:core_node11 x:children_d_replica_n1 t:null-9683] o.a.s.r.ManagedResourceStorage Loaded null at path _rest_managed.json using ZooKeeperStorageIO:path=/configs/_default 2> 1734394 INFO (qtp1167887541-5575) [n:127.0.0.1:36867_solr c:children s:c r:core_node19 x:children_c_replica_n9 t:null-9689] o.a.s.s.ZkIndexSchemaReader Creating ZooKeeper watch for the managed schema at /configs/_default/managed-schema.xml 2> 1734394 INFO (qtp484878969-5589) [n:127.0.0.1:44957_solr c:children s:d r:core_node18 x:children_d_replica_n8 t:null-9686] o.a.s.s.ZkIndexSchemaReader Creating ZooKeeper watch for the managed schema at /configs/_default/managed-schema.xml 2> 1734395 INFO (qtp266123491-5590) [n:127.0.0.1:34035_solr c:children s:b r:core_node20 x:children_b_replica_n10 t:null-9684] o.a.s.r.ManagedResourceStorage Loaded null at path _rest_managed.json using ZooKeeperStorageIO:path=/configs/_default 2> 1734396 INFO (qtp780434683-5572) [n:127.0.0.1:35379_solr c:children s:a r:core_node14 x:children_a_replica_n4 t:null-9691] o.a.s.s.ZkIndexSchemaReader Creating ZooKeeper watch for the managed schema at /configs/_default/managed-schema.xml 2> 1734396 INFO (qtp1167887541-5571) [n:127.0.0.1:36867_solr c:children s:e r:core_node17 x:children_e_replica_n7 t:null-9688] o.a.s.r.ManagedResourceStorage Loaded null at path _rest_managed.json using ZooKeeperStorageIO:path=/configs/_default 2> 1734397 INFO (qtp484878969-5581) [n:127.0.0.1:44957_solr c:children s:a r:core_node16 x:children_a_replica_n6 t:null-9687] o.a.s.r.ManagedResourceStorage Loaded null at path _rest_managed.json using ZooKeeperStorageIO:path=/configs/_default 2> 1734397 INFO (qtp780434683-5566) [n:127.0.0.1:35379_solr c:children s:c r:core_node13 x:children_c_replica_n3 t:null-9690] o.a.s.s.ZkIndexSchemaReader Creating ZooKeeper watch for the managed schema at /configs/_default/managed-schema.xml 2> 1734397 INFO (qtp266123491-5585) [n:127.0.0.1:34035_solr c:children s:d r:core_node11 x:children_d_replica_n1 t:null-9683] o.a.s.s.ZkIndexSchemaReader Creating ZooKeeper watch for the managed schema at /configs/_default/managed-schema.xml 2> 1734398 INFO (qtp266123491-5590) [n:127.0.0.1:34035_solr c:children s:b r:core_node20 x:children_b_replica_n10 t:null-9684] o.a.s.s.ZkIndexSchemaReader Creating ZooKeeper watch for the managed schema at /configs/_default/managed-schema.xml 2> 1734400 INFO (qtp1167887541-5571) [n:127.0.0.1:36867_solr c:children s:e r:core_node17 x:children_e_replica_n7 t:null-9688] o.a.s.s.ZkIndexSchemaReader Creating ZooKeeper watch for the managed schema at /configs/_default/managed-schema.xml 2> 1734400 INFO (qtp484878969-5581) [n:127.0.0.1:44957_solr c:children s:a r:core_node16 x:children_a_replica_n6 t:null-9687] o.a.s.s.ZkIndexSchemaReader Creating ZooKeeper watch for the managed schema at /configs/_default/managed-schema.xml 2> 1734401 INFO (qtp1283346820-5568) [n:127.0.0.1:36209_solr c:children s:b r:core_node12 x:children_b_replica_n2 t:null-9682] o.a.s.s.ZkIndexSchemaReader Current schema version 3 is already the latest 2> 1734401 INFO (qtp1283346820-5570) [n:127.0.0.1:36209_solr c:children s:e r:core_node15 x:children_e_replica_n5 t:null-9685] o.a.s.s.ZkIndexSchemaReader Current schema version 3 is already the latest 2> 1734403 INFO (qtp1167887541-5575) [n:127.0.0.1:36867_solr c:children s:c r:core_node19 x:children_c_replica_n9 t:null-9689] o.a.s.s.ZkIndexSchemaReader Current schema version 3 is already the latest 2> 1734403 INFO (qtp1283346820-5570) [n:127.0.0.1:36209_solr c:children s:e r:core_node15 x:children_e_replica_n5 t:null-9685] o.a.s.s.DirectSolrSpellChecker init: {accuracy=0.5, maxQueryFrequency=0.01, maxEdits=2, minPrefix=1, maxInspections=5, minQueryLength=4, name=default, field=_text_, classname=solr.DirectSolrSpellChecker, distanceMeasure=internal} 2> 1734403 INFO (qtp1283346820-5568) [n:127.0.0.1:36209_solr c:children s:b r:core_node12 x:children_b_replica_n2 t:null-9682] o.a.s.s.DirectSolrSpellChecker init: {accuracy=0.5, maxQueryFrequency=0.01, maxEdits=2, minPrefix=1, maxInspections=5, minQueryLength=4, name=default, field=_text_, classname=solr.DirectSolrSpellChecker, distanceMeasure=internal} 2> 1734403 INFO (qtp484878969-5589) [n:127.0.0.1:44957_solr c:children s:d r:core_node18 x:children_d_replica_n8 t:null-9686] o.a.s.s.ZkIndexSchemaReader Current schema version 3 is already the latest 2> 1734403 INFO (qtp780434683-5572) [n:127.0.0.1:35379_solr c:children s:a r:core_node14 x:children_a_replica_n4 t:null-9691] o.a.s.s.ZkIndexSchemaReader Current schema version 3 is already the latest 2> 1734404 INFO (qtp1167887541-5575) [n:127.0.0.1:36867_solr c:children s:c r:core_node19 x:children_c_replica_n9 t:null-9689] o.a.s.s.DirectSolrSpellChecker init: {accuracy=0.5, maxQueryFrequency=0.01, maxEdits=2, minPrefix=1, maxInspections=5, minQueryLength=4, name=default, field=_text_, classname=solr.DirectSolrSpellChecker, distanceMeasure=internal} 2> 1734404 INFO (qtp780434683-5566) [n:127.0.0.1:35379_solr c:children s:c r:core_node13 x:children_c_replica_n3 t:null-9690] o.a.s.s.ZkIndexSchemaReader Current schema version 3 is already the latest 2> 1734405 INFO (qtp484878969-5589) [n:127.0.0.1:44957_solr c:children s:d r:core_node18 x:children_d_replica_n8 t:null-9686] o.a.s.s.DirectSolrSpellChecker init: {accuracy=0.5, maxQueryFrequency=0.01, maxEdits=2, minPrefix=1, maxInspections=5, minQueryLength=4, name=default, field=_text_, classname=solr.DirectSolrSpellChecker, distanceMeasure=internal} 2> 1734405 INFO (qtp780434683-5572) [n:127.0.0.1:35379_solr c:children s:a r:core_node14 x:children_a_replica_n4 t:null-9691] o.a.s.s.DirectSolrSpellChecker init: {accuracy=0.5, maxQueryFrequency=0.01, maxEdits=2, minPrefix=1, maxInspections=5, minQueryLength=4, name=default, field=_text_, classname=solr.DirectSolrSpellChecker, distanceMeasure=internal} 2> 1734405 INFO (qtp780434683-5566) [n:127.0.0.1:35379_solr c:children s:c r:core_node13 x:children_c_replica_n3 t:null-9690] o.a.s.s.DirectSolrSpellChecker init: {accuracy=0.5, maxQueryFrequency=0.01, maxEdits=2, minPrefix=1, maxInspections=5, minQueryLength=4, name=default, field=_text_, classname=solr.DirectSolrSpellChecker, distanceMeasure=internal} 2> 1734405 INFO (qtp266123491-5585) [n:127.0.0.1:34035_solr c:children s:d r:core_node11 x:children_d_replica_n1 t:null-9683] o.a.s.s.ZkIndexSchemaReader Current schema version 3 is already the latest 2> 1734406 INFO (qtp266123491-5590) [n:127.0.0.1:34035_solr c:children s:b r:core_node20 x:children_b_replica_n10 t:null-9684] o.a.s.s.ZkIndexSchemaReader Current schema version 3 is already the latest 2> 1734406 INFO (qtp266123491-5585) [n:127.0.0.1:34035_solr c:children s:d r:core_node11 x:children_d_replica_n1 t:null-9683] o.a.s.s.DirectSolrSpellChecker init: {accuracy=0.5, maxQueryFrequency=0.01, maxEdits=2, minPrefix=1, maxInspections=5, minQueryLength=4, name=default, field=_text_, classname=solr.DirectSolrSpellChecker, distanceMeasure=internal} 2> 1734406 INFO (qtp1167887541-5571) [n:127.0.0.1:36867_solr c:children s:e r:core_node17 x:children_e_replica_n7 t:null-9688] o.a.s.s.ZkIndexSchemaReader Current schema version 3 is already the latest 2> 1734407 INFO (qtp484878969-5581) [n:127.0.0.1:44957_solr c:children s:a r:core_node16 x:children_a_replica_n6 t:null-9687] o.a.s.s.ZkIndexSchemaReader Current schema version 3 is already the latest 2> 1734409 INFO (qtp484878969-5581) [n:127.0.0.1:44957_solr c:children s:a r:core_node16 x:children_a_replica_n6 t:null-9687] o.a.s.s.DirectSolrSpellChecker init: {accuracy=0.5, maxQueryFrequency=0.01, maxEdits=2, minPrefix=1, maxInspections=5, minQueryLength=4, name=default, field=_text_, classname=solr.DirectSolrSpellChecker, distanceMeasure=internal} 2> 1734409 INFO (qtp1167887541-5571) [n:127.0.0.1:36867_solr c:children s:e r:core_node17 x:children_e_replica_n7 t:null-9688] o.a.s.s.DirectSolrSpellChecker init: {accuracy=0.5, maxQueryFrequency=0.01, maxEdits=2, minPrefix=1, maxInspections=5, minQueryLength=4, name=default, field=_text_, classname=solr.DirectSolrSpellChecker, distanceMeasure=internal} 2> 1734409 INFO (qtp266123491-5590) [n:127.0.0.1:34035_solr c:children s:b r:core_node20 x:children_b_replica_n10 t:null-9684] o.a.s.s.DirectSolrSpellChecker init: {accuracy=0.5, maxQueryFrequency=0.01, maxEdits=2, minPrefix=1, maxInspections=5, minQueryLength=4, name=default, field=_text_, classname=solr.DirectSolrSpellChecker, distanceMeasure=internal} 2> 1734444 INFO (qtp266123491-5590) [n:127.0.0.1:34035_solr c:children s:b r:core_node20 x:children_b_replica_n10 t:null-9684] o.a.s.h.ReplicationHandler Commits will be reserved for 10000 ms 2> 1734444 INFO (qtp484878969-5581) [n:127.0.0.1:44957_solr c:children s:a r:core_node16 x:children_a_replica_n6 t:null-9687] o.a.s.h.ReplicationHandler Commits will be reserved for 10000 ms 2> 1734444 INFO (qtp1167887541-5575) [n:127.0.0.1:36867_solr c:children s:c r:core_node19 x:children_c_replica_n9 t:null-9689] o.a.s.h.ReplicationHandler Commits will be reserved for 10000 ms 2> 1734446 INFO (qtp780434683-5572) [n:127.0.0.1:35379_solr c:children s:a r:core_node14 x:children_a_replica_n4 t:null-9691] o.a.s.h.ReplicationHandler Commits will be reserved for 10000 ms 2> 1734446 INFO (qtp1167887541-5575) [n:127.0.0.1:36867_solr c:children s:c r:core_node19 x:children_c_replica_n9 t:null-9689] o.a.s.u.UpdateLog Could not find max version in index or recent updates, using new clock 1788737054325080064 2> 1734446 INFO (qtp484878969-5589) [n:127.0.0.1:44957_solr c:children s:d r:core_node18 x:children_d_replica_n8 t:null-9686] o.a.s.h.ReplicationHandler Commits will be reserved for 10000 ms 2> 1734446 INFO (qtp266123491-5590) [n:127.0.0.1:34035_solr c:children s:b r:core_node20 x:children_b_replica_n10 t:null-9684] o.a.s.u.UpdateLog Could not find max version in index or recent updates, using new clock 1788737054325080064 2> 1734446 INFO (qtp484878969-5581) [n:127.0.0.1:44957_solr c:children s:a r:core_node16 x:children_a_replica_n6 t:null-9687] o.a.s.u.UpdateLog Could not find max version in index or recent updates, using new clock 1788737054325080064 2> 1734447 INFO (qtp780434683-5572) [n:127.0.0.1:35379_solr c:children s:a r:core_node14 x:children_a_replica_n4 t:null-9691] o.a.s.u.UpdateLog Could not find max version in index or recent updates, using new clock 1788737054326128640 2> 1734448 INFO (qtp484878969-5589) [n:127.0.0.1:44957_solr c:children s:d r:core_node18 x:children_d_replica_n8 t:null-9686] o.a.s.u.UpdateLog Could not find max version in index or recent updates, using new clock 1788737054327177216 2> 1734448 INFO (qtp266123491-5585) [n:127.0.0.1:34035_solr c:children s:d r:core_node11 x:children_d_replica_n1 t:null-9683] o.a.s.h.ReplicationHandler Commits will be reserved for 10000 ms 2> 1734449 INFO (qtp1167887541-5571) [n:127.0.0.1:36867_solr c:children s:e r:core_node17 x:children_e_replica_n7 t:null-9688] o.a.s.h.ReplicationHandler Commits will be reserved for 10000 ms 2> 1734449 INFO (qtp266123491-5585) [n:127.0.0.1:34035_solr c:children s:d r:core_node11 x:children_d_replica_n1 t:null-9683] o.a.s.u.UpdateLog Could not find max version in index or recent updates, using new clock 1788737054328225792 2> 1734450 INFO (qtp1283346820-5570) [n:127.0.0.1:36209_solr c:children s:e r:core_node15 x:children_e_replica_n5 t:null-9685] o.a.s.h.ReplicationHandler Commits will be reserved for 10000 ms 2> 1734450 INFO (qtp1167887541-5571) [n:127.0.0.1:36867_solr c:children s:e r:core_node17 x:children_e_replica_n7 t:null-9688] o.a.s.u.UpdateLog Could not find max version in index or recent updates, using new clock 1788737054329274368 2> 1734453 INFO (qtp1283346820-5570) [n:127.0.0.1:36209_solr c:children s:e r:core_node15 x:children_e_replica_n5 t:null-9685] o.a.s.u.UpdateLog Could not find max version in index or recent updates, using new clock 1788737054332420096 2> 1734456 INFO (qtp1283346820-5568) [n:127.0.0.1:36209_solr c:children s:b r:core_node12 x:children_b_replica_n2 t:null-9682] o.a.s.h.ReplicationHandler Commits will be reserved for 10000 ms 2> 1734457 INFO (qtp1283346820-5568) [n:127.0.0.1:36209_solr c:children s:b r:core_node12 x:children_b_replica_n2 t:null-9682] o.a.s.u.UpdateLog Could not find max version in index or recent updates, using new clock 1788737054336614400 2> 1734460 INFO (qtp780434683-5566) [n:127.0.0.1:35379_solr c:children s:c r:core_node13 x:children_c_replica_n3 t:null-9690] o.a.s.h.ReplicationHandler Commits will be reserved for 10000 ms 2> 1734461 INFO (qtp780434683-5566) [n:127.0.0.1:35379_solr c:children s:c r:core_node13 x:children_c_replica_n3 t:null-9690] o.a.s.u.UpdateLog Could not find max version in index or recent updates, using new clock 1788737054340808704 2> 1734462 INFO (searcherExecutor-3726-thread-1-processing-children_a_replica_n4 null-9691 core_node14 127.0.0.1:35379_solr children a) [n:127.0.0.1:35379_solr c:children s:a r:core_node14 x:children_a_replica_n4 t:null-9691] o.a.s.c.QuerySenderListener QuerySenderListener done. 2> 1734463 INFO (searcherExecutor-3712-thread-1-processing-children_b_replica_n10 null-9684 core_node20 127.0.0.1:34035_solr children b) [n:127.0.0.1:34035_solr c:children s:b r:core_node20 x:children_b_replica_n10 t:null-9684] o.a.s.c.QuerySenderListener QuerySenderListener done. 2> 1734463 INFO (searcherExecutor-3724-thread-1-processing-children_d_replica_n1 null-9683 core_node11 127.0.0.1:34035_solr children d) [n:127.0.0.1:34035_solr c:children s:d r:core_node11 x:children_d_replica_n1 t:null-9683] o.a.s.c.QuerySenderListener QuerySenderListener done. 2> 1734463 INFO (searcherExecutor-3726-thread-1-processing-children_a_replica_n4 null-9691 core_node14 127.0.0.1:35379_solr children a) [n:127.0.0.1:35379_solr c:children s:a r:core_node14 x:children_a_replica_n4 t:null-9691] o.a.s.h.c.SpellCheckComponent Loading spell index for spellchecker: default 2> 1734463 INFO (searcherExecutor-3712-thread-1-processing-children_b_replica_n10 null-9684 core_node20 127.0.0.1:34035_solr children b) [n:127.0.0.1:34035_solr c:children s:b r:core_node20 x:children_b_replica_n10 t:null-9684] o.a.s.h.c.SpellCheckComponent Loading spell index for spellchecker: default 2> 1734463 INFO (searcherExecutor-3724-thread-1-processing-children_d_replica_n1 null-9683 core_node11 127.0.0.1:34035_solr children d) [n:127.0.0.1:34035_solr c:children s:d r:core_node11 x:children_d_replica_n1 t:null-9683] o.a.s.h.c.SpellCheckComponent Loading spell index for spellchecker: default 2> 1734464 INFO (searcherExecutor-3720-thread-1-processing-children_c_replica_n9 null-9689 core_node19 127.0.0.1:36867_solr children c) [n:127.0.0.1:36867_solr c:children s:c r:core_node19 x:children_c_replica_n9 t:null-9689] o.a.s.c.QuerySenderListener QuerySenderListener done. 2> 1734482 INFO (searcherExecutor-3718-thread-1-processing-children_d_replica_n8 null-9686 core_node18 127.0.0.1:44957_solr children d) [n:127.0.0.1:44957_solr c:children s:d r:core_node18 x:children_d_replica_n8 t:null-9686] o.a.s.c.QuerySenderListener QuerySenderListener done. 2> 1734483 INFO (searcherExecutor-3720-thread-1-processing-children_c_replica_n9 null-9689 core_node19 127.0.0.1:36867_solr children c) [n:127.0.0.1:36867_solr c:children s:c r:core_node19 x:children_c_replica_n9 t:null-9689] o.a.s.h.c.SpellCheckComponent Loading spell index for spellchecker: default 2> 1734483 INFO (searcherExecutor-3714-thread-1-processing-children_a_replica_n6 null-9687 core_node16 127.0.0.1:44957_solr children a) [n:127.0.0.1:44957_solr c:children s:a r:core_node16 x:children_a_replica_n6 t:null-9687] o.a.s.c.QuerySenderListener QuerySenderListener done. 2> 1734483 INFO (searcherExecutor-3718-thread-1-processing-children_d_replica_n8 null-9686 core_node18 127.0.0.1:44957_solr children d) [n:127.0.0.1:44957_solr c:children s:d r:core_node18 x:children_d_replica_n8 t:null-9686] o.a.s.h.c.SpellCheckComponent Loading spell index for spellchecker: default 2> 1734483 INFO (searcherExecutor-3714-thread-1-processing-children_a_replica_n6 null-9687 core_node16 127.0.0.1:44957_solr children a) [n:127.0.0.1:44957_solr c:children s:a r:core_node16 x:children_a_replica_n6 t:null-9687] o.a.s.h.c.SpellCheckComponent Loading spell index for spellchecker: default 2> 1734489 INFO (searcherExecutor-3708-thread-1-processing-children_e_replica_n7 null-9688 core_node17 127.0.0.1:36867_solr children e) [n:127.0.0.1:36867_solr c:children s:e r:core_node17 x:children_e_replica_n7 t:null-9688] o.a.s.c.QuerySenderListener QuerySenderListener done. 2> 1734492 INFO (searcherExecutor-3708-thread-1-processing-children_e_replica_n7 null-9688 core_node17 127.0.0.1:36867_solr children e) [n:127.0.0.1:36867_solr c:children s:e r:core_node17 x:children_e_replica_n7 t:null-9688] o.a.s.h.c.SpellCheckComponent Loading spell index for spellchecker: default 2> 1734541 INFO (searcherExecutor-3712-thread-1-processing-children_b_replica_n10 null-9684 core_node20 127.0.0.1:34035_solr children b) [n:127.0.0.1:34035_solr c:children s:b r:core_node20 x:children_b_replica_n10 t:null-9684] o.a.s.c.SolrCore Registered new searcher autowarm time: 0 ms 2> 1734542 INFO (searcherExecutor-3708-thread-1-processing-children_e_replica_n7 null-9688 core_node17 127.0.0.1:36867_solr children e) [n:127.0.0.1:36867_solr c:children s:e r:core_node17 x:children_e_replica_n7 t:null-9688] o.a.s.c.SolrCore Registered new searcher autowarm time: 0 ms 2> 1734542 INFO (searcherExecutor-3724-thread-1-processing-children_d_replica_n1 null-9683 core_node11 127.0.0.1:34035_solr children d) [n:127.0.0.1:34035_solr c:children s:d r:core_node11 x:children_d_replica_n1 t:null-9683] o.a.s.c.SolrCore Registered new searcher autowarm time: 0 ms 2> 1734542 INFO (searcherExecutor-3718-thread-1-processing-children_d_replica_n8 null-9686 core_node18 127.0.0.1:44957_solr children d) [n:127.0.0.1:44957_solr c:children s:d r:core_node18 x:children_d_replica_n8 t:null-9686] o.a.s.c.SolrCore Registered new searcher autowarm time: 0 ms 2> 1734543 INFO (searcherExecutor-3726-thread-1-processing-children_a_replica_n4 null-9691 core_node14 127.0.0.1:35379_solr children a) [n:127.0.0.1:35379_solr c:children s:a r:core_node14 x:children_a_replica_n4 t:null-9691] o.a.s.c.SolrCore Registered new searcher autowarm time: 0 ms 2> 1734543 INFO (searcherExecutor-3714-thread-1-processing-children_a_replica_n6 null-9687 core_node16 127.0.0.1:44957_solr children a) [n:127.0.0.1:44957_solr c:children s:a r:core_node16 x:children_a_replica_n6 t:null-9687] o.a.s.c.SolrCore Registered new searcher autowarm time: 0 ms 2> 1734543 INFO (searcherExecutor-3720-thread-1-processing-children_c_replica_n9 null-9689 core_node19 127.0.0.1:36867_solr children c) [n:127.0.0.1:36867_solr c:children s:c r:core_node19 x:children_c_replica_n9 t:null-9689] o.a.s.c.SolrCore Registered new searcher autowarm time: 0 ms 2> 1734548 INFO (searcherExecutor-3710-thread-1-processing-children_e_replica_n5 null-9685 core_node15 127.0.0.1:36209_solr children e) [n:127.0.0.1:36209_solr c:children s:e r:core_node15 x:children_e_replica_n5 t:null-9685] o.a.s.c.QuerySenderListener QuerySenderListener done. 2> 1734548 INFO (searcherExecutor-3716-thread-1-processing-children_b_replica_n2 null-9682 core_node12 127.0.0.1:36209_solr children b) [n:127.0.0.1:36209_solr c:children s:b r:core_node12 x:children_b_replica_n2 t:null-9682] o.a.s.c.QuerySenderListener QuerySenderListener done. 2> 1734548 INFO (searcherExecutor-3710-thread-1-processing-children_e_replica_n5 null-9685 core_node15 127.0.0.1:36209_solr children e) [n:127.0.0.1:36209_solr c:children s:e r:core_node15 x:children_e_replica_n5 t:null-9685] o.a.s.h.c.SpellCheckComponent Loading spell index for spellchecker: default 2> 1734549 INFO (searcherExecutor-3716-thread-1-processing-children_b_replica_n2 null-9682 core_node12 127.0.0.1:36209_solr children b) [n:127.0.0.1:36209_solr c:children s:b r:core_node12 x:children_b_replica_n2 t:null-9682] o.a.s.h.c.SpellCheckComponent Loading spell index for spellchecker: default 2> 1734557 INFO (searcherExecutor-3722-thread-1-processing-children_c_replica_n3 null-9690 core_node13 127.0.0.1:35379_solr children c) [n:127.0.0.1:35379_solr c:children s:c r:core_node13 x:children_c_replica_n3 t:null-9690] o.a.s.c.QuerySenderListener QuerySenderListener done. 2> 1734557 INFO (searcherExecutor-3710-thread-1-processing-children_e_replica_n5 null-9685 core_node15 127.0.0.1:36209_solr children e) [n:127.0.0.1:36209_solr c:children s:e r:core_node15 x:children_e_replica_n5 t:null-9685] o.a.s.c.SolrCore Registered new searcher autowarm time: 0 ms 2> 1734557 INFO (searcherExecutor-3722-thread-1-processing-children_c_replica_n3 null-9690 core_node13 127.0.0.1:35379_solr children c) [n:127.0.0.1:35379_solr c:children s:c r:core_node13 x:children_c_replica_n3 t:null-9690] o.a.s.h.c.SpellCheckComponent Loading spell index for spellchecker: default 2> 1734557 INFO (searcherExecutor-3716-thread-1-processing-children_b_replica_n2 null-9682 core_node12 127.0.0.1:36209_solr children b) [n:127.0.0.1:36209_solr c:children s:b r:core_node12 x:children_b_replica_n2 t:null-9682] o.a.s.c.SolrCore Registered new searcher autowarm time: 0 ms 2> 1734563 INFO (searcherExecutor-3722-thread-1-processing-children_c_replica_n3 null-9690 core_node13 127.0.0.1:35379_solr children c) [n:127.0.0.1:35379_solr c:children s:c r:core_node13 x:children_c_replica_n3 t:null-9690] o.a.s.c.SolrCore Registered new searcher autowarm time: 0 ms 2> 1734635 INFO (qtp1167887541-5571) [n:127.0.0.1:36867_solr c:children s:e r:core_node17 x:children_e_replica_n7 t:null-9688] o.a.s.c.ZkShardTerms Failed to save terms, version is not a match, retrying 2> 1734635 INFO (qtp780434683-5572) [n:127.0.0.1:35379_solr c:children s:a r:core_node14 x:children_a_replica_n4 t:null-9691] o.a.s.c.ZkShardTerms Failed to save terms, version is not a match, retrying 2> 1734635 INFO (qtp484878969-5581) [n:127.0.0.1:44957_solr c:children s:a r:core_node16 x:children_a_replica_n6 t:null-9687] o.a.s.c.ZkShardTerms Successful update of terms at /collections/children/terms/a to Terms{values={core_node16=0}, version=0} for registerTerm 2> 1734635 INFO (qtp266123491-5590) [n:127.0.0.1:34035_solr c:children s:b r:core_node20 x:children_b_replica_n10 t:null-9684] o.a.s.c.ZkShardTerms Successful update of terms at /collections/children/terms/b to Terms{values={core_node20=0}, version=0} for registerTerm 2> 1734635 INFO (qtp1283346820-5570) [n:127.0.0.1:36209_solr c:children s:e r:core_node15 x:children_e_replica_n5 t:null-9685] o.a.s.c.ZkShardTerms Successful update of terms at /collections/children/terms/e to Terms{values={core_node15=0}, version=0} for registerTerm 2> 1734641 INFO (qtp1283346820-5570) [n:127.0.0.1:36209_solr c:children s:e r:core_node15 x:children_e_replica_n5 t:null-9685] o.a.s.c.ShardLeaderElectionContextBase make sure parent is created /collections/children/leaders/e 2> 1734641 INFO (qtp484878969-5581) [n:127.0.0.1:44957_solr c:children s:a r:core_node16 x:children_a_replica_n6 t:null-9687] o.a.s.c.ShardLeaderElectionContextBase make sure parent is created /collections/children/leaders/a 2> 1734641 INFO (qtp266123491-5590) [n:127.0.0.1:34035_solr c:children s:b r:core_node20 x:children_b_replica_n10 t:null-9684] o.a.s.c.ShardLeaderElectionContextBase make sure parent is created /collections/children/leaders/b 2> 1734647 INFO (qtp1167887541-5571) [n:127.0.0.1:36867_solr c:children s:e r:core_node17 x:children_e_replica_n7 t:null-9688] o.a.s.c.ZkShardTerms Successful update of terms at /collections/children/terms/e to Terms{values={core_node17=0, core_node15=0}, version=1} for registerTerm 2> 1734648 INFO (qtp1167887541-5571) [n:127.0.0.1:36867_solr c:children s:e r:core_node17 x:children_e_replica_n7 t:null-9688] o.a.s.c.ShardLeaderElectionContextBase make sure parent is created /collections/children/leaders/e 2> 1734649 INFO (qtp780434683-5572) [n:127.0.0.1:35379_solr c:children s:a r:core_node14 x:children_a_replica_n4 t:null-9691] o.a.s.c.ZkShardTerms Successful update of terms at /collections/children/terms/a to Terms{values={core_node14=0, core_node16=0}, version=1} for registerTerm 2> 1734650 INFO (qtp780434683-5572) [n:127.0.0.1:35379_solr c:children s:a r:core_node14 x:children_a_replica_n4 t:null-9691] o.a.s.c.ShardLeaderElectionContextBase make sure parent is created /collections/children/leaders/a 2> 1734678 INFO (qtp266123491-5585) [n:127.0.0.1:34035_solr c:children s:d r:core_node11 x:children_d_replica_n1 t:null-9683] o.a.s.c.ZkShardTerms Successful update of terms at /collections/children/terms/d to Terms{values={core_node11=0}, version=0} for registerTerm 2> 1734685 INFO (qtp266123491-5585) [n:127.0.0.1:34035_solr c:children s:d r:core_node11 x:children_d_replica_n1 t:null-9683] o.a.s.c.ShardLeaderElectionContextBase make sure parent is created /collections/children/leaders/d 2> 1734685 INFO (qtp1283346820-5568) [n:127.0.0.1:36209_solr c:children s:b r:core_node12 x:children_b_replica_n2 t:null-9682] o.a.s.c.ZkShardTerms Successful update of terms at /collections/children/terms/b to Terms{values={core_node20=0, core_node12=0}, version=1} for registerTerm 2> 1734686 INFO (qtp484878969-5589) [n:127.0.0.1:44957_solr c:children s:d r:core_node18 x:children_d_replica_n8 t:null-9686] o.a.s.c.ZkShardTerms Failed to save terms, version is not a match, retrying 2> 1734693 INFO (qtp1283346820-5568) [n:127.0.0.1:36209_solr c:children s:b r:core_node12 x:children_b_replica_n2 t:null-9682] o.a.s.c.ShardLeaderElectionContextBase make sure parent is created /collections/children/leaders/b 2> 1734705 INFO (qtp1167887541-5575) [n:127.0.0.1:36867_solr c:children s:c r:core_node19 x:children_c_replica_n9 t:null-9689] o.a.s.c.ZkShardTerms Successful update of terms at /collections/children/terms/c to Terms{values={core_node19=0}, version=0} for registerTerm 2> 1734710 INFO (qtp1167887541-5575) [n:127.0.0.1:36867_solr c:children s:c r:core_node19 x:children_c_replica_n9 t:null-9689] o.a.s.c.ShardLeaderElectionContextBase make sure parent is created /collections/children/leaders/c 2> 1734712 INFO (qtp780434683-5566) [n:127.0.0.1:35379_solr c:children s:c r:core_node13 x:children_c_replica_n3 t:null-9690] o.a.s.c.ZkShardTerms Failed to save terms, version is not a match, retrying 2> 1734713 INFO (qtp484878969-5589) [n:127.0.0.1:44957_solr c:children s:d r:core_node18 x:children_d_replica_n8 t:null-9686] o.a.s.c.ZkShardTerms Successful update of terms at /collections/children/terms/d to Terms{values={core_node18=0, core_node11=0}, version=1} for registerTerm 2> 1734717 INFO (qtp484878969-5589) [n:127.0.0.1:44957_solr c:children s:d r:core_node18 x:children_d_replica_n8 t:null-9686] o.a.s.c.ShardLeaderElectionContextBase make sure parent is created /collections/children/leaders/d 2> 1734729 INFO (qtp780434683-5566) [n:127.0.0.1:35379_solr c:children s:c r:core_node13 x:children_c_replica_n3 t:null-9690] o.a.s.c.ZkShardTerms Successful update of terms at /collections/children/terms/c to Terms{values={core_node13=0, core_node19=0}, version=1} for registerTerm 2> 1734733 INFO (qtp780434683-5566) [n:127.0.0.1:35379_solr c:children s:c r:core_node13 x:children_c_replica_n3 t:null-9690] o.a.s.c.ShardLeaderElectionContextBase make sure parent is created /collections/children/leaders/c 2> 1734771 INFO (qtp484878969-5581) [n:127.0.0.1:44957_solr c:children s:a r:core_node16 x:children_a_replica_n6 t:null-9687] o.a.s.c.ShardLeaderElectionContext Enough replicas found to continue. 2> 1734771 INFO (qtp484878969-5581) [n:127.0.0.1:44957_solr c:children s:a r:core_node16 x:children_a_replica_n6 t:null-9687] o.a.s.c.ShardLeaderElectionContext I may be the new leader - try and sync 2> 1734772 INFO (qtp484878969-5581) [n:127.0.0.1:44957_solr c:children s:a r:core_node16 x:children_a_replica_n6 t:null-9687] o.a.s.c.SyncStrategy Sync replicas to https://127.0.0.1:44957/solr/children_a_replica_n6/ 2> 1734774 INFO (qtp1167887541-5575) [n:127.0.0.1:36867_solr c:children s:c r:core_node19 x:children_c_replica_n9 t:null-9689] o.a.s.c.ShardLeaderElectionContext Enough replicas found to continue. 2> 1734774 INFO (qtp1167887541-5575) [n:127.0.0.1:36867_solr c:children s:c r:core_node19 x:children_c_replica_n9 t:null-9689] o.a.s.c.ShardLeaderElectionContext I may be the new leader - try and sync 2> 1734774 INFO (qtp1167887541-5575) [n:127.0.0.1:36867_solr c:children s:c r:core_node19 x:children_c_replica_n9 t:null-9689] o.a.s.c.SyncStrategy Sync replicas to https://127.0.0.1:36867/solr/children_c_replica_n9/ 2> 1735345 INFO (qtp484878969-5581) [n:127.0.0.1:44957_solr c:children s:a r:core_node16 x:children_a_replica_n6 t:null-9687] o.a.s.u.PeerSync PeerSync: core=children_a_replica_n6 url=https://127.0.0.1:44957/solr START replicas=[https://127.0.0.1:35379/solr/children_a_replica_n4/] nUpdates=100 2> 1735345 INFO (qtp1167887541-5575) [n:127.0.0.1:36867_solr c:children s:c r:core_node19 x:children_c_replica_n9 t:null-9689] o.a.s.u.PeerSync PeerSync: core=children_c_replica_n9 url=https://127.0.0.1:36867/solr START replicas=[https://127.0.0.1:35379/solr/children_c_replica_n3/] nUpdates=100 2> 1735345 INFO (qtp266123491-5585) [n:127.0.0.1:34035_solr c:children s:d r:core_node11 x:children_d_replica_n1 t:null-9683] o.a.s.c.ShardLeaderElectionContext Enough replicas found to continue. 2> 1735346 INFO (qtp266123491-5585) [n:127.0.0.1:34035_solr c:children s:d r:core_node11 x:children_d_replica_n1 t:null-9683] o.a.s.c.ShardLeaderElectionContext I may be the new leader - try and sync 2> 1735346 INFO (qtp266123491-5585) [n:127.0.0.1:34035_solr c:children s:d r:core_node11 x:children_d_replica_n1 t:null-9683] o.a.s.c.SyncStrategy Sync replicas to https://127.0.0.1:34035/solr/children_d_replica_n1/ 2> 1735349 INFO (qtp266123491-5590) [n:127.0.0.1:34035_solr c:children s:b r:core_node20 x:children_b_replica_n10 t:null-9684] o.a.s.c.ShardLeaderElectionContext Enough replicas found to continue. 2> 1735349 INFO (qtp266123491-5590) [n:127.0.0.1:34035_solr c:children s:b r:core_node20 x:children_b_replica_n10 t:null-9684] o.a.s.c.ShardLeaderElectionContext I may be the new leader - try and sync 2> 1735349 INFO (qtp266123491-5590) [n:127.0.0.1:34035_solr c:children s:b r:core_node20 x:children_b_replica_n10 t:null-9684] o.a.s.c.SyncStrategy Sync replicas to https://127.0.0.1:34035/solr/children_b_replica_n10/ 2> 1735354 INFO (qtp266123491-5585) [n:127.0.0.1:34035_solr c:children s:d r:core_node11 x:children_d_replica_n1 t:null-9683] o.a.s.u.PeerSync PeerSync: core=children_d_replica_n1 url=https://127.0.0.1:34035/solr START replicas=[https://127.0.0.1:44957/solr/children_d_replica_n8/] nUpdates=100 2> 1735356 INFO (qtp1283346820-5570) [n:127.0.0.1:36209_solr c:children s:e r:core_node15 x:children_e_replica_n5 t:null-9685] o.a.s.c.ShardLeaderElectionContext Enough replicas found to continue. 2> 1735356 INFO (qtp1283346820-5570) [n:127.0.0.1:36209_solr c:children s:e r:core_node15 x:children_e_replica_n5 t:null-9685] o.a.s.c.ShardLeaderElectionContext I may be the new leader - try and sync 2> 1735357 INFO (qtp1283346820-5570) [n:127.0.0.1:36209_solr c:children s:e r:core_node15 x:children_e_replica_n5 t:null-9685] o.a.s.c.SyncStrategy Sync replicas to https://127.0.0.1:36209/solr/children_e_replica_n5/ 2> 1735360 INFO (qtp266123491-5590) [n:127.0.0.1:34035_solr c:children s:b r:core_node20 x:children_b_replica_n10 t:null-9684] o.a.s.u.PeerSync PeerSync: core=children_b_replica_n10 url=https://127.0.0.1:34035/solr START replicas=[https://127.0.0.1:36209/solr/children_b_replica_n2/] nUpdates=100 2> 1735360 INFO (qtp1283346820-5570) [n:127.0.0.1:36209_solr c:children s:e r:core_node15 x:children_e_replica_n5 t:null-9685] o.a.s.u.PeerSync PeerSync: core=children_e_replica_n5 url=https://127.0.0.1:36209/solr START replicas=[https://127.0.0.1:36867/solr/children_e_replica_n7/] nUpdates=100 2> 1735372 INFO (zkConnectionManagerCallback-3702-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ConnectionManager zkClient has connected 2> 1735373 INFO (qtp266123491-5590) [n:127.0.0.1:34035_solr c:children s:b r:core_node20 x:children_b_replica_n10 t:null-9684] o.a.s.u.PeerSync PeerSync: core=children_b_replica_n10 url=https://127.0.0.1:34035/solr DONE. We have no versions. sync failed. 2> 1735373 INFO (qtp1167887541-5575) [n:127.0.0.1:36867_solr c:children s:c r:core_node19 x:children_c_replica_n9 t:null-9689] o.a.s.u.PeerSync PeerSync: core=children_c_replica_n9 url=https://127.0.0.1:36867/solr DONE. We have no versions. sync failed. 2> 1735376 INFO (qtp484878969-5581) [n:127.0.0.1:44957_solr c:children s:a r:core_node16 x:children_a_replica_n6 t:null-9687] o.a.s.u.PeerSync PeerSync: core=children_a_replica_n6 url=https://127.0.0.1:44957/solr DONE. We have no versions. sync failed. 2> 1735391 INFO (qtp266123491-5585) [n:127.0.0.1:34035_solr c:children s:d r:core_node11 x:children_d_replica_n1 t:null-9683] o.a.s.u.PeerSync PeerSync: core=children_d_replica_n1 url=https://127.0.0.1:34035/solr DONE. We have no versions. sync failed. 2> 1735394 INFO (qtp1283346820-5570) [n:127.0.0.1:36209_solr c:children s:e r:core_node15 x:children_e_replica_n5 t:null-9685] o.a.s.u.PeerSync PeerSync: core=children_e_replica_n5 url=https://127.0.0.1:36209/solr DONE. We have no versions. sync failed. 2> 1735875 INFO (qtp1283346820-5584) [n:127.0.0.1:36209_solr c:children s:b r:core_node12 x:children_b_replica_n2 t:null-9684] o.a.s.c.S.Request webapp=/solr path=/get params={distrib=false&qt=/get&fingerprint=false&getVersions=100&wt=javabin&version=2} status=0 QTime=48 2> 1735875 INFO (qtp1167887541-5563) [n:127.0.0.1:36867_solr c:children s:e r:core_node17 x:children_e_replica_n7 t:null-9685] o.a.s.c.S.Request webapp=/solr path=/get params={distrib=false&qt=/get&fingerprint=false&getVersions=100&wt=javabin&version=2} status=0 QTime=48 2> 1735875 INFO (qtp484878969-5565) [n:127.0.0.1:44957_solr c:children s:d r:core_node18 x:children_d_replica_n8 t:null-9683] o.a.s.c.S.Request webapp=/solr path=/get params={distrib=false&qt=/get&fingerprint=false&getVersions=100&wt=javabin&version=2} status=0 QTime=49 2> 1735875 INFO (qtp780434683-5582) [n:127.0.0.1:35379_solr c:children s:a r:core_node14 x:children_a_replica_n4 t:null-9687] o.a.s.c.S.Request webapp=/solr path=/get params={distrib=false&qt=/get&fingerprint=false&getVersions=100&wt=javabin&version=2} status=0 QTime=46 2> 1735876 INFO (qtp780434683-5580) [n:127.0.0.1:35379_solr c:children s:c r:core_node13 x:children_c_replica_n3 t:null-9689] o.a.s.c.S.Request webapp=/solr path=/get params={distrib=false&qt=/get&fingerprint=false&getVersions=100&wt=javabin&version=2} status=0 QTime=46 2> 1735931 INFO (qtp266123491-5590) [n:127.0.0.1:34035_solr c:children s:b r:core_node20 x:children_b_replica_n10 t:null-9684] o.a.s.c.SyncStrategy Leader's attempt to sync with shard failed, moving to the next candidate 2> 1735931 INFO (qtp484878969-5581) [n:127.0.0.1:44957_solr c:children s:a r:core_node16 x:children_a_replica_n6 t:null-9687] o.a.s.c.SyncStrategy Leader's attempt to sync with shard failed, moving to the next candidate 2> 1735931 INFO (qtp1167887541-5575) [n:127.0.0.1:36867_solr c:children s:c r:core_node19 x:children_c_replica_n9 t:null-9689] o.a.s.c.SyncStrategy Leader's attempt to sync with shard failed, moving to the next candidate 2> 1735932 INFO (qtp266123491-5590) [n:127.0.0.1:34035_solr c:children s:b r:core_node20 x:children_b_replica_n10 t:null-9684] o.a.s.c.ShardLeaderElectionContext We failed sync, but we have no versions - we can't sync in that case - we were active before, so become leader anyway 2> 1735931 INFO (qtp266123491-5585) [n:127.0.0.1:34035_solr c:children s:d r:core_node11 x:children_d_replica_n1 t:null-9683] o.a.s.c.SyncStrategy Leader's attempt to sync with shard failed, moving to the next candidate 2> 1735932 INFO (qtp1167887541-5575) [n:127.0.0.1:36867_solr c:children s:c r:core_node19 x:children_c_replica_n9 t:null-9689] o.a.s.c.ShardLeaderElectionContext We failed sync, but we have no versions - we can't sync in that case - we were active before, so become leader anyway 2> 1735932 INFO (qtp484878969-5581) [n:127.0.0.1:44957_solr c:children s:a r:core_node16 x:children_a_replica_n6 t:null-9687] o.a.s.c.ShardLeaderElectionContext We failed sync, but we have no versions - we can't sync in that case - we were active before, so become leader anyway 2> 1735931 INFO (qtp1283346820-5570) [n:127.0.0.1:36209_solr c:children s:e r:core_node15 x:children_e_replica_n5 t:null-9685] o.a.s.c.SyncStrategy Leader's attempt to sync with shard failed, moving to the next candidate 2> 1735932 INFO (qtp266123491-5590) [n:127.0.0.1:34035_solr c:children s:b r:core_node20 x:children_b_replica_n10 t:null-9684] o.a.s.c.ShardLeaderElectionContextBase Creating leader registration node /collections/children/leaders/b/leader after winning as /collections/children/leader_elect/b/election/72077900700319754-core_node20-n_0000000000 2> 1735932 INFO (qtp266123491-5585) [n:127.0.0.1:34035_solr c:children s:d r:core_node11 x:children_d_replica_n1 t:null-9683] o.a.s.c.ShardLeaderElectionContext We failed sync, but we have no versions - we can't sync in that case - we were active before, so become leader anyway 2> 1735932 INFO (qtp1167887541-5575) [n:127.0.0.1:36867_solr c:children s:c r:core_node19 x:children_c_replica_n9 t:null-9689] o.a.s.c.ShardLeaderElectionContextBase Creating leader registration node /collections/children/leaders/c/leader after winning as /collections/children/leader_elect/c/election/72077900700319753-core_node19-n_0000000000 2> 1735932 INFO (qtp1283346820-5570) [n:127.0.0.1:36209_solr c:children s:e r:core_node15 x:children_e_replica_n5 t:null-9685] o.a.s.c.ShardLeaderElectionContext We failed sync, but we have no versions - we can't sync in that case - we were active before, so become leader anyway 2> 1735932 INFO (qtp484878969-5581) [n:127.0.0.1:44957_solr c:children s:a r:core_node16 x:children_a_replica_n6 t:null-9687] o.a.s.c.ShardLeaderElectionContextBase Creating leader registration node /collections/children/leaders/a/leader after winning as /collections/children/leader_elect/a/election/72077900700319756-core_node16-n_0000000000 2> 1735932 INFO (qtp266123491-5585) [n:127.0.0.1:34035_solr c:children s:d r:core_node11 x:children_d_replica_n1 t:null-9683] o.a.s.c.ShardLeaderElectionContextBase Creating leader registration node /collections/children/leaders/d/leader after winning as /collections/children/leader_elect/d/election/72077900700319754-core_node11-n_0000000000 2> 1735932 INFO (qtp1283346820-5570) [n:127.0.0.1:36209_solr c:children s:e r:core_node15 x:children_e_replica_n5 t:null-9685] o.a.s.c.ShardLeaderElectionContextBase Creating leader registration node /collections/children/leaders/e/leader after winning as /collections/children/leader_elect/e/election/72077900700319755-core_node15-n_0000000000 2> 1735960 INFO (qtp1283346820-5570) [n:127.0.0.1:36209_solr c:children s:e r:core_node15 x:children_e_replica_n5 t:null-9685] o.a.s.c.ShardLeaderElectionContext I am the new leader: https://127.0.0.1:36209/solr/children_e_replica_n5/ e 2> 1735969 INFO (qtp1167887541-5575) [n:127.0.0.1:36867_solr c:children s:c r:core_node19 x:children_c_replica_n9 t:null-9689] o.a.s.c.ShardLeaderElectionContext I am the new leader: https://127.0.0.1:36867/solr/children_c_replica_n9/ c 2> 1735972 INFO (qtp1283346820-5570) [n:127.0.0.1:36209_solr c:children s:e r:core_node15 x:children_e_replica_n5 t:null-9685] o.a.s.c.ZkController I am the leader, no recovery necessary 2> 1735978 INFO (qtp266123491-5590) [n:127.0.0.1:34035_solr c:children s:b r:core_node20 x:children_b_replica_n10 t:null-9684] o.a.s.c.ShardLeaderElectionContext I am the new leader: https://127.0.0.1:34035/solr/children_b_replica_n10/ b 2> 1735983 INFO (qtp1167887541-5575) [n:127.0.0.1:36867_solr c:children s:c r:core_node19 x:children_c_replica_n9 t:null-9689] o.a.s.c.ZkController I am the leader, no recovery necessary 2> 1735988 INFO (qtp484878969-5581) [n:127.0.0.1:44957_solr c:children s:a r:core_node16 x:children_a_replica_n6 t:null-9687] o.a.s.c.ShardLeaderElectionContext I am the new leader: https://127.0.0.1:44957/solr/children_a_replica_n6/ a 2> 1735995 INFO (qtp266123491-5590) [n:127.0.0.1:34035_solr c:children s:b r:core_node20 x:children_b_replica_n10 t:null-9684] o.a.s.c.ZkController I am the leader, no recovery necessary 2> 1736000 INFO (qtp484878969-5581) [n:127.0.0.1:44957_solr c:children s:a r:core_node16 x:children_a_replica_n6 t:null-9687] o.a.s.c.ZkController I am the leader, no recovery necessary 2> 1736026 INFO (zkCallback-3664-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/children/state.json zxid: 777] for collection [children] has occurred - updating... (live nodes size: [5]) 2> 1736058 INFO (qtp1283346820-5570) [n:127.0.0.1:36209_solr c: s: r: x: t:null-9685] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/cores params={qt=/admin/cores&coreNodeName=core_node15&collection.configName=_default&newCollection=true&name=children_e_replica_n5&action=CREATE&numShards=5&collection=children&shard=e&wt=javabin&version=2&replicaType=NRT} status=0 QTime=25678 2> 1736058 INFO (qtp266123491-5590) [n:127.0.0.1:34035_solr c: s: r: x: t:null-9684] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/cores params={qt=/admin/cores&coreNodeName=core_node20&collection.configName=_default&newCollection=true&name=children_b_replica_n10&action=CREATE&numShards=5&collection=children&shard=b&wt=javabin&version=2&replicaType=NRT} status=0 QTime=25678 2> 1736058 INFO (qtp1167887541-5575) [n:127.0.0.1:36867_solr c: s: r: x: t:null-9689] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/cores params={qt=/admin/cores&coreNodeName=core_node19&collection.configName=_default&newCollection=true&name=children_c_replica_n9&action=CREATE&numShards=5&collection=children&shard=c&wt=javabin&version=2&replicaType=NRT} status=0 QTime=25631 2> 1736059 INFO (zkCallback-3664-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/children/state.json zxid: 778] for collection [children] has occurred - updating... (live nodes size: [5]) 2> 1736059 INFO (qtp266123491-5585) [n:127.0.0.1:34035_solr c:children s:d r:core_node11 x:children_d_replica_n1 t:null-9683] o.a.s.c.ShardLeaderElectionContext I am the new leader: https://127.0.0.1:34035/solr/children_d_replica_n1/ d 2> 1736059 INFO (zkCallback-3658-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/children/state.json zxid: 778] for collection [children] has occurred - updating... (live nodes size: [5]) 2> 1736059 INFO (zkCallback-3668-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/children/state.json zxid: 778] for collection [children] has occurred - updating... (live nodes size: [5]) 2> 1736069 INFO (qtp484878969-5581) [n:127.0.0.1:44957_solr c: s: r: x: t:null-9687] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/cores params={qt=/admin/cores&coreNodeName=core_node16&collection.configName=_default&newCollection=true&name=children_a_replica_n6&action=CREATE&numShards=5&collection=children&shard=a&wt=javabin&version=2&replicaType=NRT} status=0 QTime=25644 2> 1736071 INFO (qtp266123491-5585) [n:127.0.0.1:34035_solr c:children s:d r:core_node11 x:children_d_replica_n1 t:null-9683] o.a.s.c.ZkController I am the leader, no recovery necessary 2> 1736081 INFO (zkCallback-3664-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/children/state.json zxid: 779] for collection [children] has occurred - updating... (live nodes size: [5]) 2> 1736081 INFO (zkCallback-3658-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/children/state.json zxid: 779] for collection [children] has occurred - updating... (live nodes size: [5]) 2> 1736081 INFO (zkCallback-3668-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/children/state.json zxid: 779] for collection [children] has occurred - updating... (live nodes size: [5]) 2> 1736081 INFO (zkCallback-3666-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/children/state.json zxid: 779] for collection [children] has occurred - updating... (live nodes size: [5]) 2> 1736092 INFO (qtp266123491-5585) [n:127.0.0.1:34035_solr c: s: r: x: t:null-9683] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/cores params={qt=/admin/cores&coreNodeName=core_node11&collection.configName=_default&newCollection=true&name=children_d_replica_n1&action=CREATE&numShards=5&collection=children&shard=d&wt=javabin&version=2&replicaType=NRT} status=0 QTime=25714 2> 1736778 INFO (zkCallback-3664-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/children/state.json zxid: 780] for collection [children] has occurred - updating... (live nodes size: [5]) 2> 1736778 INFO (zkCallback-3666-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/children/state.json zxid: 780] for collection [children] has occurred - updating... (live nodes size: [5]) 2> 1736778 INFO (zkCallback-3658-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/children/state.json zxid: 780] for collection [children] has occurred - updating... (live nodes size: [5]) 2> 1736778 INFO (zkCallback-3668-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/children/state.json zxid: 780] for collection [children] has occurred - updating... (live nodes size: [5]) 2> 1736789 INFO (zkCallback-3664-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/children/state.json zxid: 783] for collection [children] has occurred - updating... (live nodes size: [5]) 2> 1736789 INFO (zkCallback-3666-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/children/state.json zxid: 783] for collection [children] has occurred - updating... (live nodes size: [5]) 2> 1736789 INFO (zkCallback-3668-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/children/state.json zxid: 783] for collection [children] has occurred - updating... (live nodes size: [5]) 2> 1736789 INFO (zkCallback-3658-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/children/state.json zxid: 783] for collection [children] has occurred - updating... (live nodes size: [5]) 2> 1736790 INFO (qtp1283346820-5568) [n:127.0.0.1:36209_solr c: s: r: x: t:null-9682] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/cores params={qt=/admin/cores&coreNodeName=core_node12&collection.configName=_default&newCollection=true&name=children_b_replica_n2&action=CREATE&numShards=5&collection=children&shard=b&wt=javabin&version=2&replicaType=NRT} status=0 QTime=26412 2> 1736800 INFO (zkCallback-3664-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/children/state.json zxid: 785] for collection [children] has occurred - updating... (live nodes size: [5]) 2> 1736800 INFO (qtp1167887541-5571) [n:127.0.0.1:36867_solr c: s: r: x: t:null-9688] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/cores params={qt=/admin/cores&coreNodeName=core_node17&collection.configName=_default&newCollection=true&name=children_e_replica_n7&action=CREATE&numShards=5&collection=children&shard=e&wt=javabin&version=2&replicaType=NRT} status=0 QTime=26375 2> 1736800 INFO (zkCallback-3658-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/children/state.json zxid: 785] for collection [children] has occurred - updating... (live nodes size: [5]) 2> 1736800 INFO (zkCallback-3666-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/children/state.json zxid: 785] for collection [children] has occurred - updating... (live nodes size: [5]) 2> 1736800 INFO (zkCallback-3668-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/children/state.json zxid: 785] for collection [children] has occurred - updating... (live nodes size: [5]) 2> 1736810 INFO (qtp484878969-5589) [n:127.0.0.1:44957_solr c: s: r: x: t:null-9686] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/cores params={qt=/admin/cores&coreNodeName=core_node18&collection.configName=_default&newCollection=true&name=children_d_replica_n8&action=CREATE&numShards=5&collection=children&shard=d&wt=javabin&version=2&replicaType=NRT} status=0 QTime=26386 2> 1736814 INFO (zkCallback-3664-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/children/state.json zxid: 786] for collection [children] has occurred - updating... (live nodes size: [5]) 2> 1736814 INFO (zkCallback-3658-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/children/state.json zxid: 786] for collection [children] has occurred - updating... (live nodes size: [5]) 2> 1736814 INFO (zkCallback-3666-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/children/state.json zxid: 786] for collection [children] has occurred - updating... (live nodes size: [5]) 2> 1736815 INFO (zkCallback-3668-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/children/state.json zxid: 786] for collection [children] has occurred - updating... (live nodes size: [5]) 2> 1736822 INFO (zkCallback-3664-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/children/state.json zxid: 788] for collection [children] has occurred - updating... (live nodes size: [5]) 2> 1736822 INFO (zkCallback-3658-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/children/state.json zxid: 788] for collection [children] has occurred - updating... (live nodes size: [5]) 2> 1736822 INFO (zkCallback-3666-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/children/state.json zxid: 788] for collection [children] has occurred - updating... (live nodes size: [5]) 2> 1736822 INFO (zkCallback-3668-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/children/state.json zxid: 788] for collection [children] has occurred - updating... (live nodes size: [5]) 2> 1736837 INFO (qtp780434683-5566) [n:127.0.0.1:35379_solr c: s: r: x: t:null-9690] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/cores params={qt=/admin/cores&coreNodeName=core_node13&collection.configName=_default&newCollection=true&name=children_c_replica_n3&action=CREATE&numShards=5&collection=children&shard=c&wt=javabin&version=2&replicaType=NRT} status=0 QTime=26406 2> 1736841 INFO (qtp780434683-5572) [n:127.0.0.1:35379_solr c: s: r: x: t:null-9691] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/cores params={qt=/admin/cores&coreNodeName=core_node14&collection.configName=_default&newCollection=true&name=children_a_replica_n4&action=CREATE&numShards=5&collection=children&shard=a&wt=javabin&version=2&replicaType=NRT} status=0 QTime=26398 2> 1736859 INFO (qtp780434683-5574) [n:127.0.0.1:35379_solr c:children s: r: x: t:null-9681] o.a.s.h.a.CollectionsHandler Wait for new collection to be active for at most 45 seconds. Check all shard replicas 2> 1736861 INFO (qtp780434683-5574) [n:127.0.0.1:35379_solr c:children s: r: x: t:null-9681] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/collections params={shards=a,b,c,d,e&collection.configName=_default&name=children&router.name=implicit&nrtReplicas=2&action=CREATE&numShards=3&wt=javabin&version=2} status=0 QTime=27173 2> 1736923 INFO (OverseerThreadFactory-3682-thread-2) [n: c:parent s: r: x: t:] o.a.s.c.a.c.CreateCollectionCmd Create collection parent 2> 1736925 INFO (OverseerCollectionConfigSetProcessor-72077900700319752-127.0.0.1:35379_solr-n_0000000000) [n: c: s: r: x: t:] o.a.s.c.OverseerTaskQueue Response ZK path: /overseer/collection-queue-work/qnr-0000000000 doesn't exist. Requestor may have disconnected from ZooKeeper 2> 1737210 INFO (OverseerThreadFactory-3682-thread-2) [n: c:parent s: r: x: t:] o.a.s.c.o.SliceMutator createReplica() { 2> "core":"parent_a_replica_n1", 2> "node_name":"127.0.0.1:35379_solr", 2> "base_url":"https://127.0.0.1:35379/solr", 2> "collection":"parent", 2> "shard":"a", 2> "state":"down", 2> "type":"NRT", 2> "operation":"ADDREPLICA", 2> "waitForFinalState":"false"} 2> 1737220 INFO (OverseerThreadFactory-3682-thread-2) [n: c:parent s: r: x: t:] o.a.s.c.o.SliceMutator createReplica() { 2> "core":"parent_e_replica_n2", 2> "node_name":"127.0.0.1:36867_solr", 2> "base_url":"https://127.0.0.1:36867/solr", 2> "collection":"parent", 2> "shard":"e", 2> "state":"down", 2> "type":"NRT", 2> "operation":"ADDREPLICA", 2> "waitForFinalState":"false"} 2> 1737230 INFO (OverseerThreadFactory-3682-thread-2) [n: c:parent s: r: x: t:] o.a.s.c.o.SliceMutator createReplica() { 2> "core":"parent_e_replica_n3", 2> "node_name":"127.0.0.1:36209_solr", 2> "base_url":"https://127.0.0.1:36209/solr", 2> "collection":"parent", 2> "shard":"e", 2> "state":"down", 2> "type":"NRT", 2> "operation":"ADDREPLICA", 2> "waitForFinalState":"false"} 2> 1737241 INFO (OverseerThreadFactory-3682-thread-2) [n: c:parent s: r: x: t:] o.a.s.c.o.SliceMutator createReplica() { 2> "core":"parent_b_replica_n4", 2> "node_name":"127.0.0.1:34035_solr", 2> "base_url":"https://127.0.0.1:34035/solr", 2> "collection":"parent", 2> "shard":"b", 2> "state":"down", 2> "type":"NRT", 2> "operation":"ADDREPLICA", 2> "waitForFinalState":"false"} 2> 1737251 INFO (OverseerThreadFactory-3682-thread-2) [n: c:parent s: r: x: t:] o.a.s.c.o.SliceMutator createReplica() { 2> "core":"parent_b_replica_n5", 2> "node_name":"127.0.0.1:36209_solr", 2> "base_url":"https://127.0.0.1:36209/solr", 2> "collection":"parent", 2> "shard":"b", 2> "state":"down", 2> "type":"NRT", 2> "operation":"ADDREPLICA", 2> "waitForFinalState":"false"} 2> 1737261 INFO (OverseerThreadFactory-3682-thread-2) [n: c:parent s: r: x: t:] o.a.s.c.o.SliceMutator createReplica() { 2> "core":"parent_d_replica_n6", 2> "node_name":"127.0.0.1:44957_solr", 2> "base_url":"https://127.0.0.1:44957/solr", 2> "collection":"parent", 2> "shard":"d", 2> "state":"down", 2> "type":"NRT", 2> "operation":"ADDREPLICA", 2> "waitForFinalState":"false"} 2> 1737269 INFO (OverseerThreadFactory-3682-thread-2) [n: c:parent s: r: x: t:] o.a.s.c.o.SliceMutator createReplica() { 2> "core":"parent_d_replica_n7", 2> "node_name":"127.0.0.1:34035_solr", 2> "base_url":"https://127.0.0.1:34035/solr", 2> "collection":"parent", 2> "shard":"d", 2> "state":"down", 2> "type":"NRT", 2> "operation":"ADDREPLICA", 2> "waitForFinalState":"false"} 2> 1737278 INFO (OverseerThreadFactory-3682-thread-2) [n: c:parent s: r: x: t:] o.a.s.c.o.SliceMutator createReplica() { 2> "core":"parent_a_replica_n8", 2> "node_name":"127.0.0.1:44957_solr", 2> "base_url":"https://127.0.0.1:44957/solr", 2> "collection":"parent", 2> "shard":"a", 2> "state":"down", 2> "type":"NRT", 2> "operation":"ADDREPLICA", 2> "waitForFinalState":"false"} 2> 1737288 INFO (OverseerThreadFactory-3682-thread-2) [n: c:parent s: r: x: t:] o.a.s.c.o.SliceMutator createReplica() { 2> "core":"parent_c_replica_n9", 2> "node_name":"127.0.0.1:35379_solr", 2> "base_url":"https://127.0.0.1:35379/solr", 2> "collection":"parent", 2> "shard":"c", 2> "state":"down", 2> "type":"NRT", 2> "operation":"ADDREPLICA", 2> "waitForFinalState":"false"} 2> 1737297 INFO (OverseerThreadFactory-3682-thread-2) [n: c:parent s: r: x: t:] o.a.s.c.o.SliceMutator createReplica() { 2> "core":"parent_c_replica_n10", 2> "node_name":"127.0.0.1:36867_solr", 2> "base_url":"https://127.0.0.1:36867/solr", 2> "collection":"parent", 2> "shard":"c", 2> "state":"down", 2> "type":"NRT", 2> "operation":"ADDREPLICA", 2> "waitForFinalState":"false"} 2> 1737329 INFO (qtp780434683-5582) [n:127.0.0.1:35379_solr c: s: r: x:parent_a_replica_n1 t:null-9693] o.a.s.h.a.CoreAdminOperation core create command qt=/admin/cores&collection.configName=_default&newCollection=true&collection=parent&version=2&replicaType=NRT&coreNodeName=core_node11&name=parent_a_replica_n1&action=CREATE&numShards=5&shard=a&wt=javabin 2> 1737329 INFO (qtp266123491-5592) [n:127.0.0.1:34035_solr c: s: r: x:parent_b_replica_n4 t:null-9696] o.a.s.h.a.CoreAdminOperation core create command qt=/admin/cores&collection.configName=_default&newCollection=true&collection=parent&version=2&replicaType=NRT&coreNodeName=core_node14&name=parent_b_replica_n4&action=CREATE&numShards=5&shard=b&wt=javabin 2> 1737329 INFO (qtp1283346820-5576) [n:127.0.0.1:36209_solr c: s: r: x:parent_b_replica_n5 t:null-9697] o.a.s.h.a.CoreAdminOperation core create command qt=/admin/cores&collection.configName=_default&newCollection=true&collection=parent&version=2&replicaType=NRT&coreNodeName=core_node15&name=parent_b_replica_n5&action=CREATE&numShards=5&shard=b&wt=javabin 2> 1737330 INFO (qtp1167887541-5567) [n:127.0.0.1:36867_solr c: s: r: x:parent_e_replica_n2 t:null-9694] o.a.s.h.a.CoreAdminOperation core create command qt=/admin/cores&collection.configName=_default&newCollection=true&collection=parent&version=2&replicaType=NRT&coreNodeName=core_node12&name=parent_e_replica_n2&action=CREATE&numShards=5&shard=e&wt=javabin 2> 1737330 INFO (qtp1283346820-5562) [n:127.0.0.1:36209_solr c: s: r: x:parent_e_replica_n3 t:null-9695] o.a.s.h.a.CoreAdminOperation core create command qt=/admin/cores&collection.configName=_default&newCollection=true&collection=parent&version=2&replicaType=NRT&coreNodeName=core_node13&name=parent_e_replica_n3&action=CREATE&numShards=5&shard=e&wt=javabin 2> 1737331 INFO (qtp484878969-5573) [n:127.0.0.1:44957_solr c: s: r: x:parent_d_replica_n6 t:null-9698] o.a.s.h.a.CoreAdminOperation core create command qt=/admin/cores&collection.configName=_default&newCollection=true&collection=parent&version=2&replicaType=NRT&coreNodeName=core_node16&name=parent_d_replica_n6&action=CREATE&numShards=5&shard=d&wt=javabin 2> 1737332 INFO (qtp266123491-5594) [n:127.0.0.1:34035_solr c: s: r: x:parent_d_replica_n7 t:null-9699] o.a.s.h.a.CoreAdminOperation core create command qt=/admin/cores&collection.configName=_default&newCollection=true&collection=parent&version=2&replicaType=NRT&coreNodeName=core_node17&name=parent_d_replica_n7&action=CREATE&numShards=5&shard=d&wt=javabin 2> 1737335 INFO (qtp484878969-5591) [n:127.0.0.1:44957_solr c: s: r: x:parent_a_replica_n8 t:null-9700] o.a.s.h.a.CoreAdminOperation core create command qt=/admin/cores&collection.configName=_default&newCollection=true&collection=parent&version=2&replicaType=NRT&coreNodeName=core_node18&name=parent_a_replica_n8&action=CREATE&numShards=5&shard=a&wt=javabin 2> 1737355 INFO (qtp780434683-5580) [n:127.0.0.1:35379_solr c: s: r: x:parent_c_replica_n9 t:null-9701] o.a.s.h.a.CoreAdminOperation core create command qt=/admin/cores&collection.configName=_default&newCollection=true&collection=parent&version=2&replicaType=NRT&coreNodeName=core_node19&name=parent_c_replica_n9&action=CREATE&numShards=5&shard=c&wt=javabin 2> 1737355 INFO (qtp1167887541-5579) [n:127.0.0.1:36867_solr c: s: r: x:parent_c_replica_n10 t:null-9702] o.a.s.h.a.CoreAdminOperation core create command qt=/admin/cores&collection.configName=_default&newCollection=true&collection=parent&version=2&replicaType=NRT&coreNodeName=core_node20&name=parent_c_replica_n10&action=CREATE&numShards=5&shard=c&wt=javabin 2> 1737431 INFO (qtp1167887541-5567) [n:127.0.0.1:36867_solr c:parent s:e r:core_node12 x:parent_e_replica_n2 t:null-9694] o.a.s.c.SolrConfig Using Lucene MatchVersion: 9.8.0 2> 1737436 INFO (qtp1283346820-5576) [n:127.0.0.1:36209_solr c:parent s:b r:core_node15 x:parent_b_replica_n5 t:null-9697] o.a.s.c.SolrConfig Using Lucene MatchVersion: 9.8.0 2> 1737439 INFO (qtp484878969-5573) [n:127.0.0.1:44957_solr c:parent s:d r:core_node16 x:parent_d_replica_n6 t:null-9698] o.a.s.c.SolrConfig Using Lucene MatchVersion: 9.8.0 2> 1737445 INFO (qtp780434683-5580) [n:127.0.0.1:35379_solr c:parent s:c r:core_node19 x:parent_c_replica_n9 t:null-9701] o.a.s.c.SolrConfig Using Lucene MatchVersion: 9.8.0 2> 1737458 INFO (qtp266123491-5594) [n:127.0.0.1:34035_solr c:parent s:d r:core_node17 x:parent_d_replica_n7 t:null-9699] o.a.s.c.SolrConfig Using Lucene MatchVersion: 9.8.0 2> 1737460 INFO (qtp1167887541-5567) [n:127.0.0.1:36867_solr c:parent s:e r:core_node12 x:parent_e_replica_n2 t:null-9694] o.a.s.s.IndexSchema Schema name=default-config 2> 1737461 INFO (qtp1283346820-5576) [n:127.0.0.1:36209_solr c:parent s:b r:core_node15 x:parent_b_replica_n5 t:null-9697] o.a.s.s.IndexSchema Schema name=default-config 2> 1737462 INFO (qtp266123491-5592) [n:127.0.0.1:34035_solr c:parent s:b r:core_node14 x:parent_b_replica_n4 t:null-9696] o.a.s.c.SolrConfig Using Lucene MatchVersion: 9.8.0 2> 1737462 INFO (qtp484878969-5573) [n:127.0.0.1:44957_solr c:parent s:d r:core_node16 x:parent_d_replica_n6 t:null-9698] o.a.s.s.IndexSchema Schema name=default-config 2> 1737472 INFO (qtp780434683-5582) [n:127.0.0.1:35379_solr c:parent s:a r:core_node11 x:parent_a_replica_n1 t:null-9693] o.a.s.c.SolrConfig Using Lucene MatchVersion: 9.8.0 2> 1737476 INFO (qtp484878969-5591) [n:127.0.0.1:44957_solr c:parent s:a r:core_node18 x:parent_a_replica_n8 t:null-9700] o.a.s.c.SolrConfig Using Lucene MatchVersion: 9.8.0 2> 1737480 INFO (qtp1283346820-5562) [n:127.0.0.1:36209_solr c:parent s:e r:core_node13 x:parent_e_replica_n3 t:null-9695] o.a.s.c.SolrConfig Using Lucene MatchVersion: 9.8.0 2> 1737497 INFO (qtp266123491-5594) [n:127.0.0.1:34035_solr c:parent s:d r:core_node17 x:parent_d_replica_n7 t:null-9699] o.a.s.s.IndexSchema Schema name=default-config 2> 1737497 INFO (qtp484878969-5591) [n:127.0.0.1:44957_solr c:parent s:a r:core_node18 x:parent_a_replica_n8 t:null-9700] o.a.s.s.IndexSchema Schema name=default-config 2> 1737498 INFO (qtp266123491-5592) [n:127.0.0.1:34035_solr c:parent s:b r:core_node14 x:parent_b_replica_n4 t:null-9696] o.a.s.s.IndexSchema Schema name=default-config 2> 1737498 INFO (qtp780434683-5582) [n:127.0.0.1:35379_solr c:parent s:a r:core_node11 x:parent_a_replica_n1 t:null-9693] o.a.s.s.IndexSchema Schema name=default-config 2> 1737499 INFO (qtp780434683-5580) [n:127.0.0.1:35379_solr c:parent s:c r:core_node19 x:parent_c_replica_n9 t:null-9701] o.a.s.s.IndexSchema Schema name=default-config 2> 1737499 INFO (qtp1167887541-5579) [n:127.0.0.1:36867_solr c:parent s:c r:core_node20 x:parent_c_replica_n10 t:null-9702] o.a.s.c.SolrConfig Using Lucene MatchVersion: 9.8.0 2> 1737499 INFO (qtp1283346820-5562) [n:127.0.0.1:36209_solr c:parent s:e r:core_node13 x:parent_e_replica_n3 t:null-9695] o.a.s.s.IndexSchema Schema name=default-config 2> 1737514 INFO (qtp1167887541-5579) [n:127.0.0.1:36867_solr c:parent s:c r:core_node20 x:parent_c_replica_n10 t:null-9702] o.a.s.s.IndexSchema Schema name=default-config 2> 1738981 INFO (qtp266123491-5594) [n:127.0.0.1:34035_solr c:parent s:d r:core_node17 x:parent_d_replica_n7 t:null-9699] o.a.s.s.IndexSchema Loaded schema default-config/1.6 with uniqueid field id 2> 1738981 INFO (qtp1167887541-5579) [n:127.0.0.1:36867_solr c:parent s:c r:core_node20 x:parent_c_replica_n10 t:null-9702] o.a.s.s.IndexSchema Loaded schema default-config/1.6 with uniqueid field id 2> 1738982 INFO (qtp266123491-5594) [n:127.0.0.1:34035_solr c:parent s:d r:core_node17 x:parent_d_replica_n7 t:null-9699] o.a.s.c.CoreContainer Creating SolrCore 'parent_d_replica_n7' using configuration from configset _default, trusted=true 2> 1738982 INFO (qtp1167887541-5579) [n:127.0.0.1:36867_solr c:parent s:c r:core_node20 x:parent_c_replica_n10 t:null-9702] o.a.s.c.CoreContainer Creating SolrCore 'parent_c_replica_n10' using configuration from configset _default, trusted=true 2> 1738984 INFO (qtp484878969-5573) [n:127.0.0.1:44957_solr c:parent s:d r:core_node16 x:parent_d_replica_n6 t:null-9698] o.a.s.s.IndexSchema Loaded schema default-config/1.6 with uniqueid field id 2> 1738984 INFO (qtp484878969-5573) [n:127.0.0.1:44957_solr c:parent s:d r:core_node16 x:parent_d_replica_n6 t:null-9698] o.a.s.c.CoreContainer Creating SolrCore 'parent_d_replica_n6' using configuration from configset _default, trusted=true 2> 1738985 INFO (qtp266123491-5594) [n:127.0.0.1:34035_solr c:parent s:d r:core_node17 x:parent_d_replica_n7 t:null-9699] o.a.s.c.SolrCore Opening new SolrCore at [/build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.search.join.ShardJoinImplicitTest_246C98A4C257C021-001/tempDir-001/node2/parent_d_replica_n7], dataDir=[/build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.search.join.ShardJoinImplicitTest_246C98A4C257C021-001/tempDir-001/node2/parent_d_replica_n7/data/] 2> 1738985 INFO (qtp1167887541-5579) [n:127.0.0.1:36867_solr c:parent s:c r:core_node20 x:parent_c_replica_n10 t:null-9702] o.a.s.c.SolrCore Opening new SolrCore at [/build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.search.join.ShardJoinImplicitTest_246C98A4C257C021-001/tempDir-001/node1/parent_c_replica_n10], dataDir=[/build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.search.join.ShardJoinImplicitTest_246C98A4C257C021-001/tempDir-001/node1/parent_c_replica_n10/data/] 2> 1738986 INFO (qtp484878969-5573) [n:127.0.0.1:44957_solr c:parent s:d r:core_node16 x:parent_d_replica_n6 t:null-9698] o.a.s.c.SolrCore Opening new SolrCore at [/build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.search.join.ShardJoinImplicitTest_246C98A4C257C021-001/tempDir-001/node3/parent_d_replica_n6], dataDir=[/build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.search.join.ShardJoinImplicitTest_246C98A4C257C021-001/tempDir-001/node3/parent_d_replica_n6/data/] 2> 1738989 INFO (qtp484878969-5591) [n:127.0.0.1:44957_solr c:parent s:a r:core_node18 x:parent_a_replica_n8 t:null-9700] o.a.s.s.IndexSchema Loaded schema default-config/1.6 with uniqueid field id 2> 1738989 INFO (qtp484878969-5591) [n:127.0.0.1:44957_solr c:parent s:a r:core_node18 x:parent_a_replica_n8 t:null-9700] o.a.s.c.CoreContainer Creating SolrCore 'parent_a_replica_n8' using configuration from configset _default, trusted=true 2> 1738992 INFO (qtp780434683-5580) [n:127.0.0.1:35379_solr c:parent s:c r:core_node19 x:parent_c_replica_n9 t:null-9701] o.a.s.s.IndexSchema Loaded schema default-config/1.6 with uniqueid field id 2> 1738992 INFO (qtp780434683-5580) [n:127.0.0.1:35379_solr c:parent s:c r:core_node19 x:parent_c_replica_n9 t:null-9701] o.a.s.c.CoreContainer Creating SolrCore 'parent_c_replica_n9' using configuration from configset _default, trusted=true 2> 1738992 INFO (qtp484878969-5591) [n:127.0.0.1:44957_solr c:parent s:a r:core_node18 x:parent_a_replica_n8 t:null-9700] o.a.s.c.SolrCore Opening new SolrCore at [/build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.search.join.ShardJoinImplicitTest_246C98A4C257C021-001/tempDir-001/node3/parent_a_replica_n8], dataDir=[/build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.search.join.ShardJoinImplicitTest_246C98A4C257C021-001/tempDir-001/node3/parent_a_replica_n8/data/] 2> 1738994 INFO (qtp1283346820-5576) [n:127.0.0.1:36209_solr c:parent s:b r:core_node15 x:parent_b_replica_n5 t:null-9697] o.a.s.s.IndexSchema Loaded schema default-config/1.6 with uniqueid field id 2> 1738994 INFO (qtp780434683-5580) [n:127.0.0.1:35379_solr c:parent s:c r:core_node19 x:parent_c_replica_n9 t:null-9701] o.a.s.c.SolrCore Opening new SolrCore at [/build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.search.join.ShardJoinImplicitTest_246C98A4C257C021-001/tempDir-001/node5/parent_c_replica_n9], dataDir=[/build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.search.join.ShardJoinImplicitTest_246C98A4C257C021-001/tempDir-001/node5/parent_c_replica_n9/data/] 2> 1738994 INFO (qtp1283346820-5576) [n:127.0.0.1:36209_solr c:parent s:b r:core_node15 x:parent_b_replica_n5 t:null-9697] o.a.s.c.CoreContainer Creating SolrCore 'parent_b_replica_n5' using configuration from configset _default, trusted=true 2> 1738994 INFO (qtp780434683-5582) [n:127.0.0.1:35379_solr c:parent s:a r:core_node11 x:parent_a_replica_n1 t:null-9693] o.a.s.s.IndexSchema Loaded schema default-config/1.6 with uniqueid field id 2> 1738995 INFO (qtp780434683-5582) [n:127.0.0.1:35379_solr c:parent s:a r:core_node11 x:parent_a_replica_n1 t:null-9693] o.a.s.c.CoreContainer Creating SolrCore 'parent_a_replica_n1' using configuration from configset _default, trusted=true 2> 1738995 INFO (qtp266123491-5592) [n:127.0.0.1:34035_solr c:parent s:b r:core_node14 x:parent_b_replica_n4 t:null-9696] o.a.s.s.IndexSchema Loaded schema default-config/1.6 with uniqueid field id 2> 1738996 INFO (qtp266123491-5592) [n:127.0.0.1:34035_solr c:parent s:b r:core_node14 x:parent_b_replica_n4 t:null-9696] o.a.s.c.CoreContainer Creating SolrCore 'parent_b_replica_n4' using configuration from configset _default, trusted=true 2> 1738996 INFO (qtp780434683-5582) [n:127.0.0.1:35379_solr c:parent s:a r:core_node11 x:parent_a_replica_n1 t:null-9693] o.a.s.c.SolrCore Opening new SolrCore at [/build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.search.join.ShardJoinImplicitTest_246C98A4C257C021-001/tempDir-001/node5/parent_a_replica_n1], dataDir=[/build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.search.join.ShardJoinImplicitTest_246C98A4C257C021-001/tempDir-001/node5/parent_a_replica_n1/data/] 2> 1738996 INFO (qtp1283346820-5576) [n:127.0.0.1:36209_solr c:parent s:b r:core_node15 x:parent_b_replica_n5 t:null-9697] o.a.s.c.SolrCore Opening new SolrCore at [/build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.search.join.ShardJoinImplicitTest_246C98A4C257C021-001/tempDir-001/node4/parent_b_replica_n5], dataDir=[/build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.search.join.ShardJoinImplicitTest_246C98A4C257C021-001/tempDir-001/node4/parent_b_replica_n5/data/] 2> 1738997 INFO (qtp266123491-5592) [n:127.0.0.1:34035_solr c:parent s:b r:core_node14 x:parent_b_replica_n4 t:null-9696] o.a.s.c.SolrCore Opening new SolrCore at [/build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.search.join.ShardJoinImplicitTest_246C98A4C257C021-001/tempDir-001/node2/parent_b_replica_n4], dataDir=[/build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.search.join.ShardJoinImplicitTest_246C98A4C257C021-001/tempDir-001/node2/parent_b_replica_n4/data/] 2> 1738998 INFO (qtp1283346820-5562) [n:127.0.0.1:36209_solr c:parent s:e r:core_node13 x:parent_e_replica_n3 t:null-9695] o.a.s.s.IndexSchema Loaded schema default-config/1.6 with uniqueid field id 2> 1738998 INFO (qtp1167887541-5567) [n:127.0.0.1:36867_solr c:parent s:e r:core_node12 x:parent_e_replica_n2 t:null-9694] o.a.s.s.IndexSchema Loaded schema default-config/1.6 with uniqueid field id 2> 1738998 INFO (qtp1283346820-5562) [n:127.0.0.1:36209_solr c:parent s:e r:core_node13 x:parent_e_replica_n3 t:null-9695] o.a.s.c.CoreContainer Creating SolrCore 'parent_e_replica_n3' using configuration from configset _default, trusted=true 2> 1738998 INFO (qtp1167887541-5567) [n:127.0.0.1:36867_solr c:parent s:e r:core_node12 x:parent_e_replica_n2 t:null-9694] o.a.s.c.CoreContainer Creating SolrCore 'parent_e_replica_n2' using configuration from configset _default, trusted=true 2> 1738999 INFO (qtp1283346820-5562) [n:127.0.0.1:36209_solr c:parent s:e r:core_node13 x:parent_e_replica_n3 t:null-9695] o.a.s.c.SolrCore Opening new SolrCore at [/build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.search.join.ShardJoinImplicitTest_246C98A4C257C021-001/tempDir-001/node4/parent_e_replica_n3], dataDir=[/build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.search.join.ShardJoinImplicitTest_246C98A4C257C021-001/tempDir-001/node4/parent_e_replica_n3/data/] 2> 1739000 INFO (qtp1167887541-5567) [n:127.0.0.1:36867_solr c:parent s:e r:core_node12 x:parent_e_replica_n2 t:null-9694] o.a.s.c.SolrCore Opening new SolrCore at [/build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.search.join.ShardJoinImplicitTest_246C98A4C257C021-001/tempDir-001/node1/parent_e_replica_n2], dataDir=[/build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-tmp/solr.search.join.ShardJoinImplicitTest_246C98A4C257C021-001/tempDir-001/node1/parent_e_replica_n2/data/] 2> 1739314 INFO (qtp266123491-5592) [n:127.0.0.1:34035_solr c:parent s:b r:core_node14 x:parent_b_replica_n4 t:null-9696] o.a.s.u.UpdateHandler Using UpdateLog implementation: org.apache.solr.update.UpdateLog 2> 1739314 INFO (qtp484878969-5591) [n:127.0.0.1:44957_solr c:parent s:a r:core_node18 x:parent_a_replica_n8 t:null-9700] o.a.s.u.UpdateHandler Using UpdateLog implementation: org.apache.solr.update.UpdateLog 2> 1739314 INFO (qtp1167887541-5579) [n:127.0.0.1:36867_solr c:parent s:c r:core_node20 x:parent_c_replica_n10 t:null-9702] o.a.s.u.UpdateHandler Using UpdateLog implementation: org.apache.solr.update.UpdateLog 2> 1739315 INFO (qtp266123491-5592) [n:127.0.0.1:34035_solr c:parent s:b r:core_node14 x:parent_b_replica_n4 t:null-9696] o.a.s.u.UpdateLog Initializing UpdateLog: dataDir= defaultSyncLevel=FLUSH numRecordsToKeep=100 maxNumLogsToKeep=10 numVersionBuckets=65536 2> 1739315 INFO (qtp484878969-5591) [n:127.0.0.1:44957_solr c:parent s:a r:core_node18 x:parent_a_replica_n8 t:null-9700] o.a.s.u.UpdateLog Initializing UpdateLog: dataDir= defaultSyncLevel=FLUSH numRecordsToKeep=100 maxNumLogsToKeep=10 numVersionBuckets=65536 2> 1739315 INFO (qtp1167887541-5579) [n:127.0.0.1:36867_solr c:parent s:c r:core_node20 x:parent_c_replica_n10 t:null-9702] o.a.s.u.UpdateLog Initializing UpdateLog: dataDir= defaultSyncLevel=FLUSH numRecordsToKeep=100 maxNumLogsToKeep=10 numVersionBuckets=65536 2> 1739315 INFO (qtp780434683-5580) [n:127.0.0.1:35379_solr c:parent s:c r:core_node19 x:parent_c_replica_n9 t:null-9701] o.a.s.u.UpdateHandler Using UpdateLog implementation: org.apache.solr.update.UpdateLog 2> 1739315 INFO (qtp780434683-5580) [n:127.0.0.1:35379_solr c:parent s:c r:core_node19 x:parent_c_replica_n9 t:null-9701] o.a.s.u.UpdateLog Initializing UpdateLog: dataDir= defaultSyncLevel=FLUSH numRecordsToKeep=100 maxNumLogsToKeep=10 numVersionBuckets=65536 2> 1739316 INFO (qtp1283346820-5562) [n:127.0.0.1:36209_solr c:parent s:e r:core_node13 x:parent_e_replica_n3 t:null-9695] o.a.s.u.UpdateHandler Using UpdateLog implementation: org.apache.solr.update.UpdateLog 2> 1739316 INFO (qtp1283346820-5562) [n:127.0.0.1:36209_solr c:parent s:e r:core_node13 x:parent_e_replica_n3 t:null-9695] o.a.s.u.UpdateLog Initializing UpdateLog: dataDir= defaultSyncLevel=FLUSH numRecordsToKeep=100 maxNumLogsToKeep=10 numVersionBuckets=65536 2> 1739316 INFO (qtp1283346820-5576) [n:127.0.0.1:36209_solr c:parent s:b r:core_node15 x:parent_b_replica_n5 t:null-9697] o.a.s.u.UpdateHandler Using UpdateLog implementation: org.apache.solr.update.UpdateLog 2> 1739316 INFO (qtp1283346820-5576) [n:127.0.0.1:36209_solr c:parent s:b r:core_node15 x:parent_b_replica_n5 t:null-9697] o.a.s.u.UpdateLog Initializing UpdateLog: dataDir= defaultSyncLevel=FLUSH numRecordsToKeep=100 maxNumLogsToKeep=10 numVersionBuckets=65536 2> 1739317 INFO (qtp780434683-5582) [n:127.0.0.1:35379_solr c:parent s:a r:core_node11 x:parent_a_replica_n1 t:null-9693] o.a.s.u.UpdateHandler Using UpdateLog implementation: org.apache.solr.update.UpdateLog 2> 1739317 INFO (qtp780434683-5582) [n:127.0.0.1:35379_solr c:parent s:a r:core_node11 x:parent_a_replica_n1 t:null-9693] o.a.s.u.UpdateLog Initializing UpdateLog: dataDir= defaultSyncLevel=FLUSH numRecordsToKeep=100 maxNumLogsToKeep=10 numVersionBuckets=65536 2> 1739317 INFO (qtp1167887541-5567) [n:127.0.0.1:36867_solr c:parent s:e r:core_node12 x:parent_e_replica_n2 t:null-9694] o.a.s.u.UpdateHandler Using UpdateLog implementation: org.apache.solr.update.UpdateLog 2> 1739318 INFO (qtp266123491-5594) [n:127.0.0.1:34035_solr c:parent s:d r:core_node17 x:parent_d_replica_n7 t:null-9699] o.a.s.u.UpdateHandler Using UpdateLog implementation: org.apache.solr.update.UpdateLog 2> 1739318 INFO (qtp1167887541-5567) [n:127.0.0.1:36867_solr c:parent s:e r:core_node12 x:parent_e_replica_n2 t:null-9694] o.a.s.u.UpdateLog Initializing UpdateLog: dataDir= defaultSyncLevel=FLUSH numRecordsToKeep=100 maxNumLogsToKeep=10 numVersionBuckets=65536 2> 1739318 INFO (qtp266123491-5594) [n:127.0.0.1:34035_solr c:parent s:d r:core_node17 x:parent_d_replica_n7 t:null-9699] o.a.s.u.UpdateLog Initializing UpdateLog: dataDir= defaultSyncLevel=FLUSH numRecordsToKeep=100 maxNumLogsToKeep=10 numVersionBuckets=65536 2> 1739321 INFO (qtp484878969-5573) [n:127.0.0.1:44957_solr c:parent s:d r:core_node16 x:parent_d_replica_n6 t:null-9698] o.a.s.u.UpdateHandler Using UpdateLog implementation: org.apache.solr.update.UpdateLog 2> 1739321 INFO (qtp484878969-5573) [n:127.0.0.1:44957_solr c:parent s:d r:core_node16 x:parent_d_replica_n6 t:null-9698] o.a.s.u.UpdateLog Initializing UpdateLog: dataDir= defaultSyncLevel=FLUSH numRecordsToKeep=100 maxNumLogsToKeep=10 numVersionBuckets=65536 2> 1739348 INFO (qtp266123491-5592) [n:127.0.0.1:34035_solr c:parent s:b r:core_node14 x:parent_b_replica_n4 t:null-9696] o.a.s.u.CommitTracker Hard AutoCommit: if uncommitted for 15000ms; 2> 1739348 INFO (qtp1167887541-5579) [n:127.0.0.1:36867_solr c:parent s:c r:core_node20 x:parent_c_replica_n10 t:null-9702] o.a.s.u.CommitTracker Hard AutoCommit: if uncommitted for 15000ms; 2> 1739348 INFO (qtp266123491-5592) [n:127.0.0.1:34035_solr c:parent s:b r:core_node14 x:parent_b_replica_n4 t:null-9696] o.a.s.u.CommitTracker Soft AutoCommit: if uncommitted for 3000ms; 2> 1739348 INFO (qtp780434683-5580) [n:127.0.0.1:35379_solr c:parent s:c r:core_node19 x:parent_c_replica_n9 t:null-9701] o.a.s.u.CommitTracker Hard AutoCommit: if uncommitted for 15000ms; 2> 1739348 INFO (qtp1167887541-5579) [n:127.0.0.1:36867_solr c:parent s:c r:core_node20 x:parent_c_replica_n10 t:null-9702] o.a.s.u.CommitTracker Soft AutoCommit: if uncommitted for 3000ms; 2> 1739349 INFO (qtp780434683-5580) [n:127.0.0.1:35379_solr c:parent s:c r:core_node19 x:parent_c_replica_n9 t:null-9701] o.a.s.u.CommitTracker Soft AutoCommit: if uncommitted for 3000ms; 2> 1739349 INFO (qtp1283346820-5576) [n:127.0.0.1:36209_solr c:parent s:b r:core_node15 x:parent_b_replica_n5 t:null-9697] o.a.s.u.CommitTracker Hard AutoCommit: if uncommitted for 15000ms; 2> 1739349 INFO (qtp1283346820-5576) [n:127.0.0.1:36209_solr c:parent s:b r:core_node15 x:parent_b_replica_n5 t:null-9697] o.a.s.u.CommitTracker Soft AutoCommit: if uncommitted for 3000ms; 2> 1739350 INFO (qtp1283346820-5562) [n:127.0.0.1:36209_solr c:parent s:e r:core_node13 x:parent_e_replica_n3 t:null-9695] o.a.s.u.CommitTracker Hard AutoCommit: if uncommitted for 15000ms; 2> 1739350 INFO (qtp1283346820-5562) [n:127.0.0.1:36209_solr c:parent s:e r:core_node13 x:parent_e_replica_n3 t:null-9695] o.a.s.u.CommitTracker Soft AutoCommit: if uncommitted for 3000ms; 2> 1739350 INFO (qtp266123491-5594) [n:127.0.0.1:34035_solr c:parent s:d r:core_node17 x:parent_d_replica_n7 t:null-9699] o.a.s.u.CommitTracker Hard AutoCommit: if uncommitted for 15000ms; 2> 1739350 INFO (qtp484878969-5591) [n:127.0.0.1:44957_solr c:parent s:a r:core_node18 x:parent_a_replica_n8 t:null-9700] o.a.s.u.CommitTracker Hard AutoCommit: if uncommitted for 15000ms; 2> 1739351 INFO (qtp1167887541-5567) [n:127.0.0.1:36867_solr c:parent s:e r:core_node12 x:parent_e_replica_n2 t:null-9694] o.a.s.u.CommitTracker Hard AutoCommit: if uncommitted for 15000ms; 2> 1739351 INFO (qtp266123491-5594) [n:127.0.0.1:34035_solr c:parent s:d r:core_node17 x:parent_d_replica_n7 t:null-9699] o.a.s.u.CommitTracker Soft AutoCommit: if uncommitted for 3000ms; 2> 1739351 INFO (qtp484878969-5591) [n:127.0.0.1:44957_solr c:parent s:a r:core_node18 x:parent_a_replica_n8 t:null-9700] o.a.s.u.CommitTracker Soft AutoCommit: if uncommitted for 3000ms; 2> 1739351 INFO (qtp1167887541-5567) [n:127.0.0.1:36867_solr c:parent s:e r:core_node12 x:parent_e_replica_n2 t:null-9694] o.a.s.u.CommitTracker Soft AutoCommit: if uncommitted for 3000ms; 2> 1739352 INFO (qtp780434683-5582) [n:127.0.0.1:35379_solr c:parent s:a r:core_node11 x:parent_a_replica_n1 t:null-9693] o.a.s.u.CommitTracker Hard AutoCommit: if uncommitted for 15000ms; 2> 1739352 INFO (qtp780434683-5582) [n:127.0.0.1:35379_solr c:parent s:a r:core_node11 x:parent_a_replica_n1 t:null-9693] o.a.s.u.CommitTracker Soft AutoCommit: if uncommitted for 3000ms; 2> 1739355 INFO (qtp484878969-5573) [n:127.0.0.1:44957_solr c:parent s:d r:core_node16 x:parent_d_replica_n6 t:null-9698] o.a.s.u.CommitTracker Hard AutoCommit: if uncommitted for 15000ms; 2> 1739356 INFO (qtp484878969-5573) [n:127.0.0.1:44957_solr c:parent s:d r:core_node16 x:parent_d_replica_n6 t:null-9698] o.a.s.u.CommitTracker Soft AutoCommit: if uncommitted for 3000ms; 2> 1739362 INFO (qtp1167887541-5579) [n:127.0.0.1:36867_solr c:parent s:c r:core_node20 x:parent_c_replica_n10 t:null-9702] o.a.s.r.ManagedResourceStorage Configured ZooKeeperStorageIO with znodeBase: /configs/_default 2> 1739366 INFO (qtp1167887541-5579) [n:127.0.0.1:36867_solr c:parent s:c r:core_node20 x:parent_c_replica_n10 t:null-9702] o.a.s.r.ManagedResourceStorage Loaded null at path _rest_managed.json using ZooKeeperStorageIO:path=/configs/_default 2> 1739366 INFO (qtp1167887541-5567) [n:127.0.0.1:36867_solr c:parent s:e r:core_node12 x:parent_e_replica_n2 t:null-9694] o.a.s.r.ManagedResourceStorage Configured ZooKeeperStorageIO with znodeBase: /configs/_default 2> 1739367 INFO (qtp1283346820-5576) [n:127.0.0.1:36209_solr c:parent s:b r:core_node15 x:parent_b_replica_n5 t:null-9697] o.a.s.r.ManagedResourceStorage Configured ZooKeeperStorageIO with znodeBase: /configs/_default 2> 1739367 INFO (qtp266123491-5592) [n:127.0.0.1:34035_solr c:parent s:b r:core_node14 x:parent_b_replica_n4 t:null-9696] o.a.s.r.ManagedResourceStorage Configured ZooKeeperStorageIO with znodeBase: /configs/_default 2> 1739369 INFO (qtp484878969-5591) [n:127.0.0.1:44957_solr c:parent s:a r:core_node18 x:parent_a_replica_n8 t:null-9700] o.a.s.r.ManagedResourceStorage Configured ZooKeeperStorageIO with znodeBase: /configs/_default 2> 1739369 INFO (qtp780434683-5580) [n:127.0.0.1:35379_solr c:parent s:c r:core_node19 x:parent_c_replica_n9 t:null-9701] o.a.s.r.ManagedResourceStorage Configured ZooKeeperStorageIO with znodeBase: /configs/_default 2> 1739370 INFO (qtp1167887541-5567) [n:127.0.0.1:36867_solr c:parent s:e r:core_node12 x:parent_e_replica_n2 t:null-9694] o.a.s.r.ManagedResourceStorage Loaded null at path _rest_managed.json using ZooKeeperStorageIO:path=/configs/_default 2> 1739370 INFO (qtp1283346820-5576) [n:127.0.0.1:36209_solr c:parent s:b r:core_node15 x:parent_b_replica_n5 t:null-9697] o.a.s.r.ManagedResourceStorage Loaded null at path _rest_managed.json using ZooKeeperStorageIO:path=/configs/_default 2> 1739370 INFO (qtp1167887541-5579) [n:127.0.0.1:36867_solr c:parent s:c r:core_node20 x:parent_c_replica_n10 t:null-9702] o.a.s.s.ZkIndexSchemaReader Creating ZooKeeper watch for the managed schema at /configs/_default/managed-schema.xml 2> 1739371 INFO (qtp1283346820-5562) [n:127.0.0.1:36209_solr c:parent s:e r:core_node13 x:parent_e_replica_n3 t:null-9695] o.a.s.r.ManagedResourceStorage Configured ZooKeeperStorageIO with znodeBase: /configs/_default 2> 1739371 INFO (qtp266123491-5592) [n:127.0.0.1:34035_solr c:parent s:b r:core_node14 x:parent_b_replica_n4 t:null-9696] o.a.s.r.ManagedResourceStorage Loaded null at path _rest_managed.json using ZooKeeperStorageIO:path=/configs/_default 2> 1739372 INFO (qtp780434683-5582) [n:127.0.0.1:35379_solr c:parent s:a r:core_node11 x:parent_a_replica_n1 t:null-9693] o.a.s.r.ManagedResourceStorage Configured ZooKeeperStorageIO with znodeBase: /configs/_default 2> 1739372 INFO (qtp266123491-5594) [n:127.0.0.1:34035_solr c:parent s:d r:core_node17 x:parent_d_replica_n7 t:null-9699] o.a.s.r.ManagedResourceStorage Configured ZooKeeperStorageIO with znodeBase: /configs/_default 2> 1739372 INFO (qtp780434683-5580) [n:127.0.0.1:35379_solr c:parent s:c r:core_node19 x:parent_c_replica_n9 t:null-9701] o.a.s.r.ManagedResourceStorage Loaded null at path _rest_managed.json using ZooKeeperStorageIO:path=/configs/_default 2> 1739373 INFO (qtp484878969-5591) [n:127.0.0.1:44957_solr c:parent s:a r:core_node18 x:parent_a_replica_n8 t:null-9700] o.a.s.r.ManagedResourceStorage Loaded null at path _rest_managed.json using ZooKeeperStorageIO:path=/configs/_default 2> 1739373 INFO (qtp1167887541-5567) [n:127.0.0.1:36867_solr c:parent s:e r:core_node12 x:parent_e_replica_n2 t:null-9694] o.a.s.s.ZkIndexSchemaReader Creating ZooKeeper watch for the managed schema at /configs/_default/managed-schema.xml 2> 1739373 INFO (qtp1283346820-5576) [n:127.0.0.1:36209_solr c:parent s:b r:core_node15 x:parent_b_replica_n5 t:null-9697] o.a.s.s.ZkIndexSchemaReader Creating ZooKeeper watch for the managed schema at /configs/_default/managed-schema.xml 2> 1739374 INFO (qtp484878969-5573) [n:127.0.0.1:44957_solr c:parent s:d r:core_node16 x:parent_d_replica_n6 t:null-9698] o.a.s.r.ManagedResourceStorage Configured ZooKeeperStorageIO with znodeBase: /configs/_default 2> 1739375 INFO (qtp1283346820-5562) [n:127.0.0.1:36209_solr c:parent s:e r:core_node13 x:parent_e_replica_n3 t:null-9695] o.a.s.r.ManagedResourceStorage Loaded null at path _rest_managed.json using ZooKeeperStorageIO:path=/configs/_default 2> 1739375 INFO (qtp266123491-5592) [n:127.0.0.1:34035_solr c:parent s:b r:core_node14 x:parent_b_replica_n4 t:null-9696] o.a.s.s.ZkIndexSchemaReader Creating ZooKeeper watch for the managed schema at /configs/_default/managed-schema.xml 2> 1739375 INFO (qtp780434683-5582) [n:127.0.0.1:35379_solr c:parent s:a r:core_node11 x:parent_a_replica_n1 t:null-9693] o.a.s.r.ManagedResourceStorage Loaded null at path _rest_managed.json using ZooKeeperStorageIO:path=/configs/_default 2> 1739376 INFO (qtp266123491-5594) [n:127.0.0.1:34035_solr c:parent s:d r:core_node17 x:parent_d_replica_n7 t:null-9699] o.a.s.r.ManagedResourceStorage Loaded null at path _rest_managed.json using ZooKeeperStorageIO:path=/configs/_default 2> 1739376 INFO (qtp780434683-5580) [n:127.0.0.1:35379_solr c:parent s:c r:core_node19 x:parent_c_replica_n9 t:null-9701] o.a.s.s.ZkIndexSchemaReader Creating ZooKeeper watch for the managed schema at /configs/_default/managed-schema.xml 2> 1739377 INFO (qtp484878969-5591) [n:127.0.0.1:44957_solr c:parent s:a r:core_node18 x:parent_a_replica_n8 t:null-9700] o.a.s.s.ZkIndexSchemaReader Creating ZooKeeper watch for the managed schema at /configs/_default/managed-schema.xml 2> 1739379 INFO (qtp1283346820-5562) [n:127.0.0.1:36209_solr c:parent s:e r:core_node13 x:parent_e_replica_n3 t:null-9695] o.a.s.s.ZkIndexSchemaReader Creating ZooKeeper watch for the managed schema at /configs/_default/managed-schema.xml 2> 1739379 INFO (qtp484878969-5573) [n:127.0.0.1:44957_solr c:parent s:d r:core_node16 x:parent_d_replica_n6 t:null-9698] o.a.s.r.ManagedResourceStorage Loaded null at path _rest_managed.json using ZooKeeperStorageIO:path=/configs/_default 2> 1739379 INFO (qtp1167887541-5579) [n:127.0.0.1:36867_solr c:parent s:c r:core_node20 x:parent_c_replica_n10 t:null-9702] o.a.s.s.ZkIndexSchemaReader Current schema version 3 is already the latest 2> 1739380 INFO (qtp780434683-5582) [n:127.0.0.1:35379_solr c:parent s:a r:core_node11 x:parent_a_replica_n1 t:null-9693] o.a.s.s.ZkIndexSchemaReader Creating ZooKeeper watch for the managed schema at /configs/_default/managed-schema.xml 2> 1739381 INFO (qtp266123491-5594) [n:127.0.0.1:34035_solr c:parent s:d r:core_node17 x:parent_d_replica_n7 t:null-9699] o.a.s.s.ZkIndexSchemaReader Creating ZooKeeper watch for the managed schema at /configs/_default/managed-schema.xml 2> 1739382 INFO (qtp1167887541-5567) [n:127.0.0.1:36867_solr c:parent s:e r:core_node12 x:parent_e_replica_n2 t:null-9694] o.a.s.s.ZkIndexSchemaReader Current schema version 3 is already the latest 2> 1739383 INFO (qtp1283346820-5576) [n:127.0.0.1:36209_solr c:parent s:b r:core_node15 x:parent_b_replica_n5 t:null-9697] o.a.s.s.ZkIndexSchemaReader Current schema version 3 is already the latest 2> 1739384 INFO (qtp1283346820-5576) [n:127.0.0.1:36209_solr c:parent s:b r:core_node15 x:parent_b_replica_n5 t:null-9697] o.a.s.s.DirectSolrSpellChecker init: {accuracy=0.5, maxQueryFrequency=0.01, maxEdits=2, minPrefix=1, maxInspections=5, minQueryLength=4, name=default, field=_text_, classname=solr.DirectSolrSpellChecker, distanceMeasure=internal} 2> 1739384 INFO (qtp1167887541-5567) [n:127.0.0.1:36867_solr c:parent s:e r:core_node12 x:parent_e_replica_n2 t:null-9694] o.a.s.s.DirectSolrSpellChecker init: {accuracy=0.5, maxQueryFrequency=0.01, maxEdits=2, minPrefix=1, maxInspections=5, minQueryLength=4, name=default, field=_text_, classname=solr.DirectSolrSpellChecker, distanceMeasure=internal} 2> 1739384 INFO (qtp1167887541-5579) [n:127.0.0.1:36867_solr c:parent s:c r:core_node20 x:parent_c_replica_n10 t:null-9702] o.a.s.s.DirectSolrSpellChecker init: {accuracy=0.5, maxQueryFrequency=0.01, maxEdits=2, minPrefix=1, maxInspections=5, minQueryLength=4, name=default, field=_text_, classname=solr.DirectSolrSpellChecker, distanceMeasure=internal} 2> 1739384 INFO (qtp484878969-5573) [n:127.0.0.1:44957_solr c:parent s:d r:core_node16 x:parent_d_replica_n6 t:null-9698] o.a.s.s.ZkIndexSchemaReader Creating ZooKeeper watch for the managed schema at /configs/_default/managed-schema.xml 2> 1739385 INFO (qtp266123491-5592) [n:127.0.0.1:34035_solr c:parent s:b r:core_node14 x:parent_b_replica_n4 t:null-9696] o.a.s.s.ZkIndexSchemaReader Current schema version 3 is already the latest 2> 1739386 INFO (qtp780434683-5580) [n:127.0.0.1:35379_solr c:parent s:c r:core_node19 x:parent_c_replica_n9 t:null-9701] o.a.s.s.ZkIndexSchemaReader Current schema version 3 is already the latest 2> 1739386 INFO (qtp266123491-5592) [n:127.0.0.1:34035_solr c:parent s:b r:core_node14 x:parent_b_replica_n4 t:null-9696] o.a.s.s.DirectSolrSpellChecker init: {accuracy=0.5, maxQueryFrequency=0.01, maxEdits=2, minPrefix=1, maxInspections=5, minQueryLength=4, name=default, field=_text_, classname=solr.DirectSolrSpellChecker, distanceMeasure=internal} 2> 1739387 INFO (qtp1283346820-5562) [n:127.0.0.1:36209_solr c:parent s:e r:core_node13 x:parent_e_replica_n3 t:null-9695] o.a.s.s.ZkIndexSchemaReader Current schema version 3 is already the latest 2> 1739387 INFO (qtp780434683-5580) [n:127.0.0.1:35379_solr c:parent s:c r:core_node19 x:parent_c_replica_n9 t:null-9701] o.a.s.s.DirectSolrSpellChecker init: {accuracy=0.5, maxQueryFrequency=0.01, maxEdits=2, minPrefix=1, maxInspections=5, minQueryLength=4, name=default, field=_text_, classname=solr.DirectSolrSpellChecker, distanceMeasure=internal} 2> 1739387 INFO (qtp484878969-5591) [n:127.0.0.1:44957_solr c:parent s:a r:core_node18 x:parent_a_replica_n8 t:null-9700] o.a.s.s.ZkIndexSchemaReader Current schema version 3 is already the latest 2> 1739387 INFO (qtp1283346820-5562) [n:127.0.0.1:36209_solr c:parent s:e r:core_node13 x:parent_e_replica_n3 t:null-9695] o.a.s.s.DirectSolrSpellChecker init: {accuracy=0.5, maxQueryFrequency=0.01, maxEdits=2, minPrefix=1, maxInspections=5, minQueryLength=4, name=default, field=_text_, classname=solr.DirectSolrSpellChecker, distanceMeasure=internal} 2> 1739388 INFO (qtp780434683-5582) [n:127.0.0.1:35379_solr c:parent s:a r:core_node11 x:parent_a_replica_n1 t:null-9693] o.a.s.s.ZkIndexSchemaReader Current schema version 3 is already the latest 2> 1739388 INFO (qtp484878969-5591) [n:127.0.0.1:44957_solr c:parent s:a r:core_node18 x:parent_a_replica_n8 t:null-9700] o.a.s.s.DirectSolrSpellChecker init: {accuracy=0.5, maxQueryFrequency=0.01, maxEdits=2, minPrefix=1, maxInspections=5, minQueryLength=4, name=default, field=_text_, classname=solr.DirectSolrSpellChecker, distanceMeasure=internal} 2> 1739388 INFO (qtp266123491-5594) [n:127.0.0.1:34035_solr c:parent s:d r:core_node17 x:parent_d_replica_n7 t:null-9699] o.a.s.s.ZkIndexSchemaReader Current schema version 3 is already the latest 2> 1739389 INFO (qtp780434683-5582) [n:127.0.0.1:35379_solr c:parent s:a r:core_node11 x:parent_a_replica_n1 t:null-9693] o.a.s.s.DirectSolrSpellChecker init: {accuracy=0.5, maxQueryFrequency=0.01, maxEdits=2, minPrefix=1, maxInspections=5, minQueryLength=4, name=default, field=_text_, classname=solr.DirectSolrSpellChecker, distanceMeasure=internal} 2> 1739389 INFO (qtp266123491-5594) [n:127.0.0.1:34035_solr c:parent s:d r:core_node17 x:parent_d_replica_n7 t:null-9699] o.a.s.s.DirectSolrSpellChecker init: {accuracy=0.5, maxQueryFrequency=0.01, maxEdits=2, minPrefix=1, maxInspections=5, minQueryLength=4, name=default, field=_text_, classname=solr.DirectSolrSpellChecker, distanceMeasure=internal} 2> 1739390 INFO (qtp1167887541-5567) [n:127.0.0.1:36867_solr c:parent s:e r:core_node12 x:parent_e_replica_n2 t:null-9694] o.a.s.h.ReplicationHandler Commits will be reserved for 10000 ms 2> 1739390 INFO (qtp1283346820-5576) [n:127.0.0.1:36209_solr c:parent s:b r:core_node15 x:parent_b_replica_n5 t:null-9697] o.a.s.h.ReplicationHandler Commits will be reserved for 10000 ms 2> 1739390 INFO (qtp1167887541-5579) [n:127.0.0.1:36867_solr c:parent s:c r:core_node20 x:parent_c_replica_n10 t:null-9702] o.a.s.h.ReplicationHandler Commits will be reserved for 10000 ms 2> 1739390 INFO (qtp484878969-5573) [n:127.0.0.1:44957_solr c:parent s:d r:core_node16 x:parent_d_replica_n6 t:null-9698] o.a.s.s.ZkIndexSchemaReader Current schema version 3 is already the latest 2> 1739391 INFO (qtp266123491-5592) [n:127.0.0.1:34035_solr c:parent s:b r:core_node14 x:parent_b_replica_n4 t:null-9696] o.a.s.h.ReplicationHandler Commits will be reserved for 10000 ms 2> 1739391 INFO (qtp484878969-5573) [n:127.0.0.1:44957_solr c:parent s:d r:core_node16 x:parent_d_replica_n6 t:null-9698] o.a.s.s.DirectSolrSpellChecker init: {accuracy=0.5, maxQueryFrequency=0.01, maxEdits=2, minPrefix=1, maxInspections=5, minQueryLength=4, name=default, field=_text_, classname=solr.DirectSolrSpellChecker, distanceMeasure=internal} 2> 1739391 INFO (qtp1283346820-5576) [n:127.0.0.1:36209_solr c:parent s:b r:core_node15 x:parent_b_replica_n5 t:null-9697] o.a.s.u.UpdateLog Could not find max version in index or recent updates, using new clock 1788737059510288384 2> 1739391 INFO (qtp1167887541-5567) [n:127.0.0.1:36867_solr c:parent s:e r:core_node12 x:parent_e_replica_n2 t:null-9694] o.a.s.u.UpdateLog Could not find max version in index or recent updates, using new clock 1788737059510288384 2> 1739391 INFO (qtp780434683-5580) [n:127.0.0.1:35379_solr c:parent s:c r:core_node19 x:parent_c_replica_n9 t:null-9701] o.a.s.h.ReplicationHandler Commits will be reserved for 10000 ms 2> 1739391 INFO (qtp1167887541-5579) [n:127.0.0.1:36867_solr c:parent s:c r:core_node20 x:parent_c_replica_n10 t:null-9702] o.a.s.u.UpdateLog Could not find max version in index or recent updates, using new clock 1788737059510288384 2> 1739392 INFO (qtp780434683-5580) [n:127.0.0.1:35379_solr c:parent s:c r:core_node19 x:parent_c_replica_n9 t:null-9701] o.a.s.u.UpdateLog Could not find max version in index or recent updates, using new clock 1788737059511336960 2> 1739393 INFO (qtp266123491-5592) [n:127.0.0.1:34035_solr c:parent s:b r:core_node14 x:parent_b_replica_n4 t:null-9696] o.a.s.u.UpdateLog Could not find max version in index or recent updates, using new clock 1788737059512385536 2> 1739396 INFO (qtp1283346820-5562) [n:127.0.0.1:36209_solr c:parent s:e r:core_node13 x:parent_e_replica_n3 t:null-9695] o.a.s.h.ReplicationHandler Commits will be reserved for 10000 ms 2> 1739396 INFO (qtp484878969-5591) [n:127.0.0.1:44957_solr c:parent s:a r:core_node18 x:parent_a_replica_n8 t:null-9700] o.a.s.h.ReplicationHandler Commits will be reserved for 10000 ms 2> 1739397 INFO (qtp266123491-5594) [n:127.0.0.1:34035_solr c:parent s:d r:core_node17 x:parent_d_replica_n7 t:null-9699] o.a.s.h.ReplicationHandler Commits will be reserved for 10000 ms 2> 1739397 INFO (qtp484878969-5573) [n:127.0.0.1:44957_solr c:parent s:d r:core_node16 x:parent_d_replica_n6 t:null-9698] o.a.s.h.ReplicationHandler Commits will be reserved for 10000 ms 2> 1739397 INFO (qtp1283346820-5562) [n:127.0.0.1:36209_solr c:parent s:e r:core_node13 x:parent_e_replica_n3 t:null-9695] o.a.s.u.UpdateLog Could not find max version in index or recent updates, using new clock 1788737059516579840 2> 1739397 INFO (qtp484878969-5591) [n:127.0.0.1:44957_solr c:parent s:a r:core_node18 x:parent_a_replica_n8 t:null-9700] o.a.s.u.UpdateLog Could not find max version in index or recent updates, using new clock 1788737059516579840 2> 1739397 INFO (qtp780434683-5582) [n:127.0.0.1:35379_solr c:parent s:a r:core_node11 x:parent_a_replica_n1 t:null-9693] o.a.s.h.ReplicationHandler Commits will be reserved for 10000 ms 2> 1739398 INFO (qtp266123491-5594) [n:127.0.0.1:34035_solr c:parent s:d r:core_node17 x:parent_d_replica_n7 t:null-9699] o.a.s.u.UpdateLog Could not find max version in index or recent updates, using new clock 1788737059517628416 2> 1739398 INFO (qtp484878969-5573) [n:127.0.0.1:44957_solr c:parent s:d r:core_node16 x:parent_d_replica_n6 t:null-9698] o.a.s.u.UpdateLog Could not find max version in index or recent updates, using new clock 1788737059517628416 2> 1739399 INFO (qtp780434683-5582) [n:127.0.0.1:35379_solr c:parent s:a r:core_node11 x:parent_a_replica_n1 t:null-9693] o.a.s.u.UpdateLog Could not find max version in index or recent updates, using new clock 1788737059518676992 2> 1739402 INFO (searcherExecutor-3778-thread-1-processing-parent_b_replica_n5 null-9697 core_node15 127.0.0.1:36209_solr parent b) [n:127.0.0.1:36209_solr c:parent s:b r:core_node15 x:parent_b_replica_n5 t:null-9697] o.a.s.c.QuerySenderListener QuerySenderListener done. 2> 1739402 INFO (searcherExecutor-3776-thread-1-processing-parent_c_replica_n9 null-9701 core_node19 127.0.0.1:35379_solr parent c) [n:127.0.0.1:35379_solr c:parent s:c r:core_node19 x:parent_c_replica_n9 t:null-9701] o.a.s.c.QuerySenderListener QuerySenderListener done. 2> 1739402 INFO (searcherExecutor-3778-thread-1-processing-parent_b_replica_n5 null-9697 core_node15 127.0.0.1:36209_solr parent b) [n:127.0.0.1:36209_solr c:parent s:b r:core_node15 x:parent_b_replica_n5 t:null-9697] o.a.s.h.c.SpellCheckComponent Loading spell index for spellchecker: default 2> 1739402 INFO (searcherExecutor-3786-thread-1-processing-parent_e_replica_n2 null-9694 core_node12 127.0.0.1:36867_solr parent e) [n:127.0.0.1:36867_solr c:parent s:e r:core_node12 x:parent_e_replica_n2 t:null-9694] o.a.s.c.QuerySenderListener QuerySenderListener done. 2> 1739403 INFO (searcherExecutor-3776-thread-1-processing-parent_c_replica_n9 null-9701 core_node19 127.0.0.1:35379_solr parent c) [n:127.0.0.1:35379_solr c:parent s:c r:core_node19 x:parent_c_replica_n9 t:null-9701] o.a.s.h.c.SpellCheckComponent Loading spell index for spellchecker: default 2> 1739403 INFO (searcherExecutor-3786-thread-1-processing-parent_e_replica_n2 null-9694 core_node12 127.0.0.1:36867_solr parent e) [n:127.0.0.1:36867_solr c:parent s:e r:core_node12 x:parent_e_replica_n2 t:null-9694] o.a.s.h.c.SpellCheckComponent Loading spell index for spellchecker: default 2> 1739420 INFO (searcherExecutor-3776-thread-1-processing-parent_c_replica_n9 null-9701 core_node19 127.0.0.1:35379_solr parent c) [n:127.0.0.1:35379_solr c:parent s:c r:core_node19 x:parent_c_replica_n9 t:null-9701] o.a.s.c.SolrCore Registered new searcher autowarm time: 0 ms 2> 1739420 INFO (searcherExecutor-3778-thread-1-processing-parent_b_replica_n5 null-9697 core_node15 127.0.0.1:36209_solr parent b) [n:127.0.0.1:36209_solr c:parent s:b r:core_node15 x:parent_b_replica_n5 t:null-9697] o.a.s.c.SolrCore Registered new searcher autowarm time: 0 ms 2> 1739421 INFO (searcherExecutor-3786-thread-1-processing-parent_e_replica_n2 null-9694 core_node12 127.0.0.1:36867_solr parent e) [n:127.0.0.1:36867_solr c:parent s:e r:core_node12 x:parent_e_replica_n2 t:null-9694] o.a.s.c.SolrCore Registered new searcher autowarm time: 0 ms 2> 1739421 INFO (searcherExecutor-3782-thread-1-processing-parent_b_replica_n4 null-9696 core_node14 127.0.0.1:34035_solr parent b) [n:127.0.0.1:34035_solr c:parent s:b r:core_node14 x:parent_b_replica_n4 t:null-9696] o.a.s.c.QuerySenderListener QuerySenderListener done. 2> 1739421 INFO (searcherExecutor-3782-thread-1-processing-parent_b_replica_n4 null-9696 core_node14 127.0.0.1:34035_solr parent b) [n:127.0.0.1:34035_solr c:parent s:b r:core_node14 x:parent_b_replica_n4 t:null-9696] o.a.s.h.c.SpellCheckComponent Loading spell index for spellchecker: default 2> 1739425 INFO (searcherExecutor-3769-thread-1-processing-parent_c_replica_n10 null-9702 core_node20 127.0.0.1:36867_solr parent c) [n:127.0.0.1:36867_solr c:parent s:c r:core_node20 x:parent_c_replica_n10 t:null-9702] o.a.s.c.QuerySenderListener QuerySenderListener done. 2> 1739425 INFO (searcherExecutor-3769-thread-1-processing-parent_c_replica_n10 null-9702 core_node20 127.0.0.1:36867_solr parent c) [n:127.0.0.1:36867_solr c:parent s:c r:core_node20 x:parent_c_replica_n10 t:null-9702] o.a.s.h.c.SpellCheckComponent Loading spell index for spellchecker: default 2> 1739425 INFO (searcherExecutor-3782-thread-1-processing-parent_b_replica_n4 null-9696 core_node14 127.0.0.1:34035_solr parent b) [n:127.0.0.1:34035_solr c:parent s:b r:core_node14 x:parent_b_replica_n4 t:null-9696] o.a.s.c.SolrCore Registered new searcher autowarm time: 0 ms 2> 1739435 INFO (searcherExecutor-3769-thread-1-processing-parent_c_replica_n10 null-9702 core_node20 127.0.0.1:36867_solr parent c) [n:127.0.0.1:36867_solr c:parent s:c r:core_node20 x:parent_c_replica_n10 t:null-9702] o.a.s.c.SolrCore Registered new searcher autowarm time: 0 ms 2> 1739437 INFO (searcherExecutor-3774-thread-1-processing-parent_a_replica_n8 null-9700 core_node18 127.0.0.1:44957_solr parent a) [n:127.0.0.1:44957_solr c:parent s:a r:core_node18 x:parent_a_replica_n8 t:null-9700] o.a.s.c.QuerySenderListener QuerySenderListener done. 2> 1739437 INFO (searcherExecutor-3774-thread-1-processing-parent_a_replica_n8 null-9700 core_node18 127.0.0.1:44957_solr parent a) [n:127.0.0.1:44957_solr c:parent s:a r:core_node18 x:parent_a_replica_n8 t:null-9700] o.a.s.h.c.SpellCheckComponent Loading spell index for spellchecker: default 2> 1739438 INFO (searcherExecutor-3784-thread-1-processing-parent_e_replica_n3 null-9695 core_node13 127.0.0.1:36209_solr parent e) [n:127.0.0.1:36209_solr c:parent s:e r:core_node13 x:parent_e_replica_n3 t:null-9695] o.a.s.c.QuerySenderListener QuerySenderListener done. 2> 1739438 INFO (searcherExecutor-3774-thread-1-processing-parent_a_replica_n8 null-9700 core_node18 127.0.0.1:44957_solr parent a) [n:127.0.0.1:44957_solr c:parent s:a r:core_node18 x:parent_a_replica_n8 t:null-9700] o.a.s.c.SolrCore Registered new searcher autowarm time: 0 ms 2> 1739438 INFO (searcherExecutor-3784-thread-1-processing-parent_e_replica_n3 null-9695 core_node13 127.0.0.1:36209_solr parent e) [n:127.0.0.1:36209_solr c:parent s:e r:core_node13 x:parent_e_replica_n3 t:null-9695] o.a.s.h.c.SpellCheckComponent Loading spell index for spellchecker: default 2> 1739439 INFO (searcherExecutor-3772-thread-1-processing-parent_d_replica_n6 null-9698 core_node16 127.0.0.1:44957_solr parent d) [n:127.0.0.1:44957_solr c:parent s:d r:core_node16 x:parent_d_replica_n6 t:null-9698] o.a.s.c.QuerySenderListener QuerySenderListener done. 2> 1739440 INFO (searcherExecutor-3772-thread-1-processing-parent_d_replica_n6 null-9698 core_node16 127.0.0.1:44957_solr parent d) [n:127.0.0.1:44957_solr c:parent s:d r:core_node16 x:parent_d_replica_n6 t:null-9698] o.a.s.h.c.SpellCheckComponent Loading spell index for spellchecker: default 2> 1739440 INFO (searcherExecutor-3780-thread-1-processing-parent_a_replica_n1 null-9693 core_node11 127.0.0.1:35379_solr parent a) [n:127.0.0.1:35379_solr c:parent s:a r:core_node11 x:parent_a_replica_n1 t:null-9693] o.a.s.c.QuerySenderListener QuerySenderListener done. 2> 1739440 INFO (searcherExecutor-3784-thread-1-processing-parent_e_replica_n3 null-9695 core_node13 127.0.0.1:36209_solr parent e) [n:127.0.0.1:36209_solr c:parent s:e r:core_node13 x:parent_e_replica_n3 t:null-9695] o.a.s.c.SolrCore Registered new searcher autowarm time: 0 ms 2> 1739440 INFO (searcherExecutor-3780-thread-1-processing-parent_a_replica_n1 null-9693 core_node11 127.0.0.1:35379_solr parent a) [n:127.0.0.1:35379_solr c:parent s:a r:core_node11 x:parent_a_replica_n1 t:null-9693] o.a.s.h.c.SpellCheckComponent Loading spell index for spellchecker: default 2> 1739440 INFO (searcherExecutor-3770-thread-1-processing-parent_d_replica_n7 null-9699 core_node17 127.0.0.1:34035_solr parent d) [n:127.0.0.1:34035_solr c:parent s:d r:core_node17 x:parent_d_replica_n7 t:null-9699] o.a.s.c.QuerySenderListener QuerySenderListener done. 2> 1739440 INFO (searcherExecutor-3770-thread-1-processing-parent_d_replica_n7 null-9699 core_node17 127.0.0.1:34035_solr parent d) [n:127.0.0.1:34035_solr c:parent s:d r:core_node17 x:parent_d_replica_n7 t:null-9699] o.a.s.h.c.SpellCheckComponent Loading spell index for spellchecker: default 2> 1739441 INFO (searcherExecutor-3772-thread-1-processing-parent_d_replica_n6 null-9698 core_node16 127.0.0.1:44957_solr parent d) [n:127.0.0.1:44957_solr c:parent s:d r:core_node16 x:parent_d_replica_n6 t:null-9698] o.a.s.c.SolrCore Registered new searcher autowarm time: 0 ms 2> 1739442 INFO (searcherExecutor-3780-thread-1-processing-parent_a_replica_n1 null-9693 core_node11 127.0.0.1:35379_solr parent a) [n:127.0.0.1:35379_solr c:parent s:a r:core_node11 x:parent_a_replica_n1 t:null-9693] o.a.s.c.SolrCore Registered new searcher autowarm time: 0 ms 2> 1739442 INFO (searcherExecutor-3770-thread-1-processing-parent_d_replica_n7 null-9699 core_node17 127.0.0.1:34035_solr parent d) [n:127.0.0.1:34035_solr c:parent s:d r:core_node17 x:parent_d_replica_n7 t:null-9699] o.a.s.c.SolrCore Registered new searcher autowarm time: 0 ms 2> 1739476 INFO (qtp1283346820-5576) [n:127.0.0.1:36209_solr c:parent s:b r:core_node15 x:parent_b_replica_n5 t:null-9697] o.a.s.c.ZkShardTerms Successful update of terms at /collections/parent/terms/b to Terms{values={core_node15=0}, version=0} for registerTerm 2> 1739476 INFO (qtp1283346820-5576) [n:127.0.0.1:36209_solr c:parent s:b r:core_node15 x:parent_b_replica_n5 t:null-9697] o.a.s.c.ShardLeaderElectionContextBase make sure parent is created /collections/parent/leaders/b 2> 1739477 INFO (qtp780434683-5580) [n:127.0.0.1:35379_solr c:parent s:c r:core_node19 x:parent_c_replica_n9 t:null-9701] o.a.s.c.ZkShardTerms Successful update of terms at /collections/parent/terms/c to Terms{values={core_node19=0}, version=0} for registerTerm 2> 1739477 INFO (qtp266123491-5592) [n:127.0.0.1:34035_solr c:parent s:b r:core_node14 x:parent_b_replica_n4 t:null-9696] o.a.s.c.ZkShardTerms Failed to save terms, version is not a match, retrying 2> 1739477 INFO (qtp780434683-5580) [n:127.0.0.1:35379_solr c:parent s:c r:core_node19 x:parent_c_replica_n9 t:null-9701] o.a.s.c.ShardLeaderElectionContextBase make sure parent is created /collections/parent/leaders/c 2> 1739479 INFO (qtp1167887541-5567) [n:127.0.0.1:36867_solr c:parent s:e r:core_node12 x:parent_e_replica_n2 t:null-9694] o.a.s.c.ZkShardTerms Successful update of terms at /collections/parent/terms/e to Terms{values={core_node12=0}, version=0} for registerTerm 2> 1739479 INFO (qtp1167887541-5567) [n:127.0.0.1:36867_solr c:parent s:e r:core_node12 x:parent_e_replica_n2 t:null-9694] o.a.s.c.ShardLeaderElectionContextBase make sure parent is created /collections/parent/leaders/e 2> 1739484 INFO (qtp484878969-5591) [n:127.0.0.1:44957_solr c:parent s:a r:core_node18 x:parent_a_replica_n8 t:null-9700] o.a.s.c.ZkShardTerms Successful update of terms at /collections/parent/terms/a to Terms{values={core_node18=0}, version=0} for registerTerm 2> 1739484 INFO (qtp484878969-5591) [n:127.0.0.1:44957_solr c:parent s:a r:core_node18 x:parent_a_replica_n8 t:null-9700] o.a.s.c.ShardLeaderElectionContextBase make sure parent is created /collections/parent/leaders/a 2> 1739486 INFO (qtp266123491-5592) [n:127.0.0.1:34035_solr c:parent s:b r:core_node14 x:parent_b_replica_n4 t:null-9696] o.a.s.c.ZkShardTerms Successful update of terms at /collections/parent/terms/b to Terms{values={core_node14=0, core_node15=0}, version=1} for registerTerm 2> 1739486 INFO (qtp266123491-5592) [n:127.0.0.1:34035_solr c:parent s:b r:core_node14 x:parent_b_replica_n4 t:null-9696] o.a.s.c.ShardLeaderElectionContextBase make sure parent is created /collections/parent/leaders/b 2> 1739506 INFO (qtp1283346820-5562) [n:127.0.0.1:36209_solr c:parent s:e r:core_node13 x:parent_e_replica_n3 t:null-9695] o.a.s.c.ZkShardTerms Successful update of terms at /collections/parent/terms/e to Terms{values={core_node13=0, core_node12=0}, version=1} for registerTerm 2> 1739507 INFO (qtp1283346820-5562) [n:127.0.0.1:36209_solr c:parent s:e r:core_node13 x:parent_e_replica_n3 t:null-9695] o.a.s.c.ShardLeaderElectionContextBase make sure parent is created /collections/parent/leaders/e 2> 1739507 INFO (qtp780434683-5582) [n:127.0.0.1:35379_solr c:parent s:a r:core_node11 x:parent_a_replica_n1 t:null-9693] o.a.s.c.ZkShardTerms Successful update of terms at /collections/parent/terms/a to Terms{values={core_node18=0, core_node11=0}, version=1} for registerTerm 2> 1739507 INFO (qtp780434683-5582) [n:127.0.0.1:35379_solr c:parent s:a r:core_node11 x:parent_a_replica_n1 t:null-9693] o.a.s.c.ShardLeaderElectionContextBase make sure parent is created /collections/parent/leaders/a 2> 1739508 INFO (qtp1167887541-5579) [n:127.0.0.1:36867_solr c:parent s:c r:core_node20 x:parent_c_replica_n10 t:null-9702] o.a.s.c.ZkShardTerms Successful update of terms at /collections/parent/terms/c to Terms{values={core_node20=0, core_node19=0}, version=1} for registerTerm 2> 1739508 INFO (qtp1167887541-5579) [n:127.0.0.1:36867_solr c:parent s:c r:core_node20 x:parent_c_replica_n10 t:null-9702] o.a.s.c.ShardLeaderElectionContextBase make sure parent is created /collections/parent/leaders/c 2> 1739516 INFO (qtp484878969-5573) [n:127.0.0.1:44957_solr c:parent s:d r:core_node16 x:parent_d_replica_n6 t:null-9698] o.a.s.c.ZkShardTerms Successful update of terms at /collections/parent/terms/d to Terms{values={core_node16=0}, version=0} for registerTerm 2> 1739517 INFO (qtp484878969-5573) [n:127.0.0.1:44957_solr c:parent s:d r:core_node16 x:parent_d_replica_n6 t:null-9698] o.a.s.c.ShardLeaderElectionContextBase make sure parent is created /collections/parent/leaders/d 2> 1739521 INFO (qtp266123491-5594) [n:127.0.0.1:34035_solr c:parent s:d r:core_node17 x:parent_d_replica_n7 t:null-9699] o.a.s.c.ZkShardTerms Failed to save terms, version is not a match, retrying 2> 1740686 INFO (qtp266123491-5594) [n:127.0.0.1:34035_solr c:parent s:d r:core_node17 x:parent_d_replica_n7 t:null-9699] o.a.s.c.ZkShardTerms Successful update of terms at /collections/parent/terms/d to Terms{values={core_node17=0, core_node16=0}, version=1} for registerTerm 2> 1740687 INFO (qtp266123491-5594) [n:127.0.0.1:34035_solr c:parent s:d r:core_node17 x:parent_d_replica_n7 t:null-9699] o.a.s.c.ShardLeaderElectionContextBase make sure parent is created /collections/parent/leaders/d 2> 1740699 INFO (qtp1283346820-5576) [n:127.0.0.1:36209_solr c:parent s:b r:core_node15 x:parent_b_replica_n5 t:null-9697] o.a.s.c.ShardLeaderElectionContext Enough replicas found to continue. 2> 1740699 INFO (qtp1283346820-5576) [n:127.0.0.1:36209_solr c:parent s:b r:core_node15 x:parent_b_replica_n5 t:null-9697] o.a.s.c.ShardLeaderElectionContext I may be the new leader - try and sync 2> 1740699 INFO (qtp1283346820-5576) [n:127.0.0.1:36209_solr c:parent s:b r:core_node15 x:parent_b_replica_n5 t:null-9697] o.a.s.c.SyncStrategy Sync replicas to https://127.0.0.1:36209/solr/parent_b_replica_n5/ 2> 1740705 INFO (qtp484878969-5591) [n:127.0.0.1:44957_solr c:parent s:a r:core_node18 x:parent_a_replica_n8 t:null-9700] o.a.s.c.ShardLeaderElectionContext Enough replicas found to continue. 2> 1740706 INFO (qtp1283346820-5576) [n:127.0.0.1:36209_solr c:parent s:b r:core_node15 x:parent_b_replica_n5 t:null-9697] o.a.s.u.PeerSync PeerSync: core=parent_b_replica_n5 url=https://127.0.0.1:36209/solr START replicas=[https://127.0.0.1:34035/solr/parent_b_replica_n4/] nUpdates=100 2> 1740706 INFO (qtp484878969-5591) [n:127.0.0.1:44957_solr c:parent s:a r:core_node18 x:parent_a_replica_n8 t:null-9700] o.a.s.c.ShardLeaderElectionContext I may be the new leader - try and sync 2> 1740706 INFO (qtp484878969-5591) [n:127.0.0.1:44957_solr c:parent s:a r:core_node18 x:parent_a_replica_n8 t:null-9700] o.a.s.c.SyncStrategy Sync replicas to https://127.0.0.1:44957/solr/parent_a_replica_n8/ 2> 1740708 INFO (qtp1283346820-5576) [n:127.0.0.1:36209_solr c:parent s:b r:core_node15 x:parent_b_replica_n5 t:null-9697] o.a.s.u.PeerSync PeerSync: core=parent_b_replica_n5 url=https://127.0.0.1:36209/solr DONE. We have no versions. sync failed. 2> 1740712 INFO (qtp484878969-5591) [n:127.0.0.1:44957_solr c:parent s:a r:core_node18 x:parent_a_replica_n8 t:null-9700] o.a.s.u.PeerSync PeerSync: core=parent_a_replica_n8 url=https://127.0.0.1:44957/solr START replicas=[https://127.0.0.1:35379/solr/parent_a_replica_n1/] nUpdates=100 2> 1740715 INFO (qtp1167887541-5567) [n:127.0.0.1:36867_solr c:parent s:e r:core_node12 x:parent_e_replica_n2 t:null-9694] o.a.s.c.ShardLeaderElectionContext Enough replicas found to continue. 2> 1740715 INFO (qtp1167887541-5567) [n:127.0.0.1:36867_solr c:parent s:e r:core_node12 x:parent_e_replica_n2 t:null-9694] o.a.s.c.ShardLeaderElectionContext I may be the new leader - try and sync 2> 1740715 INFO (qtp1167887541-5567) [n:127.0.0.1:36867_solr c:parent s:e r:core_node12 x:parent_e_replica_n2 t:null-9694] o.a.s.c.SyncStrategy Sync replicas to https://127.0.0.1:36867/solr/parent_e_replica_n2/ 2> 1740716 INFO (qtp484878969-5591) [n:127.0.0.1:44957_solr c:parent s:a r:core_node18 x:parent_a_replica_n8 t:null-9700] o.a.s.u.PeerSync PeerSync: core=parent_a_replica_n8 url=https://127.0.0.1:44957/solr DONE. We have no versions. sync failed. 2> 1740719 INFO (qtp780434683-5566) [n:127.0.0.1:35379_solr c:parent s:a r:core_node11 x:parent_a_replica_n1 t:null-9700] o.a.s.c.S.Request webapp=/solr path=/get params={distrib=false&qt=/get&fingerprint=false&getVersions=100&wt=javabin&version=2} status=0 QTime=2 2> 1740721 INFO (qtp1167887541-5567) [n:127.0.0.1:36867_solr c:parent s:e r:core_node12 x:parent_e_replica_n2 t:null-9694] o.a.s.u.PeerSync PeerSync: core=parent_e_replica_n2 url=https://127.0.0.1:36867/solr START replicas=[https://127.0.0.1:36209/solr/parent_e_replica_n3/] nUpdates=100 2> 1740721 INFO (qtp484878969-5591) [n:127.0.0.1:44957_solr c:parent s:a r:core_node18 x:parent_a_replica_n8 t:null-9700] o.a.s.c.SyncStrategy Leader's attempt to sync with shard failed, moving to the next candidate 2> 1740722 INFO (qtp780434683-5580) [n:127.0.0.1:35379_solr c:parent s:c r:core_node19 x:parent_c_replica_n9 t:null-9701] o.a.s.c.ShardLeaderElectionContext Enough replicas found to continue. 2> 1740722 INFO (qtp484878969-5591) [n:127.0.0.1:44957_solr c:parent s:a r:core_node18 x:parent_a_replica_n8 t:null-9700] o.a.s.c.ShardLeaderElectionContext We failed sync, but we have no versions - we can't sync in that case - we were active before, so become leader anyway 2> 1740722 INFO (qtp780434683-5580) [n:127.0.0.1:35379_solr c:parent s:c r:core_node19 x:parent_c_replica_n9 t:null-9701] o.a.s.c.ShardLeaderElectionContext I may be the new leader - try and sync 2> 1740722 INFO (qtp780434683-5580) [n:127.0.0.1:35379_solr c:parent s:c r:core_node19 x:parent_c_replica_n9 t:null-9701] o.a.s.c.SyncStrategy Sync replicas to https://127.0.0.1:35379/solr/parent_c_replica_n9/ 2> 1740722 INFO (qtp1167887541-5567) [n:127.0.0.1:36867_solr c:parent s:e r:core_node12 x:parent_e_replica_n2 t:null-9694] o.a.s.u.PeerSync PeerSync: core=parent_e_replica_n2 url=https://127.0.0.1:36867/solr DONE. We have no versions. sync failed. 2> 1740723 INFO (qtp484878969-5591) [n:127.0.0.1:44957_solr c:parent s:a r:core_node18 x:parent_a_replica_n8 t:null-9700] o.a.s.c.ShardLeaderElectionContextBase Creating leader registration node /collections/parent/leaders/a/leader after winning as /collections/parent/leader_elect/a/election/72077900700319756-core_node18-n_0000000000 2> 1740737 INFO (qtp780434683-5580) [n:127.0.0.1:35379_solr c:parent s:c r:core_node19 x:parent_c_replica_n9 t:null-9701] o.a.s.u.PeerSync PeerSync: core=parent_c_replica_n9 url=https://127.0.0.1:35379/solr START replicas=[https://127.0.0.1:36867/solr/parent_c_replica_n10/] nUpdates=100 2> 1740739 INFO (qtp780434683-5580) [n:127.0.0.1:35379_solr c:parent s:c r:core_node19 x:parent_c_replica_n9 t:null-9701] o.a.s.u.PeerSync PeerSync: core=parent_c_replica_n9 url=https://127.0.0.1:35379/solr DONE. We have no versions. sync failed. 2> 1740741 INFO (qtp484878969-5573) [n:127.0.0.1:44957_solr c:parent s:d r:core_node16 x:parent_d_replica_n6 t:null-9698] o.a.s.c.ShardLeaderElectionContext Enough replicas found to continue. 2> 1740741 INFO (qtp484878969-5573) [n:127.0.0.1:44957_solr c:parent s:d r:core_node16 x:parent_d_replica_n6 t:null-9698] o.a.s.c.ShardLeaderElectionContext I may be the new leader - try and sync 2> 1740741 INFO (qtp484878969-5573) [n:127.0.0.1:44957_solr c:parent s:d r:core_node16 x:parent_d_replica_n6 t:null-9698] o.a.s.c.SyncStrategy Sync replicas to https://127.0.0.1:44957/solr/parent_d_replica_n6/ 2> 1740744 INFO (qtp1167887541-5583) [n:127.0.0.1:36867_solr c:parent s:c r:core_node20 x:parent_c_replica_n10 t:null-9701] o.a.s.c.S.Request webapp=/solr path=/get params={distrib=false&qt=/get&fingerprint=false&getVersions=100&wt=javabin&version=2} status=0 QTime=2 2> 1740763 INFO (qtp484878969-5573) [n:127.0.0.1:44957_solr c:parent s:d r:core_node16 x:parent_d_replica_n6 t:null-9698] o.a.s.u.PeerSync PeerSync: core=parent_d_replica_n6 url=https://127.0.0.1:44957/solr START replicas=[https://127.0.0.1:34035/solr/parent_d_replica_n7/] nUpdates=100 2> 1740763 INFO (qtp780434683-5580) [n:127.0.0.1:35379_solr c:parent s:c r:core_node19 x:parent_c_replica_n9 t:null-9701] o.a.s.c.SyncStrategy Leader's attempt to sync with shard failed, moving to the next candidate 2> 1740763 INFO (qtp780434683-5580) [n:127.0.0.1:35379_solr c:parent s:c r:core_node19 x:parent_c_replica_n9 t:null-9701] o.a.s.c.ShardLeaderElectionContext We failed sync, but we have no versions - we can't sync in that case - we were active before, so become leader anyway 2> 1740763 INFO (qtp780434683-5580) [n:127.0.0.1:35379_solr c:parent s:c r:core_node19 x:parent_c_replica_n9 t:null-9701] o.a.s.c.ShardLeaderElectionContextBase Creating leader registration node /collections/parent/leaders/c/leader after winning as /collections/parent/leader_elect/c/election/72077900700319752-core_node19-n_0000000000 2> 1740764 INFO (qtp484878969-5573) [n:127.0.0.1:44957_solr c:parent s:d r:core_node16 x:parent_d_replica_n6 t:null-9698] o.a.s.u.PeerSync PeerSync: core=parent_d_replica_n6 url=https://127.0.0.1:44957/solr DONE. We have no versions. sync failed. 2> 1740768 INFO (qtp484878969-5591) [n:127.0.0.1:44957_solr c:parent s:a r:core_node18 x:parent_a_replica_n8 t:null-9700] o.a.s.c.ShardLeaderElectionContext I am the new leader: https://127.0.0.1:44957/solr/parent_a_replica_n8/ a 2> 1740778 INFO (qtp484878969-5591) [n:127.0.0.1:44957_solr c:parent s:a r:core_node18 x:parent_a_replica_n8 t:null-9700] o.a.s.c.ZkController I am the leader, no recovery necessary 2> 1740781 INFO (qtp780434683-5580) [n:127.0.0.1:35379_solr c:parent s:c r:core_node19 x:parent_c_replica_n9 t:null-9701] o.a.s.c.ShardLeaderElectionContext I am the new leader: https://127.0.0.1:35379/solr/parent_c_replica_n9/ c 2> 1740795 INFO (qtp780434683-5580) [n:127.0.0.1:35379_solr c:parent s:c r:core_node19 x:parent_c_replica_n9 t:null-9701] o.a.s.c.ZkController I am the leader, no recovery necessary 2> 1740821 INFO (qtp484878969-5591) [n:127.0.0.1:44957_solr c: s: r: x: t:null-9700] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/cores params={qt=/admin/cores&coreNodeName=core_node18&collection.configName=_default&newCollection=true&name=parent_a_replica_n8&action=CREATE&numShards=5&collection=parent&shard=a&wt=javabin&version=2&replicaType=NRT} status=0 QTime=3489 2> 1740835 INFO (qtp780434683-5580) [n:127.0.0.1:35379_solr c: s: r: x: t:null-9701] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/cores params={qt=/admin/cores&coreNodeName=core_node19&collection.configName=_default&newCollection=true&name=parent_c_replica_n9&action=CREATE&numShards=5&collection=parent&shard=c&wt=javabin&version=2&replicaType=NRT} status=0 QTime=3501 2> 1740869 INFO (qtp266123491-5569) [n:127.0.0.1:34035_solr c:parent s:b r:core_node14 x:parent_b_replica_n4 t:null-9697] o.a.s.c.S.Request webapp=/solr path=/get params={distrib=false&qt=/get&fingerprint=false&getVersions=100&wt=javabin&version=2} status=0 QTime=5 2> 1740874 INFO (qtp1283346820-5576) [n:127.0.0.1:36209_solr c:parent s:b r:core_node15 x:parent_b_replica_n5 t:null-9697] o.a.s.c.SyncStrategy Leader's attempt to sync with shard failed, moving to the next candidate 2> 1740879 INFO (qtp1283346820-5576) [n:127.0.0.1:36209_solr c:parent s:b r:core_node15 x:parent_b_replica_n5 t:null-9697] o.a.s.c.ShardLeaderElectionContext We failed sync, but we have no versions - we can't sync in that case - we were active before, so become leader anyway 2> 1740879 INFO (qtp1283346820-5576) [n:127.0.0.1:36209_solr c:parent s:b r:core_node15 x:parent_b_replica_n5 t:null-9697] o.a.s.c.ShardLeaderElectionContextBase Creating leader registration node /collections/parent/leaders/b/leader after winning as /collections/parent/leader_elect/b/election/72077900700319755-core_node15-n_0000000000 2> 1740894 INFO (qtp1283346820-5570) [n:127.0.0.1:36209_solr c:parent s:e r:core_node13 x:parent_e_replica_n3 t:null-9694] o.a.s.c.S.Request webapp=/solr path=/get params={distrib=false&qt=/get&fingerprint=false&getVersions=100&wt=javabin&version=2} status=0 QTime=1 2> 1740899 INFO (qtp1167887541-5567) [n:127.0.0.1:36867_solr c:parent s:e r:core_node12 x:parent_e_replica_n2 t:null-9694] o.a.s.c.SyncStrategy Leader's attempt to sync with shard failed, moving to the next candidate 2> 1740899 INFO (qtp1167887541-5567) [n:127.0.0.1:36867_solr c:parent s:e r:core_node12 x:parent_e_replica_n2 t:null-9694] o.a.s.c.ShardLeaderElectionContext We failed sync, but we have no versions - we can't sync in that case - we were active before, so become leader anyway 2> 1740899 INFO (qtp1167887541-5567) [n:127.0.0.1:36867_solr c:parent s:e r:core_node12 x:parent_e_replica_n2 t:null-9694] o.a.s.c.ShardLeaderElectionContextBase Creating leader registration node /collections/parent/leaders/e/leader after winning as /collections/parent/leader_elect/e/election/72077900700319753-core_node12-n_0000000000 2> 1740911 INFO (qtp1283346820-5576) [n:127.0.0.1:36209_solr c:parent s:b r:core_node15 x:parent_b_replica_n5 t:null-9697] o.a.s.c.ShardLeaderElectionContext I am the new leader: https://127.0.0.1:36209/solr/parent_b_replica_n5/ b 2> 1740914 INFO (zkCallback-3666-thread-2) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/parent/state.json zxid: 966] for collection [parent] has occurred - updating... (live nodes size: [5]) 2> 1740914 INFO (zkCallback-3660-thread-2) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/parent/state.json zxid: 966] for collection [parent] has occurred - updating... (live nodes size: [5]) 2> 1740930 INFO (qtp266123491-5590) [n:127.0.0.1:34035_solr c:parent s:d r:core_node17 x:parent_d_replica_n7 t:null-9698] o.a.s.c.S.Request webapp=/solr path=/get params={distrib=false&qt=/get&fingerprint=false&getVersions=100&wt=javabin&version=2} status=0 QTime=1 2> 1740932 INFO (qtp484878969-5573) [n:127.0.0.1:44957_solr c:parent s:d r:core_node16 x:parent_d_replica_n6 t:null-9698] o.a.s.c.SyncStrategy Leader's attempt to sync with shard failed, moving to the next candidate 2> 1740932 INFO (qtp484878969-5573) [n:127.0.0.1:44957_solr c:parent s:d r:core_node16 x:parent_d_replica_n6 t:null-9698] o.a.s.c.ShardLeaderElectionContext We failed sync, but we have no versions - we can't sync in that case - we were active before, so become leader anyway 2> 1740932 INFO (qtp1283346820-5576) [n:127.0.0.1:36209_solr c:parent s:b r:core_node15 x:parent_b_replica_n5 t:null-9697] o.a.s.c.ZkController I am the leader, no recovery necessary 2> 1740932 INFO (qtp484878969-5573) [n:127.0.0.1:44957_solr c:parent s:d r:core_node16 x:parent_d_replica_n6 t:null-9698] o.a.s.c.ShardLeaderElectionContextBase Creating leader registration node /collections/parent/leaders/d/leader after winning as /collections/parent/leader_elect/d/election/72077900700319756-core_node16-n_0000000000 2> 1740934 INFO (zkCallback-3666-thread-2) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/parent/state.json zxid: 967] for collection [parent] has occurred - updating... (live nodes size: [5]) 2> 1740934 INFO (qtp1167887541-5567) [n:127.0.0.1:36867_solr c:parent s:e r:core_node12 x:parent_e_replica_n2 t:null-9694] o.a.s.c.ShardLeaderElectionContext I am the new leader: https://127.0.0.1:36867/solr/parent_e_replica_n2/ e 2> 1740934 INFO (zkCallback-3660-thread-2) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/parent/state.json zxid: 967] for collection [parent] has occurred - updating... (live nodes size: [5]) 2> 1740943 INFO (zkCallback-3666-thread-2) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/parent/state.json zxid: 969] for collection [parent] has occurred - updating... (live nodes size: [5]) 2> 1740943 INFO (zkCallback-3660-thread-2) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/parent/state.json zxid: 969] for collection [parent] has occurred - updating... (live nodes size: [5]) 2> 1740943 INFO (qtp484878969-5573) [n:127.0.0.1:44957_solr c:parent s:d r:core_node16 x:parent_d_replica_n6 t:null-9698] o.a.s.c.ShardLeaderElectionContext I am the new leader: https://127.0.0.1:44957/solr/parent_d_replica_n6/ d 2> 1740947 INFO (qtp484878969-5573) [n:127.0.0.1:44957_solr c:parent s:d r:core_node16 x:parent_d_replica_n6 t:null-9698] o.a.s.c.ZkController I am the leader, no recovery necessary 2> 1740948 INFO (qtp1167887541-5567) [n:127.0.0.1:36867_solr c:parent s:e r:core_node12 x:parent_e_replica_n2 t:null-9694] o.a.s.c.ZkController I am the leader, no recovery necessary 2> 1740951 INFO (zkCallback-3660-thread-2) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/parent/state.json zxid: 971] for collection [parent] has occurred - updating... (live nodes size: [5]) 2> 1740951 INFO (zkCallback-3666-thread-2) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/parent/state.json zxid: 971] for collection [parent] has occurred - updating... (live nodes size: [5]) 2> 1740961 INFO (zkCallback-3660-thread-2) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/parent/state.json zxid: 973] for collection [parent] has occurred - updating... (live nodes size: [5]) 2> 1740961 INFO (zkCallback-3666-thread-2) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/parent/state.json zxid: 973] for collection [parent] has occurred - updating... (live nodes size: [5]) 2> 1740974 INFO (qtp1283346820-5576) [n:127.0.0.1:36209_solr c: s: r: x: t:null-9697] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/cores params={qt=/admin/cores&coreNodeName=core_node15&collection.configName=_default&newCollection=true&name=parent_b_replica_n5&action=CREATE&numShards=5&collection=parent&shard=b&wt=javabin&version=2&replicaType=NRT} status=0 QTime=3647 2> 1740975 INFO (qtp484878969-5573) [n:127.0.0.1:44957_solr c: s: r: x: t:null-9698] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/cores params={qt=/admin/cores&coreNodeName=core_node16&collection.configName=_default&newCollection=true&name=parent_d_replica_n6&action=CREATE&numShards=5&collection=parent&shard=d&wt=javabin&version=2&replicaType=NRT} status=0 QTime=3646 2> 1740975 INFO (zkCallback-3660-thread-2) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/parent/state.json zxid: 975] for collection [parent] has occurred - updating... (live nodes size: [5]) 2> 1740975 INFO (zkCallback-3666-thread-2) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/parent/state.json zxid: 975] for collection [parent] has occurred - updating... (live nodes size: [5]) 2> 1740975 INFO (zkCallback-3668-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/parent/state.json zxid: 975] for collection [parent] has occurred - updating... (live nodes size: [5]) 2> 1740995 INFO (qtp1167887541-5567) [n:127.0.0.1:36867_solr c: s: r: x: t:null-9694] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/cores params={qt=/admin/cores&coreNodeName=core_node12&collection.configName=_default&newCollection=true&name=parent_e_replica_n2&action=CREATE&numShards=5&collection=parent&shard=e&wt=javabin&version=2&replicaType=NRT} status=0 QTime=3673 2> 1741703 INFO (zkCallback-3660-thread-2) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/parent/state.json zxid: 976] for collection [parent] has occurred - updating... (live nodes size: [5]) 2> 1741703 INFO (zkCallback-3668-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/parent/state.json zxid: 976] for collection [parent] has occurred - updating... (live nodes size: [5]) 2> 1741703 INFO (zkCallback-3664-thread-2) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/parent/state.json zxid: 976] for collection [parent] has occurred - updating... (live nodes size: [5]) 2> 1741703 INFO (zkCallback-3666-thread-2) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/parent/state.json zxid: 976] for collection [parent] has occurred - updating... (live nodes size: [5]) 2> 1741714 INFO (zkCallback-3660-thread-2) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/parent/state.json zxid: 980] for collection [parent] has occurred - updating... (live nodes size: [5]) 2> 1741715 INFO (zkCallback-3664-thread-2) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/parent/state.json zxid: 980] for collection [parent] has occurred - updating... (live nodes size: [5]) 2> 1741715 INFO (zkCallback-3666-thread-2) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/parent/state.json zxid: 980] for collection [parent] has occurred - updating... (live nodes size: [5]) 2> 1741715 INFO (zkCallback-3668-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/parent/state.json zxid: 980] for collection [parent] has occurred - updating... (live nodes size: [5]) 2> 1741716 INFO (qtp780434683-5582) [n:127.0.0.1:35379_solr c: s: r: x: t:null-9693] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/cores params={qt=/admin/cores&coreNodeName=core_node11&collection.configName=_default&newCollection=true&name=parent_a_replica_n1&action=CREATE&numShards=5&collection=parent&shard=a&wt=javabin&version=2&replicaType=NRT} status=0 QTime=4395 2> 1741725 INFO (qtp1167887541-5579) [n:127.0.0.1:36867_solr c: s: r: x: t:null-9702] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/cores params={qt=/admin/cores&coreNodeName=core_node20&collection.configName=_default&newCollection=true&name=parent_c_replica_n10&action=CREATE&numShards=5&collection=parent&shard=c&wt=javabin&version=2&replicaType=NRT} status=0 QTime=4390 2> 1741727 INFO (zkCallback-3660-thread-2) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/parent/state.json zxid: 983] for collection [parent] has occurred - updating... (live nodes size: [5]) 2> 1741727 INFO (zkCallback-3664-thread-2) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/parent/state.json zxid: 983] for collection [parent] has occurred - updating... (live nodes size: [5]) 2> 1741727 INFO (zkCallback-3666-thread-2) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/parent/state.json zxid: 983] for collection [parent] has occurred - updating... (live nodes size: [5]) 2> 1741727 INFO (zkCallback-3668-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/parent/state.json zxid: 983] for collection [parent] has occurred - updating... (live nodes size: [5]) 2> 1741735 INFO (zkCallback-3660-thread-2) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/parent/state.json zxid: 985] for collection [parent] has occurred - updating... (live nodes size: [5]) 2> 1741735 INFO (zkCallback-3668-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/parent/state.json zxid: 985] for collection [parent] has occurred - updating... (live nodes size: [5]) 2> 1741736 INFO (zkCallback-3666-thread-2) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/parent/state.json zxid: 985] for collection [parent] has occurred - updating... (live nodes size: [5]) 2> 1741736 INFO (zkCallback-3664-thread-2) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/parent/state.json zxid: 985] for collection [parent] has occurred - updating... (live nodes size: [5]) 2> 1741746 INFO (qtp1283346820-5562) [n:127.0.0.1:36209_solr c: s: r: x: t:null-9695] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/cores params={qt=/admin/cores&coreNodeName=core_node13&collection.configName=_default&newCollection=true&name=parent_e_replica_n3&action=CREATE&numShards=5&collection=parent&shard=e&wt=javabin&version=2&replicaType=NRT} status=0 QTime=4422 2> 1741754 INFO (zkCallback-3664-thread-2) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/parent/state.json zxid: 986] for collection [parent] has occurred - updating... (live nodes size: [5]) 2> 1741754 INFO (zkCallback-3658-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/parent/state.json zxid: 986] for collection [parent] has occurred - updating... (live nodes size: [5]) 2> 1741754 INFO (zkCallback-3660-thread-2) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/parent/state.json zxid: 986] for collection [parent] has occurred - updating... (live nodes size: [5]) 2> 1741754 INFO (zkCallback-3666-thread-2) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/parent/state.json zxid: 986] for collection [parent] has occurred - updating... (live nodes size: [5]) 2> 1741754 INFO (zkCallback-3668-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/parent/state.json zxid: 986] for collection [parent] has occurred - updating... (live nodes size: [5]) 2> 1741758 INFO (qtp266123491-5592) [n:127.0.0.1:34035_solr c: s: r: x: t:null-9696] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/cores params={qt=/admin/cores&coreNodeName=core_node14&collection.configName=_default&newCollection=true&name=parent_b_replica_n4&action=CREATE&numShards=5&collection=parent&shard=b&wt=javabin&version=2&replicaType=NRT} status=0 QTime=4432 2> 1741780 INFO (qtp266123491-5594) [n:127.0.0.1:34035_solr c: s: r: x: t:null-9699] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/cores params={qt=/admin/cores&coreNodeName=core_node17&collection.configName=_default&newCollection=true&name=parent_d_replica_n7&action=CREATE&numShards=5&collection=parent&shard=d&wt=javabin&version=2&replicaType=NRT} status=0 QTime=4449 2> 1741792 INFO (qtp780434683-5564) [n:127.0.0.1:35379_solr c:parent s: r: x: t:null-9692] o.a.s.h.a.CollectionsHandler Wait for new collection to be active for at most 45 seconds. Check all shard replicas 2> 1741794 INFO (qtp780434683-5564) [n:127.0.0.1:35379_solr c:parent s: r: x: t:null-9692] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/collections params={shards=a,b,c,d,e&collection.configName=_default&name=parent&router.name=implicit&nrtReplicas=2&action=CREATE&numShards=3&wt=javabin&version=2} status=0 QTime=4900 2> 1742775 INFO (zkCallback-3664-thread-2) [n: c: s: r: x: t:] o.a.s.s.ZkIndexSchemaReader A schema change: WatchedEvent state:SyncConnected type:NodeDataChanged path:/configs/_default/managed-schema.xml zxid: 990, has occurred - updating schema from ZooKeeper ... 2> 1742775 INFO (zkCallback-3658-thread-2) [n: c: s: r: x: t:] o.a.s.s.ZkIndexSchemaReader A schema change: WatchedEvent state:SyncConnected type:NodeDataChanged path:/configs/_default/managed-schema.xml zxid: 990, has occurred - updating schema from ZooKeeper ... 2> 1742775 INFO (zkCallback-3660-thread-1) [n: c: s: r: x: t:] o.a.s.s.ZkIndexSchemaReader A schema change: WatchedEvent state:SyncConnected type:NodeDataChanged path:/configs/_default/managed-schema.xml zxid: 990, has occurred - updating schema from ZooKeeper ... 2> 1742775 INFO (zkCallback-3666-thread-2) [n: c: s: r: x: t:] o.a.s.s.ZkIndexSchemaReader A schema change: WatchedEvent state:SyncConnected type:NodeDataChanged path:/configs/_default/managed-schema.xml zxid: 990, has occurred - updating schema from ZooKeeper ... 2> 1742775 INFO (zkCallback-3666-thread-1) [n: c: s: r: x: t:] o.a.s.s.ZkIndexSchemaReader A schema change: WatchedEvent state:SyncConnected type:NodeDataChanged path:/configs/_default/managed-schema.xml zxid: 990, has occurred - updating schema from ZooKeeper ... 2> 1742775 INFO (zkCallback-3668-thread-1) [n: c: s: r: x: t:] o.a.s.s.ZkIndexSchemaReader A schema change: WatchedEvent state:SyncConnected type:NodeDataChanged path:/configs/_default/managed-schema.xml zxid: 990, has occurred - updating schema from ZooKeeper ... 2> 1742775 INFO (zkCallback-3664-thread-1) [n: c: s: r: x: t:] o.a.s.s.ZkIndexSchemaReader A schema change: WatchedEvent state:SyncConnected type:NodeDataChanged path:/configs/_default/managed-schema.xml zxid: 990, has occurred - updating schema from ZooKeeper ... 2> 1742775 INFO (zkCallback-3658-thread-1) [n: c: s: r: x: t:] o.a.s.s.ZkIndexSchemaReader A schema change: WatchedEvent state:SyncConnected type:NodeDataChanged path:/configs/_default/managed-schema.xml zxid: 990, has occurred - updating schema from ZooKeeper ... 2> 1742775 INFO (zkCallback-3660-thread-2) [n: c: s: r: x: t:] o.a.s.s.ZkIndexSchemaReader A schema change: WatchedEvent state:SyncConnected type:NodeDataChanged path:/configs/_default/managed-schema.xml zxid: 990, has occurred - updating schema from ZooKeeper ... 2> 1742775 INFO (qtp484878969-5589) [n:127.0.0.1:44957_solr c:children s:a r:core_node16 x:children_a_replica_n6 t:null-9703] o.a.s.s.ManagedIndexSchema Persisted managed schema version 4 at /configs/_default/managed-schema.xml 2> 1742776 INFO (zkCallback-3658-thread-3) [n: c: s: r: x: t:] o.a.s.s.ZkIndexSchemaReader A schema change: WatchedEvent state:SyncConnected type:NodeDataChanged path:/configs/_default/managed-schema.xml zxid: 990, has occurred - updating schema from ZooKeeper ... 2> 1742777 INFO (zkCallback-3666-thread-3) [n: c: s: r: x: t:] o.a.s.s.ZkIndexSchemaReader A schema change: WatchedEvent state:SyncConnected type:NodeDataChanged path:/configs/_default/managed-schema.xml zxid: 990, has occurred - updating schema from ZooKeeper ... 2> 1742778 INFO (zkCallback-3664-thread-2) [n: c: s: r: x: t:] o.a.s.s.ZkIndexSchemaReader Retrieved schema version 4 from Zookeeper 2> 1742778 INFO (zkCallback-3660-thread-1) [n: c: s: r: x: t:] o.a.s.s.ZkIndexSchemaReader Retrieved schema version 4 from Zookeeper 2> 1742779 INFO (zkCallback-3658-thread-2) [n: c: s: r: x: t:] o.a.s.s.ZkIndexSchemaReader Retrieved schema version 4 from Zookeeper 2> 1742780 INFO (zkCallback-3666-thread-1) [n: c: s: r: x: t:] o.a.s.s.ZkIndexSchemaReader Retrieved schema version 4 from Zookeeper 2> 1742780 INFO (zkCallback-3668-thread-1) [n: c: s: r: x: t:] o.a.s.s.ZkIndexSchemaReader Retrieved schema version 4 from Zookeeper 2> 1742780 INFO (zkCallback-3664-thread-1) [n: c: s: r: x: t:] o.a.s.s.ZkIndexSchemaReader Retrieved schema version 4 from Zookeeper 2> 1742781 INFO (zkCallback-3658-thread-1) [n: c: s: r: x: t:] o.a.s.s.ZkIndexSchemaReader Retrieved schema version 4 from Zookeeper 2> 1742781 INFO (zkCallback-3660-thread-2) [n: c: s: r: x: t:] o.a.s.s.ZkIndexSchemaReader Retrieved schema version 4 from Zookeeper 2> 1742782 INFO (zkCallback-3658-thread-3) [n: c: s: r: x: t:] o.a.s.s.ZkIndexSchemaReader Retrieved schema version 4 from Zookeeper 2> 1742780 INFO (zkCallback-3668-thread-2) [n: c: s: r: x: t:] o.a.s.s.ZkIndexSchemaReader A schema change: WatchedEvent state:SyncConnected type:NodeDataChanged path:/configs/_default/managed-schema.xml zxid: 990, has occurred - updating schema from ZooKeeper ... 2> 1742782 INFO (zkCallback-3666-thread-2) [n: c: s: r: x: t:] o.a.s.s.ZkIndexSchemaReader Retrieved schema version 4 from Zookeeper 2> 1742783 INFO (zkCallback-3666-thread-3) [n: c: s: r: x: t:] o.a.s.s.ZkIndexSchemaReader Retrieved schema version 4 from Zookeeper 2> 1742783 INFO (zkCallback-3660-thread-3) [n: c: s: r: x: t:] o.a.s.s.ZkIndexSchemaReader A schema change: WatchedEvent state:SyncConnected type:NodeDataChanged path:/configs/_default/managed-schema.xml zxid: 990, has occurred - updating schema from ZooKeeper ... 2> 1742784 INFO (zkCallback-3664-thread-3) [n: c: s: r: x: t:] o.a.s.s.ZkIndexSchemaReader A schema change: WatchedEvent state:SyncConnected type:NodeDataChanged path:/configs/_default/managed-schema.xml zxid: 990, has occurred - updating schema from ZooKeeper ... 2> 1742785 INFO (zkCallback-3668-thread-2) [n: c: s: r: x: t:] o.a.s.s.ZkIndexSchemaReader Retrieved schema version 4 from Zookeeper 2> 1742787 INFO (zkCallback-3660-thread-3) [n: c: s: r: x: t:] o.a.s.s.ZkIndexSchemaReader Retrieved schema version 4 from Zookeeper 2> 1742787 INFO (zkCallback-3664-thread-3) [n: c: s: r: x: t:] o.a.s.s.ZkIndexSchemaReader Retrieved schema version 4 from Zookeeper 2> 1742787 INFO (zkCallback-3660-thread-4) [n: c: s: r: x: t:] o.a.s.s.ZkIndexSchemaReader A schema change: WatchedEvent state:SyncConnected type:NodeDataChanged path:/configs/_default/managed-schema.xml zxid: 990, has occurred - updating schema from ZooKeeper ... 2> 1742790 INFO (zkCallback-3668-thread-3) [n: c: s: r: x: t:] o.a.s.s.ZkIndexSchemaReader A schema change: WatchedEvent state:SyncConnected type:NodeDataChanged path:/configs/_default/managed-schema.xml zxid: 990, has occurred - updating schema from ZooKeeper ... 2> 1742791 INFO (zkCallback-3660-thread-4) [n: c: s: r: x: t:] o.a.s.s.ZkIndexSchemaReader Retrieved schema version 4 from Zookeeper 2> 1742793 INFO (zkCallback-3666-thread-4) [n: c: s: r: x: t:] o.a.s.s.ZkIndexSchemaReader A schema change: WatchedEvent state:SyncConnected type:NodeDataChanged path:/configs/_default/managed-schema.xml zxid: 990, has occurred - updating schema from ZooKeeper ... 2> 1742793 INFO (zkCallback-3668-thread-3) [n: c: s: r: x: t:] o.a.s.s.ZkIndexSchemaReader Retrieved schema version 4 from Zookeeper 2> 1742795 INFO (zkCallback-3658-thread-4) [n: c: s: r: x: t:] o.a.s.s.ZkIndexSchemaReader A schema change: WatchedEvent state:SyncConnected type:NodeDataChanged path:/configs/_default/managed-schema.xml zxid: 990, has occurred - updating schema from ZooKeeper ... 2> 1742796 INFO (zkCallback-3666-thread-4) [n: c: s: r: x: t:] o.a.s.s.ZkIndexSchemaReader Retrieved schema version 4 from Zookeeper 2> 1742798 INFO (zkCallback-3668-thread-4) [n: c: s: r: x: t:] o.a.s.s.ZkIndexSchemaReader A schema change: WatchedEvent state:SyncConnected type:NodeDataChanged path:/configs/_default/managed-schema.xml zxid: 990, has occurred - updating schema from ZooKeeper ... 2> 1742799 INFO (zkCallback-3658-thread-4) [n: c: s: r: x: t:] o.a.s.s.ZkIndexSchemaReader Retrieved schema version 4 from Zookeeper 2> 1742799 INFO (zkCallback-3664-thread-4) [n: c: s: r: x: t:] o.a.s.s.ZkIndexSchemaReader A schema change: WatchedEvent state:SyncConnected type:NodeDataChanged path:/configs/_default/managed-schema.xml zxid: 990, has occurred - updating schema from ZooKeeper ... 2> 1742802 INFO (zkCallback-3668-thread-4) [n: c: s: r: x: t:] o.a.s.s.ZkIndexSchemaReader Retrieved schema version 4 from Zookeeper 2> 1742802 INFO (zkCallback-3664-thread-4) [n: c: s: r: x: t:] o.a.s.s.ZkIndexSchemaReader Retrieved schema version 4 from Zookeeper 2> 1742930 INFO (zkCallback-3666-thread-2) [n: c: s: r: x: t:] o.a.s.s.IndexSchema Schema name=default-config 2> 1742944 INFO (zkCallback-3658-thread-4) [n: c: s: r: x: t:] o.a.s.s.IndexSchema Schema name=default-config 2> 1742945 INFO (zkCallback-3664-thread-1) [n: c: s: r: x: t:] o.a.s.s.IndexSchema Schema name=default-config 2> 1742945 INFO (zkCallback-3664-thread-2) [n: c: s: r: x: t:] o.a.s.s.IndexSchema Schema name=default-config 2> 1742945 INFO (zkCallback-3658-thread-1) [n: c: s: r: x: t:] o.a.s.s.IndexSchema Schema name=default-config 2> 1742945 INFO (zkCallback-3668-thread-4) [n: c: s: r: x: t:] o.a.s.s.IndexSchema Schema name=default-config 2> 1742945 INFO (zkCallback-3668-thread-3) [n: c: s: r: x: t:] o.a.s.s.IndexSchema Schema name=default-config 2> 1742945 INFO (zkCallback-3668-thread-2) [n: c: s: r: x: t:] o.a.s.s.IndexSchema Schema name=default-config 2> 1742953 INFO (zkCallback-3666-thread-3) [n: c: s: r: x: t:] o.a.s.s.IndexSchema Schema name=default-config 2> 1742953 INFO (zkCallback-3668-thread-1) [n: c: s: r: x: t:] o.a.s.s.IndexSchema Schema name=default-config 2> 1742954 INFO (zkCallback-3664-thread-3) [n: c: s: r: x: t:] o.a.s.s.IndexSchema Schema name=default-config 2> 1742954 INFO (zkCallback-3666-thread-4) [n: c: s: r: x: t:] o.a.s.s.IndexSchema Schema name=default-config 2> 1742954 INFO (zkCallback-3660-thread-1) [n: c: s: r: x: t:] o.a.s.s.IndexSchema Schema name=default-config 2> 1742955 INFO (zkCallback-3660-thread-2) [n: c: s: r: x: t:] o.a.s.s.IndexSchema Schema name=default-config 2> 1742955 INFO (zkCallback-3658-thread-2) [n: c: s: r: x: t:] o.a.s.s.IndexSchema Schema name=default-config 2> 1742955 INFO (zkCallback-3666-thread-1) [n: c: s: r: x: t:] o.a.s.s.IndexSchema Schema name=default-config 2> 1742957 INFO (zkCallback-3664-thread-4) [n: c: s: r: x: t:] o.a.s.s.IndexSchema Schema name=default-config 2> 1742957 INFO (zkCallback-3658-thread-3) [n: c: s: r: x: t:] o.a.s.s.IndexSchema Schema name=default-config 2> 1742958 INFO (OverseerCollectionConfigSetProcessor-72077900700319752-127.0.0.1:35379_solr-n_0000000000) [n: c: s: r: x: t:] o.a.s.c.OverseerTaskQueue Response ZK path: /overseer/collection-queue-work/qnr-0000000002 doesn't exist. Requestor may have disconnected from ZooKeeper 2> 1742959 INFO (zkCallback-3660-thread-3) [n: c: s: r: x: t:] o.a.s.s.IndexSchema Schema name=default-config 2> 1742960 INFO (zkCallback-3660-thread-4) [n: c: s: r: x: t:] o.a.s.s.IndexSchema Schema name=default-config 2> 1747186 INFO (searcherExecutor-3714-thread-1-processing-127.0.0.1:44957_solr children_a_replica_n6 children a core_node16) [n:127.0.0.1:44957_solr c:children s:a r:core_node16 x:children_a_replica_n6 t:] o.a.s.c.QuerySenderListener QuerySenderListener done. 2> 1747189 INFO (searcherExecutor-3714-thread-1-processing-127.0.0.1:44957_solr children_a_replica_n6 children a core_node16) [n:127.0.0.1:44957_solr c:children s:a r:core_node16 x:children_a_replica_n6 t:] o.a.s.c.SolrCore Registered new searcher autowarm time: 7 ms 2> 1747239 INFO (zkCallback-3668-thread-3) [n: c: s: r: x: t:] o.a.s.s.IndexSchema Loaded schema default-config/1.6 with uniqueid field id 2> 1747240 INFO (zkCallback-3668-thread-3) [n: c: s: r: x: t:] o.a.s.s.ZkIndexSchemaReader Finished refreshing schema in 4446 ms 2> 1747247 INFO (zkCallback-3666-thread-4) [n: c: s: r: x: t:] o.a.s.s.IndexSchema Loaded schema default-config/1.6 with uniqueid field id 2> 1747248 INFO (zkCallback-3666-thread-4) [n: c: s: r: x: t:] o.a.s.s.ZkIndexSchemaReader Finished refreshing schema in 4451 ms 2> 1747250 INFO (zkCallback-3658-thread-3) [n: c: s: r: x: t:] o.a.s.s.IndexSchema Loaded schema default-config/1.6 with uniqueid field id 2> 1747251 INFO (zkCallback-3658-thread-3) [n: c: s: r: x: t:] o.a.s.s.ZkIndexSchemaReader Finished refreshing schema in 4468 ms 2> 1747253 INFO (zkCallback-3660-thread-1) [n: c: s: r: x: t:] o.a.s.s.IndexSchema Loaded schema default-config/1.6 with uniqueid field id 2> 1747254 INFO (zkCallback-3660-thread-1) [n: c: s: r: x: t:] o.a.s.s.ZkIndexSchemaReader Finished refreshing schema in 4475 ms 2> 1747255 INFO (zkCallback-3658-thread-4) [n: c: s: r: x: t:] o.a.s.s.IndexSchema Loaded schema default-config/1.6 with uniqueid field id 2> 1747255 INFO (zkCallback-3658-thread-4) [n: c: s: r: x: t:] o.a.s.s.ZkIndexSchemaReader Finished refreshing schema in 4455 ms 2> 1747255 INFO (zkCallback-3658-thread-1) [n: c: s: r: x: t:] o.a.s.s.IndexSchema Loaded schema default-config/1.6 with uniqueid field id 2> 1747255 INFO (zkCallback-3658-thread-1) [n: c: s: r: x: t:] o.a.s.s.ZkIndexSchemaReader Finished refreshing schema in 4474 ms 2> 1747259 INFO (zkCallback-3660-thread-4) [n: c: s: r: x: t:] o.a.s.s.IndexSchema Loaded schema default-config/1.6 with uniqueid field id 2> 1748274 INFO (zkCallback-3660-thread-4) [n: c: s: r: x: t:] o.a.s.s.ZkIndexSchemaReader Finished refreshing schema in 4468 ms 2> 1748277 INFO (zkCallback-3668-thread-4) [n: c: s: r: x: t:] o.a.s.s.IndexSchema Loaded schema default-config/1.6 with uniqueid field id 2> 1748278 INFO (zkCallback-3668-thread-4) [n: c: s: r: x: t:] o.a.s.s.ZkIndexSchemaReader Finished refreshing schema in 5475 ms 2> 1748278 INFO (zkCallback-3660-thread-2) [n: c: s: r: x: t:] o.a.s.s.IndexSchema Loaded schema default-config/1.6 with uniqueid field id 2> 1748278 INFO (zkCallback-3660-thread-2) [n: c: s: r: x: t:] o.a.s.s.ZkIndexSchemaReader Finished refreshing schema in 5496 ms 2> 1748281 INFO (zkCallback-3666-thread-1) [n: c: s: r: x: t:] o.a.s.s.IndexSchema Loaded schema default-config/1.6 with uniqueid field id 2> 1748282 INFO (zkCallback-3666-thread-1) [n: c: s: r: x: t:] o.a.s.s.ZkIndexSchemaReader Finished refreshing schema in 5501 ms 2> 1748283 INFO (zkCallback-3660-thread-3) [n: c: s: r: x: t:] o.a.s.s.IndexSchema Loaded schema default-config/1.6 with uniqueid field id 2> 1748283 INFO (zkCallback-3658-thread-2) [n: c: s: r: x: t:] o.a.s.s.IndexSchema Loaded schema default-config/1.6 with uniqueid field id 2> 1748283 INFO (zkCallback-3660-thread-3) [n: c: s: r: x: t:] o.a.s.s.ZkIndexSchemaReader Finished refreshing schema in 5496 ms 2> 1748283 INFO (zkCallback-3658-thread-2) [n: c: s: r: x: t:] o.a.s.s.ZkIndexSchemaReader Finished refreshing schema in 5504 ms 2> 1748285 INFO (zkCallback-3666-thread-2) [n: c: s: r: x: t:] o.a.s.s.IndexSchema Loaded schema default-config/1.6 with uniqueid field id 2> 1748285 INFO (zkCallback-3666-thread-2) [n: c: s: r: x: t:] o.a.s.s.ZkIndexSchemaReader Finished refreshing schema in 5502 ms 2> 1748287 INFO (zkCallback-3668-thread-2) [n: c: s: r: x: t:] o.a.s.s.IndexSchema Loaded schema default-config/1.6 with uniqueid field id 2> 1748287 INFO (zkCallback-3668-thread-2) [n: c: s: r: x: t:] o.a.s.s.ZkIndexSchemaReader Finished refreshing schema in 5501 ms 2> 1748289 INFO (zkCallback-3668-thread-1) [n: c: s: r: x: t:] o.a.s.s.IndexSchema Loaded schema default-config/1.6 with uniqueid field id 2> 1748289 INFO (zkCallback-3668-thread-1) [n: c: s: r: x: t:] o.a.s.s.ZkIndexSchemaReader Finished refreshing schema in 5508 ms 2> 1748291 INFO (zkCallback-3666-thread-3) [n: c: s: r: x: t:] o.a.s.s.IndexSchema Loaded schema default-config/1.6 with uniqueid field id 2> 1748292 INFO (zkCallback-3666-thread-3) [n: c: s: r: x: t:] o.a.s.s.ZkIndexSchemaReader Finished refreshing schema in 5508 ms 2> 1748309 INFO (zkCallback-3664-thread-3) [n: c: s: r: x: t:] o.a.s.s.IndexSchema Loaded schema default-config/1.6 with uniqueid field id 2> 1748310 INFO (zkCallback-3664-thread-3) [n: c: s: r: x: t:] o.a.s.s.ZkIndexSchemaReader Finished refreshing schema in 5522 ms 2> 1748321 INFO (zkCallback-3664-thread-1) [n: c: s: r: x: t:] o.a.s.s.IndexSchema Loaded schema default-config/1.6 with uniqueid field id 2> 1748321 INFO (zkCallback-3664-thread-1) [n: c: s: r: x: t:] o.a.s.s.ZkIndexSchemaReader Finished refreshing schema in 5540 ms 2> 1748327 INFO (zkCallback-3664-thread-4) [n: c: s: r: x: t:] o.a.s.s.IndexSchema Loaded schema default-config/1.6 with uniqueid field id 2> 1748327 INFO (zkCallback-3664-thread-2) [n: c: s: r: x: t:] o.a.s.s.IndexSchema Loaded schema default-config/1.6 with uniqueid field id 2> 1748327 INFO (zkCallback-3664-thread-4) [n: c: s: r: x: t:] o.a.s.s.ZkIndexSchemaReader Finished refreshing schema in 5524 ms 2> 1748327 INFO (zkCallback-3664-thread-2) [n: c: s: r: x: t:] o.a.s.s.ZkIndexSchemaReader Finished refreshing schema in 5549 ms 2> 1769441 WARN (SUITE-ShardJoinImplicitTest-seed#[246C98A4C257C021]-worker-SendThread(127.0.0.1:33915)) [n: c: s: r: x: t:] o.a.z.ClientCnxn Client session timed out, have not heard from server in 27491ms for session id 0x10012780366000d 2> 1769485 WARN (SUITE-ShardJoinImplicitTest-seed#[246C98A4C257C021]-worker-SendThread(127.0.0.1:33915)) [n: c: s: r: x: t:] o.a.z.ClientCnxn Session 0x10012780366000d for server localhost/127.0.0.1:33915, Closing socket connection. Attempting reconnect except it is a SessionExpiredException. 2> => org.apache.zookeeper.ClientCnxn$SessionTimeoutException: Client session timed out, have not heard from server in 27491ms for session id 0x10012780366000d 2> at org.apache.zookeeper.ClientCnxn$SendThread.run(ClientCnxn.java:1242) 2> org.apache.zookeeper.ClientCnxn$SessionTimeoutException: Client session timed out, have not heard from server in 27491ms for session id 0x10012780366000d 2> at org.apache.zookeeper.ClientCnxn$SendThread.run(ClientCnxn.java:1242) [zookeeper-3.9.1.jar:3.9.1] 2> 1769570 INFO (qtp484878969-5589) [n:127.0.0.1:44957_solr c:children s:a r:core_node16 x:children_a_replica_n6 t:null-9703] o.a.s.c.ZkShardTerms Successful update of terms at /collections/children/terms/a to Terms{values={core_node14=1, core_node16=1}, version=2} for ensureHighestTermsAreNotZero 2> 1769602 WARN (zkConnectionManagerCallback-3702-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ConnectionManager Watcher org.apache.solr.common.cloud.ConnectionManager@6974560f name: ZooKeeperConnection Watcher:127.0.0.1:33915/solr got event WatchedEvent state:Disconnected type:None path:null zxid: -1 path: null type: None 2> 1769603 WARN (zkConnectionManagerCallback-3702-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ConnectionManager zkClient has disconnected 2> 1769709 ERROR (qtp1283346820-5584) [n:127.0.0.1:36209_solr c:children s:e r:core_node15 x:children_e_replica_n5 t:null-9703] o.a.s.s.ManagedIndexSchema Bad version when trying to persist schema using 3 due to: 2> => org.apache.zookeeper.KeeperException$BadVersionException: KeeperErrorCode = BadVersion for /configs/_default/managed-schema.xml 2> at org.apache.zookeeper.KeeperException.create(KeeperException.java:121) 2> org.apache.zookeeper.KeeperException$BadVersionException: KeeperErrorCode = BadVersion for /configs/_default/managed-schema.xml 2> at org.apache.zookeeper.KeeperException.create(KeeperException.java:121) ~[zookeeper-3.9.1.jar:3.9.1] 2> at org.apache.zookeeper.KeeperException.create(KeeperException.java:53) ~[zookeeper-3.9.1.jar:3.9.1] 2> at org.apache.zookeeper.ZooKeeper.setData(ZooKeeper.java:2181) ~[zookeeper-3.9.1.jar:3.9.1] 2> at org.apache.solr.common.cloud.SolrZkClient.lambda$setData$7(SolrZkClient.java:479) ~[solr-solrj-zookeeper-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.common.cloud.ZkCmdExecutor.retryOperation(ZkCmdExecutor.java:70) ~[solr-solrj-zookeeper-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.common.cloud.SolrZkClient.setData(SolrZkClient.java:479) ~[solr-solrj-zookeeper-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.schema.ManagedIndexSchema.persistManagedSchemaToZooKeeper(ManagedIndexSchema.java:194) [main/:?] 2> at org.apache.solr.schema.ManagedIndexSchema.persistManagedSchema(ManagedIndexSchema.java:131) [main/:?] 2> at org.apache.solr.update.processor.AddSchemaFieldsUpdateProcessorFactory$AddSchemaFieldsUpdateProcessor.processAdd(AddSchemaFieldsUpdateProcessorFactory.java:515) [main/:?] 2> at org.apache.solr.handler.loader.JavabinLoader$1.update(JavabinLoader.java:123) [main/:?] 2> at org.apache.solr.client.solrj.request.JavaBinUpdateRequestCodec$StreamingCodec.readOuterMostDocIterator(JavaBinUpdateRequestCodec.java:342) [solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.request.JavaBinUpdateRequestCodec$StreamingCodec.readIterator(JavaBinUpdateRequestCodec.java:286) [solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.common.util.JavaBinCodec.readObject(JavaBinCodec.java:340) [solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.common.util.JavaBinCodec.readVal(JavaBinCodec.java:285) [solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.request.JavaBinUpdateRequestCodec$StreamingCodec.readNamedList(JavaBinUpdateRequestCodec.java:236) [solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.common.util.JavaBinCodec.readObject(JavaBinCodec.java:305) [solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.common.util.JavaBinCodec.readVal(JavaBinCodec.java:285) [solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.common.util.JavaBinCodec.unmarshal(JavaBinCodec.java:194) [solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.request.JavaBinUpdateRequestCodec.unmarshal(JavaBinUpdateRequestCodec.java:126) [solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.handler.loader.JavabinLoader.parseAndLoadDocs(JavabinLoader.java:135) [main/:?] 2> at org.apache.solr.handler.loader.JavabinLoader.load(JavabinLoader.java:74) [main/:?] 2> at org.apache.solr.handler.UpdateRequestHandler$1.load(UpdateRequestHandler.java:102) [main/:?] 2> at org.apache.solr.handler.ContentStreamHandlerBase.handleRequestBody(ContentStreamHandlerBase.java:100) [main/:?] 2> at org.apache.solr.handler.RequestHandlerBase.handleRequest(RequestHandlerBase.java:226) [main/:?] 2> at org.apache.solr.core.SolrCore.execute(SolrCore.java:2908) [main/:?] 2> at org.apache.solr.servlet.HttpSolrCall.executeCoreRequest(HttpSolrCall.java:875) [main/:?] 2> at org.apache.solr.servlet.HttpSolrCall.call(HttpSolrCall.java:561) [main/:?] 2> at org.apache.solr.servlet.SolrDispatchFilter.dispatch(SolrDispatchFilter.java:262) [main/:?] 2> at org.apache.solr.servlet.SolrDispatchFilter.lambda$doFilter$0(SolrDispatchFilter.java:219) [main/:?] 2> at org.apache.solr.servlet.ServletUtils.traceHttpRequestExecution2(ServletUtils.java:246) [main/:?] 2> at org.apache.solr.servlet.ServletUtils.rateLimitRequest(ServletUtils.java:215) [main/:?] 2> at org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:213) [main/:?] 2> at org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:195) [main/:?] 2> at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:202) [jetty-servlet-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1635) [jetty-servlet-10.0.19.jar:10.0.19] 2> at org.apache.solr.embedded.JettySolrRunner$DebugFilter.doFilter(JettySolrRunner.java:187) [solr-test-framework-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:202) [jetty-servlet-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1635) [jetty-servlet-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:527) [jetty-servlet-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:221) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1580) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:221) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1384) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:176) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:484) [jetty-servlet-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1553) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:174) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1306) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:129) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:122) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.rewrite.handler.RewriteHandler.handle(RewriteHandler.java:301) [jetty-rewrite-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:122) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.gzip.GzipHandler.handle(GzipHandler.java:790) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:122) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.Server.handle(Server.java:563) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.HttpChannel$RequestDispatchable.dispatch(HttpChannel.java:1598) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:753) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:501) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.HttpChannel.run(HttpChannel.java:461) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:969) [jetty-util-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.doRunJob(QueuedThreadPool.java:1194) [jetty-util-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1149) [jetty-util-10.0.19.jar:10.0.19] 2> at java.base/java.lang.Thread.run(Thread.java:829) [?:?] 2> 1769709 ERROR (qtp1167887541-5571) [n:127.0.0.1:36867_solr c:children s:c r:core_node19 x:children_c_replica_n9 t:null-9703] o.a.s.s.ManagedIndexSchema Bad version when trying to persist schema using 3 due to: 2> => org.apache.zookeeper.KeeperException$BadVersionException: KeeperErrorCode = BadVersion for /configs/_default/managed-schema.xml 2> at org.apache.zookeeper.KeeperException.create(KeeperException.java:121) 2> org.apache.zookeeper.KeeperException$BadVersionException: KeeperErrorCode = BadVersion for /configs/_default/managed-schema.xml 2> at org.apache.zookeeper.KeeperException.create(KeeperException.java:121) ~[zookeeper-3.9.1.jar:3.9.1] 2> at org.apache.zookeeper.KeeperException.create(KeeperException.java:53) ~[zookeeper-3.9.1.jar:3.9.1] 2> at org.apache.zookeeper.ZooKeeper.setData(ZooKeeper.java:2181) ~[zookeeper-3.9.1.jar:3.9.1] 2> at org.apache.solr.common.cloud.SolrZkClient.lambda$setData$7(SolrZkClient.java:479) ~[solr-solrj-zookeeper-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.common.cloud.ZkCmdExecutor.retryOperation(ZkCmdExecutor.java:70) ~[solr-solrj-zookeeper-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.common.cloud.SolrZkClient.setData(SolrZkClient.java:479) ~[solr-solrj-zookeeper-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.schema.ManagedIndexSchema.persistManagedSchemaToZooKeeper(ManagedIndexSchema.java:194) [main/:?] 2> at org.apache.solr.schema.ManagedIndexSchema.persistManagedSchema(ManagedIndexSchema.java:131) [main/:?] 2> at org.apache.solr.update.processor.AddSchemaFieldsUpdateProcessorFactory$AddSchemaFieldsUpdateProcessor.processAdd(AddSchemaFieldsUpdateProcessorFactory.java:515) [main/:?] 2> at org.apache.solr.handler.loader.JavabinLoader$1.update(JavabinLoader.java:123) [main/:?] 2> at org.apache.solr.client.solrj.request.JavaBinUpdateRequestCodec$StreamingCodec.readOuterMostDocIterator(JavaBinUpdateRequestCodec.java:342) [solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.request.JavaBinUpdateRequestCodec$StreamingCodec.readIterator(JavaBinUpdateRequestCodec.java:286) [solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.common.util.JavaBinCodec.readObject(JavaBinCodec.java:340) [solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.common.util.JavaBinCodec.readVal(JavaBinCodec.java:285) [solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.request.JavaBinUpdateRequestCodec$StreamingCodec.readNamedList(JavaBinUpdateRequestCodec.java:236) [solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.common.util.JavaBinCodec.readObject(JavaBinCodec.java:305) [solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.common.util.JavaBinCodec.readVal(JavaBinCodec.java:285) [solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.common.util.JavaBinCodec.unmarshal(JavaBinCodec.java:194) [solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.request.JavaBinUpdateRequestCodec.unmarshal(JavaBinUpdateRequestCodec.java:126) [solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.handler.loader.JavabinLoader.parseAndLoadDocs(JavabinLoader.java:135) [main/:?] 2> at org.apache.solr.handler.loader.JavabinLoader.load(JavabinLoader.java:74) [main/:?] 2> at org.apache.solr.handler.UpdateRequestHandler$1.load(UpdateRequestHandler.java:102) [main/:?] 2> at org.apache.solr.handler.ContentStreamHandlerBase.handleRequestBody(ContentStreamHandlerBase.java:100) [main/:?] 2> at org.apache.solr.handler.RequestHandlerBase.handleRequest(RequestHandlerBase.java:226) [main/:?] 2> at org.apache.solr.core.SolrCore.execute(SolrCore.java:2908) [main/:?] 2> at org.apache.solr.servlet.HttpSolrCall.executeCoreRequest(HttpSolrCall.java:875) [main/:?] 2> at org.apache.solr.servlet.HttpSolrCall.call(HttpSolrCall.java:561) [main/:?] 2> at org.apache.solr.servlet.SolrDispatchFilter.dispatch(SolrDispatchFilter.java:262) [main/:?] 2> at org.apache.solr.servlet.SolrDispatchFilter.lambda$doFilter$0(SolrDispatchFilter.java:219) [main/:?] 2> at org.apache.solr.servlet.ServletUtils.traceHttpRequestExecution2(ServletUtils.java:246) [main/:?] 2> at org.apache.solr.servlet.ServletUtils.rateLimitRequest(ServletUtils.java:215) [main/:?] 2> at org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:213) [main/:?] 2> at org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:195) [main/:?] 2> at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:202) [jetty-servlet-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1635) [jetty-servlet-10.0.19.jar:10.0.19] 2> at org.apache.solr.embedded.JettySolrRunner$DebugFilter.doFilter(JettySolrRunner.java:187) [solr-test-framework-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:202) [jetty-servlet-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1635) [jetty-servlet-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:527) [jetty-servlet-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:221) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1580) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:221) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1384) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:176) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:484) [jetty-servlet-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1553) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:174) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1306) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:129) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:122) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.rewrite.handler.RewriteHandler.handle(RewriteHandler.java:301) [jetty-rewrite-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:122) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.gzip.GzipHandler.handle(GzipHandler.java:790) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:122) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.Server.handle(Server.java:563) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.HttpChannel$RequestDispatchable.dispatch(HttpChannel.java:1598) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:753) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:501) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.HttpChannel.run(HttpChannel.java:461) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:969) [jetty-util-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.doRunJob(QueuedThreadPool.java:1194) [jetty-util-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1149) [jetty-util-10.0.19.jar:10.0.19] 2> at java.base/java.lang.Thread.run(Thread.java:829) [?:?] 2> 1769709 ERROR (qtp780434683-5566) [n:127.0.0.1:35379_solr c:children s:a r:core_node14 x:children_a_replica_n4 t:null-9703] o.a.s.s.ManagedIndexSchema Bad version when trying to persist schema using 3 due to: 2> => org.apache.zookeeper.KeeperException$BadVersionException: KeeperErrorCode = BadVersion for /configs/_default/managed-schema.xml 2> at org.apache.zookeeper.KeeperException.create(KeeperException.java:121) 2> org.apache.zookeeper.KeeperException$BadVersionException: KeeperErrorCode = BadVersion for /configs/_default/managed-schema.xml 2> at org.apache.zookeeper.KeeperException.create(KeeperException.java:121) ~[zookeeper-3.9.1.jar:3.9.1] 2> at org.apache.zookeeper.KeeperException.create(KeeperException.java:53) ~[zookeeper-3.9.1.jar:3.9.1] 2> at org.apache.zookeeper.ZooKeeper.setData(ZooKeeper.java:2181) ~[zookeeper-3.9.1.jar:3.9.1] 2> at org.apache.solr.common.cloud.SolrZkClient.lambda$setData$7(SolrZkClient.java:479) ~[solr-solrj-zookeeper-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.common.cloud.ZkCmdExecutor.retryOperation(ZkCmdExecutor.java:70) ~[solr-solrj-zookeeper-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.common.cloud.SolrZkClient.setData(SolrZkClient.java:479) ~[solr-solrj-zookeeper-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.schema.ManagedIndexSchema.persistManagedSchemaToZooKeeper(ManagedIndexSchema.java:194) [main/:?] 2> at org.apache.solr.schema.ManagedIndexSchema.persistManagedSchema(ManagedIndexSchema.java:131) [main/:?] 2> at org.apache.solr.update.processor.AddSchemaFieldsUpdateProcessorFactory$AddSchemaFieldsUpdateProcessor.processAdd(AddSchemaFieldsUpdateProcessorFactory.java:515) [main/:?] 2> at org.apache.solr.handler.loader.JavabinLoader$1.update(JavabinLoader.java:123) [main/:?] 2> at org.apache.solr.client.solrj.request.JavaBinUpdateRequestCodec$StreamingCodec.readOuterMostDocIterator(JavaBinUpdateRequestCodec.java:342) [solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.request.JavaBinUpdateRequestCodec$StreamingCodec.readIterator(JavaBinUpdateRequestCodec.java:286) [solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.common.util.JavaBinCodec.readObject(JavaBinCodec.java:340) [solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.common.util.JavaBinCodec.readVal(JavaBinCodec.java:285) [solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.request.JavaBinUpdateRequestCodec$StreamingCodec.readNamedList(JavaBinUpdateRequestCodec.java:236) [solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.common.util.JavaBinCodec.readObject(JavaBinCodec.java:305) [solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.common.util.JavaBinCodec.readVal(JavaBinCodec.java:285) [solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.common.util.JavaBinCodec.unmarshal(JavaBinCodec.java:194) [solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.request.JavaBinUpdateRequestCodec.unmarshal(JavaBinUpdateRequestCodec.java:126) [solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.handler.loader.JavabinLoader.parseAndLoadDocs(JavabinLoader.java:135) [main/:?] 2> at org.apache.solr.handler.loader.JavabinLoader.load(JavabinLoader.java:74) [main/:?] 2> at org.apache.solr.handler.UpdateRequestHandler$1.load(UpdateRequestHandler.java:102) [main/:?] 2> at org.apache.solr.handler.ContentStreamHandlerBase.handleRequestBody(ContentStreamHandlerBase.java:100) [main/:?] 2> at org.apache.solr.handler.RequestHandlerBase.handleRequest(RequestHandlerBase.java:226) [main/:?] 2> at org.apache.solr.core.SolrCore.execute(SolrCore.java:2908) [main/:?] 2> at org.apache.solr.servlet.HttpSolrCall.executeCoreRequest(HttpSolrCall.java:875) [main/:?] 2> at org.apache.solr.servlet.HttpSolrCall.call(HttpSolrCall.java:561) [main/:?] 2> at org.apache.solr.servlet.SolrDispatchFilter.dispatch(SolrDispatchFilter.java:262) [main/:?] 2> at org.apache.solr.servlet.SolrDispatchFilter.lambda$doFilter$0(SolrDispatchFilter.java:219) [main/:?] 2> at org.apache.solr.servlet.ServletUtils.traceHttpRequestExecution2(ServletUtils.java:246) [main/:?] 2> at org.apache.solr.servlet.ServletUtils.rateLimitRequest(ServletUtils.java:215) [main/:?] 2> at org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:213) [main/:?] 2> at org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:195) [main/:?] 2> at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:202) [jetty-servlet-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1635) [jetty-servlet-10.0.19.jar:10.0.19] 2> at org.apache.solr.embedded.JettySolrRunner$DebugFilter.doFilter(JettySolrRunner.java:187) [solr-test-framework-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:202) [jetty-servlet-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1635) [jetty-servlet-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:527) [jetty-servlet-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:221) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1580) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:221) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1384) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:176) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:484) [jetty-servlet-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1553) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:174) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1306) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:129) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:122) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.rewrite.handler.RewriteHandler.handle(RewriteHandler.java:301) [jetty-rewrite-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:122) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.gzip.GzipHandler.handle(GzipHandler.java:790) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:122) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.Server.handle(Server.java:563) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.HttpChannel$RequestDispatchable.dispatch(HttpChannel.java:1598) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:753) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:501) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.HttpChannel.run(HttpChannel.java:461) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:969) [jetty-util-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.doRunJob(QueuedThreadPool.java:1194) [jetty-util-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1149) [jetty-util-10.0.19.jar:10.0.19] 2> at java.base/java.lang.Thread.run(Thread.java:829) [?:?] 2> 1769736 INFO (qtp1283346820-5584) [n:127.0.0.1:36209_solr c:children s:e r:core_node15 x:children_e_replica_n5 t:null-9703] o.a.s.s.ManagedIndexSchema Failed to persist managed schema at /configs/_default/managed-schema.xml - version mismatch 2> 1769738 INFO (qtp1167887541-5571) [n:127.0.0.1:36867_solr c:children s:c r:core_node19 x:children_c_replica_n9 t:null-9703] o.a.s.s.ManagedIndexSchema Failed to persist managed schema at /configs/_default/managed-schema.xml - version mismatch 2> 1769739 INFO (qtp780434683-5566) [n:127.0.0.1:35379_solr c:children s:a r:core_node14 x:children_a_replica_n4 t:null-9703] o.a.s.s.ManagedIndexSchema Failed to persist managed schema at /configs/_default/managed-schema.xml - version mismatch 2> 1769888 INFO (qtp1167887541-5571) [n:127.0.0.1:36867_solr c:children s:c r:core_node19 x:children_c_replica_n9 t:null-9703] o.a.s.c.ZkShardTerms Successful update of terms at /collections/children/terms/c to Terms{values={core_node13=1, core_node19=1}, version=2} for ensureHighestTermsAreNotZero 2> 1769891 INFO (qtp780434683-5566) [n:127.0.0.1:35379_solr c:children s:a r:core_node14 x:children_a_replica_n4 t:null-9703] o.a.s.u.p.LogUpdateProcessorFactory webapp=/solr path=/update params={update.distrib=FROMLEADER&distrib.from=https://127.0.0.1:44957/solr/children_a_replica_n6/&wt=javabin&version=2}{add=[3031 (1788737063160381440), 3032 (1788737064239366144), 3033 (1788737064241463296), 3034 (1788737064243560448), 3035 (1788737064244609024), 3036 (1788737064246706176), 3037 (1788737064247754752), 3038 (1788737064249851904), 3039 (1788737064250900480), 3040 (1788737064251949056), ... (22 adds)]} 0 25737 2> 1769899 INFO (qtp1283346820-5584) [n:127.0.0.1:36209_solr c:children s:e r:core_node15 x:children_e_replica_n5 t:null-9703] o.a.s.c.ZkShardTerms Successful update of terms at /collections/children/terms/e to Terms{values={core_node17=1, core_node15=1}, version=2} for ensureHighestTermsAreNotZero 2> 1769877 ERROR (qtp266123491-5585) [n:127.0.0.1:34035_solr c:children s:d r:core_node11 x:children_d_replica_n1 t:null-9703] o.a.s.s.ManagedIndexSchema Bad version when trying to persist schema using 3 due to: 2> => org.apache.zookeeper.KeeperException$BadVersionException: KeeperErrorCode = BadVersion for /configs/_default/managed-schema.xml 2> at org.apache.zookeeper.KeeperException.create(KeeperException.java:121) 2> org.apache.zookeeper.KeeperException$BadVersionException: KeeperErrorCode = BadVersion for /configs/_default/managed-schema.xml 2> at org.apache.zookeeper.KeeperException.create(KeeperException.java:121) ~[zookeeper-3.9.1.jar:3.9.1] 2> at org.apache.zookeeper.KeeperException.create(KeeperException.java:53) ~[zookeeper-3.9.1.jar:3.9.1] 2> at org.apache.zookeeper.ZooKeeper.setData(ZooKeeper.java:2181) ~[zookeeper-3.9.1.jar:3.9.1] 2> at org.apache.solr.common.cloud.SolrZkClient.lambda$setData$7(SolrZkClient.java:479) ~[solr-solrj-zookeeper-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.common.cloud.ZkCmdExecutor.retryOperation(ZkCmdExecutor.java:70) ~[solr-solrj-zookeeper-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.common.cloud.SolrZkClient.setData(SolrZkClient.java:479) ~[solr-solrj-zookeeper-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.schema.ManagedIndexSchema.persistManagedSchemaToZooKeeper(ManagedIndexSchema.java:194) [main/:?] 2> at org.apache.solr.schema.ManagedIndexSchema.persistManagedSchema(ManagedIndexSchema.java:131) [main/:?] 2> at org.apache.solr.update.processor.AddSchemaFieldsUpdateProcessorFactory$AddSchemaFieldsUpdateProcessor.processAdd(AddSchemaFieldsUpdateProcessorFactory.java:515) [main/:?] 2> at org.apache.solr.handler.loader.JavabinLoader$1.update(JavabinLoader.java:123) [main/:?] 2> at org.apache.solr.client.solrj.request.JavaBinUpdateRequestCodec$StreamingCodec.readOuterMostDocIterator(JavaBinUpdateRequestCodec.java:342) [solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.request.JavaBinUpdateRequestCodec$StreamingCodec.readIterator(JavaBinUpdateRequestCodec.java:286) [solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.common.util.JavaBinCodec.readObject(JavaBinCodec.java:340) [solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.common.util.JavaBinCodec.readVal(JavaBinCodec.java:285) [solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.request.JavaBinUpdateRequestCodec$StreamingCodec.readNamedList(JavaBinUpdateRequestCodec.java:236) [solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.common.util.JavaBinCodec.readObject(JavaBinCodec.java:305) [solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.common.util.JavaBinCodec.readVal(JavaBinCodec.java:285) [solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.common.util.JavaBinCodec.unmarshal(JavaBinCodec.java:194) [solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.request.JavaBinUpdateRequestCodec.unmarshal(JavaBinUpdateRequestCodec.java:126) [solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.handler.loader.JavabinLoader.parseAndLoadDocs(JavabinLoader.java:135) [main/:?] 2> at org.apache.solr.handler.loader.JavabinLoader.load(JavabinLoader.java:74) [main/:?] 2> at org.apache.solr.handler.UpdateRequestHandler$1.load(UpdateRequestHandler.java:102) [main/:?] 2> at org.apache.solr.handler.ContentStreamHandlerBase.handleRequestBody(ContentStreamHandlerBase.java:100) [main/:?] 2> at org.apache.solr.handler.RequestHandlerBase.handleRequest(RequestHandlerBase.java:226) [main/:?] 2> at org.apache.solr.core.SolrCore.execute(SolrCore.java:2908) [main/:?] 2> at org.apache.solr.servlet.HttpSolrCall.executeCoreRequest(HttpSolrCall.java:875) [main/:?] 2> at org.apache.solr.servlet.HttpSolrCall.call(HttpSolrCall.java:561) [main/:?] 2> at org.apache.solr.servlet.SolrDispatchFilter.dispatch(SolrDispatchFilter.java:262) [main/:?] 2> at org.apache.solr.servlet.SolrDispatchFilter.lambda$doFilter$0(SolrDispatchFilter.java:219) [main/:?] 2> at org.apache.solr.servlet.ServletUtils.traceHttpRequestExecution2(ServletUtils.java:246) [main/:?] 2> at org.apache.solr.servlet.ServletUtils.rateLimitRequest(ServletUtils.java:215) [main/:?] 2> at org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:213) [main/:?] 2> at org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:195) [main/:?] 2> at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:202) [jetty-servlet-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1635) [jetty-servlet-10.0.19.jar:10.0.19] 2> at org.apache.solr.embedded.JettySolrRunner$DebugFilter.doFilter(JettySolrRunner.java:187) [solr-test-framework-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:202) [jetty-servlet-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1635) [jetty-servlet-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:527) [jetty-servlet-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:221) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1580) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:221) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1384) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:176) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:484) [jetty-servlet-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1553) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:174) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1306) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:129) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:122) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.rewrite.handler.RewriteHandler.handle(RewriteHandler.java:301) [jetty-rewrite-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:122) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.gzip.GzipHandler.handle(GzipHandler.java:790) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:122) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.Server.handle(Server.java:563) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.HttpChannel$RequestDispatchable.dispatch(HttpChannel.java:1598) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:753) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:501) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.HttpChannel.run(HttpChannel.java:461) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:969) [jetty-util-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.doRunJob(QueuedThreadPool.java:1194) [jetty-util-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1149) [jetty-util-10.0.19.jar:10.0.19] 2> at java.base/java.lang.Thread.run(Thread.java:829) [?:?] 2> 1769886 ERROR (qtp266123491-5590) [n:127.0.0.1:34035_solr c:children s:b r:core_node20 x:children_b_replica_n10 t:null-9703] o.a.s.s.ManagedIndexSchema Bad version when trying to persist schema using 3 due to: 2> => org.apache.zookeeper.KeeperException$BadVersionException: KeeperErrorCode = BadVersion for /configs/_default/managed-schema.xml 2> at org.apache.zookeeper.KeeperException.create(KeeperException.java:121) 2> org.apache.zookeeper.KeeperException$BadVersionException: KeeperErrorCode = BadVersion for /configs/_default/managed-schema.xml 2> at org.apache.zookeeper.KeeperException.create(KeeperException.java:121) ~[zookeeper-3.9.1.jar:3.9.1] 2> at org.apache.zookeeper.KeeperException.create(KeeperException.java:53) ~[zookeeper-3.9.1.jar:3.9.1] 2> at org.apache.zookeeper.ZooKeeper.setData(ZooKeeper.java:2181) ~[zookeeper-3.9.1.jar:3.9.1] 2> at org.apache.solr.common.cloud.SolrZkClient.lambda$setData$7(SolrZkClient.java:479) ~[solr-solrj-zookeeper-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.common.cloud.ZkCmdExecutor.retryOperation(ZkCmdExecutor.java:70) ~[solr-solrj-zookeeper-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.common.cloud.SolrZkClient.setData(SolrZkClient.java:479) ~[solr-solrj-zookeeper-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.schema.ManagedIndexSchema.persistManagedSchemaToZooKeeper(ManagedIndexSchema.java:194) [main/:?] 2> at org.apache.solr.schema.ManagedIndexSchema.persistManagedSchema(ManagedIndexSchema.java:131) [main/:?] 2> at org.apache.solr.update.processor.AddSchemaFieldsUpdateProcessorFactory$AddSchemaFieldsUpdateProcessor.processAdd(AddSchemaFieldsUpdateProcessorFactory.java:515) [main/:?] 2> at org.apache.solr.handler.loader.JavabinLoader$1.update(JavabinLoader.java:123) [main/:?] 2> at org.apache.solr.client.solrj.request.JavaBinUpdateRequestCodec$StreamingCodec.readOuterMostDocIterator(JavaBinUpdateRequestCodec.java:342) [solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.request.JavaBinUpdateRequestCodec$StreamingCodec.readIterator(JavaBinUpdateRequestCodec.java:286) [solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.common.util.JavaBinCodec.readObject(JavaBinCodec.java:340) [solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.common.util.JavaBinCodec.readVal(JavaBinCodec.java:285) [solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.request.JavaBinUpdateRequestCodec$StreamingCodec.readNamedList(JavaBinUpdateRequestCodec.java:236) [solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.common.util.JavaBinCodec.readObject(JavaBinCodec.java:305) [solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.common.util.JavaBinCodec.readVal(JavaBinCodec.java:285) [solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.common.util.JavaBinCodec.unmarshal(JavaBinCodec.java:194) [solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.client.solrj.request.JavaBinUpdateRequestCodec.unmarshal(JavaBinUpdateRequestCodec.java:126) [solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.handler.loader.JavabinLoader.parseAndLoadDocs(JavabinLoader.java:135) [main/:?] 2> at org.apache.solr.handler.loader.JavabinLoader.load(JavabinLoader.java:74) [main/:?] 2> at org.apache.solr.handler.UpdateRequestHandler$1.load(UpdateRequestHandler.java:102) [main/:?] 2> at org.apache.solr.handler.ContentStreamHandlerBase.handleRequestBody(ContentStreamHandlerBase.java:100) [main/:?] 2> at org.apache.solr.handler.RequestHandlerBase.handleRequest(RequestHandlerBase.java:226) [main/:?] 2> at org.apache.solr.core.SolrCore.execute(SolrCore.java:2908) [main/:?] 2> at org.apache.solr.servlet.HttpSolrCall.executeCoreRequest(HttpSolrCall.java:875) [main/:?] 2> at org.apache.solr.servlet.HttpSolrCall.call(HttpSolrCall.java:561) [main/:?] 2> at org.apache.solr.servlet.SolrDispatchFilter.dispatch(SolrDispatchFilter.java:262) [main/:?] 2> at org.apache.solr.servlet.SolrDispatchFilter.lambda$doFilter$0(SolrDispatchFilter.java:219) [main/:?] 2> at org.apache.solr.servlet.ServletUtils.traceHttpRequestExecution2(ServletUtils.java:246) [main/:?] 2> at org.apache.solr.servlet.ServletUtils.rateLimitRequest(ServletUtils.java:215) [main/:?] 2> at org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:213) [main/:?] 2> at org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:195) [main/:?] 2> at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:202) [jetty-servlet-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1635) [jetty-servlet-10.0.19.jar:10.0.19] 2> at org.apache.solr.embedded.JettySolrRunner$DebugFilter.doFilter(JettySolrRunner.java:187) [solr-test-framework-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:202) [jetty-servlet-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1635) [jetty-servlet-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:527) [jetty-servlet-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:221) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1580) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:221) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1384) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:176) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:484) [jetty-servlet-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1553) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:174) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1306) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:129) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:122) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.rewrite.handler.RewriteHandler.handle(RewriteHandler.java:301) [jetty-rewrite-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:122) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.gzip.GzipHandler.handle(GzipHandler.java:790) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:122) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.Server.handle(Server.java:563) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.HttpChannel$RequestDispatchable.dispatch(HttpChannel.java:1598) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:753) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:501) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.HttpChannel.run(HttpChannel.java:461) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:969) [jetty-util-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.doRunJob(QueuedThreadPool.java:1194) [jetty-util-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1149) [jetty-util-10.0.19.jar:10.0.19] 2> at java.base/java.lang.Thread.run(Thread.java:829) [?:?] 2> 1769902 INFO (qtp266123491-5585) [n:127.0.0.1:34035_solr c:children s:d r:core_node11 x:children_d_replica_n1 t:null-9703] o.a.s.s.ManagedIndexSchema Failed to persist managed schema at /configs/_default/managed-schema.xml - version mismatch 2> 1769905 INFO (qtp266123491-5590) [n:127.0.0.1:34035_solr c:children s:b r:core_node20 x:children_b_replica_n10 t:null-9703] o.a.s.s.ManagedIndexSchema Failed to persist managed schema at /configs/_default/managed-schema.xml - version mismatch 2> 1770048 INFO (qtp266123491-5585) [n:127.0.0.1:34035_solr c:children s:d r:core_node11 x:children_d_replica_n1 t:null-9703] o.a.s.c.ZkShardTerms Successful update of terms at /collections/children/terms/d to Terms{values={core_node18=1, core_node11=1}, version=2} for ensureHighestTermsAreNotZero 2> 1770059 INFO (qtp266123491-5590) [n:127.0.0.1:34035_solr c:children s:b r:core_node20 x:children_b_replica_n10 t:null-9703] o.a.s.c.ZkShardTerms Successful update of terms at /collections/children/terms/b to Terms{values={core_node20=1, core_node12=1}, version=2} for ensureHighestTermsAreNotZero 2> 1770308 INFO (qtp780434683-5564) [n:127.0.0.1:35379_solr c:children s:c r:core_node13 x:children_c_replica_n3 t:null-9703] o.a.s.u.p.LogUpdateProcessorFactory webapp=/solr path=/update params={update.distrib=FROMLEADER&distrib.from=https://127.0.0.1:36867/solr/children_c_replica_n9/&wt=javabin&version=2}{add=[5051 (1788737091366027264), 5052 (1788737091456204800), 5053 (1788737091457253376), 5054 (1788737091459350528), 5055 (1788737091462496256), 5056 (1788737091465641984), 5057 (1788737091467739136), 5058 (1788737091469836288), 5059 (1788737091471933440), 5060 (1788737091475079168), ... (11 adds)]} 0 81 2> 1770311 INFO (qtp1167887541-5571) [n:127.0.0.1:36867_solr c:children s:c r:core_node19 x:children_c_replica_n9 t:null-9703] o.a.s.u.p.LogUpdateProcessorFactory webapp=/solr path=/update params={update.distrib=TOLEADER&distrib.from=https://127.0.0.1:44957/solr/children_a_replica_n6/&wt=javabin&version=2}{add=[5051 (1788737091366027264), 5052 (1788737091456204800), 5053 (1788737091457253376), 5054 (1788737091459350528), 5055 (1788737091462496256), 5056 (1788737091465641984), 5057 (1788737091467739136), 5058 (1788737091469836288), 5059 (1788737091471933440), 5060 (1788737091475079168), ... (11 adds)]} 0 26132 2> 1770313 INFO (qtp1167887541-5579) [n:127.0.0.1:36867_solr c:children s:e r:core_node17 x:children_e_replica_n7 t:null-9703] o.a.s.u.p.LogUpdateProcessorFactory webapp=/solr path=/update params={update.distrib=FROMLEADER&distrib.from=https://127.0.0.1:36209/solr/children_e_replica_n5/&wt=javabin&version=2}{add=[2021 (1788737091367075840), 2022 (1788737091456204800), 2023 (1788737091457253376), 2024 (1788737091459350528), 2025 (1788737091462496256), 2026 (1788737091465641984), 2027 (1788737091467739136), 2028 (1788737091469836288), 2029 (1788737091471933440), 2030 (1788737091476127744), ... (22 adds)]} 0 85 2> 1770315 INFO (qtp1283346820-5584) [n:127.0.0.1:36209_solr c:children s:e r:core_node15 x:children_e_replica_n5 t:null-9703] o.a.s.u.p.LogUpdateProcessorFactory webapp=/solr path=/update params={update.distrib=TOLEADER&distrib.from=https://127.0.0.1:44957/solr/children_a_replica_n6/&wt=javabin&version=2}{add=[2021 (1788737091367075840), 2022 (1788737091456204800), 2023 (1788737091457253376), 2024 (1788737091459350528), 2025 (1788737091462496256), 2026 (1788737091465641984), 2027 (1788737091467739136), 2028 (1788737091469836288), 2029 (1788737091471933440), 2030 (1788737091476127744), ... (22 adds)]} 0 26269 2> 1770316 INFO (qtp484878969-5565) [n:127.0.0.1:44957_solr c:children s:d r:core_node18 x:children_d_replica_n8 t:null-9703] o.a.s.u.p.LogUpdateProcessorFactory webapp=/solr path=/update params={update.distrib=FROMLEADER&distrib.from=https://127.0.0.1:34035/solr/children_d_replica_n1/&wt=javabin&version=2}{add=[1011 (1788737091507585024), 1012 (1788737091591471104), 1013 (1788737091593568256), 1014 (1788737091596713984), 1015 (1788737091598811136), 1016 (1788737091600908288), 1017 (1788737091603005440), 1018 (1788737091605102592), 1019 (1788737091622928384), 1020 (1788737091623976960), ... (22 adds)]} 0 88 2> 1770318 INFO (qtp266123491-5585) [n:127.0.0.1:34035_solr c:children s:d r:core_node11 x:children_d_replica_n1 t:null-9703] o.a.s.u.p.LogUpdateProcessorFactory webapp=/solr path=/update params={update.distrib=TOLEADER&distrib.from=https://127.0.0.1:44957/solr/children_a_replica_n6/&wt=javabin&version=2}{add=[1011 (1788737091507585024), 1012 (1788737091591471104), 1013 (1788737091593568256), 1014 (1788737091596713984), 1015 (1788737091598811136), 1016 (1788737091600908288), 1017 (1788737091603005440), 1018 (1788737091605102592), 1019 (1788737091622928384), 1020 (1788737091623976960), ... (22 adds)]} 0 26272 2> 1770322 INFO (qtp1283346820-5578) [n:127.0.0.1:36209_solr c:children s:b r:core_node12 x:children_b_replica_n2 t:null-9703] o.a.s.u.p.LogUpdateProcessorFactory webapp=/solr path=/update params={update.distrib=FROMLEADER&distrib.from=https://127.0.0.1:34035/solr/children_b_replica_n10/&wt=javabin&version=2}{add=[4041 (1788737091507585024), 4042 (1788737091589373952), 4043 (1788737091591471104), 4044 (1788737091593568256), 4045 (1788737091595665408), 4046 (1788737091596713984), 4047 (1788737091598811136), 4048 (1788737091600908288), 4049 (1788737091603005440), 4050 (1788737091604054016), ... (33 adds)]} 0 94 2> 1770324 INFO (qtp266123491-5590) [n:127.0.0.1:34035_solr c:children s:b r:core_node20 x:children_b_replica_n10 t:null-9703] o.a.s.u.p.LogUpdateProcessorFactory webapp=/solr path=/update params={update.distrib=TOLEADER&distrib.from=https://127.0.0.1:44957/solr/children_a_replica_n6/&wt=javabin&version=2}{add=[4041 (1788737091507585024), 4042 (1788737091589373952), 4043 (1788737091591471104), 4044 (1788737091593568256), 4045 (1788737091595665408), 4046 (1788737091596713984), 4047 (1788737091598811136), 4048 (1788737091600908288), 4049 (1788737091603005440), 4050 (1788737091604054016), ... (33 adds)]} 0 26277 2> 1770328 INFO (qtp484878969-5589) [n:127.0.0.1:44957_solr c:children s:a r:core_node16 x:children_a_replica_n6 t:null-9703] o.a.s.u.p.LogUpdateProcessorFactory webapp=/solr path=/update params={_stateVer_=children:31&commit=true&wt=javabin&version=2}{add=[1011, 1012, 1013, 1014, 1015, 1016, 1017, 1018, 1019, 1020, ... (110 adds)]} 0 28397 2> 1770330 ERROR (qtp484878969-5589) [n:127.0.0.1:44957_solr c:children s:a r:core_node16 x:children_a_replica_n6 t:null-9703] o.a.s.h.RequestHandlerBase Server exception 2> => java.io.IOException: Task queue processing has stalled for 25115 ms with 0 remaining elements to process. 2> at org.apache.solr.client.solrj.impl.ConcurrentUpdateHttp2SolrClient.blockUntilFinished(ConcurrentUpdateHttp2SolrClient.java:520) 2> java.io.IOException: Task queue processing has stalled for 25115 ms with 0 remaining elements to process. 2> at org.apache.solr.client.solrj.impl.ConcurrentUpdateHttp2SolrClient.blockUntilFinished(ConcurrentUpdateHttp2SolrClient.java:520) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.update.StreamingSolrClients.blockUntilFinished(StreamingSolrClients.java:91) ~[main/:?] 2> at org.apache.solr.update.SolrCmdDistributor.blockAndDoRetries(SolrCmdDistributor.java:282) ~[main/:?] 2> at org.apache.solr.update.SolrCmdDistributor.distribCommit(SolrCmdDistributor.java:269) ~[main/:?] 2> at org.apache.solr.update.processor.DistributedZkUpdateProcessor.processCommit(DistributedZkUpdateProcessor.java:219) ~[main/:?] 2> at org.apache.solr.update.processor.LogUpdateProcessorFactory$LogUpdateProcessor.processCommit(LogUpdateProcessorFactory.java:173) ~[main/:?] 2> at org.apache.solr.update.processor.UpdateRequestProcessor.processCommit(UpdateRequestProcessor.java:66) ~[main/:?] 2> at org.apache.solr.update.processor.UpdateRequestProcessor.processCommit(UpdateRequestProcessor.java:66) ~[main/:?] 2> at org.apache.solr.update.processor.UpdateRequestProcessor.processCommit(UpdateRequestProcessor.java:66) ~[main/:?] 2> at org.apache.solr.update.processor.UpdateRequestProcessor.processCommit(UpdateRequestProcessor.java:66) ~[main/:?] 2> at org.apache.solr.update.processor.UpdateRequestProcessor.processCommit(UpdateRequestProcessor.java:66) ~[main/:?] 2> at org.apache.solr.update.processor.UpdateRequestProcessor.processCommit(UpdateRequestProcessor.java:66) ~[main/:?] 2> at org.apache.solr.update.processor.UpdateRequestProcessor.processCommit(UpdateRequestProcessor.java:66) ~[main/:?] 2> at org.apache.solr.update.processor.UpdateRequestProcessor.processCommit(UpdateRequestProcessor.java:66) ~[main/:?] 2> at org.apache.solr.handler.RequestHandlerUtils.handleCommit(RequestHandlerUtils.java:68) ~[main/:?] 2> at org.apache.solr.handler.ContentStreamHandlerBase.handleRequestBody(ContentStreamHandlerBase.java:104) ~[main/:?] 2> at org.apache.solr.handler.RequestHandlerBase.handleRequest(RequestHandlerBase.java:226) [main/:?] 2> at org.apache.solr.core.SolrCore.execute(SolrCore.java:2908) [main/:?] 2> at org.apache.solr.servlet.HttpSolrCall.executeCoreRequest(HttpSolrCall.java:875) [main/:?] 2> at org.apache.solr.servlet.HttpSolrCall.call(HttpSolrCall.java:561) [main/:?] 2> at org.apache.solr.servlet.SolrDispatchFilter.dispatch(SolrDispatchFilter.java:262) [main/:?] 2> at org.apache.solr.servlet.SolrDispatchFilter.lambda$doFilter$0(SolrDispatchFilter.java:219) [main/:?] 2> at org.apache.solr.servlet.ServletUtils.traceHttpRequestExecution2(ServletUtils.java:246) [main/:?] 2> at org.apache.solr.servlet.ServletUtils.rateLimitRequest(ServletUtils.java:215) [main/:?] 2> at org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:213) [main/:?] 2> at org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:195) [main/:?] 2> at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:202) [jetty-servlet-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1635) [jetty-servlet-10.0.19.jar:10.0.19] 2> at org.apache.solr.embedded.JettySolrRunner$DebugFilter.doFilter(JettySolrRunner.java:187) [solr-test-framework-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:202) [jetty-servlet-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1635) [jetty-servlet-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:527) [jetty-servlet-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:221) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1580) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:221) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1384) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:176) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:484) [jetty-servlet-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1553) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:174) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1306) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:129) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:122) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.rewrite.handler.RewriteHandler.handle(RewriteHandler.java:301) [jetty-rewrite-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:122) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.gzip.GzipHandler.handle(GzipHandler.java:790) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:122) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.Server.handle(Server.java:563) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.HttpChannel$RequestDispatchable.dispatch(HttpChannel.java:1598) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:753) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:501) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:287) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:314) [jetty-io-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:100) [jetty-io-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.io.ssl.SslConnection$1.run(SslConnection.java:132) [jetty-io-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:969) [jetty-util-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.doRunJob(QueuedThreadPool.java:1194) [jetty-util-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1149) [jetty-util-10.0.19.jar:10.0.19] 2> at java.base/java.lang.Thread.run(Thread.java:829) [?:?] 2> 1770347 ERROR (qtp484878969-5589) [n:127.0.0.1:44957_solr c:children s:a r:core_node16 x:children_a_replica_n6 t:null-9703] o.a.s.s.HttpSolrCall 500 Exception 2> => java.io.IOException: Task queue processing has stalled for 25115 ms with 0 remaining elements to process. 2> at org.apache.solr.client.solrj.impl.ConcurrentUpdateHttp2SolrClient.blockUntilFinished(ConcurrentUpdateHttp2SolrClient.java:520) 2> java.io.IOException: Task queue processing has stalled for 25115 ms with 0 remaining elements to process. 2> at org.apache.solr.client.solrj.impl.ConcurrentUpdateHttp2SolrClient.blockUntilFinished(ConcurrentUpdateHttp2SolrClient.java:520) ~[solr-solrj-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.apache.solr.update.StreamingSolrClients.blockUntilFinished(StreamingSolrClients.java:91) ~[main/:?] 2> at org.apache.solr.update.SolrCmdDistributor.blockAndDoRetries(SolrCmdDistributor.java:282) ~[main/:?] 2> at org.apache.solr.update.SolrCmdDistributor.distribCommit(SolrCmdDistributor.java:269) ~[main/:?] 2> at org.apache.solr.update.processor.DistributedZkUpdateProcessor.processCommit(DistributedZkUpdateProcessor.java:219) ~[main/:?] 2> at org.apache.solr.update.processor.LogUpdateProcessorFactory$LogUpdateProcessor.processCommit(LogUpdateProcessorFactory.java:173) ~[main/:?] 2> at org.apache.solr.update.processor.UpdateRequestProcessor.processCommit(UpdateRequestProcessor.java:66) ~[main/:?] 2> at org.apache.solr.update.processor.UpdateRequestProcessor.processCommit(UpdateRequestProcessor.java:66) ~[main/:?] 2> at org.apache.solr.update.processor.UpdateRequestProcessor.processCommit(UpdateRequestProcessor.java:66) ~[main/:?] 2> at org.apache.solr.update.processor.UpdateRequestProcessor.processCommit(UpdateRequestProcessor.java:66) ~[main/:?] 2> at org.apache.solr.update.processor.UpdateRequestProcessor.processCommit(UpdateRequestProcessor.java:66) ~[main/:?] 2> at org.apache.solr.update.processor.UpdateRequestProcessor.processCommit(UpdateRequestProcessor.java:66) ~[main/:?] 2> at org.apache.solr.update.processor.UpdateRequestProcessor.processCommit(UpdateRequestProcessor.java:66) ~[main/:?] 2> at org.apache.solr.update.processor.UpdateRequestProcessor.processCommit(UpdateRequestProcessor.java:66) ~[main/:?] 2> at org.apache.solr.handler.RequestHandlerUtils.handleCommit(RequestHandlerUtils.java:68) ~[main/:?] 2> at org.apache.solr.handler.ContentStreamHandlerBase.handleRequestBody(ContentStreamHandlerBase.java:104) ~[main/:?] 2> at org.apache.solr.handler.RequestHandlerBase.handleRequest(RequestHandlerBase.java:226) ~[main/:?] 2> at org.apache.solr.core.SolrCore.execute(SolrCore.java:2908) ~[main/:?] 2> at org.apache.solr.servlet.HttpSolrCall.executeCoreRequest(HttpSolrCall.java:875) [main/:?] 2> at org.apache.solr.servlet.HttpSolrCall.call(HttpSolrCall.java:561) [main/:?] 2> at org.apache.solr.servlet.SolrDispatchFilter.dispatch(SolrDispatchFilter.java:262) [main/:?] 2> at org.apache.solr.servlet.SolrDispatchFilter.lambda$doFilter$0(SolrDispatchFilter.java:219) [main/:?] 2> at org.apache.solr.servlet.ServletUtils.traceHttpRequestExecution2(ServletUtils.java:246) [main/:?] 2> at org.apache.solr.servlet.ServletUtils.rateLimitRequest(ServletUtils.java:215) [main/:?] 2> at org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:213) [main/:?] 2> at org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:195) [main/:?] 2> at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:202) [jetty-servlet-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1635) [jetty-servlet-10.0.19.jar:10.0.19] 2> at org.apache.solr.embedded.JettySolrRunner$DebugFilter.doFilter(JettySolrRunner.java:187) [solr-test-framework-9.4.1.jar:9.4.1 (not a git checkout) - builduser] 2> at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:202) [jetty-servlet-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1635) [jetty-servlet-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:527) [jetty-servlet-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:221) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1580) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:221) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1384) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:176) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:484) [jetty-servlet-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1553) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:174) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1306) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:129) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:122) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.rewrite.handler.RewriteHandler.handle(RewriteHandler.java:301) [jetty-rewrite-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:122) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.gzip.GzipHandler.handle(GzipHandler.java:790) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:122) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.Server.handle(Server.java:563) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.HttpChannel$RequestDispatchable.dispatch(HttpChannel.java:1598) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:753) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:501) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:287) [jetty-server-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:314) [jetty-io-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:100) [jetty-io-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.io.ssl.SslConnection$1.run(SslConnection.java:132) [jetty-io-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:969) [jetty-util-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.doRunJob(QueuedThreadPool.java:1194) [jetty-util-10.0.19.jar:10.0.19] 2> at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1149) [jetty-util-10.0.19.jar:10.0.19] 2> at java.base/java.lang.Thread.run(Thread.java:829) [?:?] 2> 1770372 INFO (SUITE-ShardJoinImplicitTest-seed#[246C98A4C257C021]-worker) [n: c: s: r: x: t:] o.a.s.c.s.i.CloudSolrClient request was not communication error it seems 2> 1770372 INFO (SUITE-ShardJoinImplicitTest-seed#[246C98A4C257C021]-worker) [n: c: s: r: x: t:] o.a.s.c.s.i.CloudSolrClient Request to collection [children] failed due to (500) org.apache.solr.client.solrj.impl.BaseHttpSolrClient$RemoteSolrException: Error from server at https://127.0.0.1:44957/solr/children_a_replica_n6: Task queue processing has stalled for 25115 ms with 0 remaining elements to process., retry=0 maxRetries=5 commError=false errorCode=500 2> 1770377 INFO (SUITE-ShardJoinImplicitTest-seed#[246C98A4C257C021]-worker) [n: c: s: r: x: t:] o.a.s.s.j.ShardToShardJoinAbstract logic complete ... deleting the parent and children collections 2> 1770485 INFO (OverseerThreadFactory-3682-thread-3) [n: c:parent s: r: x: t:] o.a.s.c.a.c.CollectionHandlingUtils Executing Collection Cmd=action=UNLOAD&deleteInstanceDir=true&deleteDataDir=true, asyncId=null 2> 1770500 INFO (qtp484878969-5581) [n:127.0.0.1:44957_solr c: s: r: x:parent_a_replica_n8 t:null-9706] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.core.parent.a.replica_n8 tag=null 2> 1770500 INFO (qtp780434683-5574) [n:127.0.0.1:35379_solr c: s: r: x:parent_a_replica_n1 t:null-9705] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.core.parent.a.replica_n1 tag=null 2> 1770500 INFO (qtp266123491-5592) [n:127.0.0.1:34035_solr c: s: r: x:parent_b_replica_n4 t:null-9707] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.core.parent.b.replica_n4 tag=null 2> 1770516 INFO (qtp484878969-5581) [n:127.0.0.1:44957_solr c: s: r: x:parent_a_replica_n8 t:null-9706] o.a.s.c.SolrCore CLOSING SolrCore org.apache.solr.core.SolrCore@34eae49f parent_a_replica_n8 2> 1770516 INFO (qtp780434683-5574) [n:127.0.0.1:35379_solr c: s: r: x:parent_a_replica_n1 t:null-9705] o.a.s.c.SolrCore CLOSING SolrCore org.apache.solr.core.SolrCore@7613c8b parent_a_replica_n1 2> 1770516 INFO (qtp266123491-5592) [n:127.0.0.1:34035_solr c: s: r: x:parent_b_replica_n4 t:null-9707] o.a.s.c.SolrCore CLOSING SolrCore org.apache.solr.core.SolrCore@2f90a086 parent_b_replica_n4 2> 1770516 INFO (qtp1283346820-5576) [n:127.0.0.1:36209_solr c: s: r: x:parent_b_replica_n5 t:null-9708] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.core.parent.b.replica_n5 tag=null 2> 1770516 INFO (qtp1283346820-5576) [n:127.0.0.1:36209_solr c: s: r: x:parent_b_replica_n5 t:null-9708] o.a.s.c.SolrCore CLOSING SolrCore org.apache.solr.core.SolrCore@53e4f9b4 parent_b_replica_n5 2> 1770517 INFO (qtp1283346820-5576) [n:127.0.0.1:36209_solr c: s: r: x:parent_b_replica_n5 t:null-9708] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.core.parent.b.replica_n5 tag=SolrCore@53e4f9b4 2> 1770517 INFO (qtp484878969-5581) [n:127.0.0.1:44957_solr c: s: r: x:parent_a_replica_n8 t:null-9706] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.core.parent.a.replica_n8 tag=SolrCore@34eae49f 2> 1770517 INFO (qtp1283346820-5576) [n:127.0.0.1:36209_solr c: s: r: x:parent_b_replica_n5 t:null-9708] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.collection.parent.b.leader tag=SolrCore@53e4f9b4 2> 1770517 INFO (qtp484878969-5581) [n:127.0.0.1:44957_solr c: s: r: x:parent_a_replica_n8 t:null-9706] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.collection.parent.a.leader tag=SolrCore@34eae49f 2> 1770517 INFO (qtp780434683-5574) [n:127.0.0.1:35379_solr c: s: r: x:parent_a_replica_n1 t:null-9705] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.core.parent.a.replica_n1 tag=SolrCore@7613c8b 2> 1770517 INFO (qtp266123491-5592) [n:127.0.0.1:34035_solr c: s: r: x:parent_b_replica_n4 t:null-9707] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.core.parent.b.replica_n4 tag=SolrCore@2f90a086 2> 1770518 INFO (qtp780434683-5574) [n:127.0.0.1:35379_solr c: s: r: x:parent_a_replica_n1 t:null-9705] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.collection.parent.a.leader tag=SolrCore@7613c8b 2> 1770518 INFO (qtp266123491-5592) [n:127.0.0.1:34035_solr c: s: r: x:parent_b_replica_n4 t:null-9707] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.collection.parent.b.leader tag=SolrCore@2f90a086 2> 1770518 INFO (qtp780434683-5580) [n:127.0.0.1:35379_solr c: s: r: x:parent_c_replica_n9 t:null-9709] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.core.parent.c.replica_n9 tag=null 2> 1770518 INFO (qtp780434683-5580) [n:127.0.0.1:35379_solr c: s: r: x:parent_c_replica_n9 t:null-9709] o.a.s.c.SolrCore CLOSING SolrCore org.apache.solr.core.SolrCore@7f5c219d parent_c_replica_n9 2> 1770519 INFO (qtp780434683-5580) [n:127.0.0.1:35379_solr c: s: r: x:parent_c_replica_n9 t:null-9709] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.core.parent.c.replica_n9 tag=SolrCore@7f5c219d 2> 1770519 INFO (qtp780434683-5580) [n:127.0.0.1:35379_solr c: s: r: x:parent_c_replica_n9 t:null-9709] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.collection.parent.c.leader tag=SolrCore@7f5c219d 2> 1770523 INFO (qtp1283346820-5576) [n:127.0.0.1:36209_solr c: s: r: x:parent_b_replica_n5 t:null-9708] o.a.s.u.DirectUpdateHandler2 Committing on IndexWriter.close() ... SKIPPED (unnecessary). 2> 1770523 INFO (qtp1167887541-5563) [n:127.0.0.1:36867_solr c: s: r: x:parent_c_replica_n10 t:null-9710] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.core.parent.c.replica_n10 tag=null 2> 1770526 INFO (qtp780434683-5580) [n:127.0.0.1:35379_solr c: s: r: x:parent_c_replica_n9 t:null-9709] o.a.s.u.DirectUpdateHandler2 Committing on IndexWriter.close() ... SKIPPED (unnecessary). 2> 1770526 INFO (qtp780434683-5574) [n:127.0.0.1:35379_solr c: s: r: x:parent_a_replica_n1 t:null-9705] o.a.s.u.DirectUpdateHandler2 Committing on IndexWriter.close() ... SKIPPED (unnecessary). 2> 1770523 INFO (qtp266123491-5592) [n:127.0.0.1:34035_solr c: s: r: x:parent_b_replica_n4 t:null-9707] o.a.s.u.DirectUpdateHandler2 Committing on IndexWriter.close() ... SKIPPED (unnecessary). 2> 1770523 INFO (qtp484878969-5581) [n:127.0.0.1:44957_solr c: s: r: x:parent_a_replica_n8 t:null-9706] o.a.s.u.DirectUpdateHandler2 Committing on IndexWriter.close() ... SKIPPED (unnecessary). 2> 1770530 INFO (qtp1167887541-5563) [n:127.0.0.1:36867_solr c: s: r: x:parent_c_replica_n10 t:null-9710] o.a.s.c.SolrCore CLOSING SolrCore org.apache.solr.core.SolrCore@7abbac49 parent_c_replica_n10 2> 1770560 INFO (qtp1167887541-5563) [n:127.0.0.1:36867_solr c: s: r: x:parent_c_replica_n10 t:null-9710] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.core.parent.c.replica_n10 tag=SolrCore@7abbac49 2> 1770561 INFO (qtp1167887541-5563) [n:127.0.0.1:36867_solr c: s: r: x:parent_c_replica_n10 t:null-9710] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.collection.parent.c.leader tag=SolrCore@7abbac49 2> 1770565 INFO (qtp484878969-5591) [n:127.0.0.1:44957_solr c: s: r: x:parent_d_replica_n6 t:null-9711] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.core.parent.d.replica_n6 tag=null 2> 1770569 INFO (qtp484878969-5591) [n:127.0.0.1:44957_solr c: s: r: x:parent_d_replica_n6 t:null-9711] o.a.s.c.SolrCore CLOSING SolrCore org.apache.solr.core.SolrCore@450ea3c4 parent_d_replica_n6 2> 1770569 INFO (qtp484878969-5591) [n:127.0.0.1:44957_solr c: s: r: x:parent_d_replica_n6 t:null-9711] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.core.parent.d.replica_n6 tag=SolrCore@450ea3c4 2> 1770569 INFO (qtp484878969-5591) [n:127.0.0.1:44957_solr c: s: r: x:parent_d_replica_n6 t:null-9711] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.collection.parent.d.leader tag=SolrCore@450ea3c4 2> 1770572 INFO (qtp1167887541-5563) [n:127.0.0.1:36867_solr c: s: r: x:parent_c_replica_n10 t:null-9710] o.a.s.u.DirectUpdateHandler2 Committing on IndexWriter.close() ... SKIPPED (unnecessary). 2> 1770580 INFO (qtp484878969-5591) [n:127.0.0.1:44957_solr c: s: r: x:parent_d_replica_n6 t:null-9711] o.a.s.u.DirectUpdateHandler2 Committing on IndexWriter.close() ... SKIPPED (unnecessary). 2> 1770583 INFO (qtp266123491-5594) [n:127.0.0.1:34035_solr c: s: r: x:parent_d_replica_n7 t:null-9712] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.core.parent.d.replica_n7 tag=null 2> 1770584 INFO (qtp266123491-5594) [n:127.0.0.1:34035_solr c: s: r: x:parent_d_replica_n7 t:null-9712] o.a.s.c.SolrCore CLOSING SolrCore org.apache.solr.core.SolrCore@1ee52e95 parent_d_replica_n7 2> 1770584 INFO (qtp266123491-5594) [n:127.0.0.1:34035_solr c: s: r: x:parent_d_replica_n7 t:null-9712] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.core.parent.d.replica_n7 tag=SolrCore@1ee52e95 2> 1770584 INFO (qtp266123491-5594) [n:127.0.0.1:34035_solr c: s: r: x:parent_d_replica_n7 t:null-9712] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.collection.parent.d.leader tag=SolrCore@1ee52e95 2> 1770585 INFO (qtp1283346820-5570) [n:127.0.0.1:36209_solr c: s: r: x:parent_e_replica_n3 t:null-9713] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.core.parent.e.replica_n3 tag=null 2> 1770585 INFO (qtp266123491-5594) [n:127.0.0.1:34035_solr c: s: r: x:parent_d_replica_n7 t:null-9712] o.a.s.u.DirectUpdateHandler2 Committing on IndexWriter.close() ... SKIPPED (unnecessary). 2> 1770588 INFO (qtp1283346820-5570) [n:127.0.0.1:36209_solr c: s: r: x:parent_e_replica_n3 t:null-9713] o.a.s.c.SolrCore CLOSING SolrCore org.apache.solr.core.SolrCore@279b23b9 parent_e_replica_n3 2> 1770589 INFO (qtp1283346820-5570) [n:127.0.0.1:36209_solr c: s: r: x:parent_e_replica_n3 t:null-9713] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.core.parent.e.replica_n3 tag=SolrCore@279b23b9 2> 1770589 INFO (qtp1283346820-5570) [n:127.0.0.1:36209_solr c: s: r: x:parent_e_replica_n3 t:null-9713] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.collection.parent.e.leader tag=SolrCore@279b23b9 2> 1770589 INFO (qtp1167887541-5575) [n:127.0.0.1:36867_solr c: s: r: x:parent_e_replica_n2 t:null-9714] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.core.parent.e.replica_n2 tag=null 2> 1770593 INFO (qtp1167887541-5575) [n:127.0.0.1:36867_solr c: s: r: x:parent_e_replica_n2 t:null-9714] o.a.s.c.SolrCore CLOSING SolrCore org.apache.solr.core.SolrCore@31138d20 parent_e_replica_n2 2> 1770593 INFO (qtp1167887541-5575) [n:127.0.0.1:36867_solr c: s: r: x:parent_e_replica_n2 t:null-9714] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.core.parent.e.replica_n2 tag=SolrCore@31138d20 2> 1770593 INFO (qtp1167887541-5575) [n:127.0.0.1:36867_solr c: s: r: x:parent_e_replica_n2 t:null-9714] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.collection.parent.e.leader tag=SolrCore@31138d20 2> 1770597 INFO (qtp1167887541-5575) [n:127.0.0.1:36867_solr c: s: r: x:parent_e_replica_n2 t:null-9714] o.a.s.u.DirectUpdateHandler2 Committing on IndexWriter.close() ... SKIPPED (unnecessary). 2> 1770597 INFO (qtp1283346820-5570) [n:127.0.0.1:36209_solr c: s: r: x:parent_e_replica_n3 t:null-9713] o.a.s.u.DirectUpdateHandler2 Committing on IndexWriter.close() ... SKIPPED (unnecessary). 2> 1770623 INFO (qtp266123491-5592) [n:127.0.0.1:34035_solr c: s: r: x: t:null-9707] o.a.s.c.ZkShardTerms Successful update of terms at /collections/parent/terms/b to Terms{values={core_node15=0}, version=2} for removeTerm 2> 1770623 INFO (qtp1283346820-5576) [n:127.0.0.1:36209_solr c: s: r: x: t:null-9708] o.a.s.c.ZkShardTerms Failed to save terms, version is not a match, retrying 2> 1770624 INFO (qtp484878969-5591) [n:127.0.0.1:44957_solr c: s: r: x: t:null-9711] o.a.s.c.ZkShardTerms Successful update of terms at /collections/parent/terms/d to Terms{values={core_node17=0}, version=2} for removeTerm 2> 1770625 INFO (qtp780434683-5574) [n:127.0.0.1:35379_solr c: s: r: x: t:null-9705] o.a.s.c.ZkShardTerms Successful update of terms at /collections/parent/terms/a to Terms{values={core_node18=0}, version=2} for removeTerm 2> 1770626 INFO (qtp1167887541-5563) [n:127.0.0.1:36867_solr c: s: r: x: t:null-9710] o.a.s.c.ZkShardTerms Successful update of terms at /collections/parent/terms/c to Terms{values={core_node19=0}, version=2} for removeTerm 2> 1770628 INFO (qtp266123491-5594) [n:127.0.0.1:34035_solr c: s: r: x: t:null-9712] o.a.s.c.ZkShardTerms Failed to save terms, version is not a match, retrying 2> 1770629 INFO (qtp266123491-5592) [n:127.0.0.1:34035_solr c: s: r: x: t:null-9707] o.a.s.c.ShardLeaderElectionContextBase No version found for ephemeral leader parent node, won't remove previous leader registration. 2> 1770630 INFO (qtp780434683-5574) [n:127.0.0.1:35379_solr c: s: r: x: t:null-9705] o.a.s.c.ShardLeaderElectionContextBase No version found for ephemeral leader parent node, won't remove previous leader registration. 2> 1770631 INFO (qtp484878969-5581) [n:127.0.0.1:44957_solr c: s: r: x: t:null-9706] o.a.s.c.ZkShardTerms Failed to save terms, version is not a match, retrying 2> 1770632 INFO (qtp780434683-5580) [n:127.0.0.1:35379_solr c: s: r: x: t:null-9709] o.a.s.c.ZkShardTerms Failed to save terms, version is not a match, retrying 2> 1770633 INFO (qtp1167887541-5563) [n:127.0.0.1:36867_solr c: s: r: x: t:null-9710] o.a.s.c.ShardLeaderElectionContextBase No version found for ephemeral leader parent node, won't remove previous leader registration. 2> 1770635 INFO (qtp1283346820-5576) [n:127.0.0.1:36209_solr c: s: r: x: t:null-9708] o.a.s.c.ZkShardTerms Successful update of terms at /collections/parent/terms/b to Terms{values={}, version=3} for removeTerm 2> 1770640 INFO (qtp1167887541-5575) [n:127.0.0.1:36867_solr c: s: r: x: t:null-9714] o.a.s.c.ZkShardTerms Successful update of terms at /collections/parent/terms/e to Terms{values={core_node13=0}, version=2} for removeTerm 2> 1770645 INFO (qtp1283346820-5570) [n:127.0.0.1:36209_solr c: s: r: x: t:null-9713] o.a.s.c.ZkShardTerms Failed to save terms, version is not a match, retrying 2> 1770645 INFO (qtp266123491-5594) [n:127.0.0.1:34035_solr c: s: r: x: t:null-9712] o.a.s.c.ZkShardTerms Successful update of terms at /collections/parent/terms/d to Terms{values={}, version=3} for removeTerm 2> 1770649 INFO (qtp484878969-5581) [n:127.0.0.1:44957_solr c: s: r: x: t:null-9706] o.a.s.c.ZkShardTerms Successful update of terms at /collections/parent/terms/a to Terms{values={}, version=3} for removeTerm 2> 1770649 INFO (qtp780434683-5580) [n:127.0.0.1:35379_solr c: s: r: x: t:null-9709] o.a.s.c.ZkShardTerms Successful update of terms at /collections/parent/terms/c to Terms{values={}, version=3} for removeTerm 2> 1770650 WARN (zkCallback-3658-thread-2) [n: c: s: r: x: t:] o.a.s.c.LeaderElector Our node is no longer in line to be leader 2> 1770652 INFO (qtp266123491-5594) [n:127.0.0.1:34035_solr c: s: r: x: t:null-9712] o.a.s.c.ShardLeaderElectionContextBase No version found for ephemeral leader parent node, won't remove previous leader registration. 2> 1770659 INFO (zkCallback-3664-thread-4) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/parent/state.json zxid: 1031] for collection [parent] has occurred - updating... (live nodes size: [5]) 2> 1770660 INFO (qtp780434683-5574) [n:127.0.0.1:35379_solr c: s: r: x: t:null-9705] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/cores params={deleteInstanceDir=true&core=parent_a_replica_n1&qt=/admin/cores&deleteDataDir=true&action=UNLOAD&wt=javabin&version=2} status=0 QTime=164 2> 1770659 INFO (zkCallback-3660-thread-4) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/parent/state.json zxid: 1031] for collection [parent] has occurred - updating... (live nodes size: [5]) 2> 1770659 INFO (zkCallback-3658-thread-4) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/parent/state.json zxid: 1031] for collection [parent] has occurred - updating... (live nodes size: [5]) 2> 1770659 INFO (zkCallback-3666-thread-3) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/parent/state.json zxid: 1031] for collection [parent] has occurred - updating... (live nodes size: [5]) 2> 1770660 INFO (qtp1283346820-5570) [n:127.0.0.1:36209_solr c: s: r: x: t:null-9713] o.a.s.c.ZkShardTerms Successful update of terms at /collections/parent/terms/e to Terms{values={}, version=3} for removeTerm 2> 1770659 INFO (zkCallback-3668-thread-4) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/parent/state.json zxid: 1031] for collection [parent] has occurred - updating... (live nodes size: [5]) 2> 1770663 WARN (zkCallback-3664-thread-2) [n: c: s: r: x: t:] o.a.s.c.LeaderElector Our node is no longer in line to be leader 2> 1770665 WARN (zkCallback-3660-thread-2) [n: c: s: r: x: t:] o.a.s.c.LeaderElector Our node is no longer in line to be leader 2> 1770671 INFO (qtp1283346820-5570) [n:127.0.0.1:36209_solr c: s: r: x: t:null-9713] o.a.s.c.ShardLeaderElectionContextBase No version found for ephemeral leader parent node, won't remove previous leader registration. 2> 1770673 INFO (zkCallback-3664-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/parent/state.json zxid: 1040] for collection [parent] has occurred - updating... (live nodes size: [5]) 2> 1770674 INFO (zkCallback-3666-thread-3) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/parent/state.json zxid: 1040] for collection [parent] has occurred - updating... (live nodes size: [5]) 2> 1770674 INFO (zkCallback-3668-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/parent/state.json zxid: 1040] for collection [parent] has occurred - updating... (live nodes size: [5]) 2> 1770674 INFO (qtp266123491-5594) [n:127.0.0.1:34035_solr c: s: r: x: t:null-9712] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/cores params={deleteInstanceDir=true&core=parent_d_replica_n7&qt=/admin/cores&deleteDataDir=true&action=UNLOAD&wt=javabin&version=2} status=0 QTime=113 2> 1770674 INFO (zkCallback-3658-thread-2) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/parent/state.json zxid: 1040] for collection [parent] has occurred - updating... (live nodes size: [5]) 2> 1770693 INFO (qtp1283346820-5576) [n:127.0.0.1:36209_solr c: s: r: x: t:null-9708] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/cores params={deleteInstanceDir=true&core=parent_b_replica_n5&qt=/admin/cores&deleteDataDir=true&action=UNLOAD&wt=javabin&version=2} status=0 QTime=194 2> 1770699 INFO (qtp1167887541-5563) [n:127.0.0.1:36867_solr c: s: r: x: t:null-9710] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/cores params={deleteInstanceDir=true&core=parent_c_replica_n10&qt=/admin/cores&deleteDataDir=true&action=UNLOAD&wt=javabin&version=2} status=0 QTime=181 2> 1770706 INFO (qtp1283346820-5570) [n:127.0.0.1:36209_solr c: s: r: x: t:null-9713] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/cores params={deleteInstanceDir=true&core=parent_e_replica_n3&qt=/admin/cores&deleteDataDir=true&action=UNLOAD&wt=javabin&version=2} status=0 QTime=125 2> 1770713 INFO (qtp266123491-5592) [n:127.0.0.1:34035_solr c: s: r: x: t:null-9707] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/cores params={deleteInstanceDir=true&core=parent_b_replica_n4&qt=/admin/cores&deleteDataDir=true&action=UNLOAD&wt=javabin&version=2} status=0 QTime=215 2> 1770717 INFO (qtp484878969-5581) [n:127.0.0.1:44957_solr c: s: r: x: t:null-9706] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/cores params={deleteInstanceDir=true&core=parent_a_replica_n8&qt=/admin/cores&deleteDataDir=true&action=UNLOAD&wt=javabin&version=2} status=0 QTime=220 2> 1770724 INFO (qtp780434683-5580) [n:127.0.0.1:35379_solr c: s: r: x: t:null-9709] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/cores params={deleteInstanceDir=true&core=parent_c_replica_n9&qt=/admin/cores&deleteDataDir=true&action=UNLOAD&wt=javabin&version=2} status=0 QTime=224 2> 1770730 INFO (qtp484878969-5591) [n:127.0.0.1:44957_solr c: s: r: x: t:null-9711] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/cores params={deleteInstanceDir=true&core=parent_d_replica_n6&qt=/admin/cores&deleteDataDir=true&action=UNLOAD&wt=javabin&version=2} status=0 QTime=210 2> 1770734 INFO (qtp1167887541-5575) [n:127.0.0.1:36867_solr c: s: r: x: t:null-9714] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/cores params={deleteInstanceDir=true&core=parent_e_replica_n2&qt=/admin/cores&deleteDataDir=true&action=UNLOAD&wt=javabin&version=2} status=0 QTime=153 2> 1770749 INFO (OverseerThreadFactory-3682-thread-3) [n: c:parent s: r: x: t:] o.a.s.c.c.ZkStateReader /collections/parent/state.json is deleted, stop watching children 2> 1770872 INFO (qtp780434683-5572) [n:127.0.0.1:35379_solr c:parent s: r: x: t:null-9704] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/collections params={name=parent&action=DELETE&wt=javabin&version=2} status=0 QTime=470 2> 1770904 INFO (OverseerCollectionConfigSetProcessor-72077900700319752-127.0.0.1:35379_solr-n_0000000000) [n: c: s: r: x: t:] o.a.s.c.OverseerTaskQueue Response ZK path: /overseer/collection-queue-work/qnr-0000000004 doesn't exist. Requestor may have disconnected from ZooKeeper 2> 1770907 INFO (OverseerThreadFactory-3682-thread-4) [n: c:children s: r: x: t:] o.a.s.c.a.c.CollectionHandlingUtils Executing Collection Cmd=action=UNLOAD&deleteInstanceDir=true&deleteDataDir=true, asyncId=null 2> 1770913 INFO (qtp780434683-5582) [n:127.0.0.1:35379_solr c: s: r: x:children_a_replica_n4 t:null-9716] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.core.children.a.replica_n4 tag=null 2> 1770918 INFO (qtp780434683-5582) [n:127.0.0.1:35379_solr c: s: r: x:children_a_replica_n4 t:null-9716] o.a.s.c.SolrCore CLOSING SolrCore org.apache.solr.core.SolrCore@7d6ab1e8 children_a_replica_n4 2> 1770918 INFO (qtp780434683-5582) [n:127.0.0.1:35379_solr c: s: r: x:children_a_replica_n4 t:null-9716] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.core.children.a.replica_n4 tag=SolrCore@7d6ab1e8 2> 1770918 INFO (qtp780434683-5582) [n:127.0.0.1:35379_solr c: s: r: x:children_a_replica_n4 t:null-9716] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.collection.children.a.leader tag=SolrCore@7d6ab1e8 2> 1770923 INFO (qtp484878969-5573) [n:127.0.0.1:44957_solr c: s: r: x:children_a_replica_n6 t:null-9717] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.core.children.a.replica_n6 tag=null 2> 1770924 INFO (qtp484878969-5573) [n:127.0.0.1:44957_solr c: s: r: x:children_a_replica_n6 t:null-9717] o.a.s.c.SolrCore CLOSING SolrCore org.apache.solr.core.SolrCore@db3e41c children_a_replica_n6 2> 1770924 INFO (qtp484878969-5573) [n:127.0.0.1:44957_solr c: s: r: x:children_a_replica_n6 t:null-9717] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.core.children.a.replica_n6 tag=SolrCore@db3e41c 2> 1770924 INFO (qtp484878969-5573) [n:127.0.0.1:44957_solr c: s: r: x:children_a_replica_n6 t:null-9717] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.collection.children.a.leader tag=SolrCore@db3e41c 2> 1770926 INFO (qtp484878969-5573) [n:127.0.0.1:44957_solr c: s: r: x:children_a_replica_n6 t:null-9717] o.a.s.u.DirectUpdateHandler2 Committing on IndexWriter.close() ... SKIPPED (unnecessary). 2> 1770927 INFO (qtp1283346820-5562) [n:127.0.0.1:36209_solr c: s: r: x:children_b_replica_n2 t:null-9718] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.core.children.b.replica_n2 tag=null 2> 1770931 INFO (qtp780434683-5582) [n:127.0.0.1:35379_solr c: s: r: x:children_a_replica_n4 t:null-9716] o.a.s.u.DirectUpdateHandler2 Committing on IndexWriter.close() . 2> 1770932 INFO (qtp266123491-5577) [n:127.0.0.1:34035_solr c: s: r: x:children_b_replica_n10 t:null-9719] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.core.children.b.replica_n10 tag=null 2> 1770938 INFO (qtp266123491-5577) [n:127.0.0.1:34035_solr c: s: r: x:children_b_replica_n10 t:null-9719] o.a.s.c.SolrCore CLOSING SolrCore org.apache.solr.core.SolrCore@3b3239c0 children_b_replica_n10 2> 1770935 INFO (qtp1283346820-5562) [n:127.0.0.1:36209_solr c: s: r: x:children_b_replica_n2 t:null-9718] o.a.s.c.SolrCore CLOSING SolrCore org.apache.solr.core.SolrCore@7f78b936 children_b_replica_n2 2> 1770938 INFO (qtp266123491-5577) [n:127.0.0.1:34035_solr c: s: r: x:children_b_replica_n10 t:null-9719] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.core.children.b.replica_n10 tag=SolrCore@3b3239c0 2> 1770939 INFO (qtp266123491-5577) [n:127.0.0.1:34035_solr c: s: r: x:children_b_replica_n10 t:null-9719] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.collection.children.b.leader tag=SolrCore@3b3239c0 2> 1770939 INFO (qtp1283346820-5562) [n:127.0.0.1:36209_solr c: s: r: x:children_b_replica_n2 t:null-9718] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.core.children.b.replica_n2 tag=SolrCore@7f78b936 2> 1770939 INFO (qtp780434683-5564) [n:127.0.0.1:35379_solr c: s: r: x:children_c_replica_n3 t:null-9720] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.core.children.c.replica_n3 tag=null 2> 1770939 INFO (qtp1283346820-5562) [n:127.0.0.1:36209_solr c: s: r: x:children_b_replica_n2 t:null-9718] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.collection.children.b.leader tag=SolrCore@7f78b936 2> 1770942 INFO (qtp780434683-5564) [n:127.0.0.1:35379_solr c: s: r: x:children_c_replica_n3 t:null-9720] o.a.s.c.SolrCore CLOSING SolrCore org.apache.solr.core.SolrCore@55660309 children_c_replica_n3 2> 1770946 INFO (qtp780434683-5564) [n:127.0.0.1:35379_solr c: s: r: x:children_c_replica_n3 t:null-9720] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.core.children.c.replica_n3 tag=SolrCore@55660309 2> 1770949 INFO (qtp780434683-5564) [n:127.0.0.1:35379_solr c: s: r: x:children_c_replica_n3 t:null-9720] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.collection.children.c.leader tag=SolrCore@55660309 2> 1770954 INFO (qtp1167887541-5583) [n:127.0.0.1:36867_solr c: s: r: x:children_c_replica_n9 t:null-9721] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.core.children.c.replica_n9 tag=null 2> 1770955 INFO (qtp1283346820-5562) [n:127.0.0.1:36209_solr c: s: r: x:children_b_replica_n2 t:null-9718] o.a.s.u.DirectUpdateHandler2 Committing on IndexWriter.close() . 2> 1770955 INFO (qtp780434683-5564) [n:127.0.0.1:35379_solr c: s: r: x:children_c_replica_n3 t:null-9720] o.a.s.u.DirectUpdateHandler2 Committing on IndexWriter.close() . 2> 1770960 INFO (qtp1167887541-5583) [n:127.0.0.1:36867_solr c: s: r: x:children_c_replica_n9 t:null-9721] o.a.s.c.SolrCore CLOSING SolrCore org.apache.solr.core.SolrCore@2739b353 children_c_replica_n9 2> 1770960 INFO (qtp266123491-5577) [n:127.0.0.1:34035_solr c: s: r: x:children_b_replica_n10 t:null-9719] o.a.s.u.DirectUpdateHandler2 Committing on IndexWriter.close() . 2> 1770960 INFO (qtp1167887541-5583) [n:127.0.0.1:36867_solr c: s: r: x:children_c_replica_n9 t:null-9721] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.core.children.c.replica_n9 tag=SolrCore@2739b353 2> 1770960 INFO (qtp266123491-5569) [n:127.0.0.1:34035_solr c: s: r: x:children_d_replica_n1 t:null-9722] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.core.children.d.replica_n1 tag=null 2> 1770960 INFO (qtp1167887541-5583) [n:127.0.0.1:36867_solr c: s: r: x:children_c_replica_n9 t:null-9721] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.collection.children.c.leader tag=SolrCore@2739b353 2> 1770963 INFO (qtp266123491-5569) [n:127.0.0.1:34035_solr c: s: r: x:children_d_replica_n1 t:null-9722] o.a.s.c.SolrCore CLOSING SolrCore org.apache.solr.core.SolrCore@7c3f4698 children_d_replica_n1 2> 1770964 INFO (qtp266123491-5569) [n:127.0.0.1:34035_solr c: s: r: x:children_d_replica_n1 t:null-9722] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.core.children.d.replica_n1 tag=SolrCore@7c3f4698 2> 1770964 INFO (qtp266123491-5569) [n:127.0.0.1:34035_solr c: s: r: x:children_d_replica_n1 t:null-9722] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.collection.children.d.leader tag=SolrCore@7c3f4698 2> 1770970 INFO (qtp1167887541-5583) [n:127.0.0.1:36867_solr c: s: r: x:children_c_replica_n9 t:null-9721] o.a.s.u.DirectUpdateHandler2 Committing on IndexWriter.close() . 2> 1770984 INFO (qtp266123491-5569) [n:127.0.0.1:34035_solr c: s: r: x:children_d_replica_n1 t:null-9722] o.a.s.u.DirectUpdateHandler2 Committing on IndexWriter.close() . 2> 1770990 INFO (qtp484878969-5593) [n:127.0.0.1:44957_solr c: s: r: x:children_d_replica_n8 t:null-9723] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.core.children.d.replica_n8 tag=null 2> 1770992 INFO (qtp484878969-5593) [n:127.0.0.1:44957_solr c: s: r: x:children_d_replica_n8 t:null-9723] o.a.s.c.SolrCore CLOSING SolrCore org.apache.solr.core.SolrCore@533e4f60 children_d_replica_n8 2> 1770993 INFO (qtp484878969-5593) [n:127.0.0.1:44957_solr c: s: r: x:children_d_replica_n8 t:null-9723] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.core.children.d.replica_n8 tag=SolrCore@533e4f60 2> 1770993 INFO (qtp484878969-5593) [n:127.0.0.1:44957_solr c: s: r: x:children_d_replica_n8 t:null-9723] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.collection.children.d.leader tag=SolrCore@533e4f60 2> 1771000 INFO (qtp484878969-5593) [n:127.0.0.1:44957_solr c: s: r: x:children_d_replica_n8 t:null-9723] o.a.s.u.DirectUpdateHandler2 Committing on IndexWriter.close() . 2> 1771001 INFO (qtp1283346820-5568) [n:127.0.0.1:36209_solr c: s: r: x:children_e_replica_n5 t:null-9724] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.core.children.e.replica_n5 tag=null 2> 1771012 INFO (qtp1283346820-5568) [n:127.0.0.1:36209_solr c: s: r: x:children_e_replica_n5 t:null-9724] o.a.s.c.SolrCore CLOSING SolrCore org.apache.solr.core.SolrCore@56dc2ec3 children_e_replica_n5 2> 1771013 INFO (qtp1283346820-5568) [n:127.0.0.1:36209_solr c: s: r: x:children_e_replica_n5 t:null-9724] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.core.children.e.replica_n5 tag=SolrCore@56dc2ec3 2> 1771013 INFO (qtp1283346820-5568) [n:127.0.0.1:36209_solr c: s: r: x:children_e_replica_n5 t:null-9724] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.collection.children.e.leader tag=SolrCore@56dc2ec3 2> 1771020 INFO (qtp1283346820-5568) [n:127.0.0.1:36209_solr c: s: r: x:children_e_replica_n5 t:null-9724] o.a.s.u.DirectUpdateHandler2 Committing on IndexWriter.close() . 2> 1771020 INFO (qtp1167887541-5567) [n:127.0.0.1:36867_solr c: s: r: x:children_e_replica_n7 t:null-9725] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.core.children.e.replica_n7 tag=null 2> 1771021 INFO (qtp1167887541-5567) [n:127.0.0.1:36867_solr c: s: r: x:children_e_replica_n7 t:null-9725] o.a.s.c.SolrCore CLOSING SolrCore org.apache.solr.core.SolrCore@7c2ea7bb children_e_replica_n7 2> 1771021 INFO (qtp1167887541-5567) [n:127.0.0.1:36867_solr c: s: r: x:children_e_replica_n7 t:null-9725] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.core.children.e.replica_n7 tag=SolrCore@7c2ea7bb 2> 1771022 INFO (qtp1167887541-5567) [n:127.0.0.1:36867_solr c: s: r: x:children_e_replica_n7 t:null-9725] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.collection.children.e.leader tag=SolrCore@7c2ea7bb 2> 1771030 INFO (qtp1167887541-5567) [n:127.0.0.1:36867_solr c: s: r: x:children_e_replica_n7 t:null-9725] o.a.s.u.DirectUpdateHandler2 Committing on IndexWriter.close() . 2> 1771042 INFO (qtp484878969-5573) [n:127.0.0.1:44957_solr c: s: r: x: t:null-9717] o.a.s.c.ZkShardTerms Successful update of terms at /collections/children/terms/a to Terms{values={core_node14=1}, version=3} for removeTerm 2> 1771058 INFO (zkCallback-3664-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/children/state.json zxid: 1106] for collection [children] has occurred - updating... (live nodes size: [5]) 2> 1771058 INFO (zkCallback-3668-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/children/state.json zxid: 1106] for collection [children] has occurred - updating... (live nodes size: [5]) 2> 1771058 INFO (zkCallback-3666-thread-3) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/children/state.json zxid: 1106] for collection [children] has occurred - updating... (live nodes size: [5]) 2> 1771058 INFO (qtp484878969-5573) [n:127.0.0.1:44957_solr c: s: r: x: t:null-9717] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/cores params={deleteInstanceDir=true&core=children_a_replica_n6&qt=/admin/cores&deleteDataDir=true&action=UNLOAD&wt=javabin&version=2} status=0 QTime=146 2> 1771058 INFO (zkCallback-3658-thread-2) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/children/state.json zxid: 1106] for collection [children] has occurred - updating... (live nodes size: [5]) 2> 1771058 INFO (zkCallback-3660-thread-2) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/children/state.json zxid: 1106] for collection [children] has occurred - updating... (live nodes size: [5]) 2> 1771611 INFO (zkConnectionManagerCallback-3702-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ConnectionManager zkClient has connected 2> 1771699 INFO (qtp1167887541-5583) [n:127.0.0.1:36867_solr c: s: r: x: t:null-9721] o.a.s.c.ZkShardTerms Successful update of terms at /collections/children/terms/c to Terms{values={core_node13=1}, version=3} for removeTerm 2> 1771700 INFO (qtp780434683-5582) [n:127.0.0.1:35379_solr c: s: r: x: t:null-9716] o.a.s.c.ZkShardTerms Successful update of terms at /collections/children/terms/a to Terms{values={}, version=4} for removeTerm 2> 1771706 INFO (qtp266123491-5569) [n:127.0.0.1:34035_solr c: s: r: x: t:null-9722] o.a.s.c.ZkShardTerms Successful update of terms at /collections/children/terms/d to Terms{values={core_node18=1}, version=3} for removeTerm 2> 1771706 INFO (qtp484878969-5593) [n:127.0.0.1:44957_solr c: s: r: x: t:null-9723] o.a.s.c.ZkShardTerms Failed to save terms, version is not a match, retrying 2> 1771712 INFO (qtp1283346820-5562) [n:127.0.0.1:36209_solr c: s: r: x: t:null-9718] o.a.s.c.ZkShardTerms Successful update of terms at /collections/children/terms/b to Terms{values={core_node20=1}, version=3} for removeTerm 2> 1771713 INFO (qtp780434683-5582) [n:127.0.0.1:35379_solr c: s: r: x: t:null-9716] o.a.s.c.ShardLeaderElectionContextBase No version found for ephemeral leader parent node, won't remove previous leader registration. 2> 1771716 INFO (qtp1167887541-5567) [n:127.0.0.1:36867_solr c: s: r: x: t:null-9725] o.a.s.c.ZkShardTerms Successful update of terms at /collections/children/terms/e to Terms{values={core_node15=1}, version=3} for removeTerm 2> 1771729 INFO (qtp484878969-5593) [n:127.0.0.1:44957_solr c: s: r: x: t:null-9723] o.a.s.c.ZkShardTerms Successful update of terms at /collections/children/terms/d to Terms{values={}, version=4} for removeTerm 2> 1771730 INFO (qtp1283346820-5568) [n:127.0.0.1:36209_solr c: s: r: x: t:null-9724] o.a.s.c.ZkShardTerms Failed to save terms, version is not a match, retrying 2> 1771731 INFO (qtp780434683-5564) [n:127.0.0.1:35379_solr c: s: r: x: t:null-9720] o.a.s.c.ZkShardTerms Successful update of terms at /collections/children/terms/c to Terms{values={}, version=4} for removeTerm 2> 1771732 INFO (qtp266123491-5577) [n:127.0.0.1:34035_solr c: s: r: x: t:null-9719] o.a.s.c.ZkShardTerms Failed to save terms, version is not a match, retrying 2> 1771740 INFO (qtp1283346820-5562) [n:127.0.0.1:36209_solr c: s: r: x: t:null-9718] o.a.s.c.ShardLeaderElectionContextBase No version found for ephemeral leader parent node, won't remove previous leader registration. 2> 1771744 INFO (qtp780434683-5582) [n:127.0.0.1:35379_solr c: s: r: x: t:null-9716] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/cores params={deleteInstanceDir=true&core=children_a_replica_n4&qt=/admin/cores&deleteDataDir=true&action=UNLOAD&wt=javabin&version=2} status=0 QTime=834 2> 1771746 INFO (qtp1167887541-5567) [n:127.0.0.1:36867_solr c: s: r: x: t:null-9725] o.a.s.c.ShardLeaderElectionContextBase No version found for ephemeral leader parent node, won't remove previous leader registration. 2> 1771748 INFO (qtp1283346820-5568) [n:127.0.0.1:36209_solr c: s: r: x: t:null-9724] o.a.s.c.ZkShardTerms Successful update of terms at /collections/children/terms/e to Terms{values={}, version=4} for removeTerm 2> 1771748 INFO (qtp484878969-5593) [n:127.0.0.1:44957_solr c: s: r: x: t:null-9723] o.a.s.c.ShardLeaderElectionContextBase No version found for ephemeral leader parent node, won't remove previous leader registration. 2> 1771754 INFO (qtp266123491-5577) [n:127.0.0.1:34035_solr c: s: r: x: t:null-9719] o.a.s.c.ZkShardTerms Successful update of terms at /collections/children/terms/b to Terms{values={}, version=4} for removeTerm 2> 1771759 INFO (qtp266123491-5569) [n:127.0.0.1:34035_solr c: s: r: x: t:null-9722] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/cores params={deleteInstanceDir=true&core=children_d_replica_n1&qt=/admin/cores&deleteDataDir=true&action=UNLOAD&wt=javabin&version=2} status=0 QTime=806 2> 1771763 INFO (qtp780434683-5564) [n:127.0.0.1:35379_solr c: s: r: x: t:null-9720] o.a.s.c.ShardLeaderElectionContextBase No version found for ephemeral leader parent node, won't remove previous leader registration. 2> 1771766 INFO (qtp484878969-5593) [n:127.0.0.1:44957_solr c: s: r: x: t:null-9723] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/cores params={deleteInstanceDir=true&core=children_d_replica_n8&qt=/admin/cores&deleteDataDir=true&action=UNLOAD&wt=javabin&version=2} status=0 QTime=785 2> 1771770 INFO (qtp780434683-5564) [n:127.0.0.1:35379_solr c: s: r: x: t:null-9720] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/cores params={deleteInstanceDir=true&core=children_c_replica_n3&qt=/admin/cores&deleteDataDir=true&action=UNLOAD&wt=javabin&version=2} status=0 QTime=844 2> 1771773 WARN (zkCallback-3664-thread-1) [n: c: s: r: x: t:] o.a.s.c.LeaderElector Our node is no longer in line to be leader 2> 1771773 WARN (zkCallback-3668-thread-4) [n: c: s: r: x: t:] o.a.s.c.LeaderElector Our node is no longer in line to be leader 2> 1771775 INFO (qtp1283346820-5562) [n:127.0.0.1:36209_solr c: s: r: x: t:null-9718] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/cores params={deleteInstanceDir=true&core=children_b_replica_n2&qt=/admin/cores&deleteDataDir=true&action=UNLOAD&wt=javabin&version=2} status=0 QTime=855 2> 1771778 INFO (qtp1283346820-5568) [n:127.0.0.1:36209_solr c: s: r: x: t:null-9724] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/cores params={deleteInstanceDir=true&core=children_e_replica_n5&qt=/admin/cores&deleteDataDir=true&action=UNLOAD&wt=javabin&version=2} status=0 QTime=788 2> 1771784 INFO (qtp1167887541-5583) [n:127.0.0.1:36867_solr c: s: r: x: t:null-9721] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/cores params={deleteInstanceDir=true&core=children_c_replica_n9&qt=/admin/cores&deleteDataDir=true&action=UNLOAD&wt=javabin&version=2} status=0 QTime=845 2> 1771790 INFO (qtp266123491-5577) [n:127.0.0.1:34035_solr c: s: r: x: t:null-9719] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/cores params={deleteInstanceDir=true&core=children_b_replica_n10&qt=/admin/cores&deleteDataDir=true&action=UNLOAD&wt=javabin&version=2} status=0 QTime=865 2> 1771796 INFO (qtp1167887541-5567) [n:127.0.0.1:36867_solr c: s: r: x: t:null-9725] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/cores params={deleteInstanceDir=true&core=children_e_replica_n7&qt=/admin/cores&deleteDataDir=true&action=UNLOAD&wt=javabin&version=2} status=0 QTime=799 2> 1771812 INFO (OverseerThreadFactory-3682-thread-4) [n: c:children s: r: x: t:] o.a.s.c.c.ZkStateReader /collections/children/state.json is deleted, stop watching children 2> 1771931 INFO (qtp780434683-5566) [n:127.0.0.1:35379_solr c:children s: r: x: t:null-9715] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/collections params={name=children&action=DELETE&wt=javabin&version=2} status=0 QTime=1047 2> 1771934 INFO (SUITE-ShardJoinImplicitTest-seed#[246C98A4C257C021]-worker) [n: c: s: r: x: t:] o.a.s.s.j.ShardToShardJoinAbstract succeeded ... shutting down now! 2> 1772051 INFO (jetty-closer-3827-thread-1) [n: c: s: r: x: t:] o.e.j.s.Server Stopped Server@2a789ec4{STOPPING}[10.0.19,sto=0] 2> 1772052 INFO (jetty-closer-3827-thread-2) [n: c: s: r: x: t:] o.e.j.s.Server Stopped Server@1346323a{STOPPING}[10.0.19,sto=0] 2> 1772052 INFO (jetty-closer-3827-thread-3) [n: c: s: r: x: t:] o.e.j.s.Server Stopped Server@725d94a3{STOPPING}[10.0.19,sto=0] 2> 1772053 INFO (jetty-closer-3827-thread-4) [n: c: s: r: x: t:] o.e.j.s.Server Stopped Server@37dd8df4{STOPPING}[10.0.19,sto=0] 2> 1772054 INFO (jetty-closer-3827-thread-5) [n: c: s: r: x: t:] o.e.j.s.Server Stopped Server@328df163{STOPPING}[10.0.19,sto=0] 2> 1772064 INFO (jetty-closer-3827-thread-4) [n: c: s: r: x: t:] o.e.j.s.AbstractConnector Stopped ServerConnector@4dcb8271{ssl, (ssl, alpn, http/1.1, h2)}{127.0.0.1:0} 2> 1772065 INFO (jetty-closer-3827-thread-3) [n: c: s: r: x: t:] o.e.j.s.AbstractConnector Stopped ServerConnector@344d64e7{ssl, (ssl, alpn, http/1.1, h2)}{127.0.0.1:0} 2> 1772066 INFO (jetty-closer-3827-thread-1) [n: c: s: r: x: t:] o.e.j.s.AbstractConnector Stopped ServerConnector@6c105610{ssl, (ssl, alpn, http/1.1, h2)}{127.0.0.1:0} 2> 1772066 INFO (jetty-closer-3827-thread-5) [n: c: s: r: x: t:] o.e.j.s.AbstractConnector Stopped ServerConnector@35bbdf47{ssl, (ssl, alpn, http/1.1, h2)}{127.0.0.1:0} 2> 1772068 INFO (jetty-closer-3827-thread-1) [n: c: s: r: x: t:] o.e.j.s.h.ContextHandler Stopped o.e.j.s.ServletContextHandler@6ed9f03c{/solr,file:///build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-cwd/,STOPPED} 2> 1772068 INFO (jetty-closer-3827-thread-4) [n: c: s: r: x: t:] o.e.j.s.h.ContextHandler Stopped o.e.j.s.ServletContextHandler@96ac04c{/solr,file:///build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-cwd/,STOPPED} 2> 1772068 INFO (jetty-closer-3827-thread-3) [n: c: s: r: x: t:] o.e.j.s.h.ContextHandler Stopped o.e.j.s.ServletContextHandler@10e9a287{/solr,file:///build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-cwd/,STOPPED} 2> 1772068 INFO (jetty-closer-3827-thread-5) [n: c: s: r: x: t:] o.e.j.s.h.ContextHandler Stopped o.e.j.s.ServletContextHandler@29213bd9{/solr,file:///build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-cwd/,STOPPED} 2> 1772070 INFO (jetty-closer-3827-thread-2) [n: c: s: r: x: t:] o.e.j.s.AbstractConnector Stopped ServerConnector@45eb814e{ssl, (ssl, alpn, http/1.1, h2)}{127.0.0.1:0} 2> 1772071 INFO (jetty-closer-3827-thread-2) [n: c: s: r: x: t:] o.e.j.s.h.ContextHandler Stopped o.e.j.s.ServletContextHandler@359fff57{/solr,file:///build/solr/src/solr-9.4.1/solr/core/build/tmp/tests-cwd/,STOPPED} 2> 1772094 INFO (jetty-closer-3827-thread-4) [n: c: s: r: x: t:] o.a.s.c.CoreContainer Shutting down CoreContainer instance=1641641563 2> 1772094 INFO (jetty-closer-3827-thread-2) [n: c: s: r: x: t:] o.a.s.c.CoreContainer Shutting down CoreContainer instance=552792440 2> 1772094 INFO (jetty-closer-3827-thread-3) [n: c: s: r: x: t:] o.a.s.c.CoreContainer Shutting down CoreContainer instance=1831194361 2> 1772094 INFO (jetty-closer-3827-thread-1) [n: c: s: r: x: t:] o.a.s.c.CoreContainer Shutting down CoreContainer instance=258595260 2> 1772095 INFO (jetty-closer-3827-thread-2) [n: c: s: r: x: t:] o.a.s.c.ZkController Remove node as live in ZooKeeper:/live_nodes/127.0.0.1:36209_solr 2> 1772095 INFO (jetty-closer-3827-thread-3) [n: c: s: r: x: t:] o.a.s.c.ZkController Remove node as live in ZooKeeper:/live_nodes/127.0.0.1:36867_solr 2> 1772095 INFO (jetty-closer-3827-thread-1) [n: c: s: r: x: t:] o.a.s.c.ZkController Remove node as live in ZooKeeper:/live_nodes/127.0.0.1:35379_solr 2> 1772094 INFO (jetty-closer-3827-thread-5) [n: c: s: r: x: t:] o.a.s.c.CoreContainer Shutting down CoreContainer instance=1440030786 2> 1772095 INFO (jetty-closer-3827-thread-4) [n: c: s: r: x: t:] o.a.s.c.ZkController Remove node as live in ZooKeeper:/live_nodes/127.0.0.1:44957_solr 2> 1772096 INFO (jetty-closer-3827-thread-5) [n: c: s: r: x: t:] o.a.s.c.ZkController Remove node as live in ZooKeeper:/live_nodes/127.0.0.1:34035_solr 2> 1772101 INFO (jetty-closer-3827-thread-1) [n: c: s: r: x: t:] o.a.s.c.ZkController Publish this node as DOWN... 2> 1772101 INFO (jetty-closer-3827-thread-2) [n: c: s: r: x: t:] o.a.s.c.ZkController Publish this node as DOWN... 2> 1772101 INFO (jetty-closer-3827-thread-1) [n: c: s: r: x: t:] o.a.s.c.ZkController Publish node=127.0.0.1:35379_solr as DOWN 2> 1772101 INFO (jetty-closer-3827-thread-2) [n: c: s: r: x: t:] o.a.s.c.ZkController Publish node=127.0.0.1:36209_solr as DOWN 2> 1772103 INFO (jetty-closer-3827-thread-3) [n: c: s: r: x: t:] o.a.s.c.ZkController Publish this node as DOWN... 2> 1772103 INFO (jetty-closer-3827-thread-3) [n: c: s: r: x: t:] o.a.s.c.ZkController Publish node=127.0.0.1:36867_solr as DOWN 2> 1772104 INFO (jetty-closer-3827-thread-4) [n: c: s: r: x: t:] o.a.s.c.ZkController Publish this node as DOWN... 2> 1772104 INFO (jetty-closer-3827-thread-5) [n: c: s: r: x: t:] o.a.s.c.ZkController Publish this node as DOWN... 2> 1772104 INFO (jetty-closer-3827-thread-4) [n: c: s: r: x: t:] o.a.s.c.ZkController Publish node=127.0.0.1:44957_solr as DOWN 2> 1772104 INFO (jetty-closer-3827-thread-5) [n: c: s: r: x: t:] o.a.s.c.ZkController Publish node=127.0.0.1:34035_solr as DOWN 2> 1772104 INFO (zkCallback-3668-thread-4) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (5) -> (0) 2> 1772104 INFO (zkCallback-3660-thread-4) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (5) -> (0) 2> 1772105 INFO (zkCallback-3666-thread-2) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (5) -> (0) 2> 1772105 INFO (zkCallback-3658-thread-4) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (5) -> (0) 2> 1772105 INFO (zkCallback-3664-thread-1) [n: c: s: r: x: t:] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (5) -> (0) 2> 1772127 INFO (jetty-closer-3827-thread-2) [n: c: s: r: x: t:] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.node tag=null 2> 1772127 INFO (jetty-closer-3827-thread-2) [n: c: s: r: x: t:] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.jvm tag=null 2> 1772127 INFO (jetty-closer-3827-thread-2) [n: c: s: r: x: t:] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.jetty tag=null 2> 1772134 INFO (jetty-closer-3827-thread-2) [n: c: s: r: x: t:] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.cluster tag=null 2> 1772135 INFO (jetty-closer-3827-thread-3) [n: c: s: r: x: t:] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.node tag=null 2> 1772135 INFO (jetty-closer-3827-thread-3) [n: c: s: r: x: t:] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.jvm tag=null 2> 1772135 INFO (jetty-closer-3827-thread-3) [n: c: s: r: x: t:] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.jetty tag=null 2> 1772136 INFO (jetty-closer-3827-thread-3) [n: c: s: r: x: t:] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.cluster tag=null 2> 1772136 INFO (jetty-closer-3827-thread-4) [n: c: s: r: x: t:] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.node tag=null 2> 1772136 INFO (jetty-closer-3827-thread-4) [n: c: s: r: x: t:] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.jvm tag=null 2> 1772136 INFO (jetty-closer-3827-thread-1) [n: c: s: r: x: t:] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.node tag=null 2> 1772136 INFO (jetty-closer-3827-thread-4) [n: c: s: r: x: t:] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.jetty tag=null 2> 1772136 INFO (jetty-closer-3827-thread-1) [n: c: s: r: x: t:] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.jvm tag=null 2> 1772136 INFO (jetty-closer-3827-thread-1) [n: c: s: r: x: t:] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.jetty tag=null 2> 1772141 INFO (jetty-closer-3827-thread-5) [n: c: s: r: x: t:] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.node tag=null 2> 1772142 INFO (jetty-closer-3827-thread-5) [n: c: s: r: x: t:] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.jvm tag=null 2> 1772142 INFO (jetty-closer-3827-thread-5) [n: c: s: r: x: t:] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.jetty tag=null 2> 1772142 INFO (jetty-closer-3827-thread-4) [n: c: s: r: x: t:] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.cluster tag=null 2> 1772142 INFO (jetty-closer-3827-thread-1) [n: c: s: r: x: t:] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.cluster tag=null 2> 1772142 INFO (jetty-closer-3827-thread-5) [n: c: s: r: x: t:] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.cluster tag=null 2> 1772219 INFO (closeThreadPool-3842-thread-1) [n: c: s: r: x: t:] o.a.s.c.Overseer Overseer (id=72077900700319752-127.0.0.1:35379_solr-n_0000000000) closing 2> 1772224 INFO (OverseerStateUpdate-72077900700319752-127.0.0.1:35379_solr-n_0000000000) [n:127.0.0.1:35379_solr c: s: r: x: t:] o.a.s.c.Overseer Overseer Loop exiting : 127.0.0.1:35379_solr 2> 1772248 INFO (closeThreadPool-3842-thread-2) [n: c: s: r: x: t:] o.a.s.c.Overseer Overseer (id=72077900700319752-127.0.0.1:35379_solr-n_0000000000) closing 2> 1772359 INFO (jetty-closer-3827-thread-1) [n: c: s: r: x: t:] o.a.s.c.Overseer Overseer (id=72077900700319752-127.0.0.1:35379_solr-n_0000000000) closing 2> 1772366 INFO (SUITE-ShardJoinImplicitTest-seed#[246C98A4C257C021]-worker) [n: c: s: r: x: t:] o.a.s.c.ZkTestServer Shutting down ZkTestServer. 2> 1772583 WARN (ZkTestServer Run Thread) [n: c: s: r: x: t:] o.a.s.c.ZkTestServer Watch limit violations: 2> Maximum concurrent create/delete watches above limit: 2> 2> 20 /solr/configs/_default/managed-schema.xml 2> 11 /solr/collections/children/terms/c 2> 11 /solr/collections/children/terms/a 2> 11 /solr/collections/children/terms/d 2> 11 /solr/collections/children/terms/e 2> 10 /solr/collections/children/terms/b 2> 9 /solr/collections/parent/terms/b 2> 9 /solr/collections/parent/terms/d 2> 8 /solr/collections/parent/terms/a 2> 8 /solr/collections/parent/terms/c 2> 8 /solr/collections/parent/terms/e 2> 6 /solr/aliases.json 2> 5 /solr/packages.json 2> 5 /solr/security.json 2> 5 /solr/configs/_default 2> 5 /solr/collections/children/collectionprops.json 2> 2> Maximum concurrent data watches above limit: 2> 2> 48 /solr/collections/parent/state.json 2> 38 /solr/collections/children/state.json 2> 20 /solr/configs/_default/managed-schema.xml 2> 12 /solr/clusterprops.json 2> 2> Maximum concurrent children watches above limit: 2> 2> 30 /solr/collections 2> 21 /solr/live_nodes 2> 17 /solr/overseer/collection-queue-work 2> 5 /solr/collections/parent/state.json 2> 5 /solr/collections/children/state.json 2> 2> 1772657 INFO (SUITE-ShardJoinImplicitTest-seed#[246C98A4C257C021]-worker) [n: c: s: r: x: t:] o.a.s.u.ErrorLogMuter Closing ErrorLogMuter-regex-227 after mutting 0 log messages 2> 1772658 INFO (SUITE-ShardJoinImplicitTest-seed#[246C98A4C257C021]-worker) [n: c: s: r: x: t:] o.a.s.u.ErrorLogMuter Creating ErrorLogMuter-regex-228 for ERROR logs matching regex: ignore_exception > org.apache.solr.client.solrj.impl.BaseHttpSolrClient$RemoteSolrException: Error from server at https://127.0.0.1:44957/solr/children_a_replica_n6: Task queue processing has stalled for 25115 ms with 0 remaining elements to process. > at __randomizedtesting.SeedInfo.seed([246C98A4C257C021]:0) > at app//org.apache.solr.client.solrj.impl.HttpSolrClient.executeMethod(HttpSolrClient.java:747) > at app//org.apache.solr.client.solrj.impl.HttpSolrClient.request(HttpSolrClient.java:263) > at app//org.apache.solr.client.solrj.impl.HttpSolrClient.request(HttpSolrClient.java:244) > at app//org.apache.solr.client.solrj.impl.LBSolrClient.doRequest(LBSolrClient.java:386) > at app//org.apache.solr.client.solrj.impl.LBSolrClient.request(LBSolrClient.java:352) > at app//org.apache.solr.client.solrj.impl.CloudSolrClient.sendRequest(CloudSolrClient.java:1199) > at app//org.apache.solr.client.solrj.impl.CloudSolrClient.requestWithRetryOnStaleState(CloudSolrClient.java:898) > at app//org.apache.solr.client.solrj.impl.CloudSolrClient.request(CloudSolrClient.java:826) > at app//org.apache.solr.client.solrj.SolrRequest.process(SolrRequest.java:234) > at app//org.apache.solr.client.solrj.request.UpdateRequest.commit(UpdateRequest.java:228) > at app//org.apache.solr.search.join.ShardToShardJoinAbstract.setupCluster(ShardToShardJoinAbstract.java:142) > at app//org.apache.solr.search.join.ShardJoinImplicitTest.setupCluster(ShardJoinImplicitTest.java:37) > at java.base@11.0.22/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method) > at java.base@11.0.22/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) > at java.base@11.0.22/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) > at java.base@11.0.22/java.lang.reflect.Method.invoke(Method.java:566) > at app//com.carrotsearch.randomizedtesting.RandomizedRunner.invoke(RandomizedRunner.java:1758) > at app//com.carrotsearch.randomizedtesting.RandomizedRunner$6.evaluate(RandomizedRunner.java:886) > at app//com.carrotsearch.randomizedtesting.RandomizedRunner$7.evaluate(RandomizedRunner.java:902) > at app//com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) > at app//com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) > at app//org.apache.lucene.tests.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:43) > at app//org.apache.lucene.tests.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:43) > at app//com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:80) > at app//org.junit.rules.RunRules.evaluate(RunRules.java:20) > at app//org.apache.lucene.tests.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:43) > at app//com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) > at app//org.apache.lucene.tests.util.TestRuleStoreClassName$1.evaluate(TestRuleStoreClassName.java:38) > at app//com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:40) > at app//com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:40) > at app//com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) > at app//com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) > at app//org.apache.lucene.tests.util.TestRuleAssertionsRequired$1.evaluate(TestRuleAssertionsRequired.java:53) > at app//org.apache.lucene.tests.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:43) > at app//org.apache.lucene.tests.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:44) > at app//org.apache.lucene.tests.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:60) > at app//org.apache.lucene.tests.util.TestRuleIgnoreTestSuites$1.evaluate(TestRuleIgnoreTestSuites.java:47) > at app//org.junit.rules.RunRules.evaluate(RunRules.java:20) > at app//com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) > at app//com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:390) > at app//com.carrotsearch.randomizedtesting.ThreadLeakControl.lambda$forkTimeoutingTask$0(ThreadLeakControl.java:850) > at java.base@11.0.22/java.lang.Thread.run(Thread.java:829) 2> NOTE: test params are: codec=CheapBastard, sim=Asserting(RandomSimilarity(queryNorm=true): {}), locale=en-SE, timezone=ACT 2> NOTE: Linux 6.1.61-3-sophgo-08357-g369f7207fe69 riscv64/Oracle Corporation 11.0.22 (64-bit)/cpus=1,threads=2,free=254943656,total=477626368 2> NOTE: All tests run in this JVM: [ConvertedLegacyTest, TestCrossCoreJoin, TestSolrCoreProperties, JerseyResourceTest, TestEmbeddedSolrServerSchemaAPI, ChaosMonkeySafeLeaderWithPullReplicasTest, ConnectionManagerTest, DistributedApiAsyncTrackerTest, LeaderElectionTest, NodeMutatorTest, ParallelCommitExecutionTest, SSLMigrationTest, TestAuthenticationFramework, TestConfigSetsAPIExclusivity, TestLRUStatsCacheCloud, TestQueryingOnDownCollection, TestStressInPlaceUpdates, ZkControllerTest, CollectionsAPIDistributedZkTest, TestReplicaProperties, ZkDynamicConfigTest, PluginBagTest, TestCodecSupport, TestFileSystemConfigSetService, TestShardHandlerFactory, TestSolrCloudSnapshots, RequestLoggingTest, TestReqParamsAPI, V2StandaloneTest, IndexSizeEstimatorTest, SplitHandlerTest, ZookeeperStatusHandlerFailureTest, DeleteAliasAPITest, ListAliasesAPITest, V2NodeAPIMappingTest, DistributedFacetPivotLongTailTest, FacetPivotSmallTest, SpellCheckComponentTest, UpdateLogCloudTest, TaggingAttributeTest, CSVPrinterTest, TestMultiValuedNumericRangeQuery, SolrMetricsIntegrationTest, TestIntervalFaceting, TestCustomDocTransformer, TestExplainDocTransformer, TestFieldCollectionResource, TestManagedSynonymGraphFilterFactory, DocValuesTest, PrimitiveFieldTypeTest, TestCloudSchemaless, TestSortableTextField, MergeStrategyTest, TestBlockCollapse, TestExtendedDismaxParser, TestMaxScoreQueryParser, TestRTGBase, TestSolr4Spatial, TestStressUserVersions, SpatialHeatmapFacetsTest, TestDenseVectorFunctionQuery, ShardJoinImplicitTest] 2> NOTE: reproduce with: gradlew test --tests ShardJoinImplicitTest -Dtests.seed=246C98A4C257C021 -Dtests.locale=en-SE -Dtests.timezone=ACT -Dtests.asserts=true -Dtests.file.encoding=UTF-8 > Task :solr:modules:extraction:test :solr:modules:extraction:test (SUCCESS): 20 test(s), 3 skipped > Task :solr:modules:clustering:wipeTaskTemp > Task :solr:modules:s3-repository:classes UP-TO-DATE > Task :solr:modules:s3-repository:compileTestJava > Task :solr:core:test WARNING: Test org.apache.solr.update.TestInPlaceUpdatesStandalone wrote 11,406,034 bytes of output. > Task :solr:modules:jaegertracer-configurator:test :solr:modules:jaegertracer-configurator:test (SUCCESS): 1 test(s) > Task :solr:modules:jaegertracer-configurator:wipeTaskTemp > Task :solr:modules:scripting:compileTestJava Note: Some input files use or override a deprecated API. Note: Recompile with -Xlint:deprecation for details. > Task :solr:core:test :solr:core:test (FAILURE): 4900 test(s), 6 failure(s), 208 skipped 4900 tests completed, 6 failed, 208 skipped > Task :solr:modules:langid:testClasses > Task :solr:solrj:wipeTaskTemp > Task :solr:modules:sql:compileTestJava Note: /build/solr/src/solr-9.4.1/solr/modules/sql/src/test/org/apache/solr/handler/sql/TestSQLHandlerNonCloud.java uses or overrides a deprecated API. Note: Recompile with -Xlint:deprecation for details. > Task :solr:modules:ltr:test org.apache.solr.ltr.TestLTRQParserExplain > interleavingModels_shouldReturnExplainForTheModelPicked FAILED java.lang.AssertionError: mismatch: ' 8.0 = LinearModel(name=modelB,featureWeights=[featureB1=2.0,featureB2=4.0,featureAB=8.0]) model applied to features, sum of: 0.0 = prod of: 2.0 = weight on feature 0.0 = SolrFeature [name=featureB1, params={fq=[{!terms f=popularity}5]}] 0.0 = prod of: 4.0 = weight on feature 0.0 = SolrFeature [name=featureB2, params={fq=[{!terms f=title}different]}] 8.0 = prod of: 8.0 = weight on feature 1.0 = SolrFeature [name=featureAB, params={fq=[{!terms f=popularity}2]}] '!=' 8.0 = LinearModel(name=modelB,featureWeights=[featureB1=2.0,featureB2=4.0,featureAB=8.0]) model applied to features, sum of: 0.0 = prod of: 2.0 = weight on feature 0.0 = The feature has no value 0.0 = prod of: 4.0 = weight on feature 0.0 = SolrFeature [name=featureB2, params={fq=[{!terms f=title}different]}] 8.0 = prod of: 8.0 = weight on feature 1.0 = SolrFeature [name=featureAB, params={fq=[{!terms f=popularity}2]}] ' @ debug/explain/7 at __randomizedtesting.SeedInfo.seed([246C98A4C257C021:4E323A507246CE23]:0) at org.junit.Assert.fail(Assert.java:89) at org.apache.solr.util.RestTestBase.assertJsonMatches(RestTestBase.java:268) at org.apache.solr.util.RestTestBase.assertJQ(RestTestBase.java:210) at org.apache.solr.util.RestTestBase.assertJQ(RestTestBase.java:182) at org.apache.solr.ltr.TestLTRQParserExplain.interleavingModels_shouldReturnExplainForTheModelPicked(TestLTRQParserExplain.java:253) at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.base/java.lang.reflect.Method.invoke(Method.java:566) at com.carrotsearch.randomizedtesting.RandomizedRunner.invoke(RandomizedRunner.java:1758) at com.carrotsearch.randomizedtesting.RandomizedRunner$8.evaluate(RandomizedRunner.java:946) at com.carrotsearch.randomizedtesting.RandomizedRunner$9.evaluate(RandomizedRunner.java:982) at com.carrotsearch.randomizedtesting.RandomizedRunner$10.evaluate(RandomizedRunner.java:996) at com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:80) at org.junit.rules.RunRules.evaluate(RunRules.java:20) at org.apache.lucene.tests.util.TestRuleSetupTeardownChained$1.evaluate(TestRuleSetupTeardownChained.java:48) at org.apache.lucene.tests.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:43) at org.apache.lucene.tests.util.TestRuleThreadAndTestName$1.evaluate(TestRuleThreadAndTestName.java:45) at org.apache.lucene.tests.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:60) at org.apache.lucene.tests.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:44) at org.junit.rules.RunRules.evaluate(RunRules.java:20) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:390) at com.carrotsearch.randomizedtesting.ThreadLeakControl.forkTimeoutingTask(ThreadLeakControl.java:843) at com.carrotsearch.randomizedtesting.ThreadLeakControl$3.evaluate(ThreadLeakControl.java:490) at com.carrotsearch.randomizedtesting.RandomizedRunner.runSingleTest(RandomizedRunner.java:955) at com.carrotsearch.randomizedtesting.RandomizedRunner$5.evaluate(RandomizedRunner.java:840) at com.carrotsearch.randomizedtesting.RandomizedRunner$6.evaluate(RandomizedRunner.java:891) at com.carrotsearch.randomizedtesting.RandomizedRunner$7.evaluate(RandomizedRunner.java:902) at org.junit.rules.ExternalResource$1.evaluate(ExternalResource.java:54) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at org.apache.lucene.tests.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:43) at org.apache.lucene.tests.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:43) at com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:80) at org.junit.rules.RunRules.evaluate(RunRules.java:20) at org.apache.lucene.tests.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:43) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at org.apache.lucene.tests.util.TestRuleStoreClassName$1.evaluate(TestRuleStoreClassName.java:38) at com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:40) at com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:40) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at org.apache.lucene.tests.util.TestRuleAssertionsRequired$1.evaluate(TestRuleAssertionsRequired.java:53) at org.apache.lucene.tests.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:43) at org.apache.lucene.tests.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:44) at org.apache.lucene.tests.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:60) at org.apache.lucene.tests.util.TestRuleIgnoreTestSuites$1.evaluate(TestRuleIgnoreTestSuites.java:47) at org.junit.rules.RunRules.evaluate(RunRules.java:20) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:390) at com.carrotsearch.randomizedtesting.ThreadLeakControl.lambda$forkTimeoutingTask$0(ThreadLeakControl.java:850) at java.base/java.lang.Thread.run(Thread.java:829) > Task :solr:modules:langid:test > Task :solr:modules:jwt-auth:compileTestJava Note: /build/solr/src/solr-9.4.1/solr/modules/jwt-auth/src/test/org/apache/solr/security/jwt/JWTAuthPluginIntegrationTest.java uses or overrides a deprecated API. Note: Recompile with -Xlint:deprecation for details. > Task :solr:modules:ltr:test org.apache.solr.ltr.TestLTRQParserExplain > test suite's output saved to /build/solr/src/solr-9.4.1/solr/modules/ltr/build/test-results/test/outputs/OUTPUT-org.apache.solr.ltr.TestLTRQParserExplain.txt, copied below: 1> 16:34:25.471 [TEST-TestLTRQParserExplain.interleavingModels_shouldReturnExplainForTheModelPicked-seed#[246C98A4C257C021]] ERROR org.apache.solr.util.StartupLoggingUtils - Missing Java Option solr.log.dir. Logging may be missing or incomplete. 2> gen 22, 2024 7:04:45 AM org.glassfish.jersey.message.internal.MessagingBinders$EnabledProvidersBinder bindToBinder 2> AVVERTENZA: A class javax.activation.DataSource for a default provider MessageBodyWriter was not found. The provider is not available. 1> 16:35:07.211 [TEST-TestLTRQParserExplain.interleavingModels_shouldReturnExplainForTheModelPicked-seed#[246C98A4C257C021]] ERROR org.apache.solr.util.RestTestBase - query failed JSON validation. error: mismatch: ' 1> 8.0 = LinearModel(name=modelB,featureWeights=[featureB1=2.0,featureB2=4.0,featureAB=8.0]) model applied to features, sum of: 1> 0.0 = prod of: 1> 2.0 = weight on feature 1> 0.0 = SolrFeature [name=featureB1, params={fq=[{!terms f=popularity}5]}] 1> 0.0 = prod of: 1> 4.0 = weight on feature 1> 0.0 = SolrFeature [name=featureB2, params={fq=[{!terms f=title}different]}] 1> 8.0 = prod of: 1> 8.0 = weight on feature 1> 1.0 = SolrFeature [name=featureAB, params={fq=[{!terms f=popularity}2]}] 1> '!=' 1> 8.0 = LinearModel(name=modelB,featureWeights=[featureB1=2.0,featureB2=4.0,featureAB=8.0]) model applied to features, sum of: 1> 0.0 = prod of: 1> 2.0 = weight on feature 1> 0.0 = The feature has no value 1> 0.0 = prod of: 1> 4.0 = weight on feature 1> 0.0 = SolrFeature [name=featureB2, params={fq=[{!terms f=title}different]}] 1> 8.0 = prod of: 1> 8.0 = weight on feature 1> 1.0 = SolrFeature [name=featureAB, params={fq=[{!terms f=popularity}2]}] 1> ' @ debug/explain/7 1> expected: /debug/explain/7==' 1> 8.0 = LinearModel(name=modelB,featureWeights=[featureB1=2.0,featureB2=4.0,featureAB=8.0]) model applied to features, sum of: 1> 0.0 = prod of: 1> 2.0 = weight on feature 1> 0.0 = SolrFeature [name=featureB1, params={fq=[{!terms f=popularity}5]}] 1> 0.0 = prod of: 1> 4.0 = weight on feature 1> 0.0 = SolrFeature [name=featureB2, params={fq=[{!terms f=title}different]}] 1> 8.0 = prod of: 1> 8.0 = weight on feature 1> 1.0 = SolrFeature [name=featureAB, params={fq=[{!terms f=popularity}2]}] 1> '} 1> response: { 1> "responseHeader":{ 1> "status":0, 1> "QTime":1378, 1> "params":{ 1> "q":"title:bloomberg", 1> "indent":"on", 1> "fl":"*,score", 1> "rows":"10", 1> "wt":"json", 1> "debugQuery":"on", 1> "rq":"{!ltr reRankDocs=10 model=modelA model=modelB}" 1> } 1> }, 1> "response":{ 1> "numFound":4, 1> "start":0, 1> "maxScore":0.07662583, 1> "numFoundExact":true, 1> "docs":[{ 1> "id":"7", 1> "popularity":2, 1> "_version_":1788737383034781696, 1> "score":8.0 1> },{ 1> "id":"6", 1> "popularity":1, 1> "_version_":1788737382142443520, 1> "score":12.0 1> },{ 1> "id":"8", 1> "popularity":3, 1> "_version_":1788737383894614016, 1> "score":9.0 1> },{ 1> "id":"9", 1> "popularity":5, 1> "_version_":1788737384063434752, 1> "score":2.0 1> }] 1> }, 1> "debug":{ 1> "rawquerystring":"title:bloomberg", 1> "querystring":"title:bloomberg", 1> "parsedquery":"LTRInterleavingQuery({!ltr mainQuery='title:bloomberg' rerankingQueries='[, ]' reRankDocs=10})", 1> "parsedquery_toString":"{!ltr mainQuery='title:bloomberg' rerankingQueries='[, ]' reRankDocs=10}", 1> "explain":{ 1> "7":"\n8.0 = LinearModel(name=modelB,featureWeights=[featureB1=2.0,featureB2=4.0,featureAB=8.0]) model applied to features, sum of:\n 0.0 = prod of:\n 2.0 = weight on feature\n 0.0 = The feature has no value\n 0.0 = prod of:\n 4.0 = weight on feature\n 0.0 = SolrFeature [name=featureB2, params={fq=[{!terms f=title}different]}]\n 8.0 = prod of:\n 8.0 = weight on feature\n 1.0 = SolrFeature [name=featureAB, params={fq=[{!terms f=popularity}2]}]\n", 1> "6":"\n12.0 = LinearModel(name=modelA,featureWeights=[featureA1=3.0,featureA2=9.0,featureAB=27.0]) model applied to features, sum of:\n 3.0 = prod of:\n 3.0 = weight on feature\n 1.0 = SolrFeature [name=featureA1, params={fq=[{!terms f=popularity}1]}]\n 9.0 = prod of:\n 9.0 = weight on feature\n 1.0 = SolrFeature [name=featureA2, params={fq=[{!terms f=description}bloomberg]}]\n 0.0 = prod of:\n 27.0 = weight on feature\n 0.0 = SolrFeature [name=featureAB, params={fq=[{!terms f=popularity}2]}]\n", 1> "8":"\n9.0 = LinearModel(name=modelA,featureWeights=[featureA1=3.0,featureA2=9.0,featureAB=27.0]) model applied to features, sum of:\n 0.0 = prod of:\n 3.0 = weight on feature\n 0.0 = The feature has no value\n 9.0 = prod of:\n 9.0 = weight on feature\n 1.0 = SolrFeature [name=featureA2, params={fq=[{!terms f=description}bloomberg]}]\n 0.0 = prod of:\n 27.0 = weight on feature\n 0.0 = The feature has no value\n", 1> "9":"\n2.0 = LinearModel(name=modelB,featureWeights=[featureB1=2.0,featureB2=4.0,featureAB=8.0]) model applied to features, sum of:\n 2.0 = prod of:\n 2.0 = weight on feature\n 1.0 = SolrFeature [name=featureB1, params={fq=[{!terms f=popularity}5]}]\n 0.0 = prod of:\n 4.0 = weight on feature\n 0.0 = The feature has no value\n 0.0 = prod of:\n 8.0 = weight on feature\n 0.0 = The feature has no value\n" 1> }, 1> "QParser":"LuceneQParser", 1> "timing":{ 1> "time":1372.0, 1> "prepare":{ 1> "time":252.0, 1> "query":{ 1> "time":250.0 1> }, 1> "facet":{ 1> "time":0.0 1> }, 1> "facet_module":{ 1> "time":0.0 1> }, 1> "mlt":{ 1> "time":0.0 1> }, 1> "highlight":{ 1> "time":0.0 1> }, 1> "stats":{ 1> "time":0.0 1> }, 1> "expand":{ 1> "time":0.0 1> }, 1> "terms":{ 1> "time":0.0 1> }, 1> "debug":{ 1> "time":0.0 1> } 1> }, 1> "process":{ 1> "time":1070.0, 1> "query":{ 1> "time":674.0 1> }, 1> "facet":{ 1> "time":0.0 1> }, 1> "facet_module":{ 1> "time":1.0 1> }, 1> "mlt":{ 1> "time":0.0 1> }, 1> "highlight":{ 1> "time":0.0 1> }, 1> "stats":{ 1> "time":0.0 1> }, 1> "expand":{ 1> "time":0.0 1> }, 1> "terms":{ 1> "time":0.0 1> }, 1> "debug":{ 1> "time":393.0 1> } 1> } 1> } 1> } 1> } 1> request: /query?q=title%3Abloomberg&debugQuery=on&rows=10&rq=%7B%21ltr+reRankDocs%3D10+model%3DmodelA+model%3DmodelB%7D&fl=*%2Cscore&wt=json&indent=on 1> > java.lang.AssertionError: mismatch: ' > 8.0 = LinearModel(name=modelB,featureWeights=[featureB1=2.0,featureB2=4.0,featureAB=8.0]) model applied to features, sum of: > 0.0 = prod of: > 2.0 = weight on feature > 0.0 = SolrFeature [name=featureB1, params={fq=[{!terms f=popularity}5]}] > 0.0 = prod of: > 4.0 = weight on feature > 0.0 = SolrFeature [name=featureB2, params={fq=[{!terms f=title}different]}] > 8.0 = prod of: > 8.0 = weight on feature > 1.0 = SolrFeature [name=featureAB, params={fq=[{!terms f=popularity}2]}] > '!=' > 8.0 = LinearModel(name=modelB,featureWeights=[featureB1=2.0,featureB2=4.0,featureAB=8.0]) model applied to features, sum of: > 0.0 = prod of: > 2.0 = weight on feature > 0.0 = The feature has no value > 0.0 = prod of: > 4.0 = weight on feature > 0.0 = SolrFeature [name=featureB2, params={fq=[{!terms f=title}different]}] > 8.0 = prod of: > 8.0 = weight on feature > 1.0 = SolrFeature [name=featureAB, params={fq=[{!terms f=popularity}2]}] > ' @ debug/explain/7 > at __randomizedtesting.SeedInfo.seed([246C98A4C257C021:4E323A507246CE23]:0) > at org.junit.Assert.fail(Assert.java:89) > at org.apache.solr.util.RestTestBase.assertJsonMatches(RestTestBase.java:268) > at org.apache.solr.util.RestTestBase.assertJQ(RestTestBase.java:210) > at org.apache.solr.util.RestTestBase.assertJQ(RestTestBase.java:182) > at org.apache.solr.ltr.TestLTRQParserExplain.interleavingModels_shouldReturnExplainForTheModelPicked(TestLTRQParserExplain.java:253) > at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method) > at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) > at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) > at java.base/java.lang.reflect.Method.invoke(Method.java:566) > at com.carrotsearch.randomizedtesting.RandomizedRunner.invoke(RandomizedRunner.java:1758) > at com.carrotsearch.randomizedtesting.RandomizedRunner$8.evaluate(RandomizedRunner.java:946) > at com.carrotsearch.randomizedtesting.RandomizedRunner$9.evaluate(RandomizedRunner.java:982) > at com.carrotsearch.randomizedtesting.RandomizedRunner$10.evaluate(RandomizedRunner.java:996) > at com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:80) > at org.junit.rules.RunRules.evaluate(RunRules.java:20) > at org.apache.lucene.tests.util.TestRuleSetupTeardownChained$1.evaluate(TestRuleSetupTeardownChained.java:48) > at org.apache.lucene.tests.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:43) > at org.apache.lucene.tests.util.TestRuleThreadAndTestName$1.evaluate(TestRuleThreadAndTestName.java:45) > at org.apache.lucene.tests.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:60) > at org.apache.lucene.tests.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:44) > at org.junit.rules.RunRules.evaluate(RunRules.java:20) > at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) > at com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:390) > at com.carrotsearch.randomizedtesting.ThreadLeakControl.forkTimeoutingTask(ThreadLeakControl.java:843) > at com.carrotsearch.randomizedtesting.ThreadLeakControl$3.evaluate(ThreadLeakControl.java:490) > at com.carrotsearch.randomizedtesting.RandomizedRunner.runSingleTest(RandomizedRunner.java:955) > at com.carrotsearch.randomizedtesting.RandomizedRunner$5.evaluate(RandomizedRunner.java:840) > at com.carrotsearch.randomizedtesting.RandomizedRunner$6.evaluate(RandomizedRunner.java:891) > at com.carrotsearch.randomizedtesting.RandomizedRunner$7.evaluate(RandomizedRunner.java:902) > at org.junit.rules.ExternalResource$1.evaluate(ExternalResource.java:54) > at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) > at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) > at org.apache.lucene.tests.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:43) > at org.apache.lucene.tests.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:43) > at com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:80) > at org.junit.rules.RunRules.evaluate(RunRules.java:20) > at org.apache.lucene.tests.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:43) > at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) > at org.apache.lucene.tests.util.TestRuleStoreClassName$1.evaluate(TestRuleStoreClassName.java:38) > at com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:40) > at com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:40) > at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) > at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) > at org.apache.lucene.tests.util.TestRuleAssertionsRequired$1.evaluate(TestRuleAssertionsRequired.java:53) > at org.apache.lucene.tests.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:43) > at org.apache.lucene.tests.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:44) > at org.apache.lucene.tests.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:60) > at org.apache.lucene.tests.util.TestRuleIgnoreTestSuites$1.evaluate(TestRuleIgnoreTestSuites.java:47) > at org.junit.rules.RunRules.evaluate(RunRules.java:20) > at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) > at com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:390) > at com.carrotsearch.randomizedtesting.ThreadLeakControl.lambda$forkTimeoutingTask$0(ThreadLeakControl.java:850) > at java.base/java.lang.Thread.run(Thread.java:829) 2> NOTE: reproduce with: gradlew test --tests TestLTRQParserExplain.interleavingModels_shouldReturnExplainForTheModelPicked -Dtests.seed=246C98A4C257C021 -Dtests.locale=it-CH -Dtests.timezone=ACT -Dtests.asserts=true -Dtests.file.encoding=UTF-8 1> 16:35:09.210 [TEST-TestLTRQParserExplain.LinearScoreExplainMissingEfiFeatureShouldReturnDefaultScore-seed#[246C98A4C257C021]] ERROR org.apache.solr.util.StartupLoggingUtils - Missing Java Option solr.log.dir. Logging may be missing or incomplete. 1> 16:35:13.930 [TEST-TestLTRQParserExplain.testRerankedExplainSameBetweenDifferentDocsWithSameFeatures-seed#[246C98A4C257C021]] ERROR org.apache.solr.util.StartupLoggingUtils - Missing Java Option solr.log.dir. Logging may be missing or incomplete. 1> 16:35:20.239 [TEST-TestLTRQParserExplain.interleavingModelsWithOriginalRanking_shouldReturnExplainForTheModelPicked-seed#[246C98A4C257C021]] ERROR org.apache.solr.util.StartupLoggingUtils - Missing Java Option solr.log.dir. Logging may be missing or incomplete. 1> 16:35:30.839 [TEST-TestLTRQParserExplain.testRerankedExplain-seed#[246C98A4C257C021]] ERROR org.apache.solr.util.StartupLoggingUtils - Missing Java Option solr.log.dir. Logging may be missing or incomplete. 1> 16:35:36.395 [TEST-TestLTRQParserExplain.multipleAdditiveTreesScoreExplainMissingEfiFeatureShouldReturnDefaultScore-seed#[246C98A4C257C021]] ERROR org.apache.solr.util.StartupLoggingUtils - Missing Java Option solr.log.dir. Logging may be missing or incomplete. 2> NOTE: leaving temporary files on disk at: /build/solr/src/solr-9.4.1/solr/modules/ltr/build/tmp/tests-tmp/solr.ltr.TestLTRQParserExplain_246C98A4C257C021-001 2> NOTE: test params are: codec=Asserting(Lucene95): {description=PostingsFormat(name=MockRandom), id=Lucene90, text=PostingsFormat(name=LuceneVarGapFixedInterval), title=PostingsFormat(name=MockRandom)}, docValues:{dvLongPopularity=DocValuesFormat(name=Asserting), dvIntField=DocValuesFormat(name=Lucene90), dvLongField=DocValuesFormat(name=Lucene90), dvFloatPopularity=DocValuesFormat(name=Lucene90), dvIntPopularity=DocValuesFormat(name=Lucene90), dvDoublePopularity=DocValuesFormat(name=Asserting), dvFloatField=DocValuesFormat(name=Lucene90)}, maxPointsInLeafNode=498, maxMBSortInHeap=6.219141331860774, sim=Asserting(RandomSimilarity(queryNorm=true): {}), locale=it-CH, timezone=ACT 2> NOTE: Linux 6.1.61-3-sophgo-08357-g369f7207fe69 riscv64/Oracle Corporation 11.0.22 (64-bit)/cpus=1,threads=1,free=251802688,total=338690048 2> NOTE: All tests run in this JVM: [TestLTRQParserExplain] > Task :solr:modules:jwt-auth:testClasses > Task :solr:modules:gcs-repository:test :solr:modules:gcs-repository:test (SUCCESS): 20 test(s), 5 skipped > Task :solr:modules:langid:test :solr:modules:langid:test (SUCCESS): 43 test(s) > Task :solr:modules:langid:wipeTaskTemp > Task :solr:modules:extraction:wipeTaskTemp > Task :solr:modules:scripting:testClasses > Task :solr:modules:jwt-auth:test > Task :solr:modules:scripting:test > Task :solr:modules:ltr:test :solr:modules:ltr:test (FAILURE): 205 test(s), 1 failure(s) 205 tests completed, 1 failed > Task :solr:modules:gcs-repository:wipeTaskTemp > Task :solr:modules:opentelemetry:test :solr:modules:opentelemetry:test (SUCCESS): 5 test(s) > Task :solr:modules:sql:testClasses > Task :solr:modules:ltr:test FAILED > Task :solr:modules:ltr:wipeTaskTemp > Task :solr:modules:opentelemetry:wipeTaskTemp > Task :solr:core:test FAILED > Task :solr:core:wipeTaskTemp > Task :solr:modules:hadoop-auth:test :solr:modules:hadoop-auth:test (SUCCESS): 55 test(s), 1 skipped > Task :solr:modules:hadoop-auth:wipeTaskTemp > Task :solr:modules:sql:test > Task :solr:modules:scripting:test :solr:modules:scripting:test (SUCCESS): 22 test(s), 1 skipped > Task :solr:modules:scripting:wipeTaskTemp > Task :solr:modules:hdfs:test :solr:modules:hdfs:test (SUCCESS): 103 test(s), 45 skipped > Task :solr:modules:hdfs:wipeTaskTemp > Task :solr:modules:jwt-auth:test :solr:modules:jwt-auth:test (SUCCESS): 63 test(s) > Task :solr:modules:jwt-auth:wipeTaskTemp > Task :solr:modules:sql:test :solr:modules:sql:test (SUCCESS): 33 test(s) > Task :solr:modules:sql:wipeTaskTemp ERROR: The following test(s) have failed: - org.apache.solr.ltr.TestLTRQParserExplain.interleavingModels_shouldReturnExplainForTheModelPicked (:solr:modules:ltr) Test output: /build/solr/src/solr-9.4.1/solr/modules/ltr/build/test-results/test/outputs/OUTPUT-org.apache.solr.ltr.TestLTRQParserExplain.txt Reproduce with: gradlew :solr:modules:ltr:test --tests "org.apache.solr.ltr.TestLTRQParserExplain.interleavingModels_shouldReturnExplainForTheModelPicked" -Ptests.jvms=12 "-Ptests.jvmargs=-XX:TieredStopAtLevel=1 -XX:+UseParallelGC -XX:ActiveProcessorCount=1 -XX:ReservedCodeCacheSize=120m" -Ptests.seed=246C98A4C257C021 -Ptests.file.encoding=ISO-8859-1 - org.apache.solr.cloud.BasicDistributedZkTest.test (:solr:core) Test output: /build/solr/src/solr-9.4.1/solr/core/build/test-results/test/outputs/OUTPUT-org.apache.solr.cloud.BasicDistributedZkTest.txt Reproduce with: gradlew :solr:core:test --tests "org.apache.solr.cloud.BasicDistributedZkTest.test" -Ptests.jvms=12 "-Ptests.jvmargs=-XX:TieredStopAtLevel=1 -XX:+UseParallelGC -XX:ActiveProcessorCount=1 -XX:ReservedCodeCacheSize=120m" -Ptests.seed=246C98A4C257C021 -Ptests.file.encoding=ISO-8859-1 - org.apache.solr.cloud.BasicDistributedZkTest.classMethod (:solr:core) Test output: /build/solr/src/solr-9.4.1/solr/core/build/test-results/test/outputs/OUTPUT-org.apache.solr.cloud.BasicDistributedZkTest.txt Reproduce with: gradlew :solr:core:test --tests "org.apache.solr.cloud.BasicDistributedZkTest.classMethod" -Ptests.jvms=12 "-Ptests.jvmargs=-XX:TieredStopAtLevel=1 -XX:+UseParallelGC -XX:ActiveProcessorCount=1 -XX:ReservedCodeCacheSize=120m" -Ptests.seed=246C98A4C257C021 -Ptests.file.encoding=ISO-8859-1 - org.apache.solr.cloud.ReplaceNodeTest.testGoodSpreadDuringAssignWithNoTarget (:solr:core) Test output: /build/solr/src/solr-9.4.1/solr/core/build/test-results/test/outputs/OUTPUT-org.apache.solr.cloud.ReplaceNodeTest.txt Reproduce with: gradlew :solr:core:test --tests "org.apache.solr.cloud.ReplaceNodeTest.testGoodSpreadDuringAssignWithNoTarget" -Ptests.jvms=12 "-Ptests.jvmargs=-XX:TieredStopAtLevel=1 -XX:+UseParallelGC -XX:ActiveProcessorCount=1 -XX:ReservedCodeCacheSize=120m" -Ptests.seed=246C98A4C257C021 -Ptests.file.encoding=ISO-8859-1 - org.apache.solr.cloud.ReplicationFactorTest.test (:solr:core) Test output: /build/solr/src/solr-9.4.1/solr/core/build/test-results/test/outputs/OUTPUT-org.apache.solr.cloud.ReplicationFactorTest.txt Reproduce with: gradlew :solr:core:test --tests "org.apache.solr.cloud.ReplicationFactorTest.test" -Ptests.jvms=12 "-Ptests.jvmargs=-XX:TieredStopAtLevel=1 -XX:+UseParallelGC -XX:ActiveProcessorCount=1 -XX:ReservedCodeCacheSize=120m" -Ptests.seed=246C98A4C257C021 -Ptests.file.encoding=ISO-8859-1 - org.apache.solr.cloud.api.collections.TestReplicaProperties.test (:solr:core) Test output: /build/solr/src/solr-9.4.1/solr/core/build/test-results/test/outputs/OUTPUT-org.apache.solr.cloud.api.collections.TestReplicaProperties.txt Reproduce with: gradlew :solr:core:test --tests "org.apache.solr.cloud.api.collections.TestReplicaProperties.test" -Ptests.jvms=12 "-Ptests.jvmargs=-XX:TieredStopAtLevel=1 -XX:+UseParallelGC -XX:ActiveProcessorCount=1 -XX:ReservedCodeCacheSize=120m" -Ptests.seed=246C98A4C257C021 -Ptests.file.encoding=ISO-8859-1 - org.apache.solr.search.join.ShardJoinImplicitTest.classMethod (:solr:core) Test output: /build/solr/src/solr-9.4.1/solr/core/build/test-results/test/outputs/OUTPUT-org.apache.solr.search.join.ShardJoinImplicitTest.txt Reproduce with: gradlew :solr:core:test --tests "org.apache.solr.search.join.ShardJoinImplicitTest.classMethod" -Ptests.jvms=12 "-Ptests.jvmargs=-XX:TieredStopAtLevel=1 -XX:+UseParallelGC -XX:ActiveProcessorCount=1 -XX:ReservedCodeCacheSize=120m" -Ptests.seed=246C98A4C257C021 -Ptests.file.encoding=ISO-8859-1 FAILURE: Build completed with 2 failures. 1: Task failed with an exception. ----------- * What went wrong: Execution failed for task ':solr:modules:ltr:test'. > There were failing tests. See the results at: file:///build/solr/src/solr-9.4.1/solr/modules/ltr/build/test-results/test/ * Try: > Run with --stacktrace option to get the stack trace. > Run with --info or --debug option to get more log output. > Run with --scan to get full insights. ============================================================================== 2: Task failed with an exception. ----------- * What went wrong: Execution failed for task ':solr:core:test'. > There were failing tests. See the results at: file:///build/solr/src/solr-9.4.1/solr/core/build/test-results/test/ * Try: > Run with --stacktrace option to get the stack trace. > Run with --info or --debug option to get more log output. > Run with --scan to get full insights. ============================================================================== * Get more help at https://help.gradle.org BUILD FAILED in 55m 38s 149 actionable tasks: 108 executed, 41 up-to-date ==> ERROR: A failure occurred in check().  Aborting... ==> ERROR: Build failed, check /var/lib/archbuild/extra-riscv64/root7/build receiving incremental file list solr-9.4.1-1-riscv64-build.log solr-9.4.1-1-riscv64-check.log solr-9.4.1-1-riscv64-prepare.log sent 81 bytes received 240,575 bytes 481,312.00 bytes/sec total size is 3,323,099 speedup is 13.81