==> Building on chienpao ==> Checking for remote environment... ==> Syncing package to remote host... sending incremental file list created directory packages/paraview-catalyst ./ .SRCINFO 821 100% 0.00kB/s 0:00:00 821 100% 0.00kB/s 0:00:00 (xfr#1, to-chk=7/9) .nvchecker.toml 105 100% 102.54kB/s 0:00:00 105 100% 102.54kB/s 0:00:00 (xfr#2, to-chk=6/9) LICENSE 646 100% 630.86kB/s 0:00:00 646 100% 630.86kB/s 0:00:00 (xfr#3, to-chk=5/9) PKGBUILD 1,879 100% 1.79MB/s 0:00:00 1,879 100% 1.79MB/s 0:00:00 (xfr#4, to-chk=4/9) REUSE.toml 375 100% 366.21kB/s 0:00:00 375 100% 366.21kB/s 0:00:00 (xfr#5, to-chk=3/9) paraview-catalyst-2.1.0-1.log 851 100% 831.05kB/s 0:00:00 851 100% 831.05kB/s 0:00:00 (xfr#6, to-chk=2/9) LICENSES/ LICENSES/0BSD.txt -> ../LICENSE sent 3,014 bytes received 193 bytes 2,138.00 bytes/sec total size is 4,031 speedup is 1.26 ==> Applying RISC-V patches... sending incremental file list ./ char-signed-test.patch 1,328 100% 0.00kB/s 0:00:00 1,328 100% 0.00kB/s 0:00:00 (xfr#1, to-chk=1/3) risc64.patch 1,239 100% 1.18MB/s 0:00:00 1,239 100% 1.18MB/s 0:00:00 (xfr#2, to-chk=0/3) sent 1,125 bytes received 57 bytes 788.00 bytes/sec total size is 2,567 speedup is 2.17 ==> Patching arch to riscv64... ==> Running pkgctl build --arch riscv64 on remote host... ==> WARNING: invalid architecture: riscv64 ==> Updating pacman database cache [?25l:: Synchronizing package databases... core downloading... extra downloading... multilib downloading... [?25h==> Building paraview-catalyst  -> repo: extra  -> arch: riscv64  -> worker: felix-0 ==> Building paraview-catalyst for [extra] (riscv64) ]3008;start=132387287b1248aea343e7c6dce86a3a;user=root;hostname=chienpao.felixc.at;machineid=33bd66794bef4c019a0e3acfdcceb30a;bootid=4a1aecd9730c4109959dc5a775f247d8;pid=3001736;comm=systemd-nspawn;container=arch-nspawn-3001736;type=container\]11;?\]2;🔵 Container arch-nspawn-3001736 on chienpao.felixc.at\[?25l:: Synchronizing package databases... core downloading... extra downloading... :: Starting full system upgrade... there is nothing to do [?25h[!p]104[?7h]3008;end=132387287b1248aea343e7c6dce86a3a\==> Building in chroot for [extra] (riscv64)... ==> Synchronizing chroot copy [/var/lib/archbuild/extra-riscv64/root] -> [felix-0]...done ==> Making package: paraview-catalyst 2.1.0-1 (Fri Apr 24 20:01:51 2026) ==> Retrieving sources...  -> Downloading catalyst-v2.1.0.tar.gz... % Total % Received % Xferd Average Speed Time Time Time Current Dload Upload Total Spent Left Speed 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 00:01 0 0 0 0 0 0 0 0 0 00:02 0 0 0 0 0 0 0 0 0 00:03 0 16 1.01M 16 176.0k 0 0 36370 0 00:29 00:04 00:25 36372 62 1.01M 62 647.9k 0 0 107.8k 0 00:09 00:06 00:03 107.8k 100 1.01M 100 1.01M 0 0 160.1k 0 00:06 00:06 107.8k 100 1.01M 100 1.01M 0 0 160.1k 0 00:06 00:06 107.8k 100 1.01M 100 1.01M 0 0 160.1k 0 00:06 00:06 107.8k ==> Validating source files with b2sums... catalyst-v2.1.0.tar.gz ... Passed ]3008;start=af2f9c6ab6e543d3a5429bdaed37a7a3;user=root;hostname=chienpao.felixc.at;machineid=33bd66794bef4c019a0e3acfdcceb30a;bootid=4a1aecd9730c4109959dc5a775f247d8;pid=3003276;comm=systemd-nspawn;container=arch-nspawn-3003276;type=container\]11;?\]2;🔵 Container arch-nspawn-3003276 on chienpao.felixc.at\==> Making package: paraview-catalyst 2.1.0-1 (Fri Apr 24 20:02:18 2026) ==> Checking runtime dependencies... ==> Installing missing dependencies... [?25lresolving dependencies... looking for conflicting packages... Package (10) New Version Net Change Download Size extra/hwloc 2.13.0-1 1.51 MiB 0.56 MiB extra/libfabric 2.5.1-1 7.10 MiB 1.61 MiB extra/libpciaccess 0.19-1 0.05 MiB core/mpdecimal 4.0.1-3 0.31 MiB extra/numactl 2.0.19-1 0.20 MiB extra/openpmix 5.0.10-1 3.58 MiB 0.95 MiB extra/openucx 1.20.0-3 6.56 MiB 2.30 MiB extra/prrte 3.0.13-1 1.89 MiB 0.60 MiB extra/openmpi 5.0.10-2 9.26 MiB 3.94 MiB core/python 3.14.3-1 132.79 MiB Total Download Size: 9.95 MiB Total Installed Size: 163.27 MiB :: Proceed with installation? [Y/n] :: Retrieving packages... openmpi-5.0.10-2-riscv64 downloading... openucx-1.20.0-3-riscv64 downloading... libfabric-2.5.1-1-riscv64 downloading... openpmix-5.0.10-1-riscv64 downloading... prrte-3.0.13-1-riscv64 downloading... hwloc-2.13.0-1-riscv64 downloading... checking keyring... checking package integrity... loading package files... checking for file conflicts... :: Processing package changes... installing libpciaccess... installing hwloc... Optional dependencies for hwloc cairo: PDF, Postscript, and PNG export support libxml2: full XML import/export support [installed] installing numactl... installing libfabric... installing openpmix... Optional dependencies for openpmix openpmix-docs: for documentation installing openucx... Optional dependencies for openucx rdma-core: for InfiniBand and RDMA support rocm-language-runtime: for ROCm support installing prrte... Optional dependencies for prrte openssh: for execution on remote hosts via plm_ssh_agent prrte-docs: for documentation installing openmpi... Optional dependencies for openmpi hip-runtime-amd: ROCm support gcc-fortran: fortran support openssh: for execution on remote hosts via plm_ssh_agent installing mpdecimal... installing python... Optional dependencies for python python-setuptools: for building Python packages using tooling that is usually bundled with Python python-pip: for installing Python packages using tooling that is usually bundled with Python python-pipx: for installing Python software not packaged on Arch Linux sqlite: for a default database integration [installed] xz: for lzma [installed] tk: for tkinter :: Running post-transaction hooks... (1/1) Arming ConditionNeedsUpdate... [?25h==> Checking buildtime dependencies... ==> Installing missing dependencies... [?25lresolving dependencies... looking for conflicting packages... Package (14) New Version Net Change Download Size extra/blas 3.12.1-2 0.43 MiB extra/cblas 3.12.1-2 0.31 MiB extra/cppdap 1.58.0-3 1.57 MiB extra/hicolor-icon-theme 0.18-1 0.05 MiB extra/jsoncpp 1.9.6-3 3.16 MiB extra/lapack 3.12.1-2 9.09 MiB extra/libuv 1.52.1-1 0.62 MiB extra/rhash 1.4.6-1 0.35 MiB extra/cmake 4.3.2-1 85.40 MiB core/gcc-fortran 15.2.1+r604+g0b99615a8aef-1.1 62.65 MiB 17.12 MiB extra/gtest 1.17.0-2 1.55 MiB extra/ninja 1.13.2-3 0.36 MiB extra/python-mpi4py 4.1.1-2 2.85 MiB 0.79 MiB extra/python-numpy 2.4.4-1 41.12 MiB Total Download Size: 17.91 MiB Total Installed Size: 209.52 MiB :: Proceed with installation? [Y/n] :: Retrieving packages... gcc-fortran-15.2.1+r604+g0b99615a8aef-1.1-riscv64 downloading... python-mpi4py-4.1.1-2-riscv64 downloading... checking keyring... checking package integrity... loading package files... checking for file conflicts... :: Processing package changes... installing cppdap... installing hicolor-icon-theme... installing jsoncpp... Optional dependencies for jsoncpp jsoncpp-doc: documentation installing libuv... installing rhash... installing cmake... Optional dependencies for cmake make: for unix Makefile generator [installed] ninja: for ninja generator [pending] qt6-base: cmake-gui installing gcc-fortran... installing gtest... Optional dependencies for gtest python: gmock generator [installed] installing ninja... installing blas... installing cblas... installing lapack... installing python-numpy... Optional dependencies for python-numpy blas-openblas: faster linear algebra installing python-mpi4py... :: Running post-transaction hooks... (1/2) Arming ConditionNeedsUpdate... (2/2) Updating the info directory file... [?25h==> Retrieving sources...  -> Found catalyst-v2.1.0.tar.gz ==> WARNING: Skipping all source file integrity checks. ==> Extracting sources...  -> Extracting catalyst-v2.1.0.tar.gz with bsdtar ==> Starting prepare()... ==> Starting build()... -- The C compiler identification is GNU 15.2.1 -- The CXX compiler identification is GNU 15.2.1 -- Detecting C compiler ABI info -- Detecting C compiler ABI info - done -- Check for working C compiler: /usr/bin/cc - skipped -- Detecting C compile features -- Detecting C compile features - done -- Detecting CXX compiler ABI info -- Detecting CXX compiler ABI info - done -- Check for working CXX compiler: /usr/bin/c++ - skipped -- Detecting CXX compile features -- Detecting CXX compile features - done -- Found Python3: /usr/include/python3.14 (found version "3.14.3") found components: Development NumPy Interpreter Development.Module Development.Embed -- The Fortran compiler identification is GNU 15.2.1 -- Detecting Fortran compiler ABI info -- Detecting Fortran compiler ABI info - done -- Check for working Fortran compiler: /usr/bin/gfortran - skipped -- Found MPI_C: /lib/libmpi.so (found version "3.1") -- Found MPI: TRUE (found version "3.1") found components: C -- Looking for sys/types.h -- Looking for sys/types.h - found -- Looking for stdint.h -- Looking for stdint.h - found -- Looking for stddef.h -- Looking for stddef.h - found -- Check size of char -- Check size of char - done -- Check size of short -- Check size of short - done -- Check size of int -- Check size of int - done -- Check size of long -- Check size of long - done -- Check size of float -- Check size of float - done -- Check size of double -- Check size of double - done -- Check size of long long -- Check size of long long - done -- Check size of long float -- Check size of long float - failed -- Check size of long double -- Check size of long double - done -- Check size of void * -- Check size of void * - done -- Bitwidth Mapping: preferring `long` over `long long` for c++11 compatibility -- Bitwidth Mapping Results: -- conduit::int8 native type: signed char -- conduit::int16 native type: signed short -- conduit::int32 native type: signed int -- conduit::int64 native type: signed long -- conduit::uint8 native type: unsigned char -- conduit::uint16 native type: unsigned short -- conduit::uint32 native type: unsigned int -- conduit::uint64 native type: unsigned long -- conduit::float32 native type: float -- conduit::float64 native type: double -- Performing Test CMAKE_HAVE_LIBC_PTHREAD -- Performing Test CMAKE_HAVE_LIBC_PTHREAD - Success -- Found Threads: TRUE -- Performing Test COMPILER_HAS_HIDDEN_VISIBILITY -- Performing Test COMPILER_HAS_HIDDEN_VISIBILITY - Success -- Performing Test COMPILER_HAS_HIDDEN_INLINE_VISIBILITY -- Performing Test COMPILER_HAS_HIDDEN_INLINE_VISIBILITY - Success -- Performing Test COMPILER_HAS_DEPRECATED_ATTR -- Performing Test COMPILER_HAS_DEPRECATED_ATTR - Success -- Found GTest: /usr/lib/cmake/GTest/GTestConfig.cmake (found version "1.17.0") -- Adding conduit lib unit tests -- Configuring done (11.0s) -- Generating done (0.3s) -- Build files have been written to: /build/paraview-catalyst/src/build [1/142] Building Fortran preprocessed thirdparty/conduit/conduit/CMakeFiles/catalyst_conduit.dir/fortran/conduit_fortran.F90-pp.f90 [2/142] Generating Fortran dyndep file thirdparty/conduit/conduit/CMakeFiles/catalyst_conduit.dir/Fortran.dd [3/142] Building CXX object thirdparty/conduit/conduit/CMakeFiles/catalyst_conduit.dir/conduit_endianness.cpp.o [4/142] Building CXX object thirdparty/conduit/conduit/CMakeFiles/catalyst_conduit.dir/conduit_error.cpp.o [5/142] Building CXX object thirdparty/conduit/conduit/CMakeFiles/catalyst_conduit.dir/conduit_core.cpp.o [6/142] Building CXX object thirdparty/conduit/conduit/CMakeFiles/catalyst_conduit.dir/conduit_data_type.cpp.o [7/142] Building CXX object thirdparty/conduit/conduit/CMakeFiles/catalyst_conduit.dir/conduit_node_iterator.cpp.o [8/142] Building CXX object thirdparty/conduit/conduit/CMakeFiles/catalyst_conduit.dir/conduit_log.cpp.o [9/142] Building CXX object thirdparty/conduit/conduit/CMakeFiles/catalyst_conduit.dir/conduit_schema.cpp.o [10/142] Building CXX object thirdparty/conduit/conduit/CMakeFiles/catalyst_conduit.dir/conduit_data_accessor.cpp.o [11/142] Building CXX object thirdparty/conduit/conduit/CMakeFiles/catalyst_conduit.dir/c/conduit_c.cpp.o [12/142] Building CXX object thirdparty/conduit/conduit/CMakeFiles/catalyst_conduit.dir/c/conduit_datatype_c.cpp.o [13/142] Building CXX object thirdparty/conduit/conduit/CMakeFiles/catalyst_conduit.dir/c/conduit_cpp_to_c.cpp.o [14/142] Building CXX object thirdparty/conduit/conduit/CMakeFiles/catalyst_conduit.dir/c/conduit_utils_c.cpp.o [15/142] Building Fortran object thirdparty/conduit/conduit/CMakeFiles/catalyst_conduit.dir/fortran/conduit_fortran.F90.o [16/142] Building CXX object thirdparty/conduit/conduit/CMakeFiles/catalyst_conduit.dir/conduit_generator.cpp.o [17/142] Building CXX object thirdparty/conduit/blueprint/CMakeFiles/catalyst_blueprint.dir/conduit_blueprint.cpp.o [18/142] Building CXX object thirdparty/conduit/conduit/CMakeFiles/catalyst_conduit.dir/c/conduit_node_c.cpp.o [19/142] Building CXX object thirdparty/conduit/conduit/CMakeFiles/catalyst_conduit.dir/conduit_utils.cpp.o [20/142] Building CXX object thirdparty/conduit/conduit/CMakeFiles/catalyst_conduit.dir/conduit_data_array.cpp.o [21/142] Building CXX object thirdparty/conduit/conduit/CMakeFiles/catalyst_conduit.dir/conduit_node.cpp.o [22/142] Building CXX object thirdparty/conduit/blueprint/CMakeFiles/catalyst_blueprint.dir/conduit_blueprint_mesh_matset_xforms.cpp.o [23/142] Building CXX object thirdparty/conduit/blueprint/CMakeFiles/catalyst_blueprint.dir/conduit_blueprint_mesh_topology_metadata.cpp.o [24/142] Building CXX object thirdparty/conduit/blueprint/CMakeFiles/catalyst_blueprint.dir/conduit_blueprint_ndarray_index.cpp.o [25/142] Building CXX object thirdparty/conduit/blueprint/CMakeFiles/catalyst_blueprint.dir/conduit_blueprint_mcarray.cpp.o [26/142] Building CXX object thirdparty/conduit/blueprint/CMakeFiles/catalyst_blueprint.dir/conduit_blueprint_o2mrelation_index.cpp.o [27/142] Building CXX object thirdparty/conduit/blueprint/CMakeFiles/catalyst_blueprint.dir/conduit_blueprint_o2mrelation_utils.cpp.o [28/142] Building CXX object thirdparty/conduit/blueprint/CMakeFiles/catalyst_blueprint.dir/conduit_blueprint_o2mrelation.cpp.o [29/142] Building CXX object thirdparty/conduit/blueprint/CMakeFiles/catalyst_blueprint.dir/conduit_blueprint_o2mrelation_iterator.cpp.o [30/142] Building CXX object thirdparty/conduit/blueprint/CMakeFiles/catalyst_blueprint.dir/conduit_blueprint_table.cpp.o [31/142] Building CXX object thirdparty/conduit/blueprint/CMakeFiles/catalyst_blueprint.dir/conduit_blueprint_zfparray.cpp.o [32/142] Building CXX object thirdparty/conduit/blueprint/CMakeFiles/catalyst_blueprint.dir/c/conduit_blueprint_c.cpp.o [33/142] Building CXX object thirdparty/conduit/blueprint/CMakeFiles/catalyst_blueprint.dir/c/conduit_blueprint_mcarray_c.cpp.o [34/142] Building CXX object thirdparty/conduit/blueprint/CMakeFiles/catalyst_blueprint.dir/c/conduit_blueprint_mesh_c.cpp.o [35/142] Building CXX object thirdparty/conduit/blueprint/CMakeFiles/catalyst_blueprint.dir/c/conduit_blueprint_table_c.cpp.o [36/142] Building C object thirdparty/conduit/libyaml/CMakeFiles/catalyst_conduit_libyaml.dir/src/dumper.c.o [37/142] Building C object thirdparty/conduit/libyaml/CMakeFiles/catalyst_conduit_libyaml.dir/src/api.c.o [38/142] Building C object thirdparty/conduit/libyaml/CMakeFiles/catalyst_conduit_libyaml.dir/src/loader.c.o [39/142] Building C object thirdparty/conduit/libyaml/CMakeFiles/catalyst_conduit_libyaml.dir/src/parser.c.o [40/142] Building C object thirdparty/conduit/libyaml/CMakeFiles/catalyst_conduit_libyaml.dir/src/reader.c.o [41/142] Building C object thirdparty/conduit/libyaml/CMakeFiles/catalyst_conduit_libyaml.dir/src/emitter.c.o [42/142] Building C object thirdparty/conduit/libyaml/CMakeFiles/catalyst_conduit_libyaml.dir/src/writer.c.o [43/142] Building CXX object thirdparty/conduit/libb64/CMakeFiles/catalyst_conduit_b64.dir/src/cdecode.cpp.o [44/142] Building CXX object thirdparty/conduit/libb64/CMakeFiles/catalyst_conduit_b64.dir/src/cencode.cpp.o [45/142] Building C object thirdparty/conduit/libyaml/CMakeFiles/catalyst_conduit_libyaml.dir/src/scanner.c.o [46/142] Building CXX object thirdparty/conduit/blueprint/CMakeFiles/catalyst_blueprint.dir/conduit_blueprint_mesh.cpp.o [47/142] Building CXX object src/catalyst/CMakeFiles/catalyst.dir/catalyst_api_default.cpp.o [48/142] Building C object src/catalyst/CMakeFiles/catalyst.dir/catalyst_api.c.o [49/142] Building CXX object src/catalyst/CMakeFiles/catalyst.dir/catalyst_stub.cpp.o [50/142] Building C object src/catalyst/CMakeFiles/catalyst.dir/catalyst_conduit_abi_internal.c.o [51/142] Building CXX object src/catalyst/CMakeFiles/catalyst.dir/catalyst_python_tools.cpp.o [52/142] Generating Fortran dyndep file src/catalyst/CMakeFiles/catalyst.dir/Fortran.dd [53/142] Building Fortran preprocessed src/wrap/fortran/CMakeFiles/catalyst_fortran.dir/catalyst_api.f90-pp.f90 [54/142] Generating Fortran dyndep file src/wrap/fortran/CMakeFiles/catalyst_fortran.dir/Fortran.dd [55/142] Building Fortran preprocessed tests/catalyst_tests/CMakeFiles/test_double_impl_f.dir/test_double_impl.f90-pp.f90 [56/142] Building Fortran object src/wrap/fortran/CMakeFiles/catalyst_fortran.dir/catalyst_api.f90.o [57/142] Generating Fortran dyndep file tests/catalyst_tests/CMakeFiles/test_double_impl_f.dir/Fortran.dd [58/142] Building CXX object src/catalyst/CMakeFiles/catalyst.dir/catalyst_async.cpp.o [59/142] Building CXX object thirdparty/conduit/conduit/python/CMakeFiles/conduit_utils_python.dir/conduit_utils_python.cpp.o [60/142] Building CXX object thirdparty/conduit/blueprint/CMakeFiles/catalyst_blueprint.dir/conduit_blueprint_mesh_utils.cpp.o [61/142] Building CXX object thirdparty/conduit/blueprint/python/CMakeFiles/conduit_blueprint_mcarray_python.dir/conduit_blueprint_mcarray_python.cpp.o [62/142] Building CXX object thirdparty/conduit/blueprint/python/CMakeFiles/conduit_blueprint_python.dir/conduit_blueprint_python.cpp.o [63/142] Building CXX object thirdparty/conduit/blueprint/python/CMakeFiles/conduit_blueprint_table_python.dir/conduit_blueprint_table_python.cpp.o [64/142] Building CXX object thirdparty/conduit/blueprint/python/CMakeFiles/conduit_blueprint_mesh_python.dir/conduit_blueprint_mesh_python.cpp.o [65/142] Building C object src/impl-stub/CMakeFiles/catalyst_stub.dir/catalyst_impl_stub.c.o [66/142] Building CXX object src/impl-stub/CMakeFiles/catalyst_stub.dir/catalyst.cxx.o [67/142] Building CXX object thirdparty/conduit/conduit/python/CMakeFiles/conduit_python.dir/conduit_python.cpp.o [68/142] Building CXX object src/wrap/python/CMakeFiles/catalyst_python.dir/catalyst_python.cpp.o [69/142] Building CXX object tests/abi_tests/CMakeFiles/test_catalyst_abi.dir/test_catalyst_abi.cpp.o [70/142] Building C object tests/catalyst_tests/CMakeFiles/test_double_impl.dir/test_double_impl.c.o [71/142] Building C object tests/catalyst_tests/CMakeFiles/test_about_impl_path.dir/test_about_impl_path.c.o [72/142] Building C object tests/catalyst_tests/CMakeFiles/test_about_impl_path_double.dir/test_about_impl_path_double.c.o [73/142] Building C object tests/catalyst_tests/CMakeFiles/test_about_impl_path_params.dir/test_about_impl_path_params.c.o [74/142] Building CXX object tests/abi_tests/CMakeFiles/test_conduit_abi.dir/test_conduit_abi.cpp.o [75/142] Building C object tests/catalyst_tests/CMakeFiles/test_about_impl_path_env.dir/test_about_impl_path_env.c.o [76/142] Building C object tests/catalyst_tests/CMakeFiles/test_catalyst_results.dir/catalyst_results/test_catalyst_results.c.o [77/142] Building C object tests/catalyst_tests/CMakeFiles/test_external_conduit_impl.dir/test_external_conduit_impl.c.o [78/142] Building C object tests/catalyst_tests/CMakeFiles/test_internal_conduit_impl.dir/test_internal_conduit_impl.c.o [79/142] Building CXX object tests/catalyst_tests/CMakeFiles/test_catalyst_conduit.dir/catalyst_conduit/test_catalyst_conduit.cpp.o [80/142] Building Fortran object tests/catalyst_tests/CMakeFiles/test_double_impl_f.dir/test_double_impl.f90.o [81/142] Building C object tests/catalyst_tests/impls/CMakeFiles/catalyst-double.dir/catalyst_impl_double.c.o [82/142] Building CXX object tests/catalyst_tests/impls/CMakeFiles/catalyst-double.dir/double.cpp.o [83/142] Building C object tests/catalyst_tests/impls/CMakeFiles/catalyst-external-conduit.dir/catalyst_impl_external_conduit.c.o [84/142] Building CXX object tests/catalyst_tests/impls/CMakeFiles/catalyst-external-conduit.dir/external-conduit.cpp.o [85/142] Building C object tests/catalyst_tests/impls/CMakeFiles/catalyst-internal-conduit.dir/catalyst_impl_internal_conduit.c.o [86/142] Building CXX object tests/catalyst_tests/impls/CMakeFiles/catalyst-internal-conduit.dir/internal-conduit.cpp.o [87/142] Building CXX object tests/catalyst_tests/CMakeFiles/test_catalyst_async.dir/test_catalyst_async.cpp.o [88/142] Building CXX object thirdparty/conduit/blueprint/CMakeFiles/catalyst_blueprint.dir/conduit_blueprint_mesh_flatten.cpp.o [89/142] Building CXX object src/tools/replay/CMakeFiles/catalyst_replay.dir/catalyst_replay.cpp.o [90/142] Building CXX object tests/conduit_tests/CMakeFiles/t_conduit_node_info.dir/t_conduit_node_info.cpp.o [91/142] Building CXX object tests/conduit_tests/CMakeFiles/t_conduit_node_parent.dir/t_conduit_node_parent.cpp.o [92/142] Building CXX object tests/conduit_tests/CMakeFiles/t_conduit_node_compare.dir/t_conduit_node_compare.cpp.o [93/142] Building CXX object tests/conduit_tests/CMakeFiles/t_conduit_node_paths.dir/t_conduit_node_paths.cpp.o [94/142] Building CXX object tests/conduit_tests/CMakeFiles/t_conduit_to_string.dir/t_conduit_to_string.cpp.o [95/142] Copying catalyst_conduit/__init__.py to the binary directory [96/142] Copying catalyst_conduit/blueprint/__init__.py to the binary directory [97/142] Copying catalyst_conduit/blueprint/mcarray/__init__.py to the binary directory [98/142] Copying catalyst_conduit/blueprint/mesh/__init__.py to the binary directory [99/142] Copying catalyst_conduit/blueprint/table/__init__.py to the binary directory [100/142] Copying catalyst_conduit/utils/__init__.py to the binary directory [101/142] Copying catalyst/__init__.py to the binary directory [102/142] Building CXX object tests/conduit_tests/CMakeFiles/t_conduit_node_to_value.dir/t_conduit_node_to_value.cpp.o [103/142] Building CXX object thirdparty/conduit/blueprint/CMakeFiles/catalyst_blueprint.dir/conduit_blueprint_mesh_partition.cpp.o [104/142] Building CXX object thirdparty/conduit/yyjson/CMakeFiles/catalyst_conduit_yyjson.dir/conduit_yyjson.cpp.o [105/142] Building CXX object tests/conduit_tests/CMakeFiles/t_conduit_node.dir/t_conduit_node.cpp.o [106/142] Linking CXX shared module lib/python3.14/site-packages/catalyst_conduit/utils/conduit_utils_python.so [107/142] Linking CXX shared module lib/python3.14/site-packages/catalyst_conduit/conduit_python.so [108/142] Linking CXX shared library lib/libcatalyst.so.3 [109/142] Creating library symlink lib/libcatalyst.so [110/142] Linking Fortran shared library lib/libcatalyst_fortran.so [111/142] Linking CXX shared module lib/python3.14/site-packages/catalyst_conduit/blueprint/conduit_blueprint_python.so [112/142] Linking CXX shared module lib/catalyst/libcatalyst-stub.so [113/142] Linking CXX executable bin/catalyst_replay [114/142] Linking CXX executable bin/test_catalyst_abi [115/142] Linking CXX executable bin/test_conduit_abi [116/142] Linking C executable bin/test_double_impl [117/142] Linking C executable bin/test_about_impl_path [118/142] Linking C executable bin/test_about_impl_path_double [119/142] Linking C executable bin/test_about_impl_path_params [120/142] Linking C executable bin/test_about_impl_path_env [121/142] Linking C executable bin/test_catalyst_results [122/142] Linking CXX executable bin/test_catalyst_conduit [123/142] Linking C executable bin/test_external_conduit_impl [124/142] Linking C executable bin/test_internal_conduit_impl [125/142] Linking CXX executable bin/test_catalyst_async [126/142] Linking Fortran executable bin/test_double_impl_f [127/142] Linking CXX shared module lib/catalyst/libcatalyst-double.so [128/142] Linking CXX shared module lib/catalyst/libcatalyst-external_conduit.so [129/142] Linking CXX shared module lib/catalyst/libcatalyst-internal_conduit.so [130/142] Linking CXX shared module lib/python3.14/site-packages/catalyst_conduit/blueprint/mcarray/conduit_blueprint_mcarray_python.so [131/142] Linking CXX executable bin/t_conduit_node_compare [132/142] Linking CXX executable bin/t_conduit_node_info [133/142] Linking CXX executable bin/t_conduit_node_parent [134/142] Linking CXX executable bin/t_conduit_node_paths [135/142] Linking CXX executable bin/t_conduit_to_string [136/142] Linking CXX executable bin/t_conduit_node_to_value [137/142] Linking CXX executable bin/t_conduit_node [138/142] Linking CXX shared module lib/python3.14/site-packages/catalyst/catalyst_python.so [139/142] Linking CXX shared module lib/python3.14/site-packages/catalyst_conduit/blueprint/mesh/conduit_blueprint_mesh_python.so [140/142] Linking CXX shared module lib/python3.14/site-packages/catalyst_conduit/blueprint/table/conduit_blueprint_table_python.so [141/142] Building CXX object tests/conduit_tests/CMakeFiles/t_conduit_node_set.dir/t_conduit_node_set.cpp.o [142/142] Linking CXX executable bin/t_conduit_node_set ==> Starting check()... Test project /build/paraview-catalyst/src/build Start 1: install-prepare Start 3: example-build-about Start 5: example-build-adaptor0 Start 7: example-build-replay Start 9: catalyst-abi Start 10: conduit-abi Start 11: catalyst-abi-nm Start 12: catalyst-impl-double 1/69 Test #1: install-prepare ....................................... Passed 0.04 sec 2/69 Test #10: conduit-abi ........................................... Passed 0.02 sec Start 2: install Start 13: catalyst-about-impl-path 3/69 Test #9: catalyst-abi .......................................... Passed 0.04 sec Start 14: catalyst-about-impl-path-double 4/69 Test #13: catalyst-about-impl-path .............................. Passed 0.12 sec Start 15: catalyst-about-impl-path-params 5/69 Test #12: catalyst-impl-double .................................. Passed 0.14 sec 6/69 Test #14: catalyst-about-impl-path-double ....................... Passed 0.12 sec Start 16: catalyst-about-impl-path-env Start 17: catalyst-results 7/69 Test #15: catalyst-about-impl-path-params ....................... Passed 0.11 sec Start 18: catalyst-conduit 8/69 Test #16: catalyst-about-impl-path-env .......................... Passed 0.13 sec 9/69 Test #17: catalyst-results ...................................... Passed 0.12 sec Start 19: catalyst-impl-external-conduit Start 20: catalyst-impl-internal-conduit 10/69 Test #18: catalyst-conduit ...................................... Passed 0.09 sec Start 21: catalyst-async 11/69 Test #19: catalyst-impl-external-conduit ........................ Passed 0.11 sec Start 22: test_conduit_import_python 12/69 Test #20: catalyst-impl-internal-conduit ........................ Passed 0.10 sec Start 23: test_catalyst_import_python 13/69 Test #21: catalyst-async ........................................ Passed 0.59 sec Start 24: test_double_impl_python 14/69 Test #11: catalyst-abi-nm ....................................... Passed 1.29 sec Start 25: catalyst-impl-double-fortran 15/69 Test #25: catalyst-impl-double-fortran .......................... Passed 0.14 sec Start 26: t_conduit_node 16/69 Test #26: t_conduit_node ........................................ Passed 0.13 sec Start 27: t_conduit_node_compare 17/69 Test #23: test_catalyst_import_python ........................... Passed 1.26 sec Start 28: t_conduit_node_info 18/69 Test #27: t_conduit_node_compare ................................ Passed 0.10 sec Start 29: t_conduit_node_parent 19/69 Test #29: t_conduit_node_parent ................................. Passed 0.11 sec Start 30: t_conduit_node_paths 20/69 Test #28: t_conduit_node_info ................................... Passed 0.13 sec Start 31: t_conduit_node_set 21/69 Test #22: test_conduit_import_python ............................ Passed 1.44 sec Start 32: t_conduit_to_string 22/69 Test #30: t_conduit_node_paths .................................. Passed 0.13 sec Start 33: t_conduit_node_to_value 23/69 Test #31: t_conduit_node_set ....................................***Failed 0.13 sec Running main() from /usr/src/debug/gtest/googletest-1.17.0/googletest/src/gtest_main.cc [==========] Running 56 tests from 3 test suites. [----------] Global test environment set-up. [----------] 51 tests from conduit_node_set [ RUN ] conduit_node_set.set_bitwidth_uint_scalar [ OK ] conduit_node_set.set_bitwidth_uint_scalar (0 ms) [ RUN ] conduit_node_set.set_path_bitwidth_uint_scalar [ OK ] conduit_node_set.set_path_bitwidth_uint_scalar (0 ms) [ RUN ] conduit_node_set.set_external_bitwidth_uint_scalar [ OK ] conduit_node_set.set_external_bitwidth_uint_scalar (0 ms) [ RUN ] conduit_node_set.set_bitwidth_int_scalar [ OK ] conduit_node_set.set_bitwidth_int_scalar (0 ms) [ RUN ] conduit_node_set.set_path_bitwidth_int_scalar [ OK ] conduit_node_set.set_path_bitwidth_int_scalar (0 ms) [ RUN ] conduit_node_set.set_external_bitwidth_int_scalar [ OK ] conduit_node_set.set_external_bitwidth_int_scalar (0 ms) [ RUN ] conduit_node_set.set_bitwidth_float_scalar [ OK ] conduit_node_set.set_bitwidth_float_scalar (0 ms) [ RUN ] conduit_node_set.set_path_bitwidth_float_scalar [ OK ] conduit_node_set.set_path_bitwidth_float_scalar (0 ms) [ RUN ] conduit_node_set.set_external_bitwidth_float_scalar [ OK ] conduit_node_set.set_external_bitwidth_float_scalar (0 ms) [ RUN ] conduit_node_set.set_bitwidth_uint_ptr [ OK ] conduit_node_set.set_bitwidth_uint_ptr (0 ms) [ RUN ] conduit_node_set.set_path_bitwidth_uint_ptr [ OK ] conduit_node_set.set_path_bitwidth_uint_ptr (0 ms) [ RUN ] conduit_node_set.set_external_bitwidth_uint_ptr [2, 100, 8, ..., 32, 64] [2, 100, 8, ..., 32, 64] [2, 100, 8, ..., 32, 64] [2, 100, 8, ..., 32, 64] [ OK ] conduit_node_set.set_external_bitwidth_uint_ptr (0 ms) [ RUN ] conduit_node_set.set_path_external_bitwidth_uint_ptr two: lvl: [2, 100, 8, ..., 32, 64] two: lvl: [2, 100, 8, ..., 32, 64] two: lvl: [2, 100, 8, ..., 32, 64] two: lvl: [2, 100, 8, ..., 32, 64] [ OK ] conduit_node_set.set_path_external_bitwidth_uint_ptr (0 ms) [ RUN ] conduit_node_set.set_bitwidth_int_ptr [ OK ] conduit_node_set.set_bitwidth_int_ptr (0 ms) [ RUN ] conduit_node_set.set_path_external_bitwidth_int_ptr two: lvl: [-2, -100, -8, ..., -32, -64] two: lvl: [-2, -100, -8, ..., -32, -64] two: lvl: [-2, -100, -8, ..., -32, -64] two: lvl: [-2, -100, -8, ..., -32, -64] [ OK ] conduit_node_set.set_path_external_bitwidth_int_ptr (0 ms) [ RUN ] conduit_node_set.set_bitwidth_float_ptr [ OK ] conduit_node_set.set_bitwidth_float_ptr (0 ms) [ RUN ] conduit_node_set.set_path_bitwidth_float_ptr [ OK ] conduit_node_set.set_path_bitwidth_float_ptr (0 ms) [ RUN ] conduit_node_set.set_external_bitwidth_float_ptr [-0.800000011920929, -110.099998474121, -3.20000004768372, -6.40000009536743] [-0.8, -110.1, -3.2, -6.4] [ OK ] conduit_node_set.set_external_bitwidth_float_ptr (0 ms) [ RUN ] conduit_node_set.set_path_external_bitwidth_float_ptr two: lvl: [-0.800000011920929, -110.099998474121, -3.20000004768372, -6.40000009536743] two: lvl: [-0.8, -110.1, -3.2, -6.4] [ OK ] conduit_node_set.set_path_external_bitwidth_float_ptr (0 ms) [ RUN ] conduit_node_set.set_cstyle_native_int /usr/src/debug/paraview-catalyst/catalyst-v2.1.0/tests/conduit_tests/t_conduit_node_set.cpp:1197: Failure Expected equality of these values: catalyst_conduit_datatype_is_signed_integer(n.c_dtype()) Which is: 0 true /usr/src/debug/paraview-catalyst/catalyst-v2.1.0/tests/conduit_tests/t_conduit_node_set.cpp:1198: Failure Expected equality of these values: catalyst_conduit_datatype_is_unsigned_integer(n.c_dtype()) Which is: 1 false [ FAILED ] conduit_node_set.set_cstyle_native_int (0 ms) [ RUN ] conduit_node_set.set_cstyle_native_int_ptr [ OK ] conduit_node_set.set_cstyle_native_int_ptr (0 ms) [ RUN ] conduit_node_set.set_cstyle_native_int_vec [ OK ] conduit_node_set.set_cstyle_native_int_vec (0 ms) [ RUN ] conduit_node_set.set_cstyle_unsigned_int [ OK ] conduit_node_set.set_cstyle_unsigned_int (0 ms) [ RUN ] conduit_node_set.set_cstyle_unsigned_int_ptr [ OK ] conduit_node_set.set_cstyle_unsigned_int_ptr (0 ms) [ RUN ] conduit_node_set.set_cstyle_unsigned_int_vec [ OK ] conduit_node_set.set_cstyle_unsigned_int_vec (0 ms) [ RUN ] conduit_node_set.set_cstyle_signed_int [ OK ] conduit_node_set.set_cstyle_signed_int (0 ms) [ RUN ] conduit_node_set.set_cstyle_signed_int_ptr [ OK ] conduit_node_set.set_cstyle_signed_int_ptr (0 ms) [ RUN ] conduit_node_set.set_cstyle_signed_int_vec [ OK ] conduit_node_set.set_cstyle_signed_int_vec (0 ms) [ RUN ] conduit_node_set.set_cstyle_float [ OK ] conduit_node_set.set_cstyle_float (0 ms) [ RUN ] conduit_node_set.set_cstyle_float_ptr [ OK ] conduit_node_set.set_cstyle_float_ptr (0 ms) [ RUN ] conduit_node_set.set_cstyle_float_vec [ OK ] conduit_node_set.set_cstyle_float_vec (0 ms) [ RUN ] conduit_node_set.set_path_cstyle_native_int /usr/src/debug/paraview-catalyst/catalyst-v2.1.0/tests/conduit_tests/t_conduit_node_set.cpp:2281: Failure Expected equality of these values: catalyst_conduit_datatype_is_signed_integer(nc.c_dtype()) Which is: 0 true /usr/src/debug/paraview-catalyst/catalyst-v2.1.0/tests/conduit_tests/t_conduit_node_set.cpp:2282: Failure Expected equality of these values: catalyst_conduit_datatype_is_unsigned_integer(nc.c_dtype()) Which is: 1 false [ FAILED ] conduit_node_set.set_path_cstyle_native_int (0 ms) [ RUN ] conduit_node_set.set_path_cstyle_native_int_ptr [ OK ] conduit_node_set.set_path_cstyle_native_int_ptr (0 ms) [ RUN ] conduit_node_set.set_path_cstyle_native_int_vec [ OK ] conduit_node_set.set_path_cstyle_native_int_vec (0 ms) [ RUN ] conduit_node_set.set_path_cstyle_unsigned_int [ OK ] conduit_node_set.set_path_cstyle_unsigned_int (0 ms) [ RUN ] conduit_node_set.set_path_cstyle_unsigned_int_ptr [ OK ] conduit_node_set.set_path_cstyle_unsigned_int_ptr (0 ms) [ RUN ] conduit_node_set.set_path_cstyle_unsigned_int_vec [ OK ] conduit_node_set.set_path_cstyle_unsigned_int_vec (0 ms) [ RUN ] conduit_node_set.set_path_cstyle_signed_int [ OK ] conduit_node_set.set_path_cstyle_signed_int (0 ms) [ RUN ] conduit_node_set.set_path_cstyle_signed_int_ptr [ OK ] conduit_node_set.set_path_cstyle_signed_int_ptr (0 ms) [ RUN ] conduit_node_set.set_path_cstyle_signed_int_vec [ OK ] conduit_node_set.set_path_cstyle_signed_int_vec (0 ms) [ RUN ] conduit_node_set.set_path_cstyle_float [ OK ] conduit_node_set.set_path_cstyle_float (0 ms) [ RUN ] conduit_node_set.set_path_cstyle_float_ptr [ OK ] conduit_node_set.set_path_cstyle_float_ptr (0 ms) [ RUN ] conduit_node_set.set_path_cstyle_float_vec [ OK ] conduit_node_set.set_path_cstyle_float_vec (0 ms) [ RUN ] conduit_node_set.set_node two: lvl: [-0.800000011920929, -1.60000002384186, -3.20000004768372, -6.40000009536743] two: lvl: [-0.800000011920929, -110.099998474121, -3.20000004768372, -6.40000009536743] two: lvl: [-0.8, -1.6, -3.2, -6.4] two: lvl: [-0.800000011920929, -110.099998474121, -3.20000004768372, -6.40000009536743] one: two: lvl: [-0.8, -110.1, -3.2, -6.4] [ OK ] conduit_node_set.set_node (0 ms) [ RUN ] conduit_node_set.set_external_node two: lvl: [-0.800000011920929, -1.60000002384186, -3.20000004768372, -6.40000009536743] two: lvl: [-0.800000011920929, -110.099998474121, -3.20000004768372, -6.40000009536743] two: lvl: [-0.8, -1.6, -3.2, -6.4] two: lvl: [-0.800000011920929, -110.099998474121, -3.20000004768372, -6.40000009536743] one: two: lvl: [-0.8, -110.1, -3.2, -6.4] [ OK ] conduit_node_set.set_external_node (0 ms) [ RUN ] conduit_node_set.set_string [ OK ] conduit_node_set.set_string (0 ms) [ RUN ] conduit_node_set.set_path_string [ OK ] conduit_node_set.set_path_string (0 ms) [ RUN ] conduit_node_set.set_string_multiple [ OK ] conduit_node_set.set_string_multiple (0 ms) [ RUN ] conduit_node_set.set_vector i8: -8 i16: -16 i32: -32 i64: -64 ... ( skipped 3 children ) ui64: 64 f32: 2.71828007698059 f64: 3.1415 [ OK ] conduit_node_set.set_vector (0 ms) [ RUN ] conduit_node_set.set_path_vector i8: -8 i16: -16 i32: -32 i64: -64 ... ( skipped 3 children ) ui64: 64 f32: 2.71828007698059 f64: 3.1415 [ OK ] conduit_node_set.set_path_vector (0 ms) [ RUN ] conduit_node_set.set_vector_external i8: 0 i16: 0 i32: 0 i64: 0 ... ( skipped 3 children ) ui64: 0 f32: 0.0 f64: 0.0 i8: -8 i16: -16 i32: -32 i64: -64 ... ( skipped 3 children ) ui64: 64 f32: 2.71828007698059 f64: 3.1415 i8: -8 i16: -16 i32: -32 i64: -64 ... ( skipped 3 children ) ui64: 64 f32: 2.71828007698059 f64: 3.1415 [ OK ] conduit_node_set.set_vector_external (0 ms) [----------] 51 tests from conduit_node_set (3 ms total) [----------] 2 tests from conduit_node_set_ [ RUN ] conduit_node_set_.set_path_bitwidth_int_ptr [ OK ] conduit_node_set_.set_path_bitwidth_int_ptr (0 ms) [ RUN ] conduit_node_set_.set_external_bitwidth_int_ptr [-2, -100, -8, ..., -32, -64] [-2, -100, -8, ..., -32, -64] [-2, -100, -8, ..., -32, -64] [-2, -100, -8, ..., -32, -64] [ OK ] conduit_node_set_.set_external_bitwidth_int_ptr (0 ms) [----------] 2 tests from conduit_node_set_ (0 ms total) [----------] 3 tests from conduit_node [ RUN ] conduit_node.node_set_existing_char8 a: 10 b: "my value" a: 10 b: "my value" [ OK ] conduit_node.node_set_existing_char8 (0 ms) [ RUN ] conduit_node.node_set_existing_obj [ OK ] conduit_node.node_set_existing_obj (0 ms) [ RUN ] conduit_node.node_set_overload_ambig [ OK ] conduit_node.node_set_overload_ambig (0 ms) [----------] 3 tests from conduit_node (0 ms total) [----------] Global test environment tear-down [==========] 56 tests from 3 test suites ran. (3 ms total) [ PASSED ] 54 tests. [ FAILED ] 2 tests, listed below: [ FAILED ] conduit_node_set.set_cstyle_native_int [ FAILED ] conduit_node_set.set_path_cstyle_native_int 2 FAILED TESTS Start 34: test-build-replay_high_num_ranks 24/69 Test #32: t_conduit_to_string ................................... Passed 0.12 sec Start 36: test-build-replay_ranks_mismatch 25/69 Test #33: t_conduit_node_to_value ............................... Passed 0.14 sec Start 38: test-build-replay_high_num_execute_invc 26/69 Test #24: test_double_impl_python ............................... Passed 1.46 sec Start 40: test-build-replay_no_data_dump_dir 27/69 Test #2: install ............................................... Passed 5.01 sec Start 4: example-install-about 28/69 Test #3: example-build-about ................................... Passed 15.84 sec Start 6: example-install-adaptor0 29/69 Test #4: example-install-about ................................. Passed 16.64 sec Start 8: example-install-replay 30/69 Test #5: example-build-adaptor0 ................................ Passed 24.29 sec Start 35: test-install-replay_high_num_ranks 31/69 Test #34: test-build-replay_high_num_ranks ......................***Failed 29.91 sec Internal cmake changing into directory: /build/paraview-catalyst/src/build/tests/replay_tests/build-replay_high_num_ranks ======== CMake output ====== CMake Deprecation Warning at CMakeLists.txt:1 (cmake_minimum_required): Compatibility with CMake < 3.10 will be removed from a future version of CMake. Update the VERSION argument value. Or, use the ... syntax to tell CMake that the project requires at least but has been updated to work with policies introduced by or earlier. The C compiler identification is GNU 15.2.1 The CXX compiler identification is GNU 15.2.1 Detecting C compiler ABI info Detecting C compiler ABI info - done Check for working C compiler: /usr/bin/cc - skipped Detecting C compile features Detecting C compile features - done Detecting CXX compiler ABI info Detecting CXX compiler ABI info - done Check for working CXX compiler: /usr/bin/c++ - skipped Detecting CXX compile features Detecting CXX compile features - done Found MPI_C: /lib/libmpi.so (found version "3.1") Found MPI: TRUE (found version "3.1") found components: C Found Python3: /usr/include/python3.14 (found version "3.14.3") found components: Development Development.Module Development.Embed Configuring done (7.2s) Generating done (0.2s) Build files have been written to: /build/paraview-catalyst/src/build/tests/replay_tests/build-replay_high_num_ranks ======== End CMake output ====== Change Dir: '/build/paraview-catalyst/src/build/tests/replay_tests/build-replay_high_num_ranks' Run Clean Command: /usr/bin/ninja clean [1/1] Cleaning all built files... Cleaning... 0 files. Run Build Command(s): /usr/bin/ninja [1/5] Building C object CMakeFiles/high_num_ranks_adaptor.dir/catalyst_impl_replay.c.o [2/5] Building CXX object CMakeFiles/high_num_ranks_driver.dir/build/paraview-catalyst/src/catalyst-v2.1.0/tests/replay_tests/common_src_dir/common_replay_driver.cxx.o [3/5] Linking CXX executable high_num_ranks_driver [4/5] Building CXX object CMakeFiles/high_num_ranks_adaptor.dir/build/paraview-catalyst/src/catalyst-v2.1.0/tests/replay_tests/common_src_dir/common_replay_adaptor.cxx.o [5/5] Linking CXX shared module catalyst/libcatalyst-replay.so Running test command: "/usr/bin/ctest" "-C" "$" "-V" "--output-on-failure" UpdateCTestConfiguration from :/build/paraview-catalyst/src/build/tests/replay_tests/build-replay_high_num_ranks/DartConfiguration.tcl Parse Config file:/build/paraview-catalyst/src/build/tests/replay_tests/build-replay_high_num_ranks/DartConfiguration.tcl Test project /build/paraview-catalyst/src/build/tests/replay_tests/build-replay_high_num_ranks Constructing a list of tests Done constructing a list of tests Updating test list for fixtures Added 0 tests to meet fixture requirements Checking test dependency graph... Checking test dependency graph end test 1 Start 1: high_num_ranks_write_prepare 1: Test command: /usr/bin/cmake "-E" "rm" "-rf" "/build/paraview-catalyst/src/build/tests/replay_tests/build-replay_high_num_ranks/data_dump/" 1: Working Directory: /build/paraview-catalyst/src/build/tests/replay_tests/build-replay_high_num_ranks 1: Test timeout computed to be: 1500 1/3 Test #1: high_num_ranks_write_prepare ..... Passed 0.03 sec test 2 Start 2: high_num_ranks_write_out 2: Test command: /usr/bin/mpiexec "-n" "10" "/build/paraview-catalyst/src/build/tests/replay_tests/build-replay_high_num_ranks/high_num_ranks_driver" "/build/paraview-catalyst/src/build/tests/replay_tests/build-replay_high_num_ranks/catalyst" 2: Working Directory: /build/paraview-catalyst/src/build/tests/replay_tests/build-replay_high_num_ranks 2: Environment variables: 2: CATALYST_DATA_DUMP_DIRECTORY=/build/paraview-catalyst/src/build/tests/replay_tests/build-replay_high_num_ranks/data_dump/ 2: Test timeout computed to be: 1500 2: -------------------------------------------------------------------------- 2: There are not enough slots available in the system to satisfy the 10 2: slots that were requested by the application: 2: 2: /build/paraview-catalyst/src/build/tests/replay_tests/build-replay_high_num_ranks/high_num_ranks_driver 2: 2: Either request fewer procs for your application, or make more slots 2: available for use. 2: 2: A "slot" is the PRRTE term for an allocatable unit where we can 2: launch a process. The number of slots available are defined by the 2: environment in which PRRTE processes are run: 2: 2: 1. Hostfile, via "slots=N" clauses (N defaults to number of 2: processor cores if not provided) 2: 2. The --host command line parameter, via a ":N" suffix on the 2: hostname (N defaults to 1 if not provided) 2: 3. Resource manager (e.g., SLURM, PBS/Torque, LSF, etc.) 2: 4. If none of a hostfile, the --host command line parameter, or an 2: RM is present, PRRTE defaults to the number of processor cores 2: 2: In all the above cases, if you want PRRTE to default to the number 2: of hardware threads instead of the number of processor cores, use the 2: --use-hwthread-cpus option. 2: 2: Alternatively, you can use the --map-by :OVERSUBSCRIBE option to ignore the 2: number of available slots when deciding the number of processes to 2: launch. 2: -------------------------------------------------------------------------- 2: 2/3 Test #2: high_num_ranks_write_out .........***Failed 3.37 sec -------------------------------------------------------------------------- There are not enough slots available in the system to satisfy the 10 slots that were requested by the application: /build/paraview-catalyst/src/build/tests/replay_tests/build-replay_high_num_ranks/high_num_ranks_driver Either request fewer procs for your application, or make more slots available for use. A "slot" is the PRRTE term for an allocatable unit where we can launch a process. The number of slots available are defined by the environment in which PRRTE processes are run: 1. Hostfile, via "slots=N" clauses (N defaults to number of processor cores if not provided) 2. The --host command line parameter, via a ":N" suffix on the hostname (N defaults to 1 if not provided) 3. Resource manager (e.g., SLURM, PBS/Torque, LSF, etc.) 4. If none of a hostfile, the --host command line parameter, or an RM is present, PRRTE defaults to the number of processor cores In all the above cases, if you want PRRTE to default to the number of hardware threads instead of the number of processor cores, use the --use-hwthread-cpus option. Alternatively, you can use the --map-by :OVERSUBSCRIBE option to ignore the number of available slots when deciding the number of processes to launch. -------------------------------------------------------------------------- test 3 Start 3: high_num_ranks_read_in Failed test dependencies: high_num_ranks_write_out 3/3 Test #3: high_num_ranks_read_in ...........***Not Run 0.00 sec 33% tests passed, 2 tests failed out of 3 Total Test time (real) = 3.41 sec The following tests FAILED: 2 - high_num_ranks_write_out (Failed) 3 - high_num_ranks_read_in (Not Run) Errors while running CTest Test command failed: /usr/bin/ctest CMake Error at /build/paraview-catalyst/src/catalyst-v2.1.0/cmake/catalyst-test-path-setting.cmake:41 (message): test or example failed! Start 37: test-install-replay_ranks_mismatch 32/69 Test #36: test-build-replay_ranks_mismatch ......................***Failed 30.19 sec Internal cmake changing into directory: /build/paraview-catalyst/src/build/tests/replay_tests/build-replay_ranks_mismatch ======== CMake output ====== CMake Deprecation Warning at CMakeLists.txt:1 (cmake_minimum_required): Compatibility with CMake < 3.10 will be removed from a future version of CMake. Update the VERSION argument value. Or, use the ... syntax to tell CMake that the project requires at least but has been updated to work with policies introduced by or earlier. The C compiler identification is GNU 15.2.1 The CXX compiler identification is GNU 15.2.1 Detecting C compiler ABI info Detecting C compiler ABI info - done Check for working C compiler: /usr/bin/cc - skipped Detecting C compile features Detecting C compile features - done Detecting CXX compiler ABI info Detecting CXX compiler ABI info - done Check for working CXX compiler: /usr/bin/c++ - skipped Detecting CXX compile features Detecting CXX compile features - done Found MPI_C: /lib/libmpi.so (found version "3.1") Found MPI: TRUE (found version "3.1") found components: C Found Python3: /usr/include/python3.14 (found version "3.14.3") found components: Development Development.Module Development.Embed Configuring done (7.4s) Generating done (0.2s) Build files have been written to: /build/paraview-catalyst/src/build/tests/replay_tests/build-replay_ranks_mismatch ======== End CMake output ====== Change Dir: '/build/paraview-catalyst/src/build/tests/replay_tests/build-replay_ranks_mismatch' Run Clean Command: /usr/bin/ninja clean [1/1] Cleaning all built files... Cleaning... 0 files. Run Build Command(s): /usr/bin/ninja [1/5] Building C object CMakeFiles/replay_ranks_mismatch_adaptor.dir/catalyst_impl_replay.c.o [2/5] Building CXX object CMakeFiles/replay_ranks_mismatch_driver.dir/build/paraview-catalyst/src/catalyst-v2.1.0/tests/replay_tests/common_src_dir/common_replay_driver.cxx.o [3/5] Linking CXX executable replay_ranks_mismatch_driver [4/5] Building CXX object CMakeFiles/replay_ranks_mismatch_adaptor.dir/build/paraview-catalyst/src/catalyst-v2.1.0/tests/replay_tests/common_src_dir/common_replay_adaptor.cxx.o [5/5] Linking CXX shared module catalyst/libcatalyst-replay.so Running test command: "/usr/bin/ctest" "-C" "$" "-V" "--output-on-failure" UpdateCTestConfiguration from :/build/paraview-catalyst/src/build/tests/replay_tests/build-replay_ranks_mismatch/DartConfiguration.tcl Parse Config file:/build/paraview-catalyst/src/build/tests/replay_tests/build-replay_ranks_mismatch/DartConfiguration.tcl Test project /build/paraview-catalyst/src/build/tests/replay_tests/build-replay_ranks_mismatch Constructing a list of tests Done constructing a list of tests Updating test list for fixtures Added 0 tests to meet fixture requirements Checking test dependency graph... Checking test dependency graph end test 1 Start 1: replay_ranks_mismatch_prepare 1: Test command: /usr/bin/cmake "-E" "rm" "-rf" "/build/paraview-catalyst/src/build/tests/replay_tests/build-replay_ranks_mismatch/data_dump/" 1: Working Directory: /build/paraview-catalyst/src/build/tests/replay_tests/build-replay_ranks_mismatch 1: Test timeout computed to be: 1500 1/3 Test #1: replay_ranks_mismatch_prepare ..... Passed 0.04 sec test 2 Start 2: replay_ranks_mismatch_write_out 2: Test command: /usr/bin/mpiexec "-n" "1" "/build/paraview-catalyst/src/build/tests/replay_tests/build-replay_ranks_mismatch/replay_ranks_mismatch_driver" "/build/paraview-catalyst/src/build/tests/replay_tests/build-replay_ranks_mismatch/catalyst" 2: Working Directory: /build/paraview-catalyst/src/build/tests/replay_tests/build-replay_ranks_mismatch 2: Environment variables: 2: CATALYST_DATA_DUMP_DIRECTORY=/build/paraview-catalyst/src/build/tests/replay_tests/build-replay_ranks_mismatch/data_dump/ 2: Test timeout computed to be: 1500 2: [arch-nspawn-3003276:04457] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 1056 2: [arch-nspawn-3003276:04457] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 1231 2: [arch-nspawn-3003276:04457] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 1353 2: [arch-nspawn-3003276:04457] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 2405 2: [arch-nspawn-3003276:04457] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 2460 2: [arch-nspawn-3003276:04457] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 2476 2: [arch-nspawn-3003276:04457] PMIX ERROR: PMIX_ERROR in file server/pmix_server.c at line 4660 2: -------------------------------------------------------------------------- 2: The gds/shmem2 component attempted to attach to a shared-memory segment at a 2: particular base address, but was given a different one. Your job will now likely 2: abort. 2: 2: Requested Address: 0x2aaaac000000 2: Acquired Address: 0x3f806fa000 2: 2: If this problem persists, please consider disabling the gds/shmem2 component by 2: setting in your environment the following: PMIX_MCA_gds=hash 2: -------------------------------------------------------------------------- 2: 2: [arch-nspawn-3003276:04574] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file client/pmix_client.c at line 278 2: *** An error occurred in MPI_Init 2: *** on a NULL communicator 2: *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, 2: *** and MPI will try to terminate your MPI job as well) 2: [arch-nspawn-3003276:04574] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! 2/3 Test #2: replay_ranks_mismatch_write_out ...***Failed 3.99 sec [arch-nspawn-3003276:04457] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 1056 [arch-nspawn-3003276:04457] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 1231 [arch-nspawn-3003276:04457] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 1353 [arch-nspawn-3003276:04457] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 2405 [arch-nspawn-3003276:04457] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 2460 [arch-nspawn-3003276:04457] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 2476 [arch-nspawn-3003276:04457] PMIX ERROR: PMIX_ERROR in file server/pmix_server.c at line 4660 -------------------------------------------------------------------------- The gds/shmem2 component attempted to attach to a shared-memory segment at a particular base address, but was given a different one. Your job will now likely abort. Requested Address: 0x2aaaac000000 Acquired Address: 0x3f806fa000 If this problem persists, please consider disabling the gds/shmem2 component by setting in your environment the following: PMIX_MCA_gds=hash -------------------------------------------------------------------------- [arch-nspawn-3003276:04574] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file client/pmix_client.c at line 278 *** An error occurred in MPI_Init *** on a NULL communicator *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, *** and MPI will try to terminate your MPI job as well) [arch-nspawn-3003276:04574] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! test 3 Start 3: replay_ranks_mismatch_read_in Failed test dependencies: replay_ranks_mismatch_write_out 3/3 Test #3: replay_ranks_mismatch_read_in .....***Not Run 0.00 sec 33% tests passed, 2 tests failed out of 3 Total Test time (real) = 4.03 sec The following tests FAILED: 2 - replay_ranks_mismatch_write_out (Failed) 3 - replay_ranks_mismatch_read_in (Not Run) Errors while running CTest Test command failed: /usr/bin/ctest CMake Error at /build/paraview-catalyst/src/catalyst-v2.1.0/cmake/catalyst-test-path-setting.cmake:41 (message): test or example failed! Start 39: test-install-replay_high_num_execute_invc 33/69 Test #40: test-build-replay_no_data_dump_dir ....................***Failed 30.02 sec Internal cmake changing into directory: /build/paraview-catalyst/src/build/tests/replay_tests/build-replay_no_data_dump_dir ======== CMake output ====== CMake Deprecation Warning at CMakeLists.txt:1 (cmake_minimum_required): Compatibility with CMake < 3.10 will be removed from a future version of CMake. Update the VERSION argument value. Or, use the ... syntax to tell CMake that the project requires at least but has been updated to work with policies introduced by or earlier. The C compiler identification is GNU 15.2.1 The CXX compiler identification is GNU 15.2.1 Detecting C compiler ABI info Detecting C compiler ABI info - done Check for working C compiler: /usr/bin/cc - skipped Detecting C compile features Detecting C compile features - done Detecting CXX compiler ABI info Detecting CXX compiler ABI info - done Check for working CXX compiler: /usr/bin/c++ - skipped Detecting CXX compile features Detecting CXX compile features - done Found MPI_C: /lib/libmpi.so (found version "3.1") Found MPI: TRUE (found version "3.1") found components: C Found Python3: /usr/include/python3.14 (found version "3.14.3") found components: Development Development.Module Development.Embed Configuring done (7.0s) Generating done (0.1s) Build files have been written to: /build/paraview-catalyst/src/build/tests/replay_tests/build-replay_no_data_dump_dir ======== End CMake output ====== Change Dir: '/build/paraview-catalyst/src/build/tests/replay_tests/build-replay_no_data_dump_dir' Run Clean Command: /usr/bin/ninja clean [1/1] Cleaning all built files... Cleaning... 0 files. Run Build Command(s): /usr/bin/ninja [1/5] Building C object CMakeFiles/no_data_dump_dir_adapter.dir/catalyst_impl_replay.c.o [2/5] Building CXX object CMakeFiles/no_data_dump_dir_driver.dir/build/paraview-catalyst/src/catalyst-v2.1.0/tests/replay_tests/common_src_dir/common_replay_driver.cxx.o [3/5] Linking CXX executable no_data_dump_dir_driver [4/5] Building CXX object CMakeFiles/no_data_dump_dir_adapter.dir/build/paraview-catalyst/src/catalyst-v2.1.0/tests/replay_tests/common_src_dir/common_replay_adaptor.cxx.o [5/5] Linking CXX shared module catalyst/libcatalyst-replay.so Running test command: "/usr/bin/ctest" "-C" "$" "-V" "--output-on-failure" UpdateCTestConfiguration from :/build/paraview-catalyst/src/build/tests/replay_tests/build-replay_no_data_dump_dir/DartConfiguration.tcl Parse Config file:/build/paraview-catalyst/src/build/tests/replay_tests/build-replay_no_data_dump_dir/DartConfiguration.tcl Test project /build/paraview-catalyst/src/build/tests/replay_tests/build-replay_no_data_dump_dir Constructing a list of tests Done constructing a list of tests Updating test list for fixtures Added 0 tests to meet fixture requirements Checking test dependency graph... Checking test dependency graph end test 1 Start 1: no_write_out 1: Test command: /usr/bin/mpiexec "-n" "2" "/build/paraview-catalyst/src/build/tests/replay_tests/build-replay_no_data_dump_dir/no_data_dump_dir_driver" "/build/paraview-catalyst/src/build/tests/replay_tests/build-replay_no_data_dump_dir/catalyst" 1: Working Directory: /build/paraview-catalyst/src/build/tests/replay_tests/build-replay_no_data_dump_dir 1: Test timeout computed to be: 1500 1: [arch-nspawn-3003276:04418] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 1056 1: [arch-nspawn-3003276:04418] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 1231 1: [arch-nspawn-3003276:04418] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 1353 1: [arch-nspawn-3003276:04418] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 2405 1: [arch-nspawn-3003276:04418] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 2460 1: [arch-nspawn-3003276:04418] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 2476 1: [arch-nspawn-3003276:04418] PMIX ERROR: PMIX_ERROR in file server/pmix_server.c at line 4660 1: -------------------------------------------------------------------------- 1: The gds/shmem2 component attempted to attach to a shared-memory segment at a 1: particular base address, but was given a different one. Your job will now likely 1: abort. 1: 1: Requested Address: 0x2aaab8000000 1: Acquired Address: 0x3f815a3000 1: 1: If this problem persists, please consider disabling the gds/shmem2 component by 1: setting in your environment the following: PMIX_MCA_gds=hash 1: -------------------------------------------------------------------------- 1: 1: [arch-nspawn-3003276:04584] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file client/pmix_client.c at line 278 1: [arch-nspawn-3003276:04418] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 1056 1: [arch-nspawn-3003276:04418] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 1231 1: [arch-nspawn-3003276:04418] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 1353 1: [arch-nspawn-3003276:04418] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 2405 1: [arch-nspawn-3003276:04418] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 2460 1: [arch-nspawn-3003276:04418] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 2476 1: [arch-nspawn-3003276:04418] PMIX ERROR: PMIX_ERROR in file server/pmix_server.c at line 4660 1: [arch-nspawn-3003276:04576] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file client/pmix_client.c at line 278 1: -------------------------------------------------------------------------- 1: PMIx_Init failed for the following reason: 1: 1: PMIX_ERROR 1: 1: Open MPI requires access to a local PMIx server to execute. Please ensure 1: that either you are operating in a PMIx-enabled environment, or use "mpirun" 1: to execute the job. 1: -------------------------------------------------------------------------- 1: 1: *** An error occurred in MPI_Init 1: *** on a NULL communicator 1: *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, 1: *** and MPI will try to terminate your MPI job as well) 1: [arch-nspawn-3003276:04584] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! 1: -------------------------------------------------------------------------- 1: prte detected that one or more processes exited with non-zero status, 1: thus causing the job to be terminated. The first process to do so was: 1: 1: Process name: [prte-arch-nspawn-3003276-4418@1,1] 1: Exit code: 14 1: -------------------------------------------------------------------------- 1/2 Test #1: no_write_out .....................***Failed 5.20 sec [arch-nspawn-3003276:04418] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 1056 [arch-nspawn-3003276:04418] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 1231 [arch-nspawn-3003276:04418] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 1353 [arch-nspawn-3003276:04418] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 2405 [arch-nspawn-3003276:04418] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 2460 [arch-nspawn-3003276:04418] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 2476 [arch-nspawn-3003276:04418] PMIX ERROR: PMIX_ERROR in file server/pmix_server.c at line 4660 -------------------------------------------------------------------------- The gds/shmem2 component attempted to attach to a shared-memory segment at a particular base address, but was given a different one. Your job will now likely abort. Requested Address: 0x2aaab8000000 Acquired Address: 0x3f815a3000 If this problem persists, please consider disabling the gds/shmem2 component by setting in your environment the following: PMIX_MCA_gds=hash -------------------------------------------------------------------------- [arch-nspawn-3003276:04584] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file client/pmix_client.c at line 278 [arch-nspawn-3003276:04418] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 1056 [arch-nspawn-3003276:04418] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 1231 [arch-nspawn-3003276:04418] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 1353 [arch-nspawn-3003276:04418] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 2405 [arch-nspawn-3003276:04418] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 2460 [arch-nspawn-3003276:04418] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 2476 [arch-nspawn-3003276:04418] PMIX ERROR: PMIX_ERROR in file server/pmix_server.c at line 4660 [arch-nspawn-3003276:04576] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file client/pmix_client.c at line 278 -------------------------------------------------------------------------- PMIx_Init failed for the following reason: PMIX_ERROR Open MPI requires access to a local PMIx server to execute. Please ensure that either you are operating in a PMIx-enabled environment, or use "mpirun" to execute the job. -------------------------------------------------------------------------- *** An error occurred in MPI_Init *** on a NULL communicator *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, *** and MPI will try to terminate your MPI job as well) [arch-nspawn-3003276:04584] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! -------------------------------------------------------------------------- prte detected that one or more processes exited with non-zero status, thus causing the job to be terminated. The first process to do so was: Process name: [prte-arch-nspawn-3003276-4418@1,1] Exit code: 14 -------------------------------------------------------------------------- test 2 Start 2: no_data_dump_dir_read_in Failed test dependencies: no_write_out 2/2 Test #2: no_data_dump_dir_read_in .........***Not Run 0.00 sec 0% tests passed, 2 tests failed out of 2 Total Test time (real) = 5.21 sec The following tests FAILED: 1 - no_write_out (Failed) 2 - no_data_dump_dir_read_in (Not Run) Errors while running CTest Test command failed: /usr/bin/ctest CMake Error at /build/paraview-catalyst/src/catalyst-v2.1.0/cmake/catalyst-test-path-setting.cmake:41 (message): test or example failed! Start 41: test-install-replay_no_data_dump_dir 34/69 Test #7: example-build-replay ..................................***Failed 32.46 sec Internal cmake changing into directory: /build/paraview-catalyst/src/build/examples/build-replay ======== CMake output ====== The C compiler identification is GNU 15.2.1 The CXX compiler identification is GNU 15.2.1 Detecting C compiler ABI info Detecting C compiler ABI info - done Check for working C compiler: /usr/bin/cc - skipped Detecting C compile features Detecting C compile features - done Detecting CXX compiler ABI info Detecting CXX compiler ABI info - done Check for working CXX compiler: /usr/bin/c++ - skipped Detecting CXX compile features Detecting CXX compile features - done Found MPI_C: /lib/libmpi.so (found version "3.1") Found MPI: TRUE (found version "3.1") found components: C Found Python3: /usr/include/python3.14 (found version "3.14.3") found components: Development Development.Module Development.Embed Configuring done (7.3s) Generating done (0.1s) Build files have been written to: /build/paraview-catalyst/src/build/examples/build-replay ======== End CMake output ====== Change Dir: '/build/paraview-catalyst/src/build/examples/build-replay' Run Clean Command: /usr/bin/ninja clean [1/1] Cleaning all built files... Cleaning... 0 files. Run Build Command(s): /usr/bin/ninja [1/5] Building C object CMakeFiles/catalyst_example_replay_adaptor.dir/catalyst_impl_example_replay_adaptor.c.o [2/5] Building CXX object CMakeFiles/replay_test.dir/replay_test.cxx.o [3/5] Linking CXX executable replay_test [4/5] Building CXX object CMakeFiles/catalyst_example_replay_adaptor.dir/catalyst_example_replay_adaptor.cxx.o [5/5] Linking CXX shared module catalyst/libcatalyst-example_replay_adaptor.so Running test command: "/usr/bin/ctest" "-C" "$" "-V" "--output-on-failure" UpdateCTestConfiguration from :/build/paraview-catalyst/src/build/examples/build-replay/DartConfiguration.tcl Parse Config file:/build/paraview-catalyst/src/build/examples/build-replay/DartConfiguration.tcl Test project /build/paraview-catalyst/src/build/examples/build-replay Constructing a list of tests Done constructing a list of tests Updating test list for fixtures Added 0 tests to meet fixture requirements Checking test dependency graph... Checking test dependency graph end test 1 Start 1: replay_test_prepare 1: Test command: /usr/bin/cmake "-E" "rm" "-rf" "/build/paraview-catalyst/src/build/examples/build-replay/data_dump" 1: Working Directory: /build/paraview-catalyst/src/build/examples/build-replay 1: Test timeout computed to be: 1500 1/3 Test #1: replay_test_prepare .............. Passed 0.03 sec test 2 Start 2: replay_test_write_out 2: Test command: /usr/bin/mpiexec "-n" "2" "/build/paraview-catalyst/src/build/examples/build-replay/replay_test" "/build/paraview-catalyst/src/build/examples/build-replay/catalyst" 2: Working Directory: /build/paraview-catalyst/src/build/examples/build-replay 2: Environment variables: 2: CATALYST_DATA_DUMP_DIRECTORY=/build/paraview-catalyst/src/build/examples/build-replay/data_dump 2: Test timeout computed to be: 1500 2: [arch-nspawn-3003276:04468] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 1056 2: [arch-nspawn-3003276:04468] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 1231 2: [arch-nspawn-3003276:04468] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 1353 2: [arch-nspawn-3003276:04468] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 2405 2: [arch-nspawn-3003276:04468] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 2460 2: [arch-nspawn-3003276:04468] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 2476 2: [arch-nspawn-3003276:04468] PMIX ERROR: PMIX_ERROR in file server/pmix_server.c at line 4660 2: -------------------------------------------------------------------------- 2: The gds/shmem2 component attempted to attach to a shared-memory segment at a 2: particular base address, but was given a different one. Your job will now likely 2: abort. 2: 2: Requested Address: 0x2aaac8000000 2: Acquired Address: 0x3fb1e51000 2: 2: If this problem persists, please consider disabling the gds/shmem2 component by 2: setting in your environment the following: PMIX_MCA_gds=hash 2: -------------------------------------------------------------------------- 2: 2: [arch-nspawn-3003276:04575] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file client/pmix_client.c at line 278 2: [arch-nspawn-3003276:04468] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 1056 2: [arch-nspawn-3003276:04468] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 1231 2: [arch-nspawn-3003276:04468] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 1353 2: [arch-nspawn-3003276:04468] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 2405 2: [arch-nspawn-3003276:04468] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 2460 2: [arch-nspawn-3003276:04468] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 2476 2: [arch-nspawn-3003276:04468] PMIX ERROR: PMIX_ERROR in file server/pmix_server.c at line 4660 2: [arch-nspawn-3003276:04572] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file client/pmix_client.c at line 278 2: *** An error occurred in MPI_Init 2: *** on a NULL communicator 2: *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, 2: *** and MPI will try to terminate your MPI job as well) 2: [arch-nspawn-3003276:04575] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! 2: -------------------------------------------------------------------------- 2: PMIx_Init failed for the following reason: 2: 2: PMIX_ERROR 2: 2: Open MPI requires access to a local PMIx server to execute. Please ensure 2: that either you are operating in a PMIx-enabled environment, or use "mpirun" 2: to execute the job. 2: -------------------------------------------------------------------------- 2: -------------------------------------------------------------------------- 2: prte detected that one or more processes exited with non-zero status, 2: thus causing the job to be terminated. The first process to do so was: 2: 2: Process name: [prte-arch-nspawn-3003276-4468@1,1] 2: Exit code: 14 2: -------------------------------------------------------------------------- 2: [arch-nspawn-3003276:04468] PMIX ERROR: PMIX_ERR_UNREACH in file server/pmix_server.c at line 3110 2: 2/3 Test #2: replay_test_write_out ............***Failed 3.62 sec [arch-nspawn-3003276:04468] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 1056 [arch-nspawn-3003276:04468] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 1231 [arch-nspawn-3003276:04468] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 1353 [arch-nspawn-3003276:04468] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 2405 [arch-nspawn-3003276:04468] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 2460 [arch-nspawn-3003276:04468] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 2476 [arch-nspawn-3003276:04468] PMIX ERROR: PMIX_ERROR in file server/pmix_server.c at line 4660 -------------------------------------------------------------------------- The gds/shmem2 component attempted to attach to a shared-memory segment at a particular base address, but was given a different one. Your job will now likely abort. Requested Address: 0x2aaac8000000 Acquired Address: 0x3fb1e51000 If this problem persists, please consider disabling the gds/shmem2 component by setting in your environment the following: PMIX_MCA_gds=hash -------------------------------------------------------------------------- [arch-nspawn-3003276:04575] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file client/pmix_client.c at line 278 [arch-nspawn-3003276:04468] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 1056 [arch-nspawn-3003276:04468] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 1231 [arch-nspawn-3003276:04468] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 1353 [arch-nspawn-3003276:04468] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 2405 [arch-nspawn-3003276:04468] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 2460 [arch-nspawn-3003276:04468] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 2476 [arch-nspawn-3003276:04468] PMIX ERROR: PMIX_ERROR in file server/pmix_server.c at line 4660 [arch-nspawn-3003276:04572] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file client/pmix_client.c at line 278 *** An error occurred in MPI_Init *** on a NULL communicator *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, *** and MPI will try to terminate your MPI job as well) [arch-nspawn-3003276:04575] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! -------------------------------------------------------------------------- PMIx_Init failed for the following reason: PMIX_ERROR Open MPI requires access to a local PMIx server to execute. Please ensure that either you are operating in a PMIx-enabled environment, or use "mpirun" to execute the job. -------------------------------------------------------------------------- -------------------------------------------------------------------------- prte detected that one or more processes exited with non-zero status, thus causing the job to be terminated. The first process to do so was: Process name: [prte-arch-nspawn-3003276-4468@1,1] Exit code: 14 -------------------------------------------------------------------------- [arch-nspawn-3003276:04468] PMIX ERROR: PMIX_ERR_UNREACH in file server/pmix_server.c at line 3110 test 3 Start 3: replay_test_read_in Failed test dependencies: replay_test_write_out 3/3 Test #3: replay_test_read_in ..............***Not Run 0.00 sec 33% tests passed, 2 tests failed out of 3 Total Test time (real) = 3.66 sec The following tests FAILED: 2 - replay_test_write_out (Failed) 3 - replay_test_read_in (Not Run) Errors while running CTest Test command failed: /usr/bin/ctest CMake Error at /build/paraview-catalyst/src/catalyst-v2.1.0/cmake/catalyst-test-path-setting.cmake:41 (message): test or example failed! Start 42: test-build-replay_missing_initialize_data 35/69 Test #38: test-build-replay_high_num_execute_invc ...............***Failed 30.40 sec Internal cmake changing into directory: /build/paraview-catalyst/src/build/tests/replay_tests/build-replay_high_num_execute_invc ======== CMake output ====== CMake Deprecation Warning at CMakeLists.txt:1 (cmake_minimum_required): Compatibility with CMake < 3.10 will be removed from a future version of CMake. Update the VERSION argument value. Or, use the ... syntax to tell CMake that the project requires at least but has been updated to work with policies introduced by or earlier. The C compiler identification is GNU 15.2.1 The CXX compiler identification is GNU 15.2.1 Detecting C compiler ABI info Detecting C compiler ABI info - done Check for working C compiler: /usr/bin/cc - skipped Detecting C compile features Detecting C compile features - done Detecting CXX compiler ABI info Detecting CXX compiler ABI info - done Check for working CXX compiler: /usr/bin/c++ - skipped Detecting CXX compile features Detecting CXX compile features - done Found MPI_C: /lib/libmpi.so (found version "3.1") Found MPI: TRUE (found version "3.1") found components: C Found Python3: /usr/include/python3.14 (found version "3.14.3") found components: Development Development.Module Development.Embed Configuring done (7.2s) Generating done (0.2s) Build files have been written to: /build/paraview-catalyst/src/build/tests/replay_tests/build-replay_high_num_execute_invc ======== End CMake output ====== Change Dir: '/build/paraview-catalyst/src/build/tests/replay_tests/build-replay_high_num_execute_invc' Run Clean Command: /usr/bin/ninja clean [1/1] Cleaning all built files... Cleaning... 0 files. Run Build Command(s): /usr/bin/ninja [1/5] Building C object CMakeFiles/high_num_execute_invc_adaptor.dir/catalyst_impl_replay.c.o [2/5] Building CXX object CMakeFiles/high_num_execute_invc_driver.dir/high_num_execute_invc_driver.cxx.o [3/5] Linking CXX executable high_num_execute_invc_driver [4/5] Building CXX object CMakeFiles/high_num_execute_invc_adaptor.dir/build/paraview-catalyst/src/catalyst-v2.1.0/tests/replay_tests/common_src_dir/common_replay_adaptor.cxx.o [5/5] Linking CXX shared module catalyst/libcatalyst-replay.so Running test command: "/usr/bin/ctest" "-C" "$" "-V" "--output-on-failure" UpdateCTestConfiguration from :/build/paraview-catalyst/src/build/tests/replay_tests/build-replay_high_num_execute_invc/DartConfiguration.tcl Parse Config file:/build/paraview-catalyst/src/build/tests/replay_tests/build-replay_high_num_execute_invc/DartConfiguration.tcl Test project /build/paraview-catalyst/src/build/tests/replay_tests/build-replay_high_num_execute_invc Constructing a list of tests Done constructing a list of tests Updating test list for fixtures Added 0 tests to meet fixture requirements Checking test dependency graph... Checking test dependency graph end test 1 Start 1: high_num_execute_invc_prepare 1: Test command: /usr/bin/cmake "-E" "rm" "-rf" "/build/paraview-catalyst/src/build/tests/replay_tests/build-replay_high_num_execute_invc/data_dump" 1: Working Directory: /build/paraview-catalyst/src/build/tests/replay_tests/build-replay_high_num_execute_invc 1: Test timeout computed to be: 1500 1/3 Test #1: high_num_execute_invc_prepare ..... Passed 0.04 sec test 2 Start 2: high_num_execute_invc_write_out 2: Test command: /usr/bin/mpiexec "-n" "2" "/build/paraview-catalyst/src/build/tests/replay_tests/build-replay_high_num_execute_invc/high_num_execute_invc_driver" "/build/paraview-catalyst/src/build/tests/replay_tests/build-replay_high_num_execute_invc/catalyst" 2: Working Directory: /build/paraview-catalyst/src/build/tests/replay_tests/build-replay_high_num_execute_invc 2: Environment variables: 2: CATALYST_DATA_DUMP_DIRECTORY=/build/paraview-catalyst/src/build/tests/replay_tests/build-replay_high_num_execute_invc/data_dump 2: Test timeout computed to be: 1500 2: [arch-nspawn-3003276:04458] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 1056 2: [arch-nspawn-3003276:04458] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 1231 2: [arch-nspawn-3003276:04458] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 1353 2: [arch-nspawn-3003276:04458] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 2405 2: [arch-nspawn-3003276:04458] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 2460 2: [arch-nspawn-3003276:04458] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 2476 2: [arch-nspawn-3003276:04458] PMIX ERROR: PMIX_ERROR in file server/pmix_server.c at line 4660 2: -------------------------------------------------------------------------- 2: The gds/shmem2 component attempted to attach to a shared-memory segment at a 2: particular base address, but was given a different one. Your job will now likely 2: abort. 2: 2: Requested Address: 0x2aaab8000000 2: Acquired Address: 0x3f87b80000 2: 2: If this problem persists, please consider disabling the gds/shmem2 component by 2: setting in your environment the following: PMIX_MCA_gds=hash 2: -------------------------------------------------------------------------- 2: 2: [arch-nspawn-3003276:04579] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file client/pmix_client.c at line 278 2: [arch-nspawn-3003276:04458] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 1056 2: [arch-nspawn-3003276:04458] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 1231 2: [arch-nspawn-3003276:04458] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 1353 2: [arch-nspawn-3003276:04458] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 2405 2: [arch-nspawn-3003276:04458] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 2460 2: [arch-nspawn-3003276:04458] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 2476 2: [arch-nspawn-3003276:04458] PMIX ERROR: PMIX_ERROR in file server/pmix_server.c at line 4660 2: [arch-nspawn-3003276:04573] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file client/pmix_client.c at line 278 2: *** An error occurred in MPI_Init 2: *** on a NULL communicator 2: *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, 2: *** and MPI will try to terminate your MPI job as well) 2: [arch-nspawn-3003276:04579] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! 2: -------------------------------------------------------------------------- 2: PMIx_Init failed for the following reason: 2: 2: PMIX_ERROR 2: 2: Open MPI requires access to a local PMIx server to execute. Please ensure 2: that either you are operating in a PMIx-enabled environment, or use "mpirun" 2: to execute the job. 2: -------------------------------------------------------------------------- 2: -------------------------------------------------------------------------- 2: prte detected that one or more processes exited with non-zero status, 2: thus causing the job to be terminated. The first process to do so was: 2: 2: Process name: [prte-arch-nspawn-3003276-4458@1,1] 2: Exit code: 14 2: -------------------------------------------------------------------------- 2: 2/3 Test #2: high_num_execute_invc_write_out ...***Failed 4.22 sec [arch-nspawn-3003276:04458] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 1056 [arch-nspawn-3003276:04458] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 1231 [arch-nspawn-3003276:04458] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 1353 [arch-nspawn-3003276:04458] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 2405 [arch-nspawn-3003276:04458] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 2460 [arch-nspawn-3003276:04458] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 2476 [arch-nspawn-3003276:04458] PMIX ERROR: PMIX_ERROR in file server/pmix_server.c at line 4660 -------------------------------------------------------------------------- The gds/shmem2 component attempted to attach to a shared-memory segment at a particular base address, but was given a different one. Your job will now likely abort. Requested Address: 0x2aaab8000000 Acquired Address: 0x3f87b80000 If this problem persists, please consider disabling the gds/shmem2 component by setting in your environment the following: PMIX_MCA_gds=hash -------------------------------------------------------------------------- [arch-nspawn-3003276:04579] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file client/pmix_client.c at line 278 [arch-nspawn-3003276:04458] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 1056 [arch-nspawn-3003276:04458] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 1231 [arch-nspawn-3003276:04458] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 1353 [arch-nspawn-3003276:04458] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 2405 [arch-nspawn-3003276:04458] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 2460 [arch-nspawn-3003276:04458] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 2476 [arch-nspawn-3003276:04458] PMIX ERROR: PMIX_ERROR in file server/pmix_server.c at line 4660 [arch-nspawn-3003276:04573] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file client/pmix_client.c at line 278 *** An error occurred in MPI_Init *** on a NULL communicator *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, *** and MPI will try to terminate your MPI job as well) [arch-nspawn-3003276:04579] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! -------------------------------------------------------------------------- PMIx_Init failed for the following reason: PMIX_ERROR Open MPI requires access to a local PMIx server to execute. Please ensure that either you are operating in a PMIx-enabled environment, or use "mpirun" to execute the job. -------------------------------------------------------------------------- -------------------------------------------------------------------------- prte detected that one or more processes exited with non-zero status, thus causing the job to be terminated. The first process to do so was: Process name: [prte-arch-nspawn-3003276-4458@1,1] Exit code: 14 -------------------------------------------------------------------------- test 3 Start 3: high_num_execute_invc_read_in Failed test dependencies: high_num_execute_invc_write_out 3/3 Test #3: high_num_execute_invc_read_in .....***Not Run 0.00 sec 33% tests passed, 2 tests failed out of 3 Total Test time (real) = 4.27 sec The following tests FAILED: 2 - high_num_execute_invc_write_out (Failed) 3 - high_num_execute_invc_read_in (Not Run) Errors while running CTest Test command failed: /usr/bin/ctest CMake Error at /build/paraview-catalyst/src/catalyst-v2.1.0/cmake/catalyst-test-path-setting.cmake:41 (message): test or example failed! Start 43: test-install-replay_missing_initialize_data 36/69 Test #6: example-install-adaptor0 .............................. Passed 30.12 sec Start 44: test-build-replay_missing_execute_invc 37/69 Test #35: test-install-replay_high_num_ranks ....................***Failed 29.09 sec Internal cmake changing into directory: /build/paraview-catalyst/src/build/tests/replay_tests/install-replay_high_num_ranks ======== CMake output ====== CMake Deprecation Warning at CMakeLists.txt:1 (cmake_minimum_required): Compatibility with CMake < 3.10 will be removed from a future version of CMake. Update the VERSION argument value. Or, use the ... syntax to tell CMake that the project requires at least but has been updated to work with policies introduced by or earlier. The C compiler identification is GNU 15.2.1 The CXX compiler identification is GNU 15.2.1 Detecting C compiler ABI info Detecting C compiler ABI info - done Check for working C compiler: /usr/bin/cc - skipped Detecting C compile features Detecting C compile features - done Detecting CXX compiler ABI info Detecting CXX compiler ABI info - done Check for working CXX compiler: /usr/bin/c++ - skipped Detecting CXX compile features Detecting CXX compile features - done Found MPI_C: /lib/libmpi.so (found version "3.1") Found MPI: TRUE (found version "3.1") found components: C Found Python3: /usr/include/python3.14 (found version "3.14.3") found components: Development Development.Module Development.Embed Configuring done (11.2s) Generating done (0.1s) Build files have been written to: /build/paraview-catalyst/src/build/tests/replay_tests/install-replay_high_num_ranks ======== End CMake output ====== Change Dir: '/build/paraview-catalyst/src/build/tests/replay_tests/install-replay_high_num_ranks' Run Clean Command: /usr/bin/ninja clean [1/1] Cleaning all built files... Cleaning... 0 files. Run Build Command(s): /usr/bin/ninja [1/5] Building C object CMakeFiles/high_num_ranks_adaptor.dir/catalyst_impl_replay.c.o [2/5] Building CXX object CMakeFiles/high_num_ranks_driver.dir/build/paraview-catalyst/src/catalyst-v2.1.0/tests/replay_tests/common_src_dir/common_replay_driver.cxx.o [3/5] Linking CXX executable high_num_ranks_driver [4/5] Building CXX object CMakeFiles/high_num_ranks_adaptor.dir/build/paraview-catalyst/src/catalyst-v2.1.0/tests/replay_tests/common_src_dir/common_replay_adaptor.cxx.o [5/5] Linking CXX shared module catalyst/libcatalyst-replay.so Running test command: "/usr/bin/ctest" "-C" "$" "-V" "--output-on-failure" UpdateCTestConfiguration from :/build/paraview-catalyst/src/build/tests/replay_tests/install-replay_high_num_ranks/DartConfiguration.tcl Parse Config file:/build/paraview-catalyst/src/build/tests/replay_tests/install-replay_high_num_ranks/DartConfiguration.tcl Test project /build/paraview-catalyst/src/build/tests/replay_tests/install-replay_high_num_ranks Constructing a list of tests Done constructing a list of tests Updating test list for fixtures Added 0 tests to meet fixture requirements Checking test dependency graph... Checking test dependency graph end test 1 Start 1: high_num_ranks_write_prepare 1: Test command: /usr/bin/cmake "-E" "rm" "-rf" "/build/paraview-catalyst/src/build/tests/replay_tests/install-replay_high_num_ranks/data_dump/" 1: Working Directory: /build/paraview-catalyst/src/build/tests/replay_tests/install-replay_high_num_ranks 1: Test timeout computed to be: 1500 1/3 Test #1: high_num_ranks_write_prepare ..... Passed 0.12 sec test 2 Start 2: high_num_ranks_write_out 2: Test command: /usr/bin/mpiexec "-n" "10" "/build/paraview-catalyst/src/build/tests/replay_tests/install-replay_high_num_ranks/high_num_ranks_driver" "/build/paraview-catalyst/src/build/tests/replay_tests/install-replay_high_num_ranks/catalyst" 2: Working Directory: /build/paraview-catalyst/src/build/tests/replay_tests/install-replay_high_num_ranks 2: Environment variables: 2: CATALYST_DATA_DUMP_DIRECTORY=/build/paraview-catalyst/src/build/tests/replay_tests/install-replay_high_num_ranks/data_dump/ 2: Test timeout computed to be: 1500 2: -------------------------------------------------------------------------- 2: There are not enough slots available in the system to satisfy the 10 2: slots that were requested by the application: 2: 2: /build/paraview-catalyst/src/build/tests/replay_tests/install-replay_high_num_ranks/high_num_ranks_driver 2: 2: Either request fewer procs for your application, or make more slots 2: available for use. 2: 2: A "slot" is the PRRTE term for an allocatable unit where we can 2: launch a process. The number of slots available are defined by the 2: environment in which PRRTE processes are run: 2: 2: 1. Hostfile, via "slots=N" clauses (N defaults to number of 2: processor cores if not provided) 2: 2. The --host command line parameter, via a ":N" suffix on the 2: hostname (N defaults to 1 if not provided) 2: 3. Resource manager (e.g., SLURM, PBS/Torque, LSF, etc.) 2: 4. If none of a hostfile, the --host command line parameter, or an 2: RM is present, PRRTE defaults to the number of processor cores 2: 2: In all the above cases, if you want PRRTE to default to the number 2: of hardware threads instead of the number of processor cores, use the 2: --use-hwthread-cpus option. 2: 2: Alternatively, you can use the --map-by :OVERSUBSCRIBE option to ignore the 2: number of available slots when deciding the number of processes to 2: launch. 2: -------------------------------------------------------------------------- 2: 2/3 Test #2: high_num_ranks_write_out .........***Failed 0.29 sec -------------------------------------------------------------------------- There are not enough slots available in the system to satisfy the 10 slots that were requested by the application: /build/paraview-catalyst/src/build/tests/replay_tests/install-replay_high_num_ranks/high_num_ranks_driver Either request fewer procs for your application, or make more slots available for use. A "slot" is the PRRTE term for an allocatable unit where we can launch a process. The number of slots available are defined by the environment in which PRRTE processes are run: 1. Hostfile, via "slots=N" clauses (N defaults to number of processor cores if not provided) 2. The --host command line parameter, via a ":N" suffix on the hostname (N defaults to 1 if not provided) 3. Resource manager (e.g., SLURM, PBS/Torque, LSF, etc.) 4. If none of a hostfile, the --host command line parameter, or an RM is present, PRRTE defaults to the number of processor cores In all the above cases, if you want PRRTE to default to the number of hardware threads instead of the number of processor cores, use the --use-hwthread-cpus option. Alternatively, you can use the --map-by :OVERSUBSCRIBE option to ignore the number of available slots when deciding the number of processes to launch. -------------------------------------------------------------------------- test 3 Start 3: high_num_ranks_read_in Failed test dependencies: high_num_ranks_write_out 3/3 Test #3: high_num_ranks_read_in ...........***Not Run 0.00 sec 33% tests passed, 2 tests failed out of 3 Total Test time (real) = 0.41 sec The following tests FAILED: 2 - high_num_ranks_write_out (Failed) 3 - high_num_ranks_read_in (Not Run) Errors while running CTest Test command failed: /usr/bin/ctest CMake Error at /build/paraview-catalyst/src/catalyst-v2.1.0/cmake/catalyst-test-path-setting.cmake:41 (message): test or example failed! Start 45: test-install-replay_missing_execute_invc 38/69 Test #8: example-install-replay ................................***Failed 33.65 sec Internal cmake changing into directory: /build/paraview-catalyst/src/build/examples/install-replay ======== CMake output ====== The C compiler identification is GNU 15.2.1 The CXX compiler identification is GNU 15.2.1 Detecting C compiler ABI info Detecting C compiler ABI info - done Check for working C compiler: /usr/bin/cc - skipped Detecting C compile features Detecting C compile features - done Detecting CXX compiler ABI info Detecting CXX compiler ABI info - done Check for working CXX compiler: /usr/bin/c++ - skipped Detecting CXX compile features Detecting CXX compile features - done Found MPI_C: /lib/libmpi.so (found version "3.1") Found MPI: TRUE (found version "3.1") found components: C Found Python3: /usr/include/python3.14 (found version "3.14.3") found components: Development Development.Module Development.Embed Configuring done (11.9s) Generating done (0.1s) Build files have been written to: /build/paraview-catalyst/src/build/examples/install-replay ======== End CMake output ====== Change Dir: '/build/paraview-catalyst/src/build/examples/install-replay' Run Clean Command: /usr/bin/ninja clean [1/1] Cleaning all built files... Cleaning... 0 files. Run Build Command(s): /usr/bin/ninja [1/5] Building C object CMakeFiles/catalyst_example_replay_adaptor.dir/catalyst_impl_example_replay_adaptor.c.o [2/5] Building CXX object CMakeFiles/replay_test.dir/replay_test.cxx.o [3/5] Linking CXX executable replay_test [4/5] Building CXX object CMakeFiles/catalyst_example_replay_adaptor.dir/catalyst_example_replay_adaptor.cxx.o [5/5] Linking CXX shared module catalyst/libcatalyst-example_replay_adaptor.so Running test command: "/usr/bin/ctest" "-C" "$" "-V" "--output-on-failure" UpdateCTestConfiguration from :/build/paraview-catalyst/src/build/examples/install-replay/DartConfiguration.tcl Parse Config file:/build/paraview-catalyst/src/build/examples/install-replay/DartConfiguration.tcl Test project /build/paraview-catalyst/src/build/examples/install-replay Constructing a list of tests Done constructing a list of tests Updating test list for fixtures Added 0 tests to meet fixture requirements Checking test dependency graph... Checking test dependency graph end test 1 Start 1: replay_test_prepare 1: Test command: /usr/bin/cmake "-E" "rm" "-rf" "/build/paraview-catalyst/src/build/examples/install-replay/data_dump" 1: Working Directory: /build/paraview-catalyst/src/build/examples/install-replay 1: Test timeout computed to be: 1500 1/3 Test #1: replay_test_prepare .............. Passed 0.12 sec test 2 Start 2: replay_test_write_out 2: Test command: /usr/bin/mpiexec "-n" "2" "/build/paraview-catalyst/src/build/examples/install-replay/replay_test" "/build/paraview-catalyst/src/build/examples/install-replay/catalyst" 2: Working Directory: /build/paraview-catalyst/src/build/examples/install-replay 2: Environment variables: 2: CATALYST_DATA_DUMP_DIRECTORY=/build/paraview-catalyst/src/build/examples/install-replay/data_dump 2: Test timeout computed to be: 1500 2: [arch-nspawn-3003276:06320] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 1056 2: [arch-nspawn-3003276:06320] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 1231 2: [arch-nspawn-3003276:06320] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 1353 2: [arch-nspawn-3003276:06320] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 2405 2: [arch-nspawn-3003276:06320] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 2460 2: [arch-nspawn-3003276:06320] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 2476 2: [arch-nspawn-3003276:06320] PMIX ERROR: PMIX_ERROR in file server/pmix_server.c at line 4660 2: -------------------------------------------------------------------------- 2: The gds/shmem2 component attempted to attach to a shared-memory segment at a 2: particular base address, but was given a different one. Your job will now likely 2: abort. 2: 2: Requested Address: 0x2aaac0000000 2: Acquired Address: 0x3fa2afc000 2: 2: If this problem persists, please consider disabling the gds/shmem2 component by 2: setting in your environment the following: PMIX_MCA_gds=hash 2: -------------------------------------------------------------------------- 2: 2: [arch-nspawn-3003276:06329] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file client/pmix_client.c at line 278 2: [arch-nspawn-3003276:06320] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 1056 2: [arch-nspawn-3003276:06320] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 1231 2: [arch-nspawn-3003276:06320] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 1353 2: [arch-nspawn-3003276:06320] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 2405 2: [arch-nspawn-3003276:06320] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 2460 2: [arch-nspawn-3003276:06320] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 2476 2: [arch-nspawn-3003276:06320] PMIX ERROR: PMIX_ERROR in file server/pmix_server.c at line 4660 2: [arch-nspawn-3003276:06330] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file client/pmix_client.c at line 278 2: *** An error occurred in MPI_Init 2: *** on a NULL communicator 2: *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, 2: *** and MPI will try to terminate your MPI job as well) 2: [arch-nspawn-3003276:06329] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! 2: *** An error occurred in MPI_Init 2: *** on a NULL communicator 2: *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, 2: *** and MPI will try to terminate your MPI job as well) 2: [arch-nspawn-3003276:06330] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! 2: -------------------------------------------------------------------------- 2: PMIx_Init failed for the following reason: 2: 2: PMIX_ERROR 2: 2: Open MPI requires access to a local PMIx server to execute. Please ensure 2: that either you are operating in a PMIx-enabled environment, or use "mpirun" 2: to execute the job. 2: -------------------------------------------------------------------------- 2: -------------------------------------------------------------------------- 2: prte detected that one or more processes exited with non-zero status, 2: thus causing the job to be terminated. The first process to do so was: 2: 2: Process name: [prte-arch-nspawn-3003276-6320@1,0] 2: Exit code: 14 2: -------------------------------------------------------------------------- 2: [arch-nspawn-3003276:06320] PMIX ERROR: PMIX_ERR_UNREACH in file server/pmix_server.c at line 3110 2: 2/3 Test #2: replay_test_write_out ............***Failed 0.55 sec [arch-nspawn-3003276:06320] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 1056 [arch-nspawn-3003276:06320] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 1231 [arch-nspawn-3003276:06320] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 1353 [arch-nspawn-3003276:06320] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 2405 [arch-nspawn-3003276:06320] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 2460 [arch-nspawn-3003276:06320] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 2476 [arch-nspawn-3003276:06320] PMIX ERROR: PMIX_ERROR in file server/pmix_server.c at line 4660 -------------------------------------------------------------------------- The gds/shmem2 component attempted to attach to a shared-memory segment at a particular base address, but was given a different one. Your job will now likely abort. Requested Address: 0x2aaac0000000 Acquired Address: 0x3fa2afc000 If this problem persists, please consider disabling the gds/shmem2 component by setting in your environment the following: PMIX_MCA_gds=hash -------------------------------------------------------------------------- [arch-nspawn-3003276:06329] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file client/pmix_client.c at line 278 [arch-nspawn-3003276:06320] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 1056 [arch-nspawn-3003276:06320] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 1231 [arch-nspawn-3003276:06320] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 1353 [arch-nspawn-3003276:06320] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 2405 [arch-nspawn-3003276:06320] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 2460 [arch-nspawn-3003276:06320] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 2476 [arch-nspawn-3003276:06320] PMIX ERROR: PMIX_ERROR in file server/pmix_server.c at line 4660 [arch-nspawn-3003276:06330] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file client/pmix_client.c at line 278 *** An error occurred in MPI_Init *** on a NULL communicator *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, *** and MPI will try to terminate your MPI job as well) [arch-nspawn-3003276:06329] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! *** An error occurred in MPI_Init *** on a NULL communicator *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, *** and MPI will try to terminate your MPI job as well) [arch-nspawn-3003276:06330] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! -------------------------------------------------------------------------- PMIx_Init failed for the following reason: PMIX_ERROR Open MPI requires access to a local PMIx server to execute. Please ensure that either you are operating in a PMIx-enabled environment, or use "mpirun" to execute the job. -------------------------------------------------------------------------- -------------------------------------------------------------------------- prte detected that one or more processes exited with non-zero status, thus causing the job to be terminated. The first process to do so was: Process name: [prte-arch-nspawn-3003276-6320@1,0] Exit code: 14 -------------------------------------------------------------------------- [arch-nspawn-3003276:06320] PMIX ERROR: PMIX_ERR_UNREACH in file server/pmix_server.c at line 3110 test 3 Start 3: replay_test_read_in Failed test dependencies: replay_test_write_out 3/3 Test #3: replay_test_read_in ..............***Not Run 0.00 sec 33% tests passed, 2 tests failed out of 3 Total Test time (real) = 0.68 sec The following tests FAILED: 2 - replay_test_write_out (Failed) 3 - replay_test_read_in (Not Run) Errors while running CTest Test command failed: /usr/bin/ctest CMake Error at /build/paraview-catalyst/src/catalyst-v2.1.0/cmake/catalyst-test-path-setting.cmake:41 (message): test or example failed! Start 46: test-build-replay_high_num_ranks_fortran 39/69 Test #37: test-install-replay_ranks_mismatch ....................***Failed 31.03 sec Internal cmake changing into directory: /build/paraview-catalyst/src/build/tests/replay_tests/install-replay_ranks_mismatch ======== CMake output ====== CMake Deprecation Warning at CMakeLists.txt:1 (cmake_minimum_required): Compatibility with CMake < 3.10 will be removed from a future version of CMake. Update the VERSION argument value. Or, use the ... syntax to tell CMake that the project requires at least but has been updated to work with policies introduced by or earlier. The C compiler identification is GNU 15.2.1 The CXX compiler identification is GNU 15.2.1 Detecting C compiler ABI info Detecting C compiler ABI info - done Check for working C compiler: /usr/bin/cc - skipped Detecting C compile features Detecting C compile features - done Detecting CXX compiler ABI info Detecting CXX compiler ABI info - done Check for working CXX compiler: /usr/bin/c++ - skipped Detecting CXX compile features Detecting CXX compile features - done Found MPI_C: /lib/libmpi.so (found version "3.1") Found MPI: TRUE (found version "3.1") found components: C Found Python3: /usr/include/python3.14 (found version "3.14.3") found components: Development Development.Module Development.Embed Configuring done (10.1s) Generating done (0.2s) Build files have been written to: /build/paraview-catalyst/src/build/tests/replay_tests/install-replay_ranks_mismatch ======== End CMake output ====== Change Dir: '/build/paraview-catalyst/src/build/tests/replay_tests/install-replay_ranks_mismatch' Run Clean Command: /usr/bin/ninja clean [1/1] Cleaning all built files... Cleaning... 0 files. Run Build Command(s): /usr/bin/ninja [1/5] Building C object CMakeFiles/replay_ranks_mismatch_adaptor.dir/catalyst_impl_replay.c.o [2/5] Building CXX object CMakeFiles/replay_ranks_mismatch_driver.dir/build/paraview-catalyst/src/catalyst-v2.1.0/tests/replay_tests/common_src_dir/common_replay_driver.cxx.o [3/5] Linking CXX executable replay_ranks_mismatch_driver [4/5] Building CXX object CMakeFiles/replay_ranks_mismatch_adaptor.dir/build/paraview-catalyst/src/catalyst-v2.1.0/tests/replay_tests/common_src_dir/common_replay_adaptor.cxx.o [5/5] Linking CXX shared module catalyst/libcatalyst-replay.so Running test command: "/usr/bin/ctest" "-C" "$" "-V" "--output-on-failure" UpdateCTestConfiguration from :/build/paraview-catalyst/src/build/tests/replay_tests/install-replay_ranks_mismatch/DartConfiguration.tcl Parse Config file:/build/paraview-catalyst/src/build/tests/replay_tests/install-replay_ranks_mismatch/DartConfiguration.tcl Test project /build/paraview-catalyst/src/build/tests/replay_tests/install-replay_ranks_mismatch Constructing a list of tests Done constructing a list of tests Updating test list for fixtures Added 0 tests to meet fixture requirements Checking test dependency graph... Checking test dependency graph end test 1 Start 1: replay_ranks_mismatch_prepare 1: Test command: /usr/bin/cmake "-E" "rm" "-rf" "/build/paraview-catalyst/src/build/tests/replay_tests/install-replay_ranks_mismatch/data_dump/" 1: Working Directory: /build/paraview-catalyst/src/build/tests/replay_tests/install-replay_ranks_mismatch 1: Test timeout computed to be: 1500 1/3 Test #1: replay_ranks_mismatch_prepare ..... Passed 0.10 sec test 2 Start 2: replay_ranks_mismatch_write_out 2: Test command: /usr/bin/mpiexec "-n" "1" "/build/paraview-catalyst/src/build/tests/replay_tests/install-replay_ranks_mismatch/replay_ranks_mismatch_driver" "/build/paraview-catalyst/src/build/tests/replay_tests/install-replay_ranks_mismatch/catalyst" 2: Working Directory: /build/paraview-catalyst/src/build/tests/replay_tests/install-replay_ranks_mismatch 2: Environment variables: 2: CATALYST_DATA_DUMP_DIRECTORY=/build/paraview-catalyst/src/build/tests/replay_tests/install-replay_ranks_mismatch/data_dump/ 2: Test timeout computed to be: 1500 2: [arch-nspawn-3003276:06636] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 1056 2: [arch-nspawn-3003276:06636] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 1231 2: [arch-nspawn-3003276:06636] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 1353 2: [arch-nspawn-3003276:06636] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 2405 2: [arch-nspawn-3003276:06636] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 2460 2: [arch-nspawn-3003276:06636] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 2476 2: [arch-nspawn-3003276:06636] PMIX ERROR: PMIX_ERROR in file server/pmix_server.c at line 4660 2: -------------------------------------------------------------------------- 2: The gds/shmem2 component attempted to attach to a shared-memory segment at a 2: particular base address, but was given a different one. Your job will now likely 2: abort. 2: 2: Requested Address: 0x2aaac8000000 2: Acquired Address: 0x3f95972000 2: 2: If this problem persists, please consider disabling the gds/shmem2 component by 2: setting in your environment the following: PMIX_MCA_gds=hash 2: -------------------------------------------------------------------------- 2: 2: [arch-nspawn-3003276:06641] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file client/pmix_client.c at line 278 2: *** An error occurred in MPI_Init 2: *** on a NULL communicator 2: *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, 2: *** and MPI will try to terminate your MPI job as well) 2: [arch-nspawn-3003276:06641] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! 2/3 Test #2: replay_ranks_mismatch_write_out ...***Failed 0.46 sec [arch-nspawn-3003276:06636] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 1056 [arch-nspawn-3003276:06636] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 1231 [arch-nspawn-3003276:06636] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 1353 [arch-nspawn-3003276:06636] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 2405 [arch-nspawn-3003276:06636] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 2460 [arch-nspawn-3003276:06636] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 2476 [arch-nspawn-3003276:06636] PMIX ERROR: PMIX_ERROR in file server/pmix_server.c at line 4660 -------------------------------------------------------------------------- The gds/shmem2 component attempted to attach to a shared-memory segment at a particular base address, but was given a different one. Your job will now likely abort. Requested Address: 0x2aaac8000000 Acquired Address: 0x3f95972000 If this problem persists, please consider disabling the gds/shmem2 component by setting in your environment the following: PMIX_MCA_gds=hash -------------------------------------------------------------------------- [arch-nspawn-3003276:06641] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file client/pmix_client.c at line 278 *** An error occurred in MPI_Init *** on a NULL communicator *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, *** and MPI will try to terminate your MPI job as well) [arch-nspawn-3003276:06641] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! test 3 Start 3: replay_ranks_mismatch_read_in Failed test dependencies: replay_ranks_mismatch_write_out 3/3 Test #3: replay_ranks_mismatch_read_in .....***Not Run 0.00 sec 33% tests passed, 2 tests failed out of 3 Total Test time (real) = 0.57 sec The following tests FAILED: 2 - replay_ranks_mismatch_write_out (Failed) 3 - replay_ranks_mismatch_read_in (Not Run) Errors while running CTest Test command failed: /usr/bin/ctest CMake Error at /build/paraview-catalyst/src/catalyst-v2.1.0/cmake/catalyst-test-path-setting.cmake:41 (message): test or example failed! Start 47: test-install-replay_high_num_ranks_fortran 40/69 Test #39: test-install-replay_high_num_execute_invc .............***Failed 31.71 sec Internal cmake changing into directory: /build/paraview-catalyst/src/build/tests/replay_tests/install-replay_high_num_execute_invc ======== CMake output ====== CMake Deprecation Warning at CMakeLists.txt:1 (cmake_minimum_required): Compatibility with CMake < 3.10 will be removed from a future version of CMake. Update the VERSION argument value. Or, use the ... syntax to tell CMake that the project requires at least but has been updated to work with policies introduced by or earlier. The C compiler identification is GNU 15.2.1 The CXX compiler identification is GNU 15.2.1 Detecting C compiler ABI info Detecting C compiler ABI info - done Check for working C compiler: /usr/bin/cc - skipped Detecting C compile features Detecting C compile features - done Detecting CXX compiler ABI info Detecting CXX compiler ABI info - done Check for working CXX compiler: /usr/bin/c++ - skipped Detecting CXX compile features Detecting CXX compile features - done Found MPI_C: /lib/libmpi.so (found version "3.1") Found MPI: TRUE (found version "3.1") found components: C Found Python3: /usr/include/python3.14 (found version "3.14.3") found components: Development Development.Module Development.Embed Configuring done (10.9s) Generating done (0.1s) Build files have been written to: /build/paraview-catalyst/src/build/tests/replay_tests/install-replay_high_num_execute_invc ======== End CMake output ====== Change Dir: '/build/paraview-catalyst/src/build/tests/replay_tests/install-replay_high_num_execute_invc' Run Clean Command: /usr/bin/ninja clean [1/1] Cleaning all built files... Cleaning... 0 files. Run Build Command(s): /usr/bin/ninja [1/5] Building C object CMakeFiles/high_num_execute_invc_adaptor.dir/catalyst_impl_replay.c.o [2/5] Building CXX object CMakeFiles/high_num_execute_invc_driver.dir/high_num_execute_invc_driver.cxx.o [3/5] Linking CXX executable high_num_execute_invc_driver [4/5] Building CXX object CMakeFiles/high_num_execute_invc_adaptor.dir/build/paraview-catalyst/src/catalyst-v2.1.0/tests/replay_tests/common_src_dir/common_replay_adaptor.cxx.o [5/5] Linking CXX shared module catalyst/libcatalyst-replay.so Running test command: "/usr/bin/ctest" "-C" "$" "-V" "--output-on-failure" UpdateCTestConfiguration from :/build/paraview-catalyst/src/build/tests/replay_tests/install-replay_high_num_execute_invc/DartConfiguration.tcl Parse Config file:/build/paraview-catalyst/src/build/tests/replay_tests/install-replay_high_num_execute_invc/DartConfiguration.tcl Test project /build/paraview-catalyst/src/build/tests/replay_tests/install-replay_high_num_execute_invc Constructing a list of tests Done constructing a list of tests Updating test list for fixtures Added 0 tests to meet fixture requirements Checking test dependency graph... Checking test dependency graph end test 1 Start 1: high_num_execute_invc_prepare 1: Test command: /usr/bin/cmake "-E" "rm" "-rf" "/build/paraview-catalyst/src/build/tests/replay_tests/install-replay_high_num_execute_invc/data_dump" 1: Working Directory: /build/paraview-catalyst/src/build/tests/replay_tests/install-replay_high_num_execute_invc 1: Test timeout computed to be: 1500 1/3 Test #1: high_num_execute_invc_prepare ..... Passed 0.11 sec test 2 Start 2: high_num_execute_invc_write_out 2: Test command: /usr/bin/mpiexec "-n" "2" "/build/paraview-catalyst/src/build/tests/replay_tests/install-replay_high_num_execute_invc/high_num_execute_invc_driver" "/build/paraview-catalyst/src/build/tests/replay_tests/install-replay_high_num_execute_invc/catalyst" 2: Working Directory: /build/paraview-catalyst/src/build/tests/replay_tests/install-replay_high_num_execute_invc 2: Environment variables: 2: CATALYST_DATA_DUMP_DIRECTORY=/build/paraview-catalyst/src/build/tests/replay_tests/install-replay_high_num_execute_invc/data_dump 2: Test timeout computed to be: 1500 2: [arch-nspawn-3003276:06686] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 1056 2: [arch-nspawn-3003276:06686] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 1231 2: [arch-nspawn-3003276:06686] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 1353 2: [arch-nspawn-3003276:06686] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 2405 2: [arch-nspawn-3003276:06686] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 2460 2: [arch-nspawn-3003276:06686] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 2476 2: [arch-nspawn-3003276:06686] PMIX ERROR: PMIX_ERROR in file server/pmix_server.c at line 4660 2: -------------------------------------------------------------------------- 2: The gds/shmem2 component attempted to attach to a shared-memory segment at a 2: particular base address, but was given a different one. Your job will now likely 2: abort. 2: 2: Requested Address: 0x2aaacc000000 2: Acquired Address: 0x3f91349000 2: 2: If this problem persists, please consider disabling the gds/shmem2 component by 2: setting in your environment the following: PMIX_MCA_gds=hash 2: -------------------------------------------------------------------------- 2: 2: [arch-nspawn-3003276:06726] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file client/pmix_client.c at line 278 2: [arch-nspawn-3003276:06686] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 1056 2: [arch-nspawn-3003276:06686] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 1231 2: [arch-nspawn-3003276:06686] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 1353 2: [arch-nspawn-3003276:06686] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 2405 2: [arch-nspawn-3003276:06686] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 2460 2: [arch-nspawn-3003276:06686] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 2476 2: [arch-nspawn-3003276:06686] PMIX ERROR: PMIX_ERROR in file server/pmix_server.c at line 4660 2: [arch-nspawn-3003276:06723] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file client/pmix_client.c at line 278 2: -------------------------------------------------------------------------- 2: PMIx_Init failed for the following reason: 2: 2: PMIX_ERROR 2: 2: Open MPI requires access to a local PMIx server to execute. Please ensure 2: that either you are operating in a PMIx-enabled environment, or use "mpirun" 2: to execute the job. 2: -------------------------------------------------------------------------- 2: 2: *** An error occurred in MPI_Init 2: *** on a NULL communicator 2: *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, 2: *** and MPI will try to terminate your MPI job as well) 2: [arch-nspawn-3003276:06723] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! 2: -------------------------------------------------------------------------- 2: prte detected that one or more processes exited with non-zero status, 2: thus causing the job to be terminated. The first process to do so was: 2: 2: Process name: [prte-arch-nspawn-3003276-6686@1,0] 2: Exit code: 14 2: -------------------------------------------------------------------------- 2/3 Test #2: high_num_execute_invc_write_out ...***Failed 0.69 sec [arch-nspawn-3003276:06686] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 1056 [arch-nspawn-3003276:06686] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 1231 [arch-nspawn-3003276:06686] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 1353 [arch-nspawn-3003276:06686] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 2405 [arch-nspawn-3003276:06686] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 2460 [arch-nspawn-3003276:06686] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 2476 [arch-nspawn-3003276:06686] PMIX ERROR: PMIX_ERROR in file server/pmix_server.c at line 4660 -------------------------------------------------------------------------- The gds/shmem2 component attempted to attach to a shared-memory segment at a particular base address, but was given a different one. Your job will now likely abort. Requested Address: 0x2aaacc000000 Acquired Address: 0x3f91349000 If this problem persists, please consider disabling the gds/shmem2 component by setting in your environment the following: PMIX_MCA_gds=hash -------------------------------------------------------------------------- [arch-nspawn-3003276:06726] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file client/pmix_client.c at line 278 [arch-nspawn-3003276:06686] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 1056 [arch-nspawn-3003276:06686] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 1231 [arch-nspawn-3003276:06686] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 1353 [arch-nspawn-3003276:06686] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 2405 [arch-nspawn-3003276:06686] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 2460 [arch-nspawn-3003276:06686] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 2476 [arch-nspawn-3003276:06686] PMIX ERROR: PMIX_ERROR in file server/pmix_server.c at line 4660 [arch-nspawn-3003276:06723] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file client/pmix_client.c at line 278 -------------------------------------------------------------------------- PMIx_Init failed for the following reason: PMIX_ERROR Open MPI requires access to a local PMIx server to execute. Please ensure that either you are operating in a PMIx-enabled environment, or use "mpirun" to execute the job. -------------------------------------------------------------------------- *** An error occurred in MPI_Init *** on a NULL communicator *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, *** and MPI will try to terminate your MPI job as well) [arch-nspawn-3003276:06723] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! -------------------------------------------------------------------------- prte detected that one or more processes exited with non-zero status, thus causing the job to be terminated. The first process to do so was: Process name: [prte-arch-nspawn-3003276-6686@1,0] Exit code: 14 -------------------------------------------------------------------------- test 3 Start 3: high_num_execute_invc_read_in Failed test dependencies: high_num_execute_invc_write_out 3/3 Test #3: high_num_execute_invc_read_in .....***Not Run 0.00 sec 33% tests passed, 2 tests failed out of 3 Total Test time (real) = 0.81 sec The following tests FAILED: 2 - high_num_execute_invc_write_out (Failed) 3 - high_num_execute_invc_read_in (Not Run) Errors while running CTest Test command failed: /usr/bin/ctest CMake Error at /build/paraview-catalyst/src/catalyst-v2.1.0/cmake/catalyst-test-path-setting.cmake:41 (message): test or example failed! Start 48: test-build-replay_ranks_mismatch_fortran 41/69 Test #43: test-install-replay_missing_initialize_data ...........***Failed 32.10 sec Internal cmake changing into directory: /build/paraview-catalyst/src/build/tests/replay_tests/install-replay_missing_initialize_data ======== CMake output ====== CMake Deprecation Warning at CMakeLists.txt:1 (cmake_minimum_required): Compatibility with CMake < 3.10 will be removed from a future version of CMake. Update the VERSION argument value. Or, use the ... syntax to tell CMake that the project requires at least but has been updated to work with policies introduced by or earlier. The C compiler identification is GNU 15.2.1 The CXX compiler identification is GNU 15.2.1 Detecting C compiler ABI info Detecting C compiler ABI info - done Check for working C compiler: /usr/bin/cc - skipped Detecting C compile features Detecting C compile features - done Detecting CXX compiler ABI info Detecting CXX compiler ABI info - done Check for working CXX compiler: /usr/bin/c++ - skipped Detecting CXX compile features Detecting CXX compile features - done Found MPI_C: /lib/libmpi.so (found version "3.1") Found MPI: TRUE (found version "3.1") found components: C Found Python3: /usr/include/python3.14 (found version "3.14.3") found components: Development Development.Module Development.Embed Configuring done (10.9s) Generating done (0.2s) Build files have been written to: /build/paraview-catalyst/src/build/tests/replay_tests/install-replay_missing_initialize_data ======== End CMake output ====== Change Dir: '/build/paraview-catalyst/src/build/tests/replay_tests/install-replay_missing_initialize_data' Run Clean Command: /usr/bin/ninja clean [1/1] Cleaning all built files... Cleaning... 0 files. Run Build Command(s): /usr/bin/ninja [1/5] Building C object CMakeFiles/missing_initialize_adaptor.dir/catalyst_impl_replay.c.o [2/5] Building CXX object CMakeFiles/missing_initialize_driver.dir/build/paraview-catalyst/src/catalyst-v2.1.0/tests/replay_tests/common_src_dir/common_replay_driver.cxx.o [3/5] Linking CXX executable missing_initialize_driver [4/5] Building CXX object CMakeFiles/missing_initialize_adaptor.dir/build/paraview-catalyst/src/catalyst-v2.1.0/tests/replay_tests/common_src_dir/common_replay_adaptor.cxx.o [5/5] Linking CXX shared module catalyst/libcatalyst-replay.so Running test command: "/usr/bin/ctest" "-C" "$" "-V" "--output-on-failure" UpdateCTestConfiguration from :/build/paraview-catalyst/src/build/tests/replay_tests/install-replay_missing_initialize_data/DartConfiguration.tcl Parse Config file:/build/paraview-catalyst/src/build/tests/replay_tests/install-replay_missing_initialize_data/DartConfiguration.tcl Test project /build/paraview-catalyst/src/build/tests/replay_tests/install-replay_missing_initialize_data Constructing a list of tests Done constructing a list of tests Updating test list for fixtures Added 0 tests to meet fixture requirements Checking test dependency graph... Checking test dependency graph end test 1 Start 1: missing_initialize_prepare 1: Test command: /usr/bin/cmake "-E" "rm" "-rf" "/build/paraview-catalyst/src/build/tests/replay_tests/install-replay_missing_initialize_data/data_dump/" 1: Working Directory: /build/paraview-catalyst/src/build/tests/replay_tests/install-replay_missing_initialize_data 1: Test timeout computed to be: 1500 1/4 Test #1: missing_initialize_prepare ....... Passed 0.05 sec test 2 Start 2: missing_initialize_write_out 2: Test command: /usr/bin/mpiexec "-n" "2" "/build/paraview-catalyst/src/build/tests/replay_tests/install-replay_missing_initialize_data/missing_initialize_driver" "/build/paraview-catalyst/src/build/tests/replay_tests/install-replay_missing_initialize_data/catalyst" 2: Working Directory: /build/paraview-catalyst/src/build/tests/replay_tests/install-replay_missing_initialize_data 2: Environment variables: 2: CATALYST_DATA_DUMP_DIRECTORY=/build/paraview-catalyst/src/build/tests/replay_tests/install-replay_missing_initialize_data/data_dump/ 2: Test timeout computed to be: 1500 2: [arch-nspawn-3003276:06816] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 1056 2: [arch-nspawn-3003276:06816] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 1231 2: [arch-nspawn-3003276:06816] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 1353 2: [arch-nspawn-3003276:06816] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 2405 2: [arch-nspawn-3003276:06816] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 2460 2: [arch-nspawn-3003276:06816] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 2476 2: [arch-nspawn-3003276:06816] PMIX ERROR: PMIX_ERROR in file server/pmix_server.c at line 4660 2: -------------------------------------------------------------------------- 2: The gds/shmem2 component attempted to attach to a shared-memory segment at a 2: particular base address, but was given a different one. Your job will now likely 2: abort. 2: 2: Requested Address: 0x2aaab8000000 2: Acquired Address: 0x3fab92f000 2: 2: If this problem persists, please consider disabling the gds/shmem2 component by 2: setting in your environment the following: PMIX_MCA_gds=hash 2: -------------------------------------------------------------------------- 2: 2: [arch-nspawn-3003276:06840] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file client/pmix_client.c at line 278 2: *** An error occurred in MPI_Init 2: *** on a NULL communicator 2: *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, 2: *** and MPI will try to terminate your MPI job as well) 2: [arch-nspawn-3003276:06840] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! 2: -------------------------------------------------------------------------- 2: prte detected that one or more processes exited with non-zero status, 2: thus causing the job to be terminated. The first process to do so was: 2: 2: Process name: [prte-arch-nspawn-3003276-6816@1,0] 2: Exit code: 14 2: -------------------------------------------------------------------------- 2: [arch-nspawn-3003276:06816] PMIX ERROR: PMIX_ERR_UNREACH in file base/ptl_base_connection_hdlr.c at line 103 2/4 Test #2: missing_initialize_write_out .....***Failed 0.64 sec [arch-nspawn-3003276:06816] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 1056 [arch-nspawn-3003276:06816] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 1231 [arch-nspawn-3003276:06816] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 1353 [arch-nspawn-3003276:06816] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 2405 [arch-nspawn-3003276:06816] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 2460 [arch-nspawn-3003276:06816] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 2476 [arch-nspawn-3003276:06816] PMIX ERROR: PMIX_ERROR in file server/pmix_server.c at line 4660 -------------------------------------------------------------------------- The gds/shmem2 component attempted to attach to a shared-memory segment at a particular base address, but was given a different one. Your job will now likely abort. Requested Address: 0x2aaab8000000 Acquired Address: 0x3fab92f000 If this problem persists, please consider disabling the gds/shmem2 component by setting in your environment the following: PMIX_MCA_gds=hash -------------------------------------------------------------------------- [arch-nspawn-3003276:06840] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file client/pmix_client.c at line 278 *** An error occurred in MPI_Init *** on a NULL communicator *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, *** and MPI will try to terminate your MPI job as well) [arch-nspawn-3003276:06840] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! -------------------------------------------------------------------------- prte detected that one or more processes exited with non-zero status, thus causing the job to be terminated. The first process to do so was: Process name: [prte-arch-nspawn-3003276-6816@1,0] Exit code: 14 -------------------------------------------------------------------------- [arch-nspawn-3003276:06816] PMIX ERROR: PMIX_ERR_UNREACH in file base/ptl_base_connection_hdlr.c at line 103 test 3 Start 3: remove_initialize Failed test dependencies: missing_initialize_write_out 3/4 Test #3: remove_initialize ................***Not Run 0.00 sec test 4 Start 4: missing_initialize_read_in Failed test dependencies: remove_initialize 4/4 Test #4: missing_initialize_read_in .......***Not Run 0.00 sec 25% tests passed, 3 tests failed out of 4 Total Test time (real) = 0.70 sec The following tests FAILED: 2 - missing_initialize_write_out (Failed) 3 - remove_initialize (Not Run) 4 - missing_initialize_read_in (Not Run) Errors while running CTest Test command failed: /usr/bin/ctest CMake Error at /build/paraview-catalyst/src/catalyst-v2.1.0/cmake/catalyst-test-path-setting.cmake:41 (message): test or example failed! Start 49: test-install-replay_ranks_mismatch_fortran 42/69 Test #42: test-build-replay_missing_initialize_data .............***Failed 32.14 sec Internal cmake changing into directory: /build/paraview-catalyst/src/build/tests/replay_tests/build-replay_missing_initialize_data ======== CMake output ====== CMake Deprecation Warning at CMakeLists.txt:1 (cmake_minimum_required): Compatibility with CMake < 3.10 will be removed from a future version of CMake. Update the VERSION argument value. Or, use the ... syntax to tell CMake that the project requires at least but has been updated to work with policies introduced by or earlier. The C compiler identification is GNU 15.2.1 The CXX compiler identification is GNU 15.2.1 Detecting C compiler ABI info Detecting C compiler ABI info - done Check for working C compiler: /usr/bin/cc - skipped Detecting C compile features Detecting C compile features - done Detecting CXX compiler ABI info Detecting CXX compiler ABI info - done Check for working CXX compiler: /usr/bin/c++ - skipped Detecting CXX compile features Detecting CXX compile features - done Found MPI_C: /lib/libmpi.so (found version "3.1") Found MPI: TRUE (found version "3.1") found components: C Found Python3: /usr/include/python3.14 (found version "3.14.3") found components: Development Development.Module Development.Embed Configuring done (10.8s) Generating done (0.1s) Build files have been written to: /build/paraview-catalyst/src/build/tests/replay_tests/build-replay_missing_initialize_data ======== End CMake output ====== Change Dir: '/build/paraview-catalyst/src/build/tests/replay_tests/build-replay_missing_initialize_data' Run Clean Command: /usr/bin/ninja clean [1/1] Cleaning all built files... Cleaning... 0 files. Run Build Command(s): /usr/bin/ninja [1/5] Building C object CMakeFiles/missing_initialize_adaptor.dir/catalyst_impl_replay.c.o [2/5] Building CXX object CMakeFiles/missing_initialize_driver.dir/build/paraview-catalyst/src/catalyst-v2.1.0/tests/replay_tests/common_src_dir/common_replay_driver.cxx.o [3/5] Linking CXX executable missing_initialize_driver [4/5] Building CXX object CMakeFiles/missing_initialize_adaptor.dir/build/paraview-catalyst/src/catalyst-v2.1.0/tests/replay_tests/common_src_dir/common_replay_adaptor.cxx.o [5/5] Linking CXX shared module catalyst/libcatalyst-replay.so Running test command: "/usr/bin/ctest" "-C" "$" "-V" "--output-on-failure" UpdateCTestConfiguration from :/build/paraview-catalyst/src/build/tests/replay_tests/build-replay_missing_initialize_data/DartConfiguration.tcl Parse Config file:/build/paraview-catalyst/src/build/tests/replay_tests/build-replay_missing_initialize_data/DartConfiguration.tcl Test project /build/paraview-catalyst/src/build/tests/replay_tests/build-replay_missing_initialize_data Constructing a list of tests Done constructing a list of tests Updating test list for fixtures Added 0 tests to meet fixture requirements Checking test dependency graph... Checking test dependency graph end test 1 Start 1: missing_initialize_prepare 1: Test command: /usr/bin/cmake "-E" "rm" "-rf" "/build/paraview-catalyst/src/build/tests/replay_tests/build-replay_missing_initialize_data/data_dump/" 1: Working Directory: /build/paraview-catalyst/src/build/tests/replay_tests/build-replay_missing_initialize_data 1: Test timeout computed to be: 1500 1/4 Test #1: missing_initialize_prepare ....... Passed 0.11 sec test 2 Start 2: missing_initialize_write_out 2: Test command: /usr/bin/mpiexec "-n" "2" "/build/paraview-catalyst/src/build/tests/replay_tests/build-replay_missing_initialize_data/missing_initialize_driver" "/build/paraview-catalyst/src/build/tests/replay_tests/build-replay_missing_initialize_data/catalyst" 2: Working Directory: /build/paraview-catalyst/src/build/tests/replay_tests/build-replay_missing_initialize_data 2: Environment variables: 2: CATALYST_DATA_DUMP_DIRECTORY=/build/paraview-catalyst/src/build/tests/replay_tests/build-replay_missing_initialize_data/data_dump/ 2: Test timeout computed to be: 1500 2: [arch-nspawn-3003276:06817] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 1056 2: [arch-nspawn-3003276:06817] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 1231 2: [arch-nspawn-3003276:06817] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 1353 2: [arch-nspawn-3003276:06817] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 2405 2: [arch-nspawn-3003276:06817] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 2460 2: [arch-nspawn-3003276:06817] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 2476 2: [arch-nspawn-3003276:06817] PMIX ERROR: PMIX_ERROR in file server/pmix_server.c at line 4660 2: -------------------------------------------------------------------------- 2: The gds/shmem2 component attempted to attach to a shared-memory segment at a 2: particular base address, but was given a different one. Your job will now likely 2: abort. 2: 2: Requested Address: 0x2aaab4000000 2: Acquired Address: 0x3fb86ea000 2: 2: If this problem persists, please consider disabling the gds/shmem2 component by 2: setting in your environment the following: PMIX_MCA_gds=hash 2: -------------------------------------------------------------------------- 2: 2: [arch-nspawn-3003276:06841] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file client/pmix_client.c at line 278 2: *** An error occurred in MPI_Init 2: *** on a NULL communicator 2: -------------------------------------------------------------------------- 2: PMIx_Init failed for the following reason: 2: 2: PMIX_ERROR 2: 2: Open MPI requires access to a local PMIx server to execute. Please ensure 2: that either you are operating in a PMIx-enabled environment, or use "mpirun" 2: to execute the job. 2: -------------------------------------------------------------------------- 2: 2: *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, 2: *** and MPI will try to terminate your MPI job as well) 2: [arch-nspawn-3003276:06841] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! 2: -------------------------------------------------------------------------- 2: prte detected that one or more processes exited with non-zero status, 2: thus causing the job to be terminated. The first process to do so was: 2: 2: Process name: [prte-arch-nspawn-3003276-6817@1,0] 2: Exit code: 14 2: -------------------------------------------------------------------------- 2: [arch-nspawn-3003276:06817] PMIX ERROR: PMIX_ERR_UNREACH in file base/ptl_base_connection_hdlr.c at line 103 2/4 Test #2: missing_initialize_write_out .....***Failed 0.61 sec [arch-nspawn-3003276:06817] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 1056 [arch-nspawn-3003276:06817] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 1231 [arch-nspawn-3003276:06817] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 1353 [arch-nspawn-3003276:06817] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 2405 [arch-nspawn-3003276:06817] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 2460 [arch-nspawn-3003276:06817] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 2476 [arch-nspawn-3003276:06817] PMIX ERROR: PMIX_ERROR in file server/pmix_server.c at line 4660 -------------------------------------------------------------------------- The gds/shmem2 component attempted to attach to a shared-memory segment at a particular base address, but was given a different one. Your job will now likely abort. Requested Address: 0x2aaab4000000 Acquired Address: 0x3fb86ea000 If this problem persists, please consider disabling the gds/shmem2 component by setting in your environment the following: PMIX_MCA_gds=hash -------------------------------------------------------------------------- [arch-nspawn-3003276:06841] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file client/pmix_client.c at line 278 *** An error occurred in MPI_Init *** on a NULL communicator -------------------------------------------------------------------------- PMIx_Init failed for the following reason: PMIX_ERROR Open MPI requires access to a local PMIx server to execute. Please ensure that either you are operating in a PMIx-enabled environment, or use "mpirun" to execute the job. -------------------------------------------------------------------------- *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, *** and MPI will try to terminate your MPI job as well) [arch-nspawn-3003276:06841] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! -------------------------------------------------------------------------- prte detected that one or more processes exited with non-zero status, thus causing the job to be terminated. The first process to do so was: Process name: [prte-arch-nspawn-3003276-6817@1,0] Exit code: 14 -------------------------------------------------------------------------- [arch-nspawn-3003276:06817] PMIX ERROR: PMIX_ERR_UNREACH in file base/ptl_base_connection_hdlr.c at line 103 test 3 Start 3: remove_initialize Failed test dependencies: missing_initialize_write_out 3/4 Test #3: remove_initialize ................***Not Run 0.00 sec test 4 Start 4: missing_initialize_read_in Failed test dependencies: remove_initialize 4/4 Test #4: missing_initialize_read_in .......***Not Run 0.00 sec 25% tests passed, 3 tests failed out of 4 Total Test time (real) = 0.73 sec The following tests FAILED: 2 - missing_initialize_write_out (Failed) 3 - remove_initialize (Not Run) 4 - missing_initialize_read_in (Not Run) Errors while running CTest Test command failed: /usr/bin/ctest CMake Error at /build/paraview-catalyst/src/catalyst-v2.1.0/cmake/catalyst-test-path-setting.cmake:41 (message): test or example failed! Start 50: test-build-replay_high_num_execute_invc_fortran 43/69 Test #41: test-install-replay_no_data_dump_dir ..................***Failed 32.62 sec Internal cmake changing into directory: /build/paraview-catalyst/src/build/tests/replay_tests/install-replay_no_data_dump_dir ======== CMake output ====== CMake Deprecation Warning at CMakeLists.txt:1 (cmake_minimum_required): Compatibility with CMake < 3.10 will be removed from a future version of CMake. Update the VERSION argument value. Or, use the ... syntax to tell CMake that the project requires at least but has been updated to work with policies introduced by or earlier. The C compiler identification is GNU 15.2.1 The CXX compiler identification is GNU 15.2.1 Detecting C compiler ABI info Detecting C compiler ABI info - done Check for working C compiler: /usr/bin/cc - skipped Detecting C compile features Detecting C compile features - done Detecting CXX compiler ABI info Detecting CXX compiler ABI info - done Check for working CXX compiler: /usr/bin/c++ - skipped Detecting CXX compile features Detecting CXX compile features - done Found MPI_C: /lib/libmpi.so (found version "3.1") Found MPI: TRUE (found version "3.1") found components: C Found Python3: /usr/include/python3.14 (found version "3.14.3") found components: Development Development.Module Development.Embed Configuring done (10.9s) Generating done (0.2s) Build files have been written to: /build/paraview-catalyst/src/build/tests/replay_tests/install-replay_no_data_dump_dir ======== End CMake output ====== Change Dir: '/build/paraview-catalyst/src/build/tests/replay_tests/install-replay_no_data_dump_dir' Run Clean Command: /usr/bin/ninja clean [1/1] Cleaning all built files... Cleaning... 0 files. Run Build Command(s): /usr/bin/ninja [1/5] Building C object CMakeFiles/no_data_dump_dir_adapter.dir/catalyst_impl_replay.c.o [2/5] Building CXX object CMakeFiles/no_data_dump_dir_driver.dir/build/paraview-catalyst/src/catalyst-v2.1.0/tests/replay_tests/common_src_dir/common_replay_driver.cxx.o [3/5] Linking CXX executable no_data_dump_dir_driver [4/5] Building CXX object CMakeFiles/no_data_dump_dir_adapter.dir/build/paraview-catalyst/src/catalyst-v2.1.0/tests/replay_tests/common_src_dir/common_replay_adaptor.cxx.o [5/5] Linking CXX shared module catalyst/libcatalyst-replay.so Running test command: "/usr/bin/ctest" "-C" "$" "-V" "--output-on-failure" UpdateCTestConfiguration from :/build/paraview-catalyst/src/build/tests/replay_tests/install-replay_no_data_dump_dir/DartConfiguration.tcl Parse Config file:/build/paraview-catalyst/src/build/tests/replay_tests/install-replay_no_data_dump_dir/DartConfiguration.tcl Test project /build/paraview-catalyst/src/build/tests/replay_tests/install-replay_no_data_dump_dir Constructing a list of tests Done constructing a list of tests Updating test list for fixtures Added 0 tests to meet fixture requirements Checking test dependency graph... Checking test dependency graph end test 1 Start 1: no_write_out 1: Test command: /usr/bin/mpiexec "-n" "2" "/build/paraview-catalyst/src/build/tests/replay_tests/install-replay_no_data_dump_dir/no_data_dump_dir_driver" "/build/paraview-catalyst/src/build/tests/replay_tests/install-replay_no_data_dump_dir/catalyst" 1: Working Directory: /build/paraview-catalyst/src/build/tests/replay_tests/install-replay_no_data_dump_dir 1: Test timeout computed to be: 1500 1: [arch-nspawn-3003276:06853] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 1056 1: [arch-nspawn-3003276:06853] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 1231 1: [arch-nspawn-3003276:06853] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 1353 1: [arch-nspawn-3003276:06853] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 2405 1: [arch-nspawn-3003276:06853] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 2460 1: [arch-nspawn-3003276:06853] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 2476 1: [arch-nspawn-3003276:06853] PMIX ERROR: PMIX_ERROR in file server/pmix_server.c at line 4660 1: -------------------------------------------------------------------------- 1: The gds/shmem2 component attempted to attach to a shared-memory segment at a 1: particular base address, but was given a different one. Your job will now likely 1: abort. 1: 1: Requested Address: 0x2aaac8000000 1: Acquired Address: 0x3fac589000 1: 1: If this problem persists, please consider disabling the gds/shmem2 component by 1: setting in your environment the following: PMIX_MCA_gds=hash 1: -------------------------------------------------------------------------- 1: 1: [arch-nspawn-3003276:06871] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file client/pmix_client.c at line 278 1: [arch-nspawn-3003276:06853] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 1056 1: [arch-nspawn-3003276:06853] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 1231 1: [arch-nspawn-3003276:06853] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 1353 1: [arch-nspawn-3003276:06853] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 2405 1: [arch-nspawn-3003276:06853] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 2460 1: [arch-nspawn-3003276:06853] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 2476 1: [arch-nspawn-3003276:06853] PMIX ERROR: PMIX_ERROR in file server/pmix_server.c at line 4660 1: [arch-nspawn-3003276:06868] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file client/pmix_client.c at line 278 1: *** An error occurred in MPI_Init 1: *** on a NULL communicator 1: *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, 1: *** and MPI will try to terminate your MPI job as well) 1: [arch-nspawn-3003276:06868] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! 1: *** An error occurred in MPI_Init 1: *** on a NULL communicator 1: *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, 1: *** and MPI will try to terminate your MPI job as well) 1: [arch-nspawn-3003276:06871] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! 1: -------------------------------------------------------------------------- 1: prte detected that one or more processes exited with non-zero status, 1: thus causing the job to be terminated. The first process to do so was: 1: 1: Process name: [prte-arch-nspawn-3003276-6853@1,0] 1: Exit code: 14 1: -------------------------------------------------------------------------- 1/2 Test #1: no_write_out .....................***Failed 0.91 sec [arch-nspawn-3003276:06853] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 1056 [arch-nspawn-3003276:06853] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 1231 [arch-nspawn-3003276:06853] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 1353 [arch-nspawn-3003276:06853] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 2405 [arch-nspawn-3003276:06853] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 2460 [arch-nspawn-3003276:06853] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 2476 [arch-nspawn-3003276:06853] PMIX ERROR: PMIX_ERROR in file server/pmix_server.c at line 4660 -------------------------------------------------------------------------- The gds/shmem2 component attempted to attach to a shared-memory segment at a particular base address, but was given a different one. Your job will now likely abort. Requested Address: 0x2aaac8000000 Acquired Address: 0x3fac589000 If this problem persists, please consider disabling the gds/shmem2 component by setting in your environment the following: PMIX_MCA_gds=hash -------------------------------------------------------------------------- [arch-nspawn-3003276:06871] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file client/pmix_client.c at line 278 [arch-nspawn-3003276:06853] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 1056 [arch-nspawn-3003276:06853] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 1231 [arch-nspawn-3003276:06853] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 1353 [arch-nspawn-3003276:06853] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 2405 [arch-nspawn-3003276:06853] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 2460 [arch-nspawn-3003276:06853] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 2476 [arch-nspawn-3003276:06853] PMIX ERROR: PMIX_ERROR in file server/pmix_server.c at line 4660 [arch-nspawn-3003276:06868] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file client/pmix_client.c at line 278 *** An error occurred in MPI_Init *** on a NULL communicator *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, *** and MPI will try to terminate your MPI job as well) [arch-nspawn-3003276:06868] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! *** An error occurred in MPI_Init *** on a NULL communicator *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, *** and MPI will try to terminate your MPI job as well) [arch-nspawn-3003276:06871] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! -------------------------------------------------------------------------- prte detected that one or more processes exited with non-zero status, thus causing the job to be terminated. The first process to do so was: Process name: [prte-arch-nspawn-3003276-6853@1,0] Exit code: 14 -------------------------------------------------------------------------- test 2 Start 2: no_data_dump_dir_read_in Failed test dependencies: no_write_out 2/2 Test #2: no_data_dump_dir_read_in .........***Not Run 0.00 sec 0% tests passed, 2 tests failed out of 2 Total Test time (real) = 0.92 sec The following tests FAILED: 1 - no_write_out (Failed) 2 - no_data_dump_dir_read_in (Not Run) Errors while running CTest Test command failed: /usr/bin/ctest CMake Error at /build/paraview-catalyst/src/catalyst-v2.1.0/cmake/catalyst-test-path-setting.cmake:41 (message): test or example failed! Start 51: test-install-replay_high_num_execute_invc_fortran 44/69 Test #44: test-build-replay_missing_execute_invc ................***Failed 29.77 sec Internal cmake changing into directory: /build/paraview-catalyst/src/build/tests/replay_tests/build-replay_missing_execute_invc ======== CMake output ====== CMake Deprecation Warning at CMakeLists.txt:1 (cmake_minimum_required): Compatibility with CMake < 3.10 will be removed from a future version of CMake. Update the VERSION argument value. Or, use the ... syntax to tell CMake that the project requires at least but has been updated to work with policies introduced by or earlier. The C compiler identification is GNU 15.2.1 The CXX compiler identification is GNU 15.2.1 Detecting C compiler ABI info Detecting C compiler ABI info - done Check for working C compiler: /usr/bin/cc - skipped Detecting C compile features Detecting C compile features - done Detecting CXX compiler ABI info Detecting CXX compiler ABI info - done Check for working CXX compiler: /usr/bin/c++ - skipped Detecting CXX compile features Detecting CXX compile features - done Found MPI_C: /lib/libmpi.so (found version "3.1") Found MPI: TRUE (found version "3.1") found components: C Found Python3: /usr/include/python3.14 (found version "3.14.3") found components: Development Development.Module Development.Embed Configuring done (11.9s) Generating done (0.2s) Build files have been written to: /build/paraview-catalyst/src/build/tests/replay_tests/build-replay_missing_execute_invc ======== End CMake output ====== Change Dir: '/build/paraview-catalyst/src/build/tests/replay_tests/build-replay_missing_execute_invc' Run Clean Command: /usr/bin/ninja clean [1/1] Cleaning all built files... Cleaning... 0 files. Run Build Command(s): /usr/bin/ninja [1/5] Building C object CMakeFiles/missing_execute_invc_adaptor.dir/catalyst_impl_replay.c.o [2/5] Building CXX object CMakeFiles/missing_execute_invc_driver.dir/build/paraview-catalyst/src/catalyst-v2.1.0/tests/replay_tests/common_src_dir/common_replay_driver.cxx.o [3/5] Linking CXX executable missing_execute_invc_driver [4/5] Building CXX object CMakeFiles/missing_execute_invc_adaptor.dir/build/paraview-catalyst/src/catalyst-v2.1.0/tests/replay_tests/common_src_dir/common_replay_adaptor.cxx.o [5/5] Linking CXX shared module catalyst/libcatalyst-replay.so Running test command: "/usr/bin/ctest" "-C" "$" "-V" "--output-on-failure" UpdateCTestConfiguration from :/build/paraview-catalyst/src/build/tests/replay_tests/build-replay_missing_execute_invc/DartConfiguration.tcl Parse Config file:/build/paraview-catalyst/src/build/tests/replay_tests/build-replay_missing_execute_invc/DartConfiguration.tcl Test project /build/paraview-catalyst/src/build/tests/replay_tests/build-replay_missing_execute_invc Constructing a list of tests Done constructing a list of tests Updating test list for fixtures Added 0 tests to meet fixture requirements Checking test dependency graph... Checking test dependency graph end test 1 Start 1: missing_execute_invc_prepare 1: Test command: /usr/bin/cmake "-E" "rm" "-rf" "/build/paraview-catalyst/src/build/tests/replay_tests/build-replay_missing_execute_invc/data_dump/" 1: Working Directory: /build/paraview-catalyst/src/build/tests/replay_tests/build-replay_missing_execute_invc 1: Test timeout computed to be: 1500 1/4 Test #1: missing_execute_invc_prepare ..... Passed 0.09 sec test 2 Start 2: missing_execute_invc_write_out 2: Test command: /usr/bin/mpiexec "-n" "1" "/build/paraview-catalyst/src/build/tests/replay_tests/build-replay_missing_execute_invc/missing_execute_invc_driver" "/build/paraview-catalyst/src/build/tests/replay_tests/build-replay_missing_execute_invc/catalyst" 2: Working Directory: /build/paraview-catalyst/src/build/tests/replay_tests/build-replay_missing_execute_invc 2: Environment variables: 2: CATALYST_DATA_DUMP_DIRECTORY=/build/paraview-catalyst/src/build/tests/replay_tests/build-replay_missing_execute_invc/data_dump/ 2: Test timeout computed to be: 1500 2: [arch-nspawn-3003276:07932] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 1056 2: [arch-nspawn-3003276:07932] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 1231 2: [arch-nspawn-3003276:07932] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 1353 2: [arch-nspawn-3003276:07932] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 2405 2: [arch-nspawn-3003276:07932] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 2460 2: [arch-nspawn-3003276:07932] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 2476 2: [arch-nspawn-3003276:07932] PMIX ERROR: PMIX_ERROR in file server/pmix_server.c at line 4660 2: -------------------------------------------------------------------------- 2: The gds/shmem2 component attempted to attach to a shared-memory segment at a 2: particular base address, but was given a different one. Your job will now likely 2: abort. 2: 2: Requested Address: 0x2aaabc000000 2: Acquired Address: 0x3fb7e35000 2: 2: If this problem persists, please consider disabling the gds/shmem2 component by 2: setting in your environment the following: PMIX_MCA_gds=hash 2: -------------------------------------------------------------------------- 2: 2: [arch-nspawn-3003276:07986] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file client/pmix_client.c at line 278 2: *** An error occurred in MPI_Init 2: *** on a NULL communicator 2: *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, 2: *** and MPI will try to terminate your MPI job as well) 2: [arch-nspawn-3003276:07986] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! 2/4 Test #2: missing_execute_invc_write_out ...***Failed 0.37 sec [arch-nspawn-3003276:07932] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 1056 [arch-nspawn-3003276:07932] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 1231 [arch-nspawn-3003276:07932] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 1353 [arch-nspawn-3003276:07932] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 2405 [arch-nspawn-3003276:07932] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 2460 [arch-nspawn-3003276:07932] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 2476 [arch-nspawn-3003276:07932] PMIX ERROR: PMIX_ERROR in file server/pmix_server.c at line 4660 -------------------------------------------------------------------------- The gds/shmem2 component attempted to attach to a shared-memory segment at a particular base address, but was given a different one. Your job will now likely abort. Requested Address: 0x2aaabc000000 Acquired Address: 0x3fb7e35000 If this problem persists, please consider disabling the gds/shmem2 component by setting in your environment the following: PMIX_MCA_gds=hash -------------------------------------------------------------------------- [arch-nspawn-3003276:07986] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file client/pmix_client.c at line 278 *** An error occurred in MPI_Init *** on a NULL communicator *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, *** and MPI will try to terminate your MPI job as well) [arch-nspawn-3003276:07986] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! test 3 Start 3: remove_execute_invc0 Failed test dependencies: missing_execute_invc_write_out 3/4 Test #3: remove_execute_invc0 .............***Not Run 0.00 sec test 4 Start 4: missing_execute_invc_read_in Failed test dependencies: remove_execute_invc0 4/4 Test #4: missing_execute_invc_read_in .....***Not Run 0.00 sec 25% tests passed, 3 tests failed out of 4 Total Test time (real) = 0.46 sec The following tests FAILED: 2 - missing_execute_invc_write_out (Failed) 3 - remove_execute_invc0 (Not Run) 4 - missing_execute_invc_read_in (Not Run) Errors while running CTest Test command failed: /usr/bin/ctest CMake Error at /build/paraview-catalyst/src/catalyst-v2.1.0/cmake/catalyst-test-path-setting.cmake:41 (message): test or example failed! Start 52: test-build-replay_no_data_dump_dir_fortran 45/69 Test #45: test-install-replay_missing_execute_invc ..............***Failed 26.36 sec Internal cmake changing into directory: /build/paraview-catalyst/src/build/tests/replay_tests/install-replay_missing_execute_invc ======== CMake output ====== CMake Deprecation Warning at CMakeLists.txt:1 (cmake_minimum_required): Compatibility with CMake < 3.10 will be removed from a future version of CMake. Update the VERSION argument value. Or, use the ... syntax to tell CMake that the project requires at least but has been updated to work with policies introduced by or earlier. The C compiler identification is GNU 15.2.1 The CXX compiler identification is GNU 15.2.1 Detecting C compiler ABI info Detecting C compiler ABI info - done Check for working C compiler: /usr/bin/cc - skipped Detecting C compile features Detecting C compile features - done Detecting CXX compiler ABI info Detecting CXX compiler ABI info - done Check for working CXX compiler: /usr/bin/c++ - skipped Detecting CXX compile features Detecting CXX compile features - done Found MPI_C: /lib/libmpi.so (found version "3.1") Found MPI: TRUE (found version "3.1") found components: C Found Python3: /usr/include/python3.14 (found version "3.14.3") found components: Development Development.Module Development.Embed Configuring done (10.5s) Generating done (0.1s) Build files have been written to: /build/paraview-catalyst/src/build/tests/replay_tests/install-replay_missing_execute_invc ======== End CMake output ====== Change Dir: '/build/paraview-catalyst/src/build/tests/replay_tests/install-replay_missing_execute_invc' Run Clean Command: /usr/bin/ninja clean [1/1] Cleaning all built files... Cleaning... 0 files. Run Build Command(s): /usr/bin/ninja [1/5] Building C object CMakeFiles/missing_execute_invc_adaptor.dir/catalyst_impl_replay.c.o [2/5] Building CXX object CMakeFiles/missing_execute_invc_driver.dir/build/paraview-catalyst/src/catalyst-v2.1.0/tests/replay_tests/common_src_dir/common_replay_driver.cxx.o [3/5] Linking CXX executable missing_execute_invc_driver [4/5] Building CXX object CMakeFiles/missing_execute_invc_adaptor.dir/build/paraview-catalyst/src/catalyst-v2.1.0/tests/replay_tests/common_src_dir/common_replay_adaptor.cxx.o [5/5] Linking CXX shared module catalyst/libcatalyst-replay.so Running test command: "/usr/bin/ctest" "-C" "$" "-V" "--output-on-failure" UpdateCTestConfiguration from :/build/paraview-catalyst/src/build/tests/replay_tests/install-replay_missing_execute_invc/DartConfiguration.tcl Parse Config file:/build/paraview-catalyst/src/build/tests/replay_tests/install-replay_missing_execute_invc/DartConfiguration.tcl Test project /build/paraview-catalyst/src/build/tests/replay_tests/install-replay_missing_execute_invc Constructing a list of tests Done constructing a list of tests Updating test list for fixtures Added 0 tests to meet fixture requirements Checking test dependency graph... Checking test dependency graph end test 1 Start 1: missing_execute_invc_prepare 1: Test command: /usr/bin/cmake "-E" "rm" "-rf" "/build/paraview-catalyst/src/build/tests/replay_tests/install-replay_missing_execute_invc/data_dump/" 1: Working Directory: /build/paraview-catalyst/src/build/tests/replay_tests/install-replay_missing_execute_invc 1: Test timeout computed to be: 1500 1/4 Test #1: missing_execute_invc_prepare ..... Passed 0.10 sec test 2 Start 2: missing_execute_invc_write_out 2: Test command: /usr/bin/mpiexec "-n" "1" "/build/paraview-catalyst/src/build/tests/replay_tests/install-replay_missing_execute_invc/missing_execute_invc_driver" "/build/paraview-catalyst/src/build/tests/replay_tests/install-replay_missing_execute_invc/catalyst" 2: Working Directory: /build/paraview-catalyst/src/build/tests/replay_tests/install-replay_missing_execute_invc 2: Environment variables: 2: CATALYST_DATA_DUMP_DIRECTORY=/build/paraview-catalyst/src/build/tests/replay_tests/install-replay_missing_execute_invc/data_dump/ 2: Test timeout computed to be: 1500 2: [arch-nspawn-3003276:08701] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 1056 2: [arch-nspawn-3003276:08701] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 1231 2: [arch-nspawn-3003276:08701] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 1353 2: [arch-nspawn-3003276:08701] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 2405 2: [arch-nspawn-3003276:08701] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 2460 2: [arch-nspawn-3003276:08701] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 2476 2: [arch-nspawn-3003276:08701] PMIX ERROR: PMIX_ERROR in file server/pmix_server.c at line 4660 2: -------------------------------------------------------------------------- 2: The gds/shmem2 component attempted to attach to a shared-memory segment at a 2: particular base address, but was given a different one. Your job will now likely 2: abort. 2: 2: Requested Address: 0x2aaab8000000 2: Acquired Address: 0x3f9e63f000 2: 2: If this problem persists, please consider disabling the gds/shmem2 component by 2: setting in your environment the following: PMIX_MCA_gds=hash 2: -------------------------------------------------------------------------- 2: 2: [arch-nspawn-3003276:08719] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file client/pmix_client.c at line 278 2: *** An error occurred in MPI_Init 2: *** on a NULL communicator 2: *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, 2: *** and MPI will try to terminate your MPI job as well) 2: [arch-nspawn-3003276:08719] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! 2/4 Test #2: missing_execute_invc_write_out ...***Failed 0.34 sec [arch-nspawn-3003276:08701] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 1056 [arch-nspawn-3003276:08701] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 1231 [arch-nspawn-3003276:08701] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 1353 [arch-nspawn-3003276:08701] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 2405 [arch-nspawn-3003276:08701] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 2460 [arch-nspawn-3003276:08701] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 2476 [arch-nspawn-3003276:08701] PMIX ERROR: PMIX_ERROR in file server/pmix_server.c at line 4660 -------------------------------------------------------------------------- The gds/shmem2 component attempted to attach to a shared-memory segment at a particular base address, but was given a different one. Your job will now likely abort. Requested Address: 0x2aaab8000000 Acquired Address: 0x3f9e63f000 If this problem persists, please consider disabling the gds/shmem2 component by setting in your environment the following: PMIX_MCA_gds=hash -------------------------------------------------------------------------- [arch-nspawn-3003276:08719] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file client/pmix_client.c at line 278 *** An error occurred in MPI_Init *** on a NULL communicator *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, *** and MPI will try to terminate your MPI job as well) [arch-nspawn-3003276:08719] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! test 3 Start 3: remove_execute_invc0 Failed test dependencies: missing_execute_invc_write_out 3/4 Test #3: remove_execute_invc0 .............***Not Run 0.00 sec test 4 Start 4: missing_execute_invc_read_in Failed test dependencies: remove_execute_invc0 4/4 Test #4: missing_execute_invc_read_in .....***Not Run 0.00 sec 25% tests passed, 3 tests failed out of 4 Total Test time (real) = 0.44 sec The following tests FAILED: 2 - missing_execute_invc_write_out (Failed) 3 - remove_execute_invc0 (Not Run) 4 - missing_execute_invc_read_in (Not Run) Errors while running CTest Test command failed: /usr/bin/ctest CMake Error at /build/paraview-catalyst/src/catalyst-v2.1.0/cmake/catalyst-test-path-setting.cmake:41 (message): test or example failed! Start 53: test-install-replay_no_data_dump_dir_fortran 46/69 Test #46: test-build-replay_high_num_ranks_fortran ..............***Failed 41.67 sec Internal cmake changing into directory: /build/paraview-catalyst/src/build/tests/replay_tests/fortran/build-replay_high_num_ranks_fortran ======== CMake output ====== CMake Deprecation Warning at CMakeLists.txt:1 (cmake_minimum_required): Compatibility with CMake < 3.10 will be removed from a future version of CMake. Update the VERSION argument value. Or, use the ... syntax to tell CMake that the project requires at least but has been updated to work with policies introduced by or earlier. The C compiler identification is GNU 15.2.1 The CXX compiler identification is GNU 15.2.1 Detecting C compiler ABI info Detecting C compiler ABI info - done Check for working C compiler: /usr/bin/cc - skipped Detecting C compile features Detecting C compile features - done Detecting CXX compiler ABI info Detecting CXX compiler ABI info - done Check for working CXX compiler: /usr/bin/c++ - skipped Detecting CXX compile features Detecting CXX compile features - done The Fortran compiler identification is GNU 15.2.1 Detecting Fortran compiler ABI info Detecting Fortran compiler ABI info - done Check for working Fortran compiler: /usr/bin/gfortran - skipped Found MPI_C: /lib/libmpi.so (found version "3.1") Found MPI: TRUE (found version "3.1") found components: C Found Python3: /usr/include/python3.14 (found version "3.14.3") found components: Development Development.Module Development.Embed Found MPI_Fortran: /lib/libmpi_usempif08.so (found version "3.1") Found MPI: TRUE (found version "3.1") found components: Fortran Configuring done (25.4s) Generating done (0.1s) Build files have been written to: /build/paraview-catalyst/src/build/tests/replay_tests/fortran/build-replay_high_num_ranks_fortran ======== End CMake output ====== Change Dir: '/build/paraview-catalyst/src/build/tests/replay_tests/fortran/build-replay_high_num_ranks_fortran' Run Clean Command: /usr/bin/ninja clean [1/1] Cleaning all built files... Cleaning... 0 files. Run Build Command(s): /usr/bin/ninja [1/7] Building Fortran preprocessed CMakeFiles/high_num_ranks_driver.dir/build/paraview-catalyst/src/catalyst-v2.1.0/tests/replay_tests/fortran/common_src_dir/common_replay_driver.F90-pp.f90 [2/7] Generating Fortran dyndep file CMakeFiles/high_num_ranks_driver.dir/Fortran.dd [3/7] Building C object CMakeFiles/high_num_ranks_adaptor.dir/catalyst_impl_replay.c.o [4/7] Building Fortran object CMakeFiles/high_num_ranks_driver.dir/build/paraview-catalyst/src/catalyst-v2.1.0/tests/replay_tests/fortran/common_src_dir/common_replay_driver.F90.o [5/7] Linking Fortran executable high_num_ranks_driver [6/7] Building CXX object CMakeFiles/high_num_ranks_adaptor.dir/build/paraview-catalyst/src/catalyst-v2.1.0/tests/replay_tests/common_src_dir/common_replay_adaptor.cxx.o [7/7] Linking CXX shared module catalyst/libcatalyst-replay.so Running test command: "/usr/bin/ctest" "-C" "$" "-V" "--output-on-failure" UpdateCTestConfiguration from :/build/paraview-catalyst/src/build/tests/replay_tests/fortran/build-replay_high_num_ranks_fortran/DartConfiguration.tcl Parse Config file:/build/paraview-catalyst/src/build/tests/replay_tests/fortran/build-replay_high_num_ranks_fortran/DartConfiguration.tcl Test project /build/paraview-catalyst/src/build/tests/replay_tests/fortran/build-replay_high_num_ranks_fortran Constructing a list of tests Done constructing a list of tests Updating test list for fixtures Added 0 tests to meet fixture requirements Checking test dependency graph... Checking test dependency graph end test 1 Start 1: high_num_ranks_write_prepare 1: Test command: /usr/bin/cmake "-E" "rm" "-rf" "/build/paraview-catalyst/src/build/tests/replay_tests/fortran/build-replay_high_num_ranks_fortran/data_dump/" 1: Working Directory: /build/paraview-catalyst/src/build/tests/replay_tests/fortran/build-replay_high_num_ranks_fortran 1: Test timeout computed to be: 1500 1/3 Test #1: high_num_ranks_write_prepare ..... Passed 0.08 sec test 2 Start 2: high_num_ranks_write_out 2: Test command: /usr/bin/mpiexec "-n" "10" "/build/paraview-catalyst/src/build/tests/replay_tests/fortran/build-replay_high_num_ranks_fortran/high_num_ranks_driver" "/build/paraview-catalyst/src/build/tests/replay_tests/fortran/build-replay_high_num_ranks_fortran/catalyst" 2: Working Directory: /build/paraview-catalyst/src/build/tests/replay_tests/fortran/build-replay_high_num_ranks_fortran 2: Environment variables: 2: CATALYST_DATA_DUMP_DIRECTORY=/build/paraview-catalyst/src/build/tests/replay_tests/fortran/build-replay_high_num_ranks_fortran/data_dump/ 2: Test timeout computed to be: 1500 2: -------------------------------------------------------------------------- 2: There are not enough slots available in the system to satisfy the 10 2: slots that were requested by the application: 2: 2: /build/paraview-catalyst/src/build/tests/replay_tests/fortran/build-replay_high_num_ranks_fortran/high_num_ranks_driver 2: 2: Either request fewer procs for your application, or make more slots 2: available for use. 2: 2: A "slot" is the PRRTE term for an allocatable unit where we can 2: launch a process. The number of slots available are defined by the 2: environment in which PRRTE processes are run: 2: 2: 1. Hostfile, via "slots=N" clauses (N defaults to number of 2: processor cores if not provided) 2: 2. The --host command line parameter, via a ":N" suffix on the 2: hostname (N defaults to 1 if not provided) 2: 3. Resource manager (e.g., SLURM, PBS/Torque, LSF, etc.) 2: 4. If none of a hostfile, the --host command line parameter, or an 2: RM is present, PRRTE defaults to the number of processor cores 2: 2: In all the above cases, if you want PRRTE to default to the number 2: of hardware threads instead of the number of processor cores, use the 2: --use-hwthread-cpus option. 2: 2: Alternatively, you can use the --map-by :OVERSUBSCRIBE option to ignore the 2: number of available slots when deciding the number of processes to 2: launch. 2: -------------------------------------------------------------------------- 2: 2/3 Test #2: high_num_ranks_write_out .........***Failed 0.13 sec -------------------------------------------------------------------------- There are not enough slots available in the system to satisfy the 10 slots that were requested by the application: /build/paraview-catalyst/src/build/tests/replay_tests/fortran/build-replay_high_num_ranks_fortran/high_num_ranks_driver Either request fewer procs for your application, or make more slots available for use. A "slot" is the PRRTE term for an allocatable unit where we can launch a process. The number of slots available are defined by the environment in which PRRTE processes are run: 1. Hostfile, via "slots=N" clauses (N defaults to number of processor cores if not provided) 2. The --host command line parameter, via a ":N" suffix on the hostname (N defaults to 1 if not provided) 3. Resource manager (e.g., SLURM, PBS/Torque, LSF, etc.) 4. If none of a hostfile, the --host command line parameter, or an RM is present, PRRTE defaults to the number of processor cores In all the above cases, if you want PRRTE to default to the number of hardware threads instead of the number of processor cores, use the --use-hwthread-cpus option. Alternatively, you can use the --map-by :OVERSUBSCRIBE option to ignore the number of available slots when deciding the number of processes to launch. -------------------------------------------------------------------------- test 3 Start 3: high_num_ranks_read_in Failed test dependencies: high_num_ranks_write_out 3/3 Test #3: high_num_ranks_read_in ...........***Not Run 0.00 sec 33% tests passed, 2 tests failed out of 3 Total Test time (real) = 0.21 sec The following tests FAILED: 2 - high_num_ranks_write_out (Failed) 3 - high_num_ranks_read_in (Not Run) Errors while running CTest Test command failed: /usr/bin/ctest CMake Error at /build/paraview-catalyst/src/catalyst-v2.1.0/cmake/catalyst-test-path-setting.cmake:41 (message): test or example failed! Start 54: test-build-replay_missing_initialize_data_fortran 47/69 Test #47: test-install-replay_high_num_ranks_fortran ............***Failed 38.87 sec Internal cmake changing into directory: /build/paraview-catalyst/src/build/tests/replay_tests/fortran/install-replay_high_num_ranks_fortran ======== CMake output ====== CMake Deprecation Warning at CMakeLists.txt:1 (cmake_minimum_required): Compatibility with CMake < 3.10 will be removed from a future version of CMake. Update the VERSION argument value. Or, use the ... syntax to tell CMake that the project requires at least but has been updated to work with policies introduced by or earlier. The C compiler identification is GNU 15.2.1 The CXX compiler identification is GNU 15.2.1 Detecting C compiler ABI info Detecting C compiler ABI info - done Check for working C compiler: /usr/bin/cc - skipped Detecting C compile features Detecting C compile features - done Detecting CXX compiler ABI info Detecting CXX compiler ABI info - done Check for working CXX compiler: /usr/bin/c++ - skipped Detecting CXX compile features Detecting CXX compile features - done The Fortran compiler identification is GNU 15.2.1 Detecting Fortran compiler ABI info Detecting Fortran compiler ABI info - done Check for working Fortran compiler: /usr/bin/gfortran - skipped Found MPI_C: /lib/libmpi.so (found version "3.1") Found MPI: TRUE (found version "3.1") found components: C Found Python3: /usr/include/python3.14 (found version "3.14.3") found components: Development Development.Module Development.Embed Found MPI_Fortran: /lib/libmpi_usempif08.so (found version "3.1") Found MPI: TRUE (found version "3.1") found components: Fortran Configuring done (23.0s) Generating done (0.2s) Build files have been written to: /build/paraview-catalyst/src/build/tests/replay_tests/fortran/install-replay_high_num_ranks_fortran ======== End CMake output ====== Change Dir: '/build/paraview-catalyst/src/build/tests/replay_tests/fortran/install-replay_high_num_ranks_fortran' Run Clean Command: /usr/bin/ninja clean [1/1] Cleaning all built files... Cleaning... 0 files. Run Build Command(s): /usr/bin/ninja [1/7] Building Fortran preprocessed CMakeFiles/high_num_ranks_driver.dir/build/paraview-catalyst/src/catalyst-v2.1.0/tests/replay_tests/fortran/common_src_dir/common_replay_driver.F90-pp.f90 [2/7] Building C object CMakeFiles/high_num_ranks_adaptor.dir/catalyst_impl_replay.c.o [3/7] Generating Fortran dyndep file CMakeFiles/high_num_ranks_driver.dir/Fortran.dd [4/7] Building Fortran object CMakeFiles/high_num_ranks_driver.dir/build/paraview-catalyst/src/catalyst-v2.1.0/tests/replay_tests/fortran/common_src_dir/common_replay_driver.F90.o [5/7] Linking Fortran executable high_num_ranks_driver [6/7] Building CXX object CMakeFiles/high_num_ranks_adaptor.dir/build/paraview-catalyst/src/catalyst-v2.1.0/tests/replay_tests/common_src_dir/common_replay_adaptor.cxx.o [7/7] Linking CXX shared module catalyst/libcatalyst-replay.so Running test command: "/usr/bin/ctest" "-C" "$" "-V" "--output-on-failure" UpdateCTestConfiguration from :/build/paraview-catalyst/src/build/tests/replay_tests/fortran/install-replay_high_num_ranks_fortran/DartConfiguration.tcl Parse Config file:/build/paraview-catalyst/src/build/tests/replay_tests/fortran/install-replay_high_num_ranks_fortran/DartConfiguration.tcl Test project /build/paraview-catalyst/src/build/tests/replay_tests/fortran/install-replay_high_num_ranks_fortran Constructing a list of tests Done constructing a list of tests Updating test list for fixtures Added 0 tests to meet fixture requirements Checking test dependency graph... Checking test dependency graph end test 1 Start 1: high_num_ranks_write_prepare 1: Test command: /usr/bin/cmake "-E" "rm" "-rf" "/build/paraview-catalyst/src/build/tests/replay_tests/fortran/install-replay_high_num_ranks_fortran/data_dump/" 1: Working Directory: /build/paraview-catalyst/src/build/tests/replay_tests/fortran/install-replay_high_num_ranks_fortran 1: Test timeout computed to be: 1500 1/3 Test #1: high_num_ranks_write_prepare ..... Passed 0.06 sec test 2 Start 2: high_num_ranks_write_out 2: Test command: /usr/bin/mpiexec "-n" "10" "/build/paraview-catalyst/src/build/tests/replay_tests/fortran/install-replay_high_num_ranks_fortran/high_num_ranks_driver" "/build/paraview-catalyst/src/build/tests/replay_tests/fortran/install-replay_high_num_ranks_fortran/catalyst" 2: Working Directory: /build/paraview-catalyst/src/build/tests/replay_tests/fortran/install-replay_high_num_ranks_fortran 2: Environment variables: 2: CATALYST_DATA_DUMP_DIRECTORY=/build/paraview-catalyst/src/build/tests/replay_tests/fortran/install-replay_high_num_ranks_fortran/data_dump/ 2: Test timeout computed to be: 1500 2: -------------------------------------------------------------------------- 2: There are not enough slots available in the system to satisfy the 10 2: slots that were requested by the application: 2: 2: /build/paraview-catalyst/src/build/tests/replay_tests/fortran/install-replay_high_num_ranks_fortran/high_num_ranks_driver 2: 2: Either request fewer procs for your application, or make more slots 2: available for use. 2: 2: A "slot" is the PRRTE term for an allocatable unit where we can 2: launch a process. The number of slots available are defined by the 2: environment in which PRRTE processes are run: 2: 2: 1. Hostfile, via "slots=N" clauses (N defaults to number of 2: processor cores if not provided) 2: 2. The --host command line parameter, via a ":N" suffix on the 2: hostname (N defaults to 1 if not provided) 2: 3. Resource manager (e.g., SLURM, PBS/Torque, LSF, etc.) 2: 4. If none of a hostfile, the --host command line parameter, or an 2: RM is present, PRRTE defaults to the number of processor cores 2: 2: In all the above cases, if you want PRRTE to default to the number 2: of hardware threads instead of the number of processor cores, use the 2: --use-hwthread-cpus option. 2: 2: Alternatively, you can use the --map-by :OVERSUBSCRIBE option to ignore the 2: number of available slots when deciding the number of processes to 2: launch. 2: -------------------------------------------------------------------------- 2: 2/3 Test #2: high_num_ranks_write_out .........***Failed 0.15 sec -------------------------------------------------------------------------- There are not enough slots available in the system to satisfy the 10 slots that were requested by the application: /build/paraview-catalyst/src/build/tests/replay_tests/fortran/install-replay_high_num_ranks_fortran/high_num_ranks_driver Either request fewer procs for your application, or make more slots available for use. A "slot" is the PRRTE term for an allocatable unit where we can launch a process. The number of slots available are defined by the environment in which PRRTE processes are run: 1. Hostfile, via "slots=N" clauses (N defaults to number of processor cores if not provided) 2. The --host command line parameter, via a ":N" suffix on the hostname (N defaults to 1 if not provided) 3. Resource manager (e.g., SLURM, PBS/Torque, LSF, etc.) 4. If none of a hostfile, the --host command line parameter, or an RM is present, PRRTE defaults to the number of processor cores In all the above cases, if you want PRRTE to default to the number of hardware threads instead of the number of processor cores, use the --use-hwthread-cpus option. Alternatively, you can use the --map-by :OVERSUBSCRIBE option to ignore the number of available slots when deciding the number of processes to launch. -------------------------------------------------------------------------- test 3 Start 3: high_num_ranks_read_in Failed test dependencies: high_num_ranks_write_out 3/3 Test #3: high_num_ranks_read_in ...........***Not Run 0.00 sec 33% tests passed, 2 tests failed out of 3 Total Test time (real) = 0.22 sec The following tests FAILED: 2 - high_num_ranks_write_out (Failed) 3 - high_num_ranks_read_in (Not Run) Errors while running CTest Test command failed: /usr/bin/ctest CMake Error at /build/paraview-catalyst/src/catalyst-v2.1.0/cmake/catalyst-test-path-setting.cmake:41 (message): test or example failed! Start 55: test-install-replay_missing_initialize_data_fortran 48/69 Test #48: test-build-replay_ranks_mismatch_fortran ..............***Failed 39.88 sec Internal cmake changing into directory: /build/paraview-catalyst/src/build/tests/replay_tests/fortran/build-replay_ranks_mismatch_fortran ======== CMake output ====== CMake Deprecation Warning at CMakeLists.txt:1 (cmake_minimum_required): Compatibility with CMake < 3.10 will be removed from a future version of CMake. Update the VERSION argument value. Or, use the ... syntax to tell CMake that the project requires at least but has been updated to work with policies introduced by or earlier. The C compiler identification is GNU 15.2.1 The CXX compiler identification is GNU 15.2.1 Detecting C compiler ABI info Detecting C compiler ABI info - done Check for working C compiler: /usr/bin/cc - skipped Detecting C compile features Detecting C compile features - done Detecting CXX compiler ABI info Detecting CXX compiler ABI info - done Check for working CXX compiler: /usr/bin/c++ - skipped Detecting CXX compile features Detecting CXX compile features - done The Fortran compiler identification is GNU 15.2.1 Detecting Fortran compiler ABI info Detecting Fortran compiler ABI info - done Check for working Fortran compiler: /usr/bin/gfortran - skipped Found MPI_C: /lib/libmpi.so (found version "3.1") Found MPI: TRUE (found version "3.1") found components: C Found Python3: /usr/include/python3.14 (found version "3.14.3") found components: Development Development.Module Development.Embed Found MPI_Fortran: /lib/libmpi_usempif08.so (found version "3.1") Found MPI: TRUE (found version "3.1") found components: Fortran Configuring done (23.5s) Generating done (0.2s) Build files have been written to: /build/paraview-catalyst/src/build/tests/replay_tests/fortran/build-replay_ranks_mismatch_fortran ======== End CMake output ====== Change Dir: '/build/paraview-catalyst/src/build/tests/replay_tests/fortran/build-replay_ranks_mismatch_fortran' Run Clean Command: /usr/bin/ninja clean [1/1] Cleaning all built files... Cleaning... 0 files. Run Build Command(s): /usr/bin/ninja [1/7] Building Fortran preprocessed CMakeFiles/replay_ranks_mismatch_driver.dir/build/paraview-catalyst/src/catalyst-v2.1.0/tests/replay_tests/fortran/common_src_dir/common_replay_driver.F90-pp.f90 [2/7] Generating Fortran dyndep file CMakeFiles/replay_ranks_mismatch_driver.dir/Fortran.dd [3/7] Building C object CMakeFiles/replay_ranks_mismatch_adaptor.dir/catalyst_impl_replay.c.o [4/7] Building Fortran object CMakeFiles/replay_ranks_mismatch_driver.dir/build/paraview-catalyst/src/catalyst-v2.1.0/tests/replay_tests/fortran/common_src_dir/common_replay_driver.F90.o [5/7] Linking Fortran executable replay_ranks_mismatch_driver [6/7] Building CXX object CMakeFiles/replay_ranks_mismatch_adaptor.dir/build/paraview-catalyst/src/catalyst-v2.1.0/tests/replay_tests/common_src_dir/common_replay_adaptor.cxx.o [7/7] Linking CXX shared module catalyst/libcatalyst-replay.so Running test command: "/usr/bin/ctest" "-C" "$" "-V" "--output-on-failure" UpdateCTestConfiguration from :/build/paraview-catalyst/src/build/tests/replay_tests/fortran/build-replay_ranks_mismatch_fortran/DartConfiguration.tcl Parse Config file:/build/paraview-catalyst/src/build/tests/replay_tests/fortran/build-replay_ranks_mismatch_fortran/DartConfiguration.tcl Test project /build/paraview-catalyst/src/build/tests/replay_tests/fortran/build-replay_ranks_mismatch_fortran Constructing a list of tests Done constructing a list of tests Updating test list for fixtures Added 0 tests to meet fixture requirements Checking test dependency graph... Checking test dependency graph end test 1 Start 1: replay_ranks_mismatch_prepare 1: Test command: /usr/bin/cmake "-E" "rm" "-rf" "/build/paraview-catalyst/src/build/tests/replay_tests/fortran/build-replay_ranks_mismatch_fortran/data_dump/" 1: Working Directory: /build/paraview-catalyst/src/build/tests/replay_tests/fortran/build-replay_ranks_mismatch_fortran 1: Test timeout computed to be: 1500 1/3 Test #1: replay_ranks_mismatch_prepare ..... Passed 0.08 sec test 2 Start 2: replay_ranks_mismatch_write_out 2: Test command: /usr/bin/mpiexec "-n" "1" "/build/paraview-catalyst/src/build/tests/replay_tests/fortran/build-replay_ranks_mismatch_fortran/replay_ranks_mismatch_driver" "/build/paraview-catalyst/src/build/tests/replay_tests/fortran/build-replay_ranks_mismatch_fortran/catalyst" 2: Working Directory: /build/paraview-catalyst/src/build/tests/replay_tests/fortran/build-replay_ranks_mismatch_fortran 2: Environment variables: 2: CATALYST_DATA_DUMP_DIRECTORY=/build/paraview-catalyst/src/build/tests/replay_tests/fortran/build-replay_ranks_mismatch_fortran/data_dump/ 2: Test timeout computed to be: 1500 2/3 Test #2: replay_ranks_mismatch_write_out ... Passed 0.29 sec test 3 Start 3: replay_ranks_mismatch_read_in 3: Test command: /usr/bin/mpiexec "-n" "2" "/build/paraview-catalyst/src/build/bin/catalyst_replay" "/build/paraview-catalyst/src/build/tests/replay_tests/fortran/build-replay_ranks_mismatch_fortran/data_dump/" 3: Working Directory: /build/paraview-catalyst/src/build/tests/replay_tests/fortran/build-replay_ranks_mismatch_fortran 3: Test timeout computed to be: 1500 3: [arch-nspawn-3003276:09906] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 1056 3: [arch-nspawn-3003276:09906] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 1231 3: [arch-nspawn-3003276:09906] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 1353 3: [arch-nspawn-3003276:09906] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 2405 3: [arch-nspawn-3003276:09906] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 2460 3: [arch-nspawn-3003276:09906] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 2476 3: [arch-nspawn-3003276:09906] PMIX ERROR: PMIX_ERROR in file server/pmix_server.c at line 4660 3: -------------------------------------------------------------------------- 3: The gds/shmem2 component attempted to attach to a shared-memory segment at a 3: particular base address, but was given a different one. Your job will now likely 3: abort. 3: 3: Requested Address: 0x2aaac8000000 3: Acquired Address: 0x3f7fde4000 3: 3: If this problem persists, please consider disabling the gds/shmem2 component by 3: setting in your environment the following: PMIX_MCA_gds=hash 3: -------------------------------------------------------------------------- 3: 3: [arch-nspawn-3003276:09913] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file client/pmix_client.c at line 278 3: [arch-nspawn-3003276:09906] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 1056 3: [arch-nspawn-3003276:09906] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 1231 3: [arch-nspawn-3003276:09906] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 1353 3: [arch-nspawn-3003276:09906] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 2405 3: [arch-nspawn-3003276:09906] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 2460 3: [arch-nspawn-3003276:09906] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 2476 3: [arch-nspawn-3003276:09906] PMIX ERROR: PMIX_ERROR in file server/pmix_server.c at line 4660 3: [arch-nspawn-3003276:09914] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file client/pmix_client.c at line 278 3: *** An error occurred in MPI_Init 3: *** on a NULL communicator 3: *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, 3: *** and MPI will try to terminate your MPI job as well) 3: [arch-nspawn-3003276:09913] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! 3: *** An error occurred in MPI_Init 3: *** on a NULL communicator 3: *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, 3: *** and MPI will try to terminate your MPI job as well) 3: [arch-nspawn-3003276:09914] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! 3: -------------------------------------------------------------------------- 3: prte detected that one or more processes exited with non-zero status, 3: thus causing the job to be terminated. The first process to do so was: 3: 3: Process name: [prte-arch-nspawn-3003276-9906@1,0] 3: Exit code: 14 3: -------------------------------------------------------------------------- 3/3 Test #3: replay_ranks_mismatch_read_in .....***Failed Required regular expression not found. Regex=[ERROR: Mismatch in the number of ranks ] 0.65 sec [arch-nspawn-3003276:09906] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 1056 [arch-nspawn-3003276:09906] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 1231 [arch-nspawn-3003276:09906] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 1353 [arch-nspawn-3003276:09906] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 2405 [arch-nspawn-3003276:09906] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 2460 [arch-nspawn-3003276:09906] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 2476 [arch-nspawn-3003276:09906] PMIX ERROR: PMIX_ERROR in file server/pmix_server.c at line 4660 -------------------------------------------------------------------------- The gds/shmem2 component attempted to attach to a shared-memory segment at a particular base address, but was given a different one. Your job will now likely abort. Requested Address: 0x2aaac8000000 Acquired Address: 0x3f7fde4000 If this problem persists, please consider disabling the gds/shmem2 component by setting in your environment the following: PMIX_MCA_gds=hash -------------------------------------------------------------------------- [arch-nspawn-3003276:09913] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file client/pmix_client.c at line 278 [arch-nspawn-3003276:09906] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 1056 [arch-nspawn-3003276:09906] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 1231 [arch-nspawn-3003276:09906] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 1353 [arch-nspawn-3003276:09906] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 2405 [arch-nspawn-3003276:09906] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 2460 [arch-nspawn-3003276:09906] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 2476 [arch-nspawn-3003276:09906] PMIX ERROR: PMIX_ERROR in file server/pmix_server.c at line 4660 [arch-nspawn-3003276:09914] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file client/pmix_client.c at line 278 *** An error occurred in MPI_Init *** on a NULL communicator *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, *** and MPI will try to terminate your MPI job as well) [arch-nspawn-3003276:09913] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! *** An error occurred in MPI_Init *** on a NULL communicator *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, *** and MPI will try to terminate your MPI job as well) [arch-nspawn-3003276:09914] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! -------------------------------------------------------------------------- prte detected that one or more processes exited with non-zero status, thus causing the job to be terminated. The first process to do so was: Process name: [prte-arch-nspawn-3003276-9906@1,0] Exit code: 14 -------------------------------------------------------------------------- 67% tests passed, 1 tests failed out of 3 Total Test time (real) = 1.03 sec The following tests FAILED: 3 - replay_ranks_mismatch_read_in (Failed) Errors while running CTest Test command failed: /usr/bin/ctest CMake Error at /build/paraview-catalyst/src/catalyst-v2.1.0/cmake/catalyst-test-path-setting.cmake:41 (message): test or example failed! Start 56: test-build-replay_missing_execute_invc_fortran 49/69 Test #51: test-install-replay_high_num_execute_invc_fortran .....***Failed 38.79 sec Internal cmake changing into directory: /build/paraview-catalyst/src/build/tests/replay_tests/fortran/install-replay_high_num_execute_invc_fortran ======== CMake output ====== CMake Deprecation Warning at CMakeLists.txt:1 (cmake_minimum_required): Compatibility with CMake < 3.10 will be removed from a future version of CMake. Update the VERSION argument value. Or, use the ... syntax to tell CMake that the project requires at least but has been updated to work with policies introduced by or earlier. The C compiler identification is GNU 15.2.1 The CXX compiler identification is GNU 15.2.1 Detecting C compiler ABI info Detecting C compiler ABI info - done Check for working C compiler: /usr/bin/cc - skipped Detecting C compile features Detecting C compile features - done Detecting CXX compiler ABI info Detecting CXX compiler ABI info - done Check for working CXX compiler: /usr/bin/c++ - skipped Detecting CXX compile features Detecting CXX compile features - done The Fortran compiler identification is GNU 15.2.1 Detecting Fortran compiler ABI info Detecting Fortran compiler ABI info - done Check for working Fortran compiler: /usr/bin/gfortran - skipped Found MPI_C: /lib/libmpi.so (found version "3.1") Found MPI: TRUE (found version "3.1") found components: C Found Python3: /usr/include/python3.14 (found version "3.14.3") found components: Development Development.Module Development.Embed Found MPI_Fortran: /lib/libmpi_usempif08.so (found version "3.1") Found MPI: TRUE (found version "3.1") found components: Fortran Configuring done (24.8s) Generating done (0.2s) Build files have been written to: /build/paraview-catalyst/src/build/tests/replay_tests/fortran/install-replay_high_num_execute_invc_fortran ======== End CMake output ====== Change Dir: '/build/paraview-catalyst/src/build/tests/replay_tests/fortran/install-replay_high_num_execute_invc_fortran' Run Clean Command: /usr/bin/ninja clean [1/1] Cleaning all built files... Cleaning... 0 files. Run Build Command(s): /usr/bin/ninja [1/7] Building Fortran preprocessed CMakeFiles/high_num_execute_invc_driver.dir/high_num_execute_invc_driver.F90-pp.f90 [2/7] Generating Fortran dyndep file CMakeFiles/high_num_execute_invc_driver.dir/Fortran.dd [3/7] Building C object CMakeFiles/high_num_execute_invc_adaptor.dir/catalyst_impl_replay.c.o [4/7] Building Fortran object CMakeFiles/high_num_execute_invc_driver.dir/high_num_execute_invc_driver.F90.o [5/7] Linking Fortran executable high_num_execute_invc_driver [6/7] Building CXX object CMakeFiles/high_num_execute_invc_adaptor.dir/build/paraview-catalyst/src/catalyst-v2.1.0/tests/replay_tests/common_src_dir/common_replay_adaptor.cxx.o [7/7] Linking CXX shared module catalyst/libcatalyst-replay.so Running test command: "/usr/bin/ctest" "-C" "$" "-V" "--output-on-failure" UpdateCTestConfiguration from :/build/paraview-catalyst/src/build/tests/replay_tests/fortran/install-replay_high_num_execute_invc_fortran/DartConfiguration.tcl Parse Config file:/build/paraview-catalyst/src/build/tests/replay_tests/fortran/install-replay_high_num_execute_invc_fortran/DartConfiguration.tcl Test project /build/paraview-catalyst/src/build/tests/replay_tests/fortran/install-replay_high_num_execute_invc_fortran Constructing a list of tests Done constructing a list of tests Updating test list for fixtures Added 0 tests to meet fixture requirements Checking test dependency graph... Checking test dependency graph end test 1 Start 1: high_num_execute_invc_prepare 1: Test command: /usr/bin/cmake "-E" "rm" "-rf" "/build/paraview-catalyst/src/build/tests/replay_tests/fortran/install-replay_high_num_execute_invc_fortran/data_dump" 1: Working Directory: /build/paraview-catalyst/src/build/tests/replay_tests/fortran/install-replay_high_num_execute_invc_fortran 1: Test timeout computed to be: 1500 1/3 Test #1: high_num_execute_invc_prepare ..... Passed 0.09 sec test 2 Start 2: high_num_execute_invc_write_out 2: Test command: /usr/bin/mpiexec "-n" "2" "/build/paraview-catalyst/src/build/tests/replay_tests/fortran/install-replay_high_num_execute_invc_fortran/high_num_execute_invc_driver" "/build/paraview-catalyst/src/build/tests/replay_tests/fortran/install-replay_high_num_execute_invc_fortran/catalyst" 2: Working Directory: /build/paraview-catalyst/src/build/tests/replay_tests/fortran/install-replay_high_num_execute_invc_fortran 2: Environment variables: 2: CATALYST_DATA_DUMP_DIRECTORY=/build/paraview-catalyst/src/build/tests/replay_tests/fortran/install-replay_high_num_execute_invc_fortran/data_dump 2: Test timeout computed to be: 1500 2: [arch-nspawn-3003276:09908] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 1056 2: [arch-nspawn-3003276:09908] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 1231 2: [arch-nspawn-3003276:09908] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 1353 2: [arch-nspawn-3003276:09908] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 2405 2: [arch-nspawn-3003276:09908] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 2460 2: [arch-nspawn-3003276:09908] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 2476 2: [arch-nspawn-3003276:09908] PMIX ERROR: PMIX_ERROR in file server/pmix_server.c at line 4660 2: -------------------------------------------------------------------------- 2: The gds/shmem2 component attempted to attach to a shared-memory segment at a 2: particular base address, but was given a different one. Your job will now likely 2: abort. 2: 2: Requested Address: 0x2aaab8000000 2: Acquired Address: 0x3f9a0ba000 2: 2: If this problem persists, please consider disabling the gds/shmem2 component by 2: setting in your environment the following: PMIX_MCA_gds=hash 2: -------------------------------------------------------------------------- 2: 2: [arch-nspawn-3003276:09929] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file client/pmix_client.c at line 278 2: [arch-nspawn-3003276:09908] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 1056 2: [arch-nspawn-3003276:09908] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 1231 2: [arch-nspawn-3003276:09908] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 1353 2: [arch-nspawn-3003276:09908] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 2405 2: [arch-nspawn-3003276:09908] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 2460 2: [arch-nspawn-3003276:09908] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 2476 2: [arch-nspawn-3003276:09908] PMIX ERROR: PMIX_ERROR in file server/pmix_server.c at line 4660 2: [arch-nspawn-3003276:09927] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file client/pmix_client.c at line 278 2: *** An error occurred in MPI_Init 2: *** on a NULL communicator 2: *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, 2: *** and MPI will try to terminate your MPI job as well) 2: [arch-nspawn-3003276:09929] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! 2: -------------------------------------------------------------------------- 2: PMIx_Init failed for the following reason: 2: 2: PMIX_ERROR 2: 2: Open MPI requires access to a local PMIx server to execute. Please ensure 2: that either you are operating in a PMIx-enabled environment, or use "mpirun" 2: to execute the job. 2: -------------------------------------------------------------------------- 2: -------------------------------------------------------------------------- 2: prte detected that one or more processes exited with non-zero status, 2: thus causing the job to be terminated. The first process to do so was: 2: 2: Process name: [prte-arch-nspawn-3003276-9908@1,1] 2: Exit code: 14 2: -------------------------------------------------------------------------- 2: 2/3 Test #2: high_num_execute_invc_write_out ...***Failed 0.71 sec [arch-nspawn-3003276:09908] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 1056 [arch-nspawn-3003276:09908] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 1231 [arch-nspawn-3003276:09908] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 1353 [arch-nspawn-3003276:09908] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 2405 [arch-nspawn-3003276:09908] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 2460 [arch-nspawn-3003276:09908] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 2476 [arch-nspawn-3003276:09908] PMIX ERROR: PMIX_ERROR in file server/pmix_server.c at line 4660 -------------------------------------------------------------------------- The gds/shmem2 component attempted to attach to a shared-memory segment at a particular base address, but was given a different one. Your job will now likely abort. Requested Address: 0x2aaab8000000 Acquired Address: 0x3f9a0ba000 If this problem persists, please consider disabling the gds/shmem2 component by setting in your environment the following: PMIX_MCA_gds=hash -------------------------------------------------------------------------- [arch-nspawn-3003276:09929] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file client/pmix_client.c at line 278 [arch-nspawn-3003276:09908] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 1056 [arch-nspawn-3003276:09908] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 1231 [arch-nspawn-3003276:09908] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 1353 [arch-nspawn-3003276:09908] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 2405 [arch-nspawn-3003276:09908] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 2460 [arch-nspawn-3003276:09908] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 2476 [arch-nspawn-3003276:09908] PMIX ERROR: PMIX_ERROR in file server/pmix_server.c at line 4660 [arch-nspawn-3003276:09927] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file client/pmix_client.c at line 278 *** An error occurred in MPI_Init *** on a NULL communicator *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, *** and MPI will try to terminate your MPI job as well) [arch-nspawn-3003276:09929] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! -------------------------------------------------------------------------- PMIx_Init failed for the following reason: PMIX_ERROR Open MPI requires access to a local PMIx server to execute. Please ensure that either you are operating in a PMIx-enabled environment, or use "mpirun" to execute the job. -------------------------------------------------------------------------- -------------------------------------------------------------------------- prte detected that one or more processes exited with non-zero status, thus causing the job to be terminated. The first process to do so was: Process name: [prte-arch-nspawn-3003276-9908@1,1] Exit code: 14 -------------------------------------------------------------------------- test 3 Start 3: high_num_execute_invc_read_in Failed test dependencies: high_num_execute_invc_write_out 3/3 Test #3: high_num_execute_invc_read_in .....***Not Run 0.00 sec 33% tests passed, 2 tests failed out of 3 Total Test time (real) = 0.81 sec The following tests FAILED: 2 - high_num_execute_invc_write_out (Failed) 3 - high_num_execute_invc_read_in (Not Run) Errors while running CTest Test command failed: /usr/bin/ctest CMake Error at /build/paraview-catalyst/src/catalyst-v2.1.0/cmake/catalyst-test-path-setting.cmake:41 (message): test or example failed! Start 57: test-install-replay_missing_execute_invc_fortran 50/69 Test #50: test-build-replay_high_num_execute_invc_fortran .......***Failed 39.41 sec Internal cmake changing into directory: /build/paraview-catalyst/src/build/tests/replay_tests/fortran/build-replay_high_num_execute_invc_fortran ======== CMake output ====== CMake Deprecation Warning at CMakeLists.txt:1 (cmake_minimum_required): Compatibility with CMake < 3.10 will be removed from a future version of CMake. Update the VERSION argument value. Or, use the ... syntax to tell CMake that the project requires at least but has been updated to work with policies introduced by or earlier. The C compiler identification is GNU 15.2.1 The CXX compiler identification is GNU 15.2.1 Detecting C compiler ABI info Detecting C compiler ABI info - done Check for working C compiler: /usr/bin/cc - skipped Detecting C compile features Detecting C compile features - done Detecting CXX compiler ABI info Detecting CXX compiler ABI info - done Check for working CXX compiler: /usr/bin/c++ - skipped Detecting CXX compile features Detecting CXX compile features - done The Fortran compiler identification is GNU 15.2.1 Detecting Fortran compiler ABI info Detecting Fortran compiler ABI info - done Check for working Fortran compiler: /usr/bin/gfortran - skipped Found MPI_C: /lib/libmpi.so (found version "3.1") Found MPI: TRUE (found version "3.1") found components: C Found Python3: /usr/include/python3.14 (found version "3.14.3") found components: Development Development.Module Development.Embed Found MPI_Fortran: /lib/libmpi_usempif08.so (found version "3.1") Found MPI: TRUE (found version "3.1") found components: Fortran Configuring done (23.8s) Generating done (0.2s) Build files have been written to: /build/paraview-catalyst/src/build/tests/replay_tests/fortran/build-replay_high_num_execute_invc_fortran ======== End CMake output ====== Change Dir: '/build/paraview-catalyst/src/build/tests/replay_tests/fortran/build-replay_high_num_execute_invc_fortran' Run Clean Command: /usr/bin/ninja clean [1/1] Cleaning all built files... Cleaning... 0 files. Run Build Command(s): /usr/bin/ninja [1/7] Building Fortran preprocessed CMakeFiles/high_num_execute_invc_driver.dir/high_num_execute_invc_driver.F90-pp.f90 [2/7] Generating Fortran dyndep file CMakeFiles/high_num_execute_invc_driver.dir/Fortran.dd [3/7] Building C object CMakeFiles/high_num_execute_invc_adaptor.dir/catalyst_impl_replay.c.o [4/7] Building Fortran object CMakeFiles/high_num_execute_invc_driver.dir/high_num_execute_invc_driver.F90.o [5/7] Linking Fortran executable high_num_execute_invc_driver [6/7] Building CXX object CMakeFiles/high_num_execute_invc_adaptor.dir/build/paraview-catalyst/src/catalyst-v2.1.0/tests/replay_tests/common_src_dir/common_replay_adaptor.cxx.o [7/7] Linking CXX shared module catalyst/libcatalyst-replay.so Running test command: "/usr/bin/ctest" "-C" "$" "-V" "--output-on-failure" UpdateCTestConfiguration from :/build/paraview-catalyst/src/build/tests/replay_tests/fortran/build-replay_high_num_execute_invc_fortran/DartConfiguration.tcl Parse Config file:/build/paraview-catalyst/src/build/tests/replay_tests/fortran/build-replay_high_num_execute_invc_fortran/DartConfiguration.tcl Test project /build/paraview-catalyst/src/build/tests/replay_tests/fortran/build-replay_high_num_execute_invc_fortran Constructing a list of tests Done constructing a list of tests Updating test list for fixtures Added 0 tests to meet fixture requirements Checking test dependency graph... Checking test dependency graph end test 1 Start 1: high_num_execute_invc_prepare 1: Test command: /usr/bin/cmake "-E" "rm" "-rf" "/build/paraview-catalyst/src/build/tests/replay_tests/fortran/build-replay_high_num_execute_invc_fortran/data_dump" 1: Working Directory: /build/paraview-catalyst/src/build/tests/replay_tests/fortran/build-replay_high_num_execute_invc_fortran 1: Test timeout computed to be: 1500 1/3 Test #1: high_num_execute_invc_prepare ..... Passed 0.06 sec test 2 Start 2: high_num_execute_invc_write_out 2: Test command: /usr/bin/mpiexec "-n" "2" "/build/paraview-catalyst/src/build/tests/replay_tests/fortran/build-replay_high_num_execute_invc_fortran/high_num_execute_invc_driver" "/build/paraview-catalyst/src/build/tests/replay_tests/fortran/build-replay_high_num_execute_invc_fortran/catalyst" 2: Working Directory: /build/paraview-catalyst/src/build/tests/replay_tests/fortran/build-replay_high_num_execute_invc_fortran 2: Environment variables: 2: CATALYST_DATA_DUMP_DIRECTORY=/build/paraview-catalyst/src/build/tests/replay_tests/fortran/build-replay_high_num_execute_invc_fortran/data_dump 2: Test timeout computed to be: 1500 2: [arch-nspawn-3003276:09926] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 1056 2: [arch-nspawn-3003276:09926] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 1231 2: [arch-nspawn-3003276:09926] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 1353 2: [arch-nspawn-3003276:09926] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 2405 2: [arch-nspawn-3003276:09926] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 2460 2: [arch-nspawn-3003276:09926] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 2476 2: [arch-nspawn-3003276:09926] PMIX ERROR: PMIX_ERROR in file server/pmix_server.c at line 4660 2: -------------------------------------------------------------------------- 2: The gds/shmem2 component attempted to attach to a shared-memory segment at a 2: particular base address, but was given a different one. Your job will now likely 2: abort. 2: 2: Requested Address: 0x2aaaac000000 2: Acquired Address: 0x3f9b2dc000 2: 2: If this problem persists, please consider disabling the gds/shmem2 component by 2: setting in your environment the following: PMIX_MCA_gds=hash 2: -------------------------------------------------------------------------- 2: [arch-nspawn-3003276:09926] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 1056 2: [arch-nspawn-3003276:09926] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 1231 2: [arch-nspawn-3003276:09926] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 1353 2: [arch-nspawn-3003276:09926] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 2405 2: [arch-nspawn-3003276:09926] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 2460 2: [arch-nspawn-3003276:09926] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 2476 2: [arch-nspawn-3003276:09926] PMIX ERROR: PMIX_ERROR in file server/pmix_server.c at line 4660 2: 2: [arch-nspawn-3003276:09939] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file client/pmix_client.c at line 278 2: [arch-nspawn-3003276:09938] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file client/pmix_client.c at line 278 2: *** An error occurred in MPI_Init 2: *** on a NULL communicator 2: *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, 2: *** and MPI will try to terminate your MPI job as well) 2: [arch-nspawn-3003276:09939] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! 2: -------------------------------------------------------------------------- 2: PMIx_Init failed for the following reason: 2: 2: PMIX_ERROR 2: 2: Open MPI requires access to a local PMIx server to execute. Please ensure 2: that either you are operating in a PMIx-enabled environment, or use "mpirun" 2: to execute the job. 2: -------------------------------------------------------------------------- 2: -------------------------------------------------------------------------- 2: prte detected that one or more processes exited with non-zero status, 2: thus causing the job to be terminated. The first process to do so was: 2: 2: Process name: [prte-arch-nspawn-3003276-9926@1,1] 2: Exit code: 14 2: -------------------------------------------------------------------------- 2: [arch-nspawn-3003276:09926] PMIX ERROR: PMIX_ERR_UNREACH in file server/pmix_server.c at line 3110 2: 2/3 Test #2: high_num_execute_invc_write_out ...***Failed 0.68 sec [arch-nspawn-3003276:09926] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 1056 [arch-nspawn-3003276:09926] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 1231 [arch-nspawn-3003276:09926] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 1353 [arch-nspawn-3003276:09926] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 2405 [arch-nspawn-3003276:09926] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 2460 [arch-nspawn-3003276:09926] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 2476 [arch-nspawn-3003276:09926] PMIX ERROR: PMIX_ERROR in file server/pmix_server.c at line 4660 -------------------------------------------------------------------------- The gds/shmem2 component attempted to attach to a shared-memory segment at a particular base address, but was given a different one. Your job will now likely abort. Requested Address: 0x2aaaac000000 Acquired Address: 0x3f9b2dc000 If this problem persists, please consider disabling the gds/shmem2 component by setting in your environment the following: PMIX_MCA_gds=hash -------------------------------------------------------------------------- [arch-nspawn-3003276:09926] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 1056 [arch-nspawn-3003276:09926] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 1231 [arch-nspawn-3003276:09926] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 1353 [arch-nspawn-3003276:09926] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 2405 [arch-nspawn-3003276:09926] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 2460 [arch-nspawn-3003276:09926] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 2476 [arch-nspawn-3003276:09926] PMIX ERROR: PMIX_ERROR in file server/pmix_server.c at line 4660 [arch-nspawn-3003276:09939] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file client/pmix_client.c at line 278 [arch-nspawn-3003276:09938] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file client/pmix_client.c at line 278 *** An error occurred in MPI_Init *** on a NULL communicator *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, *** and MPI will try to terminate your MPI job as well) [arch-nspawn-3003276:09939] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! -------------------------------------------------------------------------- PMIx_Init failed for the following reason: PMIX_ERROR Open MPI requires access to a local PMIx server to execute. Please ensure that either you are operating in a PMIx-enabled environment, or use "mpirun" to execute the job. -------------------------------------------------------------------------- -------------------------------------------------------------------------- prte detected that one or more processes exited with non-zero status, thus causing the job to be terminated. The first process to do so was: Process name: [prte-arch-nspawn-3003276-9926@1,1] Exit code: 14 -------------------------------------------------------------------------- [arch-nspawn-3003276:09926] PMIX ERROR: PMIX_ERR_UNREACH in file server/pmix_server.c at line 3110 test 3 Start 3: high_num_execute_invc_read_in Failed test dependencies: high_num_execute_invc_write_out 3/3 Test #3: high_num_execute_invc_read_in .....***Not Run 0.00 sec 33% tests passed, 2 tests failed out of 3 Total Test time (real) = 0.74 sec The following tests FAILED: 2 - high_num_execute_invc_write_out (Failed) 3 - high_num_execute_invc_read_in (Not Run) Errors while running CTest Test command failed: /usr/bin/ctest CMake Error at /build/paraview-catalyst/src/catalyst-v2.1.0/cmake/catalyst-test-path-setting.cmake:41 (message): test or example failed! Start 58: test-build-replay_high_num_ranks_python 51/69 Test #49: test-install-replay_ranks_mismatch_fortran ............***Failed 39.65 sec Internal cmake changing into directory: /build/paraview-catalyst/src/build/tests/replay_tests/fortran/install-replay_ranks_mismatch_fortran ======== CMake output ====== CMake Deprecation Warning at CMakeLists.txt:1 (cmake_minimum_required): Compatibility with CMake < 3.10 will be removed from a future version of CMake. Update the VERSION argument value. Or, use the ... syntax to tell CMake that the project requires at least but has been updated to work with policies introduced by or earlier. The C compiler identification is GNU 15.2.1 The CXX compiler identification is GNU 15.2.1 Detecting C compiler ABI info Detecting C compiler ABI info - done Check for working C compiler: /usr/bin/cc - skipped Detecting C compile features Detecting C compile features - done Detecting CXX compiler ABI info Detecting CXX compiler ABI info - done Check for working CXX compiler: /usr/bin/c++ - skipped Detecting CXX compile features Detecting CXX compile features - done The Fortran compiler identification is GNU 15.2.1 Detecting Fortran compiler ABI info Detecting Fortran compiler ABI info - done Check for working Fortran compiler: /usr/bin/gfortran - skipped Found MPI_C: /lib/libmpi.so (found version "3.1") Found MPI: TRUE (found version "3.1") found components: C Found Python3: /usr/include/python3.14 (found version "3.14.3") found components: Development Development.Module Development.Embed Found MPI_Fortran: /lib/libmpi_usempif08.so (found version "3.1") Found MPI: TRUE (found version "3.1") found components: Fortran Configuring done (23.8s) Generating done (0.2s) Build files have been written to: /build/paraview-catalyst/src/build/tests/replay_tests/fortran/install-replay_ranks_mismatch_fortran ======== End CMake output ====== Change Dir: '/build/paraview-catalyst/src/build/tests/replay_tests/fortran/install-replay_ranks_mismatch_fortran' Run Clean Command: /usr/bin/ninja clean [1/1] Cleaning all built files... Cleaning... 0 files. Run Build Command(s): /usr/bin/ninja [1/7] Building Fortran preprocessed CMakeFiles/replay_ranks_mismatch_driver.dir/build/paraview-catalyst/src/catalyst-v2.1.0/tests/replay_tests/fortran/common_src_dir/common_replay_driver.F90-pp.f90 [2/7] Building C object CMakeFiles/replay_ranks_mismatch_adaptor.dir/catalyst_impl_replay.c.o [3/7] Generating Fortran dyndep file CMakeFiles/replay_ranks_mismatch_driver.dir/Fortran.dd [4/7] Building Fortran object CMakeFiles/replay_ranks_mismatch_driver.dir/build/paraview-catalyst/src/catalyst-v2.1.0/tests/replay_tests/fortran/common_src_dir/common_replay_driver.F90.o [5/7] Linking Fortran executable replay_ranks_mismatch_driver [6/7] Building CXX object CMakeFiles/replay_ranks_mismatch_adaptor.dir/build/paraview-catalyst/src/catalyst-v2.1.0/tests/replay_tests/common_src_dir/common_replay_adaptor.cxx.o [7/7] Linking CXX shared module catalyst/libcatalyst-replay.so Running test command: "/usr/bin/ctest" "-C" "$" "-V" "--output-on-failure" UpdateCTestConfiguration from :/build/paraview-catalyst/src/build/tests/replay_tests/fortran/install-replay_ranks_mismatch_fortran/DartConfiguration.tcl Parse Config file:/build/paraview-catalyst/src/build/tests/replay_tests/fortran/install-replay_ranks_mismatch_fortran/DartConfiguration.tcl Test project /build/paraview-catalyst/src/build/tests/replay_tests/fortran/install-replay_ranks_mismatch_fortran Constructing a list of tests Done constructing a list of tests Updating test list for fixtures Added 0 tests to meet fixture requirements Checking test dependency graph... Checking test dependency graph end test 1 Start 1: replay_ranks_mismatch_prepare 1: Test command: /usr/bin/cmake "-E" "rm" "-rf" "/build/paraview-catalyst/src/build/tests/replay_tests/fortran/install-replay_ranks_mismatch_fortran/data_dump/" 1: Working Directory: /build/paraview-catalyst/src/build/tests/replay_tests/fortran/install-replay_ranks_mismatch_fortran 1: Test timeout computed to be: 1500 1/3 Test #1: replay_ranks_mismatch_prepare ..... Passed 0.05 sec test 2 Start 2: replay_ranks_mismatch_write_out 2: Test command: /usr/bin/mpiexec "-n" "1" "/build/paraview-catalyst/src/build/tests/replay_tests/fortran/install-replay_ranks_mismatch_fortran/replay_ranks_mismatch_driver" "/build/paraview-catalyst/src/build/tests/replay_tests/fortran/install-replay_ranks_mismatch_fortran/catalyst" 2: Working Directory: /build/paraview-catalyst/src/build/tests/replay_tests/fortran/install-replay_ranks_mismatch_fortran 2: Environment variables: 2: CATALYST_DATA_DUMP_DIRECTORY=/build/paraview-catalyst/src/build/tests/replay_tests/fortran/install-replay_ranks_mismatch_fortran/data_dump/ 2: Test timeout computed to be: 1500 2/3 Test #2: replay_ranks_mismatch_write_out ... Passed 0.31 sec test 3 Start 3: replay_ranks_mismatch_read_in 3: Test command: /usr/bin/mpiexec "-n" "2" "/build/paraview-catalyst/src/build/bin/catalyst_replay" "/build/paraview-catalyst/src/build/tests/replay_tests/fortran/install-replay_ranks_mismatch_fortran/data_dump/" 3: Working Directory: /build/paraview-catalyst/src/build/tests/replay_tests/fortran/install-replay_ranks_mismatch_fortran 3: Test timeout computed to be: 1500 3: [arch-nspawn-3003276:09947] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 1056 3: [arch-nspawn-3003276:09947] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 1231 3: [arch-nspawn-3003276:09947] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 1353 3: [arch-nspawn-3003276:09947] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 2405 3: [arch-nspawn-3003276:09947] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 2460 3: [arch-nspawn-3003276:09947] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 2476 3: [arch-nspawn-3003276:09947] PMIX ERROR: PMIX_ERROR in file server/pmix_server.c at line 4660 3: -------------------------------------------------------------------------- 3: The gds/shmem2 component attempted to attach to a shared-memory segment at a 3: particular base address, but was given a different one. Your job will now likely 3: abort. 3: 3: Requested Address: 0x2aaac0000000 3: Acquired Address: 0x3f8702e000 3: 3: If this problem persists, please consider disabling the gds/shmem2 component by 3: setting in your environment the following: PMIX_MCA_gds=hash 3: -------------------------------------------------------------------------- 3: 3: [arch-nspawn-3003276:09954] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file client/pmix_client.c at line 278 3: *** An error occurred in MPI_Init 3: -------------------------------------------------------------------------- 3: PMIx_Init failed for the following reason: 3: 3: PMIX_ERROR 3: 3: Open MPI requires access to a local PMIx server to execute. Please ensure 3: that either you are operating in a PMIx-enabled environment, or use "mpirun" 3: to execute the job. 3: -------------------------------------------------------------------------- 3: 3: *** on a NULL communicator 3: *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, 3: *** and MPI will try to terminate your MPI job as well) 3: [arch-nspawn-3003276:09954] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! 3: [arch-nspawn-3003276:09947] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 1056 3: [arch-nspawn-3003276:09947] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 1231 3: [arch-nspawn-3003276:09947] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 1353 3: [arch-nspawn-3003276:09947] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 2405 3: [arch-nspawn-3003276:09947] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 2460 3: [arch-nspawn-3003276:09947] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 2476 3: [arch-nspawn-3003276:09947] PMIX ERROR: PMIX_ERROR in file server/pmix_server.c at line 4660 3/3 Test #3: replay_ranks_mismatch_read_in .....***Failed Required regular expression not found. Regex=[ERROR: Mismatch in the number of ranks ] 0.55 sec [arch-nspawn-3003276:09947] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 1056 [arch-nspawn-3003276:09947] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 1231 [arch-nspawn-3003276:09947] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 1353 [arch-nspawn-3003276:09947] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 2405 [arch-nspawn-3003276:09947] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 2460 [arch-nspawn-3003276:09947] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 2476 [arch-nspawn-3003276:09947] PMIX ERROR: PMIX_ERROR in file server/pmix_server.c at line 4660 -------------------------------------------------------------------------- The gds/shmem2 component attempted to attach to a shared-memory segment at a particular base address, but was given a different one. Your job will now likely abort. Requested Address: 0x2aaac0000000 Acquired Address: 0x3f8702e000 If this problem persists, please consider disabling the gds/shmem2 component by setting in your environment the following: PMIX_MCA_gds=hash -------------------------------------------------------------------------- [arch-nspawn-3003276:09954] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file client/pmix_client.c at line 278 *** An error occurred in MPI_Init -------------------------------------------------------------------------- PMIx_Init failed for the following reason: PMIX_ERROR Open MPI requires access to a local PMIx server to execute. Please ensure that either you are operating in a PMIx-enabled environment, or use "mpirun" to execute the job. -------------------------------------------------------------------------- *** on a NULL communicator *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, *** and MPI will try to terminate your MPI job as well) [arch-nspawn-3003276:09954] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! [arch-nspawn-3003276:09947] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 1056 [arch-nspawn-3003276:09947] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 1231 [arch-nspawn-3003276:09947] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 1353 [arch-nspawn-3003276:09947] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 2405 [arch-nspawn-3003276:09947] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 2460 [arch-nspawn-3003276:09947] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 2476 [arch-nspawn-3003276:09947] PMIX ERROR: PMIX_ERROR in file server/pmix_server.c at line 4660 67% tests passed, 1 tests failed out of 3 Total Test time (real) = 0.93 sec The following tests FAILED: 3 - replay_ranks_mismatch_read_in (Failed) Errors while running CTest Test command failed: /usr/bin/ctest CMake Error at /build/paraview-catalyst/src/catalyst-v2.1.0/cmake/catalyst-test-path-setting.cmake:41 (message): test or example failed! Start 59: test-install-replay_high_num_ranks_python 52/69 Test #52: test-build-replay_no_data_dump_dir_fortran ............***Failed 36.67 sec Internal cmake changing into directory: /build/paraview-catalyst/src/build/tests/replay_tests/fortran/build-replay_no_data_dump_dir_fortran ======== CMake output ====== CMake Deprecation Warning at CMakeLists.txt:1 (cmake_minimum_required): Compatibility with CMake < 3.10 will be removed from a future version of CMake. Update the VERSION argument value. Or, use the ... syntax to tell CMake that the project requires at least but has been updated to work with policies introduced by or earlier. The C compiler identification is GNU 15.2.1 The CXX compiler identification is GNU 15.2.1 Detecting C compiler ABI info Detecting C compiler ABI info - done Check for working C compiler: /usr/bin/cc - skipped Detecting C compile features Detecting C compile features - done Detecting CXX compiler ABI info Detecting CXX compiler ABI info - done Check for working CXX compiler: /usr/bin/c++ - skipped Detecting CXX compile features Detecting CXX compile features - done The Fortran compiler identification is GNU 15.2.1 Detecting Fortran compiler ABI info Detecting Fortran compiler ABI info - done Check for working Fortran compiler: /usr/bin/gfortran - skipped Found MPI_C: /lib/libmpi.so (found version "3.1") Found MPI: TRUE (found version "3.1") found components: C Found Python3: /usr/include/python3.14 (found version "3.14.3") found components: Development Development.Module Development.Embed Found MPI_Fortran: /lib/libmpi_usempif08.so (found version "3.1") Found MPI: TRUE (found version "3.1") found components: Fortran Configuring done (22.1s) Generating done (0.2s) Build files have been written to: /build/paraview-catalyst/src/build/tests/replay_tests/fortran/build-replay_no_data_dump_dir_fortran ======== End CMake output ====== Change Dir: '/build/paraview-catalyst/src/build/tests/replay_tests/fortran/build-replay_no_data_dump_dir_fortran' Run Clean Command: /usr/bin/ninja clean [1/1] Cleaning all built files... Cleaning... 0 files. Run Build Command(s): /usr/bin/ninja [1/7] Building Fortran preprocessed CMakeFiles/no_data_dump_dir_driver.dir/build/paraview-catalyst/src/catalyst-v2.1.0/tests/replay_tests/fortran/common_src_dir/common_replay_driver.F90-pp.f90 [2/7] Generating Fortran dyndep file CMakeFiles/no_data_dump_dir_driver.dir/Fortran.dd [3/7] Building C object CMakeFiles/no_data_dump_dir_adapter.dir/catalyst_impl_replay.c.o [4/7] Building Fortran object CMakeFiles/no_data_dump_dir_driver.dir/build/paraview-catalyst/src/catalyst-v2.1.0/tests/replay_tests/fortran/common_src_dir/common_replay_driver.F90.o [5/7] Linking Fortran executable no_data_dump_dir_driver [6/7] Building CXX object CMakeFiles/no_data_dump_dir_adapter.dir/build/paraview-catalyst/src/catalyst-v2.1.0/tests/replay_tests/common_src_dir/common_replay_adaptor.cxx.o [7/7] Linking CXX shared module catalyst/libcatalyst-replay.so Running test command: "/usr/bin/ctest" "-C" "$" "-V" "--output-on-failure" UpdateCTestConfiguration from :/build/paraview-catalyst/src/build/tests/replay_tests/fortran/build-replay_no_data_dump_dir_fortran/DartConfiguration.tcl Parse Config file:/build/paraview-catalyst/src/build/tests/replay_tests/fortran/build-replay_no_data_dump_dir_fortran/DartConfiguration.tcl Test project /build/paraview-catalyst/src/build/tests/replay_tests/fortran/build-replay_no_data_dump_dir_fortran Constructing a list of tests Done constructing a list of tests Updating test list for fixtures Added 0 tests to meet fixture requirements Checking test dependency graph... Checking test dependency graph end test 1 Start 1: no_write_out 1: Test command: /usr/bin/mpiexec "-n" "2" "/build/paraview-catalyst/src/build/tests/replay_tests/fortran/build-replay_no_data_dump_dir_fortran/no_data_dump_dir_driver" "/build/paraview-catalyst/src/build/tests/replay_tests/fortran/build-replay_no_data_dump_dir_fortran/catalyst" 1: Working Directory: /build/paraview-catalyst/src/build/tests/replay_tests/fortran/build-replay_no_data_dump_dir_fortran 1: Test timeout computed to be: 1500 1: [arch-nspawn-3003276:11024] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 1056 1: [arch-nspawn-3003276:11024] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 1231 1: [arch-nspawn-3003276:11024] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 1353 1: [arch-nspawn-3003276:11024] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 2405 1: [arch-nspawn-3003276:11024] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 2460 1: [arch-nspawn-3003276:11024] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 2476 1: [arch-nspawn-3003276:11024] PMIX ERROR: PMIX_ERROR in file server/pmix_server.c at line 4660 1: -------------------------------------------------------------------------- 1: The gds/shmem2 component attempted to attach to a shared-memory segment at a 1: particular base address, but was given a different one. Your job will now likely 1: abort. 1: 1: Requested Address: 0x2aaab4000000 1: Acquired Address: 0x3f8c2d0000 1: 1: If this problem persists, please consider disabling the gds/shmem2 component by 1: setting in your environment the following: PMIX_MCA_gds=hash 1: -------------------------------------------------------------------------- 1: 1: [arch-nspawn-3003276:11071] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file client/pmix_client.c at line 278 1: [arch-nspawn-3003276:11024] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 1056 1: [arch-nspawn-3003276:11024] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 1231 1: [arch-nspawn-3003276:11024] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 1353 1: [arch-nspawn-3003276:11024] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 2405 1: [arch-nspawn-3003276:11024] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 2460 1: [arch-nspawn-3003276:11024] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 2476 1: [arch-nspawn-3003276:11024] PMIX ERROR: PMIX_ERROR in file server/pmix_server.c at line 4660 1: [arch-nspawn-3003276:11070] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file client/pmix_client.c at line 278 1: *** An error occurred in MPI_Init 1: *** on a NULL communicator 1: *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, 1: *** and MPI will try to terminate your MPI job as well) 1: [arch-nspawn-3003276:11071] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! 1: -------------------------------------------------------------------------- 1: prte detected that one or more processes exited with non-zero status, 1: thus causing the job to be terminated. The first process to do so was: 1: 1: Process name: [prte-arch-nspawn-3003276-11024@1,1] 1: Exit code: 14 1: -------------------------------------------------------------------------- 1/2 Test #1: no_write_out .....................***Failed 0.60 sec [arch-nspawn-3003276:11024] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 1056 [arch-nspawn-3003276:11024] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 1231 [arch-nspawn-3003276:11024] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 1353 [arch-nspawn-3003276:11024] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 2405 [arch-nspawn-3003276:11024] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 2460 [arch-nspawn-3003276:11024] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 2476 [arch-nspawn-3003276:11024] PMIX ERROR: PMIX_ERROR in file server/pmix_server.c at line 4660 -------------------------------------------------------------------------- The gds/shmem2 component attempted to attach to a shared-memory segment at a particular base address, but was given a different one. Your job will now likely abort. Requested Address: 0x2aaab4000000 Acquired Address: 0x3f8c2d0000 If this problem persists, please consider disabling the gds/shmem2 component by setting in your environment the following: PMIX_MCA_gds=hash -------------------------------------------------------------------------- [arch-nspawn-3003276:11071] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file client/pmix_client.c at line 278 [arch-nspawn-3003276:11024] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 1056 [arch-nspawn-3003276:11024] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 1231 [arch-nspawn-3003276:11024] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 1353 [arch-nspawn-3003276:11024] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 2405 [arch-nspawn-3003276:11024] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 2460 [arch-nspawn-3003276:11024] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 2476 [arch-nspawn-3003276:11024] PMIX ERROR: PMIX_ERROR in file server/pmix_server.c at line 4660 [arch-nspawn-3003276:11070] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file client/pmix_client.c at line 278 *** An error occurred in MPI_Init *** on a NULL communicator *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, *** and MPI will try to terminate your MPI job as well) [arch-nspawn-3003276:11071] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! -------------------------------------------------------------------------- prte detected that one or more processes exited with non-zero status, thus causing the job to be terminated. The first process to do so was: Process name: [prte-arch-nspawn-3003276-11024@1,1] Exit code: 14 -------------------------------------------------------------------------- test 2 Start 2: no_data_dump_dir_read_in Failed test dependencies: no_write_out 2/2 Test #2: no_data_dump_dir_read_in .........***Not Run 0.00 sec 0% tests passed, 2 tests failed out of 2 Total Test time (real) = 0.61 sec The following tests FAILED: 1 - no_write_out (Failed) 2 - no_data_dump_dir_read_in (Not Run) Errors while running CTest Test command failed: /usr/bin/ctest CMake Error at /build/paraview-catalyst/src/catalyst-v2.1.0/cmake/catalyst-test-path-setting.cmake:41 (message): test or example failed! Start 60: test-build-replay_ranks_mismatch_python 53/69 Test #53: test-install-replay_no_data_dump_dir_fortran ..........***Failed 38.00 sec Internal cmake changing into directory: /build/paraview-catalyst/src/build/tests/replay_tests/fortran/install-replay_no_data_dump_dir_fortran ======== CMake output ====== CMake Deprecation Warning at CMakeLists.txt:1 (cmake_minimum_required): Compatibility with CMake < 3.10 will be removed from a future version of CMake. Update the VERSION argument value. Or, use the ... syntax to tell CMake that the project requires at least but has been updated to work with policies introduced by or earlier. The C compiler identification is GNU 15.2.1 The CXX compiler identification is GNU 15.2.1 Detecting C compiler ABI info Detecting C compiler ABI info - done Check for working C compiler: /usr/bin/cc - skipped Detecting C compile features Detecting C compile features - done Detecting CXX compiler ABI info Detecting CXX compiler ABI info - done Check for working CXX compiler: /usr/bin/c++ - skipped Detecting CXX compile features Detecting CXX compile features - done The Fortran compiler identification is GNU 15.2.1 Detecting Fortran compiler ABI info Detecting Fortran compiler ABI info - done Check for working Fortran compiler: /usr/bin/gfortran - skipped Found MPI_C: /lib/libmpi.so (found version "3.1") Found MPI: TRUE (found version "3.1") found components: C Found Python3: /usr/include/python3.14 (found version "3.14.3") found components: Development Development.Module Development.Embed Found MPI_Fortran: /lib/libmpi_usempif08.so (found version "3.1") Found MPI: TRUE (found version "3.1") found components: Fortran Configuring done (23.7s) Generating done (0.1s) Build files have been written to: /build/paraview-catalyst/src/build/tests/replay_tests/fortran/install-replay_no_data_dump_dir_fortran ======== End CMake output ====== Change Dir: '/build/paraview-catalyst/src/build/tests/replay_tests/fortran/install-replay_no_data_dump_dir_fortran' Run Clean Command: /usr/bin/ninja clean [1/1] Cleaning all built files... Cleaning... 0 files. Run Build Command(s): /usr/bin/ninja [1/7] Building Fortran preprocessed CMakeFiles/no_data_dump_dir_driver.dir/build/paraview-catalyst/src/catalyst-v2.1.0/tests/replay_tests/fortran/common_src_dir/common_replay_driver.F90-pp.f90 [2/7] Generating Fortran dyndep file CMakeFiles/no_data_dump_dir_driver.dir/Fortran.dd [3/7] Building C object CMakeFiles/no_data_dump_dir_adapter.dir/catalyst_impl_replay.c.o [4/7] Building Fortran object CMakeFiles/no_data_dump_dir_driver.dir/build/paraview-catalyst/src/catalyst-v2.1.0/tests/replay_tests/fortran/common_src_dir/common_replay_driver.F90.o [5/7] Linking Fortran executable no_data_dump_dir_driver [6/7] Building CXX object CMakeFiles/no_data_dump_dir_adapter.dir/build/paraview-catalyst/src/catalyst-v2.1.0/tests/replay_tests/common_src_dir/common_replay_adaptor.cxx.o [7/7] Linking CXX shared module catalyst/libcatalyst-replay.so Running test command: "/usr/bin/ctest" "-C" "$" "-V" "--output-on-failure" UpdateCTestConfiguration from :/build/paraview-catalyst/src/build/tests/replay_tests/fortran/install-replay_no_data_dump_dir_fortran/DartConfiguration.tcl Parse Config file:/build/paraview-catalyst/src/build/tests/replay_tests/fortran/install-replay_no_data_dump_dir_fortran/DartConfiguration.tcl Test project /build/paraview-catalyst/src/build/tests/replay_tests/fortran/install-replay_no_data_dump_dir_fortran Constructing a list of tests Done constructing a list of tests Updating test list for fixtures Added 0 tests to meet fixture requirements Checking test dependency graph... Checking test dependency graph end test 1 Start 1: no_write_out 1: Test command: /usr/bin/mpiexec "-n" "2" "/build/paraview-catalyst/src/build/tests/replay_tests/fortran/install-replay_no_data_dump_dir_fortran/no_data_dump_dir_driver" "/build/paraview-catalyst/src/build/tests/replay_tests/fortran/install-replay_no_data_dump_dir_fortran/catalyst" 1: Working Directory: /build/paraview-catalyst/src/build/tests/replay_tests/fortran/install-replay_no_data_dump_dir_fortran 1: Test timeout computed to be: 1500 1: [arch-nspawn-3003276:11774] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 1056 1: [arch-nspawn-3003276:11774] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 1231 1: [arch-nspawn-3003276:11774] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 1353 1: [arch-nspawn-3003276:11774] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 2405 1: [arch-nspawn-3003276:11774] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 2460 1: [arch-nspawn-3003276:11774] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 2476 1: [arch-nspawn-3003276:11774] PMIX ERROR: PMIX_ERROR in file server/pmix_server.c at line 4660 1: -------------------------------------------------------------------------- 1: The gds/shmem2 component attempted to attach to a shared-memory segment at a 1: particular base address, but was given a different one. Your job will now likely 1: abort. 1: 1: Requested Address: 0x2aaac0000000 1: Acquired Address: 0x3f91c09000 1: 1: If this problem persists, please consider disabling the gds/shmem2 component by 1: setting in your environment the following: PMIX_MCA_gds=hash 1: -------------------------------------------------------------------------- 1: 1: [arch-nspawn-3003276:11785] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file client/pmix_client.c at line 278 1: *** An error occurred in MPI_Init 1: -------------------------------------------------------------------------- 1: PMIx_Init failed for the following reason: 1: 1: PMIX_ERROR 1: 1: Open MPI requires access to a local PMIx server to execute. Please ensure 1: that either you are operating in a PMIx-enabled environment, or use "mpirun" 1: to execute the job. 1: -------------------------------------------------------------------------- 1: 1: *** on a NULL communicator 1: *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, 1: *** and MPI will try to terminate your MPI job as well) 1: [arch-nspawn-3003276:11785] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! 1: -------------------------------------------------------------------------- 1: prte detected that one or more processes exited with non-zero status, 1: thus causing the job to be terminated. The first process to do so was: 1: 1: Process name: [prte-arch-nspawn-3003276-11774@1,0] 1: Exit code: 14 1: -------------------------------------------------------------------------- 1: [arch-nspawn-3003276:11774] PMIX ERROR: PMIX_ERR_UNREACH in file base/ptl_base_connection_hdlr.c at line 103 1/2 Test #1: no_write_out .....................***Failed 0.51 sec [arch-nspawn-3003276:11774] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 1056 [arch-nspawn-3003276:11774] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 1231 [arch-nspawn-3003276:11774] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 1353 [arch-nspawn-3003276:11774] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 2405 [arch-nspawn-3003276:11774] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 2460 [arch-nspawn-3003276:11774] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 2476 [arch-nspawn-3003276:11774] PMIX ERROR: PMIX_ERROR in file server/pmix_server.c at line 4660 -------------------------------------------------------------------------- The gds/shmem2 component attempted to attach to a shared-memory segment at a particular base address, but was given a different one. Your job will now likely abort. Requested Address: 0x2aaac0000000 Acquired Address: 0x3f91c09000 If this problem persists, please consider disabling the gds/shmem2 component by setting in your environment the following: PMIX_MCA_gds=hash -------------------------------------------------------------------------- [arch-nspawn-3003276:11785] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file client/pmix_client.c at line 278 *** An error occurred in MPI_Init -------------------------------------------------------------------------- PMIx_Init failed for the following reason: PMIX_ERROR Open MPI requires access to a local PMIx server to execute. Please ensure that either you are operating in a PMIx-enabled environment, or use "mpirun" to execute the job. -------------------------------------------------------------------------- *** on a NULL communicator *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, *** and MPI will try to terminate your MPI job as well) [arch-nspawn-3003276:11785] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! -------------------------------------------------------------------------- prte detected that one or more processes exited with non-zero status, thus causing the job to be terminated. The first process to do so was: Process name: [prte-arch-nspawn-3003276-11774@1,0] Exit code: 14 -------------------------------------------------------------------------- [arch-nspawn-3003276:11774] PMIX ERROR: PMIX_ERR_UNREACH in file base/ptl_base_connection_hdlr.c at line 103 test 2 Start 2: no_data_dump_dir_read_in Failed test dependencies: no_write_out 2/2 Test #2: no_data_dump_dir_read_in .........***Not Run 0.00 sec 0% tests passed, 2 tests failed out of 2 Total Test time (real) = 0.52 sec The following tests FAILED: 1 - no_write_out (Failed) 2 - no_data_dump_dir_read_in (Not Run) Errors while running CTest Test command failed: /usr/bin/ctest CMake Error at /build/paraview-catalyst/src/catalyst-v2.1.0/cmake/catalyst-test-path-setting.cmake:41 (message): test or example failed! Start 61: test-install-replay_ranks_mismatch_python 54/69 Test #59: test-install-replay_high_num_ranks_python .............***Failed 23.44 sec Internal cmake changing into directory: /build/paraview-catalyst/src/build/tests/replay_tests/python/install-replay_high_num_ranks_python ======== CMake output ====== The C compiler identification is GNU 15.2.1 The CXX compiler identification is GNU 15.2.1 Detecting C compiler ABI info Detecting C compiler ABI info - done Check for working C compiler: /usr/bin/cc - skipped Detecting C compile features Detecting C compile features - done Detecting CXX compiler ABI info Detecting CXX compiler ABI info - done Check for working CXX compiler: /usr/bin/c++ - skipped Detecting CXX compile features Detecting CXX compile features - done Found MPI_C: /lib/libmpi.so (found version "3.1") Found MPI: TRUE (found version "3.1") found components: C Found Python3: /usr/include/python3.14 (found version "3.14.3") found components: Development Development.Module Development.Embed Found Python3: /usr/bin/python3.14 (found version "3.14.3") found components: Interpreter Configuring done (9.2s) Generating done (0.1s) Build files have been written to: /build/paraview-catalyst/src/build/tests/replay_tests/python/install-replay_high_num_ranks_python ======== End CMake output ====== Change Dir: '/build/paraview-catalyst/src/build/tests/replay_tests/python/install-replay_high_num_ranks_python' Run Clean Command: /usr/bin/ninja clean [1/1] Cleaning all built files... Cleaning... 0 files. Run Build Command(s): /usr/bin/ninja [1/3] Building C object CMakeFiles/high_num_ranks_adaptor.dir/catalyst_impl_replay.c.o [2/3] Building CXX object CMakeFiles/high_num_ranks_adaptor.dir/build/paraview-catalyst/src/catalyst-v2.1.0/tests/replay_tests/common_src_dir/common_replay_adaptor.cxx.o [3/3] Linking CXX shared module catalyst/libcatalyst-replay.so Running test command: "/usr/bin/ctest" "-C" "$" "-V" "--output-on-failure" UpdateCTestConfiguration from :/build/paraview-catalyst/src/build/tests/replay_tests/python/install-replay_high_num_ranks_python/DartConfiguration.tcl Parse Config file:/build/paraview-catalyst/src/build/tests/replay_tests/python/install-replay_high_num_ranks_python/DartConfiguration.tcl Test project /build/paraview-catalyst/src/build/tests/replay_tests/python/install-replay_high_num_ranks_python Constructing a list of tests Done constructing a list of tests Updating test list for fixtures Added 0 tests to meet fixture requirements Checking test dependency graph... Checking test dependency graph end test 1 Start 1: high_num_ranks_write_prepare 1: Test command: /usr/bin/cmake "-E" "rm" "-rf" "/build/paraview-catalyst/src/build/tests/replay_tests/python/install-replay_high_num_ranks_python/data_dump/" 1: Working Directory: /build/paraview-catalyst/src/build/tests/replay_tests/python/install-replay_high_num_ranks_python 1: Test timeout computed to be: 1500 1/3 Test #1: high_num_ranks_write_prepare ..... Passed 0.11 sec test 2 Start 2: high_num_ranks_write_out 2: Test command: /usr/bin/mpiexec "-n" "10" "/usr/bin/python3.14" "/build/paraview-catalyst/src/catalyst-v2.1.0/tests/replay_tests/python/common_src_dir/common_replay_driver.py" "/build/paraview-catalyst/src/build/tests/replay_tests/python/install-replay_high_num_ranks_python/catalyst" "--use-mpi" 2: Working Directory: /build/paraview-catalyst/src/build/tests/replay_tests/python/install-replay_high_num_ranks_python 2: Environment variables: 2: CATALYST_DATA_DUMP_DIRECTORY=/build/paraview-catalyst/src/build/tests/replay_tests/python/install-replay_high_num_ranks_python/data_dump/ 2: Environment variable modifications: 2: PYTHONPATH=path_list_prepend:/build/paraview-catalyst/src/build/root/usr/lib/cmake/catalyst-2.1/../../../lib/python3.14/site-packages 2: Test timeout computed to be: 1500 2: -------------------------------------------------------------------------- 2: There are not enough slots available in the system to satisfy the 10 2: slots that were requested by the application: 2: 2: /usr/bin/python3.14 2: 2: Either request fewer procs for your application, or make more slots 2: available for use. 2: 2: A "slot" is the PRRTE term for an allocatable unit where we can 2: launch a process. The number of slots available are defined by the 2: environment in which PRRTE processes are run: 2: 2: 1. Hostfile, via "slots=N" clauses (N defaults to number of 2: processor cores if not provided) 2: 2. The --host command line parameter, via a ":N" suffix on the 2: hostname (N defaults to 1 if not provided) 2: 3. Resource manager (e.g., SLURM, PBS/Torque, LSF, etc.) 2: 4. If none of a hostfile, the --host command line parameter, or an 2: RM is present, PRRTE defaults to the number of processor cores 2: 2: In all the above cases, if you want PRRTE to default to the number 2: of hardware threads instead of the number of processor cores, use the 2: --use-hwthread-cpus option. 2: 2: Alternatively, you can use the --map-by :OVERSUBSCRIBE option to ignore the 2: number of available slots when deciding the number of processes to 2: launch. 2: -------------------------------------------------------------------------- 2: 2/3 Test #2: high_num_ranks_write_out .........***Failed 0.15 sec -------------------------------------------------------------------------- There are not enough slots available in the system to satisfy the 10 slots that were requested by the application: /usr/bin/python3.14 Either request fewer procs for your application, or make more slots available for use. A "slot" is the PRRTE term for an allocatable unit where we can launch a process. The number of slots available are defined by the environment in which PRRTE processes are run: 1. Hostfile, via "slots=N" clauses (N defaults to number of processor cores if not provided) 2. The --host command line parameter, via a ":N" suffix on the hostname (N defaults to 1 if not provided) 3. Resource manager (e.g., SLURM, PBS/Torque, LSF, etc.) 4. If none of a hostfile, the --host command line parameter, or an RM is present, PRRTE defaults to the number of processor cores In all the above cases, if you want PRRTE to default to the number of hardware threads instead of the number of processor cores, use the --use-hwthread-cpus option. Alternatively, you can use the --map-by :OVERSUBSCRIBE option to ignore the number of available slots when deciding the number of processes to launch. -------------------------------------------------------------------------- test 3 Start 3: high_num_ranks_read_in Failed test dependencies: high_num_ranks_write_out 3/3 Test #3: high_num_ranks_read_in ...........***Not Run 0.00 sec 33% tests passed, 2 tests failed out of 3 Total Test time (real) = 0.26 sec The following tests FAILED: 2 - high_num_ranks_write_out (Failed) 3 - high_num_ranks_read_in (Not Run) Errors while running CTest Test command failed: /usr/bin/ctest CMake Error at /build/paraview-catalyst/src/catalyst-v2.1.0/cmake/catalyst-test-path-setting.cmake:41 (message): test or example failed! Start 62: test-build-replay_high_num_execute_invc_python 55/69 Test #58: test-build-replay_high_num_ranks_python ...............***Failed 24.04 sec Internal cmake changing into directory: /build/paraview-catalyst/src/build/tests/replay_tests/python/build-replay_high_num_ranks_python ======== CMake output ====== The C compiler identification is GNU 15.2.1 The CXX compiler identification is GNU 15.2.1 Detecting C compiler ABI info Detecting C compiler ABI info - done Check for working C compiler: /usr/bin/cc - skipped Detecting C compile features Detecting C compile features - done Detecting CXX compiler ABI info Detecting CXX compiler ABI info - done Check for working CXX compiler: /usr/bin/c++ - skipped Detecting CXX compile features Detecting CXX compile features - done Found MPI_C: /lib/libmpi.so (found version "3.1") Found MPI: TRUE (found version "3.1") found components: C Found Python3: /usr/include/python3.14 (found version "3.14.3") found components: Development Development.Module Development.Embed Found Python3: /usr/bin/python3.14 (found version "3.14.3") found components: Interpreter Configuring done (9.2s) Generating done (0.1s) Build files have been written to: /build/paraview-catalyst/src/build/tests/replay_tests/python/build-replay_high_num_ranks_python ======== End CMake output ====== Change Dir: '/build/paraview-catalyst/src/build/tests/replay_tests/python/build-replay_high_num_ranks_python' Run Clean Command: /usr/bin/ninja clean [1/1] Cleaning all built files... Cleaning... 0 files. Run Build Command(s): /usr/bin/ninja [1/3] Building C object CMakeFiles/high_num_ranks_adaptor.dir/catalyst_impl_replay.c.o [2/3] Building CXX object CMakeFiles/high_num_ranks_adaptor.dir/build/paraview-catalyst/src/catalyst-v2.1.0/tests/replay_tests/common_src_dir/common_replay_adaptor.cxx.o [3/3] Linking CXX shared module catalyst/libcatalyst-replay.so Running test command: "/usr/bin/ctest" "-C" "$" "-V" "--output-on-failure" UpdateCTestConfiguration from :/build/paraview-catalyst/src/build/tests/replay_tests/python/build-replay_high_num_ranks_python/DartConfiguration.tcl Parse Config file:/build/paraview-catalyst/src/build/tests/replay_tests/python/build-replay_high_num_ranks_python/DartConfiguration.tcl Test project /build/paraview-catalyst/src/build/tests/replay_tests/python/build-replay_high_num_ranks_python Constructing a list of tests Done constructing a list of tests Updating test list for fixtures Added 0 tests to meet fixture requirements Checking test dependency graph... Checking test dependency graph end test 1 Start 1: high_num_ranks_write_prepare 1: Test command: /usr/bin/cmake "-E" "rm" "-rf" "/build/paraview-catalyst/src/build/tests/replay_tests/python/build-replay_high_num_ranks_python/data_dump/" 1: Working Directory: /build/paraview-catalyst/src/build/tests/replay_tests/python/build-replay_high_num_ranks_python 1: Test timeout computed to be: 1500 1/3 Test #1: high_num_ranks_write_prepare ..... Passed 0.09 sec test 2 Start 2: high_num_ranks_write_out 2: Test command: /usr/bin/mpiexec "-n" "10" "/usr/bin/python3.14" "/build/paraview-catalyst/src/catalyst-v2.1.0/tests/replay_tests/python/common_src_dir/common_replay_driver.py" "/build/paraview-catalyst/src/build/tests/replay_tests/python/build-replay_high_num_ranks_python/catalyst" "--use-mpi" 2: Working Directory: /build/paraview-catalyst/src/build/tests/replay_tests/python/build-replay_high_num_ranks_python 2: Environment variables: 2: CATALYST_DATA_DUMP_DIRECTORY=/build/paraview-catalyst/src/build/tests/replay_tests/python/build-replay_high_num_ranks_python/data_dump/ 2: Environment variable modifications: 2: PYTHONPATH=path_list_prepend:/build/paraview-catalyst/src/build/lib/cmake/catalyst-2.1/../../../lib/python3.14/site-packages 2: Test timeout computed to be: 1500 2: -------------------------------------------------------------------------- 2: There are not enough slots available in the system to satisfy the 10 2: slots that were requested by the application: 2: 2: /usr/bin/python3.14 2: 2: Either request fewer procs for your application, or make more slots 2: available for use. 2: 2: A "slot" is the PRRTE term for an allocatable unit where we can 2: launch a process. The number of slots available are defined by the 2: environment in which PRRTE processes are run: 2: 2: 1. Hostfile, via "slots=N" clauses (N defaults to number of 2: processor cores if not provided) 2: 2. The --host command line parameter, via a ":N" suffix on the 2: hostname (N defaults to 1 if not provided) 2: 3. Resource manager (e.g., SLURM, PBS/Torque, LSF, etc.) 2: 4. If none of a hostfile, the --host command line parameter, or an 2: RM is present, PRRTE defaults to the number of processor cores 2: 2: In all the above cases, if you want PRRTE to default to the number 2: of hardware threads instead of the number of processor cores, use the 2: --use-hwthread-cpus option. 2: 2: Alternatively, you can use the --map-by :OVERSUBSCRIBE option to ignore the 2: number of available slots when deciding the number of processes to 2: launch. 2: -------------------------------------------------------------------------- 2: 2/3 Test #2: high_num_ranks_write_out .........***Failed 0.18 sec -------------------------------------------------------------------------- There are not enough slots available in the system to satisfy the 10 slots that were requested by the application: /usr/bin/python3.14 Either request fewer procs for your application, or make more slots available for use. A "slot" is the PRRTE term for an allocatable unit where we can launch a process. The number of slots available are defined by the environment in which PRRTE processes are run: 1. Hostfile, via "slots=N" clauses (N defaults to number of processor cores if not provided) 2. The --host command line parameter, via a ":N" suffix on the hostname (N defaults to 1 if not provided) 3. Resource manager (e.g., SLURM, PBS/Torque, LSF, etc.) 4. If none of a hostfile, the --host command line parameter, or an RM is present, PRRTE defaults to the number of processor cores In all the above cases, if you want PRRTE to default to the number of hardware threads instead of the number of processor cores, use the --use-hwthread-cpus option. Alternatively, you can use the --map-by :OVERSUBSCRIBE option to ignore the number of available slots when deciding the number of processes to launch. -------------------------------------------------------------------------- test 3 Start 3: high_num_ranks_read_in Failed test dependencies: high_num_ranks_write_out 3/3 Test #3: high_num_ranks_read_in ...........***Not Run 0.00 sec 33% tests passed, 2 tests failed out of 3 Total Test time (real) = 0.28 sec The following tests FAILED: 2 - high_num_ranks_write_out (Failed) 3 - high_num_ranks_read_in (Not Run) Errors while running CTest Test command failed: /usr/bin/ctest CMake Error at /build/paraview-catalyst/src/catalyst-v2.1.0/cmake/catalyst-test-path-setting.cmake:41 (message): test or example failed! Start 63: test-install-replay_high_num_execute_invc_python 56/69 Test #54: test-build-replay_missing_initialize_data_fortran .....***Failed 37.73 sec Internal cmake changing into directory: /build/paraview-catalyst/src/build/tests/replay_tests/fortran/build-replay_missing_initialize_data_fortran ======== CMake output ====== CMake Deprecation Warning at CMakeLists.txt:1 (cmake_minimum_required): Compatibility with CMake < 3.10 will be removed from a future version of CMake. Update the VERSION argument value. Or, use the ... syntax to tell CMake that the project requires at least but has been updated to work with policies introduced by or earlier. The C compiler identification is GNU 15.2.1 The CXX compiler identification is GNU 15.2.1 Detecting C compiler ABI info Detecting C compiler ABI info - done Check for working C compiler: /usr/bin/cc - skipped Detecting C compile features Detecting C compile features - done Detecting CXX compiler ABI info Detecting CXX compiler ABI info - done Check for working CXX compiler: /usr/bin/c++ - skipped Detecting CXX compile features Detecting CXX compile features - done The Fortran compiler identification is GNU 15.2.1 Detecting Fortran compiler ABI info Detecting Fortran compiler ABI info - done Check for working Fortran compiler: /usr/bin/gfortran - skipped Found MPI_C: /lib/libmpi.so (found version "3.1") Found MPI: TRUE (found version "3.1") found components: C Found Python3: /usr/include/python3.14 (found version "3.14.3") found components: Development Development.Module Development.Embed Found MPI_Fortran: /lib/libmpi_usempif08.so (found version "3.1") Found MPI: TRUE (found version "3.1") found components: Fortran Configuring done (21.9s) Generating done (0.2s) Build files have been written to: /build/paraview-catalyst/src/build/tests/replay_tests/fortran/build-replay_missing_initialize_data_fortran ======== End CMake output ====== Change Dir: '/build/paraview-catalyst/src/build/tests/replay_tests/fortran/build-replay_missing_initialize_data_fortran' Run Clean Command: /usr/bin/ninja clean [1/1] Cleaning all built files... Cleaning... 0 files. Run Build Command(s): /usr/bin/ninja [1/7] Building Fortran preprocessed CMakeFiles/missing_initialize_driver.dir/build/paraview-catalyst/src/catalyst-v2.1.0/tests/replay_tests/fortran/common_src_dir/common_replay_driver.F90-pp.f90 [2/7] Generating Fortran dyndep file CMakeFiles/missing_initialize_driver.dir/Fortran.dd [3/7] Building C object CMakeFiles/missing_initialize_adaptor.dir/catalyst_impl_replay.c.o [4/7] Building Fortran object CMakeFiles/missing_initialize_driver.dir/build/paraview-catalyst/src/catalyst-v2.1.0/tests/replay_tests/fortran/common_src_dir/common_replay_driver.F90.o [5/7] Linking Fortran executable missing_initialize_driver [6/7] Building CXX object CMakeFiles/missing_initialize_adaptor.dir/build/paraview-catalyst/src/catalyst-v2.1.0/tests/replay_tests/common_src_dir/common_replay_adaptor.cxx.o [7/7] Linking CXX shared module catalyst/libcatalyst-replay.so Running test command: "/usr/bin/ctest" "-C" "$" "-V" "--output-on-failure" UpdateCTestConfiguration from :/build/paraview-catalyst/src/build/tests/replay_tests/fortran/build-replay_missing_initialize_data_fortran/DartConfiguration.tcl Parse Config file:/build/paraview-catalyst/src/build/tests/replay_tests/fortran/build-replay_missing_initialize_data_fortran/DartConfiguration.tcl Test project /build/paraview-catalyst/src/build/tests/replay_tests/fortran/build-replay_missing_initialize_data_fortran Constructing a list of tests Done constructing a list of tests Updating test list for fixtures Added 0 tests to meet fixture requirements Checking test dependency graph... Checking test dependency graph end test 1 Start 1: missing_initialize_prepare 1: Test command: /usr/bin/cmake "-E" "rm" "-rf" "/build/paraview-catalyst/src/build/tests/replay_tests/fortran/build-replay_missing_initialize_data_fortran/data_dump/" 1: Working Directory: /build/paraview-catalyst/src/build/tests/replay_tests/fortran/build-replay_missing_initialize_data_fortran 1: Test timeout computed to be: 1500 1/4 Test #1: missing_initialize_prepare ....... Passed 0.07 sec test 2 Start 2: missing_initialize_write_out 2: Test command: /usr/bin/mpiexec "-n" "2" "/build/paraview-catalyst/src/build/tests/replay_tests/fortran/build-replay_missing_initialize_data_fortran/missing_initialize_driver" "/build/paraview-catalyst/src/build/tests/replay_tests/fortran/build-replay_missing_initialize_data_fortran/catalyst" 2: Working Directory: /build/paraview-catalyst/src/build/tests/replay_tests/fortran/build-replay_missing_initialize_data_fortran 2: Environment variables: 2: CATALYST_DATA_DUMP_DIRECTORY=/build/paraview-catalyst/src/build/tests/replay_tests/fortran/build-replay_missing_initialize_data_fortran/data_dump/ 2: Test timeout computed to be: 1500 2: [arch-nspawn-3003276:12690] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 1056 2: [arch-nspawn-3003276:12690] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 1231 2: [arch-nspawn-3003276:12690] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 1353 2: [arch-nspawn-3003276:12690] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 2405 2: [arch-nspawn-3003276:12690] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 2460 2: [arch-nspawn-3003276:12690] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 2476 2: [arch-nspawn-3003276:12690] PMIX ERROR: PMIX_ERROR in file server/pmix_server.c at line 4660 2: -------------------------------------------------------------------------- 2: The gds/shmem2 component attempted to attach to a shared-memory segment at a 2: particular base address, but was given a different one. Your job will now likely 2: abort. 2: 2: Requested Address: 0x2aaacc000000 2: Acquired Address: 0x3fb8d6a000 2: 2: If this problem persists, please consider disabling the gds/shmem2 component by 2: setting in your environment the following: PMIX_MCA_gds=hash 2: -------------------------------------------------------------------------- 2: 2: [arch-nspawn-3003276:12715] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file client/pmix_client.c at line 278 2: -------------------------------------------------------------------------- 2: PMIx_Init failed for the following reason: 2: 2: PMIX_ERROR 2: 2: Open MPI requires access to a local PMIx server to execute. Please ensure 2: that either you are operating in a PMIx-enabled environment, or use "mpirun" 2: to execute the job. 2: -------------------------------------------------------------------------- 2: 2: *** An error occurred in MPI_Init 2: *** on a NULL communicator 2: *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, 2: *** and MPI will try to terminate your MPI job as well) 2: [arch-nspawn-3003276:12715] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! 2: -------------------------------------------------------------------------- 2: prte detected that one or more processes exited with non-zero status, 2: thus causing the job to be terminated. The first process to do so was: 2: 2: Process name: [prte-arch-nspawn-3003276-12690@1,0] 2: Exit code: 14 2: -------------------------------------------------------------------------- 2: [arch-nspawn-3003276:12690] PMIX ERROR: PMIX_ERR_UNREACH in file base/ptl_base_connection_hdlr.c at line 103 2/4 Test #2: missing_initialize_write_out .....***Failed 0.58 sec [arch-nspawn-3003276:12690] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 1056 [arch-nspawn-3003276:12690] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 1231 [arch-nspawn-3003276:12690] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 1353 [arch-nspawn-3003276:12690] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 2405 [arch-nspawn-3003276:12690] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 2460 [arch-nspawn-3003276:12690] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 2476 [arch-nspawn-3003276:12690] PMIX ERROR: PMIX_ERROR in file server/pmix_server.c at line 4660 -------------------------------------------------------------------------- The gds/shmem2 component attempted to attach to a shared-memory segment at a particular base address, but was given a different one. Your job will now likely abort. Requested Address: 0x2aaacc000000 Acquired Address: 0x3fb8d6a000 If this problem persists, please consider disabling the gds/shmem2 component by setting in your environment the following: PMIX_MCA_gds=hash -------------------------------------------------------------------------- [arch-nspawn-3003276:12715] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file client/pmix_client.c at line 278 -------------------------------------------------------------------------- PMIx_Init failed for the following reason: PMIX_ERROR Open MPI requires access to a local PMIx server to execute. Please ensure that either you are operating in a PMIx-enabled environment, or use "mpirun" to execute the job. -------------------------------------------------------------------------- *** An error occurred in MPI_Init *** on a NULL communicator *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, *** and MPI will try to terminate your MPI job as well) [arch-nspawn-3003276:12715] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! -------------------------------------------------------------------------- prte detected that one or more processes exited with non-zero status, thus causing the job to be terminated. The first process to do so was: Process name: [prte-arch-nspawn-3003276-12690@1,0] Exit code: 14 -------------------------------------------------------------------------- [arch-nspawn-3003276:12690] PMIX ERROR: PMIX_ERR_UNREACH in file base/ptl_base_connection_hdlr.c at line 103 test 3 Start 3: remove_initialize Failed test dependencies: missing_initialize_write_out 3/4 Test #3: remove_initialize ................***Not Run 0.00 sec test 4 Start 4: missing_initialize_read_in Failed test dependencies: remove_initialize 4/4 Test #4: missing_initialize_read_in .......***Not Run 0.00 sec 25% tests passed, 3 tests failed out of 4 Total Test time (real) = 0.66 sec The following tests FAILED: 2 - missing_initialize_write_out (Failed) 3 - remove_initialize (Not Run) 4 - missing_initialize_read_in (Not Run) Errors while running CTest Test command failed: /usr/bin/ctest CMake Error at /build/paraview-catalyst/src/catalyst-v2.1.0/cmake/catalyst-test-path-setting.cmake:41 (message): test or example failed! Start 64: test-build-replay_no_data_dump_dir_python 57/69 Test #60: test-build-replay_ranks_mismatch_python ...............***Failed 25.68 sec Internal cmake changing into directory: /build/paraview-catalyst/src/build/tests/replay_tests/python/build-replay_ranks_mismatch_python ======== CMake output ====== The C compiler identification is GNU 15.2.1 The CXX compiler identification is GNU 15.2.1 Detecting C compiler ABI info Detecting C compiler ABI info - done Check for working C compiler: /usr/bin/cc - skipped Detecting C compile features Detecting C compile features - done Detecting CXX compiler ABI info Detecting CXX compiler ABI info - done Check for working CXX compiler: /usr/bin/c++ - skipped Detecting CXX compile features Detecting CXX compile features - done Found MPI_C: /lib/libmpi.so (found version "3.1") Found MPI: TRUE (found version "3.1") found components: C Found Python3: /usr/include/python3.14 (found version "3.14.3") found components: Development Development.Module Development.Embed Found Python3: /usr/bin/python3.14 (found version "3.14.3") found components: Interpreter Configuring done (9.5s) Generating done (0.1s) Build files have been written to: /build/paraview-catalyst/src/build/tests/replay_tests/python/build-replay_ranks_mismatch_python ======== End CMake output ====== Change Dir: '/build/paraview-catalyst/src/build/tests/replay_tests/python/build-replay_ranks_mismatch_python' Run Clean Command: /usr/bin/ninja clean [1/1] Cleaning all built files... Cleaning... 0 files. Run Build Command(s): /usr/bin/ninja [1/3] Building C object CMakeFiles/replay_ranks_mismatch_adaptor.dir/catalyst_impl_replay.c.o [2/3] Building CXX object CMakeFiles/replay_ranks_mismatch_adaptor.dir/build/paraview-catalyst/src/catalyst-v2.1.0/tests/replay_tests/common_src_dir/common_replay_adaptor.cxx.o [3/3] Linking CXX shared module catalyst/libcatalyst-replay.so Running test command: "/usr/bin/ctest" "-C" "$" "-V" "--output-on-failure" UpdateCTestConfiguration from :/build/paraview-catalyst/src/build/tests/replay_tests/python/build-replay_ranks_mismatch_python/DartConfiguration.tcl Parse Config file:/build/paraview-catalyst/src/build/tests/replay_tests/python/build-replay_ranks_mismatch_python/DartConfiguration.tcl Test project /build/paraview-catalyst/src/build/tests/replay_tests/python/build-replay_ranks_mismatch_python Constructing a list of tests Done constructing a list of tests Updating test list for fixtures Added 0 tests to meet fixture requirements Checking test dependency graph... Checking test dependency graph end test 1 Start 1: replay_ranks_mismatch_prepare 1: Test command: /usr/bin/cmake "-E" "rm" "-rf" "/build/paraview-catalyst/src/build/tests/replay_tests/python/build-replay_ranks_mismatch_python/data_dump/" 1: Working Directory: /build/paraview-catalyst/src/build/tests/replay_tests/python/build-replay_ranks_mismatch_python 1: Test timeout computed to be: 1500 1/3 Test #1: replay_ranks_mismatch_prepare ..... Passed 0.08 sec test 2 Start 2: replay_ranks_mismatch_write_out 2: Test command: /usr/bin/mpiexec "-n" "1" "/usr/bin/python3.14" "/build/paraview-catalyst/src/catalyst-v2.1.0/tests/replay_tests/python/common_src_dir/common_replay_driver.py" "/build/paraview-catalyst/src/build/tests/replay_tests/python/build-replay_ranks_mismatch_python/catalyst" "--use-mpi" 2: Working Directory: /build/paraview-catalyst/src/build/tests/replay_tests/python/build-replay_ranks_mismatch_python 2: Environment variables: 2: CATALYST_DATA_DUMP_DIRECTORY=/build/paraview-catalyst/src/build/tests/replay_tests/python/build-replay_ranks_mismatch_python/data_dump/ 2: Environment variable modifications: 2: PYTHONPATH=path_list_prepend:/build/paraview-catalyst/src/build/lib/cmake/catalyst-2.1/../../../lib/python3.14/site-packages 2: Test timeout computed to be: 1500 2: [arch-nspawn-3003276:13053] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 1056 2: [arch-nspawn-3003276:13053] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 1231 2: [arch-nspawn-3003276:13053] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 1353 2: [arch-nspawn-3003276:13053] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 2405 2: [arch-nspawn-3003276:13053] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 2460 2: [arch-nspawn-3003276:13053] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 2476 2: [arch-nspawn-3003276:13053] PMIX ERROR: PMIX_ERROR in file server/pmix_server.c at line 4660 2: -------------------------------------------------------------------------- 2: The gds/shmem2 component attempted to attach to a shared-memory segment at a 2: particular base address, but was given a different one. Your job will now likely 2: abort. 2: 2: Requested Address: 0x2aaac4000000 2: Acquired Address: 0x3f972ca000 2: 2: If this problem persists, please consider disabling the gds/shmem2 component by 2: setting in your environment the following: PMIX_MCA_gds=hash 2: -------------------------------------------------------------------------- 2: 2: [arch-nspawn-3003276:13073] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file client/pmix_client.c at line 278 2: -------------------------------------------------------------------------- 2: PMIx_Init failed for the following reason: 2: 2: PMIX_ERROR 2: 2: Open MPI requires access to a local PMIx server to execute. Please ensure 2: that either you are operating in a PMIx-enabled environment, or use "mpirun" 2: to execute the job. 2: -------------------------------------------------------------------------- 2: 2: *** An error occurred in MPI_Init_thread 2: *** on a NULL communicator 2: *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, 2: *** and MPI will try to terminate your MPI job as well) 2: [arch-nspawn-3003276:13073] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! 2/3 Test #2: replay_ranks_mismatch_write_out ...***Failed 1.23 sec [arch-nspawn-3003276:13053] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 1056 [arch-nspawn-3003276:13053] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 1231 [arch-nspawn-3003276:13053] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 1353 [arch-nspawn-3003276:13053] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 2405 [arch-nspawn-3003276:13053] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 2460 [arch-nspawn-3003276:13053] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 2476 [arch-nspawn-3003276:13053] PMIX ERROR: PMIX_ERROR in file server/pmix_server.c at line 4660 -------------------------------------------------------------------------- The gds/shmem2 component attempted to attach to a shared-memory segment at a particular base address, but was given a different one. Your job will now likely abort. Requested Address: 0x2aaac4000000 Acquired Address: 0x3f972ca000 If this problem persists, please consider disabling the gds/shmem2 component by setting in your environment the following: PMIX_MCA_gds=hash -------------------------------------------------------------------------- [arch-nspawn-3003276:13073] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file client/pmix_client.c at line 278 -------------------------------------------------------------------------- PMIx_Init failed for the following reason: PMIX_ERROR Open MPI requires access to a local PMIx server to execute. Please ensure that either you are operating in a PMIx-enabled environment, or use "mpirun" to execute the job. -------------------------------------------------------------------------- *** An error occurred in MPI_Init_thread *** on a NULL communicator *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, *** and MPI will try to terminate your MPI job as well) [arch-nspawn-3003276:13073] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! test 3 Start 3: replay_ranks_mismatch_read_in Failed test dependencies: replay_ranks_mismatch_write_out 3/3 Test #3: replay_ranks_mismatch_read_in .....***Not Run 0.00 sec 33% tests passed, 2 tests failed out of 3 Total Test time (real) = 1.31 sec The following tests FAILED: 2 - replay_ranks_mismatch_write_out (Failed) 3 - replay_ranks_mismatch_read_in (Not Run) Errors while running CTest Test command failed: /usr/bin/ctest CMake Error at /build/paraview-catalyst/src/catalyst-v2.1.0/cmake/catalyst-test-path-setting.cmake:41 (message): test or example failed! Start 65: test-install-replay_no_data_dump_dir_python 58/69 Test #55: test-install-replay_missing_initialize_data_fortran ...***Failed 36.76 sec Internal cmake changing into directory: /build/paraview-catalyst/src/build/tests/replay_tests/fortran/install-replay_missing_initialize_data_fortran ======== CMake output ====== CMake Deprecation Warning at CMakeLists.txt:1 (cmake_minimum_required): Compatibility with CMake < 3.10 will be removed from a future version of CMake. Update the VERSION argument value. Or, use the ... syntax to tell CMake that the project requires at least but has been updated to work with policies introduced by or earlier. The C compiler identification is GNU 15.2.1 The CXX compiler identification is GNU 15.2.1 Detecting C compiler ABI info Detecting C compiler ABI info - done Check for working C compiler: /usr/bin/cc - skipped Detecting C compile features Detecting C compile features - done Detecting CXX compiler ABI info Detecting CXX compiler ABI info - done Check for working CXX compiler: /usr/bin/c++ - skipped Detecting CXX compile features Detecting CXX compile features - done The Fortran compiler identification is GNU 15.2.1 Detecting Fortran compiler ABI info Detecting Fortran compiler ABI info - done Check for working Fortran compiler: /usr/bin/gfortran - skipped Found MPI_C: /lib/libmpi.so (found version "3.1") Found MPI: TRUE (found version "3.1") found components: C Found Python3: /usr/include/python3.14 (found version "3.14.3") found components: Development Development.Module Development.Embed Found MPI_Fortran: /lib/libmpi_usempif08.so (found version "3.1") Found MPI: TRUE (found version "3.1") found components: Fortran Configuring done (21.8s) Generating done (0.2s) Build files have been written to: /build/paraview-catalyst/src/build/tests/replay_tests/fortran/install-replay_missing_initialize_data_fortran ======== End CMake output ====== Change Dir: '/build/paraview-catalyst/src/build/tests/replay_tests/fortran/install-replay_missing_initialize_data_fortran' Run Clean Command: /usr/bin/ninja clean [1/1] Cleaning all built files... Cleaning... 0 files. Run Build Command(s): /usr/bin/ninja [1/7] Building Fortran preprocessed CMakeFiles/missing_initialize_driver.dir/build/paraview-catalyst/src/catalyst-v2.1.0/tests/replay_tests/fortran/common_src_dir/common_replay_driver.F90-pp.f90 [2/7] Generating Fortran dyndep file CMakeFiles/missing_initialize_driver.dir/Fortran.dd [3/7] Building C object CMakeFiles/missing_initialize_adaptor.dir/catalyst_impl_replay.c.o [4/7] Building Fortran object CMakeFiles/missing_initialize_driver.dir/build/paraview-catalyst/src/catalyst-v2.1.0/tests/replay_tests/fortran/common_src_dir/common_replay_driver.F90.o [5/7] Linking Fortran executable missing_initialize_driver [6/7] Building CXX object CMakeFiles/missing_initialize_adaptor.dir/build/paraview-catalyst/src/catalyst-v2.1.0/tests/replay_tests/common_src_dir/common_replay_adaptor.cxx.o [7/7] Linking CXX shared module catalyst/libcatalyst-replay.so Running test command: "/usr/bin/ctest" "-C" "$" "-V" "--output-on-failure" UpdateCTestConfiguration from :/build/paraview-catalyst/src/build/tests/replay_tests/fortran/install-replay_missing_initialize_data_fortran/DartConfiguration.tcl Parse Config file:/build/paraview-catalyst/src/build/tests/replay_tests/fortran/install-replay_missing_initialize_data_fortran/DartConfiguration.tcl Test project /build/paraview-catalyst/src/build/tests/replay_tests/fortran/install-replay_missing_initialize_data_fortran Constructing a list of tests Done constructing a list of tests Updating test list for fixtures Added 0 tests to meet fixture requirements Checking test dependency graph... Checking test dependency graph end test 1 Start 1: missing_initialize_prepare 1: Test command: /usr/bin/cmake "-E" "rm" "-rf" "/build/paraview-catalyst/src/build/tests/replay_tests/fortran/install-replay_missing_initialize_data_fortran/data_dump/" 1: Working Directory: /build/paraview-catalyst/src/build/tests/replay_tests/fortran/install-replay_missing_initialize_data_fortran 1: Test timeout computed to be: 1500 1/4 Test #1: missing_initialize_prepare ....... Passed 0.09 sec test 2 Start 2: missing_initialize_write_out 2: Test command: /usr/bin/mpiexec "-n" "2" "/build/paraview-catalyst/src/build/tests/replay_tests/fortran/install-replay_missing_initialize_data_fortran/missing_initialize_driver" "/build/paraview-catalyst/src/build/tests/replay_tests/fortran/install-replay_missing_initialize_data_fortran/catalyst" 2: Working Directory: /build/paraview-catalyst/src/build/tests/replay_tests/fortran/install-replay_missing_initialize_data_fortran 2: Environment variables: 2: CATALYST_DATA_DUMP_DIRECTORY=/build/paraview-catalyst/src/build/tests/replay_tests/fortran/install-replay_missing_initialize_data_fortran/data_dump/ 2: Test timeout computed to be: 1500 2: [arch-nspawn-3003276:13124] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 1056 2: [arch-nspawn-3003276:13124] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 1231 2: [arch-nspawn-3003276:13124] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 1353 2: [arch-nspawn-3003276:13124] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 2405 2: [arch-nspawn-3003276:13124] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 2460 2: [arch-nspawn-3003276:13124] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 2476 2: [arch-nspawn-3003276:13124] PMIX ERROR: PMIX_ERROR in file server/pmix_server.c at line 4660 2: -------------------------------------------------------------------------- 2: The gds/shmem2 component attempted to attach to a shared-memory segment at a 2: particular base address, but was given a different one. Your job will now likely 2: abort. 2: 2: Requested Address: 0x2aaaac000000 2: Acquired Address: 0x3fbb5ac000 2: 2: If this problem persists, please consider disabling the gds/shmem2 component by 2: setting in your environment the following: PMIX_MCA_gds=hash 2: -------------------------------------------------------------------------- 2: 2: [arch-nspawn-3003276:13130] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file client/pmix_client.c at line 278 2: -------------------------------------------------------------------------- 2: PMIx_Init failed for the following reason: 2: 2: PMIX_ERROR 2: 2: Open MPI requires access to a local PMIx server to execute. Please ensure 2: that either you are operating in a PMIx-enabled environment, or use "mpirun" 2: to execute the job. 2: -------------------------------------------------------------------------- 2: 2: *** An error occurred in MPI_Init 2: *** on a NULL communicator 2: *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, 2: *** and MPI will try to terminate your MPI job as well) 2: [arch-nspawn-3003276:13130] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! 2: -------------------------------------------------------------------------- 2: prte detected that one or more processes exited with non-zero status, 2: thus causing the job to be terminated. The first process to do so was: 2: 2: Process name: [prte-arch-nspawn-3003276-13124@1,0] 2: Exit code: 14 2: -------------------------------------------------------------------------- 2: [arch-nspawn-3003276:13124] PMIX ERROR: PMIX_ERR_UNREACH in file base/ptl_base_connection_hdlr.c at line 103 2/4 Test #2: missing_initialize_write_out .....***Failed 0.61 sec [arch-nspawn-3003276:13124] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 1056 [arch-nspawn-3003276:13124] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 1231 [arch-nspawn-3003276:13124] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 1353 [arch-nspawn-3003276:13124] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 2405 [arch-nspawn-3003276:13124] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 2460 [arch-nspawn-3003276:13124] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 2476 [arch-nspawn-3003276:13124] PMIX ERROR: PMIX_ERROR in file server/pmix_server.c at line 4660 -------------------------------------------------------------------------- The gds/shmem2 component attempted to attach to a shared-memory segment at a particular base address, but was given a different one. Your job will now likely abort. Requested Address: 0x2aaaac000000 Acquired Address: 0x3fbb5ac000 If this problem persists, please consider disabling the gds/shmem2 component by setting in your environment the following: PMIX_MCA_gds=hash -------------------------------------------------------------------------- [arch-nspawn-3003276:13130] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file client/pmix_client.c at line 278 -------------------------------------------------------------------------- PMIx_Init failed for the following reason: PMIX_ERROR Open MPI requires access to a local PMIx server to execute. Please ensure that either you are operating in a PMIx-enabled environment, or use "mpirun" to execute the job. -------------------------------------------------------------------------- *** An error occurred in MPI_Init *** on a NULL communicator *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, *** and MPI will try to terminate your MPI job as well) [arch-nspawn-3003276:13130] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! -------------------------------------------------------------------------- prte detected that one or more processes exited with non-zero status, thus causing the job to be terminated. The first process to do so was: Process name: [prte-arch-nspawn-3003276-13124@1,0] Exit code: 14 -------------------------------------------------------------------------- [arch-nspawn-3003276:13124] PMIX ERROR: PMIX_ERR_UNREACH in file base/ptl_base_connection_hdlr.c at line 103 test 3 Start 3: remove_initialize Failed test dependencies: missing_initialize_write_out 3/4 Test #3: remove_initialize ................***Not Run 0.00 sec test 4 Start 4: missing_initialize_read_in Failed test dependencies: remove_initialize 4/4 Test #4: missing_initialize_read_in .......***Not Run 0.00 sec 25% tests passed, 3 tests failed out of 4 Total Test time (real) = 0.70 sec The following tests FAILED: 2 - missing_initialize_write_out (Failed) 3 - remove_initialize (Not Run) 4 - missing_initialize_read_in (Not Run) Errors while running CTest Test command failed: /usr/bin/ctest CMake Error at /build/paraview-catalyst/src/catalyst-v2.1.0/cmake/catalyst-test-path-setting.cmake:41 (message): test or example failed! Start 66: test-build-replay_missing_initialize_data_python 59/69 Test #56: test-build-replay_missing_execute_invc_fortran ........***Failed 36.37 sec Internal cmake changing into directory: /build/paraview-catalyst/src/build/tests/replay_tests/fortran/build-replay_missing_execute_invc_fortran ======== CMake output ====== CMake Deprecation Warning at CMakeLists.txt:1 (cmake_minimum_required): Compatibility with CMake < 3.10 will be removed from a future version of CMake. Update the VERSION argument value. Or, use the ... syntax to tell CMake that the project requires at least but has been updated to work with policies introduced by or earlier. The C compiler identification is GNU 15.2.1 The CXX compiler identification is GNU 15.2.1 Detecting C compiler ABI info Detecting C compiler ABI info - done Check for working C compiler: /usr/bin/cc - skipped Detecting C compile features Detecting C compile features - done Detecting CXX compiler ABI info Detecting CXX compiler ABI info - done Check for working CXX compiler: /usr/bin/c++ - skipped Detecting CXX compile features Detecting CXX compile features - done The Fortran compiler identification is GNU 15.2.1 Detecting Fortran compiler ABI info Detecting Fortran compiler ABI info - done Check for working Fortran compiler: /usr/bin/gfortran - skipped Found MPI_C: /lib/libmpi.so (found version "3.1") Found MPI: TRUE (found version "3.1") found components: C Found Python3: /usr/include/python3.14 (found version "3.14.3") found components: Development Development.Module Development.Embed Found MPI_Fortran: /lib/libmpi_usempif08.so (found version "3.1") Found MPI: TRUE (found version "3.1") found components: Fortran Configuring done (21.5s) Generating done (0.1s) Build files have been written to: /build/paraview-catalyst/src/build/tests/replay_tests/fortran/build-replay_missing_execute_invc_fortran ======== End CMake output ====== Change Dir: '/build/paraview-catalyst/src/build/tests/replay_tests/fortran/build-replay_missing_execute_invc_fortran' Run Clean Command: /usr/bin/ninja clean [1/1] Cleaning all built files... Cleaning... 0 files. Run Build Command(s): /usr/bin/ninja [1/7] Building Fortran preprocessed CMakeFiles/missing_execute_invc_driver.dir/build/paraview-catalyst/src/catalyst-v2.1.0/tests/replay_tests/fortran/common_src_dir/common_replay_driver.F90-pp.f90 [2/7] Generating Fortran dyndep file CMakeFiles/missing_execute_invc_driver.dir/Fortran.dd [3/7] Building C object CMakeFiles/missing_execute_invc_adaptor.dir/catalyst_impl_replay.c.o [4/7] Building Fortran object CMakeFiles/missing_execute_invc_driver.dir/build/paraview-catalyst/src/catalyst-v2.1.0/tests/replay_tests/fortran/common_src_dir/common_replay_driver.F90.o [5/7] Linking Fortran executable missing_execute_invc_driver [6/7] Building CXX object CMakeFiles/missing_execute_invc_adaptor.dir/build/paraview-catalyst/src/catalyst-v2.1.0/tests/replay_tests/common_src_dir/common_replay_adaptor.cxx.o [7/7] Linking CXX shared module catalyst/libcatalyst-replay.so Running test command: "/usr/bin/ctest" "-C" "$" "-V" "--output-on-failure" UpdateCTestConfiguration from :/build/paraview-catalyst/src/build/tests/replay_tests/fortran/build-replay_missing_execute_invc_fortran/DartConfiguration.tcl Parse Config file:/build/paraview-catalyst/src/build/tests/replay_tests/fortran/build-replay_missing_execute_invc_fortran/DartConfiguration.tcl Test project /build/paraview-catalyst/src/build/tests/replay_tests/fortran/build-replay_missing_execute_invc_fortran Constructing a list of tests Done constructing a list of tests Updating test list for fixtures Added 0 tests to meet fixture requirements Checking test dependency graph... Checking test dependency graph end test 1 Start 1: missing_execute_invc_prepare 1: Test command: /usr/bin/cmake "-E" "rm" "-rf" "/build/paraview-catalyst/src/build/tests/replay_tests/fortran/build-replay_missing_execute_invc_fortran/data_dump/" 1: Working Directory: /build/paraview-catalyst/src/build/tests/replay_tests/fortran/build-replay_missing_execute_invc_fortran 1: Test timeout computed to be: 1500 1/4 Test #1: missing_execute_invc_prepare ..... Passed 0.12 sec test 2 Start 2: missing_execute_invc_write_out 2: Test command: /usr/bin/mpiexec "-n" "1" "/build/paraview-catalyst/src/build/tests/replay_tests/fortran/build-replay_missing_execute_invc_fortran/missing_execute_invc_driver" "/build/paraview-catalyst/src/build/tests/replay_tests/fortran/build-replay_missing_execute_invc_fortran/catalyst" 2: Working Directory: /build/paraview-catalyst/src/build/tests/replay_tests/fortran/build-replay_missing_execute_invc_fortran 2: Environment variables: 2: CATALYST_DATA_DUMP_DIRECTORY=/build/paraview-catalyst/src/build/tests/replay_tests/fortran/build-replay_missing_execute_invc_fortran/data_dump/ 2: Test timeout computed to be: 1500 2: [arch-nspawn-3003276:13213] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 1056 2: [arch-nspawn-3003276:13213] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 1231 2: [arch-nspawn-3003276:13213] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 1353 2: [arch-nspawn-3003276:13213] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 2405 2: [arch-nspawn-3003276:13213] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 2460 2: [arch-nspawn-3003276:13213] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 2476 2: [arch-nspawn-3003276:13213] PMIX ERROR: PMIX_ERROR in file server/pmix_server.c at line 4660 2: -------------------------------------------------------------------------- 2: The gds/shmem2 component attempted to attach to a shared-memory segment at a 2: particular base address, but was given a different one. Your job will now likely 2: abort. 2: 2: Requested Address: 0x2aaab8000000 2: Acquired Address: 0x3f9a02f000 2: 2: If this problem persists, please consider disabling the gds/shmem2 component by 2: setting in your environment the following: PMIX_MCA_gds=hash 2: -------------------------------------------------------------------------- 2: 2: [arch-nspawn-3003276:13220] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file client/pmix_client.c at line 278 2: *** An error occurred in MPI_Init 2: *** on a NULL communicator 2: *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, 2: *** and MPI will try to terminate your MPI job as well) 2: [arch-nspawn-3003276:13220] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! 2/4 Test #2: missing_execute_invc_write_out ...***Failed 0.28 sec [arch-nspawn-3003276:13213] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 1056 [arch-nspawn-3003276:13213] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 1231 [arch-nspawn-3003276:13213] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 1353 [arch-nspawn-3003276:13213] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 2405 [arch-nspawn-3003276:13213] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 2460 [arch-nspawn-3003276:13213] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 2476 [arch-nspawn-3003276:13213] PMIX ERROR: PMIX_ERROR in file server/pmix_server.c at line 4660 -------------------------------------------------------------------------- The gds/shmem2 component attempted to attach to a shared-memory segment at a particular base address, but was given a different one. Your job will now likely abort. Requested Address: 0x2aaab8000000 Acquired Address: 0x3f9a02f000 If this problem persists, please consider disabling the gds/shmem2 component by setting in your environment the following: PMIX_MCA_gds=hash -------------------------------------------------------------------------- [arch-nspawn-3003276:13220] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file client/pmix_client.c at line 278 *** An error occurred in MPI_Init *** on a NULL communicator *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, *** and MPI will try to terminate your MPI job as well) [arch-nspawn-3003276:13220] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! test 3 Start 3: remove_execute_invc0 Failed test dependencies: missing_execute_invc_write_out 3/4 Test #3: remove_execute_invc0 .............***Not Run 0.00 sec test 4 Start 4: missing_execute_invc_read_in Failed test dependencies: remove_execute_invc0 4/4 Test #4: missing_execute_invc_read_in .....***Not Run 0.00 sec 25% tests passed, 3 tests failed out of 4 Total Test time (real) = 0.40 sec The following tests FAILED: 2 - missing_execute_invc_write_out (Failed) 3 - remove_execute_invc0 (Not Run) 4 - missing_execute_invc_read_in (Not Run) Errors while running CTest Test command failed: /usr/bin/ctest CMake Error at /build/paraview-catalyst/src/catalyst-v2.1.0/cmake/catalyst-test-path-setting.cmake:41 (message): test or example failed! Start 67: test-install-replay_missing_initialize_data_python 60/69 Test #61: test-install-replay_ranks_mismatch_python .............***Failed 23.46 sec Internal cmake changing into directory: /build/paraview-catalyst/src/build/tests/replay_tests/python/install-replay_ranks_mismatch_python ======== CMake output ====== The C compiler identification is GNU 15.2.1 The CXX compiler identification is GNU 15.2.1 Detecting C compiler ABI info Detecting C compiler ABI info - done Check for working C compiler: /usr/bin/cc - skipped Detecting C compile features Detecting C compile features - done Detecting CXX compiler ABI info Detecting CXX compiler ABI info - done Check for working CXX compiler: /usr/bin/c++ - skipped Detecting CXX compile features Detecting CXX compile features - done Found MPI_C: /lib/libmpi.so (found version "3.1") Found MPI: TRUE (found version "3.1") found components: C Found Python3: /usr/include/python3.14 (found version "3.14.3") found components: Development Development.Module Development.Embed Found Python3: /usr/bin/python3.14 (found version "3.14.3") found components: Interpreter Configuring done (9.5s) Generating done (0.1s) Build files have been written to: /build/paraview-catalyst/src/build/tests/replay_tests/python/install-replay_ranks_mismatch_python ======== End CMake output ====== Change Dir: '/build/paraview-catalyst/src/build/tests/replay_tests/python/install-replay_ranks_mismatch_python' Run Clean Command: /usr/bin/ninja clean [1/1] Cleaning all built files... Cleaning... 0 files. Run Build Command(s): /usr/bin/ninja [1/3] Building C object CMakeFiles/replay_ranks_mismatch_adaptor.dir/catalyst_impl_replay.c.o [2/3] Building CXX object CMakeFiles/replay_ranks_mismatch_adaptor.dir/build/paraview-catalyst/src/catalyst-v2.1.0/tests/replay_tests/common_src_dir/common_replay_adaptor.cxx.o [3/3] Linking CXX shared module catalyst/libcatalyst-replay.so Running test command: "/usr/bin/ctest" "-C" "$" "-V" "--output-on-failure" UpdateCTestConfiguration from :/build/paraview-catalyst/src/build/tests/replay_tests/python/install-replay_ranks_mismatch_python/DartConfiguration.tcl Parse Config file:/build/paraview-catalyst/src/build/tests/replay_tests/python/install-replay_ranks_mismatch_python/DartConfiguration.tcl Test project /build/paraview-catalyst/src/build/tests/replay_tests/python/install-replay_ranks_mismatch_python Constructing a list of tests Done constructing a list of tests Updating test list for fixtures Added 0 tests to meet fixture requirements Checking test dependency graph... Checking test dependency graph end test 1 Start 1: replay_ranks_mismatch_prepare 1: Test command: /usr/bin/cmake "-E" "rm" "-rf" "/build/paraview-catalyst/src/build/tests/replay_tests/python/install-replay_ranks_mismatch_python/data_dump/" 1: Working Directory: /build/paraview-catalyst/src/build/tests/replay_tests/python/install-replay_ranks_mismatch_python 1: Test timeout computed to be: 1500 1/3 Test #1: replay_ranks_mismatch_prepare ..... Passed 0.11 sec test 2 Start 2: replay_ranks_mismatch_write_out 2: Test command: /usr/bin/mpiexec "-n" "1" "/usr/bin/python3.14" "/build/paraview-catalyst/src/catalyst-v2.1.0/tests/replay_tests/python/common_src_dir/common_replay_driver.py" "/build/paraview-catalyst/src/build/tests/replay_tests/python/install-replay_ranks_mismatch_python/catalyst" "--use-mpi" 2: Working Directory: /build/paraview-catalyst/src/build/tests/replay_tests/python/install-replay_ranks_mismatch_python 2: Environment variables: 2: CATALYST_DATA_DUMP_DIRECTORY=/build/paraview-catalyst/src/build/tests/replay_tests/python/install-replay_ranks_mismatch_python/data_dump/ 2: Environment variable modifications: 2: PYTHONPATH=path_list_prepend:/build/paraview-catalyst/src/build/root/usr/lib/cmake/catalyst-2.1/../../../lib/python3.14/site-packages 2: Test timeout computed to be: 1500 2: [arch-nspawn-3003276:13225] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 1056 2: [arch-nspawn-3003276:13225] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 1231 2: [arch-nspawn-3003276:13225] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 1353 2: [arch-nspawn-3003276:13225] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 2405 2: [arch-nspawn-3003276:13225] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 2460 2: [arch-nspawn-3003276:13225] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 2476 2: [arch-nspawn-3003276:13225] PMIX ERROR: PMIX_ERROR in file server/pmix_server.c at line 4660 2: -------------------------------------------------------------------------- 2: The gds/shmem2 component attempted to attach to a shared-memory segment at a 2: particular base address, but was given a different one. Your job will now likely 2: abort. 2: 2: Requested Address: 0x2aaacc000000 2: Acquired Address: 0x3fbc124000 2: 2: If this problem persists, please consider disabling the gds/shmem2 component by 2: setting in your environment the following: PMIX_MCA_gds=hash 2: -------------------------------------------------------------------------- 2: 2: [arch-nspawn-3003276:13230] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file client/pmix_client.c at line 278 2: *** An error occurred in MPI_Init_thread 2: *** on a NULL communicator 2: *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, 2: *** and MPI will try to terminate your MPI job as well) 2: [arch-nspawn-3003276:13230] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! 2/3 Test #2: replay_ranks_mismatch_write_out ...***Failed 1.11 sec [arch-nspawn-3003276:13225] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 1056 [arch-nspawn-3003276:13225] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 1231 [arch-nspawn-3003276:13225] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 1353 [arch-nspawn-3003276:13225] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 2405 [arch-nspawn-3003276:13225] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 2460 [arch-nspawn-3003276:13225] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 2476 [arch-nspawn-3003276:13225] PMIX ERROR: PMIX_ERROR in file server/pmix_server.c at line 4660 -------------------------------------------------------------------------- The gds/shmem2 component attempted to attach to a shared-memory segment at a particular base address, but was given a different one. Your job will now likely abort. Requested Address: 0x2aaacc000000 Acquired Address: 0x3fbc124000 If this problem persists, please consider disabling the gds/shmem2 component by setting in your environment the following: PMIX_MCA_gds=hash -------------------------------------------------------------------------- [arch-nspawn-3003276:13230] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file client/pmix_client.c at line 278 *** An error occurred in MPI_Init_thread *** on a NULL communicator *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, *** and MPI will try to terminate your MPI job as well) [arch-nspawn-3003276:13230] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! test 3 Start 3: replay_ranks_mismatch_read_in Failed test dependencies: replay_ranks_mismatch_write_out 3/3 Test #3: replay_ranks_mismatch_read_in .....***Not Run 0.00 sec 33% tests passed, 2 tests failed out of 3 Total Test time (real) = 1.23 sec The following tests FAILED: 2 - replay_ranks_mismatch_write_out (Failed) 3 - replay_ranks_mismatch_read_in (Not Run) Errors while running CTest Test command failed: /usr/bin/ctest CMake Error at /build/paraview-catalyst/src/catalyst-v2.1.0/cmake/catalyst-test-path-setting.cmake:41 (message): test or example failed! Start 68: test-build-replay_missing_execute_invc_python 61/69 Test #57: test-install-replay_missing_execute_invc_fortran ......***Failed 37.51 sec Internal cmake changing into directory: /build/paraview-catalyst/src/build/tests/replay_tests/fortran/install-replay_missing_execute_invc_fortran ======== CMake output ====== CMake Deprecation Warning at CMakeLists.txt:1 (cmake_minimum_required): Compatibility with CMake < 3.10 will be removed from a future version of CMake. Update the VERSION argument value. Or, use the ... syntax to tell CMake that the project requires at least but has been updated to work with policies introduced by or earlier. The C compiler identification is GNU 15.2.1 The CXX compiler identification is GNU 15.2.1 Detecting C compiler ABI info Detecting C compiler ABI info - done Check for working C compiler: /usr/bin/cc - skipped Detecting C compile features Detecting C compile features - done Detecting CXX compiler ABI info Detecting CXX compiler ABI info - done Check for working CXX compiler: /usr/bin/c++ - skipped Detecting CXX compile features Detecting CXX compile features - done The Fortran compiler identification is GNU 15.2.1 Detecting Fortran compiler ABI info Detecting Fortran compiler ABI info - done Check for working Fortran compiler: /usr/bin/gfortran - skipped Found MPI_C: /lib/libmpi.so (found version "3.1") Found MPI: TRUE (found version "3.1") found components: C Found Python3: /usr/include/python3.14 (found version "3.14.3") found components: Development Development.Module Development.Embed Found MPI_Fortran: /lib/libmpi_usempif08.so (found version "3.1") Found MPI: TRUE (found version "3.1") found components: Fortran Configuring done (23.3s) Generating done (0.2s) Build files have been written to: /build/paraview-catalyst/src/build/tests/replay_tests/fortran/install-replay_missing_execute_invc_fortran ======== End CMake output ====== Change Dir: '/build/paraview-catalyst/src/build/tests/replay_tests/fortran/install-replay_missing_execute_invc_fortran' Run Clean Command: /usr/bin/ninja clean [1/1] Cleaning all built files... Cleaning... 0 files. Run Build Command(s): /usr/bin/ninja [1/7] Building Fortran preprocessed CMakeFiles/missing_execute_invc_driver.dir/build/paraview-catalyst/src/catalyst-v2.1.0/tests/replay_tests/fortran/common_src_dir/common_replay_driver.F90-pp.f90 [2/7] Generating Fortran dyndep file CMakeFiles/missing_execute_invc_driver.dir/Fortran.dd [3/7] Building C object CMakeFiles/missing_execute_invc_adaptor.dir/catalyst_impl_replay.c.o [4/7] Building Fortran object CMakeFiles/missing_execute_invc_driver.dir/build/paraview-catalyst/src/catalyst-v2.1.0/tests/replay_tests/fortran/common_src_dir/common_replay_driver.F90.o [5/7] Linking Fortran executable missing_execute_invc_driver [6/7] Building CXX object CMakeFiles/missing_execute_invc_adaptor.dir/build/paraview-catalyst/src/catalyst-v2.1.0/tests/replay_tests/common_src_dir/common_replay_adaptor.cxx.o [7/7] Linking CXX shared module catalyst/libcatalyst-replay.so Running test command: "/usr/bin/ctest" "-C" "$" "-V" "--output-on-failure" UpdateCTestConfiguration from :/build/paraview-catalyst/src/build/tests/replay_tests/fortran/install-replay_missing_execute_invc_fortran/DartConfiguration.tcl Parse Config file:/build/paraview-catalyst/src/build/tests/replay_tests/fortran/install-replay_missing_execute_invc_fortran/DartConfiguration.tcl Test project /build/paraview-catalyst/src/build/tests/replay_tests/fortran/install-replay_missing_execute_invc_fortran Constructing a list of tests Done constructing a list of tests Updating test list for fixtures Added 0 tests to meet fixture requirements Checking test dependency graph... Checking test dependency graph end test 1 Start 1: missing_execute_invc_prepare 1: Test command: /usr/bin/cmake "-E" "rm" "-rf" "/build/paraview-catalyst/src/build/tests/replay_tests/fortran/install-replay_missing_execute_invc_fortran/data_dump/" 1: Working Directory: /build/paraview-catalyst/src/build/tests/replay_tests/fortran/install-replay_missing_execute_invc_fortran 1: Test timeout computed to be: 1500 1/4 Test #1: missing_execute_invc_prepare ..... Passed 0.10 sec test 2 Start 2: missing_execute_invc_write_out 2: Test command: /usr/bin/mpiexec "-n" "1" "/build/paraview-catalyst/src/build/tests/replay_tests/fortran/install-replay_missing_execute_invc_fortran/missing_execute_invc_driver" "/build/paraview-catalyst/src/build/tests/replay_tests/fortran/install-replay_missing_execute_invc_fortran/catalyst" 2: Working Directory: /build/paraview-catalyst/src/build/tests/replay_tests/fortran/install-replay_missing_execute_invc_fortran 2: Environment variables: 2: CATALYST_DATA_DUMP_DIRECTORY=/build/paraview-catalyst/src/build/tests/replay_tests/fortran/install-replay_missing_execute_invc_fortran/data_dump/ 2: Test timeout computed to be: 1500 2: [arch-nspawn-3003276:13279] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 1056 2: [arch-nspawn-3003276:13279] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 1231 2: [arch-nspawn-3003276:13279] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 1353 2: [arch-nspawn-3003276:13279] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 2405 2: [arch-nspawn-3003276:13279] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 2460 2: [arch-nspawn-3003276:13279] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 2476 2: [arch-nspawn-3003276:13279] PMIX ERROR: PMIX_ERROR in file server/pmix_server.c at line 4660 2: -------------------------------------------------------------------------- 2: The gds/shmem2 component attempted to attach to a shared-memory segment at a 2: particular base address, but was given a different one. Your job will now likely 2: abort. 2: 2: Requested Address: 0x2aaab8000000 2: Acquired Address: 0x3fa780e000 2: 2: If this problem persists, please consider disabling the gds/shmem2 component by 2: setting in your environment the following: PMIX_MCA_gds=hash 2: -------------------------------------------------------------------------- 2: 2: [arch-nspawn-3003276:13290] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file client/pmix_client.c at line 278 2: -------------------------------------------------------------------------- 2: PMIx_Init failed for the following reason: 2: 2: PMIX_ERROR 2: 2: Open MPI requires access to a local PMIx server to execute. Please ensure 2: that either you are operating in a PMIx-enabled environment, or use "mpirun" 2: to execute the job. 2: -------------------------------------------------------------------------- 2: 2: *** An error occurred in MPI_Init 2: *** on a NULL communicator 2: *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, 2: *** and MPI will try to terminate your MPI job as well) 2: [arch-nspawn-3003276:13290] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! 2/4 Test #2: missing_execute_invc_write_out ...***Failed 0.33 sec [arch-nspawn-3003276:13279] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 1056 [arch-nspawn-3003276:13279] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 1231 [arch-nspawn-3003276:13279] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 1353 [arch-nspawn-3003276:13279] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 2405 [arch-nspawn-3003276:13279] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 2460 [arch-nspawn-3003276:13279] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 2476 [arch-nspawn-3003276:13279] PMIX ERROR: PMIX_ERROR in file server/pmix_server.c at line 4660 -------------------------------------------------------------------------- The gds/shmem2 component attempted to attach to a shared-memory segment at a particular base address, but was given a different one. Your job will now likely abort. Requested Address: 0x2aaab8000000 Acquired Address: 0x3fa780e000 If this problem persists, please consider disabling the gds/shmem2 component by setting in your environment the following: PMIX_MCA_gds=hash -------------------------------------------------------------------------- [arch-nspawn-3003276:13290] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file client/pmix_client.c at line 278 -------------------------------------------------------------------------- PMIx_Init failed for the following reason: PMIX_ERROR Open MPI requires access to a local PMIx server to execute. Please ensure that either you are operating in a PMIx-enabled environment, or use "mpirun" to execute the job. -------------------------------------------------------------------------- *** An error occurred in MPI_Init *** on a NULL communicator *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, *** and MPI will try to terminate your MPI job as well) [arch-nspawn-3003276:13290] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! test 3 Start 3: remove_execute_invc0 Failed test dependencies: missing_execute_invc_write_out 3/4 Test #3: remove_execute_invc0 .............***Not Run 0.00 sec test 4 Start 4: missing_execute_invc_read_in Failed test dependencies: remove_execute_invc0 4/4 Test #4: missing_execute_invc_read_in .....***Not Run 0.00 sec 25% tests passed, 3 tests failed out of 4 Total Test time (real) = 0.43 sec The following tests FAILED: 2 - missing_execute_invc_write_out (Failed) 3 - remove_execute_invc0 (Not Run) 4 - missing_execute_invc_read_in (Not Run) Errors while running CTest Test command failed: /usr/bin/ctest CMake Error at /build/paraview-catalyst/src/catalyst-v2.1.0/cmake/catalyst-test-path-setting.cmake:41 (message): test or example failed! Start 69: test-install-replay_missing_execute_invc_python 62/69 Test #62: test-build-replay_high_num_execute_invc_python ........***Failed 23.57 sec Internal cmake changing into directory: /build/paraview-catalyst/src/build/tests/replay_tests/python/build-replay_high_num_execute_invc_python ======== CMake output ====== The C compiler identification is GNU 15.2.1 The CXX compiler identification is GNU 15.2.1 Detecting C compiler ABI info Detecting C compiler ABI info - done Check for working C compiler: /usr/bin/cc - skipped Detecting C compile features Detecting C compile features - done Detecting CXX compiler ABI info Detecting CXX compiler ABI info - done Check for working CXX compiler: /usr/bin/c++ - skipped Detecting CXX compile features Detecting CXX compile features - done Found MPI_C: /lib/libmpi.so (found version "3.1") Found MPI: TRUE (found version "3.1") found components: C Found Python3: /usr/include/python3.14 (found version "3.14.3") found components: Development Development.Module Development.Embed Found Python3: /usr/bin/python3.14 (found version "3.14.3") found components: Interpreter Configuring done (8.9s) Generating done (0.1s) Build files have been written to: /build/paraview-catalyst/src/build/tests/replay_tests/python/build-replay_high_num_execute_invc_python ======== End CMake output ====== Change Dir: '/build/paraview-catalyst/src/build/tests/replay_tests/python/build-replay_high_num_execute_invc_python' Run Clean Command: /usr/bin/ninja clean [1/1] Cleaning all built files... Cleaning... 0 files. Run Build Command(s): /usr/bin/ninja [1/3] Building C object CMakeFiles/high_num_execute_invc_adaptor.dir/catalyst_impl_replay.c.o [2/3] Building CXX object CMakeFiles/high_num_execute_invc_adaptor.dir/build/paraview-catalyst/src/catalyst-v2.1.0/tests/replay_tests/common_src_dir/common_replay_adaptor.cxx.o [3/3] Linking CXX shared module catalyst/libcatalyst-replay.so Running test command: "/usr/bin/ctest" "-C" "$" "-V" "--output-on-failure" UpdateCTestConfiguration from :/build/paraview-catalyst/src/build/tests/replay_tests/python/build-replay_high_num_execute_invc_python/DartConfiguration.tcl Parse Config file:/build/paraview-catalyst/src/build/tests/replay_tests/python/build-replay_high_num_execute_invc_python/DartConfiguration.tcl Test project /build/paraview-catalyst/src/build/tests/replay_tests/python/build-replay_high_num_execute_invc_python Constructing a list of tests Done constructing a list of tests Updating test list for fixtures Added 0 tests to meet fixture requirements Checking test dependency graph... Checking test dependency graph end test 1 Start 1: high_num_execute_invc_prepare 1: Test command: /usr/bin/cmake "-E" "rm" "-rf" "/build/paraview-catalyst/src/build/tests/replay_tests/python/build-replay_high_num_execute_invc_python/data_dump" 1: Working Directory: /build/paraview-catalyst/src/build/tests/replay_tests/python/build-replay_high_num_execute_invc_python 1: Test timeout computed to be: 1500 1/3 Test #1: high_num_execute_invc_prepare ..... Passed 0.09 sec test 2 Start 2: high_num_execute_invc_write_out 2: Test command: /usr/bin/mpiexec "-n" "2" "/usr/bin/python3.14" "/build/paraview-catalyst/src/catalyst-v2.1.0/tests/replay_tests/python/replay_high_num_execute_invc_python/high_num_execute_invc_driver.py" "/build/paraview-catalyst/src/build/tests/replay_tests/python/build-replay_high_num_execute_invc_python/catalyst" "--use-mpi" 2: Working Directory: /build/paraview-catalyst/src/build/tests/replay_tests/python/build-replay_high_num_execute_invc_python 2: Environment variables: 2: CATALYST_DATA_DUMP_DIRECTORY=/build/paraview-catalyst/src/build/tests/replay_tests/python/build-replay_high_num_execute_invc_python/data_dump 2: Environment variable modifications: 2: PYTHONPATH=path_list_prepend:/build/paraview-catalyst/src/build/lib/cmake/catalyst-2.1/../../../lib/python3.14/site-packages 2: Test timeout computed to be: 1500 2: [arch-nspawn-3003276:14730] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 1056 2: [arch-nspawn-3003276:14730] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 1231 2: [arch-nspawn-3003276:14730] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 1353 2: [arch-nspawn-3003276:14730] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 2405 2: [arch-nspawn-3003276:14730] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 2460 2: [arch-nspawn-3003276:14730] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 2476 2: [arch-nspawn-3003276:14730] PMIX ERROR: PMIX_ERROR in file server/pmix_server.c at line 4660 2: -------------------------------------------------------------------------- 2: The gds/shmem2 component attempted to attach to a shared-memory segment at a 2: particular base address, but was given a different one. Your job will now likely 2: abort. 2: 2: Requested Address: 0x2aaab8000000 2: Acquired Address: 0x3fb3b70000 2: 2: If this problem persists, please consider disabling the gds/shmem2 component by 2: setting in your environment the following: PMIX_MCA_gds=hash 2: -------------------------------------------------------------------------- 2: 2: [arch-nspawn-3003276:14739] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file client/pmix_client.c at line 278 2: *** An error occurred in MPI_Init_thread 2: *** on a NULL communicator 2: *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, 2: *** and MPI will try to terminate your MPI job as well) 2: [arch-nspawn-3003276:14739] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! 2: -------------------------------------------------------------------------- 2: prte detected that one or more processes exited with non-zero status, 2: thus causing the job to be terminated. The first process to do so was: 2: 2: Process name: [prte-arch-nspawn-3003276-14730@1,0] 2: Exit code: 14 2: -------------------------------------------------------------------------- 2: [arch-nspawn-3003276:14730] PMIX ERROR: PMIX_ERR_UNREACH in file base/ptl_base_connection_hdlr.c at line 103 2/3 Test #2: high_num_execute_invc_write_out ...***Failed 1.62 sec [arch-nspawn-3003276:14730] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 1056 [arch-nspawn-3003276:14730] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 1231 [arch-nspawn-3003276:14730] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 1353 [arch-nspawn-3003276:14730] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 2405 [arch-nspawn-3003276:14730] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 2460 [arch-nspawn-3003276:14730] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 2476 [arch-nspawn-3003276:14730] PMIX ERROR: PMIX_ERROR in file server/pmix_server.c at line 4660 -------------------------------------------------------------------------- The gds/shmem2 component attempted to attach to a shared-memory segment at a particular base address, but was given a different one. Your job will now likely abort. Requested Address: 0x2aaab8000000 Acquired Address: 0x3fb3b70000 If this problem persists, please consider disabling the gds/shmem2 component by setting in your environment the following: PMIX_MCA_gds=hash -------------------------------------------------------------------------- [arch-nspawn-3003276:14739] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file client/pmix_client.c at line 278 *** An error occurred in MPI_Init_thread *** on a NULL communicator *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, *** and MPI will try to terminate your MPI job as well) [arch-nspawn-3003276:14739] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! -------------------------------------------------------------------------- prte detected that one or more processes exited with non-zero status, thus causing the job to be terminated. The first process to do so was: Process name: [prte-arch-nspawn-3003276-14730@1,0] Exit code: 14 -------------------------------------------------------------------------- [arch-nspawn-3003276:14730] PMIX ERROR: PMIX_ERR_UNREACH in file base/ptl_base_connection_hdlr.c at line 103 test 3 Start 3: high_num_execute_invc_read_in Failed test dependencies: high_num_execute_invc_write_out 3/3 Test #3: high_num_execute_invc_read_in .....***Not Run 0.00 sec 33% tests passed, 2 tests failed out of 3 Total Test time (real) = 1.72 sec The following tests FAILED: 2 - high_num_execute_invc_write_out (Failed) 3 - high_num_execute_invc_read_in (Not Run) Errors while running CTest Test command failed: /usr/bin/ctest CMake Error at /build/paraview-catalyst/src/catalyst-v2.1.0/cmake/catalyst-test-path-setting.cmake:41 (message): test or example failed! 63/69 Test #63: test-install-replay_high_num_execute_invc_python ......***Failed 24.32 sec Internal cmake changing into directory: /build/paraview-catalyst/src/build/tests/replay_tests/python/install-replay_high_num_execute_invc_python ======== CMake output ====== The C compiler identification is GNU 15.2.1 The CXX compiler identification is GNU 15.2.1 Detecting C compiler ABI info Detecting C compiler ABI info - done Check for working C compiler: /usr/bin/cc - skipped Detecting C compile features Detecting C compile features - done Detecting CXX compiler ABI info Detecting CXX compiler ABI info - done Check for working CXX compiler: /usr/bin/c++ - skipped Detecting CXX compile features Detecting CXX compile features - done Found MPI_C: /lib/libmpi.so (found version "3.1") Found MPI: TRUE (found version "3.1") found components: C Found Python3: /usr/include/python3.14 (found version "3.14.3") found components: Development Development.Module Development.Embed Found Python3: /usr/bin/python3.14 (found version "3.14.3") found components: Interpreter Configuring done (9.8s) Generating done (0.1s) Build files have been written to: /build/paraview-catalyst/src/build/tests/replay_tests/python/install-replay_high_num_execute_invc_python ======== End CMake output ====== Change Dir: '/build/paraview-catalyst/src/build/tests/replay_tests/python/install-replay_high_num_execute_invc_python' Run Clean Command: /usr/bin/ninja clean [1/1] Cleaning all built files... Cleaning... 0 files. Run Build Command(s): /usr/bin/ninja [1/3] Building C object CMakeFiles/high_num_execute_invc_adaptor.dir/catalyst_impl_replay.c.o [2/3] Building CXX object CMakeFiles/high_num_execute_invc_adaptor.dir/build/paraview-catalyst/src/catalyst-v2.1.0/tests/replay_tests/common_src_dir/common_replay_adaptor.cxx.o [3/3] Linking CXX shared module catalyst/libcatalyst-replay.so Running test command: "/usr/bin/ctest" "-C" "$" "-V" "--output-on-failure" UpdateCTestConfiguration from :/build/paraview-catalyst/src/build/tests/replay_tests/python/install-replay_high_num_execute_invc_python/DartConfiguration.tcl Parse Config file:/build/paraview-catalyst/src/build/tests/replay_tests/python/install-replay_high_num_execute_invc_python/DartConfiguration.tcl Test project /build/paraview-catalyst/src/build/tests/replay_tests/python/install-replay_high_num_execute_invc_python Constructing a list of tests Done constructing a list of tests Updating test list for fixtures Added 0 tests to meet fixture requirements Checking test dependency graph... Checking test dependency graph end test 1 Start 1: high_num_execute_invc_prepare 1: Test command: /usr/bin/cmake "-E" "rm" "-rf" "/build/paraview-catalyst/src/build/tests/replay_tests/python/install-replay_high_num_execute_invc_python/data_dump" 1: Working Directory: /build/paraview-catalyst/src/build/tests/replay_tests/python/install-replay_high_num_execute_invc_python 1: Test timeout computed to be: 1500 1/3 Test #1: high_num_execute_invc_prepare ..... Passed 0.08 sec test 2 Start 2: high_num_execute_invc_write_out 2: Test command: /usr/bin/mpiexec "-n" "2" "/usr/bin/python3.14" "/build/paraview-catalyst/src/catalyst-v2.1.0/tests/replay_tests/python/replay_high_num_execute_invc_python/high_num_execute_invc_driver.py" "/build/paraview-catalyst/src/build/tests/replay_tests/python/install-replay_high_num_execute_invc_python/catalyst" "--use-mpi" 2: Working Directory: /build/paraview-catalyst/src/build/tests/replay_tests/python/install-replay_high_num_execute_invc_python 2: Environment variables: 2: CATALYST_DATA_DUMP_DIRECTORY=/build/paraview-catalyst/src/build/tests/replay_tests/python/install-replay_high_num_execute_invc_python/data_dump 2: Environment variable modifications: 2: PYTHONPATH=path_list_prepend:/build/paraview-catalyst/src/build/root/usr/lib/cmake/catalyst-2.1/../../../lib/python3.14/site-packages 2: Test timeout computed to be: 1500 2: [arch-nspawn-3003276:14805] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 1056 2: [arch-nspawn-3003276:14805] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 1231 2: [arch-nspawn-3003276:14805] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 1353 2: [arch-nspawn-3003276:14805] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 2405 2: [arch-nspawn-3003276:14805] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 2460 2: [arch-nspawn-3003276:14805] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 2476 2: [arch-nspawn-3003276:14805] PMIX ERROR: PMIX_ERROR in file server/pmix_server.c at line 4660 2: -------------------------------------------------------------------------- 2: The gds/shmem2 component attempted to attach to a shared-memory segment at a 2: particular base address, but was given a different one. Your job will now likely 2: abort. 2: 2: Requested Address: 0x2aaac0000000 2: Acquired Address: 0x3fa0176000 2: 2: If this problem persists, please consider disabling the gds/shmem2 component by 2: setting in your environment the following: PMIX_MCA_gds=hash 2: -------------------------------------------------------------------------- 2: 2: [arch-nspawn-3003276:14810] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file client/pmix_client.c at line 278 2: *** An error occurred in MPI_Init_thread 2: *** on a NULL communicator 2: -------------------------------------------------------------------------- 2: PMIx_Init failed for the following reason: 2: 2: PMIX_ERROR 2: 2: Open MPI requires access to a local PMIx server to execute. Please ensure 2: that either you are operating in a PMIx-enabled environment, or use "mpirun" 2: to execute the job. 2: -------------------------------------------------------------------------- 2: 2: *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, 2: *** and MPI will try to terminate your MPI job as well) 2: [arch-nspawn-3003276:14810] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! 2: -------------------------------------------------------------------------- 2: prte detected that one or more processes exited with non-zero status, 2: thus causing the job to be terminated. The first process to do so was: 2: 2: Process name: [prte-arch-nspawn-3003276-14805@1,0] 2: Exit code: 14 2: -------------------------------------------------------------------------- 2/3 Test #2: high_num_execute_invc_write_out ...***Failed 1.30 sec [arch-nspawn-3003276:14805] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 1056 [arch-nspawn-3003276:14805] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 1231 [arch-nspawn-3003276:14805] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 1353 [arch-nspawn-3003276:14805] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 2405 [arch-nspawn-3003276:14805] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 2460 [arch-nspawn-3003276:14805] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 2476 [arch-nspawn-3003276:14805] PMIX ERROR: PMIX_ERROR in file server/pmix_server.c at line 4660 -------------------------------------------------------------------------- The gds/shmem2 component attempted to attach to a shared-memory segment at a particular base address, but was given a different one. Your job will now likely abort. Requested Address: 0x2aaac0000000 Acquired Address: 0x3fa0176000 If this problem persists, please consider disabling the gds/shmem2 component by setting in your environment the following: PMIX_MCA_gds=hash -------------------------------------------------------------------------- [arch-nspawn-3003276:14810] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file client/pmix_client.c at line 278 *** An error occurred in MPI_Init_thread *** on a NULL communicator -------------------------------------------------------------------------- PMIx_Init failed for the following reason: PMIX_ERROR Open MPI requires access to a local PMIx server to execute. Please ensure that either you are operating in a PMIx-enabled environment, or use "mpirun" to execute the job. -------------------------------------------------------------------------- *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, *** and MPI will try to terminate your MPI job as well) [arch-nspawn-3003276:14810] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! -------------------------------------------------------------------------- prte detected that one or more processes exited with non-zero status, thus causing the job to be terminated. The first process to do so was: Process name: [prte-arch-nspawn-3003276-14805@1,0] Exit code: 14 -------------------------------------------------------------------------- test 3 Start 3: high_num_execute_invc_read_in Failed test dependencies: high_num_execute_invc_write_out 3/3 Test #3: high_num_execute_invc_read_in .....***Not Run 0.00 sec 33% tests passed, 2 tests failed out of 3 Total Test time (real) = 1.38 sec The following tests FAILED: 2 - high_num_execute_invc_write_out (Failed) 3 - high_num_execute_invc_read_in (Not Run) Errors while running CTest Test command failed: /usr/bin/ctest CMake Error at /build/paraview-catalyst/src/catalyst-v2.1.0/cmake/catalyst-test-path-setting.cmake:41 (message): test or example failed! 64/69 Test #64: test-build-replay_no_data_dump_dir_python .............***Failed 22.83 sec Internal cmake changing into directory: /build/paraview-catalyst/src/build/tests/replay_tests/python/build-replay_no_data_dump_dir_python ======== CMake output ====== The C compiler identification is GNU 15.2.1 The CXX compiler identification is GNU 15.2.1 Detecting C compiler ABI info Detecting C compiler ABI info - done Check for working C compiler: /usr/bin/cc - skipped Detecting C compile features Detecting C compile features - done Detecting CXX compiler ABI info Detecting CXX compiler ABI info - done Check for working CXX compiler: /usr/bin/c++ - skipped Detecting CXX compile features Detecting CXX compile features - done Found MPI_C: /lib/libmpi.so (found version "3.1") Found MPI: TRUE (found version "3.1") found components: C Found Python3: /usr/include/python3.14 (found version "3.14.3") found components: Development Development.Module Development.Embed Found Python3: /usr/bin/python3.14 (found version "3.14.3") found components: Interpreter Configuring done (9.9s) Generating done (0.1s) Build files have been written to: /build/paraview-catalyst/src/build/tests/replay_tests/python/build-replay_no_data_dump_dir_python ======== End CMake output ====== Change Dir: '/build/paraview-catalyst/src/build/tests/replay_tests/python/build-replay_no_data_dump_dir_python' Run Clean Command: /usr/bin/ninja clean [1/1] Cleaning all built files... Cleaning... 0 files. Run Build Command(s): /usr/bin/ninja [1/3] Building C object CMakeFiles/no_data_dump_dir_adapter.dir/catalyst_impl_replay.c.o [2/3] Building CXX object CMakeFiles/no_data_dump_dir_adapter.dir/build/paraview-catalyst/src/catalyst-v2.1.0/tests/replay_tests/common_src_dir/common_replay_adaptor.cxx.o [3/3] Linking CXX shared module catalyst/libcatalyst-replay.so Running test command: "/usr/bin/ctest" "-C" "$" "-V" "--output-on-failure" UpdateCTestConfiguration from :/build/paraview-catalyst/src/build/tests/replay_tests/python/build-replay_no_data_dump_dir_python/DartConfiguration.tcl Parse Config file:/build/paraview-catalyst/src/build/tests/replay_tests/python/build-replay_no_data_dump_dir_python/DartConfiguration.tcl Test project /build/paraview-catalyst/src/build/tests/replay_tests/python/build-replay_no_data_dump_dir_python Constructing a list of tests Done constructing a list of tests Updating test list for fixtures Added 0 tests to meet fixture requirements Checking test dependency graph... Checking test dependency graph end test 1 Start 1: no_write_out 1: Test command: /usr/bin/mpiexec "-n" "2" "/usr/bin/python3.14" "/build/paraview-catalyst/src/catalyst-v2.1.0/tests/replay_tests/python/common_src_dir/common_replay_driver.py" "/build/paraview-catalyst/src/build/tests/replay_tests/python/build-replay_no_data_dump_dir_python/catalyst" "--use-mpi" 1: Working Directory: /build/paraview-catalyst/src/build/tests/replay_tests/python/build-replay_no_data_dump_dir_python 1: Environment variable modifications: 1: PYTHONPATH=path_list_prepend:/build/paraview-catalyst/src/build/lib/cmake/catalyst-2.1/../../../lib/python3.14/site-packages 1: Test timeout computed to be: 1500 1: [arch-nspawn-3003276:14827] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 1056 1: [arch-nspawn-3003276:14827] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 1231 1: [arch-nspawn-3003276:14827] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 1353 1: [arch-nspawn-3003276:14827] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 2405 1: [arch-nspawn-3003276:14827] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 2460 1: [arch-nspawn-3003276:14827] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 2476 1: [arch-nspawn-3003276:14827] PMIX ERROR: PMIX_ERROR in file server/pmix_server.c at line 4660 1: -------------------------------------------------------------------------- 1: The gds/shmem2 component attempted to attach to a shared-memory segment at a 1: particular base address, but was given a different one. Your job will now likely 1: abort. 1: 1: Requested Address: 0x2aaab0000000 1: Acquired Address: 0x3f87d8b000 1: 1: If this problem persists, please consider disabling the gds/shmem2 component by 1: setting in your environment the following: PMIX_MCA_gds=hash 1: -------------------------------------------------------------------------- 1: 1: [arch-nspawn-3003276:14830] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file client/pmix_client.c at line 278 1: *** An error occurred in MPI_Init_thread 1: *** on a NULL communicator 1: *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, 1: *** and MPI will try to terminate your MPI job as well) 1: [arch-nspawn-3003276:14830] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! 1: -------------------------------------------------------------------------- 1: prte detected that one or more processes exited with non-zero status, 1: thus causing the job to be terminated. The first process to do so was: 1: 1: Process name: [prte-arch-nspawn-3003276-14827@1,0] 1: Exit code: 14 1: -------------------------------------------------------------------------- 1/2 Test #1: no_write_out .....................***Failed 1.18 sec [arch-nspawn-3003276:14827] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 1056 [arch-nspawn-3003276:14827] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 1231 [arch-nspawn-3003276:14827] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 1353 [arch-nspawn-3003276:14827] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 2405 [arch-nspawn-3003276:14827] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 2460 [arch-nspawn-3003276:14827] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 2476 [arch-nspawn-3003276:14827] PMIX ERROR: PMIX_ERROR in file server/pmix_server.c at line 4660 -------------------------------------------------------------------------- The gds/shmem2 component attempted to attach to a shared-memory segment at a particular base address, but was given a different one. Your job will now likely abort. Requested Address: 0x2aaab0000000 Acquired Address: 0x3f87d8b000 If this problem persists, please consider disabling the gds/shmem2 component by setting in your environment the following: PMIX_MCA_gds=hash -------------------------------------------------------------------------- [arch-nspawn-3003276:14830] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file client/pmix_client.c at line 278 *** An error occurred in MPI_Init_thread *** on a NULL communicator *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, *** and MPI will try to terminate your MPI job as well) [arch-nspawn-3003276:14830] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! -------------------------------------------------------------------------- prte detected that one or more processes exited with non-zero status, thus causing the job to be terminated. The first process to do so was: Process name: [prte-arch-nspawn-3003276-14827@1,0] Exit code: 14 -------------------------------------------------------------------------- test 2 Start 2: no_data_dump_dir_read_in Failed test dependencies: no_write_out 2/2 Test #2: no_data_dump_dir_read_in .........***Not Run 0.00 sec 0% tests passed, 2 tests failed out of 2 Total Test time (real) = 1.19 sec The following tests FAILED: 1 - no_write_out (Failed) 2 - no_data_dump_dir_read_in (Not Run) Errors while running CTest Test command failed: /usr/bin/ctest CMake Error at /build/paraview-catalyst/src/catalyst-v2.1.0/cmake/catalyst-test-path-setting.cmake:41 (message): test or example failed! 65/69 Test #65: test-install-replay_no_data_dump_dir_python ...........***Failed 21.37 sec Internal cmake changing into directory: /build/paraview-catalyst/src/build/tests/replay_tests/python/install-replay_no_data_dump_dir_python ======== CMake output ====== The C compiler identification is GNU 15.2.1 The CXX compiler identification is GNU 15.2.1 Detecting C compiler ABI info Detecting C compiler ABI info - done Check for working C compiler: /usr/bin/cc - skipped Detecting C compile features Detecting C compile features - done Detecting CXX compiler ABI info Detecting CXX compiler ABI info - done Check for working CXX compiler: /usr/bin/c++ - skipped Detecting CXX compile features Detecting CXX compile features - done Found MPI_C: /lib/libmpi.so (found version "3.1") Found MPI: TRUE (found version "3.1") found components: C Found Python3: /usr/include/python3.14 (found version "3.14.3") found components: Development Development.Module Development.Embed Found Python3: /usr/bin/python3.14 (found version "3.14.3") found components: Interpreter Configuring done (9.1s) Generating done (0.1s) Build files have been written to: /build/paraview-catalyst/src/build/tests/replay_tests/python/install-replay_no_data_dump_dir_python ======== End CMake output ====== Change Dir: '/build/paraview-catalyst/src/build/tests/replay_tests/python/install-replay_no_data_dump_dir_python' Run Clean Command: /usr/bin/ninja clean [1/1] Cleaning all built files... Cleaning... 0 files. Run Build Command(s): /usr/bin/ninja [1/3] Building C object CMakeFiles/no_data_dump_dir_adapter.dir/catalyst_impl_replay.c.o [2/3] Building CXX object CMakeFiles/no_data_dump_dir_adapter.dir/build/paraview-catalyst/src/catalyst-v2.1.0/tests/replay_tests/common_src_dir/common_replay_adaptor.cxx.o [3/3] Linking CXX shared module catalyst/libcatalyst-replay.so Running test command: "/usr/bin/ctest" "-C" "$" "-V" "--output-on-failure" UpdateCTestConfiguration from :/build/paraview-catalyst/src/build/tests/replay_tests/python/install-replay_no_data_dump_dir_python/DartConfiguration.tcl Parse Config file:/build/paraview-catalyst/src/build/tests/replay_tests/python/install-replay_no_data_dump_dir_python/DartConfiguration.tcl Test project /build/paraview-catalyst/src/build/tests/replay_tests/python/install-replay_no_data_dump_dir_python Constructing a list of tests Done constructing a list of tests Updating test list for fixtures Added 0 tests to meet fixture requirements Checking test dependency graph... Checking test dependency graph end test 1 Start 1: no_write_out 1: Test command: /usr/bin/mpiexec "-n" "2" "/usr/bin/python3.14" "/build/paraview-catalyst/src/catalyst-v2.1.0/tests/replay_tests/python/common_src_dir/common_replay_driver.py" "/build/paraview-catalyst/src/build/tests/replay_tests/python/install-replay_no_data_dump_dir_python/catalyst" "--use-mpi" 1: Working Directory: /build/paraview-catalyst/src/build/tests/replay_tests/python/install-replay_no_data_dump_dir_python 1: Environment variable modifications: 1: PYTHONPATH=path_list_prepend:/build/paraview-catalyst/src/build/root/usr/lib/cmake/catalyst-2.1/../../../lib/python3.14/site-packages 1: Test timeout computed to be: 1500 1: [arch-nspawn-3003276:14842] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 1056 1: [arch-nspawn-3003276:14842] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 1231 1: [arch-nspawn-3003276:14842] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 1353 1: [arch-nspawn-3003276:14842] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 2405 1: [arch-nspawn-3003276:14842] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 2460 1: [arch-nspawn-3003276:14842] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 2476 1: [arch-nspawn-3003276:14842] PMIX ERROR: PMIX_ERROR in file server/pmix_server.c at line 4660 1: -------------------------------------------------------------------------- 1: The gds/shmem2 component attempted to attach to a shared-memory segment at a 1: particular base address, but was given a different one. Your job will now likely 1: abort. 1: 1: Requested Address: 0x2aaac0000000 1: Acquired Address: 0x3fbc9fb000 1: 1: If this problem persists, please consider disabling the gds/shmem2 component by 1: setting in your environment the following: PMIX_MCA_gds=hash 1: -------------------------------------------------------------------------- 1: 1: [arch-nspawn-3003276:14845] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file client/pmix_client.c at line 278 1: *** An error occurred in MPI_Init_thread 1: *** on a NULL communicator 1: *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, 1: *** and MPI will try to terminate your MPI job as well) 1: [arch-nspawn-3003276:14845] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! 1: -------------------------------------------------------------------------- 1: prte detected that one or more processes exited with non-zero status, 1: thus causing the job to be terminated. The first process to do so was: 1: 1: Process name: [prte-arch-nspawn-3003276-14842@1,0] 1: Exit code: 14 1: -------------------------------------------------------------------------- 1: [arch-nspawn-3003276:14842] PMIX ERROR: PMIX_ERR_UNREACH in file base/ptl_base_connection_hdlr.c at line 103 1/2 Test #1: no_write_out .....................***Failed 1.39 sec [arch-nspawn-3003276:14842] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 1056 [arch-nspawn-3003276:14842] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 1231 [arch-nspawn-3003276:14842] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 1353 [arch-nspawn-3003276:14842] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 2405 [arch-nspawn-3003276:14842] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 2460 [arch-nspawn-3003276:14842] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 2476 [arch-nspawn-3003276:14842] PMIX ERROR: PMIX_ERROR in file server/pmix_server.c at line 4660 -------------------------------------------------------------------------- The gds/shmem2 component attempted to attach to a shared-memory segment at a particular base address, but was given a different one. Your job will now likely abort. Requested Address: 0x2aaac0000000 Acquired Address: 0x3fbc9fb000 If this problem persists, please consider disabling the gds/shmem2 component by setting in your environment the following: PMIX_MCA_gds=hash -------------------------------------------------------------------------- [arch-nspawn-3003276:14845] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file client/pmix_client.c at line 278 *** An error occurred in MPI_Init_thread *** on a NULL communicator *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, *** and MPI will try to terminate your MPI job as well) [arch-nspawn-3003276:14845] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! -------------------------------------------------------------------------- prte detected that one or more processes exited with non-zero status, thus causing the job to be terminated. The first process to do so was: Process name: [prte-arch-nspawn-3003276-14842@1,0] Exit code: 14 -------------------------------------------------------------------------- [arch-nspawn-3003276:14842] PMIX ERROR: PMIX_ERR_UNREACH in file base/ptl_base_connection_hdlr.c at line 103 test 2 Start 2: no_data_dump_dir_read_in Failed test dependencies: no_write_out 2/2 Test #2: no_data_dump_dir_read_in .........***Not Run 0.00 sec 0% tests passed, 2 tests failed out of 2 Total Test time (real) = 1.39 sec The following tests FAILED: 1 - no_write_out (Failed) 2 - no_data_dump_dir_read_in (Not Run) Errors while running CTest Test command failed: /usr/bin/ctest CMake Error at /build/paraview-catalyst/src/catalyst-v2.1.0/cmake/catalyst-test-path-setting.cmake:41 (message): test or example failed! 66/69 Test #66: test-build-replay_missing_initialize_data_python ......***Failed 21.22 sec Internal cmake changing into directory: /build/paraview-catalyst/src/build/tests/replay_tests/python/build-replay_missing_initialize_data_python ======== CMake output ====== The C compiler identification is GNU 15.2.1 The CXX compiler identification is GNU 15.2.1 Detecting C compiler ABI info Detecting C compiler ABI info - done Check for working C compiler: /usr/bin/cc - skipped Detecting C compile features Detecting C compile features - done Detecting CXX compiler ABI info Detecting CXX compiler ABI info - done Check for working CXX compiler: /usr/bin/c++ - skipped Detecting CXX compile features Detecting CXX compile features - done Found MPI_C: /lib/libmpi.so (found version "3.1") Found MPI: TRUE (found version "3.1") found components: C Found Python3: /usr/include/python3.14 (found version "3.14.3") found components: Development Development.Module Development.Embed Found Python3: /usr/bin/python3.14 (found version "3.14.3") found components: Interpreter Configuring done (9.5s) Generating done (0.1s) Build files have been written to: /build/paraview-catalyst/src/build/tests/replay_tests/python/build-replay_missing_initialize_data_python ======== End CMake output ====== Change Dir: '/build/paraview-catalyst/src/build/tests/replay_tests/python/build-replay_missing_initialize_data_python' Run Clean Command: /usr/bin/ninja clean [1/1] Cleaning all built files... Cleaning... 0 files. Run Build Command(s): /usr/bin/ninja [1/3] Building C object CMakeFiles/missing_initialize_adaptor.dir/catalyst_impl_replay.c.o [2/3] Building CXX object CMakeFiles/missing_initialize_adaptor.dir/build/paraview-catalyst/src/catalyst-v2.1.0/tests/replay_tests/common_src_dir/common_replay_adaptor.cxx.o [3/3] Linking CXX shared module catalyst/libcatalyst-replay.so Running test command: "/usr/bin/ctest" "-C" "$" "-V" "--output-on-failure" UpdateCTestConfiguration from :/build/paraview-catalyst/src/build/tests/replay_tests/python/build-replay_missing_initialize_data_python/DartConfiguration.tcl Parse Config file:/build/paraview-catalyst/src/build/tests/replay_tests/python/build-replay_missing_initialize_data_python/DartConfiguration.tcl Test project /build/paraview-catalyst/src/build/tests/replay_tests/python/build-replay_missing_initialize_data_python Constructing a list of tests Done constructing a list of tests Updating test list for fixtures Added 0 tests to meet fixture requirements Checking test dependency graph... Checking test dependency graph end test 1 Start 1: missing_initialize_prepare 1: Test command: /usr/bin/cmake "-E" "rm" "-rf" "/build/paraview-catalyst/src/build/tests/replay_tests/python/build-replay_missing_initialize_data_python/data_dump/" 1: Working Directory: /build/paraview-catalyst/src/build/tests/replay_tests/python/build-replay_missing_initialize_data_python 1: Test timeout computed to be: 1500 1/4 Test #1: missing_initialize_prepare ....... Passed 0.08 sec test 2 Start 2: missing_initialize_write_out 2: Test command: /usr/bin/mpiexec "-n" "2" "/usr/bin/python3.14" "/build/paraview-catalyst/src/catalyst-v2.1.0/tests/replay_tests/python/common_src_dir/common_replay_driver.py" "/build/paraview-catalyst/src/build/tests/replay_tests/python/build-replay_missing_initialize_data_python/catalyst" "--use-mpi" 2: Working Directory: /build/paraview-catalyst/src/build/tests/replay_tests/python/build-replay_missing_initialize_data_python 2: Environment variables: 2: CATALYST_DATA_DUMP_DIRECTORY=/build/paraview-catalyst/src/build/tests/replay_tests/python/build-replay_missing_initialize_data_python/data_dump/ 2: Environment variable modifications: 2: PYTHONPATH=path_list_prepend:/build/paraview-catalyst/src/build/lib/cmake/catalyst-2.1/../../../lib/python3.14/site-packages 2: Test timeout computed to be: 1500 2: [arch-nspawn-3003276:14849] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 1056 2: [arch-nspawn-3003276:14849] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 1231 2: [arch-nspawn-3003276:14849] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 1353 2: [arch-nspawn-3003276:14849] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 2405 2: [arch-nspawn-3003276:14849] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 2460 2: [arch-nspawn-3003276:14849] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 2476 2: [arch-nspawn-3003276:14849] PMIX ERROR: PMIX_ERROR in file server/pmix_server.c at line 4660 2: -------------------------------------------------------------------------- 2: The gds/shmem2 component attempted to attach to a shared-memory segment at a 2: particular base address, but was given a different one. Your job will now likely 2: abort. 2: 2: Requested Address: 0x2aaabc000000 2: Acquired Address: 0x3f93aed000 2: 2: If this problem persists, please consider disabling the gds/shmem2 component by 2: setting in your environment the following: PMIX_MCA_gds=hash 2: -------------------------------------------------------------------------- 2: 2: [arch-nspawn-3003276:14852] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file client/pmix_client.c at line 278 2: [arch-nspawn-3003276:14849] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 1056 2: [arch-nspawn-3003276:14849] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 1231 2: [arch-nspawn-3003276:14849] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 1353 2: [arch-nspawn-3003276:14849] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 2405 2: [arch-nspawn-3003276:14849] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 2460 2: [arch-nspawn-3003276:14849] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 2476 2: [arch-nspawn-3003276:14849] PMIX ERROR: PMIX_ERROR in file server/pmix_server.c at line 4660 2: [arch-nspawn-3003276:14853] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file client/pmix_client.c at line 278 2: *** An error occurred in MPI_Init_thread 2: *** on a NULL communicator 2: *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, 2: *** and MPI will try to terminate your MPI job as well) 2: [arch-nspawn-3003276:14852] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! 2: -------------------------------------------------------------------------- 2: prte detected that one or more processes exited with non-zero status, 2: thus causing the job to be terminated. The first process to do so was: 2: 2: Process name: [prte-arch-nspawn-3003276-14849@1,0] 2: Exit code: 14 2: -------------------------------------------------------------------------- 2/4 Test #2: missing_initialize_write_out .....***Failed 1.27 sec [arch-nspawn-3003276:14849] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 1056 [arch-nspawn-3003276:14849] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 1231 [arch-nspawn-3003276:14849] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 1353 [arch-nspawn-3003276:14849] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 2405 [arch-nspawn-3003276:14849] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 2460 [arch-nspawn-3003276:14849] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 2476 [arch-nspawn-3003276:14849] PMIX ERROR: PMIX_ERROR in file server/pmix_server.c at line 4660 -------------------------------------------------------------------------- The gds/shmem2 component attempted to attach to a shared-memory segment at a particular base address, but was given a different one. Your job will now likely abort. Requested Address: 0x2aaabc000000 Acquired Address: 0x3f93aed000 If this problem persists, please consider disabling the gds/shmem2 component by setting in your environment the following: PMIX_MCA_gds=hash -------------------------------------------------------------------------- [arch-nspawn-3003276:14852] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file client/pmix_client.c at line 278 [arch-nspawn-3003276:14849] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 1056 [arch-nspawn-3003276:14849] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 1231 [arch-nspawn-3003276:14849] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 1353 [arch-nspawn-3003276:14849] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 2405 [arch-nspawn-3003276:14849] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 2460 [arch-nspawn-3003276:14849] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 2476 [arch-nspawn-3003276:14849] PMIX ERROR: PMIX_ERROR in file server/pmix_server.c at line 4660 [arch-nspawn-3003276:14853] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file client/pmix_client.c at line 278 *** An error occurred in MPI_Init_thread *** on a NULL communicator *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, *** and MPI will try to terminate your MPI job as well) [arch-nspawn-3003276:14852] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! -------------------------------------------------------------------------- prte detected that one or more processes exited with non-zero status, thus causing the job to be terminated. The first process to do so was: Process name: [prte-arch-nspawn-3003276-14849@1,0] Exit code: 14 -------------------------------------------------------------------------- test 3 Start 3: remove_initialize Failed test dependencies: missing_initialize_write_out 3/4 Test #3: remove_initialize ................***Not Run 0.00 sec test 4 Start 4: missing_initialize_read_in Failed test dependencies: remove_initialize 4/4 Test #4: missing_initialize_read_in .......***Not Run 0.00 sec 25% tests passed, 3 tests failed out of 4 Total Test time (real) = 1.36 sec The following tests FAILED: 2 - missing_initialize_write_out (Failed) 3 - remove_initialize (Not Run) 4 - missing_initialize_read_in (Not Run) Errors while running CTest Test command failed: /usr/bin/ctest CMake Error at /build/paraview-catalyst/src/catalyst-v2.1.0/cmake/catalyst-test-path-setting.cmake:41 (message): test or example failed! 67/69 Test #69: test-install-replay_missing_execute_invc_python .......***Failed 19.24 sec Internal cmake changing into directory: /build/paraview-catalyst/src/build/tests/replay_tests/python/install-replay_missing_execute_invc_python ======== CMake output ====== The C compiler identification is GNU 15.2.1 The CXX compiler identification is GNU 15.2.1 Detecting C compiler ABI info Detecting C compiler ABI info - done Check for working C compiler: /usr/bin/cc - skipped Detecting C compile features Detecting C compile features - done Detecting CXX compiler ABI info Detecting CXX compiler ABI info - done Check for working CXX compiler: /usr/bin/c++ - skipped Detecting CXX compile features Detecting CXX compile features - done Found MPI_C: /lib/libmpi.so (found version "3.1") Found MPI: TRUE (found version "3.1") found components: C Found Python3: /usr/include/python3.14 (found version "3.14.3") found components: Development Development.Module Development.Embed Found Python3: /usr/bin/python3.14 (found version "3.14.3") found components: Interpreter Configuring done (9.4s) Generating done (0.1s) Build files have been written to: /build/paraview-catalyst/src/build/tests/replay_tests/python/install-replay_missing_execute_invc_python ======== End CMake output ====== Change Dir: '/build/paraview-catalyst/src/build/tests/replay_tests/python/install-replay_missing_execute_invc_python' Run Clean Command: /usr/bin/ninja clean [1/1] Cleaning all built files... Cleaning... 0 files. Run Build Command(s): /usr/bin/ninja [1/3] Building C object CMakeFiles/missing_execute_invc_adaptor.dir/catalyst_impl_replay.c.o [2/3] Building CXX object CMakeFiles/missing_execute_invc_adaptor.dir/build/paraview-catalyst/src/catalyst-v2.1.0/tests/replay_tests/common_src_dir/common_replay_adaptor.cxx.o [3/3] Linking CXX shared module catalyst/libcatalyst-replay.so Running test command: "/usr/bin/ctest" "-C" "$" "-V" "--output-on-failure" UpdateCTestConfiguration from :/build/paraview-catalyst/src/build/tests/replay_tests/python/install-replay_missing_execute_invc_python/DartConfiguration.tcl Parse Config file:/build/paraview-catalyst/src/build/tests/replay_tests/python/install-replay_missing_execute_invc_python/DartConfiguration.tcl Test project /build/paraview-catalyst/src/build/tests/replay_tests/python/install-replay_missing_execute_invc_python Constructing a list of tests Done constructing a list of tests Updating test list for fixtures Added 0 tests to meet fixture requirements Checking test dependency graph... Checking test dependency graph end test 1 Start 1: missing_execute_invc_prepare 1: Test command: /usr/bin/cmake "-E" "rm" "-rf" "/build/paraview-catalyst/src/build/tests/replay_tests/python/install-replay_missing_execute_invc_python/data_dump/" 1: Working Directory: /build/paraview-catalyst/src/build/tests/replay_tests/python/install-replay_missing_execute_invc_python 1: Test timeout computed to be: 1500 1/4 Test #1: missing_execute_invc_prepare ..... Passed 0.04 sec test 2 Start 2: missing_execute_invc_write_out 2: Test command: /usr/bin/mpiexec "-n" "1" "/usr/bin/python3.14" "/build/paraview-catalyst/src/catalyst-v2.1.0/tests/replay_tests/python/common_src_dir/common_replay_driver.py" "/build/paraview-catalyst/src/build/tests/replay_tests/python/install-replay_missing_execute_invc_python/catalyst" "--use-mpi" 2: Working Directory: /build/paraview-catalyst/src/build/tests/replay_tests/python/install-replay_missing_execute_invc_python 2: Environment variables: 2: CATALYST_DATA_DUMP_DIRECTORY=/build/paraview-catalyst/src/build/tests/replay_tests/python/install-replay_missing_execute_invc_python/data_dump/ 2: Environment variable modifications: 2: PYTHONPATH=path_list_prepend:/build/paraview-catalyst/src/build/root/usr/lib/cmake/catalyst-2.1/../../../lib/python3.14/site-packages 2: Test timeout computed to be: 1500 2: [arch-nspawn-3003276:14864] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 1056 2: [arch-nspawn-3003276:14864] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 1231 2: [arch-nspawn-3003276:14864] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 1353 2: [arch-nspawn-3003276:14864] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 2405 2: [arch-nspawn-3003276:14864] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 2460 2: [arch-nspawn-3003276:14864] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 2476 2: [arch-nspawn-3003276:14864] PMIX ERROR: PMIX_ERROR in file server/pmix_server.c at line 4660 2: -------------------------------------------------------------------------- 2: The gds/shmem2 component attempted to attach to a shared-memory segment at a 2: particular base address, but was given a different one. Your job will now likely 2: abort. 2: 2: Requested Address: 0x2aaab0000000 2: Acquired Address: 0x3fb8cb9000 2: 2: If this problem persists, please consider disabling the gds/shmem2 component by 2: setting in your environment the following: PMIX_MCA_gds=hash 2: -------------------------------------------------------------------------- 2: 2: [arch-nspawn-3003276:14867] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file client/pmix_client.c at line 278 2: *** An error occurred in MPI_Init_thread 2: *** on a NULL communicator 2: *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, 2: *** and MPI will try to terminate your MPI job as well) 2: [arch-nspawn-3003276:14867] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! 2/4 Test #2: missing_execute_invc_write_out ...***Failed 0.96 sec [arch-nspawn-3003276:14864] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 1056 [arch-nspawn-3003276:14864] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 1231 [arch-nspawn-3003276:14864] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 1353 [arch-nspawn-3003276:14864] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 2405 [arch-nspawn-3003276:14864] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 2460 [arch-nspawn-3003276:14864] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 2476 [arch-nspawn-3003276:14864] PMIX ERROR: PMIX_ERROR in file server/pmix_server.c at line 4660 -------------------------------------------------------------------------- The gds/shmem2 component attempted to attach to a shared-memory segment at a particular base address, but was given a different one. Your job will now likely abort. Requested Address: 0x2aaab0000000 Acquired Address: 0x3fb8cb9000 If this problem persists, please consider disabling the gds/shmem2 component by setting in your environment the following: PMIX_MCA_gds=hash -------------------------------------------------------------------------- [arch-nspawn-3003276:14867] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file client/pmix_client.c at line 278 *** An error occurred in MPI_Init_thread *** on a NULL communicator *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, *** and MPI will try to terminate your MPI job as well) [arch-nspawn-3003276:14867] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! test 3 Start 3: remove_execute_invc0 Failed test dependencies: missing_execute_invc_write_out 3/4 Test #3: remove_execute_invc0 .............***Not Run 0.00 sec test 4 Start 4: missing_execute_invc_read_in Failed test dependencies: remove_execute_invc0 4/4 Test #4: missing_execute_invc_read_in .....***Not Run 0.00 sec 25% tests passed, 3 tests failed out of 4 Total Test time (real) = 1.01 sec The following tests FAILED: 2 - missing_execute_invc_write_out (Failed) 3 - remove_execute_invc0 (Not Run) 4 - missing_execute_invc_read_in (Not Run) Errors while running CTest Test command failed: /usr/bin/ctest CMake Error at /build/paraview-catalyst/src/catalyst-v2.1.0/cmake/catalyst-test-path-setting.cmake:41 (message): test or example failed! 68/69 Test #67: test-install-replay_missing_initialize_data_python ....***Failed 20.92 sec Internal cmake changing into directory: /build/paraview-catalyst/src/build/tests/replay_tests/python/install-replay_missing_initialize_data_python ======== CMake output ====== The C compiler identification is GNU 15.2.1 The CXX compiler identification is GNU 15.2.1 Detecting C compiler ABI info Detecting C compiler ABI info - done Check for working C compiler: /usr/bin/cc - skipped Detecting C compile features Detecting C compile features - done Detecting CXX compiler ABI info Detecting CXX compiler ABI info - done Check for working CXX compiler: /usr/bin/c++ - skipped Detecting CXX compile features Detecting CXX compile features - done Found MPI_C: /lib/libmpi.so (found version "3.1") Found MPI: TRUE (found version "3.1") found components: C Found Python3: /usr/include/python3.14 (found version "3.14.3") found components: Development Development.Module Development.Embed Found Python3: /usr/bin/python3.14 (found version "3.14.3") found components: Interpreter Configuring done (8.9s) Generating done (0.1s) Build files have been written to: /build/paraview-catalyst/src/build/tests/replay_tests/python/install-replay_missing_initialize_data_python ======== End CMake output ====== Change Dir: '/build/paraview-catalyst/src/build/tests/replay_tests/python/install-replay_missing_initialize_data_python' Run Clean Command: /usr/bin/ninja clean [1/1] Cleaning all built files... Cleaning... 0 files. Run Build Command(s): /usr/bin/ninja [1/3] Building C object CMakeFiles/missing_initialize_adaptor.dir/catalyst_impl_replay.c.o [2/3] Building CXX object CMakeFiles/missing_initialize_adaptor.dir/build/paraview-catalyst/src/catalyst-v2.1.0/tests/replay_tests/common_src_dir/common_replay_adaptor.cxx.o [3/3] Linking CXX shared module catalyst/libcatalyst-replay.so Running test command: "/usr/bin/ctest" "-C" "$" "-V" "--output-on-failure" UpdateCTestConfiguration from :/build/paraview-catalyst/src/build/tests/replay_tests/python/install-replay_missing_initialize_data_python/DartConfiguration.tcl Parse Config file:/build/paraview-catalyst/src/build/tests/replay_tests/python/install-replay_missing_initialize_data_python/DartConfiguration.tcl Test project /build/paraview-catalyst/src/build/tests/replay_tests/python/install-replay_missing_initialize_data_python Constructing a list of tests Done constructing a list of tests Updating test list for fixtures Added 0 tests to meet fixture requirements Checking test dependency graph... Checking test dependency graph end test 1 Start 1: missing_initialize_prepare 1: Test command: /usr/bin/cmake "-E" "rm" "-rf" "/build/paraview-catalyst/src/build/tests/replay_tests/python/install-replay_missing_initialize_data_python/data_dump/" 1: Working Directory: /build/paraview-catalyst/src/build/tests/replay_tests/python/install-replay_missing_initialize_data_python 1: Test timeout computed to be: 1500 1/4 Test #1: missing_initialize_prepare ....... Passed 0.05 sec test 2 Start 2: missing_initialize_write_out 2: Test command: /usr/bin/mpiexec "-n" "2" "/usr/bin/python3.14" "/build/paraview-catalyst/src/catalyst-v2.1.0/tests/replay_tests/python/common_src_dir/common_replay_driver.py" "/build/paraview-catalyst/src/build/tests/replay_tests/python/install-replay_missing_initialize_data_python/catalyst" "--use-mpi" 2: Working Directory: /build/paraview-catalyst/src/build/tests/replay_tests/python/install-replay_missing_initialize_data_python 2: Environment variables: 2: CATALYST_DATA_DUMP_DIRECTORY=/build/paraview-catalyst/src/build/tests/replay_tests/python/install-replay_missing_initialize_data_python/data_dump/ 2: Environment variable modifications: 2: PYTHONPATH=path_list_prepend:/build/paraview-catalyst/src/build/root/usr/lib/cmake/catalyst-2.1/../../../lib/python3.14/site-packages 2: Test timeout computed to be: 1500 2: [arch-nspawn-3003276:14874] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 1056 2: [arch-nspawn-3003276:14874] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 1231 2: [arch-nspawn-3003276:14874] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 1353 2: [arch-nspawn-3003276:14874] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 2405 2: [arch-nspawn-3003276:14874] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 2460 2: [arch-nspawn-3003276:14874] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 2476 2: [arch-nspawn-3003276:14874] PMIX ERROR: PMIX_ERROR in file server/pmix_server.c at line 4660 2: -------------------------------------------------------------------------- 2: The gds/shmem2 component attempted to attach to a shared-memory segment at a 2: particular base address, but was given a different one. Your job will now likely 2: abort. 2: 2: Requested Address: 0x2aaac0000000 2: Acquired Address: 0x3fbb1da000 2: 2: If this problem persists, please consider disabling the gds/shmem2 component by 2: setting in your environment the following: PMIX_MCA_gds=hash 2: -------------------------------------------------------------------------- 2: 2: [arch-nspawn-3003276:14878] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file client/pmix_client.c at line 278 2: *** An error occurred in MPI_Init_thread 2: *** on a NULL communicator 2: *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, 2: *** and MPI will try to terminate your MPI job as well) 2: [arch-nspawn-3003276:14878] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! 2: -------------------------------------------------------------------------- 2: prte detected that one or more processes exited with non-zero status, 2: thus causing the job to be terminated. The first process to do so was: 2: 2: Process name: [prte-arch-nspawn-3003276-14874@1,1] 2: Exit code: 14 2: -------------------------------------------------------------------------- 2: [arch-nspawn-3003276:14874] PMIX ERROR: PMIX_ERR_UNREACH in file base/ptl_base_connection_hdlr.c at line 103 2/4 Test #2: missing_initialize_write_out .....***Failed 0.96 sec [arch-nspawn-3003276:14874] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 1056 [arch-nspawn-3003276:14874] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 1231 [arch-nspawn-3003276:14874] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 1353 [arch-nspawn-3003276:14874] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 2405 [arch-nspawn-3003276:14874] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 2460 [arch-nspawn-3003276:14874] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 2476 [arch-nspawn-3003276:14874] PMIX ERROR: PMIX_ERROR in file server/pmix_server.c at line 4660 -------------------------------------------------------------------------- The gds/shmem2 component attempted to attach to a shared-memory segment at a particular base address, but was given a different one. Your job will now likely abort. Requested Address: 0x2aaac0000000 Acquired Address: 0x3fbb1da000 If this problem persists, please consider disabling the gds/shmem2 component by setting in your environment the following: PMIX_MCA_gds=hash -------------------------------------------------------------------------- [arch-nspawn-3003276:14878] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file client/pmix_client.c at line 278 *** An error occurred in MPI_Init_thread *** on a NULL communicator *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, *** and MPI will try to terminate your MPI job as well) [arch-nspawn-3003276:14878] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! -------------------------------------------------------------------------- prte detected that one or more processes exited with non-zero status, thus causing the job to be terminated. The first process to do so was: Process name: [prte-arch-nspawn-3003276-14874@1,1] Exit code: 14 -------------------------------------------------------------------------- [arch-nspawn-3003276:14874] PMIX ERROR: PMIX_ERR_UNREACH in file base/ptl_base_connection_hdlr.c at line 103 test 3 Start 3: remove_initialize Failed test dependencies: missing_initialize_write_out 3/4 Test #3: remove_initialize ................***Not Run 0.00 sec test 4 Start 4: missing_initialize_read_in Failed test dependencies: remove_initialize 4/4 Test #4: missing_initialize_read_in .......***Not Run 0.00 sec 25% tests passed, 3 tests failed out of 4 Total Test time (real) = 1.01 sec The following tests FAILED: 2 - missing_initialize_write_out (Failed) 3 - remove_initialize (Not Run) 4 - missing_initialize_read_in (Not Run) Errors while running CTest Test command failed: /usr/bin/ctest CMake Error at /build/paraview-catalyst/src/catalyst-v2.1.0/cmake/catalyst-test-path-setting.cmake:41 (message): test or example failed! 69/69 Test #68: test-build-replay_missing_execute_invc_python .........***Failed 20.21 sec Internal cmake changing into directory: /build/paraview-catalyst/src/build/tests/replay_tests/python/build-replay_missing_execute_invc_python ======== CMake output ====== The C compiler identification is GNU 15.2.1 The CXX compiler identification is GNU 15.2.1 Detecting C compiler ABI info Detecting C compiler ABI info - done Check for working C compiler: /usr/bin/cc - skipped Detecting C compile features Detecting C compile features - done Detecting CXX compiler ABI info Detecting CXX compiler ABI info - done Check for working CXX compiler: /usr/bin/c++ - skipped Detecting CXX compile features Detecting CXX compile features - done Found MPI_C: /lib/libmpi.so (found version "3.1") Found MPI: TRUE (found version "3.1") found components: C Found Python3: /usr/include/python3.14 (found version "3.14.3") found components: Development Development.Module Development.Embed Found Python3: /usr/bin/python3.14 (found version "3.14.3") found components: Interpreter Configuring done (9.2s) Generating done (0.1s) Build files have been written to: /build/paraview-catalyst/src/build/tests/replay_tests/python/build-replay_missing_execute_invc_python ======== End CMake output ====== Change Dir: '/build/paraview-catalyst/src/build/tests/replay_tests/python/build-replay_missing_execute_invc_python' Run Clean Command: /usr/bin/ninja clean [1/1] Cleaning all built files... Cleaning... 0 files. Run Build Command(s): /usr/bin/ninja [1/3] Building C object CMakeFiles/missing_execute_invc_adaptor.dir/catalyst_impl_replay.c.o [2/3] Building CXX object CMakeFiles/missing_execute_invc_adaptor.dir/build/paraview-catalyst/src/catalyst-v2.1.0/tests/replay_tests/common_src_dir/common_replay_adaptor.cxx.o [3/3] Linking CXX shared module catalyst/libcatalyst-replay.so Running test command: "/usr/bin/ctest" "-C" "$" "-V" "--output-on-failure" UpdateCTestConfiguration from :/build/paraview-catalyst/src/build/tests/replay_tests/python/build-replay_missing_execute_invc_python/DartConfiguration.tcl Parse Config file:/build/paraview-catalyst/src/build/tests/replay_tests/python/build-replay_missing_execute_invc_python/DartConfiguration.tcl Test project /build/paraview-catalyst/src/build/tests/replay_tests/python/build-replay_missing_execute_invc_python Constructing a list of tests Done constructing a list of tests Updating test list for fixtures Added 0 tests to meet fixture requirements Checking test dependency graph... Checking test dependency graph end test 1 Start 1: missing_execute_invc_prepare 1: Test command: /usr/bin/cmake "-E" "rm" "-rf" "/build/paraview-catalyst/src/build/tests/replay_tests/python/build-replay_missing_execute_invc_python/data_dump/" 1: Working Directory: /build/paraview-catalyst/src/build/tests/replay_tests/python/build-replay_missing_execute_invc_python 1: Test timeout computed to be: 1500 1/4 Test #1: missing_execute_invc_prepare ..... Passed 0.03 sec test 2 Start 2: missing_execute_invc_write_out 2: Test command: /usr/bin/mpiexec "-n" "1" "/usr/bin/python3.14" "/build/paraview-catalyst/src/catalyst-v2.1.0/tests/replay_tests/python/common_src_dir/common_replay_driver.py" "/build/paraview-catalyst/src/build/tests/replay_tests/python/build-replay_missing_execute_invc_python/catalyst" "--use-mpi" 2: Working Directory: /build/paraview-catalyst/src/build/tests/replay_tests/python/build-replay_missing_execute_invc_python 2: Environment variables: 2: CATALYST_DATA_DUMP_DIRECTORY=/build/paraview-catalyst/src/build/tests/replay_tests/python/build-replay_missing_execute_invc_python/data_dump/ 2: Environment variable modifications: 2: PYTHONPATH=path_list_prepend:/build/paraview-catalyst/src/build/lib/cmake/catalyst-2.1/../../../lib/python3.14/site-packages 2: Test timeout computed to be: 1500 2: [arch-nspawn-3003276:14887] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 1056 2: [arch-nspawn-3003276:14887] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 1231 2: [arch-nspawn-3003276:14887] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 1353 2: [arch-nspawn-3003276:14887] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 2405 2: [arch-nspawn-3003276:14887] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 2460 2: [arch-nspawn-3003276:14887] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 2476 2: [arch-nspawn-3003276:14887] PMIX ERROR: PMIX_ERROR in file server/pmix_server.c at line 4660 2: -------------------------------------------------------------------------- 2: The gds/shmem2 component attempted to attach to a shared-memory segment at a 2: particular base address, but was given a different one. Your job will now likely 2: abort. 2: 2: Requested Address: 0x2aaab8000000 2: Acquired Address: 0x3fb5119000 2: 2: If this problem persists, please consider disabling the gds/shmem2 component by 2: setting in your environment the following: PMIX_MCA_gds=hash 2: -------------------------------------------------------------------------- 2: 2: [arch-nspawn-3003276:14890] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file client/pmix_client.c at line 278 2: *** An error occurred in MPI_Init_thread 2: *** on a NULL communicator 2: *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, 2: *** and MPI will try to terminate your MPI job as well) 2: [arch-nspawn-3003276:14890] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! 2/4 Test #2: missing_execute_invc_write_out ...***Failed 0.66 sec [arch-nspawn-3003276:14887] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 1056 [arch-nspawn-3003276:14887] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 1231 [arch-nspawn-3003276:14887] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 1353 [arch-nspawn-3003276:14887] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 2405 [arch-nspawn-3003276:14887] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 2460 [arch-nspawn-3003276:14887] PMIX ERROR: PMIX_ERROR in file gds_shmem2.c at line 2476 [arch-nspawn-3003276:14887] PMIX ERROR: PMIX_ERROR in file server/pmix_server.c at line 4660 -------------------------------------------------------------------------- The gds/shmem2 component attempted to attach to a shared-memory segment at a particular base address, but was given a different one. Your job will now likely abort. Requested Address: 0x2aaab8000000 Acquired Address: 0x3fb5119000 If this problem persists, please consider disabling the gds/shmem2 component by setting in your environment the following: PMIX_MCA_gds=hash -------------------------------------------------------------------------- [arch-nspawn-3003276:14890] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file client/pmix_client.c at line 278 *** An error occurred in MPI_Init_thread *** on a NULL communicator *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, *** and MPI will try to terminate your MPI job as well) [arch-nspawn-3003276:14890] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! test 3 Start 3: remove_execute_invc0 Failed test dependencies: missing_execute_invc_write_out 3/4 Test #3: remove_execute_invc0 .............***Not Run 0.00 sec test 4 Start 4: missing_execute_invc_read_in Failed test dependencies: remove_execute_invc0 4/4 Test #4: missing_execute_invc_read_in .....***Not Run 0.00 sec 25% tests passed, 3 tests failed out of 4 Total Test time (real) = 0.70 sec The following tests FAILED: 2 - missing_execute_invc_write_out (Failed) 3 - remove_execute_invc0 (Not Run) 4 - missing_execute_invc_read_in (Not Run) Errors while running CTest Test command failed: /usr/bin/ctest CMake Error at /build/paraview-catalyst/src/catalyst-v2.1.0/cmake/catalyst-test-path-setting.cmake:41 (message): test or example failed! 43% tests passed, 39 tests failed out of 69 Label Time Summary: Python = 4.15 sec*proc (3 tests) Total Test time (real) = 161.43 sec The following tests FAILED: 7 - example-build-replay (Failed) 8 - example-install-replay (Failed) 31 - t_conduit_node_set (Failed) 34 - test-build-replay_high_num_ranks (Failed) 35 - test-install-replay_high_num_ranks (Failed) 36 - test-build-replay_ranks_mismatch (Failed) 37 - test-install-replay_ranks_mismatch (Failed) 38 - test-build-replay_high_num_execute_invc (Failed) 39 - test-install-replay_high_num_execute_invc (Failed) 40 - test-build-replay_no_data_dump_dir (Failed) 41 - test-install-replay_no_data_dump_dir (Failed) 42 - test-build-replay_missing_initialize_data (Failed) 43 - test-install-replay_missing_initialize_data (Failed) 44 - test-build-replay_missing_execute_invc (Failed) 45 - test-install-replay_missing_execute_invc (Failed) 46 - test-build-replay_high_num_ranks_fortran (Failed) 47 - test-install-replay_high_num_ranks_fortran (Failed) 48 - test-build-replay_ranks_mismatch_fortran (Failed) 49 - test-install-replay_ranks_mismatch_fortran (Failed) 50 - test-build-replay_high_num_execute_invc_fortran (Failed) 51 - test-install-replay_high_num_execute_invc_fortran (Failed) 52 - test-build-replay_no_data_dump_dir_fortran (Failed) 53 - test-install-replay_no_data_dump_dir_fortran (Failed) 54 - test-build-replay_missing_initialize_data_fortran (Failed) 55 - test-install-replay_missing_initialize_data_fortran (Failed) 56 - test-build-replay_missing_execute_invc_fortran (Failed) 57 - test-install-replay_missing_execute_invc_fortran (Failed) 58 - test-build-replay_high_num_ranks_python (Failed) 59 - test-install-replay_high_num_ranks_python (Failed) 60 - test-build-replay_ranks_mismatch_python (Failed) 61 - test-install-replay_ranks_mismatch_python (Failed) 62 - test-build-replay_high_num_execute_invc_python (Failed) 63 - test-install-replay_high_num_execute_invc_python (Failed) 64 - test-build-replay_no_data_dump_dir_python (Failed) 65 - test-install-replay_no_data_dump_dir_python (Failed) 66 - test-build-replay_missing_initialize_data_python (Failed) 67 - test-install-replay_missing_initialize_data_python (Failed) 68 - test-build-replay_missing_execute_invc_python (Failed) 69 - test-install-replay_missing_execute_invc_python (Failed) Errors while running CTest ==> ERROR: A failure occurred in check().  Aborting... [!p]104[?7h]3008;end=af2f9c6ab6e543d3a5429bdaed37a7a3\==> ERROR: Build failed, check /var/lib/archbuild/extra-riscv64/felix-0/build [?25h[?25h[?25hreceiving incremental file list paraview-catalyst-2.1.0-1-riscv64-build.log paraview-catalyst-2.1.0-1-riscv64-check.log paraview-catalyst-2.1.0-1-riscv64-prepare.log sent 81 bytes received 23,248 bytes 15,552.67 bytes/sec total size is 391,698 speedup is 16.79