From 71d0cc1cbc110bda50ab0fdbe66b88ab7216c92f Mon Sep 17 00:00:00 2001 From: Juan Carlos Graciosa Date: Wed, 11 Mar 2026 14:38:11 +1100 Subject: [PATCH 01/26] Added pixi env and documentation for Kaiju --- docs/developer/guides/kaiju-cluster-setup.md | 219 +++++++++++++++++++ docs/developer/index.md | 1 + petsc-custom/build-petsc-kaiju.sh | 178 +++++++++++++++ pixi.toml | 16 ++ 4 files changed, 414 insertions(+) create mode 100644 docs/developer/guides/kaiju-cluster-setup.md create mode 100644 petsc-custom/build-petsc-kaiju.sh diff --git a/docs/developer/guides/kaiju-cluster-setup.md b/docs/developer/guides/kaiju-cluster-setup.md new file mode 100644 index 00000000..6d9c34ab --- /dev/null +++ b/docs/developer/guides/kaiju-cluster-setup.md @@ -0,0 +1,219 @@ +# Kaiju Cluster Setup + +This guide covers installing and running Underworld3 on the **Kaiju** cluster — a Rocky Linux 8.10 HPC system using Spack for module management and Slurm for job scheduling. + +Python packages are managed by **pixi** (the same tool used for local development). MPI-dependent packages — `mpi4py`, PETSc+AMR tools, `petsc4py`, and `h5py` — are built from source against Spack's OpenMPI to ensure compatibility with Slurm's parallel interconnect. + +--- + +## Hardware Overview + +| Resource | Specification | +|----------|--------------| +| Head node | 1× Intel Xeon Silver 4210R, 40 CPUs @ 2.4 GHz | +| Compute nodes | 8× Intel Xeon Gold 6230R, 104 CPUs @ 2.1 GHz each | +| Shared storage | `/opt/cluster` via NFS (cluster-wide) | +| Scheduler | Slurm with Munge authentication | + +--- + +## Why pixi + spack? + +Pixi manages the Python environment consistently with the developer's local machine (same `pixi.toml`, same package versions). Spack provides the cluster's OpenMPI, which is what Slurm uses for inter-node communication. + +The key constraint is that **anything linked against MPI must use the same MPI as Slurm**. This means `mpi4py`, `h5py`, PETSc, and `petsc4py` are built from source against Spack's OpenMPI — not from conda-forge (which bundles MPICH). + +``` +pixi kaiju env → Python 3.12, sympy, scipy, pint, pydantic, ... (conda-forge, no MPI) +spack → openmpi@4.1.6 (cluster MPI) +source build → mpi4py, PETSc+AMR+petsc4py, h5py (linked to spack MPI) +``` + +--- + +## Prerequisites + +Spack must have OpenMPI available: + +```bash +spack find openmpi +# openmpi@4.1.6 +``` + +Pixi must be installed in your user space (no root needed): + +```bash +# Check if already installed +pixi --version + +# Install if missing +curl -fsSL https://pixi.sh/install.sh | bash +``` + +--- + +## Installation + +Use the install script at `kaiju-admin-notes/uw3_install_kaiju_amr.sh`. + +### Step 1: Edit configuration + +Open the script and set the variables at the top: + +```bash +SPACK_MPI_VERSION="openmpi@4.1.6" # Spack MPI module to load +INSTALL_PATH="${HOME}/uw3-installation" # Root directory for everything +UW3_BRANCH="development" # UW3 git branch +``` + +### Step 2: Run the full install + +```bash +source uw3_install_kaiju_amr.sh install +``` + +This runs the following steps in order: + +| Step | Function | Time | +|------|----------|------| +| Install pixi | `setup_pixi` | ~1 min | +| Clone Underworld3 | `clone_uw3` | ~1 min | +| Install pixi kaiju env | `install_pixi_env` | ~3 min | +| Build mpi4py from source | `install_mpi4py` | ~2 min | +| Build PETSc + AMR tools | `install_petsc` | ~1 hour | +| Build MPI-enabled h5py | `install_h5py` | ~2 min | +| Install Underworld3 | `install_uw3` | ~2 min | +| Verify | `verify_install` | ~1 min | + +You can also run individual steps after sourcing: + +```bash +source uw3_install_kaiju_amr.sh +install_petsc # run just one step +``` + +### What PETSc builds + +PETSc is compiled from source (`petsc-custom/build-petsc-kaiju.sh`) with: + +- **AMR tools**: mmg, parmmg, pragmatic, eigen, bison +- **Solvers**: mumps, scalapack, slepc +- **Partitioners**: metis, parmetis, ptscotch +- **MPI**: Spack's OpenMPI (`--with-mpi-dir`) +- **HDF5**: downloaded and built with MPI support +- **BLAS/LAPACK**: fblaslapack (Rocky Linux 8 has no guaranteed system BLAS) +- **cmake**: downloaded (not in Spack) +- **petsc4py**: built during configure (`--with-petsc4py=1`) + +--- + +## Activating the Environment + +In every new session (interactive or job), source the install script: + +```bash +source ~/install_scripts/uw3_install_kaiju_amr.sh +``` + +This: +1. Loads `spack openmpi@4.1.6` +2. Activates the pixi `kaiju` environment via `pixi shell-hook` +3. Sets `PETSC_DIR`, `PETSC_ARCH`, and `PYTHONPATH` for petsc4py +4. Sets `PMIX_MCA_psec=native` and `OMPI_MCA_btl_tcp_if_include=eno1` + +{note} +`pixi shell-hook` is used instead of `pixi shell` because it activates the environment in the current shell without spawning a new one. This is required for Slurm batch jobs. +{/note} + +--- + +## Running with Slurm + +Use `kaiju-admin-notes/uw3_slurm_job.sh` as your job script template. + +### Submitting a job + +```bash +sbatch uw3_slurm_job.sh +``` + +Monitor progress: + +```bash +squeue -u $USER +tail -f uw3_.out +``` + +### The `srun` invocation + +`--mpi=pmix` is **required** on Kaiju (Spack has `pmix@5.0.3`): + +```bash +srun --mpi=pmix python3 my_model.py +``` + +### Scaling examples + +```bash +# 1 node, 30 ranks +sbatch --nodes=1 --ntasks-per-node=30 uw3_slurm_job.sh + +# 4 nodes, 120 ranks +sbatch --nodes=4 --ntasks-per-node=30 uw3_slurm_job.sh +``` + +--- + +## Troubleshooting + +### `import underworld3` fails on compute nodes + +Sourcing the install script in the job script (not the login shell) ensures all paths propagate to compute nodes. The `uw3_slurm_job.sh` template does this correctly. + +### h5py HDF5 version mismatch + +h5py must be built against the same HDF5 that PETSc built. If you see HDF5 errors, rebuild: + +```bash +source uw3_install_kaiju_amr.sh +install_h5py +``` + +### PETSc needs rebuilding after Spack module update + +PETSc links against Spack's OpenMPI at build time. If `openmpi@4.1.6` is reinstalled or updated, rebuild PETSc: + +```bash +source uw3_install_kaiju_amr.sh +rm -rf ~/uw3-installation/underworld3/petsc-custom/petsc +install_petsc +install_h5py +``` + +### Checking what's installed + +```bash +source uw3_install_kaiju_amr.sh +verify_install +``` + +--- + +## Rebuilding Underworld3 after source changes + +After pulling new UW3 code: + +```bash +source uw3_install_kaiju_amr.sh +cd ~/uw3-installation/underworld3 +git pull +pip install -e . +``` + +--- + +## Related + +- [Development Setup](development-setup.md) — local development with pixi +- [Branching Strategy](branching-strategy.md) — git workflow +- [Parallel Computing](../../advanced/parallel-computing.md) — writing parallel-safe UW3 code diff --git a/docs/developer/index.md b/docs/developer/index.md index 823ff7a2..faff147f 100644 --- a/docs/developer/index.md +++ b/docs/developer/index.md @@ -114,6 +114,7 @@ guides/SPELLING_CONVENTION guides/version-management guides/branching-strategy guides/BINDER_CONTAINER_SETUP +guides/kaiju-cluster-setup ``` ```{toctree} diff --git a/petsc-custom/build-petsc-kaiju.sh b/petsc-custom/build-petsc-kaiju.sh new file mode 100644 index 00000000..c1901f28 --- /dev/null +++ b/petsc-custom/build-petsc-kaiju.sh @@ -0,0 +1,178 @@ +#!/bin/bash +# +# Build PETSc with AMR tools for the Kaiju cluster (Rocky Linux 8, Spack OpenMPI) +# +# Differences from build-petsc.sh (local macOS/pixi): +# --with-mpi-dir → spack OpenMPI (not pixi's MPICH) +# --download-hdf5 → PETSc downloads HDF5 (not provided by pixi) +# --download-fblaslapack → no guaranteed system BLAS on Rocky Linux 8 +# --download-cmake → spack does not have cmake +# --with-petsc4py → built during configure (not a separate step) +# +# This script builds the same AMR tool set as build-petsc.sh: +# pragmatic, mmg, parmmg, slepc, mumps, metis, parmetis, ptscotch, scalapack +# +# Usage (must be inside a pixi kaiju shell with spack OpenMPI loaded): +# spack load openmpi@4.1.6 +# pixi shell -e kaiju +# ./build-petsc-kaiju.sh # Full build +# ./build-petsc-kaiju.sh configure # Just reconfigure +# ./build-petsc-kaiju.sh build # Just make +# ./build-petsc-kaiju.sh clean # Remove PETSc directory +# +# Build time: ~1 hour +# +set -e + +SCRIPT_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)" +PETSC_DIR="${SCRIPT_DIR}/petsc" +PETSC_ARCH="petsc-4-uw" + +# Require spack OpenMPI to be loaded +if ! command -v mpicc &>/dev/null; then + echo "Error: mpicc not found. Load spack OpenMPI first:" + echo " spack load openmpi@4.1.6" + exit 1 +fi + +MPI_DIR="$(dirname "$(dirname "$(which mpicc)")")" + +# Require pixi kaiju environment +if [ -z "$PIXI_ENVIRONMENT" ] || [ "$PIXI_ENVIRONMENT" != "kaiju" ]; then + echo "Error: must be run inside the pixi kaiju environment" + echo " pixi shell -e kaiju" + exit 1 +fi + +echo "==========================================" +echo "PETSc AMR Build Script (Kaiju)" +echo "==========================================" +echo "PETSC_DIR: $PETSC_DIR" +echo "PETSC_ARCH: $PETSC_ARCH" +echo "MPI_DIR: $MPI_DIR" +echo "==========================================" + +clone_petsc() { + if [ -d "$PETSC_DIR" ]; then + echo "PETSc directory already exists. Skipping clone." + echo "To force fresh clone, run: ./build-petsc-kaiju.sh clean" + return 0 + fi + + echo "Cloning PETSc release branch..." + git clone -b release https://gitlab.com/petsc/petsc.git "$PETSC_DIR" + echo "Clone complete." +} + +configure_petsc() { + echo "Configuring PETSc with AMR tools..." + cd "$PETSC_DIR" + + # Downloads and builds: + # AMR: mmg, parmmg, pragmatic, eigen, bison + # Solvers: mumps, scalapack, slepc + # Partitions: metis, parmetis, ptscotch + # BLAS/LAPACK: fblaslapack (Rocky Linux 8 has no guaranteed system BLAS) + # HDF5: downloaded (not provided by pixi in kaiju env) + # cmake: downloaded (spack does not have cmake) + # MPI: spack OpenMPI (not downloaded) + # petsc4py: built during configure + python3 ./configure \ + --with-petsc-arch="$PETSC_ARCH" \ + --with-debugging=0 \ + --with-mpi-dir="$MPI_DIR" \ + --download-mpich=0 \ + --download-mpi4py=0 \ + --download-hdf5=1 \ + --download-fblaslapack=1 \ + --download-cmake=1 \ + --download-bison=1 \ + --download-eigen=1 \ + --download-metis=1 \ + --download-parmetis=1 \ + --download-mumps=1 \ + --download-scalapack=1 \ + --download-slepc=1 \ + --download-ptscotch=1 \ + --download-mmg=1 \ + --download-parmmg=1 \ + --download-pragmatic=1 \ + --with-pragmatic=1 \ + --with-petsc4py=1 \ + --with-x=0 \ + --with-make-np=40 + + echo "Configure complete." +} + +build_petsc() { + echo "Building PETSc..." + cd "$PETSC_DIR" + + export PETSC_DIR + export PETSC_ARCH + + make all + echo "PETSc build complete." +} + +test_petsc() { + echo "Testing PETSc..." + cd "$PETSC_DIR" + + export PETSC_DIR + export PETSC_ARCH + + make check + echo "PETSc tests complete." +} + +clean_petsc() { + echo "Removing PETSc directory..." + if [ -d "$PETSC_DIR" ]; then + rm -rf "$PETSC_DIR" + echo "Cleaned." + else + echo "Nothing to clean." + fi +} + +show_help() { + echo "Usage: $0 [command]" + echo "" + echo "Commands:" + echo " (none) Full build: clone, configure, build" + echo " clone Clone PETSc repository" + echo " configure Configure PETSc with AMR tools" + echo " build Build PETSc" + echo " test Run PETSc tests" + echo " clean Remove PETSc directory" + echo " help Show this help" +} + +case "${1:-all}" in + all) + clone_petsc + configure_petsc + build_petsc + echo "" + echo "==========================================" + echo "PETSc AMR build complete!" + echo "Set these environment variables:" + echo " export PETSC_DIR=$PETSC_DIR" + echo " export PETSC_ARCH=$PETSC_ARCH" + echo " export PYTHONPATH=\$PETSC_DIR/\$PETSC_ARCH/lib:\$PYTHONPATH" + echo "==========================================" + ;; + clone) clone_petsc ;; + configure) configure_petsc ;; + build) build_petsc ;; + test) test_petsc ;; + clean) clean_petsc ;; + help|--help|-h) show_help ;; + *) + echo "Unknown command: $1" + show_help + exit 1 + ;; +esac diff --git a/pixi.toml b/pixi.toml index 26c7b8d7..ddd20fbf 100644 --- a/pixi.toml +++ b/pixi.toml @@ -229,6 +229,18 @@ PETSC_ARCH = "petsc-4-uw-openmpi" petsc-local-build = { cmd = "./build-petsc.sh", cwd = "petsc-custom" } petsc-local-clean = { cmd = "./build-petsc.sh clean", cwd = "petsc-custom" } +# ============================================ +# KAIJU CLUSTER FEATURE +# ============================================ +# For the Kaiju HPC cluster (Rocky Linux 8, Spack OpenMPI, Slurm) +# Pure Python only — base dependencies cover all pure-Python needs. +# mpi4py, h5py, petsc, petsc4py are built from source against +# spack's OpenMPI using petsc-custom/build-petsc-kaiju.sh +# See: docs/developer/guides/kaiju-cluster-setup.md + +[feature.kaiju] +platforms = ["linux-64"] + # ============================================ # RUNTIME FEATURE (for tutorials/examples) # ============================================ @@ -312,3 +324,7 @@ openmpi-dev = { features = ["conda-petsc-openmpi", "runtime", "dev"], solve-gr amr-openmpi = { features = ["amr-openmpi"], solve-group = "amr-openmpi" } amr-openmpi-dev = { features = ["amr-openmpi", "runtime", "dev"], solve-group = "amr-openmpi" } + +# --- Kaiju Cluster Track (linux-64 only) --- +# Pure Python from pixi; MPI/PETSc/h5py built from source against spack OpenMPI +kaiju = { features = ["kaiju"], solve-group = "kaiju" } From d22d37be6228e5300a419f7d7aa4509d00faf115 Mon Sep 17 00:00:00 2001 From: Juan Carlos Graciosa Date: Wed, 11 Mar 2026 20:00:16 +1100 Subject: [PATCH 02/26] Fix pixi env check and use PATH instead of PIXI_ENVIRONMENT --- petsc-custom/build-petsc-kaiju.sh | 5 +++-- 1 file changed, 3 insertions(+), 2 deletions(-) diff --git a/petsc-custom/build-petsc-kaiju.sh b/petsc-custom/build-petsc-kaiju.sh index c1901f28..7e4cde86 100644 --- a/petsc-custom/build-petsc-kaiju.sh +++ b/petsc-custom/build-petsc-kaiju.sh @@ -38,9 +38,10 @@ fi MPI_DIR="$(dirname "$(dirname "$(which mpicc)")")" # Require pixi kaiju environment -if [ -z "$PIXI_ENVIRONMENT" ] || [ "$PIXI_ENVIRONMENT" != "kaiju" ]; then +# Check PATH since PIXI_ENVIRONMENT is not set by pixi shell-hook (only by pixi shell) +if ! echo "$PATH" | tr ':' '\n' | grep -q "\.pixi/envs/kaiju/bin"; then echo "Error: must be run inside the pixi kaiju environment" - echo " pixi shell -e kaiju" + echo " source uw3_install_kaiju_amr.sh (sets up env via pixi shell-hook)" exit 1 fi From 7c3fdcbe5266dfbf845a96a28b23a2e7c8bbde13 Mon Sep 17 00:00:00 2001 From: Juan Carlos Graciosa Date: Wed, 11 Mar 2026 20:08:29 +1100 Subject: [PATCH 03/26] Added CC=mpicc --- petsc-custom/build-petsc-kaiju.sh | 7 ++++++- 1 file changed, 6 insertions(+), 1 deletion(-) diff --git a/petsc-custom/build-petsc-kaiju.sh b/petsc-custom/build-petsc-kaiju.sh index 7e4cde86..2fcce68f 100644 --- a/petsc-custom/build-petsc-kaiju.sh +++ b/petsc-custom/build-petsc-kaiju.sh @@ -78,10 +78,15 @@ configure_petsc() { # cmake: downloaded (spack does not have cmake) # MPI: spack OpenMPI (not downloaded) # petsc4py: built during configure + # Pass MPI compilers via CC/CXX/FC env vars rather than --with-mpi-dir. + # Spack's mpicc wrapper works in the current shell (spack load sets the + # required env), but PETSc configure tests wrappers in a subprocess that + # may not inherit the full spack environment, causing --with-mpi-dir to fail. + CC=mpicc CXX=mpicxx FC=mpif90 \ python3 ./configure \ --with-petsc-arch="$PETSC_ARCH" \ --with-debugging=0 \ - --with-mpi-dir="$MPI_DIR" \ + --with-mpi=1 \ --download-mpich=0 \ --download-mpi4py=0 \ --download-hdf5=1 \ From f36b5b71ea93a60c6a19839aa3b1edfd1c5a648a Mon Sep 17 00:00:00 2001 From: Juan Carlos Graciosa Date: Wed, 11 Mar 2026 20:16:52 +1100 Subject: [PATCH 04/26] Modified PETSc MPI detection --- petsc-custom/build-petsc-kaiju.sh | 16 +++++----------- 1 file changed, 5 insertions(+), 11 deletions(-) diff --git a/petsc-custom/build-petsc-kaiju.sh b/petsc-custom/build-petsc-kaiju.sh index 2fcce68f..afce9574 100644 --- a/petsc-custom/build-petsc-kaiju.sh +++ b/petsc-custom/build-petsc-kaiju.sh @@ -3,7 +3,7 @@ # Build PETSc with AMR tools for the Kaiju cluster (Rocky Linux 8, Spack OpenMPI) # # Differences from build-petsc.sh (local macOS/pixi): -# --with-mpi-dir → spack OpenMPI (not pixi's MPICH) +# MPI auto-detected from PATH (spack load puts mpicc in PATH; no --with-mpi-dir needed) # --download-hdf5 → PETSc downloads HDF5 (not provided by pixi) # --download-fblaslapack → no guaranteed system BLAS on Rocky Linux 8 # --download-cmake → spack does not have cmake @@ -35,8 +35,6 @@ if ! command -v mpicc &>/dev/null; then exit 1 fi -MPI_DIR="$(dirname "$(dirname "$(which mpicc)")")" - # Require pixi kaiju environment # Check PATH since PIXI_ENVIRONMENT is not set by pixi shell-hook (only by pixi shell) if ! echo "$PATH" | tr ':' '\n' | grep -q "\.pixi/envs/kaiju/bin"; then @@ -50,7 +48,7 @@ echo "PETSc AMR Build Script (Kaiju)" echo "==========================================" echo "PETSC_DIR: $PETSC_DIR" echo "PETSC_ARCH: $PETSC_ARCH" -echo "MPI_DIR: $MPI_DIR" +echo "mpicc: $(which mpicc)" echo "==========================================" clone_petsc() { @@ -78,17 +76,13 @@ configure_petsc() { # cmake: downloaded (spack does not have cmake) # MPI: spack OpenMPI (not downloaded) # petsc4py: built during configure - # Pass MPI compilers via CC/CXX/FC env vars rather than --with-mpi-dir. - # Spack's mpicc wrapper works in the current shell (spack load sets the - # required env), but PETSc configure tests wrappers in a subprocess that - # may not inherit the full spack environment, causing --with-mpi-dir to fail. - CC=mpicc CXX=mpicxx FC=mpif90 \ + # No --with-mpi-dir or --with-mpi flags: PETSc auto-detects mpicc from PATH. + # spack load openmpi@4.1.6 (called in load_env) puts mpicc in PATH. + # --download-mpich=0 prevents fallback to downloading MPICH. python3 ./configure \ --with-petsc-arch="$PETSC_ARCH" \ --with-debugging=0 \ - --with-mpi=1 \ --download-mpich=0 \ - --download-mpi4py=0 \ --download-hdf5=1 \ --download-fblaslapack=1 \ --download-cmake=1 \ From 2b2abaa3b137850e177345b0ca236ecc1c458cee Mon Sep 17 00:00:00 2001 From: Juan Carlos Graciosa Date: Wed, 11 Mar 2026 20:23:26 +1100 Subject: [PATCH 05/26] Fix LD_LIBRARY_PATH for spack OpenMPI --- petsc-custom/build-petsc-kaiju.sh | 10 ++++++---- 1 file changed, 6 insertions(+), 4 deletions(-) diff --git a/petsc-custom/build-petsc-kaiju.sh b/petsc-custom/build-petsc-kaiju.sh index afce9574..d9750aa4 100644 --- a/petsc-custom/build-petsc-kaiju.sh +++ b/petsc-custom/build-petsc-kaiju.sh @@ -76,13 +76,15 @@ configure_petsc() { # cmake: downloaded (spack does not have cmake) # MPI: spack OpenMPI (not downloaded) # petsc4py: built during configure - # No --with-mpi-dir or --with-mpi flags: PETSc auto-detects mpicc from PATH. - # spack load openmpi@4.1.6 (called in load_env) puts mpicc in PATH. - # --download-mpich=0 prevents fallback to downloading MPICH. + # MPI_DIR is computed from `which mpicc` (spack OpenMPI in PATH). + # LD_LIBRARY_PATH must include $MPI_DIR/lib so PETSc configure test binaries + # can find libmpi.so at runtime (spack uses RPATH for its own binaries but + # does not set LD_LIBRARY_PATH — load_env in uw3_install_kaiju_amr.sh sets it). + MPI_DIR="$(dirname "$(dirname "$(which mpicc)")")" python3 ./configure \ --with-petsc-arch="$PETSC_ARCH" \ --with-debugging=0 \ - --download-mpich=0 \ + --with-mpi-dir="$MPI_DIR" \ --download-hdf5=1 \ --download-fblaslapack=1 \ --download-cmake=1 \ From 6498cea0d6ce6eb2bd5d8717c4f4386e60126009 Mon Sep 17 00:00:00 2001 From: Juan Carlos Graciosa Date: Wed, 11 Mar 2026 20:54:46 +1100 Subject: [PATCH 06/26] Add MMG_INSTALL_PRIVATE_HEADERS=ON for PARMMG (kaiju only) --- petsc-custom/build-petsc-kaiju.sh | 1 + 1 file changed, 1 insertion(+) diff --git a/petsc-custom/build-petsc-kaiju.sh b/petsc-custom/build-petsc-kaiju.sh index d9750aa4..0966ee5b 100644 --- a/petsc-custom/build-petsc-kaiju.sh +++ b/petsc-custom/build-petsc-kaiju.sh @@ -97,6 +97,7 @@ configure_petsc() { --download-slepc=1 \ --download-ptscotch=1 \ --download-mmg=1 \ + --download-mmg-cmake-arguments="-DMMG_INSTALL_PRIVATE_HEADERS=ON" \ --download-parmmg=1 \ --download-pragmatic=1 \ --with-pragmatic=1 \ From 188d1ed912fe852a1e4f26d27a5e1d88f5fa13f6 Mon Sep 17 00:00:00 2001 From: Juan Carlos Graciosa Date: Wed, 11 Mar 2026 21:44:03 +1100 Subject: [PATCH 07/26] Disable SCOTCH in MMG build to fix PARMMG configure on Kaiju pixi's conda ld (14.3.0) requires explicit transitive shared lib deps. libmmg.so built with SCOTCH caused MMG_WORKS link test to fail in PARMMG's FindMMG.cmake because libscotch.so wasn't explicitly linked. MMG's SCOTCH is only used for mesh renumbering (optional perf feature); PARMMG uses ptscotch separately for parallel partitioning, unaffected. Co-Authored-By: Claude Sonnet 4.6 --- petsc-custom/build-petsc-kaiju.sh | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/petsc-custom/build-petsc-kaiju.sh b/petsc-custom/build-petsc-kaiju.sh index 0966ee5b..d46852ba 100644 --- a/petsc-custom/build-petsc-kaiju.sh +++ b/petsc-custom/build-petsc-kaiju.sh @@ -97,7 +97,7 @@ configure_petsc() { --download-slepc=1 \ --download-ptscotch=1 \ --download-mmg=1 \ - --download-mmg-cmake-arguments="-DMMG_INSTALL_PRIVATE_HEADERS=ON" \ + --download-mmg-cmake-arguments="-DMMG_INSTALL_PRIVATE_HEADERS=ON -DUSE_SCOTCH=OFF" \ --download-parmmg=1 \ --download-pragmatic=1 \ --with-pragmatic=1 \ From f26da685b6ab074f13ca0cec425830693b9724b5 Mon Sep 17 00:00:00 2001 From: Juan Carlos Graciosa Date: Thu, 12 Mar 2026 10:46:22 +1100 Subject: [PATCH 08/26] Update kaiju cluster setup docs with shared install and troubleshooting - Add shared installation section (admin, Lmod module) - Add troubleshooting entries from install experience: h5py replacing mpi4py, numpy ABI mismatch, PARMMG/pixi ld issue Underworld development team with AI support from Claude Code --- docs/developer/guides/kaiju-cluster-setup.md | 52 ++++++++++++++++++++ 1 file changed, 52 insertions(+) diff --git a/docs/developer/guides/kaiju-cluster-setup.md b/docs/developer/guides/kaiju-cluster-setup.md index 6d9c34ab..d5ca8129 100644 --- a/docs/developer/guides/kaiju-cluster-setup.md +++ b/docs/developer/guides/kaiju-cluster-setup.md @@ -164,6 +164,23 @@ sbatch --nodes=4 --ntasks-per-node=30 uw3_slurm_job.sh --- +## Shared Installation (Admin) + +A system-wide installation can be deployed to `/opt/cluster/software/underworld3/` so all users access it via Lmod: + +```bash +module load underworld3/development-12Mar26 +``` + +Use `uw3_install_kaiju_shared.sh` from the `kaiju-admin-notes` repo. It is identical to the per-user script except: +- `INSTALL_PATH=/opt/cluster/software` +- Adds `fix_permissions()` — sets world-readable permissions after install +- Adds `install_modulefile()` — copies the Lmod modulefile with a date-stamped name + +The Lmod modulefile (`modulefiles/underworld3/development.lua`) hardcodes the spack OpenMPI and pixi env paths. If spack is rebuilt (hash changes), update `mpi_root` in the modulefile. + +--- + ## Troubleshooting ### `import underworld3` fails on compute nodes @@ -190,6 +207,41 @@ install_petsc install_h5py ``` +### h5py replaces source-built mpi4py + +`pip install h5py` without `--no-deps` silently replaces the source-built mpi4py (spack OpenMPI) with a pre-built wheel linked to a different MPI. Always use `--no-deps` when installing h5py. The install script handles this correctly. + +If mpi4py was accidentally replaced, rebuild it from source: +```bash +source uw3_install_kaiju_amr.sh +pip install --no-binary :all: --no-cache-dir --force-reinstall "mpi4py>=4,<5" +``` + +Verify it links to spack OpenMPI: +```bash +ldd $(python3 -c "import mpi4py; print(mpi4py.__file__.replace('__init__.py',''))") \ + MPI*.so | grep mpi +# Should show: libmpi.so.40 => /opt/cluster/spack/.../openmpi-4.1.6-.../lib/libmpi.so.40 +``` + +### numpy ABI mismatch after h5py install + +If numpy is upgraded after petsc4py is compiled, `import petsc4py` fails with: +``` +ValueError: numpy.dtype size changed, may indicate binary incompatibility. +``` + +Fix: restore the numpy version used during the PETSc build, then rebuild h5py: +```bash +pip install --force-reinstall "numpy==1.26.4" +CC=mpicc HDF5_MPI="ON" HDF5_DIR="${PETSC_DIR}/${PETSC_ARCH}" \ + pip install --no-binary=h5py --no-cache-dir --force-reinstall --no-deps h5py +``` + +### PARMMG configure failure (pixi ld + spack transitive deps) + +pixi's conda linker (`ld` 14.x) requires transitive shared library dependencies to be explicitly linked. `libmmg.so` built with SCOTCH support causes PARMMG's `MMG_WORKS` link test to fail because `libscotch.so` is not explicitly passed. This is fixed in `petsc-custom/build-petsc-kaiju.sh` by building MMG without SCOTCH (`-DUSE_SCOTCH=OFF`). PARMMG uses ptscotch separately for parallel partitioning, which is unaffected. + ### Checking what's installed ```bash From 322e9161783124e5e86ef6dedd9a3c02327feafb Mon Sep 17 00:00:00 2001 From: Juan Carlos Graciosa Date: Thu, 12 Mar 2026 13:45:42 +1100 Subject: [PATCH 09/26] Added info re. shared installation in Kaiju cluster --- docs/developer/guides/kaiju-cluster-setup.md | 43 +++++++++++++++++--- 1 file changed, 37 insertions(+), 6 deletions(-) diff --git a/docs/developer/guides/kaiju-cluster-setup.md b/docs/developer/guides/kaiju-cluster-setup.md index d5ca8129..2b54c3d1 100644 --- a/docs/developer/guides/kaiju-cluster-setup.md +++ b/docs/developer/guides/kaiju-cluster-setup.md @@ -129,12 +129,18 @@ This: ## Running with Slurm -Use `kaiju-admin-notes/uw3_slurm_job.sh` as your job script template. +Two job script templates are available in `kaiju-admin-notes`: + +| Script | Use when | +|--------|----------| +| `uw3_slurm_job.sh` | Per-user install (sources `uw3_install_kaiju_amr.sh`) | +| `uw3_slurm_job_shared.sh` | Shared install (`module load underworld3/...`) | ### Submitting a job ```bash -sbatch uw3_slurm_job.sh +sbatch uw3_slurm_job.sh # per-user install +sbatch uw3_slurm_job_shared.sh # shared install ``` Monitor progress: @@ -166,18 +172,43 @@ sbatch --nodes=4 --ntasks-per-node=30 uw3_slurm_job.sh ## Shared Installation (Admin) -A system-wide installation can be deployed to `/opt/cluster/software/underworld3/` so all users access it via Lmod: +A system-wide installation can be deployed to `/opt/cluster/software/underworld3/` so all users access it via Environment Modules: ```bash module load underworld3/development-12Mar26 ``` -Use `uw3_install_kaiju_shared.sh` from the `kaiju-admin-notes` repo. It is identical to the per-user script except: +Run as an admin with write access to `/opt/cluster/software`: + +```bash +source uw3_install_kaiju_shared.sh install +``` + +This script is identical to the per-user script except: - `INSTALL_PATH=/opt/cluster/software` - Adds `fix_permissions()` — sets world-readable permissions after install -- Adds `install_modulefile()` — copies the Lmod modulefile with a date-stamped name +- Adds `install_modulefile()` — copies the TCL modulefile with a date-stamped name to `/opt/cluster/modulefiles/underworld3/` -The Lmod modulefile (`modulefiles/underworld3/development.lua`) hardcodes the spack OpenMPI and pixi env paths. If spack is rebuilt (hash changes), update `mpi_root` in the modulefile. +The modulefile (`modulefiles/underworld3/development.tcl`) hardcodes the spack OpenMPI and pixi env paths. If spack is rebuilt (hash changes), update `mpi_root` in the modulefile. + +### Slurm job script (shared install) + +Users with the shared install should use `uw3_slurm_job_shared.sh`: + +```bash +# Edit UW3_MODULE and SCRIPT at the top, then: +sbatch uw3_slurm_job_shared.sh +``` + +The key difference from the per-user job script is environment setup: + +```bash +# Shared install: load module +module load underworld3/development-12Mar26 + +# Per-user install: source install script +source ~/install_scripts/uw3_install_kaiju_amr.sh +``` --- From 04fca69c14d638a50847e07151d559194f90a3d1 Mon Sep 17 00:00:00 2001 From: Juan Carlos Graciosa Date: Thu, 12 Mar 2026 14:01:12 +1100 Subject: [PATCH 10/26] Added link to admin-repo --- docs/developer/guides/kaiju-cluster-setup.md | 4 ++-- 1 file changed, 2 insertions(+), 2 deletions(-) diff --git a/docs/developer/guides/kaiju-cluster-setup.md b/docs/developer/guides/kaiju-cluster-setup.md index 2b54c3d1..760f7435 100644 --- a/docs/developer/guides/kaiju-cluster-setup.md +++ b/docs/developer/guides/kaiju-cluster-setup.md @@ -54,7 +54,7 @@ curl -fsSL https://pixi.sh/install.sh | bash ## Installation -Use the install script at `kaiju-admin-notes/uw3_install_kaiju_amr.sh`. +Use the install script at `uw3_install_kaiju_amr.sh` from the [kaiju-admin-notes](https://github.com/jcgraciosa/kaiju-admin-notes) repo. ### Step 1: Edit configuration @@ -129,7 +129,7 @@ This: ## Running with Slurm -Two job script templates are available in `kaiju-admin-notes`: +Two job script templates are available in the [kaiju-admin-notes](https://github.com/jcgraciosa/kaiju-admin-notes) repo: | Script | Use when | |--------|----------| From 2c60aaa75552e7ecdb22d7d38d02eecc45f7189e Mon Sep 17 00:00:00 2001 From: Juan Carlos Graciosa Date: Tue, 17 Mar 2026 20:55:00 +1100 Subject: [PATCH 11/26] Added pixi env for Gadi baremetal install --- pixi.lock | 532 ++++++++++++++++++++++++++++++++++++++++++++++++++++++ pixi.toml | 16 ++ 2 files changed, 548 insertions(+) diff --git a/pixi.lock b/pixi.lock index 00800d32..1f79a946 100644 --- a/pixi.lock +++ b/pixi.lock @@ -7606,6 +7606,342 @@ environments: - pypi: https://files.pythonhosted.org/packages/6e/67/9d4ac4b0d683aaa4170da59a1980740b281fd38fc253e1830fde4dac3d4f/pygmsh-7.1.17-py3-none-any.whl - pypi: https://files.pythonhosted.org/packages/b1/09/0ab0853d6d634455fe70d90a306162160ead7592eceaca194168a16d3beb/sphinx_math_dollar-1.3-py3-none-any.whl - pypi: https://files.pythonhosted.org/packages/03/46/25d64bcd7821c8d6f1080e1c43d5fcdfc442a18f759a230b5ccdc891093e/sphinxcontrib_mermaid-2.0.1-py3-none-any.whl + kaiju: + channels: + - url: https://conda.anaconda.org/conda-forge/ + indexes: + - https://pypi.org/simple + options: + pypi-prerelease-mode: if-necessary-or-explicit + packages: + linux-64: + - conda: https://conda.anaconda.org/conda-forge/linux-64/_openmp_mutex-4.5-20_gnu.conda + - conda: https://conda.anaconda.org/conda-forge/noarch/_python_abi3_support-1.0-hd8ed1ab_2.conda + - conda: https://conda.anaconda.org/conda-forge/noarch/alabaster-1.0.0-pyhd8ed1ab_1.conda + - conda: https://conda.anaconda.org/conda-forge/linux-64/alsa-lib-1.2.15.3-hb03c661_0.conda + - conda: https://conda.anaconda.org/conda-forge/noarch/annotated-types-0.7.0-pyhd8ed1ab_1.conda + - conda: https://conda.anaconda.org/conda-forge/noarch/anyio-4.12.1-pyhcf101f3_0.conda + - conda: https://conda.anaconda.org/conda-forge/noarch/argon2-cffi-25.1.0-pyhd8ed1ab_0.conda + - conda: https://conda.anaconda.org/conda-forge/linux-64/argon2-cffi-bindings-25.1.0-py312h4c3975b_2.conda + - conda: https://conda.anaconda.org/conda-forge/noarch/arrow-1.4.0-pyhcf101f3_0.conda + - conda: https://conda.anaconda.org/conda-forge/noarch/asttokens-3.0.1-pyhd8ed1ab_0.conda + - conda: https://conda.anaconda.org/conda-forge/linux-64/attr-2.5.2-h39aace5_0.conda + - conda: https://conda.anaconda.org/conda-forge/noarch/attrs-25.4.0-pyhcf101f3_1.conda + - conda: https://conda.anaconda.org/conda-forge/noarch/babel-2.18.0-pyhcf101f3_1.conda + - conda: https://conda.anaconda.org/conda-forge/linux-64/backports.zstd-1.3.0-py312h90b7ffd_0.conda + - conda: https://conda.anaconda.org/conda-forge/noarch/beautifulsoup4-4.14.3-pyha770c72_0.conda + - conda: https://conda.anaconda.org/conda-forge/linux-64/binutils-2.45.1-default_h4852527_101.conda + - conda: https://conda.anaconda.org/conda-forge/linux-64/binutils_impl_linux-64-2.45.1-default_hfdba357_101.conda + - conda: https://conda.anaconda.org/conda-forge/linux-64/binutils_linux-64-2.45.1-default_h4852527_101.conda + - conda: https://conda.anaconda.org/conda-forge/noarch/bleach-6.3.0-pyhcf101f3_1.conda + - conda: https://conda.anaconda.org/conda-forge/noarch/bleach-with-css-6.3.0-hbca2aae_1.conda + - conda: https://conda.anaconda.org/conda-forge/linux-64/blosc-1.21.6-he440d0b_1.conda + - conda: https://conda.anaconda.org/conda-forge/linux-64/brotli-1.2.0-hed03a55_1.conda + - conda: https://conda.anaconda.org/conda-forge/linux-64/brotli-bin-1.2.0-hb03c661_1.conda + - conda: https://conda.anaconda.org/conda-forge/linux-64/brotli-python-1.2.0-py312hdb49522_1.conda + - conda: https://conda.anaconda.org/conda-forge/linux-64/bzip2-1.0.8-hda65f42_9.conda + - conda: https://conda.anaconda.org/conda-forge/linux-64/c-ares-1.34.6-hb03c661_0.conda + - conda: https://conda.anaconda.org/conda-forge/linux-64/c-compiler-1.11.0-h4d9bdce_0.conda + - conda: https://conda.anaconda.org/conda-forge/noarch/ca-certificates-2026.2.25-hbd8a1cb_0.conda + - conda: https://conda.anaconda.org/conda-forge/noarch/cached-property-1.5.2-hd8ed1ab_1.tar.bz2 + - conda: https://conda.anaconda.org/conda-forge/noarch/cached_property-1.5.2-pyha770c72_1.tar.bz2 + - conda: https://conda.anaconda.org/conda-forge/linux-64/cairo-1.18.4-he90730b_1.conda + - conda: https://conda.anaconda.org/conda-forge/noarch/certifi-2026.2.25-pyhd8ed1ab_0.conda + - conda: https://conda.anaconda.org/conda-forge/linux-64/cffi-2.0.0-py312h460c074_1.conda + - conda: https://conda.anaconda.org/conda-forge/linux-64/cftime-1.6.5-py312h4f23490_1.conda + - conda: https://conda.anaconda.org/conda-forge/noarch/charset-normalizer-3.4.5-pyhd8ed1ab_0.conda + - conda: https://conda.anaconda.org/conda-forge/noarch/colorama-0.4.6-pyhd8ed1ab_1.conda + - conda: https://conda.anaconda.org/conda-forge/noarch/comm-0.2.3-pyhe01879c_0.conda + - conda: https://conda.anaconda.org/conda-forge/linux-64/compilers-1.11.0-ha770c72_0.conda + - conda: https://conda.anaconda.org/conda-forge/linux-64/conda-gcc-specs-14.3.0-he8ccf15_18.conda + - conda: https://conda.anaconda.org/conda-forge/linux-64/contourpy-1.3.3-py312h0a2e395_4.conda + - conda: https://conda.anaconda.org/conda-forge/noarch/cpython-3.12.13-py312hd8ed1ab_0.conda + - conda: https://conda.anaconda.org/conda-forge/linux-64/cxx-compiler-1.11.0-hfcd1e18_0.conda + - conda: https://conda.anaconda.org/conda-forge/noarch/cycler-0.12.1-pyhcf101f3_2.conda + - conda: https://conda.anaconda.org/conda-forge/linux-64/cyrus-sasl-2.1.28-hac629b4_1.conda + - conda: https://conda.anaconda.org/conda-forge/linux-64/cython-3.2.4-py312h68e6be4_0.conda + - conda: https://conda.anaconda.org/conda-forge/linux-64/dbus-1.16.2-h24cb091_1.conda + - conda: https://conda.anaconda.org/conda-forge/linux-64/debugpy-1.8.20-py312h8285ef7_0.conda + - conda: https://conda.anaconda.org/conda-forge/noarch/decorator-5.2.1-pyhd8ed1ab_0.conda + - conda: https://conda.anaconda.org/conda-forge/noarch/defusedxml-0.7.1-pyhd8ed1ab_0.tar.bz2 + - conda: https://conda.anaconda.org/conda-forge/noarch/docutils-0.21.2-pyhd8ed1ab_1.conda + - conda: https://conda.anaconda.org/conda-forge/linux-64/double-conversion-3.4.0-hecca717_0.conda + - conda: https://conda.anaconda.org/conda-forge/noarch/exceptiongroup-1.3.1-pyhd8ed1ab_0.conda + - conda: https://conda.anaconda.org/conda-forge/noarch/execnet-2.1.2-pyhd8ed1ab_0.conda + - conda: https://conda.anaconda.org/conda-forge/noarch/executing-2.2.1-pyhd8ed1ab_0.conda + - conda: https://conda.anaconda.org/conda-forge/noarch/flexcache-0.3-pyhd8ed1ab_1.conda + - conda: https://conda.anaconda.org/conda-forge/noarch/flexparser-0.4-pyhd8ed1ab_1.conda + - conda: https://conda.anaconda.org/conda-forge/noarch/font-ttf-dejavu-sans-mono-2.37-hab24e00_0.tar.bz2 + - conda: https://conda.anaconda.org/conda-forge/noarch/font-ttf-inconsolata-3.000-h77eed37_0.tar.bz2 + - conda: https://conda.anaconda.org/conda-forge/noarch/font-ttf-source-code-pro-2.038-h77eed37_0.tar.bz2 + - conda: https://conda.anaconda.org/conda-forge/noarch/font-ttf-ubuntu-0.83-h77eed37_3.conda + - conda: https://conda.anaconda.org/conda-forge/linux-64/fontconfig-2.17.1-h27c8c51_0.conda + - conda: https://conda.anaconda.org/conda-forge/noarch/fonts-conda-ecosystem-1-0.tar.bz2 + - conda: https://conda.anaconda.org/conda-forge/noarch/fonts-conda-forge-1-hc364b38_1.conda + - conda: https://conda.anaconda.org/conda-forge/linux-64/fonttools-4.62.0-py312h8a5da7c_0.conda + - conda: https://conda.anaconda.org/conda-forge/linux-64/fortran-compiler-1.11.0-h9bea470_0.conda + - conda: https://conda.anaconda.org/conda-forge/noarch/fqdn-1.5.1-pyhd8ed1ab_1.conda + - conda: https://conda.anaconda.org/conda-forge/linux-64/freetype-2.14.2-ha770c72_0.conda + - conda: https://conda.anaconda.org/conda-forge/linux-64/gcc-14.3.0-h0dff253_18.conda + - conda: https://conda.anaconda.org/conda-forge/linux-64/gcc_impl_linux-64-14.3.0-hbdf3cc3_18.conda + - conda: https://conda.anaconda.org/conda-forge/linux-64/gcc_linux-64-14.3.0-h298d278_21.conda + - conda: https://conda.anaconda.org/conda-forge/linux-64/gfortran-14.3.0-h76987e4_18.conda + - conda: https://conda.anaconda.org/conda-forge/linux-64/gfortran_impl_linux-64-14.3.0-h1a219da_18.conda + - conda: https://conda.anaconda.org/conda-forge/linux-64/gfortran_linux-64-14.3.0-hfa02b96_21.conda + - conda: https://conda.anaconda.org/conda-forge/linux-64/gmp-6.3.0-hac33072_2.conda + - conda: https://conda.anaconda.org/conda-forge/linux-64/gmpy2-2.3.0-py312hcaba1f9_1.conda + - conda: https://conda.anaconda.org/conda-forge/linux-64/graphite2-1.3.14-hecca717_2.conda + - conda: https://conda.anaconda.org/conda-forge/linux-64/gxx-14.3.0-h76987e4_18.conda + - conda: https://conda.anaconda.org/conda-forge/linux-64/gxx_impl_linux-64-14.3.0-h2185e75_18.conda + - conda: https://conda.anaconda.org/conda-forge/linux-64/gxx_linux-64-14.3.0-he467f4b_21.conda + - conda: https://conda.anaconda.org/conda-forge/noarch/h2-4.3.0-pyhcf101f3_0.conda + - conda: https://conda.anaconda.org/conda-forge/linux-64/h5py-3.15.1-nompi_py312ha4f8f14_101.conda + - conda: https://conda.anaconda.org/conda-forge/linux-64/harfbuzz-13.1.0-h6083320_0.conda + - conda: https://conda.anaconda.org/conda-forge/linux-64/hdf4-4.2.15-h2a13503_7.conda + - conda: https://conda.anaconda.org/conda-forge/linux-64/hdf5-1.14.6-nompi_h19486de_106.conda + - conda: https://conda.anaconda.org/conda-forge/noarch/hpack-4.1.0-pyhd8ed1ab_0.conda + - conda: https://conda.anaconda.org/conda-forge/noarch/hyperframe-6.1.0-pyhd8ed1ab_0.conda + - conda: https://conda.anaconda.org/conda-forge/linux-64/icu-78.2-h33c6efd_0.conda + - conda: https://conda.anaconda.org/conda-forge/noarch/idna-3.11-pyhd8ed1ab_0.conda + - conda: https://conda.anaconda.org/conda-forge/noarch/imagesize-1.4.1-pyhd8ed1ab_0.tar.bz2 + - conda: https://conda.anaconda.org/conda-forge/noarch/importlib-metadata-8.7.0-pyhe01879c_1.conda + - conda: https://conda.anaconda.org/conda-forge/noarch/importlib_metadata-8.7.0-h40b2b14_1.conda + - conda: https://conda.anaconda.org/conda-forge/noarch/iniconfig-2.3.0-pyhd8ed1ab_0.conda + - conda: https://conda.anaconda.org/conda-forge/noarch/ipykernel-6.31.0-pyha191276_0.conda + - conda: https://conda.anaconda.org/conda-forge/noarch/ipython-9.11.0-pyhecfbec7_0.conda + - conda: https://conda.anaconda.org/conda-forge/noarch/ipython_pygments_lexers-1.1.1-pyhd8ed1ab_0.conda + - conda: https://conda.anaconda.org/conda-forge/noarch/isoduration-20.11.0-pyhd8ed1ab_1.conda + - conda: https://conda.anaconda.org/conda-forge/noarch/jedi-0.19.2-pyhd8ed1ab_1.conda + - conda: https://conda.anaconda.org/conda-forge/noarch/jinja2-3.1.6-pyhcf101f3_1.conda + - conda: https://conda.anaconda.org/conda-forge/noarch/jsonpointer-3.0.0-pyhcf101f3_3.conda + - conda: https://conda.anaconda.org/conda-forge/noarch/jsonschema-4.26.0-pyhcf101f3_0.conda + - conda: https://conda.anaconda.org/conda-forge/noarch/jsonschema-specifications-2025.9.1-pyhcf101f3_0.conda + - conda: https://conda.anaconda.org/conda-forge/noarch/jsonschema-with-format-nongpl-4.26.0-hcf101f3_0.conda + - conda: https://conda.anaconda.org/conda-forge/noarch/jupyter_client-8.8.0-pyhcf101f3_0.conda + - conda: https://conda.anaconda.org/conda-forge/noarch/jupyter_core-5.9.1-pyhc90fa1f_0.conda + - conda: https://conda.anaconda.org/conda-forge/noarch/jupyter_events-0.12.0-pyhe01879c_0.conda + - conda: https://conda.anaconda.org/conda-forge/noarch/jupyter_server-2.17.0-pyhcf101f3_0.conda + - conda: https://conda.anaconda.org/conda-forge/noarch/jupyter_server_terminals-0.5.4-pyhcf101f3_0.conda + - conda: https://conda.anaconda.org/conda-forge/noarch/jupyterlab_pygments-0.3.0-pyhd8ed1ab_2.conda + - conda: https://conda.anaconda.org/conda-forge/noarch/kernel-headers_linux-64-4.18.0-he073ed8_9.conda + - conda: https://conda.anaconda.org/conda-forge/linux-64/keyutils-1.6.3-hb9d3cd8_0.conda + - conda: https://conda.anaconda.org/conda-forge/linux-64/kiwisolver-1.5.0-py312h0a2e395_0.conda + - conda: https://conda.anaconda.org/conda-forge/linux-64/krb5-1.22.2-ha1258a1_0.conda + - conda: https://conda.anaconda.org/conda-forge/noarch/lark-1.3.1-pyhd8ed1ab_0.conda + - conda: https://conda.anaconda.org/conda-forge/linux-64/lcms2-2.18-h0c24ade_0.conda + - conda: https://conda.anaconda.org/conda-forge/linux-64/ld_impl_linux-64-2.45.1-default_hbd61a6d_101.conda + - conda: https://conda.anaconda.org/conda-forge/linux-64/lerc-4.1.0-hdb68285_0.conda + - conda: https://conda.anaconda.org/conda-forge/linux-64/libaec-1.1.5-h088129d_0.conda + - conda: https://conda.anaconda.org/conda-forge/linux-64/libblas-3.11.0-5_h4a7cf45_openblas.conda + - conda: https://conda.anaconda.org/conda-forge/linux-64/libbrotlicommon-1.2.0-hb03c661_1.conda + - conda: https://conda.anaconda.org/conda-forge/linux-64/libbrotlidec-1.2.0-hb03c661_1.conda + - conda: https://conda.anaconda.org/conda-forge/linux-64/libbrotlienc-1.2.0-hb03c661_1.conda + - conda: https://conda.anaconda.org/conda-forge/linux-64/libcblas-3.11.0-5_h0358290_openblas.conda + - conda: https://conda.anaconda.org/conda-forge/linux-64/libclang-cpp22.1-22.1.0-default_h99862b1_0.conda + - conda: https://conda.anaconda.org/conda-forge/linux-64/libclang13-22.1.0-default_h746c552_0.conda + - conda: https://conda.anaconda.org/conda-forge/linux-64/libcups-2.3.3-h7a8fb5f_6.conda + - conda: https://conda.anaconda.org/conda-forge/linux-64/libcurl-8.19.0-hcf29cc6_0.conda + - conda: https://conda.anaconda.org/conda-forge/linux-64/libdeflate-1.25-h17f619e_0.conda + - conda: https://conda.anaconda.org/conda-forge/linux-64/libdrm-2.4.125-hb03c661_1.conda + - conda: https://conda.anaconda.org/conda-forge/linux-64/libedit-3.1.20250104-pl5321h7949ede_0.conda + - conda: https://conda.anaconda.org/conda-forge/linux-64/libegl-1.7.0-ha4b6fd6_2.conda + - conda: https://conda.anaconda.org/conda-forge/linux-64/libev-4.33-hd590300_2.conda + - conda: https://conda.anaconda.org/conda-forge/linux-64/libexpat-2.7.4-hecca717_0.conda + - conda: https://conda.anaconda.org/conda-forge/linux-64/libffi-3.5.2-h3435931_0.conda + - conda: https://conda.anaconda.org/conda-forge/linux-64/libfreetype-2.14.2-ha770c72_0.conda + - conda: https://conda.anaconda.org/conda-forge/linux-64/libfreetype6-2.14.2-h73754d4_0.conda + - conda: https://conda.anaconda.org/conda-forge/linux-64/libgcc-15.2.0-he0feb66_18.conda + - conda: https://conda.anaconda.org/conda-forge/noarch/libgcc-devel_linux-64-14.3.0-hf649bbc_118.conda + - conda: https://conda.anaconda.org/conda-forge/linux-64/libgcc-ng-15.2.0-h69a702a_18.conda + - conda: https://conda.anaconda.org/conda-forge/linux-64/libgfortran-15.2.0-h69a702a_18.conda + - conda: https://conda.anaconda.org/conda-forge/linux-64/libgfortran5-15.2.0-h68bc16d_18.conda + - conda: https://conda.anaconda.org/conda-forge/linux-64/libgl-1.7.0-ha4b6fd6_2.conda + - conda: https://conda.anaconda.org/conda-forge/linux-64/libglib-2.86.4-h6548e54_1.conda + - conda: https://conda.anaconda.org/conda-forge/linux-64/libglvnd-1.7.0-ha4b6fd6_2.conda + - conda: https://conda.anaconda.org/conda-forge/linux-64/libglx-1.7.0-ha4b6fd6_2.conda + - conda: https://conda.anaconda.org/conda-forge/linux-64/libgomp-15.2.0-he0feb66_18.conda + - conda: https://conda.anaconda.org/conda-forge/linux-64/libiconv-1.18-h3b78370_2.conda + - conda: https://conda.anaconda.org/conda-forge/linux-64/libjpeg-turbo-3.1.2-hb03c661_0.conda + - conda: https://conda.anaconda.org/conda-forge/linux-64/liblapack-3.11.0-5_h47877c9_openblas.conda + - conda: https://conda.anaconda.org/conda-forge/linux-64/libllvm22-22.1.0-hf7376ad_0.conda + - conda: https://conda.anaconda.org/conda-forge/linux-64/liblzma-5.8.2-hb03c661_0.conda + - conda: https://conda.anaconda.org/conda-forge/linux-64/libnetcdf-4.10.0-nompi_hbf2fc22_100.conda + - conda: https://conda.anaconda.org/conda-forge/linux-64/libnghttp2-1.67.0-had1ee68_0.conda + - conda: https://conda.anaconda.org/conda-forge/linux-64/libnsl-2.0.1-hb9d3cd8_1.conda + - conda: https://conda.anaconda.org/conda-forge/linux-64/libntlm-1.8-hb9d3cd8_0.conda + - conda: https://conda.anaconda.org/conda-forge/linux-64/libopenblas-0.3.30-pthreads_h94d23a6_4.conda + - conda: https://conda.anaconda.org/conda-forge/linux-64/libopengl-1.7.0-ha4b6fd6_2.conda + - conda: https://conda.anaconda.org/conda-forge/linux-64/libpciaccess-0.18-hb9d3cd8_0.conda + - conda: https://conda.anaconda.org/conda-forge/linux-64/libpng-1.6.55-h421ea60_0.conda + - conda: https://conda.anaconda.org/conda-forge/linux-64/libpq-18.3-h9abb657_0.conda + - conda: https://conda.anaconda.org/conda-forge/linux-64/libsanitizer-14.3.0-h8f1669f_18.conda + - conda: https://conda.anaconda.org/conda-forge/linux-64/libsodium-1.0.21-h280c20c_3.conda + - conda: https://conda.anaconda.org/conda-forge/linux-64/libsqlite-3.52.0-hf4e2dac_0.conda + - conda: https://conda.anaconda.org/conda-forge/linux-64/libssh2-1.11.1-hcf80075_0.conda + - conda: https://conda.anaconda.org/conda-forge/linux-64/libstdcxx-15.2.0-h934c35e_18.conda + - conda: https://conda.anaconda.org/conda-forge/noarch/libstdcxx-devel_linux-64-14.3.0-h9f08a49_118.conda + - conda: https://conda.anaconda.org/conda-forge/linux-64/libstdcxx-ng-15.2.0-hdf11a46_18.conda + - conda: https://conda.anaconda.org/conda-forge/linux-64/libtiff-4.7.1-h9d88235_1.conda + - conda: https://conda.anaconda.org/conda-forge/linux-64/libuuid-2.41.3-h5347b49_0.conda + - conda: https://conda.anaconda.org/conda-forge/linux-64/libvulkan-loader-1.4.341.0-h5279c79_0.conda + - conda: https://conda.anaconda.org/conda-forge/linux-64/libwebp-base-1.6.0-hd42ef1d_0.conda + - conda: https://conda.anaconda.org/conda-forge/linux-64/libxcb-1.17.0-h8a09558_0.conda + - conda: https://conda.anaconda.org/conda-forge/linux-64/libxcrypt-4.4.36-hd590300_1.conda + - conda: https://conda.anaconda.org/conda-forge/linux-64/libxkbcommon-1.13.1-hca5e8e5_0.conda + - conda: https://conda.anaconda.org/conda-forge/linux-64/libxml2-16-2.15.2-hca6bf5a_0.conda + - conda: https://conda.anaconda.org/conda-forge/linux-64/libxml2-2.15.2-he237659_0.conda + - conda: https://conda.anaconda.org/conda-forge/linux-64/libxslt-1.1.43-h711ed8c_1.conda + - conda: https://conda.anaconda.org/conda-forge/linux-64/libzip-1.11.2-h6991a6a_0.conda + - conda: https://conda.anaconda.org/conda-forge/linux-64/libzlib-1.3.1-hb9d3cd8_2.conda + - conda: https://conda.anaconda.org/conda-forge/linux-64/lz4-c-1.10.0-h5888daf_1.conda + - conda: https://conda.anaconda.org/conda-forge/noarch/markdown-it-py-4.0.0-pyhd8ed1ab_0.conda + - conda: https://conda.anaconda.org/conda-forge/linux-64/markupsafe-3.0.3-py312h8a5da7c_1.conda + - conda: https://conda.anaconda.org/conda-forge/linux-64/matplotlib-3.10.8-py312h7900ff3_0.conda + - conda: https://conda.anaconda.org/conda-forge/linux-64/matplotlib-base-3.10.8-py312he3d6523_0.conda + - conda: https://conda.anaconda.org/conda-forge/noarch/matplotlib-inline-0.2.1-pyhd8ed1ab_0.conda + - conda: https://conda.anaconda.org/conda-forge/noarch/mdurl-0.1.2-pyhd8ed1ab_1.conda + - conda: https://conda.anaconda.org/conda-forge/noarch/meshio-5.3.5-pyhd8ed1ab_0.conda + - conda: https://conda.anaconda.org/conda-forge/noarch/mistune-3.2.0-pyhcf101f3_0.conda + - conda: https://conda.anaconda.org/conda-forge/linux-64/mpc-1.3.1-h24ddda3_1.conda + - conda: https://conda.anaconda.org/conda-forge/linux-64/mpfr-4.2.1-h90cbb55_3.conda + - conda: https://conda.anaconda.org/conda-forge/noarch/mpmath-1.4.0-pyhd8ed1ab_0.conda + - conda: https://conda.anaconda.org/conda-forge/linux-64/msgspec-0.20.0-py312h4c3975b_2.conda + - conda: https://conda.anaconda.org/conda-forge/noarch/munkres-1.1.4-pyhd8ed1ab_1.conda + - conda: https://conda.anaconda.org/conda-forge/noarch/nbclient-0.10.4-pyhd8ed1ab_0.conda + - conda: https://conda.anaconda.org/conda-forge/noarch/nbconvert-core-7.17.0-pyhcf101f3_0.conda + - conda: https://conda.anaconda.org/conda-forge/noarch/nbformat-5.10.4-pyhd8ed1ab_1.conda + - conda: https://conda.anaconda.org/conda-forge/linux-64/ncurses-6.5-h2d0b736_3.conda + - conda: https://conda.anaconda.org/conda-forge/noarch/nest-asyncio-1.6.0-pyhd8ed1ab_1.conda + - conda: https://conda.anaconda.org/conda-forge/linux-64/netcdf4-1.7.4-nompi_py311ha0596eb_105.conda + - conda: https://conda.anaconda.org/conda-forge/linux-64/numpy-1.26.4-py312heda63a1_0.conda + - conda: https://conda.anaconda.org/conda-forge/linux-64/openjpeg-2.5.4-h55fea9a_0.conda + - conda: https://conda.anaconda.org/conda-forge/linux-64/openldap-2.6.10-hbde042b_1.conda + - conda: https://conda.anaconda.org/conda-forge/linux-64/openssl-3.6.1-h35e630c_1.conda + - conda: https://conda.anaconda.org/conda-forge/noarch/overrides-7.7.0-pyhd8ed1ab_1.conda + - conda: https://conda.anaconda.org/conda-forge/noarch/packaging-26.0-pyhcf101f3_0.conda + - conda: https://conda.anaconda.org/conda-forge/noarch/pandocfilters-1.5.0-pyhd8ed1ab_0.tar.bz2 + - conda: https://conda.anaconda.org/conda-forge/noarch/parso-0.8.6-pyhcf101f3_0.conda + - conda: https://conda.anaconda.org/conda-forge/linux-64/pcre2-10.47-haa7fec5_0.conda + - conda: https://conda.anaconda.org/conda-forge/noarch/pexpect-4.9.0-pyhd8ed1ab_1.conda + - conda: https://conda.anaconda.org/conda-forge/linux-64/pillow-12.1.1-py312h50c33e8_0.conda + - conda: https://conda.anaconda.org/conda-forge/noarch/pint-0.24.4-pyhe01879c_2.conda + - conda: https://conda.anaconda.org/conda-forge/noarch/pip-25.3-pyh8b19718_0.conda + - conda: https://conda.anaconda.org/conda-forge/noarch/pixi-kernel-0.7.1-pyhbbac1ac_0.conda + - conda: https://conda.anaconda.org/conda-forge/linux-64/pixman-0.46.4-h54a6638_1.conda + - conda: https://conda.anaconda.org/conda-forge/noarch/platformdirs-4.9.4-pyhcf101f3_0.conda + - conda: https://conda.anaconda.org/conda-forge/noarch/pluggy-1.6.0-pyhf9edf01_1.conda + - conda: https://conda.anaconda.org/conda-forge/noarch/prometheus_client-0.24.1-pyhd8ed1ab_0.conda + - conda: https://conda.anaconda.org/conda-forge/noarch/prompt-toolkit-3.0.52-pyha770c72_0.conda + - conda: https://conda.anaconda.org/conda-forge/linux-64/psutil-7.2.2-py312h5253ce2_0.conda + - conda: https://conda.anaconda.org/conda-forge/linux-64/pthread-stubs-0.4-hb9d3cd8_1002.conda + - conda: https://conda.anaconda.org/conda-forge/noarch/ptyprocess-0.7.0-pyhd8ed1ab_1.conda + - conda: https://conda.anaconda.org/conda-forge/noarch/pure_eval-0.2.3-pyhd8ed1ab_1.conda + - conda: https://conda.anaconda.org/conda-forge/noarch/py-1.11.0-pyhd8ed1ab_1.conda + - conda: https://conda.anaconda.org/conda-forge/noarch/pycparser-2.22-pyh29332c3_1.conda + - conda: https://conda.anaconda.org/conda-forge/noarch/pydantic-2.12.5-pyhcf101f3_1.conda + - conda: https://conda.anaconda.org/conda-forge/linux-64/pydantic-core-2.41.5-py312h868fb18_1.conda + - conda: https://conda.anaconda.org/conda-forge/noarch/pygments-2.19.2-pyhd8ed1ab_0.conda + - conda: https://conda.anaconda.org/conda-forge/linux-64/pykdtree-1.4.3-py312h4f23490_2.conda + - conda: https://conda.anaconda.org/conda-forge/noarch/pyparsing-3.3.2-pyhcf101f3_0.conda + - conda: https://conda.anaconda.org/conda-forge/linux-64/pyside6-6.10.2-py312h9da60e5_0.conda + - conda: https://conda.anaconda.org/conda-forge/noarch/pysocks-1.7.1-pyha55dd90_7.conda + - conda: https://conda.anaconda.org/conda-forge/noarch/pytest-8.4.2-pyhcf101f3_1.conda + - conda: https://conda.anaconda.org/conda-forge/noarch/pytest-forked-1.6.0-pyhd8ed1ab_1.conda + - conda: https://conda.anaconda.org/conda-forge/noarch/pytest-mpi-0.6-pyhd8ed1ab_1.conda + - conda: https://conda.anaconda.org/conda-forge/noarch/pytest-timeout-2.4.0-pyhd8ed1ab_0.conda + - conda: https://conda.anaconda.org/conda-forge/noarch/pytest-xdist-3.8.0-pyhd8ed1ab_0.conda + - conda: https://conda.anaconda.org/conda-forge/linux-64/python-3.12.13-hd63d673_0_cpython.conda + - conda: https://conda.anaconda.org/conda-forge/noarch/python-dateutil-2.9.0.post0-pyhe01879c_2.conda + - conda: https://conda.anaconda.org/conda-forge/noarch/python-fastjsonschema-2.21.2-pyhe01879c_0.conda + - conda: https://conda.anaconda.org/conda-forge/noarch/python-gil-3.12.13-hd8ed1ab_0.conda + - conda: https://conda.anaconda.org/conda-forge/noarch/python-json-logger-2.0.7-pyhd8ed1ab_0.conda + - conda: https://conda.anaconda.org/conda-forge/noarch/python-tzdata-2025.3-pyhd8ed1ab_0.conda + - conda: https://conda.anaconda.org/conda-forge/linux-64/python-xxhash-3.6.0-py312h0d868a3_1.conda + - conda: https://conda.anaconda.org/conda-forge/noarch/python_abi-3.12-8_cp312.conda + - conda: https://conda.anaconda.org/conda-forge/linux-64/pyyaml-6.0.3-py312h8a5da7c_1.conda + - conda: https://conda.anaconda.org/conda-forge/linux-64/pyzmq-27.1.0-py312hda471dd_2.conda + - conda: https://conda.anaconda.org/conda-forge/linux-64/qhull-2020.2-h434a139_5.conda + - conda: https://conda.anaconda.org/conda-forge/linux-64/qt6-main-6.10.2-h17e89b9_5.conda + - conda: https://conda.anaconda.org/conda-forge/linux-64/readline-8.3-h853b02a_0.conda + - conda: https://conda.anaconda.org/conda-forge/noarch/referencing-0.37.0-pyhcf101f3_0.conda + - conda: https://conda.anaconda.org/conda-forge/noarch/requests-2.32.5-pyhcf101f3_1.conda + - conda: https://conda.anaconda.org/conda-forge/noarch/returns-0.26.0-pyhe01879c_0.conda + - conda: https://conda.anaconda.org/conda-forge/noarch/rfc3339-validator-0.1.4-pyhd8ed1ab_1.conda + - conda: https://conda.anaconda.org/conda-forge/noarch/rfc3986-validator-0.1.1-pyh9f0ad1d_0.tar.bz2 + - conda: https://conda.anaconda.org/conda-forge/noarch/rfc3987-syntax-1.1.0-pyhe01879c_1.conda + - conda: https://conda.anaconda.org/conda-forge/noarch/rich-14.3.3-pyhcf101f3_0.conda + - conda: https://conda.anaconda.org/conda-forge/noarch/roman-numerals-4.1.0-pyhd8ed1ab_0.conda + - conda: https://conda.anaconda.org/conda-forge/noarch/roman-numerals-py-4.1.0-pyhd8ed1ab_0.conda + - conda: https://conda.anaconda.org/conda-forge/linux-64/rpds-py-0.30.0-py312h868fb18_0.conda + - conda: https://conda.anaconda.org/conda-forge/linux-64/scipy-1.17.1-py312h54fa4ab_0.conda + - conda: https://conda.anaconda.org/conda-forge/noarch/send2trash-2.1.0-pyha191276_1.conda + - conda: https://conda.anaconda.org/conda-forge/noarch/setuptools-75.8.2-pyhff2d567_0.conda + - conda: https://conda.anaconda.org/conda-forge/noarch/six-1.17.0-pyhe01879c_1.conda + - conda: https://conda.anaconda.org/conda-forge/linux-64/snappy-1.2.2-h03e3b7b_1.conda + - conda: https://conda.anaconda.org/conda-forge/noarch/snowballstemmer-3.0.1-pyhd8ed1ab_0.conda + - conda: https://conda.anaconda.org/conda-forge/noarch/soupsieve-2.8.3-pyhd8ed1ab_0.conda + - conda: https://conda.anaconda.org/conda-forge/noarch/sphinx-8.2.3-pyhd8ed1ab_0.conda + - conda: https://conda.anaconda.org/conda-forge/noarch/sphinx-design-0.6.1-pyhd8ed1ab_2.conda + - conda: https://conda.anaconda.org/conda-forge/noarch/sphinxcontrib-applehelp-2.0.0-pyhd8ed1ab_1.conda + - conda: https://conda.anaconda.org/conda-forge/noarch/sphinxcontrib-devhelp-2.0.0-pyhd8ed1ab_1.conda + - conda: https://conda.anaconda.org/conda-forge/noarch/sphinxcontrib-htmlhelp-2.1.0-pyhd8ed1ab_1.conda + - conda: https://conda.anaconda.org/conda-forge/noarch/sphinxcontrib-jsmath-1.0.1-pyhd8ed1ab_1.conda + - conda: https://conda.anaconda.org/conda-forge/noarch/sphinxcontrib-qthelp-2.0.0-pyhd8ed1ab_1.conda + - conda: https://conda.anaconda.org/conda-forge/noarch/sphinxcontrib-serializinghtml-1.1.10-pyhd8ed1ab_1.conda + - conda: https://conda.anaconda.org/conda-forge/noarch/stack_data-0.6.3-pyhd8ed1ab_1.conda + - conda: https://conda.anaconda.org/conda-forge/noarch/sympy-1.14.0-pyh2585a3b_106.conda + - conda: https://conda.anaconda.org/conda-forge/noarch/sysroot_linux-64-2.28-h4ee821c_9.conda + - conda: https://conda.anaconda.org/conda-forge/noarch/terminado-0.18.1-pyhc90fa1f_1.conda + - conda: https://conda.anaconda.org/conda-forge/noarch/tinycss2-1.4.0-pyhd8ed1ab_0.conda + - conda: https://conda.anaconda.org/conda-forge/linux-64/tk-8.6.13-noxft_h366c992_103.conda + - conda: https://conda.anaconda.org/conda-forge/noarch/tomli-2.4.0-pyhcf101f3_0.conda + - conda: https://conda.anaconda.org/conda-forge/linux-64/tornado-6.5.3-py312h4c3975b_0.conda + - conda: https://conda.anaconda.org/conda-forge/noarch/traitlets-5.14.3-pyhd8ed1ab_1.conda + - conda: https://conda.anaconda.org/conda-forge/noarch/typeguard-4.5.1-pyhd8ed1ab_0.conda + - conda: https://conda.anaconda.org/conda-forge/noarch/typing-extensions-4.15.0-h396c80c_0.conda + - conda: https://conda.anaconda.org/conda-forge/noarch/typing-inspection-0.4.2-pyhd8ed1ab_1.conda + - conda: https://conda.anaconda.org/conda-forge/noarch/typing_extensions-4.15.0-pyhcf101f3_0.conda + - conda: https://conda.anaconda.org/conda-forge/noarch/typing_utils-0.1.0-pyhd8ed1ab_1.conda + - conda: https://conda.anaconda.org/conda-forge/noarch/tzdata-2025c-hc9c84f9_1.conda + - conda: https://conda.anaconda.org/conda-forge/linux-64/unicodedata2-17.0.1-py312h4c3975b_0.conda + - conda: https://conda.anaconda.org/conda-forge/noarch/uri-template-1.3.0-pyhd8ed1ab_1.conda + - conda: https://conda.anaconda.org/conda-forge/noarch/urllib3-2.6.3-pyhd8ed1ab_0.conda + - conda: https://conda.anaconda.org/conda-forge/linux-64/wayland-1.24.0-hd6090a7_1.conda + - conda: https://conda.anaconda.org/conda-forge/noarch/wcwidth-0.6.0-pyhd8ed1ab_0.conda + - conda: https://conda.anaconda.org/conda-forge/noarch/webcolors-25.10.0-pyhd8ed1ab_0.conda + - conda: https://conda.anaconda.org/conda-forge/noarch/webencodings-0.5.1-pyhd8ed1ab_3.conda + - conda: https://conda.anaconda.org/conda-forge/noarch/websocket-client-1.9.0-pyhd8ed1ab_0.conda + - conda: https://conda.anaconda.org/conda-forge/noarch/wheel-0.46.3-pyhd8ed1ab_0.conda + - conda: https://conda.anaconda.org/conda-forge/linux-64/xcb-util-0.4.1-h4f16b4b_2.conda + - conda: https://conda.anaconda.org/conda-forge/linux-64/xcb-util-cursor-0.1.6-hb03c661_0.conda + - conda: https://conda.anaconda.org/conda-forge/linux-64/xcb-util-image-0.4.0-hb711507_2.conda + - conda: https://conda.anaconda.org/conda-forge/linux-64/xcb-util-keysyms-0.4.1-hb711507_0.conda + - conda: https://conda.anaconda.org/conda-forge/linux-64/xcb-util-renderutil-0.3.10-hb711507_0.conda + - conda: https://conda.anaconda.org/conda-forge/linux-64/xcb-util-wm-0.4.2-hb711507_0.conda + - conda: https://conda.anaconda.org/conda-forge/linux-64/xkeyboard-config-2.47-hb03c661_0.conda + - conda: https://conda.anaconda.org/conda-forge/linux-64/xorg-libice-1.1.2-hb9d3cd8_0.conda + - conda: https://conda.anaconda.org/conda-forge/linux-64/xorg-libsm-1.2.6-he73a12e_0.conda + - conda: https://conda.anaconda.org/conda-forge/linux-64/xorg-libx11-1.8.13-he1eb515_0.conda + - conda: https://conda.anaconda.org/conda-forge/linux-64/xorg-libxau-1.0.12-hb03c661_1.conda + - conda: https://conda.anaconda.org/conda-forge/linux-64/xorg-libxcomposite-0.4.7-hb03c661_0.conda + - conda: https://conda.anaconda.org/conda-forge/linux-64/xorg-libxcursor-1.2.3-hb9d3cd8_0.conda + - conda: https://conda.anaconda.org/conda-forge/linux-64/xorg-libxdamage-1.1.6-hb9d3cd8_0.conda + - conda: https://conda.anaconda.org/conda-forge/linux-64/xorg-libxdmcp-1.1.5-hb03c661_1.conda + - conda: https://conda.anaconda.org/conda-forge/linux-64/xorg-libxext-1.3.7-hb03c661_0.conda + - conda: https://conda.anaconda.org/conda-forge/linux-64/xorg-libxfixes-6.0.2-hb03c661_0.conda + - conda: https://conda.anaconda.org/conda-forge/linux-64/xorg-libxi-1.8.2-hb9d3cd8_0.conda + - conda: https://conda.anaconda.org/conda-forge/linux-64/xorg-libxrandr-1.5.5-hb03c661_0.conda + - conda: https://conda.anaconda.org/conda-forge/linux-64/xorg-libxrender-0.9.12-hb9d3cd8_0.conda + - conda: https://conda.anaconda.org/conda-forge/linux-64/xorg-libxtst-1.2.5-hb9d3cd8_3.conda + - conda: https://conda.anaconda.org/conda-forge/linux-64/xorg-libxxf86vm-1.1.7-hb03c661_0.conda + - conda: https://conda.anaconda.org/conda-forge/linux-64/xxhash-0.8.3-hb47aa4a_0.conda + - conda: https://conda.anaconda.org/conda-forge/linux-64/yaml-0.2.5-h280c20c_3.conda + - conda: https://conda.anaconda.org/conda-forge/linux-64/zeromq-4.3.5-h41580af_10.conda + - conda: https://conda.anaconda.org/conda-forge/noarch/zipp-3.23.0-pyhcf101f3_1.conda + - conda: https://conda.anaconda.org/conda-forge/linux-64/zlib-ng-2.3.3-hceb46e0_1.conda + - conda: https://conda.anaconda.org/conda-forge/linux-64/zstd-1.5.7-hb78ec9c_6.conda + - pypi: https://files.pythonhosted.org/packages/bc/5d/339b995273c25a79bad5144ffb4fe57f4428d9ad2603942c851a66376afd/gmsh-4.15.1-py2.py3-none-manylinux_2_24_x86_64.whl + - pypi: https://files.pythonhosted.org/packages/6e/67/9d4ac4b0d683aaa4170da59a1980740b281fd38fc253e1830fde4dac3d4f/pygmsh-7.1.17-py3-none-any.whl mpich: channels: - url: https://conda.anaconda.org/conda-forge/ @@ -14955,6 +15291,23 @@ packages: - pkg:pypi/fonttools?source=hash-mapping size: 2932702 timestamp: 1765632761555 +- conda: https://conda.anaconda.org/conda-forge/linux-64/fonttools-4.62.0-py312h8a5da7c_0.conda + sha256: 777c80a1aa0889e6b637631c31f95d0b048848c5ba710f89ed7cedd3ad318227 + md5: 526f7ffd63820e55d7992cc1cf931a36 + depends: + - __glibc >=2.17,<3.0.a0 + - brotli + - libgcc >=14 + - munkres + - python >=3.12,<3.13.0a0 + - python_abi 3.12.* *_cp312 + - unicodedata2 >=15.1.0 + license: MIT + license_family: MIT + purls: + - pkg:pypi/fonttools?source=compressed-mapping + size: 2935817 + timestamp: 1773137546716 - conda: https://conda.anaconda.org/conda-forge/osx-arm64/fonttools-4.61.1-py312h5748b74_0.conda sha256: d87752e84621f90e9350262200fef55f054472f7779323f51717b557208e2a16 md5: c14625bf00c41c00cea174f459287fc4 @@ -15550,6 +15903,23 @@ packages: - pkg:pypi/gmpy2?source=hash-mapping size: 214554 timestamp: 1762946924209 +- conda: https://conda.anaconda.org/conda-forge/linux-64/gmpy2-2.3.0-py312hcaba1f9_1.conda + sha256: 6fbdd686d04a0d8c48efe92795137d3bba55a4325acd7931978fd8ea5e24684d + md5: fedbe80d864debab03541e1b447fc12a + depends: + - __glibc >=2.17,<3.0.a0 + - gmp >=6.3.0,<7.0a0 + - libgcc >=14 + - mpc >=1.3.1,<2.0a0 + - mpfr >=4.2.1,<5.0a0 + - python >=3.12,<3.13.0a0 + - python_abi 3.12.* *_cp312 + license: LGPL-3.0-or-later + license_family: LGPL + purls: + - pkg:pypi/gmpy2?source=compressed-mapping + size: 253171 + timestamp: 1773245116314 - conda: https://conda.anaconda.org/conda-forge/osx-arm64/gmpy2-2.2.1-py312hee6aa52_2.conda sha256: e2f72ddb929fcd161d68729891f25241d62ab1a9d4e37d0284f2b2fce88935fa md5: bed6eebc8d1690f205a781c993f9bc65 @@ -15815,6 +16185,23 @@ packages: - pkg:pypi/h5py?source=hash-mapping size: 1296491 timestamp: 1764016696413 +- conda: https://conda.anaconda.org/conda-forge/linux-64/h5py-3.15.1-nompi_py312ha4f8f14_101.conda + sha256: bb5cefbe5b54195a54f749189fc6797568d52e8790b2f542143c681b98a92b71 + md5: 23965cb240cb534649dfe2327ecec4fa + depends: + - __glibc >=2.17,<3.0.a0 + - cached-property + - hdf5 >=1.14.6,<1.14.7.0a0 + - libgcc >=14 + - numpy >=1.23,<3 + - python >=3.12,<3.13.0a0 + - python_abi 3.12.* *_cp312 + license: BSD-3-Clause + license_family: BSD + purls: + - pkg:pypi/h5py?source=hash-mapping + size: 1290741 + timestamp: 1764016665782 - conda: https://conda.anaconda.org/conda-forge/osx-arm64/h5py-3.15.1-mpi_mpich_py312h15326f5_1.conda sha256: cf8f8debcf495e0b9c284e673877f2fa208b02ee5c8dfc1d96268d779299adc3 md5: 3f7f1871038d11ee3dcae08316842e23 @@ -15895,6 +16282,26 @@ packages: purls: [] size: 2035859 timestamp: 1769445400168 +- conda: https://conda.anaconda.org/conda-forge/linux-64/harfbuzz-13.1.0-h6083320_0.conda + sha256: 08dc098dcc5c3445331a834f46602b927cb65d2768189f3f032a6e4643f15cd9 + md5: 5baf48da05855be929c5a50f4377794d + depends: + - __glibc >=2.17,<3.0.a0 + - cairo >=1.18.4,<2.0a0 + - graphite2 >=1.3.14,<2.0a0 + - icu >=78.2,<79.0a0 + - libexpat >=2.7.4,<3.0a0 + - libfreetype >=2.14.2 + - libfreetype6 >=2.14.2 + - libgcc >=14 + - libglib >=2.86.4,<3.0a0 + - libstdcxx >=14 + - libzlib >=1.3.1,<2.0a0 + license: MIT + license_family: MIT + purls: [] + size: 2615630 + timestamp: 1773217509651 - conda: https://conda.anaconda.org/conda-forge/osx-arm64/harfbuzz-12.2.0-haf38c7b_0.conda sha256: 2f8d95fe1cb655fe3bac114062963f08cc77b31b042027ef7a04ebde3ce21594 md5: 1c7ff9d458dd8220ac2ee71dd4af1be5 @@ -16015,6 +16422,24 @@ packages: purls: [] size: 3885031 timestamp: 1770390958500 +- conda: https://conda.anaconda.org/conda-forge/linux-64/hdf5-1.14.6-nompi_h19486de_106.conda + sha256: 1fc50ce3b86710fba3ec9c5714f1612b5ffa4230d70bfe43e2a1436eacba1621 + md5: c223ee1429ba538f3e48cfb4a0b97357 + depends: + - __glibc >=2.17,<3.0.a0 + - libaec >=1.1.5,<2.0a0 + - libcurl >=8.18.0,<9.0a0 + - libgcc >=14 + - libgfortran + - libgfortran5 >=14.3.0 + - libstdcxx >=14 + - libzlib >=1.3.1,<2.0a0 + - openssl >=3.5.5,<4.0a0 + license: BSD-3-Clause + license_family: BSD + purls: [] + size: 3708864 + timestamp: 1770390337946 - conda: https://conda.anaconda.org/conda-forge/osx-arm64/hdf5-1.14.6-mpi_mpich_h15328f7_6.conda sha256: a041fb1de4c6e44d173c98385fb89e48254e03f22c4eeb5ffe7bffa53c05adc6 md5: be4b2eb273cb805267537126012cdd17 @@ -17115,6 +17540,21 @@ packages: - pkg:pypi/kiwisolver?source=hash-mapping size: 77682 timestamp: 1762488738724 +- conda: https://conda.anaconda.org/conda-forge/linux-64/kiwisolver-1.5.0-py312h0a2e395_0.conda + sha256: eec7654c2d68f06590862c6e845cc70987b6d6559222b6f0e619dea4268f5dd5 + md5: cd74a9525dc74bbbf93cf8aa2fa9eb5b + depends: + - python + - libstdcxx >=14 + - libgcc >=14 + - __glibc >=2.17,<3.0.a0 + - python_abi 3.12.* *_cp312 + license: BSD-3-Clause + license_family: BSD + purls: + - pkg:pypi/kiwisolver?source=compressed-mapping + size: 77120 + timestamp: 1773067050308 - conda: https://conda.anaconda.org/conda-forge/osx-arm64/kiwisolver-1.4.9-py312hd8c8125_2.conda sha256: 8d68f6ec4d947902034fe9ed9d4a4c1180b5767bd9731af940f5a0e436bc3dfd md5: ddf4775023a2466ee308792ed80ca408 @@ -17568,6 +18008,18 @@ packages: purls: [] size: 264243 timestamp: 1745264221534 +- conda: https://conda.anaconda.org/conda-forge/linux-64/lerc-4.1.0-hdb68285_0.conda + sha256: f84cb54782f7e9cea95e810ea8fef186e0652d0fa73d3009914fa2c1262594e1 + md5: a752488c68f2e7c456bcbd8f16eec275 + depends: + - __glibc >=2.17,<3.0.a0 + - libgcc >=14 + - libstdcxx >=14 + license: Apache-2.0 + license_family: Apache + purls: [] + size: 261513 + timestamp: 1773113328888 - conda: https://conda.anaconda.org/conda-forge/osx-arm64/lerc-4.0.0-hd64df32_1.conda sha256: 12361697f8ffc9968907d1a7b5830e34c670e4a59b638117a2cdfed8f63a38f8 md5: a74332d9b60b62905e3d30709df08bf1 @@ -18384,6 +18836,23 @@ packages: purls: [] size: 463621 timestamp: 1770892808818 +- conda: https://conda.anaconda.org/conda-forge/linux-64/libcurl-8.19.0-hcf29cc6_0.conda + sha256: a0390fd0536ebcd2244e243f5f00ab8e76ab62ed9aa214cd54470fe7496620f4 + md5: d50608c443a30c341c24277d28290f76 + depends: + - __glibc >=2.17,<3.0.a0 + - krb5 >=1.22.2,<1.23.0a0 + - libgcc >=14 + - libnghttp2 >=1.67.0,<2.0a0 + - libssh2 >=1.11.1,<2.0a0 + - libzlib >=1.3.1,<2.0a0 + - openssl >=3.5.5,<4.0a0 + - zstd >=1.5.7,<1.6.0a0 + license: curl + license_family: MIT + purls: [] + size: 466704 + timestamp: 1773218522665 - conda: https://conda.anaconda.org/conda-forge/osx-arm64/libcurl-8.18.0-hd5a2499_1.conda sha256: dbc34552fc6f040bbcd52b4246ec068ce8d82be0e76bfe45c6984097758d37c2 md5: 2742a933ef07e91f38e3d33ad6fe937c @@ -19661,6 +20130,31 @@ packages: purls: [] size: 117463 timestamp: 1768753005332 +- conda: https://conda.anaconda.org/conda-forge/linux-64/libnetcdf-4.10.0-nompi_hbf2fc22_100.conda + sha256: f38b00b29c9495b71c12465397c735224ebaef71ad01278c3b9cb69dac685b65 + md5: 0eb36a09dad274e750d60b49aaec0af7 + depends: + - __glibc >=2.17,<3.0.a0 + - attr >=2.5.2,<2.6.0a0 + - blosc >=1.21.6,<2.0a0 + - bzip2 >=1.0.8,<2.0a0 + - hdf4 >=4.2.15,<4.2.16.0a0 + - hdf5 >=1.14.6,<1.14.7.0a0 + - libaec >=1.1.5,<2.0a0 + - libcurl >=8.18.0,<9.0a0 + - libgcc >=14 + - libstdcxx >=14 + - libxml2 + - libxml2-16 >=2.14.6 + - libzip >=1.11.2,<2.0a0 + - libzlib >=1.3.1,<2.0a0 + - openssl >=3.5.5,<4.0a0 + - zstd >=1.5.7,<1.6.0a0 + license: MIT + license_family: MIT + purls: [] + size: 862222 + timestamp: 1772190364667 - conda: https://conda.anaconda.org/conda-forge/linux-64/libnetcdf-4.9.3-nompi_h11f7409_103.conda sha256: e9a8668212719a91a6b0348db05188dfc59de5a21888db13ff8510918a67b258 md5: 3ccff1066c05a1e6c221356eecc40581 @@ -23243,6 +23737,32 @@ packages: - pkg:pypi/netcdf4?source=hash-mapping size: 1109790 timestamp: 1760540565753 +- conda: https://conda.anaconda.org/conda-forge/linux-64/netcdf4-1.7.4-nompi_py311ha0596eb_105.conda + noarch: python + sha256: ec2dc3171649378d1602bfd361f00a158f800d32e4092749740c4c6d288746b1 + md5: 71b833f92f41ab92b16ca9f87e8735fe + depends: + - python + - certifi + - cftime + - numpy + - packaging + - hdf5 + - libnetcdf + - libgcc >=14 + - __glibc >=2.17,<3.0.a0 + - hdf5 >=1.14.6,<1.14.7.0a0 + - libnetcdf >=4.10.0,<4.10.1.0a0 + - libzlib >=1.3.1,<2.0a0 + - numpy >=1.23,<3 + - _python_abi3_support 1.* + - cpython >=3.11 + license: MIT + license_family: MIT + purls: + - pkg:pypi/netcdf4?source=hash-mapping + size: 1094040 + timestamp: 1772794140711 - conda: https://conda.anaconda.org/conda-forge/linux-64/netcdf4-1.7.4-nompi_py312h25f8dc5_102.conda sha256: eecbf3489560510d2c7d8d73ae812b1d0d1241f667e250afdd3faad244fb3a52 md5: 99217b58c029977345b72bb36a1f6596 @@ -24534,6 +25054,18 @@ packages: - pkg:pypi/platformdirs?source=compressed-mapping size: 25643 timestamp: 1771233827084 +- conda: https://conda.anaconda.org/conda-forge/noarch/platformdirs-4.9.4-pyhcf101f3_0.conda + sha256: 0289f0a38337ee201d984f8f31f11f6ef076cfbbfd0ab9181d12d9d1d099bf46 + md5: 82c1787f2a65c0155ef9652466ee98d6 + depends: + - python >=3.10 + - python + license: MIT + license_family: MIT + purls: + - pkg:pypi/platformdirs?source=compressed-mapping + size: 25646 + timestamp: 1773199142345 - conda: https://conda.anaconda.org/conda-forge/noarch/pluggy-1.6.0-pyhf9edf01_1.conda sha256: e14aafa63efa0528ca99ba568eaf506eb55a0371d12e6250aaaa61718d2eb62e md5: d7585b6550ad04c8c5e21097ada2888e diff --git a/pixi.toml b/pixi.toml index ddd20fbf..4b52deb9 100644 --- a/pixi.toml +++ b/pixi.toml @@ -241,6 +241,18 @@ petsc-local-clean = { cmd = "./build-petsc.sh clean", cwd = "petsc-custom" } [feature.kaiju] platforms = ["linux-64"] +# ============================================ +# GADI CLUSTER FEATURE +# ============================================ +# For NCI Gadi HPC (CentOS, module OpenMPI + HDF5, PBS Pro) +# Pure Python only — base dependencies cover all pure-Python needs. +# mpi4py, h5py, petsc, petsc4py are built from source against +# Gadi's module OpenMPI and HDF5 using gadi_install_pixi.sh. +# See: install-scripts/uw3-hpc-install-scripts/gadi_install_pixi.sh + +[feature.gadi] +platforms = ["linux-64"] + # ============================================ # RUNTIME FEATURE (for tutorials/examples) # ============================================ @@ -328,3 +340,7 @@ amr-openmpi-dev = { features = ["amr-openmpi", "runtime", "dev"], solve-group = # --- Kaiju Cluster Track (linux-64 only) --- # Pure Python from pixi; MPI/PETSc/h5py built from source against spack OpenMPI kaiju = { features = ["kaiju"], solve-group = "kaiju" } + +# --- Gadi Cluster Track (linux-64 only) --- +# Pure Python from pixi; MPI/PETSc/h5py built from source against Gadi modules +gadi = { features = ["gadi"], solve-group = "gadi" } From 23791c0229f6457b802e8aef1bfbb3223f8bae05 Mon Sep 17 00:00:00 2001 From: Juan Carlos Graciosa Date: Wed, 18 Mar 2026 09:19:00 +1100 Subject: [PATCH 12/26] Added build-petsc-gadi script --- petsc-custom/build-petsc-gadi.sh | 220 +++++++++++++++++++++++++++++++ 1 file changed, 220 insertions(+) create mode 100644 petsc-custom/build-petsc-gadi.sh diff --git a/petsc-custom/build-petsc-gadi.sh b/petsc-custom/build-petsc-gadi.sh new file mode 100644 index 00000000..b1bfbc70 --- /dev/null +++ b/petsc-custom/build-petsc-gadi.sh @@ -0,0 +1,220 @@ +#!/bin/bash +# +# Build PETSc with AMR tools for NCI Gadi (module OpenMPI + HDF5, PBS Pro) +# +# Differences from build-petsc.sh (local macOS/pixi): +# MPI auto-detected from PATH (module load puts mpicc in PATH) +# --with-hdf5-dir=$HDF5_DIR → uses Gadi's system HDF5 module (not pixi) +# No --download-fblaslapack → Gadi has system BLAS/LAPACK +# No --download-cmake → cmake loaded from Gadi module +# --with-petsc4py=1 → built during configure (not a separate step) +# +# This script builds the same AMR tool set as build-petsc.sh and build-petsc-kaiju.sh: +# pragmatic, mmg, parmmg, slepc, mumps, metis, parmetis, ptscotch, scalapack +# +# Applies the same UW3 patches as build-petsc.sh: +# plexfem-internal-boundary-ownership-fix.patch +# scotch-7.0.10-c23-fix.tar.gz +# +# Usage (must be inside pixi gadi env with Gadi modules loaded): +# module load openmpi/4.1.7 hdf5/1.12.2p cmake/3.31.6 +# source gadi_install_pixi.sh (activates pixi gadi env) +# ./build-petsc-gadi.sh # Full build +# ./build-petsc-gadi.sh configure # Just reconfigure +# ./build-petsc-gadi.sh build # Just make +# ./build-petsc-gadi.sh patch # Apply UW3 patches +# ./build-petsc-gadi.sh test # Run PETSc tests +# ./build-petsc-gadi.sh clean # Remove PETSc directory +# +# Build time: ~1 hour +# +set -e + +SCRIPT_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)" +PETSC_DIR="${SCRIPT_DIR}/petsc" +PETSC_ARCH="arch-linux-c-opt" + +# Require Gadi OpenMPI to be loaded +if ! command -v mpicc &>/dev/null; then + echo "Error: mpicc not found. Load Gadi OpenMPI module first:" + echo " module load openmpi/4.1.7" + exit 1 +fi + +# Require HDF5_DIR to be set (from Gadi hdf5 module) +if [ -z "${HDF5_DIR}" ]; then + echo "Error: HDF5_DIR is not set. Load Gadi HDF5 module first:" + echo " module load hdf5/1.12.2p" + exit 1 +fi + +# Require pixi gadi environment +if ! echo "${PATH}" | tr ':' '\n' | grep -q "\.pixi/envs/gadi/bin"; then + echo "Error: must be run inside the pixi gadi environment" + echo " source gadi_install_pixi.sh (sets up env via pixi shell-hook)" + exit 1 +fi + +echo "==========================================" +echo "PETSc AMR Build Script (Gadi)" +echo "==========================================" +echo "PETSC_DIR: $PETSC_DIR" +echo "PETSC_ARCH: $PETSC_ARCH" +echo "mpicc: $(which mpicc)" +echo "HDF5_DIR: $HDF5_DIR" +echo "==========================================" + +clone_petsc() { + if [ -d "$PETSC_DIR" ]; then + echo "PETSc directory already exists. Skipping clone." + echo "To force fresh clone, run: ./build-petsc-gadi.sh clean" + return 0 + fi + + echo "Cloning PETSc release branch..." + git clone -b release https://gitlab.com/petsc/petsc.git "$PETSC_DIR" + echo "Clone complete." +} + +apply_patches() { + echo "Applying UW3 patches to PETSc..." + cd "$PETSC_DIR" + + # Fix ghost facet ownership + part-consistent assembly in boundary + # residual/integral/Jacobian paths (plexfem.c). Without this, internal + # boundary natural BCs produce rank-dependent results in parallel. + local patch="${SCRIPT_DIR}/patches/plexfem-internal-boundary-ownership-fix.patch" + if [ -f "$patch" ]; then + if git apply --check "$patch" 2>/dev/null; then + git apply "$patch" + echo " Applied: plexfem-internal-boundary-ownership-fix.patch" + else + echo " Skipped: plexfem-internal-boundary-ownership-fix.patch (already applied or conflict)" + fi + fi + + echo "Patches complete." +} + +configure_petsc() { + echo "Configuring PETSc with AMR tools..." + cd "$PETSC_DIR" + + # Downloads and builds: + # AMR: mmg, parmmg, pragmatic, eigen + # Solvers: mumps, scalapack, slepc, superlu, superlu_dist, hypre + # Partitions: metis, parmetis, ptscotch (patched for C23) + # Mesh: ctetgen, triangle, zlib + # HDF5: from Gadi module (not downloaded) + # cmake: from Gadi module (not downloaded) + # BLAS/LAPACK: from Gadi system (not downloaded) + # MPI: from Gadi module (not downloaded) + # petsc4py: built during configure + python3 ./configure \ + --with-petsc-arch="${PETSC_ARCH}" \ + --with-debugging=0 \ + --COPTFLAGS="-g -O3" --CXXOPTFLAGS="-g -O3" --FOPTFLAGS="-g -O3" \ + --with-shared-libraries=1 \ + --with-cxx-dialect=C++11 \ + --with-make-np=40 \ + --with-hdf5-dir="${HDF5_DIR}" \ + --with-hdf5=1 \ + --with-pragmatic=1 \ + --with-petsc4py=1 \ + --with-x=0 \ + --download-zlib=1 \ + --download-eigen=1 \ + --download-metis=1 \ + --download-parmetis=1 \ + --download-mumps=1 \ + --download-scalapack=1 \ + --download-slepc=1 \ + --download-ptscotch="${SCRIPT_DIR}/patches/scotch-7.0.10-c23-fix.tar.gz" \ + --download-mmg=1 \ + --download-mmg-cmake-arguments="-DMMG_INSTALL_PRIVATE_HEADERS=ON -DUSE_SCOTCH=OFF" \ + --download-parmmg=1 \ + --download-pragmatic=1 \ + --download-superlu=1 \ + --download-superlu_dist=1 \ + --download-hypre=1 \ + --download-ctetgen=1 \ + --download-triangle=1 \ + --useThreads=0 + + echo "Configure complete." +} + +build_petsc() { + echo "Building PETSc..." + cd "$PETSC_DIR" + + export PETSC_DIR + export PETSC_ARCH + + make all + echo "PETSc build complete." +} + +test_petsc() { + echo "Testing PETSc..." + cd "$PETSC_DIR" + + export PETSC_DIR + export PETSC_ARCH + + make check + echo "PETSc tests complete." +} + +clean_petsc() { + echo "Removing PETSc directory..." + if [ -d "$PETSC_DIR" ]; then + rm -rf "$PETSC_DIR" + echo "Cleaned." + else + echo "Nothing to clean." + fi +} + +show_help() { + echo "Usage: $0 [command]" + echo "" + echo "Commands:" + echo " (none) Full build: clone, patch, configure, build" + echo " clone Clone PETSc repository" + echo " patch Apply UW3 patches to PETSc source" + echo " configure Configure PETSc with AMR tools" + echo " build Build PETSc" + echo " test Run PETSc tests" + echo " clean Remove PETSc directory" + echo " help Show this help" +} + +case "${1:-all}" in + all) + clone_petsc + apply_patches + configure_petsc + build_petsc + echo "" + echo "==========================================" + echo "PETSc AMR build complete! (Gadi)" + echo "Set these environment variables:" + echo " export PETSC_DIR=$PETSC_DIR" + echo " export PETSC_ARCH=$PETSC_ARCH" + echo " export PYTHONPATH=\$PETSC_DIR/\$PETSC_ARCH/lib:\$PYTHONPATH" + echo "==========================================" + ;; + clone) clone_petsc ;; + patch) apply_patches ;; + configure) configure_petsc ;; + build) build_petsc ;; + test) test_petsc ;; + clean) clean_petsc ;; + help|--help|-h) show_help ;; + *) + echo "Unknown command: $1" + show_help + exit 1 + ;; +esac From 330560d1672ee215e6f4e11c38ecf52daf994f32 Mon Sep 17 00:00:00 2001 From: Juan Carlos Graciosa Date: Wed, 18 Mar 2026 09:29:58 +1100 Subject: [PATCH 13/26] Specified MPI_DIR --- petsc-custom/build-petsc-gadi.sh | 4 +++- 1 file changed, 3 insertions(+), 1 deletion(-) diff --git a/petsc-custom/build-petsc-gadi.sh b/petsc-custom/build-petsc-gadi.sh index b1bfbc70..5f1dbec7 100644 --- a/petsc-custom/build-petsc-gadi.sh +++ b/petsc-custom/build-petsc-gadi.sh @@ -108,10 +108,12 @@ configure_petsc() { # HDF5: from Gadi module (not downloaded) # cmake: from Gadi module (not downloaded) # BLAS/LAPACK: from Gadi system (not downloaded) - # MPI: from Gadi module (not downloaded) + # MPI: from Gadi module — MPI_DIR derived from which mpicc # petsc4py: built during configure + MPI_DIR="$(dirname "$(dirname "$(which mpicc)")")" python3 ./configure \ --with-petsc-arch="${PETSC_ARCH}" \ + --with-mpi-dir="${MPI_DIR}" \ --with-debugging=0 \ --COPTFLAGS="-g -O3" --CXXOPTFLAGS="-g -O3" --FOPTFLAGS="-g -O3" \ --with-shared-libraries=1 \ From 566ecc9b7ed308d4ed8ec520a34992698302590d Mon Sep 17 00:00:00 2001 From: Juan Carlos Graciosa Date: Wed, 18 Mar 2026 10:10:48 +1100 Subject: [PATCH 14/26] Unset conda/pixi compiler variables that interfere with mpicc --- petsc-custom/build-petsc-gadi.sh | 5 +++++ 1 file changed, 5 insertions(+) diff --git a/petsc-custom/build-petsc-gadi.sh b/petsc-custom/build-petsc-gadi.sh index 5f1dbec7..ff87a83b 100644 --- a/petsc-custom/build-petsc-gadi.sh +++ b/petsc-custom/build-petsc-gadi.sh @@ -111,6 +111,11 @@ configure_petsc() { # MPI: from Gadi module — MPI_DIR derived from which mpicc # petsc4py: built during configure MPI_DIR="$(dirname "$(dirname "$(which mpicc)")")" + export LD_LIBRARY_PATH="${MPI_DIR}/lib:${LD_LIBRARY_PATH}" + + # Unset conda/pixi compiler variables that interfere with mpicc + unset CC CXX FC F77 AR CFLAGS CXXFLAGS FFLAGS LDFLAGS + python3 ./configure \ --with-petsc-arch="${PETSC_ARCH}" \ --with-mpi-dir="${MPI_DIR}" \ From 98d84f64b604e29283a4bc147fde5e594bbb3047 Mon Sep 17 00:00:00 2001 From: Juan Carlos Graciosa Date: Wed, 18 Mar 2026 10:24:49 +1100 Subject: [PATCH 15/26] Removed explicit setting of MPI_DIR --- petsc-custom/build-petsc-gadi.sh | 26 ++++++++++++++++++++++---- 1 file changed, 22 insertions(+), 4 deletions(-) diff --git a/petsc-custom/build-petsc-gadi.sh b/petsc-custom/build-petsc-gadi.sh index ff87a83b..977e99c2 100644 --- a/petsc-custom/build-petsc-gadi.sh +++ b/petsc-custom/build-petsc-gadi.sh @@ -110,15 +110,33 @@ configure_petsc() { # BLAS/LAPACK: from Gadi system (not downloaded) # MPI: from Gadi module — MPI_DIR derived from which mpicc # petsc4py: built during configure - MPI_DIR="$(dirname "$(dirname "$(which mpicc)")")" + # MPI_DIR is set by load_env() via module load — trust it directly. + # Do NOT recompute from 'which mpicc': conda's PATH may resolve to the + # pixi env's bin instead of the Gadi OpenMPI bin. + if [ -z "${MPI_DIR}" ]; then + echo "Error: MPI_DIR is not set. Source gadi_install_pixi.sh first." + exit 1 + fi export LD_LIBRARY_PATH="${MPI_DIR}/lib:${LD_LIBRARY_PATH}" - # Unset conda/pixi compiler variables that interfere with mpicc - unset CC CXX FC F77 AR CFLAGS CXXFLAGS FFLAGS LDFLAGS + # Unset ALL conda/pixi compiler and build variables. + # The pixi gadi env ships a full conda toolchain (x86_64-conda-linux-gnu-*) + # that interferes with OpenMPI wrappers and PETSc configure. + unset CC CXX FC F77 F90 CPP AR RANLIB + unset CFLAGS CXXFLAGS FFLAGS CPPFLAGS LDFLAGS + + # Force mpicc/mpicxx/mpifort to use the system compilers, not conda's gcc. + # Conda's gcc uses conda's bundled linker which cannot find Gadi-specific + # libs that OpenMPI depends on (libucc from UCX, libnl_3 from netlink). + export OMPI_CC=/usr/bin/gcc + export OMPI_CXX=/usr/bin/g++ + export OMPI_FC=/usr/bin/gfortran python3 ./configure \ --with-petsc-arch="${PETSC_ARCH}" \ - --with-mpi-dir="${MPI_DIR}" \ + --with-cc="${MPI_DIR}/bin/mpicc" \ + --with-cxx="${MPI_DIR}/bin/mpicxx" \ + --with-fc="${MPI_DIR}/bin/mpifort" \ --with-debugging=0 \ --COPTFLAGS="-g -O3" --CXXOPTFLAGS="-g -O3" --FOPTFLAGS="-g -O3" \ --with-shared-libraries=1 \ From 9637ed9cb5a145ac120e5ca176f7ff95fd151114 Mon Sep 17 00:00:00 2001 From: Juan Carlos Graciosa Date: Wed, 18 Mar 2026 10:34:58 +1100 Subject: [PATCH 16/26] Added ignoreLinkOutput --- petsc-custom/build-petsc-gadi.sh | 1 + 1 file changed, 1 insertion(+) diff --git a/petsc-custom/build-petsc-gadi.sh b/petsc-custom/build-petsc-gadi.sh index 977e99c2..ed240874 100644 --- a/petsc-custom/build-petsc-gadi.sh +++ b/petsc-custom/build-petsc-gadi.sh @@ -137,6 +137,7 @@ configure_petsc() { --with-cc="${MPI_DIR}/bin/mpicc" \ --with-cxx="${MPI_DIR}/bin/mpicxx" \ --with-fc="${MPI_DIR}/bin/mpifort" \ + --ignoreLinkOutput=1 \ --with-debugging=0 \ --COPTFLAGS="-g -O3" --CXXOPTFLAGS="-g -O3" --FOPTFLAGS="-g -O3" \ --with-shared-libraries=1 \ From 6952d4242b003f4236ccca0e0a0f93341539bab5 Mon Sep 17 00:00:00 2001 From: Juan Carlos Graciosa Date: Wed, 18 Mar 2026 10:46:58 +1100 Subject: [PATCH 17/26] Added missing libraries: libucc and libnl_3 --- petsc-custom/build-petsc-gadi.sh | 3 +-- 1 file changed, 1 insertion(+), 2 deletions(-) diff --git a/petsc-custom/build-petsc-gadi.sh b/petsc-custom/build-petsc-gadi.sh index ed240874..ae775cd0 100644 --- a/petsc-custom/build-petsc-gadi.sh +++ b/petsc-custom/build-petsc-gadi.sh @@ -117,7 +117,7 @@ configure_petsc() { echo "Error: MPI_DIR is not set. Source gadi_install_pixi.sh first." exit 1 fi - export LD_LIBRARY_PATH="${MPI_DIR}/lib:${LD_LIBRARY_PATH}" + export LD_LIBRARY_PATH="${MPI_DIR}/lib:/apps/ucc/1.3.0/lib:/usr/lib64:${LD_LIBRARY_PATH}" # Unset ALL conda/pixi compiler and build variables. # The pixi gadi env ships a full conda toolchain (x86_64-conda-linux-gnu-*) @@ -137,7 +137,6 @@ configure_petsc() { --with-cc="${MPI_DIR}/bin/mpicc" \ --with-cxx="${MPI_DIR}/bin/mpicxx" \ --with-fc="${MPI_DIR}/bin/mpifort" \ - --ignoreLinkOutput=1 \ --with-debugging=0 \ --COPTFLAGS="-g -O3" --CXXOPTFLAGS="-g -O3" --FOPTFLAGS="-g -O3" \ --with-shared-libraries=1 \ From 4e09ec6f3d9866561d744239a4af1cf34879d3d4 Mon Sep 17 00:00:00 2001 From: Juan Carlos Graciosa Date: Wed, 18 Mar 2026 10:52:42 +1100 Subject: [PATCH 18/26] Reordering PATH --- petsc-custom/build-petsc-gadi.sh | 11 ++++++++--- 1 file changed, 8 insertions(+), 3 deletions(-) diff --git a/petsc-custom/build-petsc-gadi.sh b/petsc-custom/build-petsc-gadi.sh index ae775cd0..43e75342 100644 --- a/petsc-custom/build-petsc-gadi.sh +++ b/petsc-custom/build-petsc-gadi.sh @@ -126,13 +126,18 @@ configure_petsc() { unset CFLAGS CXXFLAGS FFLAGS CPPFLAGS LDFLAGS # Force mpicc/mpicxx/mpifort to use the system compilers, not conda's gcc. - # Conda's gcc uses conda's bundled linker which cannot find Gadi-specific - # libs that OpenMPI depends on (libucc from UCX, libnl_3 from netlink). export OMPI_CC=/usr/bin/gcc export OMPI_CXX=/usr/bin/g++ export OMPI_FC=/usr/bin/gfortran - python3 ./configure \ + # Capture pixi's python3 BEFORE reordering PATH. + # Then put system bin dirs first so the system linker (/usr/bin/ld) is + # found before conda's ld — conda's ld cannot find Gadi-specific libs + # (hcoll, ucc, libnl) that OpenMPI was built against. + _PIXI_PYTHON="$(which python3)" + export PATH="/usr/bin:/usr/local/bin:${MPI_DIR}/bin:${PATH}" + + "${_PIXI_PYTHON}" ./configure \ --with-petsc-arch="${PETSC_ARCH}" \ --with-cc="${MPI_DIR}/bin/mpicc" \ --with-cxx="${MPI_DIR}/bin/mpicxx" \ From edacae7eea8529bed09e4ba473180b2b62511ada Mon Sep 17 00:00:00 2001 From: Juan Carlos Graciosa Date: Wed, 18 Mar 2026 11:17:43 +1100 Subject: [PATCH 19/26] Added symlink and setting LD_LIBRARY_PATH before configure runs --- petsc-custom/build-petsc-gadi.sh | 12 +++++++++++- 1 file changed, 11 insertions(+), 1 deletion(-) mode change 100644 => 100755 petsc-custom/build-petsc-gadi.sh diff --git a/petsc-custom/build-petsc-gadi.sh b/petsc-custom/build-petsc-gadi.sh old mode 100644 new mode 100755 index 43e75342..29d75994 --- a/petsc-custom/build-petsc-gadi.sh +++ b/petsc-custom/build-petsc-gadi.sh @@ -117,7 +117,17 @@ configure_petsc() { echo "Error: MPI_DIR is not set. Source gadi_install_pixi.sh first." exit 1 fi - export LD_LIBRARY_PATH="${MPI_DIR}/lib:/apps/ucc/1.3.0/lib:/usr/lib64:${LD_LIBRARY_PATH}" + # Create symlinks for Gadi's compiler-tagged Fortran MPI libs. + # mpifort --showme refers to libmpi_usempif08 etc. (no compiler tag), + # but Gadi only ships _GNU, _Intel, _nvidia variants. Symlink GNU → untagged. + local _mpi_gnu_dir="${SCRIPT_DIR}/mpi-gadi-gnu-libs" + mkdir -p "${_mpi_gnu_dir}" + for _lib in usempif08 usempi_ignore_tkr mpifh; do + [ ! -f "${_mpi_gnu_dir}/libmpi_${_lib}.so" ] && \ + ln -sf "${MPI_DIR}/lib/libmpi_${_lib}_GNU.so" "${_mpi_gnu_dir}/libmpi_${_lib}.so" + done + + export LD_LIBRARY_PATH="${_mpi_gnu_dir}:${MPI_DIR}/lib:/apps/ucc/1.3.0/lib:/usr/lib64:${LD_LIBRARY_PATH}" # Unset ALL conda/pixi compiler and build variables. # The pixi gadi env ships a full conda toolchain (x86_64-conda-linux-gnu-*) From 201acdb9a0cfb114dcb6006941c7c9f895c8d5ac Mon Sep 17 00:00:00 2001 From: Juan Carlos Graciosa Date: Wed, 18 Mar 2026 11:21:51 +1100 Subject: [PATCH 20/26] fix linking --- petsc-custom/build-petsc-gadi.sh | 3 +++ 1 file changed, 3 insertions(+) diff --git a/petsc-custom/build-petsc-gadi.sh b/petsc-custom/build-petsc-gadi.sh index 29d75994..fbfb34b6 100755 --- a/petsc-custom/build-petsc-gadi.sh +++ b/petsc-custom/build-petsc-gadi.sh @@ -127,7 +127,10 @@ configure_petsc() { ln -sf "${MPI_DIR}/lib/libmpi_${_lib}_GNU.so" "${_mpi_gnu_dir}/libmpi_${_lib}.so" done + # LD_LIBRARY_PATH = runtime search path (dynamic loader) + # LIBRARY_PATH = link-time search path (ld resolves -lmpi_usempif08 etc.) export LD_LIBRARY_PATH="${_mpi_gnu_dir}:${MPI_DIR}/lib:/apps/ucc/1.3.0/lib:/usr/lib64:${LD_LIBRARY_PATH}" + export LIBRARY_PATH="${_mpi_gnu_dir}:${LIBRARY_PATH}" # Unset ALL conda/pixi compiler and build variables. # The pixi gadi env ships a full conda toolchain (x86_64-conda-linux-gnu-*) From c82aa588a922d09451b35bbd2603b5949b618916 Mon Sep 17 00:00:00 2001 From: Juan Carlos Graciosa Date: Wed, 18 Mar 2026 11:29:51 +1100 Subject: [PATCH 21/26] Added OMPI_FCFLAGS --- petsc-custom/build-petsc-gadi.sh | 4 ++++ 1 file changed, 4 insertions(+) diff --git a/petsc-custom/build-petsc-gadi.sh b/petsc-custom/build-petsc-gadi.sh index fbfb34b6..de3c8021 100755 --- a/petsc-custom/build-petsc-gadi.sh +++ b/petsc-custom/build-petsc-gadi.sh @@ -142,6 +142,10 @@ configure_petsc() { export OMPI_CC=/usr/bin/gcc export OMPI_CXX=/usr/bin/g++ export OMPI_FC=/usr/bin/gfortran + # Gadi's OpenMPI puts Fortran headers in a compiler-tagged subdirectory + # (include/GNU/) rather than include/ directly. OMPI_FCFLAGS adds extra + # flags to the mpifort wrapper so mpif.h and .mod files are found. + export OMPI_FCFLAGS="-I${MPI_DIR}/include/GNU" # Capture pixi's python3 BEFORE reordering PATH. # Then put system bin dirs first so the system linker (/usr/bin/ld) is From 1d888c36a4c06364114f887ce413ce35f2ef0f93 Mon Sep 17 00:00:00 2001 From: Juan Carlos Graciosa Date: Wed, 18 Mar 2026 14:03:24 +1100 Subject: [PATCH 22/26] Added symlink to scratch for petsc; download fblaslapack --- petsc-custom/build-petsc-gadi.sh | 17 +++++++++++++++-- 1 file changed, 15 insertions(+), 2 deletions(-) diff --git a/petsc-custom/build-petsc-gadi.sh b/petsc-custom/build-petsc-gadi.sh index de3c8021..fd619f00 100755 --- a/petsc-custom/build-petsc-gadi.sh +++ b/petsc-custom/build-petsc-gadi.sh @@ -65,14 +65,24 @@ echo "HDF5_DIR: $HDF5_DIR" echo "==========================================" clone_petsc() { - if [ -d "$PETSC_DIR" ]; then + # Resolve symlink so git clone always writes to the real path. + # git clone can replace a symlink-to-empty-dir with a real directory, + # which defeats the gdata→scratch symlink approach. + local _clone_target + if [ -L "$PETSC_DIR" ]; then + _clone_target="$(readlink -f "$PETSC_DIR")" + else + _clone_target="$PETSC_DIR" + fi + + if [ -f "${_clone_target}/configure" ]; then echo "PETSc directory already exists. Skipping clone." echo "To force fresh clone, run: ./build-petsc-gadi.sh clean" return 0 fi echo "Cloning PETSc release branch..." - git clone -b release https://gitlab.com/petsc/petsc.git "$PETSC_DIR" + git clone -b release https://gitlab.com/petsc/petsc.git "${_clone_target}" echo "Clone complete." } @@ -105,6 +115,8 @@ configure_petsc() { # Solvers: mumps, scalapack, slepc, superlu, superlu_dist, hypre # Partitions: metis, parmetis, ptscotch (patched for C23) # Mesh: ctetgen, triangle, zlib + # BLAS/LAPACK: fblaslapack (Gadi has system BLAS/LAPACK but auto-detection + # fails due to PATH/env manipulation required for OpenMPI) # HDF5: from Gadi module (not downloaded) # cmake: from Gadi module (not downloaded) # BLAS/LAPACK: from Gadi system (not downloaded) @@ -169,6 +181,7 @@ configure_petsc() { --with-pragmatic=1 \ --with-petsc4py=1 \ --with-x=0 \ + --download-fblaslapack=1 \ --download-zlib=1 \ --download-eigen=1 \ --download-metis=1 \ From e81feff3dcd1b973916793aa714b965d92f4967f Mon Sep 17 00:00:00 2001 From: Juan Carlos Graciosa Date: Wed, 18 Mar 2026 15:52:53 +1100 Subject: [PATCH 23/26] Updated env vars for build --- petsc-custom/build-petsc-gadi.sh | 54 ++++++++++++++++++-------------- 1 file changed, 31 insertions(+), 23 deletions(-) diff --git a/petsc-custom/build-petsc-gadi.sh b/petsc-custom/build-petsc-gadi.sh index fd619f00..69f01784 100755 --- a/petsc-custom/build-petsc-gadi.sh +++ b/petsc-custom/build-petsc-gadi.sh @@ -106,29 +106,16 @@ apply_patches() { echo "Patches complete." } -configure_petsc() { - echo "Configuring PETSc with AMR tools..." - cd "$PETSC_DIR" - - # Downloads and builds: - # AMR: mmg, parmmg, pragmatic, eigen - # Solvers: mumps, scalapack, slepc, superlu, superlu_dist, hypre - # Partitions: metis, parmetis, ptscotch (patched for C23) - # Mesh: ctetgen, triangle, zlib - # BLAS/LAPACK: fblaslapack (Gadi has system BLAS/LAPACK but auto-detection - # fails due to PATH/env manipulation required for OpenMPI) - # HDF5: from Gadi module (not downloaded) - # cmake: from Gadi module (not downloaded) - # BLAS/LAPACK: from Gadi system (not downloaded) - # MPI: from Gadi module — MPI_DIR derived from which mpicc - # petsc4py: built during configure +setup_gadi_build_env() { + # Shared environment setup required for both configure and build. + # Must be called before any compile/link step. + # # MPI_DIR is set by load_env() via module load — trust it directly. - # Do NOT recompute from 'which mpicc': conda's PATH may resolve to the - # pixi env's bin instead of the Gadi OpenMPI bin. if [ -z "${MPI_DIR}" ]; then echo "Error: MPI_DIR is not set. Source gadi_install_pixi.sh first." exit 1 fi + # Create symlinks for Gadi's compiler-tagged Fortran MPI libs. # mpifort --showme refers to libmpi_usempif08 etc. (no compiler tag), # but Gadi only ships _GNU, _Intel, _nvidia variants. Symlink GNU → untagged. @@ -146,7 +133,7 @@ configure_petsc() { # Unset ALL conda/pixi compiler and build variables. # The pixi gadi env ships a full conda toolchain (x86_64-conda-linux-gnu-*) - # that interferes with OpenMPI wrappers and PETSc configure. + # that interferes with OpenMPI wrappers and PETSc configure/build. unset CC CXX FC F77 F90 CPP AR RANLIB unset CFLAGS CXXFLAGS FFLAGS CPPFLAGS LDFLAGS @@ -156,15 +143,34 @@ configure_petsc() { export OMPI_FC=/usr/bin/gfortran # Gadi's OpenMPI puts Fortran headers in a compiler-tagged subdirectory # (include/GNU/) rather than include/ directly. OMPI_FCFLAGS adds extra - # flags to the mpifort wrapper so mpif.h and .mod files are found. + # flags to the mpifort wrapper so mpif.h and mpi.mod are found. export OMPI_FCFLAGS="-I${MPI_DIR}/include/GNU" - # Capture pixi's python3 BEFORE reordering PATH. - # Then put system bin dirs first so the system linker (/usr/bin/ld) is + # Put system bin dirs first so the system linker (/usr/bin/ld) is # found before conda's ld — conda's ld cannot find Gadi-specific libs # (hcoll, ucc, libnl) that OpenMPI was built against. - _PIXI_PYTHON="$(which python3)" export PATH="/usr/bin:/usr/local/bin:${MPI_DIR}/bin:${PATH}" +} + +configure_petsc() { + echo "Configuring PETSc with AMR tools..." + cd "$PETSC_DIR" + + # Downloads and builds: + # AMR: mmg, parmmg, pragmatic, eigen + # Solvers: mumps, scalapack, slepc, superlu, superlu_dist, hypre + # Partitions: metis, parmetis, ptscotch (patched for C23) + # Mesh: ctetgen, triangle, zlib + # BLAS/LAPACK: fblaslapack (Gadi has system BLAS/LAPACK but auto-detection + # fails due to PATH/env manipulation required for OpenMPI) + # HDF5: from Gadi module (not downloaded) + # cmake: from Gadi module (not downloaded) + # MPI: from Gadi module — MPI_DIR derived from which mpicc + # petsc4py: built during configure + + # Capture pixi's python3 BEFORE setup_gadi_build_env reorders PATH. + _PIXI_PYTHON="$(which python3)" + setup_gadi_build_env "${_PIXI_PYTHON}" ./configure \ --with-petsc-arch="${PETSC_ARCH}" \ @@ -210,6 +216,7 @@ build_petsc() { export PETSC_DIR export PETSC_ARCH + setup_gadi_build_env make all echo "PETSc build complete." @@ -221,6 +228,7 @@ test_petsc() { export PETSC_DIR export PETSC_ARCH + setup_gadi_build_env make check echo "PETSc tests complete." From f11e3754ec5a66b189211d3d7a83dd4245611db8 Mon Sep 17 00:00:00 2001 From: Juan Carlos Graciosa Date: Wed, 18 Mar 2026 23:28:36 +1100 Subject: [PATCH 24/26] Added patchelf to reorder h5py RPATH after source build --- pixi.toml | 6 ++++++ 1 file changed, 6 insertions(+) diff --git a/pixi.toml b/pixi.toml index 4b52deb9..8c1b9efe 100644 --- a/pixi.toml +++ b/pixi.toml @@ -253,6 +253,12 @@ platforms = ["linux-64"] [feature.gadi] platforms = ["linux-64"] +[feature.gadi.dependencies] +# patchelf is needed to fix h5py RPATH order after source build: +# meson embeds the conda env lib dir before Gadi's HDF5 in RPATH, +# so we use patchelf post-install to move HDF5_DIR to the front. +patchelf = "*" + # ============================================ # RUNTIME FEATURE (for tutorials/examples) # ============================================ From f73469e1844ca14a94f9fc8fb645a32c575890f6 Mon Sep 17 00:00:00 2001 From: Juan Carlos Graciosa Date: Mon, 23 Mar 2026 15:55:07 +1100 Subject: [PATCH 25/26] Changes according to PR feedback --- docs/developer/guides/kaiju-cluster-setup.md | 12 +- petsc-custom/build-petsc-gadi.sh | 288 ------------- petsc-custom/build-petsc-kaiju.sh | 181 -------- petsc-custom/build-petsc.sh | 426 +++++++++++++------ pixi.lock | 376 ++++++++++++++-- pixi.toml | 37 +- 6 files changed, 653 insertions(+), 667 deletions(-) delete mode 100755 petsc-custom/build-petsc-gadi.sh delete mode 100644 petsc-custom/build-petsc-kaiju.sh diff --git a/docs/developer/guides/kaiju-cluster-setup.md b/docs/developer/guides/kaiju-cluster-setup.md index 760f7435..53ef20a0 100644 --- a/docs/developer/guides/kaiju-cluster-setup.md +++ b/docs/developer/guides/kaiju-cluster-setup.md @@ -24,9 +24,9 @@ Pixi manages the Python environment consistently with the developer's local mach The key constraint is that **anything linked against MPI must use the same MPI as Slurm**. This means `mpi4py`, `h5py`, PETSc, and `petsc4py` are built from source against Spack's OpenMPI — not from conda-forge (which bundles MPICH). ``` -pixi kaiju env → Python 3.12, sympy, scipy, pint, pydantic, ... (conda-forge, no MPI) -spack → openmpi@4.1.6 (cluster MPI) -source build → mpi4py, PETSc+AMR+petsc4py, h5py (linked to spack MPI) +pixi hpc env → Python 3.12, sympy, scipy, pint, pydantic, ... (conda-forge, no MPI) +spack → openmpi@4.1.6 (cluster MPI) +source build → mpi4py, PETSc+AMR+petsc4py, h5py (linked to spack MPI) ``` --- @@ -78,7 +78,7 @@ This runs the following steps in order: |------|----------|------| | Install pixi | `setup_pixi` | ~1 min | | Clone Underworld3 | `clone_uw3` | ~1 min | -| Install pixi kaiju env | `install_pixi_env` | ~3 min | +| Install pixi hpc env | `install_pixi_env` | ~3 min | | Build mpi4py from source | `install_mpi4py` | ~2 min | | Build PETSc + AMR tools | `install_petsc` | ~1 hour | | Build MPI-enabled h5py | `install_h5py` | ~2 min | @@ -94,7 +94,7 @@ install_petsc # run just one step ### What PETSc builds -PETSc is compiled from source (`petsc-custom/build-petsc-kaiju.sh`) with: +PETSc is compiled from source (`petsc-custom/build-petsc.sh` with `UW_CLUSTER=kaiju`) with: - **AMR tools**: mmg, parmmg, pragmatic, eigen, bison - **Solvers**: mumps, scalapack, slepc @@ -117,7 +117,7 @@ source ~/install_scripts/uw3_install_kaiju_amr.sh This: 1. Loads `spack openmpi@4.1.6` -2. Activates the pixi `kaiju` environment via `pixi shell-hook` +2. Activates the pixi `hpc` environment via `pixi shell-hook` 3. Sets `PETSC_DIR`, `PETSC_ARCH`, and `PYTHONPATH` for petsc4py 4. Sets `PMIX_MCA_psec=native` and `OMPI_MCA_btl_tcp_if_include=eno1` diff --git a/petsc-custom/build-petsc-gadi.sh b/petsc-custom/build-petsc-gadi.sh deleted file mode 100755 index 69f01784..00000000 --- a/petsc-custom/build-petsc-gadi.sh +++ /dev/null @@ -1,288 +0,0 @@ -#!/bin/bash -# -# Build PETSc with AMR tools for NCI Gadi (module OpenMPI + HDF5, PBS Pro) -# -# Differences from build-petsc.sh (local macOS/pixi): -# MPI auto-detected from PATH (module load puts mpicc in PATH) -# --with-hdf5-dir=$HDF5_DIR → uses Gadi's system HDF5 module (not pixi) -# No --download-fblaslapack → Gadi has system BLAS/LAPACK -# No --download-cmake → cmake loaded from Gadi module -# --with-petsc4py=1 → built during configure (not a separate step) -# -# This script builds the same AMR tool set as build-petsc.sh and build-petsc-kaiju.sh: -# pragmatic, mmg, parmmg, slepc, mumps, metis, parmetis, ptscotch, scalapack -# -# Applies the same UW3 patches as build-petsc.sh: -# plexfem-internal-boundary-ownership-fix.patch -# scotch-7.0.10-c23-fix.tar.gz -# -# Usage (must be inside pixi gadi env with Gadi modules loaded): -# module load openmpi/4.1.7 hdf5/1.12.2p cmake/3.31.6 -# source gadi_install_pixi.sh (activates pixi gadi env) -# ./build-petsc-gadi.sh # Full build -# ./build-petsc-gadi.sh configure # Just reconfigure -# ./build-petsc-gadi.sh build # Just make -# ./build-petsc-gadi.sh patch # Apply UW3 patches -# ./build-petsc-gadi.sh test # Run PETSc tests -# ./build-petsc-gadi.sh clean # Remove PETSc directory -# -# Build time: ~1 hour -# -set -e - -SCRIPT_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)" -PETSC_DIR="${SCRIPT_DIR}/petsc" -PETSC_ARCH="arch-linux-c-opt" - -# Require Gadi OpenMPI to be loaded -if ! command -v mpicc &>/dev/null; then - echo "Error: mpicc not found. Load Gadi OpenMPI module first:" - echo " module load openmpi/4.1.7" - exit 1 -fi - -# Require HDF5_DIR to be set (from Gadi hdf5 module) -if [ -z "${HDF5_DIR}" ]; then - echo "Error: HDF5_DIR is not set. Load Gadi HDF5 module first:" - echo " module load hdf5/1.12.2p" - exit 1 -fi - -# Require pixi gadi environment -if ! echo "${PATH}" | tr ':' '\n' | grep -q "\.pixi/envs/gadi/bin"; then - echo "Error: must be run inside the pixi gadi environment" - echo " source gadi_install_pixi.sh (sets up env via pixi shell-hook)" - exit 1 -fi - -echo "==========================================" -echo "PETSc AMR Build Script (Gadi)" -echo "==========================================" -echo "PETSC_DIR: $PETSC_DIR" -echo "PETSC_ARCH: $PETSC_ARCH" -echo "mpicc: $(which mpicc)" -echo "HDF5_DIR: $HDF5_DIR" -echo "==========================================" - -clone_petsc() { - # Resolve symlink so git clone always writes to the real path. - # git clone can replace a symlink-to-empty-dir with a real directory, - # which defeats the gdata→scratch symlink approach. - local _clone_target - if [ -L "$PETSC_DIR" ]; then - _clone_target="$(readlink -f "$PETSC_DIR")" - else - _clone_target="$PETSC_DIR" - fi - - if [ -f "${_clone_target}/configure" ]; then - echo "PETSc directory already exists. Skipping clone." - echo "To force fresh clone, run: ./build-petsc-gadi.sh clean" - return 0 - fi - - echo "Cloning PETSc release branch..." - git clone -b release https://gitlab.com/petsc/petsc.git "${_clone_target}" - echo "Clone complete." -} - -apply_patches() { - echo "Applying UW3 patches to PETSc..." - cd "$PETSC_DIR" - - # Fix ghost facet ownership + part-consistent assembly in boundary - # residual/integral/Jacobian paths (plexfem.c). Without this, internal - # boundary natural BCs produce rank-dependent results in parallel. - local patch="${SCRIPT_DIR}/patches/plexfem-internal-boundary-ownership-fix.patch" - if [ -f "$patch" ]; then - if git apply --check "$patch" 2>/dev/null; then - git apply "$patch" - echo " Applied: plexfem-internal-boundary-ownership-fix.patch" - else - echo " Skipped: plexfem-internal-boundary-ownership-fix.patch (already applied or conflict)" - fi - fi - - echo "Patches complete." -} - -setup_gadi_build_env() { - # Shared environment setup required for both configure and build. - # Must be called before any compile/link step. - # - # MPI_DIR is set by load_env() via module load — trust it directly. - if [ -z "${MPI_DIR}" ]; then - echo "Error: MPI_DIR is not set. Source gadi_install_pixi.sh first." - exit 1 - fi - - # Create symlinks for Gadi's compiler-tagged Fortran MPI libs. - # mpifort --showme refers to libmpi_usempif08 etc. (no compiler tag), - # but Gadi only ships _GNU, _Intel, _nvidia variants. Symlink GNU → untagged. - local _mpi_gnu_dir="${SCRIPT_DIR}/mpi-gadi-gnu-libs" - mkdir -p "${_mpi_gnu_dir}" - for _lib in usempif08 usempi_ignore_tkr mpifh; do - [ ! -f "${_mpi_gnu_dir}/libmpi_${_lib}.so" ] && \ - ln -sf "${MPI_DIR}/lib/libmpi_${_lib}_GNU.so" "${_mpi_gnu_dir}/libmpi_${_lib}.so" - done - - # LD_LIBRARY_PATH = runtime search path (dynamic loader) - # LIBRARY_PATH = link-time search path (ld resolves -lmpi_usempif08 etc.) - export LD_LIBRARY_PATH="${_mpi_gnu_dir}:${MPI_DIR}/lib:/apps/ucc/1.3.0/lib:/usr/lib64:${LD_LIBRARY_PATH}" - export LIBRARY_PATH="${_mpi_gnu_dir}:${LIBRARY_PATH}" - - # Unset ALL conda/pixi compiler and build variables. - # The pixi gadi env ships a full conda toolchain (x86_64-conda-linux-gnu-*) - # that interferes with OpenMPI wrappers and PETSc configure/build. - unset CC CXX FC F77 F90 CPP AR RANLIB - unset CFLAGS CXXFLAGS FFLAGS CPPFLAGS LDFLAGS - - # Force mpicc/mpicxx/mpifort to use the system compilers, not conda's gcc. - export OMPI_CC=/usr/bin/gcc - export OMPI_CXX=/usr/bin/g++ - export OMPI_FC=/usr/bin/gfortran - # Gadi's OpenMPI puts Fortran headers in a compiler-tagged subdirectory - # (include/GNU/) rather than include/ directly. OMPI_FCFLAGS adds extra - # flags to the mpifort wrapper so mpif.h and mpi.mod are found. - export OMPI_FCFLAGS="-I${MPI_DIR}/include/GNU" - - # Put system bin dirs first so the system linker (/usr/bin/ld) is - # found before conda's ld — conda's ld cannot find Gadi-specific libs - # (hcoll, ucc, libnl) that OpenMPI was built against. - export PATH="/usr/bin:/usr/local/bin:${MPI_DIR}/bin:${PATH}" -} - -configure_petsc() { - echo "Configuring PETSc with AMR tools..." - cd "$PETSC_DIR" - - # Downloads and builds: - # AMR: mmg, parmmg, pragmatic, eigen - # Solvers: mumps, scalapack, slepc, superlu, superlu_dist, hypre - # Partitions: metis, parmetis, ptscotch (patched for C23) - # Mesh: ctetgen, triangle, zlib - # BLAS/LAPACK: fblaslapack (Gadi has system BLAS/LAPACK but auto-detection - # fails due to PATH/env manipulation required for OpenMPI) - # HDF5: from Gadi module (not downloaded) - # cmake: from Gadi module (not downloaded) - # MPI: from Gadi module — MPI_DIR derived from which mpicc - # petsc4py: built during configure - - # Capture pixi's python3 BEFORE setup_gadi_build_env reorders PATH. - _PIXI_PYTHON="$(which python3)" - setup_gadi_build_env - - "${_PIXI_PYTHON}" ./configure \ - --with-petsc-arch="${PETSC_ARCH}" \ - --with-cc="${MPI_DIR}/bin/mpicc" \ - --with-cxx="${MPI_DIR}/bin/mpicxx" \ - --with-fc="${MPI_DIR}/bin/mpifort" \ - --with-debugging=0 \ - --COPTFLAGS="-g -O3" --CXXOPTFLAGS="-g -O3" --FOPTFLAGS="-g -O3" \ - --with-shared-libraries=1 \ - --with-cxx-dialect=C++11 \ - --with-make-np=40 \ - --with-hdf5-dir="${HDF5_DIR}" \ - --with-hdf5=1 \ - --with-pragmatic=1 \ - --with-petsc4py=1 \ - --with-x=0 \ - --download-fblaslapack=1 \ - --download-zlib=1 \ - --download-eigen=1 \ - --download-metis=1 \ - --download-parmetis=1 \ - --download-mumps=1 \ - --download-scalapack=1 \ - --download-slepc=1 \ - --download-ptscotch="${SCRIPT_DIR}/patches/scotch-7.0.10-c23-fix.tar.gz" \ - --download-mmg=1 \ - --download-mmg-cmake-arguments="-DMMG_INSTALL_PRIVATE_HEADERS=ON -DUSE_SCOTCH=OFF" \ - --download-parmmg=1 \ - --download-pragmatic=1 \ - --download-superlu=1 \ - --download-superlu_dist=1 \ - --download-hypre=1 \ - --download-ctetgen=1 \ - --download-triangle=1 \ - --useThreads=0 - - echo "Configure complete." -} - -build_petsc() { - echo "Building PETSc..." - cd "$PETSC_DIR" - - export PETSC_DIR - export PETSC_ARCH - setup_gadi_build_env - - make all - echo "PETSc build complete." -} - -test_petsc() { - echo "Testing PETSc..." - cd "$PETSC_DIR" - - export PETSC_DIR - export PETSC_ARCH - setup_gadi_build_env - - make check - echo "PETSc tests complete." -} - -clean_petsc() { - echo "Removing PETSc directory..." - if [ -d "$PETSC_DIR" ]; then - rm -rf "$PETSC_DIR" - echo "Cleaned." - else - echo "Nothing to clean." - fi -} - -show_help() { - echo "Usage: $0 [command]" - echo "" - echo "Commands:" - echo " (none) Full build: clone, patch, configure, build" - echo " clone Clone PETSc repository" - echo " patch Apply UW3 patches to PETSc source" - echo " configure Configure PETSc with AMR tools" - echo " build Build PETSc" - echo " test Run PETSc tests" - echo " clean Remove PETSc directory" - echo " help Show this help" -} - -case "${1:-all}" in - all) - clone_petsc - apply_patches - configure_petsc - build_petsc - echo "" - echo "==========================================" - echo "PETSc AMR build complete! (Gadi)" - echo "Set these environment variables:" - echo " export PETSC_DIR=$PETSC_DIR" - echo " export PETSC_ARCH=$PETSC_ARCH" - echo " export PYTHONPATH=\$PETSC_DIR/\$PETSC_ARCH/lib:\$PYTHONPATH" - echo "==========================================" - ;; - clone) clone_petsc ;; - patch) apply_patches ;; - configure) configure_petsc ;; - build) build_petsc ;; - test) test_petsc ;; - clean) clean_petsc ;; - help|--help|-h) show_help ;; - *) - echo "Unknown command: $1" - show_help - exit 1 - ;; -esac diff --git a/petsc-custom/build-petsc-kaiju.sh b/petsc-custom/build-petsc-kaiju.sh deleted file mode 100644 index d46852ba..00000000 --- a/petsc-custom/build-petsc-kaiju.sh +++ /dev/null @@ -1,181 +0,0 @@ -#!/bin/bash -# -# Build PETSc with AMR tools for the Kaiju cluster (Rocky Linux 8, Spack OpenMPI) -# -# Differences from build-petsc.sh (local macOS/pixi): -# MPI auto-detected from PATH (spack load puts mpicc in PATH; no --with-mpi-dir needed) -# --download-hdf5 → PETSc downloads HDF5 (not provided by pixi) -# --download-fblaslapack → no guaranteed system BLAS on Rocky Linux 8 -# --download-cmake → spack does not have cmake -# --with-petsc4py → built during configure (not a separate step) -# -# This script builds the same AMR tool set as build-petsc.sh: -# pragmatic, mmg, parmmg, slepc, mumps, metis, parmetis, ptscotch, scalapack -# -# Usage (must be inside a pixi kaiju shell with spack OpenMPI loaded): -# spack load openmpi@4.1.6 -# pixi shell -e kaiju -# ./build-petsc-kaiju.sh # Full build -# ./build-petsc-kaiju.sh configure # Just reconfigure -# ./build-petsc-kaiju.sh build # Just make -# ./build-petsc-kaiju.sh clean # Remove PETSc directory -# -# Build time: ~1 hour -# -set -e - -SCRIPT_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)" -PETSC_DIR="${SCRIPT_DIR}/petsc" -PETSC_ARCH="petsc-4-uw" - -# Require spack OpenMPI to be loaded -if ! command -v mpicc &>/dev/null; then - echo "Error: mpicc not found. Load spack OpenMPI first:" - echo " spack load openmpi@4.1.6" - exit 1 -fi - -# Require pixi kaiju environment -# Check PATH since PIXI_ENVIRONMENT is not set by pixi shell-hook (only by pixi shell) -if ! echo "$PATH" | tr ':' '\n' | grep -q "\.pixi/envs/kaiju/bin"; then - echo "Error: must be run inside the pixi kaiju environment" - echo " source uw3_install_kaiju_amr.sh (sets up env via pixi shell-hook)" - exit 1 -fi - -echo "==========================================" -echo "PETSc AMR Build Script (Kaiju)" -echo "==========================================" -echo "PETSC_DIR: $PETSC_DIR" -echo "PETSC_ARCH: $PETSC_ARCH" -echo "mpicc: $(which mpicc)" -echo "==========================================" - -clone_petsc() { - if [ -d "$PETSC_DIR" ]; then - echo "PETSc directory already exists. Skipping clone." - echo "To force fresh clone, run: ./build-petsc-kaiju.sh clean" - return 0 - fi - - echo "Cloning PETSc release branch..." - git clone -b release https://gitlab.com/petsc/petsc.git "$PETSC_DIR" - echo "Clone complete." -} - -configure_petsc() { - echo "Configuring PETSc with AMR tools..." - cd "$PETSC_DIR" - - # Downloads and builds: - # AMR: mmg, parmmg, pragmatic, eigen, bison - # Solvers: mumps, scalapack, slepc - # Partitions: metis, parmetis, ptscotch - # BLAS/LAPACK: fblaslapack (Rocky Linux 8 has no guaranteed system BLAS) - # HDF5: downloaded (not provided by pixi in kaiju env) - # cmake: downloaded (spack does not have cmake) - # MPI: spack OpenMPI (not downloaded) - # petsc4py: built during configure - # MPI_DIR is computed from `which mpicc` (spack OpenMPI in PATH). - # LD_LIBRARY_PATH must include $MPI_DIR/lib so PETSc configure test binaries - # can find libmpi.so at runtime (spack uses RPATH for its own binaries but - # does not set LD_LIBRARY_PATH — load_env in uw3_install_kaiju_amr.sh sets it). - MPI_DIR="$(dirname "$(dirname "$(which mpicc)")")" - python3 ./configure \ - --with-petsc-arch="$PETSC_ARCH" \ - --with-debugging=0 \ - --with-mpi-dir="$MPI_DIR" \ - --download-hdf5=1 \ - --download-fblaslapack=1 \ - --download-cmake=1 \ - --download-bison=1 \ - --download-eigen=1 \ - --download-metis=1 \ - --download-parmetis=1 \ - --download-mumps=1 \ - --download-scalapack=1 \ - --download-slepc=1 \ - --download-ptscotch=1 \ - --download-mmg=1 \ - --download-mmg-cmake-arguments="-DMMG_INSTALL_PRIVATE_HEADERS=ON -DUSE_SCOTCH=OFF" \ - --download-parmmg=1 \ - --download-pragmatic=1 \ - --with-pragmatic=1 \ - --with-petsc4py=1 \ - --with-x=0 \ - --with-make-np=40 - - echo "Configure complete." -} - -build_petsc() { - echo "Building PETSc..." - cd "$PETSC_DIR" - - export PETSC_DIR - export PETSC_ARCH - - make all - echo "PETSc build complete." -} - -test_petsc() { - echo "Testing PETSc..." - cd "$PETSC_DIR" - - export PETSC_DIR - export PETSC_ARCH - - make check - echo "PETSc tests complete." -} - -clean_petsc() { - echo "Removing PETSc directory..." - if [ -d "$PETSC_DIR" ]; then - rm -rf "$PETSC_DIR" - echo "Cleaned." - else - echo "Nothing to clean." - fi -} - -show_help() { - echo "Usage: $0 [command]" - echo "" - echo "Commands:" - echo " (none) Full build: clone, configure, build" - echo " clone Clone PETSc repository" - echo " configure Configure PETSc with AMR tools" - echo " build Build PETSc" - echo " test Run PETSc tests" - echo " clean Remove PETSc directory" - echo " help Show this help" -} - -case "${1:-all}" in - all) - clone_petsc - configure_petsc - build_petsc - echo "" - echo "==========================================" - echo "PETSc AMR build complete!" - echo "Set these environment variables:" - echo " export PETSC_DIR=$PETSC_DIR" - echo " export PETSC_ARCH=$PETSC_ARCH" - echo " export PYTHONPATH=\$PETSC_DIR/\$PETSC_ARCH/lib:\$PYTHONPATH" - echo "==========================================" - ;; - clone) clone_petsc ;; - configure) configure_petsc ;; - build) build_petsc ;; - test) test_petsc ;; - clean) clean_petsc ;; - help|--help|-h) show_help ;; - *) - echo "Unknown command: $1" - show_help - exit 1 - ;; -esac diff --git a/petsc-custom/build-petsc.sh b/petsc-custom/build-petsc.sh index 135c9aac..c18f2409 100755 --- a/petsc-custom/build-petsc.sh +++ b/petsc-custom/build-petsc.sh @@ -2,26 +2,34 @@ # # Build PETSc with adaptive mesh refinement (AMR) tools # -# This script builds a custom PETSc installation with: -# - pragmatic: anisotropic mesh adaptation -# - mmg: surface/volume mesh adaptation -# - parmmg: parallel mesh adaptation -# - slepc: eigenvalue solvers -# - mumps: direct solver +# Supports three build targets, auto-detected from hostname or UW_CLUSTER env var: +# local — macOS/Linux developer machine (pixi env for MPI and HDF5) +# kaiju — Kaiju cluster (Rocky Linux 8, Spack OpenMPI, no system HDF5/cmake/BLAS) +# gadi — NCI Gadi (CentOS, module OpenMPI + HDF5, PBS Pro) # -# MPI is auto-detected from the active pixi environment: -# - MPICH → PETSC_ARCH = petsc-4-uw-mpich -# - OpenMPI → PETSC_ARCH = petsc-4-uw-openmpi +# Cluster-specific differences: # -# Both builds co-exist under the same PETSc source tree. -# Build time: ~1 hour on Apple Silicon +# Aspect local kaiju gadi +# PETSC_ARCH petsc-4-uw-{mpich, petsc-4-uw arch-linux-c-opt +# openmpi} +# MPI pixi env spack (PATH) module (PATH) +# HDF5 pixi env download module ($HDF5_DIR) +# BLAS/LAPACK auto download download (auto fails) +# cmake pixi env download module +# bison download download system +# petsc4py separate step with-petsc4py=1 with-petsc4py=1 +# extra flags — — superlu, hypre, ... +# +# Override auto-detection: export UW_CLUSTER=local|kaiju|gadi # # Usage: -# ./build-petsc.sh # Full build (clone, configure, build) +# ./build-petsc.sh # Full build (clone, patch, configure, build) # ./build-petsc.sh configure # Just reconfigure # ./build-petsc.sh build # Just build (after configure) -# ./build-petsc.sh petsc4py # Just build petsc4py -# ./build-petsc.sh clean # Remove build for detected MPI +# ./build-petsc.sh petsc4py # Build petsc4py separately (local only) +# ./build-petsc.sh patch # Apply UW3 patches +# ./build-petsc.sh test # Run PETSc tests +# ./build-petsc.sh clean # Remove build for current arch # ./build-petsc.sh clean-all # Remove entire PETSc directory # ./build-petsc.sh help # Show this help # @@ -30,64 +38,180 @@ set -e SCRIPT_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)" PETSC_DIR="${SCRIPT_DIR}/petsc" -# Detect active pixi environment (robust) -if [ -z "$PIXI_PROJECT_ROOT" ]; then - echo "Error: This script must be run from within a pixi environment" - echo "Use: pixi run -e ./build-petsc.sh" - exit 1 -fi +# ── Cluster detection ───────────────────────────────────────────────────────── +detect_cluster() { + if [ -n "${UW_CLUSTER}" ]; then + echo "${UW_CLUSTER}"; return + fi + local hn + hn="$(hostname -f 2>/dev/null || hostname)" + case "${hn}" in + *.gadi.nci.org.au|gadi-*) echo "gadi" ;; + kaiju*) echo "kaiju" ;; + *) echo "local" ;; + esac +} + +CLUSTER="$(detect_cluster)" -PIXI_ENV="$(python3 - <<'EOF' +# ── Cluster-specific configuration ─────────────────────────────────────────── +# Sets PETSC_ARCH, MPI_IMPL, and cluster-specific variables (PIXI_ENV, MPI_DIR, +# HDF5_DIR). Also validates that the required environment is active. + +case "${CLUSTER}" in + local) + if [ -z "$PIXI_PROJECT_ROOT" ]; then + echo "Error: This script must be run from within a pixi environment" + echo "Use: pixi run -e ./build-petsc.sh" + exit 1 + fi + + PIXI_ENV="$(python3 - <<'EOF' import sys, pathlib print(pathlib.Path(sys.executable).resolve().parents[1]) EOF )" -# ── MPI auto-detection ────────────────────────────────────────────── -# Detect which MPI implementation is available in the pixi environment. -# Sets MPI_IMPL ("mpich" or "openmpi") and PETSC_ARCH accordingly. + _detect_local_mpi() { + local mpi_version + mpi_version=$(python3 -c "from mpi4py import MPI; print(MPI.Get_library_version())" 2>/dev/null || echo "") + if echo "$mpi_version" | grep -qi "open mpi"; then + echo "openmpi" + elif echo "$mpi_version" | grep -qi "mpich"; then + echo "mpich" + else + local mpicc_out + mpicc_out=$("$PIXI_ENV/bin/mpicc" --version 2>&1 || echo "") + if echo "$mpicc_out" | grep -qi "open mpi"; then + echo "openmpi" + else + echo "mpich" + fi + fi + } -detect_mpi() { - local mpi_version - mpi_version=$(python3 -c "from mpi4py import MPI; print(MPI.Get_library_version())" 2>/dev/null || echo "") + MPI_IMPL=$(_detect_local_mpi) + PETSC_ARCH="petsc-4-uw-${MPI_IMPL}" + ;; - if echo "$mpi_version" | grep -qi "open mpi"; then - echo "openmpi" - elif echo "$mpi_version" | grep -qi "mpich"; then - echo "mpich" - else - # Fallback: check for mpicc --version - local mpicc_out - mpicc_out=$("$PIXI_ENV/bin/mpicc" --version 2>&1 || echo "") - if echo "$mpicc_out" | grep -qi "open mpi"; then - echo "openmpi" - else - echo "mpich" # default fallback + kaiju) + if ! command -v mpicc &>/dev/null; then + echo "Error: mpicc not found. Load spack OpenMPI first:" + echo " spack load openmpi@4.1.6" + exit 1 fi - fi -} + if ! echo "${PATH}" | tr ':' '\n' | grep -q "\.pixi/envs/hpc/bin"; then + echo "Error: must be run inside the pixi hpc environment" + echo " source kaiju_install_user.sh (activates env via pixi shell-hook)" + exit 1 + fi + MPI_DIR="$(dirname "$(dirname "$(which mpicc)")")" + MPI_IMPL="openmpi" + PETSC_ARCH="petsc-4-uw" + ;; -MPI_IMPL=$(detect_mpi) -PETSC_ARCH="petsc-4-uw-${MPI_IMPL}" + gadi) + if ! command -v mpicc &>/dev/null; then + echo "Error: mpicc not found. Load Gadi OpenMPI module first:" + echo " module load openmpi/4.1.7" + exit 1 + fi + if [ -z "${HDF5_DIR}" ]; then + echo "Error: HDF5_DIR is not set. Load Gadi HDF5 module first:" + echo " module load hdf5/1.12.2p" + exit 1 + fi + if ! echo "${PATH}" | tr ':' '\n' | grep -q "\.pixi/envs/hpc/bin"; then + echo "Error: must be run inside the pixi hpc environment" + echo " source gadi_install_shared.sh (activates env via pixi shell-hook)" + exit 1 + fi + MPI_DIR="$(dirname "$(dirname "$(which mpicc)")")" + MPI_IMPL="openmpi" + PETSC_ARCH="arch-linux-c-opt" + ;; + + *) + echo "Unknown cluster: ${CLUSTER}" + echo "Set UW_CLUSTER=local|kaiju|gadi to override auto-detection" + exit 1 + ;; +esac echo "==========================================" echo "PETSc AMR Build Script" echo "==========================================" -echo "PETSC_DIR: $PETSC_DIR" -echo "PETSC_ARCH: $PETSC_ARCH" -echo "MPI: $MPI_IMPL" -echo "PIXI_ENV: $PIXI_ENV" +echo "CLUSTER: ${CLUSTER}" +echo "PETSC_DIR: ${PETSC_DIR}" +echo "PETSC_ARCH: ${PETSC_ARCH}" +echo "MPI: ${MPI_IMPL}" +if [ "${CLUSTER}" = "local" ]; then + echo "PIXI_ENV: ${PIXI_ENV}" +else + echo "MPI_DIR: ${MPI_DIR}" +fi +[ "${CLUSTER}" = "gadi" ] && echo "HDF5_DIR: ${HDF5_DIR}" echo "==========================================" +# ── Gadi-specific: build environment setup ─────────────────────────────────── +# Handles compiler-tagged Fortran MPI libs and conda toolchain interference. +# Must be called before any compile/link step on Gadi. +setup_gadi_build_env() { + if [ -z "${MPI_DIR}" ]; then + echo "Error: MPI_DIR is not set. Source gadi_install_shared.sh first." + exit 1 + fi + + # Create symlinks for Gadi's compiler-tagged Fortran MPI libs. + # mpifort --showme refers to libmpi_usempif08 etc. (no compiler tag), + # but Gadi only ships _GNU, _Intel, _nvidia variants. + local _mpi_gnu_dir="${SCRIPT_DIR}/mpi-gadi-gnu-libs" + mkdir -p "${_mpi_gnu_dir}" + for _lib in usempif08 usempi_ignore_tkr mpifh; do + [ ! -f "${_mpi_gnu_dir}/libmpi_${_lib}.so" ] && \ + ln -sf "${MPI_DIR}/lib/libmpi_${_lib}_GNU.so" "${_mpi_gnu_dir}/libmpi_${_lib}.so" + done + + # LD_LIBRARY_PATH = runtime search path (dynamic loader) + # LIBRARY_PATH = link-time search path (ld resolves -lmpi_usempif08 etc.) + export LD_LIBRARY_PATH="${_mpi_gnu_dir}:${MPI_DIR}/lib:/apps/ucc/1.3.0/lib:/usr/lib64:${LD_LIBRARY_PATH}" + export LIBRARY_PATH="${_mpi_gnu_dir}:${LIBRARY_PATH}" + + # Unset conda/pixi compiler vars that interfere with OpenMPI wrappers. + # The pixi hpc env ships a full conda toolchain (x86_64-conda-linux-gnu-*) + # that conflicts with system compilers required by Gadi's OpenMPI. + unset CC CXX FC F77 F90 CPP AR RANLIB + unset CFLAGS CXXFLAGS FFLAGS CPPFLAGS LDFLAGS + + # Force MPI wrappers to use system compilers, not conda's gcc + export OMPI_CC=/usr/bin/gcc + export OMPI_CXX=/usr/bin/g++ + export OMPI_FC=/usr/bin/gfortran + # Gadi puts Fortran MPI headers in a compiler-tagged subdirectory (include/GNU/) + export OMPI_FCFLAGS="-I${MPI_DIR}/include/GNU" + + # Put system bin dirs first so the system linker (/usr/bin/ld) wins over + # conda's ld — conda's ld cannot find Gadi-specific libs (hcoll, ucc, libnl). + export PATH="/usr/bin:/usr/local/bin:${MPI_DIR}/bin:${PATH}" +} + clone_petsc() { - if [ -d "$PETSC_DIR" ]; then + # For Gadi: resolve symlink before cloning. git clone replaces a + # symlink-to-empty-dir with a real directory, defeating the + # gdata→scratch symlink approach used to avoid inode quota limits. + local _clone_target="${PETSC_DIR}" + if [ "${CLUSTER}" = "gadi" ] && [ -L "${PETSC_DIR}" ]; then + _clone_target="$(readlink -f "${PETSC_DIR}")" + fi + + if [ -f "${_clone_target}/configure" ]; then echo "PETSc directory already exists. Skipping clone." - echo "To force fresh clone, run: ./build-petsc.sh clean-all" + echo "To force fresh clone, run: $0 clean-all" return 0 fi echo "Cloning PETSc release branch..." - git clone -b release https://gitlab.com/petsc/petsc.git "$PETSC_DIR" + git clone -b release https://gitlab.com/petsc/petsc.git "${_clone_target}" echo "Clone complete." } @@ -112,49 +236,100 @@ apply_patches() { } configure_petsc() { - echo "Configuring PETSc with AMR tools ($MPI_IMPL)..." + echo "Configuring PETSc with AMR tools (${CLUSTER})..." cd "$PETSC_DIR" - # Configure with adaptive mesh refinement tools - # Downloads: bison, eigen, metis, mmg, mumps, parmetis, parmmg, - # pragmatic, ptscotch, scalapack, slepc - # Uses system: MPI (from pixi), HDF5 (from pixi) - python3 ./configure \ - --with-petsc-arch="$PETSC_ARCH" \ - --download-bison \ - --download-eigen \ - --download-metis \ - --download-mmg \ - --download-mumps \ - --download-parmetis \ - --download-parmmg \ - --download-pragmatic \ - --download-ptscotch="${SCRIPT_DIR}/patches/scotch-7.0.10-c23-fix.tar.gz" \ - --download-scalapack \ - --download-slepc \ - --with-debugging=0 \ - --with-hdf5=1 \ - --with-pragmatic=1 \ - --with-x=0 \ - --with-mpi-dir="$PIXI_ENV" \ - --with-hdf5-dir="$PIXI_ENV" \ - --download-hdf5=0 \ - --download-mpich=0 \ - --download-openmpi=0 \ - --download-mpi4py=0 \ - --with-petsc4py=0 + # Capture pixi's python3 BEFORE setup_gadi_build_env reorders PATH. + local _python="python3" + if [ "${CLUSTER}" = "gadi" ]; then + _python="$(which python3)" + setup_gadi_build_env + fi + + # Flags shared across all clusters. + # Downloads and builds: + # AMR: mmg, parmmg, pragmatic, eigen + # Solvers: mumps, scalapack, slepc + # Partitioners: metis, parmetis, ptscotch (patched for C23) + # Uses pixi env (local): MPI (amr/amr-mpich/amr-openmpi), HDF5 + # Downloads (kaiju): HDF5, BLAS/LAPACK, cmake, bison + # Uses module (gadi): MPI (openmpi/4.1.7), HDF5 ($HDF5_DIR) + local -a _common=( + --with-petsc-arch="${PETSC_ARCH}" + --with-debugging=0 + --with-pragmatic=1 + --with-x=0 + --download-eigen=1 + --download-metis=1 + --download-mmg=1 + "--download-mmg-cmake-arguments=-DMMG_INSTALL_PRIVATE_HEADERS=ON -DUSE_SCOTCH=OFF" + --download-mumps=1 + --download-parmetis=1 + --download-parmmg=1 + --download-pragmatic=1 + "--download-ptscotch=${SCRIPT_DIR}/patches/scotch-7.0.10-c23-fix.tar.gz" + --download-scalapack=1 + --download-slepc=1 + ) + + case "${CLUSTER}" in + local) + "${_python}" ./configure "${_common[@]}" \ + --with-mpi-dir="${PIXI_ENV}" \ + --with-hdf5=1 \ + --with-hdf5-dir="${PIXI_ENV}" \ + --download-hdf5=0 \ + --download-mpich=0 \ + --download-openmpi=0 \ + --download-mpi4py=0 \ + --download-bison \ + --with-petsc4py=0 + ;; + kaiju) + "${_python}" ./configure "${_common[@]}" \ + --with-mpi-dir="${MPI_DIR}" \ + --download-hdf5=1 \ + --download-fblaslapack=1 \ + --download-cmake=1 \ + --download-bison=1 \ + --with-petsc4py=1 \ + --with-make-np=40 + ;; + gadi) + "${_python}" ./configure "${_common[@]}" \ + --with-cc="${MPI_DIR}/bin/mpicc" \ + --with-cxx="${MPI_DIR}/bin/mpicxx" \ + --with-fc="${MPI_DIR}/bin/mpifort" \ + --with-hdf5=1 \ + --with-hdf5-dir="${HDF5_DIR}" \ + --download-fblaslapack=1 \ + --with-petsc4py=1 \ + --with-make-np=40 \ + --with-shared-libraries=1 \ + --with-cxx-dialect=C++11 \ + "--COPTFLAGS=-g -O3" "--CXXOPTFLAGS=-g -O3" "--FOPTFLAGS=-g -O3" \ + --useThreads=0 \ + --download-zlib=1 \ + --download-superlu=1 \ + --download-superlu_dist=1 \ + --download-hypre=1 \ + --download-ctetgen=1 \ + --download-triangle=1 + ;; + esac echo "Configure complete." } build_petsc() { - echo "Building PETSc ($MPI_IMPL)..." + echo "Building PETSc (${CLUSTER})..." cd "$PETSC_DIR" - # Set environment for build export PETSC_DIR export PETSC_ARCH + [ "${CLUSTER}" = "gadi" ] && setup_gadi_build_env + make all echo "PETSc build complete." } @@ -166,12 +341,19 @@ test_petsc() { export PETSC_DIR export PETSC_ARCH + [ "${CLUSTER}" = "gadi" ] && setup_gadi_build_env + make check echo "PETSc tests complete." } build_petsc4py() { - echo "Building petsc4py ($MPI_IMPL)..." + if [ "${CLUSTER}" != "local" ]; then + echo "Note: petsc4py is built during configure on HPC clusters (--with-petsc4py=1). Skipping." + return 0 + fi + + echo "Building petsc4py..." cd "$PETSC_DIR/src/binding/petsc4py" export PETSC_DIR @@ -183,9 +365,8 @@ build_petsc4py() { } clean_petsc() { - # Clean just the arch-specific build local arch_dir="$PETSC_DIR/$PETSC_ARCH" - echo "Removing PETSc build for $MPI_IMPL ($arch_dir)..." + echo "Removing PETSc build for $PETSC_ARCH ($arch_dir)..." if [ -d "$arch_dir" ]; then rm -rf "$arch_dir" echo "Cleaned $PETSC_ARCH." @@ -207,69 +388,58 @@ clean_all() { show_help() { echo "Usage: $0 [command]" echo "" - echo "MPI auto-detected from pixi environment: $MPI_IMPL" - echo "PETSC_ARCH: $PETSC_ARCH" + echo "Cluster: ${CLUSTER} (override: export UW_CLUSTER=local|kaiju|gadi)" + echo "PETSC_ARCH: ${PETSC_ARCH}" echo "" echo "Commands:" - echo " (none) Full build: clone, configure, build, petsc4py" + echo " (none) Full build: clone, patch, configure, build" + [ "${CLUSTER}" = "local" ] && echo " (local: also runs petsc4py separately)" echo " clone Clone PETSc repository" + echo " patch Apply UW3 patches to PETSc source" echo " configure Configure PETSc with AMR tools" echo " build Build PETSc" echo " test Run PETSc tests" - echo " petsc4py Build and install petsc4py" - echo " patch Apply UW3 patches to PETSc source" - echo " clean Remove build for current MPI ($PETSC_ARCH)" - echo " clean-all Remove entire PETSc directory (all MPI builds)" + echo " petsc4py Build and install petsc4py (local only)" + echo " clean Remove build for current arch (${PETSC_ARCH})" + echo " clean-all Remove entire PETSc directory (all builds)" echo " help Show this help" - echo "" - echo "MPICH and OpenMPI builds co-exist. To build both:" - echo " pixi run -e amr ./petsc-custom/build-petsc.sh" - echo " pixi run -e amr-openmpi ./petsc-custom/build-petsc.sh" + if [ "${CLUSTER}" = "local" ]; then + echo "" + echo "MPICH and OpenMPI builds co-exist. To build both:" + echo " pixi run -e amr ./petsc-custom/build-petsc.sh" + echo " pixi run -e amr-openmpi ./petsc-custom/build-petsc.sh" + fi } -# Main entry point +# ── Main entry point ────────────────────────────────────────────────────────── case "${1:-all}" in all) clone_petsc apply_patches configure_petsc build_petsc - build_petsc4py + if [ "${CLUSTER}" = "local" ]; then + build_petsc4py + fi echo "" echo "==========================================" - echo "PETSc AMR build complete! ($MPI_IMPL)" - echo "Set these environment variables:" - echo " export PETSC_DIR=$PETSC_DIR" - echo " export PETSC_ARCH=$PETSC_ARCH" + echo "PETSc AMR build complete! (${CLUSTER}, ${MPI_IMPL})" + echo " PETSC_DIR=${PETSC_DIR}" + echo " PETSC_ARCH=${PETSC_ARCH}" + if [ "${CLUSTER}" != "local" ]; then + echo " export PYTHONPATH=\$PETSC_DIR/\$PETSC_ARCH/lib:\$PYTHONPATH" + fi echo "==========================================" ;; - clone) - clone_petsc - ;; - configure) - configure_petsc - ;; - build) - build_petsc - ;; - patch) - apply_patches - ;; - test) - test_petsc - ;; - petsc4py) - build_petsc4py - ;; - clean) - clean_petsc - ;; - clean-all) - clean_all - ;; - help|--help|-h) - show_help - ;; + clone) clone_petsc ;; + patch) apply_patches ;; + configure) configure_petsc ;; + build) build_petsc ;; + test) test_petsc ;; + petsc4py) build_petsc4py ;; + clean) clean_petsc ;; + clean-all) clean_all ;; + help|--help|-h) show_help ;; *) echo "Unknown command: $1" show_help diff --git a/pixi.lock b/pixi.lock index 1f79a946..ffd0bc62 100644 --- a/pixi.lock +++ b/pixi.lock @@ -7606,7 +7606,7 @@ environments: - pypi: https://files.pythonhosted.org/packages/6e/67/9d4ac4b0d683aaa4170da59a1980740b281fd38fc253e1830fde4dac3d4f/pygmsh-7.1.17-py3-none-any.whl - pypi: https://files.pythonhosted.org/packages/b1/09/0ab0853d6d634455fe70d90a306162160ead7592eceaca194168a16d3beb/sphinx_math_dollar-1.3-py3-none-any.whl - pypi: https://files.pythonhosted.org/packages/03/46/25d64bcd7821c8d6f1080e1c43d5fcdfc442a18f759a230b5ccdc891093e/sphinxcontrib_mermaid-2.0.1-py3-none-any.whl - kaiju: + hpc: channels: - url: https://conda.anaconda.org/conda-forge/ indexes: @@ -7625,14 +7625,13 @@ environments: - conda: https://conda.anaconda.org/conda-forge/linux-64/argon2-cffi-bindings-25.1.0-py312h4c3975b_2.conda - conda: https://conda.anaconda.org/conda-forge/noarch/arrow-1.4.0-pyhcf101f3_0.conda - conda: https://conda.anaconda.org/conda-forge/noarch/asttokens-3.0.1-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/linux-64/attr-2.5.2-h39aace5_0.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/attrs-25.4.0-pyhcf101f3_1.conda + - conda: https://conda.anaconda.org/conda-forge/noarch/attrs-26.1.0-pyhcf101f3_0.conda - conda: https://conda.anaconda.org/conda-forge/noarch/babel-2.18.0-pyhcf101f3_1.conda - conda: https://conda.anaconda.org/conda-forge/linux-64/backports.zstd-1.3.0-py312h90b7ffd_0.conda - conda: https://conda.anaconda.org/conda-forge/noarch/beautifulsoup4-4.14.3-pyha770c72_0.conda - - conda: https://conda.anaconda.org/conda-forge/linux-64/binutils-2.45.1-default_h4852527_101.conda - - conda: https://conda.anaconda.org/conda-forge/linux-64/binutils_impl_linux-64-2.45.1-default_hfdba357_101.conda - - conda: https://conda.anaconda.org/conda-forge/linux-64/binutils_linux-64-2.45.1-default_h4852527_101.conda + - conda: https://conda.anaconda.org/conda-forge/linux-64/binutils-2.45.1-default_h4852527_102.conda + - conda: https://conda.anaconda.org/conda-forge/linux-64/binutils_impl_linux-64-2.45.1-default_hfdba357_102.conda + - conda: https://conda.anaconda.org/conda-forge/linux-64/binutils_linux-64-2.45.1-default_h4852527_102.conda - conda: https://conda.anaconda.org/conda-forge/noarch/bleach-6.3.0-pyhcf101f3_1.conda - conda: https://conda.anaconda.org/conda-forge/noarch/bleach-with-css-6.3.0-hbca2aae_1.conda - conda: https://conda.anaconda.org/conda-forge/linux-64/blosc-1.21.6-he440d0b_1.conda @@ -7649,7 +7648,7 @@ environments: - conda: https://conda.anaconda.org/conda-forge/noarch/certifi-2026.2.25-pyhd8ed1ab_0.conda - conda: https://conda.anaconda.org/conda-forge/linux-64/cffi-2.0.0-py312h460c074_1.conda - conda: https://conda.anaconda.org/conda-forge/linux-64/cftime-1.6.5-py312h4f23490_1.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/charset-normalizer-3.4.5-pyhd8ed1ab_0.conda + - conda: https://conda.anaconda.org/conda-forge/noarch/charset-normalizer-3.4.6-pyhd8ed1ab_0.conda - conda: https://conda.anaconda.org/conda-forge/noarch/colorama-0.4.6-pyhd8ed1ab_1.conda - conda: https://conda.anaconda.org/conda-forge/noarch/comm-0.2.3-pyhe01879c_0.conda - conda: https://conda.anaconda.org/conda-forge/linux-64/compilers-1.11.0-ha770c72_0.conda @@ -7696,16 +7695,16 @@ environments: - conda: https://conda.anaconda.org/conda-forge/linux-64/gxx_linux-64-14.3.0-he467f4b_21.conda - conda: https://conda.anaconda.org/conda-forge/noarch/h2-4.3.0-pyhcf101f3_0.conda - conda: https://conda.anaconda.org/conda-forge/linux-64/h5py-3.15.1-nompi_py312ha4f8f14_101.conda - - conda: https://conda.anaconda.org/conda-forge/linux-64/harfbuzz-13.1.0-h6083320_0.conda + - conda: https://conda.anaconda.org/conda-forge/linux-64/harfbuzz-13.2.0-h6083320_0.conda - conda: https://conda.anaconda.org/conda-forge/linux-64/hdf4-4.2.15-h2a13503_7.conda - conda: https://conda.anaconda.org/conda-forge/linux-64/hdf5-1.14.6-nompi_h19486de_106.conda - conda: https://conda.anaconda.org/conda-forge/noarch/hpack-4.1.0-pyhd8ed1ab_0.conda - conda: https://conda.anaconda.org/conda-forge/noarch/hyperframe-6.1.0-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/linux-64/icu-78.2-h33c6efd_0.conda + - conda: https://conda.anaconda.org/conda-forge/linux-64/icu-78.3-h33c6efd_0.conda - conda: https://conda.anaconda.org/conda-forge/noarch/idna-3.11-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/imagesize-1.4.1-pyhd8ed1ab_0.tar.bz2 - - conda: https://conda.anaconda.org/conda-forge/noarch/importlib-metadata-8.7.0-pyhe01879c_1.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/importlib_metadata-8.7.0-h40b2b14_1.conda + - conda: https://conda.anaconda.org/conda-forge/noarch/imagesize-2.0.0-pyhd8ed1ab_0.conda + - conda: https://conda.anaconda.org/conda-forge/noarch/importlib-metadata-8.8.0-pyhcf101f3_0.conda + - conda: https://conda.anaconda.org/conda-forge/noarch/importlib_metadata-8.8.0-h8f7a5dd_0.conda - conda: https://conda.anaconda.org/conda-forge/noarch/iniconfig-2.3.0-pyhd8ed1ab_0.conda - conda: https://conda.anaconda.org/conda-forge/noarch/ipykernel-6.31.0-pyha191276_0.conda - conda: https://conda.anaconda.org/conda-forge/noarch/ipython-9.11.0-pyhecfbec7_0.conda @@ -7729,7 +7728,7 @@ environments: - conda: https://conda.anaconda.org/conda-forge/linux-64/krb5-1.22.2-ha1258a1_0.conda - conda: https://conda.anaconda.org/conda-forge/noarch/lark-1.3.1-pyhd8ed1ab_0.conda - conda: https://conda.anaconda.org/conda-forge/linux-64/lcms2-2.18-h0c24ade_0.conda - - conda: https://conda.anaconda.org/conda-forge/linux-64/ld_impl_linux-64-2.45.1-default_hbd61a6d_101.conda + - conda: https://conda.anaconda.org/conda-forge/linux-64/ld_impl_linux-64-2.45.1-default_hbd61a6d_102.conda - conda: https://conda.anaconda.org/conda-forge/linux-64/lerc-4.1.0-hdb68285_0.conda - conda: https://conda.anaconda.org/conda-forge/linux-64/libaec-1.1.5-h088129d_0.conda - conda: https://conda.anaconda.org/conda-forge/linux-64/libblas-3.11.0-5_h4a7cf45_openblas.conda @@ -7763,10 +7762,10 @@ environments: - conda: https://conda.anaconda.org/conda-forge/linux-64/libiconv-1.18-h3b78370_2.conda - conda: https://conda.anaconda.org/conda-forge/linux-64/libjpeg-turbo-3.1.2-hb03c661_0.conda - conda: https://conda.anaconda.org/conda-forge/linux-64/liblapack-3.11.0-5_h47877c9_openblas.conda - - conda: https://conda.anaconda.org/conda-forge/linux-64/libllvm22-22.1.0-hf7376ad_0.conda + - conda: https://conda.anaconda.org/conda-forge/linux-64/libllvm22-22.1.1-hf7376ad_0.conda - conda: https://conda.anaconda.org/conda-forge/linux-64/liblzma-5.8.2-hb03c661_0.conda - - conda: https://conda.anaconda.org/conda-forge/linux-64/libnetcdf-4.10.0-nompi_hbf2fc22_100.conda - - conda: https://conda.anaconda.org/conda-forge/linux-64/libnghttp2-1.67.0-had1ee68_0.conda + - conda: https://conda.anaconda.org/conda-forge/linux-64/libnetcdf-4.10.0-nompi_hb6f1874_101.conda + - conda: https://conda.anaconda.org/conda-forge/linux-64/libnghttp2-1.68.1-h877daf1_0.conda - conda: https://conda.anaconda.org/conda-forge/linux-64/libnsl-2.0.1-hb9d3cd8_1.conda - conda: https://conda.anaconda.org/conda-forge/linux-64/libntlm-1.8-hb9d3cd8_0.conda - conda: https://conda.anaconda.org/conda-forge/linux-64/libopenblas-0.3.30-pthreads_h94d23a6_4.conda @@ -7792,7 +7791,7 @@ environments: - conda: https://conda.anaconda.org/conda-forge/linux-64/libxml2-2.15.2-he237659_0.conda - conda: https://conda.anaconda.org/conda-forge/linux-64/libxslt-1.1.43-h711ed8c_1.conda - conda: https://conda.anaconda.org/conda-forge/linux-64/libzip-1.11.2-h6991a6a_0.conda - - conda: https://conda.anaconda.org/conda-forge/linux-64/libzlib-1.3.1-hb9d3cd8_2.conda + - conda: https://conda.anaconda.org/conda-forge/linux-64/libzlib-1.3.2-h25fd6f3_2.conda - conda: https://conda.anaconda.org/conda-forge/linux-64/lz4-c-1.10.0-h5888daf_1.conda - conda: https://conda.anaconda.org/conda-forge/noarch/markdown-it-py-4.0.0-pyhd8ed1ab_0.conda - conda: https://conda.anaconda.org/conda-forge/linux-64/markupsafe-3.0.3-py312h8a5da7c_1.conda @@ -7803,8 +7802,8 @@ environments: - conda: https://conda.anaconda.org/conda-forge/noarch/meshio-5.3.5-pyhd8ed1ab_0.conda - conda: https://conda.anaconda.org/conda-forge/noarch/mistune-3.2.0-pyhcf101f3_0.conda - conda: https://conda.anaconda.org/conda-forge/linux-64/mpc-1.3.1-h24ddda3_1.conda - - conda: https://conda.anaconda.org/conda-forge/linux-64/mpfr-4.2.1-h90cbb55_3.conda - - conda: https://conda.anaconda.org/conda-forge/noarch/mpmath-1.4.0-pyhd8ed1ab_0.conda + - conda: https://conda.anaconda.org/conda-forge/linux-64/mpfr-4.2.2-he0a73b1_0.conda + - conda: https://conda.anaconda.org/conda-forge/noarch/mpmath-1.4.1-pyhd8ed1ab_0.conda - conda: https://conda.anaconda.org/conda-forge/linux-64/msgspec-0.20.0-py312h4c3975b_2.conda - conda: https://conda.anaconda.org/conda-forge/noarch/munkres-1.1.4-pyhd8ed1ab_1.conda - conda: https://conda.anaconda.org/conda-forge/noarch/nbclient-0.10.4-pyhd8ed1ab_0.conda @@ -7821,6 +7820,7 @@ environments: - conda: https://conda.anaconda.org/conda-forge/noarch/packaging-26.0-pyhcf101f3_0.conda - conda: https://conda.anaconda.org/conda-forge/noarch/pandocfilters-1.5.0-pyhd8ed1ab_0.tar.bz2 - conda: https://conda.anaconda.org/conda-forge/noarch/parso-0.8.6-pyhcf101f3_0.conda + - conda: https://conda.anaconda.org/conda-forge/linux-64/patchelf-0.17.2-h58526e2_0.conda - conda: https://conda.anaconda.org/conda-forge/linux-64/pcre2-10.47-haa7fec5_0.conda - conda: https://conda.anaconda.org/conda-forge/noarch/pexpect-4.9.0-pyhd8ed1ab_1.conda - conda: https://conda.anaconda.org/conda-forge/linux-64/pillow-12.1.1-py312h50c33e8_0.conda @@ -7843,7 +7843,7 @@ environments: - conda: https://conda.anaconda.org/conda-forge/noarch/pygments-2.19.2-pyhd8ed1ab_0.conda - conda: https://conda.anaconda.org/conda-forge/linux-64/pykdtree-1.4.3-py312h4f23490_2.conda - conda: https://conda.anaconda.org/conda-forge/noarch/pyparsing-3.3.2-pyhcf101f3_0.conda - - conda: https://conda.anaconda.org/conda-forge/linux-64/pyside6-6.10.2-py312h9da60e5_0.conda + - conda: https://conda.anaconda.org/conda-forge/linux-64/pyside6-6.10.2-py312h50ac2ff_1.conda - conda: https://conda.anaconda.org/conda-forge/noarch/pysocks-1.7.1-pyha55dd90_7.conda - conda: https://conda.anaconda.org/conda-forge/noarch/pytest-8.4.2-pyhcf101f3_1.conda - conda: https://conda.anaconda.org/conda-forge/noarch/pytest-forked-1.6.0-pyhd8ed1ab_1.conda @@ -7861,7 +7861,7 @@ environments: - conda: https://conda.anaconda.org/conda-forge/linux-64/pyyaml-6.0.3-py312h8a5da7c_1.conda - conda: https://conda.anaconda.org/conda-forge/linux-64/pyzmq-27.1.0-py312hda471dd_2.conda - conda: https://conda.anaconda.org/conda-forge/linux-64/qhull-2020.2-h434a139_5.conda - - conda: https://conda.anaconda.org/conda-forge/linux-64/qt6-main-6.10.2-h17e89b9_5.conda + - conda: https://conda.anaconda.org/conda-forge/linux-64/qt6-main-6.10.2-pl5321h16c4a6b_6.conda - conda: https://conda.anaconda.org/conda-forge/linux-64/readline-8.3-h853b02a_0.conda - conda: https://conda.anaconda.org/conda-forge/noarch/referencing-0.37.0-pyhcf101f3_0.conda - conda: https://conda.anaconda.org/conda-forge/noarch/requests-2.32.5-pyhcf101f3_1.conda @@ -7906,7 +7906,7 @@ environments: - conda: https://conda.anaconda.org/conda-forge/linux-64/unicodedata2-17.0.1-py312h4c3975b_0.conda - conda: https://conda.anaconda.org/conda-forge/noarch/uri-template-1.3.0-pyhd8ed1ab_1.conda - conda: https://conda.anaconda.org/conda-forge/noarch/urllib3-2.6.3-pyhd8ed1ab_0.conda - - conda: https://conda.anaconda.org/conda-forge/linux-64/wayland-1.24.0-hd6090a7_1.conda + - conda: https://conda.anaconda.org/conda-forge/linux-64/wayland-1.25.0-hd6090a7_0.conda - conda: https://conda.anaconda.org/conda-forge/noarch/wcwidth-0.6.0-pyhd8ed1ab_0.conda - conda: https://conda.anaconda.org/conda-forge/noarch/webcolors-25.10.0-pyhd8ed1ab_0.conda - conda: https://conda.anaconda.org/conda-forge/noarch/webencodings-0.5.1-pyhd8ed1ab_3.conda @@ -12817,6 +12817,18 @@ packages: - pkg:pypi/attrs?source=compressed-mapping size: 64759 timestamp: 1764875182184 +- conda: https://conda.anaconda.org/conda-forge/noarch/attrs-26.1.0-pyhcf101f3_0.conda + sha256: 1b6124230bb4e571b1b9401537ecff575b7b109cc3a21ee019f65e083b8399ab + md5: c6b0543676ecb1fb2d7643941fe375f2 + depends: + - python >=3.10 + - python + license: MIT + license_family: MIT + purls: + - pkg:pypi/attrs?source=compressed-mapping + size: 64927 + timestamp: 1773935801332 - conda: https://conda.anaconda.org/conda-forge/noarch/babel-2.17.0-pyhd8ed1ab_0.conda sha256: 1c656a35800b7f57f7371605bc6507c8d3ad60fbaaec65876fce7f73df1fc8ac md5: 0a01c169f0ab0f91b26e77a3301fbfe4 @@ -12932,6 +12944,15 @@ packages: purls: [] size: 35128 timestamp: 1770267175160 +- conda: https://conda.anaconda.org/conda-forge/linux-64/binutils-2.45.1-default_h4852527_102.conda + sha256: 3c7c5580c1720206f28b7fa3d60d17986b3f32465e63009c14c9ae1ea64f926c + md5: 212fe5f1067445544c99dc1c847d032c + depends: + - binutils_impl_linux-64 >=2.45.1,<2.45.2.0a0 + license: GPL-3.0-only + purls: [] + size: 35436 + timestamp: 1774197482571 - conda: https://conda.anaconda.org/conda-forge/linux-64/binutils_impl_linux-64-2.45-default_hfdba357_104.conda sha256: 054a77ccab631071a803737ea8e5d04b5b18e57db5b0826a04495bd3fdf39a7c md5: a7a67bf132a4a2dea92a7cb498cdc5b1 @@ -12956,6 +12977,17 @@ packages: purls: [] size: 3744895 timestamp: 1770267152681 +- conda: https://conda.anaconda.org/conda-forge/linux-64/binutils_impl_linux-64-2.45.1-default_hfdba357_102.conda + sha256: 0a7d405064f53b9d91d92515f1460f7906ee5e8523f3cd8973430e81219f4917 + md5: 8165352fdce2d2025bf884dc0ee85700 + depends: + - ld_impl_linux-64 2.45.1 default_hbd61a6d_102 + - sysroot_linux-64 + - zstd >=1.5.7,<1.6.0a0 + license: GPL-3.0-only + purls: [] + size: 3661455 + timestamp: 1774197460085 - conda: https://conda.anaconda.org/conda-forge/linux-64/binutils_linux-64-2.45-default_h4852527_104.conda sha256: ed23fee4db69ad82320cca400fc77404c3874cd866606651a20bf743acd1a9b1 md5: e30e71d685e23cc1e5ac1c1990ba1f81 @@ -12976,6 +13008,15 @@ packages: purls: [] size: 36060 timestamp: 1770267177798 +- conda: https://conda.anaconda.org/conda-forge/linux-64/binutils_linux-64-2.45.1-default_h4852527_102.conda + sha256: 78a58d523d072b7f8e591b8f8572822e044b31764ed7e8d170392e7bc6d58339 + md5: 2a307a17309d358c9b42afdd3199ddcc + depends: + - binutils_impl_linux-64 2.45.1 default_hfdba357_102 + license: GPL-3.0-only + purls: [] + size: 36304 + timestamp: 1774197485247 - conda: https://conda.anaconda.org/conda-forge/linux-64/black-24.10.0-py312h7900ff3_0.conda sha256: 2b4344d18328b3e8fd9b5356f4ee15556779766db8cb21ecf2ff818809773df6 md5: 2daba153b913b1b901cf61440ad5e019 @@ -13622,6 +13663,17 @@ packages: - pkg:pypi/charset-normalizer?source=compressed-mapping size: 53210 timestamp: 1772816516728 +- conda: https://conda.anaconda.org/conda-forge/noarch/charset-normalizer-3.4.6-pyhd8ed1ab_0.conda + sha256: d86dfd428b2e3c364fa90e07437c8405d635aa4ef54b25ab51d9c712be4112a5 + md5: 49ee13eb9b8f44d63879c69b8a40a74b + depends: + - python >=3.10 + license: MIT + license_family: MIT + purls: + - pkg:pypi/charset-normalizer?source=hash-mapping + size: 58510 + timestamp: 1773660086450 - conda: https://conda.anaconda.org/conda-forge/osx-arm64/clang-19.1.7-default_hf9bcbb7_5.conda sha256: 6e9cb7e80a41dbbfd95e86d87c8e5dafc3171aadda16ca33a1e2136748267318 md5: 6773a2b7d7d1b0a8d0e0f3bf4e928936 @@ -15305,7 +15357,7 @@ packages: license: MIT license_family: MIT purls: - - pkg:pypi/fonttools?source=compressed-mapping + - pkg:pypi/fonttools?source=hash-mapping size: 2935817 timestamp: 1773137546716 - conda: https://conda.anaconda.org/conda-forge/osx-arm64/fonttools-4.61.1-py312h5748b74_0.conda @@ -16282,14 +16334,14 @@ packages: purls: [] size: 2035859 timestamp: 1769445400168 -- conda: https://conda.anaconda.org/conda-forge/linux-64/harfbuzz-13.1.0-h6083320_0.conda - sha256: 08dc098dcc5c3445331a834f46602b927cb65d2768189f3f032a6e4643f15cd9 - md5: 5baf48da05855be929c5a50f4377794d +- conda: https://conda.anaconda.org/conda-forge/linux-64/harfbuzz-13.2.0-h6083320_0.conda + sha256: 2b6958ab30b2ce330b0166e51fc5f20f761f71e09510d62f03f9729882707497 + md5: 71c2c966e17a65b08b995f571310fb9f depends: - __glibc >=2.17,<3.0.a0 - cairo >=1.18.4,<2.0a0 - graphite2 >=1.3.14,<2.0a0 - - icu >=78.2,<79.0a0 + - icu >=78.3,<79.0a0 - libexpat >=2.7.4,<3.0a0 - libfreetype >=2.14.2 - libfreetype6 >=2.14.2 @@ -16300,8 +16352,8 @@ packages: license: MIT license_family: MIT purls: [] - size: 2615630 - timestamp: 1773217509651 + size: 2342310 + timestamp: 1773909324136 - conda: https://conda.anaconda.org/conda-forge/osx-arm64/harfbuzz-12.2.0-haf38c7b_0.conda sha256: 2f8d95fe1cb655fe3bac114062963f08cc77b31b042027ef7a04ebde3ce21594 md5: 1c7ff9d458dd8220ac2ee71dd4af1be5 @@ -16639,6 +16691,18 @@ packages: purls: [] size: 12728445 timestamp: 1767969922681 +- conda: https://conda.anaconda.org/conda-forge/linux-64/icu-78.3-h33c6efd_0.conda + sha256: fbf86c4a59c2ed05bbffb2ba25c7ed94f6185ec30ecb691615d42342baa1a16a + md5: c80d8a3b84358cb967fa81e7075fbc8a + depends: + - __glibc >=2.17,<3.0.a0 + - libgcc >=14 + - libstdcxx >=14 + license: MIT + license_family: MIT + purls: [] + size: 12723451 + timestamp: 1773822285671 - conda: https://conda.anaconda.org/conda-forge/osx-arm64/icu-75.1-hfee45f7_0.conda sha256: 9ba12c93406f3df5ab0a43db8a4b4ef67a5871dfd401010fbe29b218b2cbe620 md5: 5eb22c1d7b3fc4abb50d92d621583137 @@ -16681,6 +16745,17 @@ packages: - pkg:pypi/imagesize?source=hash-mapping size: 10164 timestamp: 1656939625410 +- conda: https://conda.anaconda.org/conda-forge/noarch/imagesize-2.0.0-pyhd8ed1ab_0.conda + sha256: 5a047f9eac290e679b4e6f6f4cbfcc5acdfbf031a4f06824d4ddb590cdbb850b + md5: 92617c2ba2847cca7a6ed813b6f4ab79 + depends: + - python >=3.10 + license: MIT + license_family: MIT + purls: + - pkg:pypi/imagesize?source=hash-mapping + size: 15729 + timestamp: 1773752188889 - conda: https://conda.anaconda.org/conda-forge/noarch/importlib-metadata-8.7.0-pyhe01879c_1.conda sha256: c18ab120a0613ada4391b15981d86ff777b5690ca461ea7e9e49531e8f374745 md5: 63ccfdc3a3ce25b027b8767eb722fca8 @@ -16694,6 +16769,19 @@ packages: - pkg:pypi/importlib-metadata?source=hash-mapping size: 34641 timestamp: 1747934053147 +- conda: https://conda.anaconda.org/conda-forge/noarch/importlib-metadata-8.8.0-pyhcf101f3_0.conda + sha256: 82ab2a0d91ca1e7e63ab6a4939356667ef683905dea631bc2121aa534d347b16 + md5: 080594bf4493e6bae2607e65390c520a + depends: + - python >=3.10 + - zipp >=3.20 + - python + license: Apache-2.0 + license_family: APACHE + purls: + - pkg:pypi/importlib-metadata?source=compressed-mapping + size: 34387 + timestamp: 1773931568510 - conda: https://conda.anaconda.org/conda-forge/noarch/importlib_metadata-8.7.0-h40b2b14_1.conda sha256: 46b11943767eece9df0dc9fba787996e4f22cc4c067f5e264969cfdfcb982c39 md5: 8a77895fb29728b736a1a6c75906ea1a @@ -16704,6 +16792,16 @@ packages: purls: [] size: 22143 timestamp: 1747934053147 +- conda: https://conda.anaconda.org/conda-forge/noarch/importlib_metadata-8.8.0-h8f7a5dd_0.conda + sha256: 09f2b26f8c727fd2138fd4846b91708c32d5684120b59d5c8d38472c0eefbf33 + md5: 12e7a110add59a05b337484568a83a4d + depends: + - importlib-metadata ==8.8.0 pyhcf101f3_0 + license: Apache-2.0 + license_family: APACHE + purls: [] + size: 21425 + timestamp: 1773931568510 - conda: https://conda.anaconda.org/conda-forge/noarch/iniconfig-2.3.0-pyhd8ed1ab_0.conda sha256: e1a9e3b1c8fe62dc3932a616c284b5d8cbe3124bbfbedcf4ce5c828cb166ee19 md5: 9614359868482abba1bd15ce465e3c42 @@ -17996,6 +18094,18 @@ packages: purls: [] size: 725507 timestamp: 1770267139900 +- conda: https://conda.anaconda.org/conda-forge/linux-64/ld_impl_linux-64-2.45.1-default_hbd61a6d_102.conda + sha256: 3d584956604909ff5df353767f3a2a2f60e07d070b328d109f30ac40cd62df6c + md5: 18335a698559cdbcd86150a48bf54ba6 + depends: + - __glibc >=2.17,<3.0.a0 + - zstd >=1.5.7,<1.6.0a0 + constrains: + - binutils_impl_linux-64 2.45.1 + license: GPL-3.0-only + purls: [] + size: 728002 + timestamp: 1774197446916 - conda: https://conda.anaconda.org/conda-forge/linux-64/lerc-4.0.0-h0aef613_1.conda sha256: 412381a43d5ff9bbed82cd52a0bbca5b90623f62e41007c9c42d3870c60945ff md5: 9344155d33912347b37f0ae6c410a835 @@ -20027,6 +20137,22 @@ packages: purls: [] size: 44236214 timestamp: 1772009776202 +- conda: https://conda.anaconda.org/conda-forge/linux-64/libllvm22-22.1.1-hf7376ad_0.conda + sha256: 1145f9e85f0fbbdba88f1da5c8c48672bee7702e2f40c563b2dd48350ab4d413 + md5: 97cc6dad22677304846a798c8a65064d + depends: + - __glibc >=2.17,<3.0.a0 + - libgcc >=14 + - libstdcxx >=14 + - libxml2 + - libxml2-16 >=2.14.6 + - libzlib >=1.3.1,<2.0a0 + - zstd >=1.5.7,<1.6.0a0 + license: Apache-2.0 WITH LLVM-exception + license_family: Apache + purls: [] + size: 44256563 + timestamp: 1773371774629 - conda: https://conda.anaconda.org/conda-forge/osx-arm64/libllvm22-22.1.0-h89af1be_0.conda sha256: 19e2c69bd90cffc66a9fd9feff2bfe6093cda8bf69aa01a6e1c41cbc0a5c24a0 md5: 620fe27ebf89177446fb7cc3c26c9cc0 @@ -20130,31 +20256,29 @@ packages: purls: [] size: 117463 timestamp: 1768753005332 -- conda: https://conda.anaconda.org/conda-forge/linux-64/libnetcdf-4.10.0-nompi_hbf2fc22_100.conda - sha256: f38b00b29c9495b71c12465397c735224ebaef71ad01278c3b9cb69dac685b65 - md5: 0eb36a09dad274e750d60b49aaec0af7 +- conda: https://conda.anaconda.org/conda-forge/linux-64/libnetcdf-4.10.0-nompi_hb6f1874_101.conda + sha256: 68c1ae8f2f327a248a837f86f14953eb0d7ee115e8c3efb35422971e40033b71 + md5: e5c8fe120a617a1d7a749baa1f0cdf1d depends: - __glibc >=2.17,<3.0.a0 - - attr >=2.5.2,<2.6.0a0 - blosc >=1.21.6,<2.0a0 - bzip2 >=1.0.8,<2.0a0 - hdf4 >=4.2.15,<4.2.16.0a0 - hdf5 >=1.14.6,<1.14.7.0a0 - libaec >=1.1.5,<2.0a0 - - libcurl >=8.18.0,<9.0a0 + - libcurl >=8.19.0,<9.0a0 - libgcc >=14 - libstdcxx >=14 - libxml2 - libxml2-16 >=2.14.6 - libzip >=1.11.2,<2.0a0 - - libzlib >=1.3.1,<2.0a0 + - libzlib >=1.3.2,<2.0a0 - openssl >=3.5.5,<4.0a0 - zstd >=1.5.7,<1.6.0a0 license: MIT - license_family: MIT purls: [] - size: 862222 - timestamp: 1772190364667 + size: 860696 + timestamp: 1774182701778 - conda: https://conda.anaconda.org/conda-forge/linux-64/libnetcdf-4.9.3-nompi_h11f7409_103.conda sha256: e9a8668212719a91a6b0348db05188dfc59de5a21888db13ff8510918a67b258 md5: 3ccff1066c05a1e6c221356eecc40581 @@ -20270,6 +20394,23 @@ packages: purls: [] size: 666600 timestamp: 1756834976695 +- conda: https://conda.anaconda.org/conda-forge/linux-64/libnghttp2-1.68.1-h877daf1_0.conda + sha256: 663444d77a42f2265f54fb8b48c5450bfff4388d9c0f8253dd7855f0d993153f + md5: 2a45e7f8af083626f009645a6481f12d + depends: + - __glibc >=2.17,<3.0.a0 + - c-ares >=1.34.6,<2.0a0 + - libev >=4.33,<4.34.0a0 + - libev >=4.33,<5.0a0 + - libgcc >=14 + - libstdcxx >=14 + - libzlib >=1.3.1,<2.0a0 + - openssl >=3.5.5,<4.0a0 + license: MIT + license_family: MIT + purls: [] + size: 663344 + timestamp: 1773854035739 - conda: https://conda.anaconda.org/conda-forge/osx-arm64/libnghttp2-1.67.0-hc438710_0.conda sha256: a07cb53b5ffa2d5a18afc6fd5a526a5a53dd9523fbc022148bd2f9395697c46d md5: a4b4dd73c67df470d091312ab87bf6ae @@ -22422,6 +22563,18 @@ packages: purls: [] size: 60963 timestamp: 1727963148474 +- conda: https://conda.anaconda.org/conda-forge/linux-64/libzlib-1.3.2-h25fd6f3_2.conda + sha256: 55044c403570f0dc26e6364de4dc5368e5f3fc7ff103e867c487e2b5ab2bcda9 + md5: d87ff7921124eccd67248aa483c23fec + depends: + - __glibc >=2.17,<3.0.a0 + constrains: + - zlib 1.3.2 *_2 + license: Zlib + license_family: Other + purls: [] + size: 63629 + timestamp: 1774072609062 - conda: https://conda.anaconda.org/conda-forge/osx-arm64/libzlib-1.3.1-h8359307_2.conda sha256: ce34669eadaba351cd54910743e6a2261b67009624dbc7daeeafdef93616711b md5: 369964e85dc26bfe78f41399b366c435 @@ -22950,6 +23103,18 @@ packages: purls: [] size: 634751 timestamp: 1725746740014 +- conda: https://conda.anaconda.org/conda-forge/linux-64/mpfr-4.2.2-he0a73b1_0.conda + sha256: 8690f550a780f75d9c47f7ffc15f5ff1c149d36ac17208e50eda101ca16611b9 + md5: 85ce2ffa51ab21da5efa4a9edc5946aa + depends: + - __glibc >=2.17,<3.0.a0 + - gmp >=6.3.0,<7.0a0 + - libgcc >=14 + license: LGPL-3.0-only + license_family: LGPL + purls: [] + size: 730422 + timestamp: 1773413915171 - conda: https://conda.anaconda.org/conda-forge/osx-arm64/mpfr-4.2.1-hb693164_3.conda sha256: 4463e4e2aba7668e37a1b8532859191b4477a6f3602a5d6b4d64ad4c4baaeac5 md5: 4e4ea852d54cc2b869842de5044662fb @@ -23139,6 +23304,17 @@ packages: - pkg:pypi/mpmath?source=compressed-mapping size: 464419 timestamp: 1771870721583 +- conda: https://conda.anaconda.org/conda-forge/noarch/mpmath-1.4.1-pyhd8ed1ab_0.conda + sha256: 5bbf2f8179ec43d34d67ca8e4989d216c1bdb4b749fe6cb40e86ebf88c1b5300 + md5: 2e81b32b805f406d23ba61938a184081 + depends: + - python >=3.10 + license: BSD-3-Clause + license_family: BSD + purls: + - pkg:pypi/mpmath?source=compressed-mapping + size: 464918 + timestamp: 1773662068273 - conda: https://conda.anaconda.org/conda-forge/linux-64/msgpack-python-1.1.2-py312hd9148b4_1.conda sha256: 94068fd39d1a672f8799e3146a18ba4ef553f0fcccefddb3c07fbdabfd73667a md5: 2e489969e38f0b428c39492619b5e6e5 @@ -24502,6 +24678,17 @@ packages: - pkg:pypi/partd?source=hash-mapping size: 20884 timestamp: 1715026639309 +- conda: https://conda.anaconda.org/conda-forge/linux-64/patchelf-0.17.2-h58526e2_0.conda + sha256: eb355ac225be2f698e19dba4dcab7cb0748225677a9799e9cc8e4cadc3cb738f + md5: ba76a6a448819560b5f8b08a9c74f415 + depends: + - libgcc-ng >=7.5.0 + - libstdcxx-ng >=7.5.0 + license: GPL-3.0-or-later + license_family: GPL + purls: [] + size: 94048 + timestamp: 1673473024463 - conda: https://conda.anaconda.org/conda-forge/noarch/pathspec-0.12.1-pyhd8ed1ab_1.conda sha256: 9f64009cdf5b8e529995f18e03665b03f5d07c0b17445b8badef45bde76249ee md5: 617f15191456cc6a13db418a275435e5 @@ -25633,6 +25820,31 @@ packages: - pkg:pypi/shiboken6?source=hash-mapping size: 11606305 timestamp: 1765811838817 +- conda: https://conda.anaconda.org/conda-forge/linux-64/pyside6-6.10.2-py312h50ac2ff_1.conda + sha256: 18c8ffaca3d33e8617d600a79a1781b6de8e022039827254772a85651f4cebe6 + md5: 08452854f86c3190c3b0d4df1ae28555 + depends: + - python + - qt6-main 6.10.2.* + - __glibc >=2.17,<3.0.a0 + - libgcc >=14 + - libstdcxx >=14 + - libopengl >=1.7.0,<2.0a0 + - qt6-main >=6.10.2,<6.11.0a0 + - libclang13 >=21.1.8 + - libxslt >=1.1.43,<2.0a0 + - libxml2 + - libxml2-16 >=2.14.6 + - libegl >=1.7.0,<2.0a0 + - libgl >=1.7.0,<2.0a0 + - python_abi 3.12.* *_cp312 + - libvulkan-loader >=1.4.341.0,<2.0a0 + license: LGPL-3.0-only + license_family: LGPL + purls: + - pkg:pypi/pyside6?source=compressed-mapping + size: 13096913 + timestamp: 1773742520312 - conda: https://conda.anaconda.org/conda-forge/linux-64/pyside6-6.10.2-py312h9da60e5_0.conda sha256: 5e00f3e0a273e18b4ec0ae7f4fff507333a354dd5ecc31dd9f3b02ab1ee77163 md5: 52412f1ae11e89b721784f2118188902 @@ -26360,6 +26572,80 @@ packages: purls: [] size: 56550801 timestamp: 1772121854618 +- conda: https://conda.anaconda.org/conda-forge/linux-64/qt6-main-6.10.2-pl5321h16c4a6b_6.conda + sha256: dd2fdde2cfecd29d4acd2bacbb341f00500d8b3b1c0583a8d92e07fc1e4b1106 + md5: 3a00bff44c15ee37bfd5eb435e1b2a51 + depends: + - libxcb + - xcb-util + - xcb-util-wm + - xcb-util-keysyms + - xcb-util-image + - xcb-util-renderutil + - xcb-util-cursor + - __glibc >=2.17,<3.0.a0 + - libgcc >=14 + - libstdcxx >=14 + - xorg-libice >=1.1.2,<2.0a0 + - icu >=78.3,<79.0a0 + - libllvm22 >=22.1.0,<22.2.0a0 + - krb5 >=1.22.2,<1.23.0a0 + - xorg-libx11 >=1.8.13,<2.0a0 + - xorg-libxtst >=1.2.5,<2.0a0 + - libfreetype >=2.14.2 + - libfreetype6 >=2.14.2 + - libxml2 + - libxml2-16 >=2.14.6 + - libtiff >=4.7.1,<4.8.0a0 + - libegl >=1.7.0,<2.0a0 + - xorg-libxxf86vm >=1.1.7,<2.0a0 + - libdrm >=2.4.125,<2.5.0a0 + - xcb-util >=0.4.1,<0.5.0a0 + - libbrotlicommon >=1.2.0,<1.3.0a0 + - libbrotlienc >=1.2.0,<1.3.0a0 + - libbrotlidec >=1.2.0,<1.3.0a0 + - libvulkan-loader >=1.4.341.0,<2.0a0 + - libclang-cpp22.1 >=22.1.0,<22.2.0a0 + - double-conversion >=3.4.0,<3.5.0a0 + - dbus >=1.16.2,<2.0a0 + - xcb-util-renderutil >=0.3.10,<0.4.0a0 + - alsa-lib >=1.2.15.3,<1.3.0a0 + - wayland >=1.24.0,<2.0a0 + - xcb-util-cursor >=0.1.6,<0.2.0a0 + - libpng >=1.6.55,<1.7.0a0 + - libclang13 >=22.1.0 + - libwebp-base >=1.6.0,<2.0a0 + - zstd >=1.5.7,<1.6.0a0 + - pcre2 >=10.47,<10.48.0a0 + - xorg-libxrandr >=1.5.5,<2.0a0 + - libcups >=2.3.3,<2.4.0a0 + - libpq >=18.3,<19.0a0 + - libjpeg-turbo >=3.1.2,<4.0a0 + - xorg-libxcomposite >=0.4.7,<1.0a0 + - xcb-util-keysyms >=0.4.1,<0.5.0a0 + - xorg-libxcursor >=1.2.3,<2.0a0 + - harfbuzz >=13.1.1 + - openssl >=3.5.5,<4.0a0 + - fontconfig >=2.17.1,<3.0a0 + - fonts-conda-ecosystem + - libxcb >=1.17.0,<2.0a0 + - libzlib >=1.3.1,<2.0a0 + - libsqlite >=3.52.0,<4.0a0 + - xorg-libsm >=1.2.6,<2.0a0 + - libgl >=1.7.0,<2.0a0 + - libglib >=2.86.4,<3.0a0 + - xorg-libxext >=1.3.7,<2.0a0 + - libxkbcommon >=1.13.1,<2.0a0 + - xorg-libxdamage >=1.1.6,<2.0a0 + - xcb-util-image >=0.4.0,<0.5.0a0 + - xcb-util-wm >=0.4.2,<0.5.0a0 + constrains: + - qt ==6.10.2 + license: LGPL-3.0-only + license_family: LGPL + purls: [] + size: 58118322 + timestamp: 1773865930316 - conda: https://conda.anaconda.org/conda-forge/osx-arm64/qt6-main-6.10.1-h478b344_2.conda sha256: b489baadcdb702f14e453a3fe3c22af782e374c2f4e4b9d6e1ab540d029681c7 md5: 7440806d0eb80dccd17cb3c67593ac72 @@ -28989,6 +29275,20 @@ packages: purls: [] size: 329779 timestamp: 1761174273487 +- conda: https://conda.anaconda.org/conda-forge/linux-64/wayland-1.25.0-hd6090a7_0.conda + sha256: ea374d57a8fcda281a0a89af0ee49a2c2e99cc4ac97cf2e2db7064e74e764bdb + md5: 996583ea9c796e5b915f7d7580b51ea6 + depends: + - __glibc >=2.17,<3.0.a0 + - libexpat >=2.7.4,<3.0a0 + - libffi >=3.5.2,<3.6.0a0 + - libgcc >=14 + - libstdcxx >=14 + license: MIT + license_family: MIT + purls: [] + size: 334139 + timestamp: 1773959575393 - conda: https://conda.anaconda.org/conda-forge/noarch/wayland-protocols-1.47-hd8ed1ab_0.conda sha256: 9ab2c12053ea8984228dd573114ffc6d63df42c501d59fda3bf3aeb1eaa1d23e md5: 7da1571f560d4ba3343f7f4c48a79c76 diff --git a/pixi.toml b/pixi.toml index 8c1b9efe..fee1757f 100644 --- a/pixi.toml +++ b/pixi.toml @@ -230,31 +230,19 @@ petsc-local-build = { cmd = "./build-petsc.sh", cwd = "petsc-custom" } petsc-local-clean = { cmd = "./build-petsc.sh clean", cwd = "petsc-custom" } # ============================================ -# KAIJU CLUSTER FEATURE +# HPC CLUSTER FEATURE # ============================================ -# For the Kaiju HPC cluster (Rocky Linux 8, Spack OpenMPI, Slurm) +# For HPC clusters (Kaiju, Gadi, etc.) running linux-64. # Pure Python only — base dependencies cover all pure-Python needs. -# mpi4py, h5py, petsc, petsc4py are built from source against -# spack's OpenMPI using petsc-custom/build-petsc-kaiju.sh +# mpi4py, h5py, petsc, petsc4py are built from source against the +# cluster's MPI using petsc-custom/build-petsc.sh (UW_CLUSTER auto-detected). # See: docs/developer/guides/kaiju-cluster-setup.md -[feature.kaiju] +[feature.hpc] platforms = ["linux-64"] -# ============================================ -# GADI CLUSTER FEATURE -# ============================================ -# For NCI Gadi HPC (CentOS, module OpenMPI + HDF5, PBS Pro) -# Pure Python only — base dependencies cover all pure-Python needs. -# mpi4py, h5py, petsc, petsc4py are built from source against -# Gadi's module OpenMPI and HDF5 using gadi_install_pixi.sh. -# See: install-scripts/uw3-hpc-install-scripts/gadi_install_pixi.sh - -[feature.gadi] -platforms = ["linux-64"] - -[feature.gadi.dependencies] -# patchelf is needed to fix h5py RPATH order after source build: +[feature.hpc.dependencies] +# patchelf is needed on Gadi to fix h5py RPATH order after source build: # meson embeds the conda env lib dir before Gadi's HDF5 in RPATH, # so we use patchelf post-install to move HDF5_DIR to the front. patchelf = "*" @@ -343,10 +331,7 @@ openmpi-dev = { features = ["conda-petsc-openmpi", "runtime", "dev"], solve-gr amr-openmpi = { features = ["amr-openmpi"], solve-group = "amr-openmpi" } amr-openmpi-dev = { features = ["amr-openmpi", "runtime", "dev"], solve-group = "amr-openmpi" } -# --- Kaiju Cluster Track (linux-64 only) --- -# Pure Python from pixi; MPI/PETSc/h5py built from source against spack OpenMPI -kaiju = { features = ["kaiju"], solve-group = "kaiju" } - -# --- Gadi Cluster Track (linux-64 only) --- -# Pure Python from pixi; MPI/PETSc/h5py built from source against Gadi modules -gadi = { features = ["gadi"], solve-group = "gadi" } +# --- HPC Cluster Track (linux-64 only) --- +# Pure Python from pixi; MPI/PETSc/h5py built from source against cluster MPI. +# UW_CLUSTER (or hostname) selects kaiju/gadi config in build-petsc.sh. +hpc = { features = ["hpc"], solve-group = "hpc" } From 026a9d11cec85bfb280a61048e7647358852e051 Mon Sep 17 00:00:00 2001 From: Juan Carlos Graciosa Date: Mon, 23 Mar 2026 21:42:04 +1100 Subject: [PATCH 26/26] Changed documentation to contain info on the clusters closely supported. --- docs/developer/guides/hpc-cluster-setup.md | 302 +++++++++++++++++++ docs/developer/guides/kaiju-cluster-setup.md | 302 ------------------- 2 files changed, 302 insertions(+), 302 deletions(-) create mode 100644 docs/developer/guides/hpc-cluster-setup.md delete mode 100644 docs/developer/guides/kaiju-cluster-setup.md diff --git a/docs/developer/guides/hpc-cluster-setup.md b/docs/developer/guides/hpc-cluster-setup.md new file mode 100644 index 00000000..898a1447 --- /dev/null +++ b/docs/developer/guides/hpc-cluster-setup.md @@ -0,0 +1,302 @@ +# HPC Cluster Setup + +This guide covers installing and running Underworld3 on HPC clusters. Install scripts are maintained in the [uw3-hpc-baremetal-install-run](https://github.com/jcgraciosa/uw3-hpc-baremetal-install-run) repository. + +--- + +## Architecture + +All supported clusters use the same architecture: + +``` +pixi hpc env → Python 3.12, sympy, scipy, pint, pydantic, ... (conda-forge, no MPI) +cluster MPI → OpenMPI (spack or module) (cluster MPI) +source build → mpi4py, PETSc+AMR+petsc4py, h5py (linked to cluster MPI) +``` + +**Why source builds?** Anything linked against MPI must use the same MPI as the cluster scheduler. conda-forge bundles its own MPI (MPICH), which is incompatible with Slurm/PBS. Building from source ensures the correct linkage. + +**Why pixi?** Pixi manages the Python environment consistently with local development — same `pixi.toml`, same package versions. The `hpc` environment is pure Python (no MPI packages from conda-forge). + +**PETSc build:** `petsc-custom/build-petsc.sh` auto-detects the cluster from hostname, or can be overridden with `UW_CLUSTER=kaiju|gadi`. Cluster-specific differences (HDF5 source, BLAS, cmake, compiler flags) are handled internally. + +--- + +## Kaiju + +### Hardware + +| Resource | Specification | +|----------|--------------| +| Head node | 1× Intel Xeon Silver 4210R, 40 CPUs @ 2.4 GHz | +| Compute nodes | 8× Intel Xeon Gold 6230R, 104 CPUs @ 2.1 GHz each | +| Shared storage | `/opt/cluster` via NFS | +| Scheduler | Slurm with Munge authentication | +| MPI | Spack `openmpi@4.1.6` | + +### Prerequisites + +Spack must have OpenMPI available: + +```bash +spack find openmpi +# openmpi@4.1.6 +``` + +Pixi must be installed in your user space: + +```bash +pixi --version # check +curl -fsSL https://pixi.sh/install.sh | bash # install if missing +``` + +### Installation + +Copy `kaiju_install_user.sh` (per-user) or `kaiju_install_shared.sh` (admin) from [uw3-hpc-baremetal-install-run](https://github.com/jcgraciosa/uw3-hpc-baremetal-install-run) to a convenient location, edit the variables at the top, then: + +```bash +source kaiju_install_user.sh install +``` + +| Step | Function | Time | +|------|----------|------| +| Install pixi | `setup_pixi` | ~1 min | +| Clone Underworld3 | `clone_uw3` | ~1 min | +| Install pixi hpc env | `install_pixi_env` | ~3 min | +| Build mpi4py | `install_mpi4py` | ~2 min | +| Build PETSc + AMR tools | `install_petsc` | ~1 hour | +| Build h5py | `install_h5py` | ~2 min | +| Install Underworld3 | `install_uw3` | ~2 min | +| Verify | `verify_install` | ~1 min | + +Individual steps can be run after sourcing: + +```bash +source kaiju_install_user.sh +install_petsc # run just one step +``` + +#### What PETSc builds on Kaiju + +- **AMR tools**: mmg, parmmg, pragmatic, eigen, bison +- **Solvers**: mumps, scalapack, slepc +- **Partitioners**: metis, parmetis, ptscotch +- **MPI**: Spack's OpenMPI (`--with-mpi-dir`) +- **HDF5**: downloaded (not in Spack) +- **BLAS/LAPACK**: fblaslapack (no guaranteed system BLAS on Rocky Linux 8) +- **cmake**: downloaded (not in Spack) +- **petsc4py**: built during configure (`--with-petsc4py=1`) + +### Activating the Environment + +Source the install script at the start of every session or job: + +```bash +source kaiju_install_user.sh +``` + +This loads `spack openmpi@4.1.6`, activates the pixi `hpc` environment via `pixi shell-hook`, and sets `PETSC_DIR`, `PETSC_ARCH`, and `PYTHONPATH`. + +> `pixi shell-hook` is used instead of `pixi shell` because it activates the environment in the current shell without spawning a new one — required for Slurm batch jobs. + +### Running with Slurm + +Use `kaiju_slurm_job.sh` from [uw3-hpc-baremetal-install-run](https://github.com/jcgraciosa/uw3-hpc-baremetal-install-run). Edit the variables at the top, then: + +```bash +sbatch kaiju_slurm_job.sh +``` + +`--mpi=pmix` is **required** on Kaiju (Spack has `pmix@5.0.3`): + +```bash +srun --mpi=pmix python3 my_model.py +``` + +Monitor progress: + +```bash +squeue -u $USER +tail -f uw3_.out +``` + +### Shared Installation (Admin) + +Deploys to `/opt/cluster/software/underworld3/` so all users access it via Environment Modules: + +```bash +source kaiju_install_shared.sh install +module load underworld3/development-12Mar26 +``` + +The shared script adds `fix_permissions()` and `install_modulefile()` on top of the per-user steps. The TCL modulefile hardcodes the Spack OpenMPI and pixi env paths — if Spack is rebuilt (hash changes), update `mpi_root` in `modulefiles/underworld3/development.tcl`. + +### Troubleshooting (Kaiju) + +#### `import underworld3` fails on compute nodes + +Source the install script inside the job script (not the login shell) so all paths propagate to compute nodes. The `kaiju_slurm_job.sh` template does this correctly. + +#### PETSc needs rebuilding after Spack module update + +PETSc links against Spack's OpenMPI at build time. If `openmpi@4.1.6` is reinstalled: + +```bash +source kaiju_install_user.sh +rm -rf ~/uw3-installation/underworld3/petsc-custom/petsc +install_petsc +install_h5py +``` + +#### h5py replaces source-built mpi4py + +`pip install h5py` without `--no-deps` silently replaces the source-built mpi4py with a wheel linked to a different MPI. The install script uses `--no-deps` to prevent this. If mpi4py was accidentally replaced: + +```bash +pip install --no-binary :all: --no-cache-dir --force-reinstall "mpi4py>=4,<5" +``` + +#### PARMMG configure failure + +pixi's conda linker requires transitive shared library dependencies to be explicitly linked. `libmmg.so` built with SCOTCH support causes PARMMG's link test to fail. This is fixed in `build-petsc.sh` by building MMG without SCOTCH (`-DUSE_SCOTCH=OFF`). + +--- + +## Gadi + +### Hardware + +| Resource | Specification | +|----------|--------------| +| System | NCI Gadi (CentOS, Lustre filesystem) | +| Compute | Multiple node types (normal, hugemem, gpuvolta) | +| Shared storage | `/g/data` (project quota), `/scratch` (temporary) | +| Scheduler | PBS Pro | +| MPI | Module `openmpi/4.1.7` | + +### Prerequisites + +The following Gadi modules must be available: + +```bash +module load openmpi/4.1.7 hdf5/1.12.2p gmsh/4.13.1 cmake/3.31.6 +``` + +Pixi must be installed: + +```bash +pixi --version # check +curl -fsSL https://pixi.sh/install.sh | bash # install if missing +``` + +> **Inode quota:** Gadi's `/g/data` has strict inode limits. PETSc (which creates many files during build) may need to be built on `/scratch` and symlinked from `/g/data`. The install script handles this if you set `PETSC_DIR` to a `/scratch` path. + +### Installation + +Copy `gadi_install_user.sh` (per-user) or `gadi_install_shared.sh` (admin) from [uw3-hpc-baremetal-install-run](https://github.com/jcgraciosa/uw3-hpc-baremetal-install-run) to a convenient location, edit the variables at the top, then: + +```bash +source gadi_install_shared.sh install +``` + +| Step | Function | Time | +|------|----------|------| +| Install pixi | `setup_pixi` | ~1 min | +| Clone Underworld3 | `clone_uw3` | ~1 min | +| Install pixi hpc env | `install_pixi_env` | ~3 min | +| Build mpi4py | `install_mpi4py` | ~2 min | +| Build PETSc + AMR tools | `install_petsc` | ~1 hour | +| Build h5py | `install_h5py` | ~2 min | +| Install Underworld3 | `install_uw3` | ~2 min | +| Verify | `verify_install` | ~1 min | + +#### What PETSc builds on Gadi + +- **AMR tools**: mmg, parmmg, pragmatic, eigen +- **Solvers**: mumps, scalapack, slepc, superlu, superlu_dist, hypre +- **Partitioners**: metis, parmetis, ptscotch +- **MPI**: Gadi's OpenMPI module (`--with-cc/cxx/fc`) +- **HDF5**: Gadi's `hdf5/1.12.2p` module (`--with-hdf5-dir`) +- **BLAS/LAPACK**: fblaslapack (auto-detection fails due to compiler env manipulation) +- **petsc4py**: built during configure (`--with-petsc4py=1`) + +### Activating the Environment + +Source the install script at the start of every session or job: + +```bash +source gadi_install_shared.sh +``` + +This loads Gadi modules, activates the pixi `hpc` environment via `pixi shell-hook`, and sets `PETSC_DIR`, `PETSC_ARCH`, and `PYTHONPATH`. Gadi's HDF5 lib dir is prepended to `LD_LIBRARY_PATH` to ensure the parallel HDF5 1.12.2p is loaded at runtime (not conda's serial HDF5 1.14). + +### Running with PBS + +Use `gadi_pbs_job.sh` from [uw3-hpc-baremetal-install-run](https://github.com/jcgraciosa/uw3-hpc-baremetal-install-run). Edit the variables at the top, then: + +```bash +qsub gadi_pbs_job.sh +``` + +Monitor progress: + +```bash +qstat -u $USER +tail -f .o* +``` + +### Shared Installation (Admin) + +Deploys to `/g/data/m18/software/uw3-pixi/` so all m18 project members can use it: + +```bash +source gadi_install_shared.sh install +``` + +The install script is then copied to the install directory so users can source it directly: + +```bash +source /g/data/m18/software/uw3-pixi/gadi_install_shared.sh +``` + +### Troubleshooting (Gadi) + +#### h5py undefined symbol: H5E_BADATOM_g + +The pixi `hpc` env ships a serial HDF5 1.14 (transitive conda-forge dependency). If h5py links against it instead of Gadi's parallel HDF5 1.12.2p, this symbol (removed in 1.14) is missing at runtime. The install script fixes this by temporarily hiding conda's HDF5 during the h5py build so meson can only find Gadi's. If you see this error, re-run: + +```bash +source gadi_install_shared.sh +install_h5py +``` + +#### Compiler interference during PETSc build + +The pixi `hpc` env ships a full conda toolchain (`x86_64-conda-linux-gnu-*`) that interferes with Gadi's OpenMPI wrappers. `build-petsc.sh` handles this via `setup_gadi_build_env()`, which unsets conda compiler variables and forces the MPI wrappers to use system compilers (`/usr/bin/gcc`). + +#### Fortran MPI library not found + +Gadi ships compiler-tagged Fortran MPI libraries (`libmpi_usempif08_GNU.so`) rather than the standard untagged names. `build-petsc.sh` creates symlinks in `petsc-custom/mpi-gadi-gnu-libs/` to bridge this. + +#### `import underworld3` fails in PBS job + +Ensure the install script is sourced inside the job script (not just in the login shell). The `gadi_pbs_job.sh` template does this correctly. + +--- + +## Rebuilding Underworld3 after source changes + +```bash +source kaiju_install_user.sh # or gadi_install_shared.sh +cd +git pull +pip install -e . +``` + +--- + +## Related + +- [Development Setup](development-setup.md) — local development with pixi +- [Branching Strategy](branching-strategy.md) — git workflow +- [Parallel Computing](../../advanced/parallel-computing.md) — writing parallel-safe UW3 code diff --git a/docs/developer/guides/kaiju-cluster-setup.md b/docs/developer/guides/kaiju-cluster-setup.md deleted file mode 100644 index 53ef20a0..00000000 --- a/docs/developer/guides/kaiju-cluster-setup.md +++ /dev/null @@ -1,302 +0,0 @@ -# Kaiju Cluster Setup - -This guide covers installing and running Underworld3 on the **Kaiju** cluster — a Rocky Linux 8.10 HPC system using Spack for module management and Slurm for job scheduling. - -Python packages are managed by **pixi** (the same tool used for local development). MPI-dependent packages — `mpi4py`, PETSc+AMR tools, `petsc4py`, and `h5py` — are built from source against Spack's OpenMPI to ensure compatibility with Slurm's parallel interconnect. - ---- - -## Hardware Overview - -| Resource | Specification | -|----------|--------------| -| Head node | 1× Intel Xeon Silver 4210R, 40 CPUs @ 2.4 GHz | -| Compute nodes | 8× Intel Xeon Gold 6230R, 104 CPUs @ 2.1 GHz each | -| Shared storage | `/opt/cluster` via NFS (cluster-wide) | -| Scheduler | Slurm with Munge authentication | - ---- - -## Why pixi + spack? - -Pixi manages the Python environment consistently with the developer's local machine (same `pixi.toml`, same package versions). Spack provides the cluster's OpenMPI, which is what Slurm uses for inter-node communication. - -The key constraint is that **anything linked against MPI must use the same MPI as Slurm**. This means `mpi4py`, `h5py`, PETSc, and `petsc4py` are built from source against Spack's OpenMPI — not from conda-forge (which bundles MPICH). - -``` -pixi hpc env → Python 3.12, sympy, scipy, pint, pydantic, ... (conda-forge, no MPI) -spack → openmpi@4.1.6 (cluster MPI) -source build → mpi4py, PETSc+AMR+petsc4py, h5py (linked to spack MPI) -``` - ---- - -## Prerequisites - -Spack must have OpenMPI available: - -```bash -spack find openmpi -# openmpi@4.1.6 -``` - -Pixi must be installed in your user space (no root needed): - -```bash -# Check if already installed -pixi --version - -# Install if missing -curl -fsSL https://pixi.sh/install.sh | bash -``` - ---- - -## Installation - -Use the install script at `uw3_install_kaiju_amr.sh` from the [kaiju-admin-notes](https://github.com/jcgraciosa/kaiju-admin-notes) repo. - -### Step 1: Edit configuration - -Open the script and set the variables at the top: - -```bash -SPACK_MPI_VERSION="openmpi@4.1.6" # Spack MPI module to load -INSTALL_PATH="${HOME}/uw3-installation" # Root directory for everything -UW3_BRANCH="development" # UW3 git branch -``` - -### Step 2: Run the full install - -```bash -source uw3_install_kaiju_amr.sh install -``` - -This runs the following steps in order: - -| Step | Function | Time | -|------|----------|------| -| Install pixi | `setup_pixi` | ~1 min | -| Clone Underworld3 | `clone_uw3` | ~1 min | -| Install pixi hpc env | `install_pixi_env` | ~3 min | -| Build mpi4py from source | `install_mpi4py` | ~2 min | -| Build PETSc + AMR tools | `install_petsc` | ~1 hour | -| Build MPI-enabled h5py | `install_h5py` | ~2 min | -| Install Underworld3 | `install_uw3` | ~2 min | -| Verify | `verify_install` | ~1 min | - -You can also run individual steps after sourcing: - -```bash -source uw3_install_kaiju_amr.sh -install_petsc # run just one step -``` - -### What PETSc builds - -PETSc is compiled from source (`petsc-custom/build-petsc.sh` with `UW_CLUSTER=kaiju`) with: - -- **AMR tools**: mmg, parmmg, pragmatic, eigen, bison -- **Solvers**: mumps, scalapack, slepc -- **Partitioners**: metis, parmetis, ptscotch -- **MPI**: Spack's OpenMPI (`--with-mpi-dir`) -- **HDF5**: downloaded and built with MPI support -- **BLAS/LAPACK**: fblaslapack (Rocky Linux 8 has no guaranteed system BLAS) -- **cmake**: downloaded (not in Spack) -- **petsc4py**: built during configure (`--with-petsc4py=1`) - ---- - -## Activating the Environment - -In every new session (interactive or job), source the install script: - -```bash -source ~/install_scripts/uw3_install_kaiju_amr.sh -``` - -This: -1. Loads `spack openmpi@4.1.6` -2. Activates the pixi `hpc` environment via `pixi shell-hook` -3. Sets `PETSC_DIR`, `PETSC_ARCH`, and `PYTHONPATH` for petsc4py -4. Sets `PMIX_MCA_psec=native` and `OMPI_MCA_btl_tcp_if_include=eno1` - -{note} -`pixi shell-hook` is used instead of `pixi shell` because it activates the environment in the current shell without spawning a new one. This is required for Slurm batch jobs. -{/note} - ---- - -## Running with Slurm - -Two job script templates are available in the [kaiju-admin-notes](https://github.com/jcgraciosa/kaiju-admin-notes) repo: - -| Script | Use when | -|--------|----------| -| `uw3_slurm_job.sh` | Per-user install (sources `uw3_install_kaiju_amr.sh`) | -| `uw3_slurm_job_shared.sh` | Shared install (`module load underworld3/...`) | - -### Submitting a job - -```bash -sbatch uw3_slurm_job.sh # per-user install -sbatch uw3_slurm_job_shared.sh # shared install -``` - -Monitor progress: - -```bash -squeue -u $USER -tail -f uw3_.out -``` - -### The `srun` invocation - -`--mpi=pmix` is **required** on Kaiju (Spack has `pmix@5.0.3`): - -```bash -srun --mpi=pmix python3 my_model.py -``` - -### Scaling examples - -```bash -# 1 node, 30 ranks -sbatch --nodes=1 --ntasks-per-node=30 uw3_slurm_job.sh - -# 4 nodes, 120 ranks -sbatch --nodes=4 --ntasks-per-node=30 uw3_slurm_job.sh -``` - ---- - -## Shared Installation (Admin) - -A system-wide installation can be deployed to `/opt/cluster/software/underworld3/` so all users access it via Environment Modules: - -```bash -module load underworld3/development-12Mar26 -``` - -Run as an admin with write access to `/opt/cluster/software`: - -```bash -source uw3_install_kaiju_shared.sh install -``` - -This script is identical to the per-user script except: -- `INSTALL_PATH=/opt/cluster/software` -- Adds `fix_permissions()` — sets world-readable permissions after install -- Adds `install_modulefile()` — copies the TCL modulefile with a date-stamped name to `/opt/cluster/modulefiles/underworld3/` - -The modulefile (`modulefiles/underworld3/development.tcl`) hardcodes the spack OpenMPI and pixi env paths. If spack is rebuilt (hash changes), update `mpi_root` in the modulefile. - -### Slurm job script (shared install) - -Users with the shared install should use `uw3_slurm_job_shared.sh`: - -```bash -# Edit UW3_MODULE and SCRIPT at the top, then: -sbatch uw3_slurm_job_shared.sh -``` - -The key difference from the per-user job script is environment setup: - -```bash -# Shared install: load module -module load underworld3/development-12Mar26 - -# Per-user install: source install script -source ~/install_scripts/uw3_install_kaiju_amr.sh -``` - ---- - -## Troubleshooting - -### `import underworld3` fails on compute nodes - -Sourcing the install script in the job script (not the login shell) ensures all paths propagate to compute nodes. The `uw3_slurm_job.sh` template does this correctly. - -### h5py HDF5 version mismatch - -h5py must be built against the same HDF5 that PETSc built. If you see HDF5 errors, rebuild: - -```bash -source uw3_install_kaiju_amr.sh -install_h5py -``` - -### PETSc needs rebuilding after Spack module update - -PETSc links against Spack's OpenMPI at build time. If `openmpi@4.1.6` is reinstalled or updated, rebuild PETSc: - -```bash -source uw3_install_kaiju_amr.sh -rm -rf ~/uw3-installation/underworld3/petsc-custom/petsc -install_petsc -install_h5py -``` - -### h5py replaces source-built mpi4py - -`pip install h5py` without `--no-deps` silently replaces the source-built mpi4py (spack OpenMPI) with a pre-built wheel linked to a different MPI. Always use `--no-deps` when installing h5py. The install script handles this correctly. - -If mpi4py was accidentally replaced, rebuild it from source: -```bash -source uw3_install_kaiju_amr.sh -pip install --no-binary :all: --no-cache-dir --force-reinstall "mpi4py>=4,<5" -``` - -Verify it links to spack OpenMPI: -```bash -ldd $(python3 -c "import mpi4py; print(mpi4py.__file__.replace('__init__.py',''))") \ - MPI*.so | grep mpi -# Should show: libmpi.so.40 => /opt/cluster/spack/.../openmpi-4.1.6-.../lib/libmpi.so.40 -``` - -### numpy ABI mismatch after h5py install - -If numpy is upgraded after petsc4py is compiled, `import petsc4py` fails with: -``` -ValueError: numpy.dtype size changed, may indicate binary incompatibility. -``` - -Fix: restore the numpy version used during the PETSc build, then rebuild h5py: -```bash -pip install --force-reinstall "numpy==1.26.4" -CC=mpicc HDF5_MPI="ON" HDF5_DIR="${PETSC_DIR}/${PETSC_ARCH}" \ - pip install --no-binary=h5py --no-cache-dir --force-reinstall --no-deps h5py -``` - -### PARMMG configure failure (pixi ld + spack transitive deps) - -pixi's conda linker (`ld` 14.x) requires transitive shared library dependencies to be explicitly linked. `libmmg.so` built with SCOTCH support causes PARMMG's `MMG_WORKS` link test to fail because `libscotch.so` is not explicitly passed. This is fixed in `petsc-custom/build-petsc-kaiju.sh` by building MMG without SCOTCH (`-DUSE_SCOTCH=OFF`). PARMMG uses ptscotch separately for parallel partitioning, which is unaffected. - -### Checking what's installed - -```bash -source uw3_install_kaiju_amr.sh -verify_install -``` - ---- - -## Rebuilding Underworld3 after source changes - -After pulling new UW3 code: - -```bash -source uw3_install_kaiju_amr.sh -cd ~/uw3-installation/underworld3 -git pull -pip install -e . -``` - ---- - -## Related - -- [Development Setup](development-setup.md) — local development with pixi -- [Branching Strategy](branching-strategy.md) — git workflow -- [Parallel Computing](../../advanced/parallel-computing.md) — writing parallel-safe UW3 code