This should also be useful for other clusters where you want to use
components (e.g. MPI, compilers) from the module system.
Start a session for building
si -N 1 -n 16 -c 1 -t 0-02:00:00 # on iris: -C broadwell or -C skylake
Clone and setup spack in $HOME - it has better much better performance for
small files than $SCRATCH
cd $HOME
git clone --depth=2 https://github.com/spack/spack.git
cd spack
Setup Spack in the environment
source $HOME/spack/share/spack/setup-env.sh
Then create $HOME/.spack/packages.yaml
touch $HOME/.spack/packages.yaml
with the following contents:
packages:
gcc:
externals:
- spec: [email protected]+binutils languages:='c,c++,fortran'
modules:
- compiler/GCC/13.2.0
extra_attributes:
compilers:
c: /opt/apps/easybuild/systems/aion/rhel810-20250405/2023b/epyc/software/GCCcore/13.2.0/bin/gcc
cxx: /opt/apps/easybuild/systems/aion/rhel810-20250405/2023b/epyc/software/GCCcore/13.2.0/bin/g++
fortran: /opt/apps/easybuild/systems/aion/rhel810-20250405/2023b/epyc/software/GCCcore/13.2.0/bin/gfortran
buildable: false
binutils:
externals:
- spec: [email protected]
modules:
- tools/binutils/2.40-GCCcore-13.2.0
buildable: false
libevent:
externals:
- spec: [email protected]
modules:
- lib/libevent/2.1.12-GCCcore-13.2.0
buildable: false
libfabric:
externals:
- spec: [email protected]
modules:
- lib/libfabric/1.19.0-GCCcore-13.2.0
buildable: false
libpciaccess:
externals:
- spec: [email protected]
modules:
- system/libpciaccess/0.17-GCCcore-13.2.0
buildable: false
libxml2:
externals:
- spec: [email protected]
modules:
- lib/libxml2/2.11.5-GCCcore-13.2.0
buildable: false
hwloc:
externals:
- spec: [email protected]+libxml2
modules:
- system/hwloc/2.9.2-GCCcore-13.2.0
buildable: false
mpi:
buildable: false
munge:
externals:
- spec: [email protected]
prefix: /usr
buildable: false
numactl:
externals:
- spec: [email protected]
modules:
- tools/numactl/2.0.16-GCCcore-13.2.0
buildable: false
openmpi:
variants: fabrics=ofi,ucx schedulers=slurm
externals:
- spec: [email protected]
modules:
- mpi/OpenMPI/4.1.6-GCC-13.2.0
buildable: false
pmix:
externals:
- spec: [email protected]
modules:
- lib/PMIx/4.2.6-GCCcore-13.2.0
buildable: false
slurm:
externals:
- spec: [email protected] sysconfdir=/etc/slurm
prefix: /usr
buildable: false
ucx:
externals:
- spec: [email protected]
modules:
- lib/UCX/1.15.0-GCCcore-13.2.0
buildable: false
zlib:
externals:
- spec: [email protected]
modules:
- lib/zlib/1.2.13-GCCcore-13.2.0
buildable: false
This tells Spack to use the system available GCC, binutils and OpenMPI with the native fabrics.
Create an environment and install FEniCS
cd ~
spack env create -d fenicsx-main-20230126/
spack env activate fenicsx-main-20230126/
spack add py-fenics-dolfinx@main fenics-dolfinx+adios2 adios2+python petsc+mumps
# Change @main to e.g. @0.7.2 in the above if you want a fixed version.
spack concretize
spack install -j16
or the same directly in spack.yaml in $SPACK_ENV
spack:
# add package specs to the `specs` list
specs:
- py-fenics-dolfinx@main
- fenics-dolfinx@main+adios2
- petsc+mumps
- adios2+python
view: true
concretizer:
unify: true
The following are also commonly used in FEniCS scripts and may be useful
spack add gmsh+opencascade py-gmsh py-numba py-scipy py-matplotlib
It is possible to build a specific version (git ref) of DOLFINx. Note that the hash must be the full hash. It is best to specify appropriate git refs on all components.
# This is a Spack Environment file.
#
# It describes a set of packages to be installed, along with
# configuration settings.
spack:
# add package specs to the `specs` list
specs:
- [email protected]=main+adios2
- [email protected]=main
- [email protected]=main
- [email protected]=main
- [email protected]=main
- [email protected]=main
- [email protected]=main
- petsc+mumps
- adios2+python
view: true
concretizer:
unify: true
It is also possible to build only the C++ layer using
spack add fenics-dolfinx@main+adios2 py-fenics-ffcx@main petsc+mumps
To rebuild FEniCSx from main branches inside an existing environment
spack install --overwrite -j16 fenics-basix py-fenics-basix py-fenics-ffcx fenics-ufcx py-fenics-ufl fenics-dolfinx py-fenics-dolfinx
Quickly test the build with
srun python -c "from mpi4py import MPI; import dolfinx"
See the uni.lu documentation for full details - using the environment should be as
simple as adding the following where ... is the name/folder of your environment.
#!/bin/bash -l
source $HOME/spack/share/spack/setup-env.sh
spack env activate ...
Workaround for inability to find adios2 Python package:
export PYTHONPATH=$(find $SPACK_ENV/.spack-env -type d -name 'site-packages' | grep venv):$PYTHONPATH