This page discusses the platforms on which the xSDK 0.8.0 release has been tested and contains general instructions for building, as well as more specific instructions for select high-end computing systems. See also details about obtaining the xSDK.
As more information becomes available for building the xSDK 0.8.0 release on different platforms, that information will be posted here.. Check back for updates.
xSDK 0.8.0 general build instructions
1. After cloning spack git repo, setup spack environment
# For bash users $ export SPACK_ROOT=/path/to/spack $ . $SPACK_ROOT/share/spack/setup-env.sh # For tcsh or csh users (note you must set SPACK_ROOT) $ setenv SPACK_ROOT /path/to/spack $ source $SPACK_ROOT/share/spack/setup-env.csh
1.1 Make sure proxy settings are set – if needed.
If a web proxy is required for internet access on the install machine, please set up proxy settings appropriately. Otherwise, Spack will fail to “fetch” the packages of your interest.
# For bash users $ export http_proxy=<your proxy URL> $ export https_proxy=<your proxy URL> # For tcsh or csh users $ setenv http_proxy <your proxy URL> $ setenv https_proxy <your proxy URL>
2. Setup spack compilers
spack compiler find
Spack compiler configuration is stored in $HOME/.spack/$UNAME/compilers.yaml and can be checked with
spack compiler list
3. Edit/update packages.yaml file to specify any system/build tools needed for xSDK installation.
Although Spack can install required build tools, it can be convenient to use preinstalled tools – if already installed. Such preinstalled packages can be specified to spack in $HOME/.spack/packages.yaml config file. The following is an example from a Linux build.
packages: cmake: externals: - spec: cmake@3.24.1 prefix: /usr buildable: False ninja: externals: - spec: ninja@1.10.2 prefix: /usr buildable: False python: buildable: False externals: - spec: python@3.11.0%gcc@12.2.1 prefix: /usr perl: externals: - spec: perl@5.36.0 prefix: /usr buildable: False mpich: externals: - spec: mpich@4.0.1%gcc@12.2.1 prefix: /software/mpich-4.0.1 buildable: False all: providers: mpi: [mpich] blas: [netlib-lapack] lapack: [netlib-lapack]
4. Install xSDK
After the edit, xSDK packages and external dependencies can be installed with a single command:
spack install xsdk@0.8.0
Note: One can install xsdk packages with cuda enabled on Nvidia GPU
spack install xsdk@0.8.0+cuda cuda_arch=70 (V100) or cuda_arch=80 (A100)
Or rocm enabled on AMD GPU
spack install xsdk@0.8.0+rocm amdgpu_target=gfx90a (MI-250)
5. Install environment modules.
Optionally one can install environment-modules package to access xsdk packages as modules.
spack install environment-modules
After installation, the modules can be enabled with the following command line.
For bash:
# For bash users $ source `spack location -i environment-modules`/init/bash # For tcsh or csh users $ source `spack location -i environment-modules`/init/tcsh
6. Load xSDK module and its sub-modules.
Now you can load xSDK environment. Try Spack’s load command with the -r (resolve all dependencies) option:
spack load xsdk
Then, module avail generates the following output, for example:
alquimia-1.0.10-gcc-11.3.0-uzrhmsk amrex-22.09-gcc-11.3.0-5eqj6ef arborx-1.2-gcc-11.3.0-mmqrqht arpack-ng-3.8.0-gcc-11.3.0-7b7izxd autoconf-2.69-gcc-11.3.0-jnu6gsc autoconf-archive-2022.02.11-gcc-11.3.0-pz4vouj automake-1.16.5-gcc-11.3.0-sl45g6p berkeley-db-18.1.40-gcc-11.3.0-yihjhau blaspp-2022.07.00-gcc-11.3.0-ukfoeov blt-0.4.1-gcc-11.3.0-opfjoid boost-1.79.0-gcc-11.3.0-wgi7ruq butterflypack-2.2.2-gcc-11.3.0-xtawt6y bzip2-1.0.8-gcc-11.3.0-u3w3mwl ca-certificates-mozilla-2022-10-11-gcc-11.3.0-fqxa4ht camp-0.2.3-gcc-11.3.0-ybwhcg7 cmake-3.24.3-gcc-11.3.0-y22v2r7 cub-1.16.0-gcc-11.3.0-i2kcclp cuda-11.8.0-gcc-11.3.0-wxn46rm datatransferkit-3.1-rc3-gcc-11.3.0-g5orlrc dealii-9.4.0-gcc-11.3.0-tjkoncd diffutils-3.8-gcc-11.3.0-glvyd6h eigen-3.4.0-gcc-11.3.0-7du6qcz environment-modules-5.2.0-gcc-11.3.0-3nzidqf exago-1.5.0-gcc-11.3.0-f6i5756 expat-2.4.8-gcc-11.3.0-egcuuyh fftw-3.3.10-gcc-11.3.0-af4rsqu findutils-4.9.0-gcc-11.3.0-czfjr7j gdbm-1.23-gcc-11.3.0-wemxf45 gettext-0.21.1-gcc-11.3.0-grpswak ginkgo-1.5.0-gcc-11.3.0-rmnboln gmp-6.2.1-gcc-11.3.0-nqulgga gsl-2.7.1-gcc-11.3.0-p36i2j3 hdf5-1.12.2-gcc-11.3.0-rrx2ali heffte-2.3.0-gcc-11.3.0-fko5oiv hiop-0.7.1-gcc-11.3.0-egy3td4 hwloc-2.8.0-gcc-11.3.0-o2uecob hypre-2.26.0-gcc-11.3.0-czh6dum intel-tbb-2020.3-gcc-11.3.0-bja6z5t kokkos-3.7.00-gcc-11.3.0-ylleey6 lapackpp-2022.07.00-gcc-11.3.0-qzsiud3 libbsd-0.11.5-gcc-11.3.0-sgyibnm libfabric-1.16.1-gcc-11.3.0-j7bcgjq libffi-3.4.2-gcc-11.3.0-4jekx2t libiconv-1.16-gcc-11.3.0-otjckeb libmd-1.0.4-gcc-11.3.0-ncomhro libpciaccess-0.16-gcc-11.3.0-ac6plwg libsigsegv-2.13-gcc-11.3.0-ediolnj libtool-2.4.7-gcc-11.3.0-s2izwu6 libxcrypt-4.4.31-gcc-11.3.0-lmzlvpb libxml2-2.10.1-gcc-11.3.0-fiqvxuq m4-1.4.19-gcc-11.3.0-kljksyp magma-2.7.0-gcc-11.3.0-s3x3e2k metis-5.1.0-gcc-11.3.0-nevefnv mfem-4.5.0-gcc-11.3.0-57p544s mpfr-4.1.0-gcc-11.3.0-xfna6hg mpich-4.0.2-gcc-11.3.0-f6s2v46 muparser-2.2.6.1-gcc-11.3.0-m4xuaty ncurses-6.3-gcc-11.3.0-7oaa5br netlib-scalapack-2.2.0-gcc-11.3.0-tinafjm ninja-1.11.1-gcc-11.3.0-z2ldst5 oce-0.18.3-gcc-11.3.0-aq6xz7w omega-h-9.34.13-gcc-11.3.0-ujitzxz openblas-0.3.21-gcc-11.3.0-dsnjuv4 openssl-1.1.1s-gcc-11.3.0-bav6wva p4est-2.8-gcc-11.3.0-62lurag parmetis-4.0.3-gcc-11.3.0-pzf3dlu perl-5.36.0-gcc-11.3.0-gk3zfll petsc-3.18.1-gcc-11.3.0-b5vadvy pflotran-4.0.1-gcc-11.3.0-jopigbe phist-1.11.2-gcc-11.3.0-ohw47rn pigz-2.7-gcc-11.3.0-rqwys4w pkgconf-1.8.0-gcc-11.3.0-a4t6fj4 plasma-22.9.29-gcc-11.3.0-tn7mxa4 precice-2.5.0-gcc-11.3.0-764ahpd pumi-2.2.7-gcc-11.3.0-crwo6vn py-attrs-22.1.0-gcc-11.3.0-rsgdt4l py-cython-0.29.32-gcc-11.3.0-w7wb77p py-flit-core-3.7.1-gcc-11.3.0-t5fsnmo py-iniconfig-1.1.1-gcc-11.3.0-a3gdde7 py-libensemble-0.9.3-gcc-11.3.0-jpf2ziz py-mpi4py-3.1.4-gcc-11.3.0-m3pvf5y py-numpy-1.23.4-gcc-11.3.0-f5b6th5 py-packaging-21.3-gcc-11.3.0-6vzauhp py-petsc4py-3.18.1-gcc-11.3.0-sr6iwkj py-pip-22.2.2-gcc-11.3.0-mwtebo5 py-pluggy-1.0.0-gcc-11.3.0-f4adzot py-psutil-5.9.2-gcc-11.3.0-7ub4mja py-py-1.11.0-gcc-11.3.0-tkzfljm py-pyparsing-3.0.9-gcc-11.3.0-gc2qt7i py-pytest-7.1.3-gcc-11.3.0-i223yku py-setuptools-59.4.0-gcc-11.3.0-72arxxe py-setuptools-scm-7.0.5-gcc-11.3.0-33nxwnv py-tomli-2.0.1-gcc-11.3.0-lsnjr6z py-typing-extensions-4.3.0-gcc-11.3.0-mmovvry py-wheel-0.37.1-gcc-11.3.0-dddeb6s python-3.10.8-gcc-11.3.0-jljaewt raja-0.14.0-gcc-11.3.0-6aq3zvi readline-8.1.2-gcc-11.3.0-fusnllm sed-4.8-gcc-11.3.0-xfcenqn slate-2022.07.00-gcc-11.3.0-v4u526b slepc-3.18.1-gcc-11.3.0-rvcpb6z sqlite-3.39.4-gcc-11.3.0-tt3qk4r strumpack-7.0.1-gcc-11.3.0-fdf7ac6 suite-sparse-5.13.0-gcc-11.3.0-ay447vc sundials-6.4.1-gcc-11.3.0-een2zhz superlu-dist-8.1.2-gcc-11.3.0-dx34bnh tar-1.34-gcc-11.3.0-qaf7bhh tasmanian-7.9-gcc-11.3.0-ya7vntk tcl-8.6.12-gcc-11.3.0-x5w6out texinfo-6.5-gcc-11.3.0-hzp7pg5 trilinos-13.4.1-gcc-11.3.0-zp3ay7p umpire-6.0.0-gcc-11.3.0-g2kaawi util-linux-uuid-2.38.1-gcc-11.3.0-uoyfzct util-macros-1.19.3-gcc-11.3.0-74bv24h xsdk-0.8.0-gcc-11.3.0-doq7ccr xz-5.2.7-gcc-11.3.0-mplymcr yaksa-0.2-gcc-11.3.0-wnrlnan zfp-0.5.5-gcc-11.3.0-vjm7eqp zlib-1.2.13-gcc-11.3.0-a46ggan zstd-1.5.2-gcc-11.3.0-opdjgql
xSDK 0.8.0 platform testing
xSDK 0.8.0 has been updated/fixed on a regular basis on various workstations (and more)
- darwin-catalina-skylake / apple-clang@12.0.0
- linux-centos7-cascadelake / gcc@9.2.0
- linux-centos7-cascadelake / gcc@9.2.0 [+cuda]
- linux-centos7-cascadelake / intel@19.1.1.217
- linux-fedora37-cortex_a72 / gcc@12.2.1 [ARM-64]
- linux-fedora37-skylake / clang@15.0.4
- linux-fedora37-skylake / oneapi@2022.2.0
- linux-fedora37-skylake / gcc@12.2.1
- linux-ubuntu20.04-cascadelake / gcc@9.4.0
- linux-ubuntu20.04-cascadelake / gcc@10.3.0
- linux-ubuntu20.04-skylake / oneapi@2022.2.0
- linux-ubuntu20.04-skylake / gcc@9.4.0
- linux-ubuntu22.04-zen3 / gcc@11.3.0 [+cuda]
xSDK packages are tested on key machines at DOE computing facilities – ALCF, NERSC, OLCF and LLNL.
- ALCF:Polaris: HPE Apollo 6500 Cray XC40 with AMD EPYC Milan CPUs, Nvidia A100 GPUs
- buid pacakges on the front-end node with:
spack env create xsdk_polaris
[spack.lock | spack.yaml]
spack env activate -p xsdk_polaris
module load PrgEnv-gnu
# Spec(in spack.yaml): xsdk@0.8.0 +cuda cuda_arch=80
spack install
- Relevant spack config files for this build are at:
linux-sles15-zen3 / gcc@11.2.0
- buid pacakges on the front-end node with:
- NERSC: Perlmutter: a HPE Cray EX with AMD EPYC Milan CPUs, Nvidia A100 GPUs
- build packages on the compile/front-end node with:
spack install xsdk@0.8.0 +cuda cuda_arch=80 ^kokkos+wrapper+cuda_lambda cuda_arch=80 %gcc@11.2.0
- Relevant .spack config files for this build are at:
linux-sles15-zen3 / gcc@11.2.0
- build packages on the compile/front-end node with:
- NERSC: Cori: Cray XC40 with Intel compilers [in Haswell mode]
- build packages on the compile/front-end node with:
spack install xsdk@0.8.0%intel@19.1.2.254~exago ^netlib-lapack ^dealii~oce
- Relevant .spack config files for this build are at:
cray-cnl7-haswell / intel@19.1.2.254
- build packages on the compile/front-end node with:
- OLCF: Crusher: HPE Cray with AMD EPYC 3rd gen CPUs, AMD MI-250X GPUs
- Building with GNU compilers on frontend with
ROCM
enabled:./bin/spack install -j64 xsdk@0.8.0%gcc@11.2.0+rocm amdgpu_target=gfx90a ^netlib-lapack
- Relevant .spack config files for this build are at:
linux-sles15-zen3 / gcc@12.1.0
- Building with GNU compilers on frontend with
- OLCF: Summit: is a supercomputer featuring nodes with two sockets of IBM POWER9 and 6 NVIDIA Volta V100 GPUs connected with NVLik running RedHat 7 Linux with IBM, GNU, and PGI compilers.
- Building with GCC8 is possible but there are limitations on the jobs on the login node that make only 16 GiB of the main memory available for a single user. Some packages inside xSDK fail because of it.
- Build xSDK (on compute nodes via bsub) with:
spack env create xsdk_summit [spack.lock | spack.yaml]
spack env activate -p xsdk_summit
# Spec (in spack.yaml): xsdk~dealii@0.8.0+cuda cuda_arch=70 ^netlib-lapack ^spectrum-mpi@10.4.0.3 ^hiop+shared ^python@3.8
bsub xsdk08.lsf
- Relevant .spack config files for this build are at:
linux-rhel7-power9le / gcc@8.3.1
- LLNL: Tioga: HPE Cray with AMD Trento CPUs, AMD MI-250X GPUs
- Building with GNU compilers on frontend with
ROCM
enabled:spack install -j 64 --no-cache --fresh xsdk@0.8.0 +rocm amdgpu_target=gfx90a ^openssl@1.1.1k
- Relevant .spack config files for this build are at:
linux-rhel8-zen3 / gcc@12.1.0
- Building with GNU compilers on frontend with
- LLNL: Lassen: IBM POWER9 with IBM, and GNU compilers
- Building with IBM compilers and GCC8 is possible but as with Summit there are limitations on the jobs on the login node that make only 16 GiB available for a single user. Some packages inside xSDK fail because of it. This can be circumvented by submitting a job to the batch queue.
- For Lassen, we provide a Spack environment file instead of packages.yaml and compilers.yaml.
- Build xSDK on a Lassen compute node with
CUDA
enabled:spack env create xsdk_lassen [spack.lock | spack.yaml]
spack env activate xsdk_lassen
# Spec (in spack.yaml): xsdk@0.8.0%gcc@8.3.1+cuda~dealii~raja cuda_arch=70 ^cuda@11.7.0 ^python@3.9.10
spack install
- The relevant environment files for this build are at:
linux-rhel7-power9le / gcc@8.3.1