Installing xSDK 0.7.0

This page discusses the platforms on which the xSDK 0.7.0 release has been tested and contains general instructions for building, as well as more specific instructions for select high-end computing systems. See also details about obtaining the xSDK.

As more information becomes available for building the xSDK 0.7.0 release on different platforms, that information will be posted here.. Check back for updates.

xSDK 0.7.0 general build instructions

1. After cloning spack git repo, setup spack environment
# For bash users
$ export SPACK_ROOT=/path/to/spack
$ . $SPACK_ROOT/share/spack/setup-env.sh

# For tcsh or csh users (note you must set SPACK_ROOT)
$ setenv SPACK_ROOT /path/to/spack
$ source $SPACK_ROOT/share/spack/setup-env.csh
1.1 Make sure proxy settings are set – if needed.

If a web proxy is required for internet access on the install machine, please set up proxy settings appropriately. Otherwise, Spack will fail to “fetch” the packages of your interest.

# For bash users
$ export http_proxy=<your proxy URL>
$ export https_proxy=<your proxy URL>

# For tcsh or csh users
$ setenv http_proxy <your proxy URL>
$ setenv https_proxy <your proxy URL>
2. Setup spack compilers
spack compiler find

Spack compiler configuration is stored in $HOME/.spack/$UNAME/compilers.yaml and can be checked with

spack compiler list
3. Edit/update packages.yaml file to specify any system/build tools needed for xSDK installation.

Although Spack can install required build tools, it can be convenient to use preinstalled tools – if already installed. Such preinstalled packages can be specified to spack in $HOME/.spack/packages.yaml config file. The following is an example from a Linux/Intel/KNL build.

packages:
 perl:
 externals:
 - spec: perl@5.16.3
 prefix: /usr
 buildable: False
 python:
 externals:
 - spec: python@3.6.6
 prefix: /usr
 buildable: False
 py-numpy:
 externals:
 - spec: py-numpy@1.12.1
 prefix: /usr
 buildable: False
 py-setuptools:
 externals:
 - spec: py-setuptools@39.2.0
 prefix: /usr
 buildable: False
 intel-mpi:
 externals:
 - spec: intel-mpi@18.0.2
 prefix: /homes/intel/18u2
 buildable: False
 intel-mkl:
 externals:
 - spec: intel-mkl@18.0.2
 prefix: /homes/intel/18u2
 buildable: False
 all:
 providers:
 mpi: [intel-mpi]
 blas: [intel-mkl]
 lapack: [intel-mkl]
 mkl: [intel-mkl]
 compiler: [intel@18.0.2]
4. Install xSDK

After the edit, xSDK packages and external dependencies can be installed with a single command:

 spack install xsdk

Note: One can install xsdk packages with cuda enabled

 spack install xsdk+cuda
5. Install environment modules.

Optionally one can install environment-modules package to access xsdk packages as modules.

 spack install environment-modules

After installation, the modules can be enabled with the following command line.

For bash:

# For bash users
$ source  `spack location -i environment-modules`/init/bash
# For tcsh or csh users
$ source  `spack location -i environment-modules`/init/tcsh
6. Load xSDK module and its sub-modules.

Now you can load xSDK environment. Try Spack’s load command with the -r (resolve all dependencies) option:

spack load xsdk

Then, module avail generates the following output, for example:

alquimia-1.0.9-apple-clang-12.0.0-cc2bl77 
arborx-1.1-apple-clang-12.0.0-66otto4 
arpack-ng-3.8.0-apple-clang-12.0.0-zleewyb 
autoconf-archive-2019.01.06-apple-clang-12.0.0-ptndtgz 
boost-1.76.0-apple-clang-12.0.0-alsbt3n 
butterflypack-2.0.0-apple-clang-12.0.0-7dowry3 
datatransferkit-3.1-rc2-apple-clang-12.0.0-wpmnitg 
dealii-9.3.2-apple-clang-12.0.0-5uoyre4 
environment-modules-5.0.1-apple-clang-12.0.0-q2nazmx 
expat-2.4.1-apple-clang-12.0.0-lsq37dw 
fftw-3.3.10-apple-clang-12.0.0-35dnr4s 
findutils-4.8.0-apple-clang-12.0.0-7fgvm2c 
gdbm-1.21-apple-clang-12.0.0-54gqk7a 
gettext-0.21-apple-clang-12.0.0-pejsfex 
ginkgo-1.4.0-apple-clang-12.0.0-6jbuuuo 
gmp-6.2.1-apple-clang-12.0.0-c3ts3i4 
gsl-2.7-apple-clang-12.0.0-zhnmpd5 
hdf5-1.10.7-apple-clang-12.0.0-jajon2x 
heffte-2.2.0-apple-clang-12.0.0-usnsdwj 
hwloc-2.6.0-apple-clang-12.0.0-dssxr37 
hypre-2.23.0-apple-clang-12.0.0-tcdy7yj 
intel-tbb-2020.3-apple-clang-12.0.0-lvy3ik2 
kokkos-3.4.01-apple-clang-12.0.0-dtyph3o 
libfabric-1.13.2-apple-clang-12.0.0-c2zcdoc 
libffi-3.3-apple-clang-12.0.0-ge3fho7 
libiconv-1.16-apple-clang-12.0.0-6suh5zo 
libxml2-2.9.12-apple-clang-12.0.0-gi3u7d2 
metis-5.1.0-apple-clang-12.0.0-an4kcff 
mfem-4.3.0-apple-clang-12.0.0-ngafyhm 
mpfr-4.1.0-apple-clang-12.0.0-oztqx2s 
mpich-3.4.2-apple-clang-12.0.0-4hryk5e 
muparser-2.2.6.1-apple-clang-12.0.0-7bz4pmw 
ncurses-6.2-apple-clang-12.0.0-tgzni2m 
netlib-lapack-3.9.1-apple-clang-12.0.0-io4uxyi 
netlib-scalapack-2.1.0-apple-clang-12.0.0-yb7j3hl 
ninja-1.10.2-apple-clang-12.0.0-3thrpor 
oce-0.18.3-apple-clang-12.0.0-ny7yluz 
omega-h-9.34.1-apple-clang-12.0.0-uyxmm5k 
openssl-1.1.1l-apple-clang-12.0.0-kvotqys 
p4est-2.8-apple-clang-12.0.0-xtcb3er 
parmetis-4.0.3-apple-clang-12.0.0-2s2afjq 
petsc-3.16.1-apple-clang-12.0.0-dslmtcz 
pflotran-3.0.2-apple-clang-12.0.0-3xieriw 
phist-1.9.5-apple-clang-12.0.0-d5zgfqo 
pkgconf-1.8.0-apple-clang-12.0.0-5u7kepi 
pumi-2.2.6-apple-clang-12.0.0-4rcjl3w 
py-cython-0.29.24-apple-clang-12.0.0-abe5vbu 
py-libensemble-0.8.0-apple-clang-12.0.0-oenn4rx 
py-mpi4py-3.1.2-apple-clang-12.0.0-ctvcqqg 
py-numpy-1.21.4-apple-clang-12.0.0-scikdkn 
py-petsc4py-3.16.1-apple-clang-12.0.0-b3sdyvs 
py-psutil-5.8.0-apple-clang-12.0.0-yiayzyw 
py-setuptools-58.2.0-apple-clang-12.0.0-yss5ppl 
python-3.8.12-apple-clang-12.0.0-yi6p4j7 
readline-8.1-apple-clang-12.0.0-mqy3neo 
slepc-3.16.0-apple-clang-12.0.0-4tjrpdb 
sqlite-3.36.0-apple-clang-12.0.0-c5t7xbq 
strumpack-6.1.0-apple-clang-12.0.0-zhtx6ap 
suite-sparse-5.10.1-apple-clang-12.0.0-l7gshqj 
sundials-5.8.0-apple-clang-12.0.0-nyjxzov 
superlu-dist-7.1.1-apple-clang-12.0.0-7cfp6mi 
tasmanian-7.7-apple-clang-12.0.0-7eqzagz 
tcl-8.6.11-apple-clang-12.0.0-x24fqob 
texinfo-6.5-apple-clang-12.0.0-2l3yc2j 
trilinos-13.2.0-apple-clang-12.0.0-k7zyvc5 
xsdk-0.7.0-apple-clang-12.0.0-r7unzde 
zfp-0.5.5-apple-clang-12.0.0-bv3jql4 
zlib-1.2.11-apple-clang-12.0.0-7r7zeqq

xSDK 0.7.0 platform testing

xSDK 0.7.0 has been updated/fixed on a regular basis on various workstations (and more)

In collaboration with ALCF, NERSC, and OLCF xSDK packages are tested on key machines at these DOE computing facilities.

  • ALCF: Theta: Cray XC40 with Intel compilers [in KNL mode]
    • Theta front end nodes use Xeon processors and the compute nodes use KNL processors. Due to this difference – usually the builds on the compile/front-end nodes are done in cross compile mode – this does not work well with all xSDK packages. Hence xSDK build on Theta compute nodes.
    • build the packages on the compute node – by allocating a long enough 1 node job (if possible – say 24h) and run the following script
      #!/bin/sh -x
      module remove darshan
      module remove xalt
      module load cce
      
      export HTTPS_PROXY=theta-proxy.tmi.alcf.anl.gov:3128
      export https_proxy=theta-proxy.tmi.alcf.anl.gov:3128
      export HTTP_PROXY=theta-proxy.tmi.alcf.anl.gov:3128
      export http_proxy=theta-proxy.tmi.alcf.anl.gov:3128
      
      aprun -cc none -n 1 python3 /home/balay/spack/bin/spack install -j24 xsdk target=mic_knl ^boost@1.70.0 ^suite-sparse@5.7.2
    • Relevant .spack config files for this build are at:
      cray-cnl7-mic_knl / intel@19.1.0.166
  • NERSC: Cori: Cray XC40 with Intel compilers [in Haswell mode]
    • build packages on the compile/fronet-end node with:
      spack install --fail-fast xsdk ^petsc+batch ^dealii~oce cflags=-L/opt/cray/pe/atp/2.1.3/libApp cxxflags=-L/opt/cray/pe/atp/2.1.3/libApp
      
    • Relevant .spack config files for this build are at:
      cray-cnl7-haswell / intel@19.0.3.199
  • OLCF:Summit: is a supercomputer featuring nodes with two sockets of IBM POWER9 and 6 NVIDIA Volta V100 GPUs connected with NVLik running RedHat 7 Linux with IBM, GNU, and PGI compilers.
    • Building with GCC is possible for both GCC 7 and 8 but there are limitations on the jobs on the login node that make only 16 GiB of the main memory available for a single user. Some packages inside xSDK fail because of it. This can be circumvented by submitting a job to the batch queue. An example LSF submission file looks like this:
      #!/bin/bash
      #BSUB -P <project_code>
      #BSUB -W 40
      #BSUB -nnodes 1
      #BSUB -alloc_flags smt4
      #BSUB -J xsdk07
      #BSUB -o xsdk07o.%J
      #BSUB -e xsdk07e.%J
      
      projroot=/ccs/proj/<project_code>/$USER
      
      SPACK_ROOT=${projroot}/spack-xsdk-07
      export SPACK_ROOT
      PATH=${PATH}:${SPACK_ROOT}/bin
      export PATH
      
      module unload darshan-runtime
      module unload spectrum-mpi
      module unload xalt
      module unload fftw
      module unload xl
      module load spectrum-mpi/10.4.0.3-20210112
      # using system-provided GCC 8.3.1-5
      
      cd $SPACK_ROOT
      
      spack install xsdk~dealii
      
      spack install xsdk~dealii+cuda cuda_arch=70 ^spectrum-mpi@10.4.0.3
    • Relevant .spack config files for this build are at:
      linux-rhel7-power9le / gcc@8.3.1
  • OLCF:Spock: Spock contains similar hardware and software as the upcoming Frontier system
  • LLNL:Lassen: IBM POWER9 with IBM, and GNU compilers
    • Building with IBM compilers and GCC is possible for both GCC 7 and 8 but as with Summit there are limitations on the jobs on the login node that make only 16 GiB available for a single user. Some packages inside xSDK fail because of it. This can be circumvented by submitting a job to the batch queue.
    • For Lassen, we provide a Spack environment file instead of packages.yaml and compilers.yaml.
    • Build xSDK on a Lassen compute node with CUDA enabled:
    • spack env create lassen [spack.lock | spack.yaml]
      spack env activate lassen
      spack install
    • The relevant environment files for this build are at:
      https://github.com/xsdk-project/installxSDK/tree/r-0.7.0/platformFiles/lassen