Installation

Salvus is made up of several pieces - the core wave propagation code salvus_wave and a strong supporting cast of various Python packages. This page documents how to install everything.

Wave Propagation Code

Dependencies

salvus_wave requires a C++ compiler with C++ 11 features, Eigen, CMake, MPI, and PETSc (currently requires commit hash 77b6ac3a7c6f4e44cfb2e551c024ce09ad845680). PETSc has to be compiled with MPI, HDF5, NetCDF, and ExodusII support. If you know how to do this, feel free to install it any way you see fit, otherwise just follow the instructions below for your platform of choice.

We know it works on Linux and OSX. There is probably some way to get it going on Windows but we have never tried.

As is the case with all non-trivial software packages, you may run into some problems installing Salvus. Please don't fret! We're building a troubleshooting section below, report on some of the most common problems. Check that out for some advice, and as always feel free to send an email as well.

Installation on Ubuntu 16.04

This will use system libraries wherever possible. It has been tested with Ubuntu 16.04 but it will likely work with others as well. (You might have to adjust the names of some packages.)

# Make sure system is up-to-date.
$ sudo apt-get update && sudo apt-get upgrade

# Install dependencies.
$ sudo apt-get install git cmake mpich \
  libeigen3-dev libhdf5-mpich-dev libexodusii-dev \
  libopenblas-dev

# Install petsc.
# Make sure you have write access to the prefix directory!
$ sudo mkdir /opt/petsc
$ sudo chown `whoami` /opt/petsc
$ cd /directory/where/code/goes
$ git clone https://bitbucket.org/petsc/petsc.git
$ cd petsc
$ git checkout 0c7851c55cba8e40da5083f79ba1ff846acd45b2
# Production/performance installation. Follow the on-screen instructions
# after the configure step until the installation completes.
$ ./configure --prefix=/opt/petsc \
  --with-cc=mpicc --with-cxx=mpicxx \
  --download-exodusii=yes --download-chaco=yes \
  --with-hdf5=yes --with-hdf5-dir=/usr/lib/x86_64-linux-gnu/hdf5/mpich \
  --with-netcdf=yes --with-exodusii=yes --with-mpi=yes \
  --with-fc=0 \
  --with-blas-lib=/usr/lib/libblas.a --with-lapack-lib=/usr/lib/liblapack.a \
  --with-debugging=0 \
  COPTFLAGS="-O3 -march=native -mtune=native" \
  CXXOPTFLAGS="-O3 -march=native -mtune=native"

# Install salvus_wave
$ cd /directory/where/code/goes
$ git clone https://gitlab.com/Salvus/salvus_wave.git
$ cd salvus_wave
$ mkdir build
$ cd build
$ cmake .. \
  -DPETSC_DIR=/opt/petsc \
  -DEIGEN_INCLUDE=/usr/include/eigen3 \
  -DMPI_C_COMPILER=mpicc \
  -DMPI_CXX_COMPILER=mpicxx \
-DHDF5_DIR=/usr/lib/x86_64-linux-gnu/hdf5/mpich
$ make -j  # If your machine has little memory remove the -j.

Installation on OSX

This uses homebrew to install all dependencies.

# Install dependencies.
$ brew install cmake eigen
$ brew install homebrew/science/hdf5 --with-mpi --c++11
$ brew install netcdf

# Install petsc.
# Make sure you have write access to the prefix directory!
$ sudo mkdir /opt/petsc
$ sudo chown `whoami` /opt/petsc
$ cd /directory/where/code/goes
$ git clone https://bitbucket.org/petsc/petsc.git
$ cd petsc
$ git checkout 0c7851c55cba8e40da5083f79ba1ff846acd45b2
# Production/performance installation. Follow the on-screen instructions
# after the configure step until the installation completes.
$ ./configure --prefix=/opt/petsc \
  --download-exodusii=yes --download-chaco=yes \
  --with-netcdf-dir=/usr/local --with-hdf5-dir=/usr/local \
  --with-fc=0 \
  --with-debugging=0 \
  COPTFLAGS="-O3 -march=native -mtune=native" \
  CXXOPTFLAGS="-O3 -march=native -mtune=native"

# Install salvus_wave
$ cd /directory/where/code/goes
$ git clone https://gitlab.com/Salvus/salvus_wave.git
$ cd salvus_wave
$ mkdir build
$ cd build
$ cmake .. \
  -DPETSC_DIR=/opt/petsc \
  -DEIGEN_INCLUDE=/usr/local/include/eigen3 \
  -DMPI_C_COMPILER=/usr/local/bin/mpicc \
  -DMPI_CXX_COMPILER=/usr/local/bin/mpicxx
$ make -j

Run the Test Suite

This requires the salvus_mesher package whose installation is described below.

$ cd /directory/where/code/goes/salvus_wave/tests
$ ./create_unit_test_data.sh
$ ./run_unit_tests.sh

Python Support Modules

Download Miniconda - best choose version 3.5 and install it.

# Best create a new environment.
$ conda create -n salvus python=3.6
$ source activate salvus  # Always do this or put it in your .bashrc.
$ conda config --add channels conda-forge

# Install common dependencies.
$ conda install pip h5netcdf pytest flake8 obspy pyyaml \
  jsonschema pytest-mpl
$ pip install toml

# salvus_wave Python API
# Requires eigen.
$ sudo apt-get install libeigen3-dev  # Ubuntu
$ brew install eigen                  # OSX
# If you don't have salvus_wave yet, get it.
$ cd /directory/where/code/goes
$ git clone https://gitlab.com/Salvus/salvus_wave.git
# Otherwise just go the correct directory.
$ cd /directory/where/code/goes/salvus_wave/src/py
# Tell it where the eigen include dir is.
# Ubuntu
$ CFLAGS="-I/usr/include/eigen3" pip install -v -e .
# OSX Python - If this does not work, try the line below.
$ CFLAGS="-I/usr/local/include/eigen3" pip install -v -e .
# OSX Python - Alternative version.
$ CFLAGS="-I/usr/local/include/eigen3 -stdlib=libc++ -mmacosx-version-min=10.7" pip install -v -e .
# Run the tests.
$ py.test salvus_wave

# pyexodus
$ cd /directory/where/code/goes
$ git clone https://github.com/SalvusHub/pyexodus.git
$ cd pyexodus
$ pip install -v -e .
$ py.test  # All test should pass.

# salvus_mesher
$ cd /directory/where/code/goes
$ git clone https://gitlab.com/Salvus/salvus_mesher.git
$ cd salvus_mesher
$ CFLAGS='-fopenmp' pip install -v -e .  # if your compiler supports openmp
$ pip install -v -e .                    # without openmp
# Run the tests.
$ cd ~
$ python -m pymesher.tests

# salvus_seismo
$ cd /directory/where/code/goes
$ git clone https://gitlab.com/Salvus/salvus_seismo.git
$ cd salvus_seismo/py
$ pip install -v -e .
# Run the tests.
$ py.test

Troubleshooting

Conflicts in your PATH

By far the most common issue faced when installing Salvus involves the existence of multiple MPI or HDF5 installations, with conflicting versions being loaded at runtime. Currently, this issue might exhibit itself in some fairly catostrophic ways, such as a plethora of segmentation faults when running the tests. There is no `one size fits all' solution here -- it depends on which libraries are conflicting.


Salvus comes with a script that attempts to check an installation and all dependencies for consistency.

$ cd /directory/where/code/goes/salvus_wave
$ python misc/salvus_doctor.py build/salvus --mpirun=mpirun

This analyses the installation and prints its findings. Please read it carefully. It will not yet work on all platforms but we'll work on extending it. build/salvus is the actual Salvus binary and please set the mpirun to the one you plan on running Salvus with.


A simple way to see if this might be a problem is to inspect the dynamic libraries which Salvus actually loads at runtime, and compare these to what CMAKE was trying to compile against. To do this, navigigate to the Salvus build directory, and run either ldd ./salvus (Linux) or otool -L ./salvus (OSX). You will see a whole bunch of dynamic libraries spit out onto the screen -- these are what Salvus will load at runtime. In particular, look for the libraries with hdf5 or mpi in their names, and note which MPI distribution the particular library is compiled for (i.e. mpich, or openmpi). Then, clear the build directory of CMAKE build artifacts (run rm -rf CMake*), and run the cmake command again. This time, note the output listed under MPI_C_LIBRARIES, and ensure that the mpi distribution here matches that which you found earlier when inspecting the shared libraries. If it does not, this is likely the cause of some problems.

In this situation, the easiest thing to do is completely remove all MPI / HDF5 libraries (i.e. the openmpi versions), and then reinstall the mpich libraries. If you do this, the libraries reported by ldd and cmake should be the same.

Beware that similar issues occur if you have different versions of the above libraries installed via anaconda, or some other python distribution that ends up hijacking your PATH. In this case, it might be easiest to remove the anaconda directories from you PATH before installing. In the end the goal is the same, to ensure that the libraries found via cmake and those loaded dynamically are the same.

If you're still having trouble, don't hesistate for a moment to send us an email on the mailing list. At least one of us will be around to help :).

Open MPI on OSX

Open MPI on OSX might have a funny issue with the length of the temporary folder. If for example

$ mpirun -n 1 ls

does not work, just do

$ export TMPDIR=/tmp

and it should be good.

Development Info

Below we list some functions which may help developers of the code.

Profiling

The following cmake command will build with profiling

env CC=gcc-5 CXX=g++-5 cmake ../ -DPETSC_DIR=/opt/petsc \
-DEIGEN_INCLUDE=/usr/include/eigen3 \
-DCMAKE_CXX_FLAGS="-Wno-literal-suffix -Wno-deprecated-declarations -pg" \
-DCMAKE_EXE_LINKER_FLAGS="-pg" \
-DCMAKE_SHARED_LINKER_FLAGS="-pg" \
-DCMAKE_BUILD_TYPE=Release \
-DMPI_C_COMPILER=/opt/petsc/bin/mpicc \
-DMPI_CXX_COMPILER=/opt/petsc/bin/mpicxx

Linux

After running the executable (after make salvus && ./salvus --some-options), you will see a file gmon.out, which is the profile result. You can view the "raw" profile with the following command:

gprof path/to/salvus | less

To get a graphical representation of the profile, install the python utility gprof2dot via python's package manager pip, and also the dot command from apt-get (or other system package manager like brew for OS X)

pip install gprof2dot && sudo apt-get install dot

To create the visualization, chain together the commands

gprof path/to/salvus | gprof2dot | dot -Tpng -o profile.png

which will create a png for viewing.

OS X

There is a visual tool called Instruments provided with Xcode that can profile the code. Note that when going this route, it is recommended to compile with just the standard Release build flags.

Autocomplete

Emacs and Vim autocomplete can use a "compilation database" generated by Cmake. Add -DCMAKE_EXPORT_COMPILE_COMMANDS=ON to previously mentioned cmake.