Installation

Salvus consists of several pieces written in C++ and Python. This page documents how to install everything.

salvus requires a C++ compiler with C++ 11 features, Eigen, CMake, MPI, and PETSc (currently requires commit hash 77b6ac3a7c6f4e44cfb2e551c024ce09ad845680). PETSc has to be compiled with MPI, HDF5, NetCDF, and ExodusII support. If you know how to do this, feel free to install it any way you see fit, otherwise just follow the instructions below for your platform of choice.

We know it works on Linux and OSX. There is probably some way to get it going on Windows but we have never tried.

As is the case with all non-trivial software packages, you may run into some problems installing Salvus. Please don't fret! We're building a troubleshooting section below, report on some of the most common problems. Check that out for some advice, and as always feel free to send an email as well.

Install dependencies.

Installation on Ubuntu 16.04

This will use system libraries wherever possible. It has been tested with Ubuntu 16.04 but it will likely work with others as well. (You might have to adjust the names of some packages.)

# Make sure system is up-to-date.
$ sudo apt-get update && sudo apt-get upgrade

# Install dependencies.
$ sudo apt-get install git cmake mpich \
  libeigen3-dev \
  libhdf5-mpich-dev \
  libnetcdf-dev \
  libexodusii-dev \
  libopenblas-dev

Next, we install PETSc. This is the time where you need to decide whether you want to run all computations in single or double precision. For most applications single precision is accurate enough and it will save you a lot of CPU time and memory. Either way, you will need to indicate the precision in the following line below

$ --with-precision=PRECISION

and set it to --with-precision=single or --with-precision=double. Of course, it is possible to use both versions side by side. Just rebuild PETSc with a different precision in another prefix directory.

# Install petsc.
# Make sure you have write access to the prefix directory!
$ sudo mkdir /opt/petsc
$ sudo chown `whoami` /opt/petsc
$ cd /directory/where/code/goes
$ git clone https://bitbucket.org/petsc/petsc.git
$ cd petsc
$ git checkout 0c7851c55cba8e40da5083f79ba1ff846acd45b2
# Production/performance installation. Follow the on-screen instructions
# after the configure step until the installation completes.
$ ./configure --prefix=/opt/petsc \
  --with-cc=mpicc --with-cxx=mpicxx \
  --download-exodusii=yes --download-chaco=yes \
  --with-hdf5=yes --with-hdf5-dir=/usr/lib/x86_64-linux-gnu/hdf5/mpich \
  --with-netcdf=yes --with-exodusii=yes --with-mpi=yes \
  --with-fc=0 \
  --with-blas-lib=/usr/lib/libblas.a --with-lapack-lib=/usr/lib/liblapack.a \
  --with-debugging=0 \
  --with-precision=PRECISION \
  COPTFLAGS="-O3 -march=native -mtune=native" \
  CXXOPTFLAGS="-O3 -march=native -mtune=native"

Installation on OSX

This uses homebrew to install all dependencies.

# Install dependencies.
$ brew install cmake eigen
$ brew install homebrew/science/hdf5 --with-mpi --c++11
$ brew install netcdf

Next, we install PETSc. This is the time where you need to decide whether you want to run all computations in single or double precision. For most applications single precision is accurate enough and it will save you a lot of CPU time and memory. Either way, you will need to replace the following line below

$ --with-precision=PRECISION

with either --with-precision=single or --with-precision=double. Of course, it is possible to use both versions side by side. Just rebuild PETSc with a different precision in another prefix directory.

# Install petsc.
# Make sure you have write access to the prefix directory!
$ sudo mkdir /opt/petsc
$ sudo chown `whoami` /opt/petsc
$ cd /directory/where/code/goes
$ git clone https://bitbucket.org/petsc/petsc.git
$ cd petsc
$ git checkout 0c7851c55cba8e40da5083f79ba1ff846acd45b2
# Production/performance installation. Follow the on-screen instructions
# after the configure step until the installation completes.
$ ./configure --prefix=/opt/petsc \
  --download-exodusii=yes --download-chaco=yes \
  --with-netcdf-dir=/usr/local --with-hdf5-dir=/usr/local \
  --with-fc=0 \
  --with-debugging=0 \
  --with-precision=PRECISION \
  COPTFLAGS="-O3 -march=native -mtune=native" \
  CXXOPTFLAGS="-O3 -march=native -mtune=native"

Note that the configure script does not yet support Python 3 and will print out a workaround if needed that you can copy&paste. Also don't forget to copy&paste the make commands PETSc will print after each step.

Python packages

Download Miniconda for Mac OS X or Linux- best choose version 3.6 and install it.

# Best create a new environment.
$ conda create -n salvus python=3.6
$ source activate salvus  # Always do this or put it in your .bashrc.
$ conda config --add channels conda-forge

# Install common dependencies.
$ conda install pip h5netcdf pytest flake8 obspy pyyaml \
  jsonschema pytest-mpl
$ pip install toml

Install salvus and run the test suite

Finally, it is time to install salvus itself.

To get started, clone the git repository:

$ git clone https://gitlab.com/Salvus/salvus.git
$ cd salvus
$ git submodule update --init --recursive

To make sure all submodules are up-to-date, please run:

$ git submodule update --recursive --remote

Now create the build directory

$ mkdir build
$ cd build

Make sure all dependencies and Python modules are installed. You may have to adjust the following paths in case you manually installed the dependencies.

On Ubuntu 16.04 run

$ cmake .. \
  -DPETSC_DIR=/opt/petsc \
  -DEIGEN_INCLUDE=/usr/include/eigen3 \
  -DNETCDF_DIR=/usr \
  -DHDF5_DIR=/usr/lib/x86_64-linux-gnu/hdf5/mpich

On Mac OS X run

$ cmake .. \
  -DPETSC_DIR=/opt/petsc \
  -DEIGEN_INCLUDE=/usr/local/include/eigen3 \
  -DNETCDF_DIR=/usr/local \
  -DHDF5_DIR=/usr/local

Now for both operating systems we continue with

# Compile salvus_wave and install the whole suite of Salvus packages.
$ make -j
# Run all tests.
$ make CTEST_OUTPUT_ON_FAILURE=1 test

The last line will run all tests for all submodules. Make sure that all tests pass.

If you want to use MPI you can run a few more tests with

$ make mpi_tests

That's it. You have successfully installed Salvus!

Troubleshooting

Conflicting Python dependencies

If you previously installed one of the Python dependencies from some other place you might get an error like:

AssertionError: Egg-link /.../pyexodus does not match installed location of pyexodus (at /.../pyexodus)

In that case please manually uninstall the offending package and run make -j again.

$ pip uninstall pyexodus  # or other package

Conflicts in your PATH

By far the most common issue faced when installing Salvus involves the existence of multiple MPI or HDF5 installations, with conflicting versions being loaded at runtime. Currently, this issue might exhibit itself in some fairly catostrophic ways, such as a plethora of segmentation faults when running the tests. There is no `one size fits all' solution here -- it depends on which libraries are conflicting.


Salvus comes with a script that attempts to check an installation and all dependencies for consistency.

$ cd /directory/where/code/goes/salvus
$ python ./salvus_wave/misc/salvus_doctor.py build/salvus --mpirun=mpirun

This analyses the installation and prints its findings. Please read it carefully. It will not yet work on all platforms but we'll work on extending it. build/salvus is the actual Salvus binary and please set the mpirun to the one you plan on running Salvus with.


A simple way to see if this might be a problem is to inspect the dynamic libraries which Salvus actually loads at runtime, and compare these to what CMAKE was trying to compile against. To do this, navigigate to the Salvus build directory, and run either ldd ./salvus (Linux) or otool -L ./salvus (OSX). You will see a whole bunch of dynamic libraries spit out onto the screen -- these are what Salvus will load at runtime. In particular, look for the libraries with hdf5 or mpi in their names, and note which MPI distribution the particular library is compiled for (i.e. mpich, or openmpi). Then, clear the build directory of CMAKE build artifacts (run rm -rf CMake*), and run the cmake command again. This time, note the output listed under MPI_C_LIBRARIES, and ensure that the mpi distribution here matches that which you found earlier when inspecting the shared libraries. If it does not, this is likely the cause of some problems.

In this situation, the easiest thing to do is completely remove all MPI / HDF5 libraries (i.e. the openmpi versions), and then reinstall the mpich libraries. If you do this, the libraries reported by ldd and cmake should be the same.

Beware that similar issues occur if you have different versions of the above libraries installed via anaconda, or some other python distribution that ends up hijacking your PATH. In this case, it might be easiest to remove the anaconda directories from you PATH before installing. In the end the goal is the same, to ensure that the libraries found via cmake and those loaded dynamically are the same.

If you're still having trouble, don't hesistate for a moment to send us an email on the mailing list. At least one of us will be around to help :).

Open MPI on OSX

Open MPI on OSX might have a funny issue with the length of the temporary folder. If for example

$ mpirun -n 1 ls

does not work, just do

$ export TMPDIR=/tmp

and it should be good.

Development Info

Below we list some functions which may help developers of the code.

Profiling

The following cmake command will build with profiling

env CC=gcc-5 CXX=g++-5 cmake ../ -DPETSC_DIR=/opt/petsc \
-DEIGEN_INCLUDE=/usr/include/eigen3 \
-DCMAKE_CXX_FLAGS="-Wno-literal-suffix -Wno-deprecated-declarations -pg" \
-DCMAKE_EXE_LINKER_FLAGS="-pg" \
-DCMAKE_SHARED_LINKER_FLAGS="-pg" \
-DCMAKE_BUILD_TYPE=Release \
-DMPI_C_COMPILER=/opt/petsc/bin/mpicc \
-DMPI_CXX_COMPILER=/opt/petsc/bin/mpicxx

Linux

After running the executable (after make salvus && ./salvus --some-options), you will see a file gmon.out, which is the profile result. You can view the "raw" profile with the following command:

gprof path/to/salvus | less

To get a graphical representation of the profile, install the python utility gprof2dot via python's package manager pip, and also the dot command from apt-get (or other system package manager like brew for OS X)

pip install gprof2dot && sudo apt-get install dot

To create the visualization, chain together the commands

gprof path/to/salvus | gprof2dot | dot -Tpng -o profile.png

which will create a png for viewing.

OS X

There is a visual tool called Instruments provided with Xcode that can profile the code. Note that when going this route, it is recommended to compile with just the standard Release build flags.

Autocomplete

Emacs and Vim autocomplete can use a "compilation database" generated by Cmake. Add -DCMAKE_EXPORT_COMPILE_COMMANDS=ON to previously mentioned cmake.