Interfacing deal.II to PETSc

PETSc is a software package that provides lots of functionality for linear algebra, among other things. For example, it includes implementations of a variety of linear, nonlinear and ordinary differential equation solvers, as well as various different sparse and dense matrix and vector formats. Of particular interest to deal.II is their ability to provide this functionality both on sequential and parallel (using MPI) computers.

deal.II has wrapper classes to the linear algebra parts of PETSc that provide almost the same interfaces as the built-in deal.II linear algebra classes. We use these interfaces for parallel computations based on MPI since the native deal.II linear algebra classes lack this ability. They are used, among other programs, in step-17, step-18 and step-40. See step-77 for an example that uses PETSc's nonlinear solver capabilities.

Installing deal.II with PETSc

Note: deal.II is compatible with any PETSc version newer than 3.7.0. If you encounter problems with a specific version, let us know.

When you compile and install PETSc, you need to set environment variables PETSC_DIR and PETSC_ARCH to a path to PETSc and denoting the architecture for which PETSc is compiled. PETSC_ARCH is in reality just a name you give to your installation, it is a string you can choose however you like. The point of it is that it allows you to have multiple possibly different PETSc installations. A consequence of this is that you need to let deal.II's cmake scripts know which one of these installations you want it to use, i.e., you need to set the PETSC_ARCH variable to the same value you used when you installed PETSc. The same is true for PETSC_DIR. You can this via environment variables. cmake will then also recognize that PETSc shall be used, and enable the wrapper classes, without you having to explicitly say that you want to use PETSc.

Alternatively, the -DPETSC_DIR=DIR and -DPETSC_ARCH=ARCH options for cmake can be used to override the values of PETSC_DIR and PETSC_ARCH or if these environment variables are not set at all. If you do have a PETSc installation and have set the PETSC_DIR and PETSC_ARCH environment variables but do not wish deal.II to be configured for PETSc use, you should specify -DDEAL_II_WITH_PETSC=OFF as a flag during configuration.

Installing PETSc

Installing PETSc correctly can be a bit of a challenge. To start, take a look at the PETSc installation instructions. We have found that the following steps generally appear to work where we simply unpack and build PETSc in its final location (i.e., we do not first build and then install it into a separate directory):


	tar xvzf petsc-x-y-z.tar.gz
        cd petsc-x-y-z
	export PETSC_DIR=`pwd`
	export PETSC_ARCH=x86_64       # or any other identifying text for your machine
	./config/configure.py --with-shared=1 --with-x=0 --with-mpi=1 --download-hypre=1
	make
      

This automatically builds PETSc with both MPI and the algebraic multigrid preconditioner package Hypre (which we use in step-40). If you would like to use PETSc with MPI then we recommend that you install MPI through your package manager instead of letting PETSc install it: put another way, installing PETSc with the flag --download-mpich often causes problems (such as linking errors or poor performance) that may be avoided by using whatever your system provides instead. If you want to solve problems which have more than two billion unknowns with PETSc then you should also pass the --with-64-bit-indices=on flag to configure.py. You will also want to configure deal.II with 64-bit indices by providing -DDEAL_II_WITH_64BIT_INDICES=ON to deal.II's cmake configuration run. deal.II will throw an exception if it cannot successfully convert indices between the two libraries, but, as long as the supported index ranges overlap (e.g., when solving problems with less than 2 billion degrees of freedom where either PETSc or deal.II uses 32-bit indices and the other uses 64-bit indices) all solvers will work correctly.
Now let PETSc check his own sanity:


	make test
      
will self-check the serial (and MPI) implementation of PETSc.
You may wish to put the export commands into your ~/.bashrc or ~/.cshrc files, with the first one replaced by something of the kind

	export PETSC_DIR=/path/to/petsc-x-y-z
      

By default, PETSc is compiled in "debug mode". You can switch this to "optimized mode" by adding the command line parameter


	--with-debugging=0
      
to the call of ./config/configure.py above. In some cases, this has made linear solvers run up to 30% faster. As with choosing between deal.II's debug and optimized modes, you should only use optimized PETSc builds once you have tested that your program runs well in debug mode.


Valid HTML 4.01! Valid CSS!