This page provides details about how to set up a testsuite in a user project similar to the one that is used to test deal.II itself.
deal.II features an extensive testsuite to ensure consistent, well-defined behavior of its building blocks during development and for releases. But, the larger a user program/project becomes, the more important it is to also check user code for continued correctness during development. This is mainly done via unit and regression testing.
deal.II provides a mechanism to conveniently set up unit and regression tests in a user project (very much like they are handled in the library itself). At its heart a test is a small executable that is invoked and an output file for comparison. The executable that should be run can be defined in two different ways: Either as a source file in conjunction with a comparison file:
my_test_1.cc my_test_1.outputIn this case
my_test_1.cc
contains a full executable (with
a main function) that produces some output. The screen output is then
compared against my_test_1.output
. Alternatively, a
parameter file together with a comparison file can be provided:
my_test_2.prm my_test_2.outputIn this case an already built executable (that is defined by a CMake variable) is invoked with the path of
my_test_2.prm
as
first argument. Again, its screen output is compared against
my_test_2.output
This section presents two different examples of how to use the testsuite facilities. Possible directory layouts together with the necessary CMake configuration are discussed.
For the purpose of an example, let us pretend that step-1 could read
input files (defined on the command line) and do some computation
based on their contents. Then, we can set up tests for expected
output for given a given configuration file.
This can be done by creating a subdirectory tests
and
adding a test of the second type (i.e., parameter file and comparison
file). In detail the directory and file layout is as follows:
CMakeLists.txt step-1.cc tests/CMakeLists tests/my_test.output tests/my_test.prmIn order to enable testing the top-level
CMakeLists.txt
file has to be augmented by a call to ENABLE_TESTING()
and a subsequent descent into the tests/
subdirectory via
ADD_SUBDIRECTORY(tests)
. For convenience, here is the
full top-level CMakeLists.txt
file:
SET(TARGET step-1) SET(TARGET_SRC step-1.cc) CMAKE_MINIMUM_REQUIRED(VERSION 2.8.12) FIND_PACKAGE(deal.II 9.1.0 REQUIRED HINTS ${deal.II_DIR} ${DEAL_II_DIR} ../ ../../ $ENV{DEAL_II_DIR} ) DEAL_II_INITIALIZE_CACHED_VARIABLES() PROJECT(${TARGET}) DEAL_II_INVOKE_AUTOPILOT() # Enable testing and descent into tests/ subdirectory: ENABLE_TESTING() ADD_SUBDIRECTORY(tests)The corresponding file
tests/CMakeLists.txt
contains
only two statements:
SET(TEST_TARGET ${TARGET}) DEAL_II_PICKUP_TESTS()The first statement sets the variable
TEST_TARGET
to the
executable that should be invoked (in our case the contents of the
variable TARGET
). The second statement is a call to a
deal.II macro that will go through the directory contents and define
all test targets.
Due to the fact that step-1 produces only two lines of output and parses no parameters, we can set up a somewhat silly test by just providing the comparison file and an empty parameter file:
$ touch tests/my_test.prm $ echo "Grid written to grid-1.eps" > tests/my_test.output $ echo "Grid written to grid-2.eps" >> tests/my_test.outputAfter that, reconfigure and call the test driver
ctest
:
$ cmake . [...] $ ctest Test project .../examples/step-1 Start 1: tests/my_test.debug 1/1 Test #1: tests/my_test.debug .............. Passed 1.72 sec 100% tests passed, 0 tests failed out of 1 Total Test time (real) = 1.72 sec
Remark: If the parameter file ends in .prm.in
instead of .prm
, it will be configured/preprocessed to a
test.prm file. This is done with the CMake macro
CONFIGURE_FILE
that replaces all strings
@VARIABLE@
with the contents of the corresponding CMake
variable. This is useful in particular to conveniently substitute
@SOURCE_DIR@
with the full source directory path of the
test.
Remark: The test driver will compare the combined output
stream of stdout and stderr against the comparison file. If the test
creates a file output
and writes to it, the comparison
file is compared against this output file instead. In this case
stdout and stderr are discarded.
Above setup is too inflexible for larger projects that might consist
of individual libraries and an independent main program. Therefore,
as a second example a project is presented that consists of a support
library "support" and an executable "step". The task shall be to
provide unit tests for the library "support" and simple configuration
type tests for "step". We shall assume that all the
functions and classes are declared in a single header support.h
to be included #include <support.h>
from the executable "step"
and units tests for the library.
In detail the directory and file layout is as follows:
CMakeLists.txt src/CMakeLists.txt src/step.cc src/support.cc include/support.h tests/step/CMakeLists tests/step/my_test_1.prm tests/step/my_test_1.output tests/support/CMakeLists tests/support/my_test_2.cc tests/support/my_test_2.output
Again, we want to use the "autopilot" configuration for user projects
(see the cmake documentation for
details). The top-level CMakeLists.txt
is now solely
responsible for finding deal.II, enable testing, and descending into
subdirectories:
# top-level CMakelists.txt CMAKE_MINIMUM_REQUIRED(VERSION 2.8.12) FIND_PACKAGE(deal.II 9.1.0 REQUIRED HINTS ${deal.II_DIR} ${DEAL_II_DIR} ../ ../../ $ENV{DEAL_II_DIR} ) DEAL_II_INITIALIZE_CACHED_VARIABLES() PROJECT(step) ENABLE_TESTING() INCLUDE_DIRECTORIES(${CMAKE_SOURCE_DIR}/include) ADD_SUBDIRECTORY(src) ADD_SUBDIRECTORY(tests/step) ADD_SUBDIRECTORY(tests/support)The library and executable are defined in
src/CMakeLists.txt
:
# src/CMakeLists.txt # set up shared library by hand: ADD_LIBRARY(support SHARED support.cc) DEAL_II_SETUP_TARGET(support) # set up executable with autopilot macro: SET(TARGET "step") SET(TARGET_SRC step.cc) DEAL_II_INVOKE_AUTOPILOT() TARGET_LINK_LIBRARIES(${TARGET} support)Similarly to the first example, setting up tests for the executable "step" is just a matter of defining a variable and a call to a macro:
# tests/step/CMakeLists.txt SET(TEST_TARGET step) DEAL_II_PICKUP_TESTS()In contrast, the tests for the support library consist of a source file that has a main function. The object file generated from this source file will be linked against deal.II and every library listed in
TEST_LIBRARIES
:
# tests/support/CMakeLists.txt SET(TEST_LIBRARIES support) DEAL_II_PICKUP_TESTS()Again, reconfigure and run ctest:
$ cmake . $ ctest Test project .../examples/step Start 1: step/my_test_1.debug 1/2 Test #1: step/my_test_1.debug ............. Passed 0.21 sec Start 2: support/my_test_2.debug 2/2 Test #2: support/my_test_2.debug .......... Passed 0.22 sec 100% tests passed, 0 tests failed out of 2 Total Test time (real) = 0.43 secIn some cases, additional input data has to be provided in users' tests. For example, we may have an input file
tests/support/my_test_2.inputwhich shall be parsed within
my_test_2.cc
. To read this file,
one can use the SOURCE_DIR
preprocessor variable which will be equal
to the parent folder of each .cc
test. In other words, given the current
imaginary directory structure, const std::string str = std::string(SOURCE_DIR) + "/my_test_2.input";
used from my_test_2.cc
will contain the path to the
auxiliary input file.
Remark: For further information consult the testsuite documentation for the library. With the sole exception of the testsuite setup (that happens unconditionally in user testsuites), this documentation also applies for user testsuites.
Remark: The full configuration options for
DEAL_II_PICKUP_TESTS()
are:
TEST_LIBRARIES TEST_LIBRARIES_DEBUG - additionally used for tests with debug configuration TEST_LIBRARIES_RELEASE - additionally used for tests with release configuration TEST_TARGET or TEST_TARGET_DEBUG and TEST_TARGET_RELEASE - used instead of TEST_TARGET for debug/release configuration NUMDIFF_EXECUTABLE, DIFF_EXECUTABLE - pointing to valid diff executables. If NUMDIFF_EXECUTABLE is not "numdiff" it will be ignored and DIFF_EXECUTABLE is used instead TEST_TIME_LIMIT - specifying the maximal wall clock time in seconds a test is allowed to run