Running tests

There are two basic categories of tests in TARDIS: 1) the unit tests, and 2) the integration tests. Unit tests check the outputs of individual functions, while the integration tests check entire runs for different setups of TARDIS.

The unit tests run very quickly and thus are executed after every suggested change to TARDIS. The integration tests are much more costly and thus are only executed every few days on a dedicated server.

All of them are based on the excellent astropy-setup-helpers package and pytest.

Running the Unit Tests

This is very straightforward to run on your own machine. For very simple unit tests, you can run this with:

> pytest tardis

Running the more advanced unit tests requires TARDIS Reference data that can be downloaded (tardis-refdata). Git LFS is used to download the large refdata files in the tardis-refdata repository.

However, it is not required to download the entire repository. Firstly it is important to identify the refdata files that are needed. Sometimes, it is possible that a preused fixture that is also being used in the current tests is using some refdata. So, it is advised to check for such cases beforehand.

After identifying the refdata files to be used in the unit tests, those particular files can be downloaded using git lfs

> git lfs pull --include=filename

It is important to maintain the same directory structure as the tardis-refdata repo i.e. the lfs files should be in the same directory tree exactly as in tardis-refdata repository.

Finally, the tests can be run using the following command

> pytest tardis --tardis-refdata=/path/to/tardis-refdata/

Or, to run tests for a particular file or directory

> pytest tardis/path/to/test_file_or_directory --tardis-refdata=/path/to/tardis-refdata/

Warning

The tests workflow runs on pull requests and on push events. To prevent leaking LFS quota, tests have been disabled on forks. If, by any chance, you need to run tests on your fork, make sure to run the tests workflow on master branch first. The LFS cache generated in the master branch should be available in all child branches. You can check if cache was generated by looking in the Restore LFS Cache step of the workflow run. Cache can also be found under the “Management” Section under “Actions” tab.

Running Syrupy Tests

The tests module is currently being restructured to use Syrupy. These tests generate individual HDF(for Pandas) and .npy or .npz (for NumPy) files(or snapshots) for each test case. For other objects, the plugin serialises them and saves them as .ambr files. There is a custom extension in the main local conftest file to get this to work. The plugin currently only supports assert_allclose for NumPy and assert_series_equal and assert_frame_equal for Pandas. Snapshots can be generated by using the same --generate-reference flag and are compared automatically. You however need to provide the location to your snapshot directory using the --tardis-snapshot-data flag. The snapshots are also saved in tardis-sn/tardis-regressions.

Generating Plasma Reference

You can generate Plasma Reference by the following command:

> pytest -rs tardis/plasma/tests/test_complete_plasmas.py
--tardis-refdata="/path/to/tardis-refdata/" --generate-reference

Running the Integration Tests

These tests require reference files against which the results of the various tardis runs are tested. So you first need to either download the current reference files (here) or generate new ones.

Both of these require a configuration file for the integration tests:

atom_data_path: "~/projects/tardis/integration/"

# This section holds information about mechanism of saving the HTML
# report of the tests.
# "save_mode" is mandatory: It can be either "local" or "remote".
report:
  save_mode: "local"

  # This section contains credentials for dokuwiki instance.
  # It is mandatory if "save_mode" is "remote", else can be removed.
  dokuwiki:
    url: "http://opensupernova.org/~karandesai96/integration/"
    username: "private"
    password: "private"

  # If "save_mode" is "local", a sub directory will be made in this
  # directory according to commit hash (shortened), and it will contain
  # the complete report content.
  reportpath: "~/projects/tardis/integration"


# Path to directory where reference HDF files will be generated and
# saved during the test run. Use "--generate-reference" flag in command
# line args for the purpose, for other cases this will denote path
# to the directory containing reference HDF files.
reference: "~/projects/tardis/integration/"


# Speeds up test execution by reducing amount of packets per iteration,
# useful for debugging problems in testing infrastructure itself.
# Use "--less-packets" in command line args, for other cases this will be
# simply ignored. This section is not mandatory.
less_packets:
  no_of_packets: 400
  last_no_of_packets: 1000

Inside the atomic data directory there needs to be atomic data for each of the setups that are provided in the test_integration folder. If no references are given, the first step is to generate them. The --less-packets option is useful for debugging purposes and will just use very few packets to generate the references and thus make the process much faster — THIS IS ONLY FOR DEBUGGING PURPOSES. The -s option ensures that TARDIS prints out the progress:

> python setup.py test --args="--integration=integration.yml -m integration
--generate-reference --less-packets"

To run the test after having run the --generate-references, all that is needed is:

> python setup.py test --args="--integration=integration.yml -m integration
--less-packets" --remote-data

Setting up the DokuWiki report

A normal DokuWiki installation is performed on the required server. Before the connection works one is requires to set the option remote access in the settings. If this is not done the dokuwiki python plugin will not connect with the warning DokuWikiError: syntax error: line 1, column 0. One also has to enable this for users (remoteuser option) otherwise the error: ProtocolError for xmlrpc.php?p=xxxxxx&u=tardistester: 403 Forbidden will appear.

Another important configuration option is to enable embedded html htmlok; otherwise, it won’t show nice html page reports.

Finally, one has to call the python setup.py test with the --remote-data option to allow posting to an external DokuWiki server.