Running tests

In TARDIS, we focus primarily on unit tests. These tests check the outputs of individual functions, ensuring that each component behaves as expected.

Unit tests run quickly and are executed after every suggested change to TARDIS, allowing for immediate feedback and maintaining code quality.

All of them are based on the excellent astropy-setup-helpers package and pytest.

Running the Unit Tests

This is very straightforward to run on your own machine. For very simple unit tests, you can run this with:

> pytest tardis

Running the more advanced unit tests requires TARDIS Reference data that can be downloaded (tardis-refdata). Git LFS is used to download the large refdata files in the tardis-refdata repository.

However, it is not required to download the entire repository. Firstly it is important to identify the refdata files that are needed. Sometimes, it is possible that a preused fixture that is also being used in the current tests is using some refdata. So, it is advised to check for such cases beforehand.

After identifying the refdata files to be used in the unit tests, those particular files can be downloaded using git lfs

> git lfs pull --include=filename

It is important to maintain the same directory structure as the tardis-refdata repo i.e. the lfs files should be in the same directory tree exactly as in tardis-refdata repository.

Finally, the tests can be run using the following command

> pytest tardis --tardis-refdata=/path/to/tardis-refdata/

Or, to run tests for a particular file or directory

> pytest tardis/path/to/test_file_or_directory --tardis-refdata=/path/to/tardis-refdata/

Warning

The tests workflow runs on pull requests and on push events. To prevent leaking LFS quota, tests have been disabled on forks. If, by any chance, you need to run tests on your fork, make sure to run the tests workflow on master branch first. The LFS cache generated in the master branch should be available in all child branches. You can check if cache was generated by looking in the Restore LFS Cache step of the workflow run. Cache can also be found under the “Management” Section under “Actions” tab.

Generating Plasma Reference

You can generate Plasma Reference by the following command:

> pytest -rs tardis/plasma/tests/test_complete_plasmas.py
--tardis-refdata="/path/to/tardis-refdata/" --generate-reference

Running the Integration Tests

These tests require reference files against which the results of the various tardis runs are tested. So you first need to either download the current reference files (here) or generate new ones.

Both of these require a configuration file for the integration tests:

atom_data_path: "~/projects/tardis/integration/"

# Path to directory where reference HDF files will be generated and
# saved during the test run. Use "--generate-reference" flag in command
# line args for the purpose, for other cases this will denote path
# to the directory containing reference HDF files.
reference: "~/projects/tardis/integration/"


# Speeds up test execution by reducing amount of packets per iteration,
# useful for debugging problems in testing infrastructure itself.
# Use "--less-packets" in command line args, for other cases this will be
# simply ignored. This section is not mandatory.
less_packets:
  no_of_packets: 400
  last_no_of_packets: 1000

Inside the atomic data directory there needs to be atomic data for each of the setups that are provided in the test_integration folder. If no references are given, the first step is to generate them. The --less-packets option is useful for debugging purposes and will just use very few packets to generate the references and thus make the process much faster — THIS IS ONLY FOR DEBUGGING PURPOSES. The -s option ensures that TARDIS prints out the progress:

> pytest --integration=integration.yml -m integration --generate-reference --less-packets

To run the test after having run the --generate-references, all that is needed is:

> pytest --integration=integration.yml -m integration --less-packets --remote-data