.. _running-tests: ************* Running tests ************* There are two basic categories of tests in TARDIS: 1) the unit tests, and 2) the integration tests. Unit tests check the outputs of individual functions, while the integration tests check entire runs for different setups of TARDIS. The unit tests run very quickly and thus are executed after every suggested change to TARDIS. The integration tests are much more costly and thus are only executed every few days on a dedicated server. All of them are based on the excellent ``astropy-setup-helpers`` package and `pytest `_. Running the Unit Tests ====================== This is very straightforward to run on your own machine. For very simple unit tests, you can run this with: .. code-block:: shell > pytest tardis Running the more advanced unit tests requires TARDIS Reference data that can be downloaded (`tardis-refdata `_). `Git LFS `_ is used to download the large refdata files in the tardis-refdata repository. However, it is not required to download the entire repository. Firstly it is important to identify the refdata files that are needed. Sometimes, it is possible that a preused fixture that is also being used in the current tests is using some refdata. So, it is advised to check for such cases beforehand. After identifying the refdata files to be used in the unit tests, those particular files can be downloaded using ``git lfs`` .. code-block:: shell > git lfs pull --include=filename It is important to maintain the same directory structure as the tardis-refdata repo i.e. the lfs files should be in the same directory tree exactly as in tardis-refdata repository. Finally, the tests can be run using the following command .. code-block:: shell > pytest tardis --tardis-refdata=/path/to/tardis-refdata/ Or, to run tests for a particular file or directory .. code-block:: shell > pytest tardis/path/to/test_file_or_directory --tardis-refdata=/path/to/tardis-refdata/ .. warning:: The `tests workflow `_ runs on `pull requests `_ and on `push `_ events. To prevent leaking LFS quota, tests have been disabled on forks. If, by any chance, you need to run tests on your fork, make sure to run the tests workflow on master branch first. The LFS cache generated in the master branch should be available in all child branches. You can check if cache was generated by looking in the ``Restore LFS Cache`` step of the workflow run. Cache can also be found under the "Management" Section under "Actions" tab. Running Syrupy Tests ====================== The tests module is currently being restructured to use `Syrupy `_. These tests generate individual HDF(for Pandas) and ``.npy`` or ``.npz`` (for NumPy) files(or snapshots) for each test case. For other objects, the plugin serialises them and saves them as ``.ambr`` files. There is a custom extension in the main local `conftest `_ file to get this to work. The plugin currently only supports ``assert_allclose`` for NumPy and ``assert_series_equal`` and ``assert_frame_equal`` for Pandas. Snapshots can be generated by using the same ``--generate-reference`` flag and are compared automatically. You however need to provide the location to your snapshot directory using the ``--tardis-snapshot-data`` flag. The snapshots are also saved in `tardis-sn/tardis-regressions `_. Generating Plasma Reference =========================== You can generate Plasma Reference by the following command: .. code-block:: shell > pytest -rs tardis/plasma/tests/test_complete_plasmas.py --tardis-refdata="/path/to/tardis-refdata/" --generate-reference Running the Integration Tests ============================= These tests require reference files against which the results of the various tardis runs are tested. So you first need to either download the current reference files (`here `_) or generate new ones. Both of these require a configuration file for the integration tests: .. literalinclude:: integration.yml :language: yaml Inside the atomic data directory there needs to be atomic data for each of the setups that are provided in the ``test_integration`` folder. If no references are given, the first step is to generate them. The ``--less-packets`` option is useful for debugging purposes and will just use very few packets to generate the references and thus make the process much faster --- THIS IS ONLY FOR DEBUGGING PURPOSES. The ``-s`` option ensures that TARDIS prints out the progress: .. code-block:: shell > python setup.py test --args="--integration=integration.yml -m integration --generate-reference --less-packets" To run the test after having run the ``--generate-references``, all that is needed is: .. code-block:: shell > python setup.py test --args="--integration=integration.yml -m integration --less-packets" --remote-data Setting up the DokuWiki report ============================== A normal `DokuWiki `_ installation is performed on the required server. Before the connection works one is requires to set the option remote access in the settings. If this is not done the ``dokuwiki`` python plugin will not connect with the warning ``DokuWikiError: syntax error: line 1, column 0``. One also has to enable this for users (``remoteuser`` option) otherwise the error: ``ProtocolError for xmlrpc.php?p=xxxxxx&u=tardistester: 403 Forbidden`` will appear. Another important configuration option is to enable embedded html ``htmlok``; otherwise, it won't show nice html page reports. Finally, one has to call the `python setup.py test` with the ``--remote-data`` option to allow posting to an external DokuWiki server.