.. _running-tests:
*************
Running tests
*************
In TARDIS, we focus primarily on unit tests. These tests check the outputs of individual functions, ensuring that each component behaves as expected.
Unit tests run quickly and are executed after every suggested change to TARDIS, allowing for immediate feedback and maintaining code quality.
All of them are based on the excellent ``astropy-setup-helpers`` package and
`pytest `_.
Running the Unit Tests
======================
This is very straightforward to run on your own machine. For very simple unit
tests, you can run this with:
.. code-block:: shell
> pytest tardis
Running the more advanced unit tests requires TARDIS Reference data that can be
downloaded
(`tardis-refdata `_).
`Git LFS `_ is used
to download the large refdata files in the tardis-refdata repository.
However, it is not required to download the entire repository. Firstly it is
important to identify the refdata files that are needed. Sometimes, it is possible
that a preused fixture that is also being used in the current tests is using some
refdata. So, it is advised to check for such cases beforehand.
After identifying the refdata files to be used in the unit tests, those particular
files can be downloaded using ``git lfs``
.. code-block:: shell
> git lfs pull --include=filename
It is important to maintain the same directory structure as the tardis-refdata repo
i.e. the lfs files should be in the same directory tree exactly as in tardis-refdata
repository.
Finally, the tests can be run using the following command
.. code-block:: shell
> pytest tardis --tardis-refdata=/path/to/tardis-refdata/
Or, to run tests for a particular file or directory
.. code-block:: shell
> pytest tardis/path/to/test_file_or_directory --tardis-refdata=/path/to/tardis-refdata/
.. warning::
The `tests workflow `_ runs on
`pull requests `_ and on
`push `_ events.
To prevent leaking LFS quota, tests have been disabled on forks.
If, by any chance, you need to run tests on your fork, make sure to run the tests workflow on master branch first.
The LFS cache generated in the master branch should be available in all child branches.
You can check if cache was generated by looking in the ``Restore LFS Cache`` step of the workflow run.
Cache can also be found under the "Management" Section under "Actions" tab.
Generating Plasma Reference
===========================
You can generate Plasma Reference by the following command:
.. code-block:: shell
> pytest -rs tardis/plasma/tests/test_complete_plasmas.py
--tardis-refdata="/path/to/tardis-refdata/" --generate-reference
Running the Integration Tests
=============================
These tests require reference files against which the results of the various
tardis runs are tested. So you first need to either download the current
reference files (`here `_)
or generate new ones.
Both of these require a configuration file for the integration tests:
.. literalinclude:: integration.yml
:language: yaml
Inside the atomic data directory there needs to be atomic data for each of
the setups that are provided in the ``test_integration`` folder.
If no references are given, the first step is to generate them.
The ``--less-packets`` option is useful for debugging purposes and will just
use very few packets to generate the references and thus make the process much
faster --- THIS IS ONLY FOR DEBUGGING PURPOSES. The ``-s`` option ensures that
TARDIS prints out the progress:
.. code-block:: shell
> pytest --integration=integration.yml -m integration --generate-reference --less-packets
To run the test after having run the ``--generate-references``, all that is
needed is:
.. code-block:: shell
> pytest --integration=integration.yml -m integration --less-packets --remote-data