Test-driven development is a common software development approach that facilitates test automation, code refactoring, and to validate code functionality.
This article goes over pytest
, a popular testing framework to write tests in Python. This GitHub repo contains the code snippets presented in this article.
Motivation for using pytest
unittest
and the Arrange-Act-Assert model
The Arrange-Act-Assert model is a common testing framework to organize tests. We’ll demonstrate how to apply this model using both unittests
and pytest
.
Below is an example of a test case for a single function add_one
that adds 1 to a number using the unittest
framework:
To run the unit test, you can call it in the command line.
> python3 test_unittest.py
.x
-------------------------------------------------------------
Ran 2 tests in 0.008s
OK (expected failures=1)
For such a simple test, we had to write so much code!
After all, when writing tests, you may want to write one that always passes and always fails. We needed to do several things, including:
- Importing the
TestCase
class - Create a subclass (
testFunction
) to run your test cases - Write a method for each test (
test_add_one
andtest_add_one_fail
) - Use the
self.assert*
method fromunittest
to create our assertions - For the failure case, it was an extra step, but the
@unittest.expectedFailure
decorator is typically used when you don’t want an expected failure to count in the final result.
pytest
simplifies writing concise and expressive tests
pytest
simplifies this workflow by allowing you to use the assert
keyword directly and to write normal functions. All you need to do is include a function with the test_
prefix.
Let’s write the same unit tests as above, but now in a pytest
format:
As you can see from the example code above, the same test we wrote in the unittest framework is much simpler and cleaner, using the assert
method that is built into Python. You don’t have to learn any new constructs to get started, unlike with unittest
. Additionally, these tests are small and self-contained, which is essential to writing good tests.
You can call the script on the command line to run all the tests within the pytest_simple.py
file.
> pytest test_pytest.py
=================== test session starts ===================
platform linux -- Python 3.8.10, pytest-7.2.1, pluggy-1.0.0
rootdir: /pytest-example/tests/unit
collected 2 items
test_pytest.py .x [100%]
=============== 1 passed, 1 xfailed in 5.75s ===============
Let’s say that you want to run all tests within a directory.
Similar to the function call that you would do in the parent directory with the unittest
framework python -m unittest discover -v
, you can run several tests within a directory (let’s call it unit
) with pytest
using the following function:
# Runs tests using files within the directory `unit`, my current working directory
pytest .
=================== test session starts ===================
platform linux -- Python 3.8.10, pytest-7.2.1, pluggy-1.0.0
rootdir: /pytest-example/tests/unit
collected 4 items
test_pytest.py .x [ 50%]
test_unittest.py .x [100%]
=============== 2 passed, 2 xfailed in 5.07s ===============
Importantly, the files containing tests need to follow the form test_*.py
or *_test.py
for pytest
to recognize these files. I was also able to run the unittest
code as well with pytest
, and pytest
can work with other testing frameworks as well.
Later in the article, we’ll go over how you can selectively run tests, based on some expression call.
Core features of pytest
In addition to being able to write simpler and concise tests with pytest
, there are additional benefits that are mentionable to the pytest
framework:
- Fixtures and classes are supported to test objects selectively
- You can run tests written in other framework formats, including
unittest
(documented here) - There are external plugins to assist with functional testing and Behavior-Driven Testing
- Test cases can be parallelized
We’ll discuss some of the essential components of the pytest
framework.
Fixtures manage the steps and data during the arrange phase of a test
What are fixtures?
Your tests often depend on data called test doubles
that mock actual data structures your code is likely to encounter.
With unittest
, you extract these dependencies in the setUp()
and tearDown()
methods. As your test classes get larger, additional dependencies in your doubles can lead to complex interactions that make it difficult to make sense of with your tests.
pytest
lets you declare reusable elements in your test using fixtures. Fixtures are functions that create data, test doubles, or initialize a system state for the test suite.
The code below creates a fixture example_return_1
, which is a function that returns the value 1. The function is decorated it with @pytest.fixture
:
While the example above is really simple, you can think of scenarios where you may need to re-use a dataset as the input for multiple functions. Calling this fixture allows you to re-use this data structure. You can have multiple fixtures within your tests, and fixtures can use other fixtures in the test as well, providing scalability with tests.
Scaling fixtures
pytest
fixtures are modular, and can be imported, can import other modules, and can depend on other fixtures. If you want to make a fixture available for your whole project without having to import it, you can make a special configuration module called conftest.py
.
How does it work? pytest
looks for a contest.py
module in each directory. If you add fixtures to the conftest.py
module, you can use that fixture throughout the parent directory without having to import it.
For instance, let’s create some data in conftest.py that we’ll import into another test script.
To load the data, you simply call the variable name test_dataset
in the function call.
Now if you’re following this tutorial from the beginning, we would have the following files in our test directory:
test/unit/
|-- conftest.py
|-- test_conftest.py
|-- test_fixtures.py
|-- test_pytest.py
|-- test_unittest.py
If we wanted to run this test specifically, we can use the -k option. This is an example of name-based filtering where Python will look for a file containing an expression. In this case, we’ll have pytest
look for a file that contains the expression conftest
. Since conftest.py contains the data and is not technically a test, pytest
will run test_conftest.py.
> pytest -k 'conftest.py'
=================== test session starts ===================
platform linux -- Python 3.8.10, pytest-7.2.1, pluggy-1.0.0
rootdir: /tests/unit
collected 6 items / 5 deselected / 1 selected
test_conftest.py . [100%]
============= 1 passed, 5 deselected in 0.23s =============
There are so many more things you can do with fixtures that are outside the scope of this pytest
overview. But if you’re interested in learning more about pytest
fixtures, you can look up the documentation here.
Filtering tests
As you write more tests, you may just want to run a subset of tests on a feature and save the full suite for later. There are a few ways of doing this in pytest
, which are explained in more depth below.
Name-based filtering
You can use an expression as a filtering criteria to run tests using the -k
parameter. We discussed an application of name-based filtering in the Scaling fixture section.
Directory scoping
pytest
will only run tests that are in a current directory by default. However, you can run tests by specifying a directory.
Let’s say we have a test directory that two subdirectories for unit and integration tests:
/test/
|-- unit/
|-- test_unit1.py
|-- test_unit2.py
|-- integration/
|-- test_integration1.py
|-- test_integration2.py
We can have pytest
specifically run test in a subdirectory if we do not want to run the entire test suite. To run specifically the unit tests, you can call the following command in the terminal:
> pytest /test/unit/
Finally, you can actually call specific classes, functions, and parameters to run individually.
Say in the directory structure above, test_unit1.py
has the function test_function
. You can specifically run test_function
using the following command:
> pytest /test/unit/test_unit1.py::test_function
If you’re calling a method from a class, you can use the following command
> pytest /test/unit/test_unit1.py::testClass::test_function
Test categorization
In addition to the other methods described above, we can set metadata on our test functions. These are called markers, and there are several functions are baked in the marks
decorator.
Some commonly used marks include:
skip
: skip a testskipif
: skip a test if an expression passed to it is Truexfail
: indicates that a test is expected to fail.parameterize
: creates variants of a test with different values as arguments.
We actually used a mark in the first pytest
test we wrote! Take a look of the code block below:
The pytest.mark.xfail
decorator tells pytest
that a test is expected to fail, and in the function above, the test_answer_fail
function will fail because we are trying to assert that 5 == 6
.
Now say you want to run a specific set of tests. You can decorate specific functions with pytest.mark.<customMarker>
to indicate these functions are related to one another. This allows you to run tests with a higher level of granularity without having to alter the directory structure or altering expressions within the file.
To run the tests containing the specific mark, you can call the -m
flag :
> pytest -m <customMarker>
Additionally, the -m
flag can take expressions as well! If you wanted to run all tests without the expression denoted by customMarker
, you can use the following call:
> pytest -m "not <customMarker>"
Finally, you can run pytest --markers
on the command line to see a list of all the marks that are available out of the box.
Test Parameterization
When you’re testing functions that manipulate data, you may find yourself testing different inputs with the same overlying structure.
We discussed pytest
fixtures, which abstract away dependencies that are used in multiple tests (e.g. data). However, these are not useful when you have varying inputs and outputs. For these cases, pytest
allows you to parameterize tests, so that you can test different conditions independently.
For instance, say we want to test a function that takes in text. It can take in empty strings, single letters, nouns, and sentences. You could write a test for each case. Or you can parameterize our test using the @pytest.mark.parameterize()
decorator.
When we run it, we should expect that it runs 4 separate tests, one for each string instance:
pytest test_parameters.py
=================== test session starts ===================
platform linux -- Python 3.8.10, pytest-7.2.1, pluggy-1.0.0
rootdir: /tests/unit
collected 4 items
test_parameters.py .... [100%]
==================== 4 passed in 0.14s ====================
As you can see, there were 4 tests that passed.
Pytest Plugins
pytest
is open to customizations and new features, with a rich ecosystem of plugins. You can find a comprehensive list of plugins here.
Summary
The pytest
framework enables you, the developer, to write simple tests that scale well as the functionality of your applications become more complex. Hope this was helpful!
Leave a Reply