Filters, Markers and Expected Exceptions
Filter Tests
You can filter which tests to run using the -k option of the pytest command. The option accepts a keyword expression, which is used to match tests.
# This matches all tests that have `_raises` but not `delete` in their name
pytest -k `_raises and not delete`Mark Tests
pytest lets you put markers on tests, which means attaching labels to them so pytest can treat it differently. A test can have more than one marker, and a marker can be on multiple tests. Let's say you want to mark certain tests as "slow" so that you can skip them later, you can do so using @pytest.mark decorator as shown below:
@pytest.mark.slow
def test_big_computation():
...These custom markers must be declared in pytest.ini:
# pytest.ini
[pytest]
strict_markers = true
markers =
slow: marks tests as slow (deselect with '-m "not slow"')When the strict_markers configuration option is set, any unknown marks applied with the @pytest.mark.name_of_the_mark decorator will trigger an error.
pytest also includes a few helpful builtin markers such as skip, skipif, xfail etc.
Skip Tests
- The simplest way to skip a test function is to mark it with the
skipdecorator which may be passed an optionalreason:
@pytest.mark.skip(reason="no way of currently testing this")
def test_the_unknown(): ...- If you want to skip a test imperatively during execution, you can also use
pytest.skip(reason)function:
def test_function():
if not valid_config():
pytest.skip("unsupported configuration")- If you wish to skip something conditionally then you can use
skipifinstead:
@pytest.mark.skipif(sys.version_info < (3, 13), reason="requires python3.13 or higher")
def test_function(): ...Expected Failures
- To mark a test function as expected to fail, use
xfailmarker.
@pytest.mark.xfail
def test_function(): ...- Similar to skipping tests, you can also mark a test as XFAIL imperatively during execution, using
pytest.xfail()function:
WARNING
Calling pytest.xfail() immediately marks the test as XFAIL and stops executing the rest of the test, because internally it works by raising a special exception. This is different from the xfail marker, which still runs the full test body.
def test_function():
if not valid_config():
pytest.xfail("failing configuration (but should work)")- You can also make the marker conditional by passing a
conditiontoxfail, along with areasonexplaining why it’s expected to fail:
@pytest.mark.xfail(sys.version_info < (3, 13), reason="not supported for Python versions < 3.13")
def test_function():
...Expected Exceptions
- To verify that a certain error is raised when the code runs, you can use
pytest.raises()as a context manager:
# This test fails if no exception is raised or if the raised exception is not ZeroDivisionError
def test_zero_division():
with pytest.raises(ZeroDivisionError):
# some code that might end up with something like 1/0- You can also get access to the actual exception message using something like:
def test_recursion_depth():
with pytest.raises(RuntimeError) as excinfo:
# Some code that will raise an exception
assert "maximum recursion" in str(excinfo.value)- Alternatively, you can pass a
matchkeyword parameter to match the exception message against a regular expression:
def test_recursion_depth():
with pytest.raises(RuntimeError, match=r".* recursion"):
# Some code that will raise an exception