Fading Coder

One Final Commit for the Last Sprint

Home > Tech > Content

Complete Usage and Configuration Reference for pytest Fixtures

Tech 3

Fixtures are a core pytest mechanism for managing test environment setup and teardown. They let you define reusable preconditions, resources, or state that runs before (and optionally after) test execution, ensuring consistent test isolation, reducing code duplication, and improving test suite maintainability. There are two primary fixture variants: parameterized fixtures that enable data-driven testing by passing dynamic input sets to test cases, and standard fixtures that provide fixed resources or environment configurations for test runs.

To create a fixture, you define a function decorated with @pytest.fixture, which can accept arbitrary positional arguments if needed. The logic inside the fixture function handles resource initialization, and you can optionally include teardown logic after a yield statement that runs once the dependent test completes.

Fixture Configuration Parameters

The @pytest.fixture decorator acccepts the following configuration parameters:

pytest.fixture(scope="function",  params=None,  autouse=False,  ids=None,  name=None)
  • scope: Defines the lifecycle of the fixture. Valid values from narrowest to widest are function (default, runs once per test function/method), class (runs once per test class), module (runs once per .py test file), package (runs once per Python package), session (runs once for the entire test suite execution).
  • params: An optional list of input values. If provided, the fixture will be executed once for each value in the list, and all tests that depand on the fixture will run against every parameterized value, enabling bulk test execution across multiple input sets.
  • autouse: When set to True, the fixture is automatically applied to all tests within its scope without explicit referencing. If False (default), tests must explicitly declare the fixture as a parameter or use the @pytest.mark.usefixtures decorator to activate it.
  • ids: A list of custom identifiers corresponding to each entry in params, used to generate human-readable test case names in test output. If not provided, identifiers are auto-generated from the params values.
  • name: A custom alias for the fixture, which defaults to the name of the decorated function. This is useful to avoid naming conflicts if the original function name would clash with other fixtures or test parameters.

Scope Lifecycle Example

First, create a conftest.py file in your test root, which stores shared fixtures accessible across the antire test suite:

import  pytest

@pytest.fixture(scope="session",  autouse=True)
def  session_lifecycle():
    print("Running  pre-session  setup  steps")
    yield
    print("Running  post-session  teardown  steps")

@p.fixture(scope="package",  autouse=True)
def  package_lifecycle():
    print("Running  pre-package  setup  steps")
    yield
    print("Running  post-package  teardown  steps")
@pytest.fixture(scope="module",  autouse=True)
def  module_lifecycle():
    print("Running  pre-module  setup  steps")
    yield
    print("Running  post-module  teardown  steps")

@pytest.fixture(scope="class",  autouse=True)
def  class_lifecycle():
    print("Running  pre-class  setup  steps")
    yield
    print("Running  post-class  teardown  steps")

@pytest.fixture(scope="function",  autouse=True)
def  function_lifecycle():
    print("Running  pre-function  setup  steps")
    yield
    print("Running  post-function  teardown  steps")

@pytest.fixture(scope="function",  name="custom_test_data")
def  non_autouse_fixture():
    print("Running  custom  pre-test  setup  for  non-autouse  fixture")
    yield
    print("Running  custom  post-test  teardown  for  non-autouse  fixture")

Next, create a test file test_shopping_flows.py with the following test cases:

import  pytest

class  TestCheckoutFlows:
    def  test_one_click_purchase(self):
        print("Executing  one-click  purchase  test")

    def  test_scheduled_purchase(self):
        print("Executing  scheduled  purchase  test")

    def  test_add_item_to_cart(self):
        print("Executing  add  to  cart  test")

    #  Explicitly  call  fixture  via  decorator
    @pytest.mark.usefixtures("custom_test_data")
    def  test_product_search(self):
        print("Test  executed  with  explicitly  called  fixture  via  decorator")

    #  Explicitly  call  fixture  via  test  parameter
    def  test_coupon_application(self,  custom_test_data):
        print("Test  executed  with  explicitly  called  fixture  via  parameter")

When you run the test suite, the output will show the execution order of fixtures based on their scope:

test_shopping_flows.py::TestCheckoutFlows::test_one_click_purchase  
Running  pre-session  setup  steps
Running  pre-package  setup  steps
Running  pre-module  setup  steps
Running  pre-class  setup  steps
Running  pre-function  setup  steps
PASSED  [20%]  Executing  one-click  purchase  test
Running  post-function  teardown

test_shopping_flows.py::TestCheckoutFlows::test_scheduled_purchase
Running  pre-function  setup  steps
PASSED  [40%]  Executing  scheduled  purchase  test
Running  post-function  teardown

test_shopping_flows.py::TestCheckoutFlows::test_add_item_to_cart
Running  pre-function  setup  steps
PASSED  [60%]  Executing  add  to  cart  test
Running  post-function  teardown

test_shopping_flows::TestCheckoutFlows::test_product_search
Running  pre-function  setup  steps
Running  custom  pre-test  setup  for  non-autouse  fixture
PASSED  [80%]  Test  executed  with  explicitly  called  fixture  via  decorator
Running  custom  post-test  teardown  for  non-autouse  fixture
Running  post-function  teardown

test_shopping_flows.py::TestCheckoutFloows::test_coupon_application
Running  pre-function  setup  steps
Running  custom  pre-test  setup  for  non-autouse  fixture
PASSED  [100%]  Test  executed  with  explicitly  called  fixture  via  parameter
Running  custom  post-test  teardown  for  non-autouse  fixture
Running  post-function  teardown
Running  post-class  teardown
Running  post-module  teardown
Running  post-package  teardown
Running  post-session  teardown

Fixture Dependency and Data Return Example

Fixtures can return preconfigured data for test consumption, and can depend on other fixtures for nested resource creation:

import  pytest

@pytest.fixture
def  base_user_profile():
    return  {"username":  "alice123",  "birth_year":  1992}

@pytest.fixture
def  full_user_account(base_user_profile):
    extended_profile  =  base_user_profile.copy()
    extended_profile.update({
        "email":  "alice@example.org",
        "is_verified":  True,
        "storage_quota":  500
    })
    return  extended_profile

def  test_profile_data(base_user_profile):
    assert  base_user_profile["username"]  ==  "al123"
    assert  base_user_profile["birth_year"]  ==  1992

def  test_full_account_data(full_user_account):
    assert  full_user_account["username"]  ==  "alice123"
    assert  full_user_account["email"]  ==  "alice@example.org"
    assert  full_user_account["is_verified"]  is  True

Related Articles

Understanding Strong and Weak References in Java

Strong References Strong reference are the most prevalent type of object referencing in Java. When an object has a strong reference pointing to it, the garbage collector will not reclaim its memory. F...

Comprehensive Guide to SSTI Explained with Payload Bypass Techniques

Introduction Server-Side Template Injection (SSTI) is a vulnerability in web applications where user input is improper handled within the template engine and executed on the server. This exploit can r...

Implement Image Upload Functionality for Django Integrated TinyMCE Editor

Django’s Admin panel is highly user-friendly, and pairing it with TinyMCE, an effective rich text editor, simplifies content management significantly. Combining the two is particular useful for bloggi...

Leave a Comment

Anonymous

◎Feel free to join the discussion and share your thoughts.