Dev Conventions
  • Dev Conventions
  • Git
    • Branch Flow
    • Conventional Commits
    • Pull requests template
    • Before making the PRs in typescript, keep in mind
  • Typescript
    • Introduction
    • Good practices
    • Solid
    • Using pnpm for Package Management
    • NestJs
    • NextJs
    • React
    • NestJs Testing
    • React Testing
    • Npm Registry
  • PYTHON
    • Introduction
    • Good practices
    • Testing
    • Naming Convention
  • DevOps
    • Introduction
    • Github Actions
    • Creating a Github Actions Workflow
  • Agile
    • Story Points
Powered by GitBook
On this page
  • File Structure
  • Labeling the Different Tests
  • Creating Mocks for pytest
  • Creating Tests by Classes
  • Creating Mocks with pandas
  • Creating Parameterized Tests
  1. PYTHON

Testing

PreviousGood practicesNextNaming Convention

Last updated 11 months ago

File Structure

For testing purposes, it is recommended to replicate the project structure that we are going to test.

For the example we have, it would look like this:

The directories in the root are the classes or functions that we are going to test, and in the test directory, we have the replica of the original filesystem with the tests already performed.

Labeling the Different Tests

To properly label the different tests, we need to follow these steps:

Step 1:

Create a file in the root of the project called pytest.ini where we will put all the labels we will use in the project.

Step 2:

In the file created in the previous step, we will configure the different labels to be used in the project.

As shown in the image, we will put the label with a brief description of what the test will do.

**NOTE: When we put this (deselect with '-m "not slow"') in the description of our test, it is to make it skip the tests that have this label, in this case, the slow label, during the global execution of the tests since those tests might take longer. It should be noted that the label does not have to be exactly as shown in the example; it can be any label.

To execute the tests while omitting those labeled as slow, it would be done as follows:**

pytest -m "not slow"

Step 3:

To let our tests know which label each test belongs to, we have to specify it in the code with the following decorator.

@pytest.mark.arithmetic

As shown in the image, after mark. we would put our previously configured label in the pytest.ini file.

Step 4:

To perform the respective execution, we must consider the following:

  • Have the pytest package previously installed: pip install pytest

  • For best practices, use virtual environments. In the example, we used the virtualenv package; its installation is straightforward:

    1. pip install virtualenv

    2. Create the virtual environment: py -m venv venv

    3. Install the necessary packages for the project

    4. Execute the test: pytest -m arithmetic

Creating Mocks for pytest

In this file, we will create instances of the classes to be tested to centralize the mocks and classes to be used. Below is a brief example.

Class:

Conftest.py:

For this file, we will use the @pytest.fixture decorator, which provides data or configurations needed for the tests.

Advantages of Using Fixtures

  1. Reusability: You can reuse the same setup in multiple tests.

  2. Modularity: Keeps setup code separate from tests.

  3. Maintenance: Easier to modify the test setup without changing all tests.

Creating Tests by Classes

For creating this example test, we will create a class with the setup method responsible for initializing instances and attributes needed for the different tests. Below is a brief example:

Note: To validate the tests, we will use the assert keyword, which expects a boolean to know if the condition is met. If it returns false, pytest will return an error message that we can customize.

x = 5
assert x == 10, "x should be 10"  # Raises AssertionError with the message "x should be 10"

Creating Mocks with pandas

To read the previously created mocks, we must consider that we will need the following libraries from Python's standard library: unittest and io. These will allow us to create and read the data loaded previously in the fixture of the constructor.

Below is a brief example of its use:

With this function, we use patch from unittest.mock to temporarily replace the pandas.read_csv function with a mock version.

return_value=pd.read_csv(StringIO(self.data_mock_csv)) defines what the mock function will return. In this case, it returns the result of reading the CSV from self.data_mock_csv using pd.read_csv(StringIO(self.data_mock_csv)).

StringIO(self.data_mock_csv) converts the text string self.data_mock_csv into a file-like object that pd.read_csv can read.

Within the with context, it calls the load_data_from_csv method of the self.calculator instance. The function is designed to load data from a CSV file.

The same applies to JSON:

Creating Parameterized Tests

To create this type of test, we will use the @pytest.mark.parametrize decorator, which expects the following parameters: "num_1, num_2, result", corresponding to the variables used in the test, and [(1, 2, 3), (2, 1, 1), (2, 3, 6), (6, 3, 2)], corresponding to the variable initializations and their result. This is equivalent to saying num1 = 1, num2 = 2, and result = 3, and so on with all the data in the list. Note that the data stored in the list must be primitive data, as the idea is only to check the functionality of the method. Below is a brief example:

To create mocks for pytest, it is recommended to use a file called . It should be noted that it is not mandatory to name it this way, but in this case, we will name it so by our convention.

The @pytest.fixture(autouse=True) decorator with the autouse=True parameter will initialize the fixtures configured previously in the file to be used in the tests. This decorator works similarly to a beforeAll in jest.js.

conftest.py
Conftest.py