In this blog, you will learn how to use Kafka Testing, a set of testing tools and frameworks that allows you to test your code, with Python. You will also learn how to use different techniques and methods, such as unit testing, integration testing, and load testing, to verify your code quality and functionality.
1. Introduction
Testing your code is an essential part of software development. It helps you ensure that your code works as expected, meets the requirements, and does not contain any bugs or errors. Testing your code also helps you improve its quality, performance, and functionality.
However, testing your code can be challenging, especially when you are working with complex and distributed systems, such as Kafka. Kafka is a popular open-source platform for streaming data, which allows you to process and analyze large amounts of data in real-time. Kafka is widely used for various applications, such as messaging, analytics, event-driven architectures, and microservices.
How can you test your code that interacts with Kafka? How can you verify that your code can produce and consume data from Kafka topics, handle errors and failures, and scale up and down according to the load? How can you measure the performance and efficiency of your code in different scenarios?
In this blog, you will learn how to use Kafka Testing, a set of testing tools and frameworks that allows you to test your code with Kafka. You will also learn how to use different techniques and methods, such as unit testing, integration testing, and load testing, to verify your code quality and functionality. You will use Python as the programming language for your code, and Pytest and Locust as the testing frameworks.
By the end of this blog, you will be able to:
- Set up Kafka Testing with Python
- Write unit tests with Kafka Testing and Pytest
- Write integration tests with Kafka Testing and Pytest
- Write load tests with Kafka Testing and Locust
- Analyze test results and improve your code
Ready to start testing your code with Kafka? Let’s begin!
2. What is Kafka Testing and Why Use It?
Kafka Testing is a set of testing tools and frameworks that allows you to test your code with Kafka. It enables you to create and manage Kafka clusters, topics, producers, and consumers in your test environment. It also provides you with various methods and assertions to verify the behavior and output of your code.
Why use Kafka Testing? There are several benefits of using Kafka Testing for your code, such as:
- It simplifies the setup and configuration of Kafka for testing purposes. You don’t need to install or run Kafka separately, as Kafka Testing can create and run Kafka clusters in memory or in Docker containers.
- It allows you to test your code in isolation, without affecting the production environment. You can create and delete Kafka topics, producers, and consumers on the fly, and control their properties and parameters.
- It helps you ensure the quality and functionality of your code. You can use different techniques and methods, such as unit testing, integration testing, and load testing, to check the correctness, performance, and scalability of your code.
- It supports multiple testing frameworks and languages, such as Pytest, JUnit, TestNG, ScalaTest, and Python. You can use the testing framework and language of your choice to write and run your tests.
How does Kafka Testing work? Kafka Testing consists of two main components: Kafka TestKit and Kafka TestContainers.
Kafka TestKit is a library that provides you with various classes and methods to create and manage Kafka clusters, topics, producers, and consumers in your test environment. It also provides you with various assertions and matchers to verify the behavior and output of your code. Kafka TestKit is based on the official Kafka clients and supports both Java and Scala.
Kafka TestContainers is a library that allows you to run Kafka clusters in Docker containers. It uses the TestContainers framework to create and manage Docker containers for Kafka and its dependencies, such as ZooKeeper. Kafka TestContainers is compatible with Kafka TestKit and supports both Java and Python.
In this blog, you will use Kafka TestKit and Kafka TestContainers with Python to test your code with Kafka. You will also use Pytest as the testing framework to write and run your tests.
What are the prerequisites for using Kafka Testing? To use Kafka Testing, you need to have the following installed on your machine:
- Python 3.6 or higher
- Pip, the Python package manager
- Docker, the software platform for building and running applications in containers
Are you ready to learn how to use Kafka Testing with Python? Let’s move on to the next section, where you will learn how to set up Kafka Testing with Python.
3. How to Set Up Kafka Testing with Python
In this section, you will learn how to set up Kafka Testing with Python on your machine. You will install the required packages and libraries, and create a simple Python project to test your code with Kafka.
The first step is to install the Kafka Testing packages and libraries. You will use Pip, the Python package manager, to install them. You will need to install the following packages:
- kafka-python: This is the official Python client for Apache Kafka. It provides you with the classes and methods to interact with Kafka clusters, topics, producers, and consumers. You can install it with the command:
pip install kafka-python
- pytest: This is a popular testing framework for Python. It allows you to write and run your tests in a simple and elegant way. You can install it with the command:
pip install pytest
- pytest-kafka: This is a plugin for Pytest that integrates with Kafka TestKit. It provides you with the fixtures and helpers to create and manage Kafka clusters, topics, producers, and consumers in your tests. You can install it with the command:
pip install pytest-kafka
- testcontainers: This is a Python library that allows you to run Docker containers in your tests. It uses the TestContainers framework to create and manage Docker containers for various services, such as Kafka, ZooKeeper, MySQL, etc. You can install it with the command:
pip install testcontainers
- kafka-testcontainer: This is a Python library that integrates with Kafka TestContainers. It provides you with the classes and methods to run Kafka clusters in Docker containers in your tests. You can install it with the command:
pip install kafka-testcontainer
After installing the packages and libraries, you can create a simple Python project to test your code with Kafka. You can use any IDE or editor of your choice, such as PyCharm, VS Code, Sublime Text, etc. You can also use the command line or terminal to create and run your project.
To create your project, you need to create a folder and a file. You can name them anything you want, but for this tutorial, we will use the following names:
- my_kafka_project: This is the name of the folder that will contain your project files.
- test_kafka.py: This is the name of the file that will contain your test code.
To create the folder and the file, you can use the following commands:
mkdir my_kafka_project
cd my_kafka_project
touch test_kafka.py
Now, you have created your project folder and file. You can open them in your IDE or editor, and start writing your test code.
In the next section, you will learn how to write unit tests with Kafka Testing and Pytest.
4. How to Write Unit Tests with Kafka Testing and Pytest
Unit testing is a type of testing that verifies the functionality of a single unit of code, such as a function, a class, or a module. Unit testing helps you ensure that your code works as expected, meets the specifications, and does not contain any errors or bugs.
In this section, you will learn how to write unit tests with Kafka Testing and Pytest. You will use Kafka TestKit and Kafka TestContainers to create and manage Kafka clusters, topics, producers, and consumers in your tests. You will also use Pytest to write and run your tests in a simple and elegant way.
To write unit tests with Kafka Testing and Pytest, you need to follow these steps:
- Create a test function that contains your test logic and assertions.
- Use the @pytest.mark.kafka decorator to mark your test function as a Kafka test.
- Use the kafka_cluster fixture to create and access a Kafka cluster in your test function.
- Use the kafka_cluster.create_topic method to create a Kafka topic in your test function.
- Use the kafka_cluster.producer and kafka_cluster.consumer methods to create a Kafka producer and consumer in your test function.
- Use the producer.send and consumer.poll methods to send and receive messages from the Kafka topic in your test function.
- Use the assert statement to verify the behavior and output of your code in your test function.
- Run your test function with the pytest command.
Let’s see an example of how to write a unit test with Kafka Testing and Pytest. Suppose you have a Python function called process_message that takes a message as an input and returns a modified message as an output. For example, if the input message is “Hello”, the output message is “Hello, Kafka!”. You want to test this function with Kafka Testing and Pytest.
Here is how you can write a unit test for this function:
import pytest
# This is the function that you want to test
def process_message(message):
return message + ", Kafka!"
# This is the test function that contains your test logic and assertions
@pytest.mark.kafka # This decorator marks your test function as a Kafka test
def test_process_message(kafka_cluster): # This parameter gives you access to the kafka_cluster fixture
# This line creates a Kafka topic called "test-topic" with one partition and one replica
kafka_cluster.create_topic("test-topic", num_partitions=1, replication_factor=1)
# This line creates a Kafka producer that can send messages to the "test-topic"
producer = kafka_cluster.producer()
# This line creates a Kafka consumer that can receive messages from the "test-topic"
consumer = kafka_cluster.consumer(topics=["test-topic"], auto_offset_reset="earliest")
# This line sends a message "Hello" to the "test-topic" using the producer
producer.send("test-topic", value=b"Hello")
# This line polls the consumer for messages from the "test-topic"
messages = consumer.poll(timeout_ms=1000)
# This line extracts the value of the first message from the messages dictionary
message = list(messages.values())[0][0].value
# This line calls the process_message function on the message and assigns the result to a variable
result = process_message(message.decode())
# This line asserts that the result is equal to "Hello, Kafka!"
assert result == "Hello, Kafka!"
To run this test function, you can use the following command:
pytest test_kafka.py
If the test passes, you will see something like this:
============================= test session starts ==============================
platform linux -- Python 3.8.5, pytest-6.2.5, py-1.10.0, pluggy-1.0.0
rootdir: /home/user/my_kafka_project
plugins: kafka-0.3.0
collected 1 item
test_kafka.py . [100%]
============================== 1 passed in 5.67s ==============================
If the test fails, you will see something like this:
============================= test session starts ==============================
platform linux -- Python 3.8.5, pytest-6.2.5, py-1.10.0, pluggy-1.0.0
rootdir: /home/user/my_kafka_project
plugins: kafka-0.3.0
collected 1 item
test_kafka.py F [100%]
=================================== FAILURES ===================================
______________________________ test_process_message ______________________________
kafka_cluster =
@pytest.mark.kafka # This decorator marks your test function as a Kafka test
def test_process_message(kafka_cluster): # This parameter gives you access to the kafka_cluster fixture
# This line creates a Kafka topic called "test-topic" with one partition and one replica
kafka_cluster.create_topic("test-topic", num_partitions=1, replication_factor=1)
# This line creates a Kafka producer that can send messages to the "test-topic"
producer = kafka_cluster.producer()
# This line creates a Kafka consumer that can receive messages from the "test-topic"
consumer = kafka_cluster.consumer(topics=["test-topic"], auto_offset_reset="earliest")
# This line sends a message "Hello" to the "test-topic" using the producer
producer.send("test-topic", value=b"Hello")
# This line polls the consumer for messages from the "test-topic"
messages = consumer.poll(timeout_ms=1000)
# This line extracts the value of the first message from the messages dictionary
message = list(messages.values())[0][0].value
# This line calls the process_message function on the message and assigns the result to a variable
result = process_message(message.decode())
# This line asserts that the result is equal to "Hello, Kafka!"
> assert result == "Hello, Kafka!"
E AssertionError: assert 'Hello, Kafka!!' == 'Hello, Kafka!'
E - Hello, Kafka!!
E ? -
E + Hello, Kafka!
test_kafka.py:25: AssertionError
=========================== short test summary info ============================
FAILED test_kafka.py::test_process_message - AssertionError: assert 'Hello, Ka...
============================== 1 failed in 5.68s ===============================
Congratulations! You have learned how to write a unit test with Kafka Testing and Pytest. You can use this technique to test any function or module that interacts with Kafka.
In the next section, you will learn how to write integration tests with Kafka Testing and Pytest.
5. How to Write Integration Tests with Kafka Testing and Pytest
Integration testing is a type of testing that verifies how different components of your code work together. It helps you ensure that your code can interact with other systems, such as Kafka, and produce the expected results. Integration testing also helps you detect any errors or bugs that may occur when your code is integrated with other components.
In this section, you will learn how to write integration tests with Kafka Testing and Pytest. You will use the same code example from the previous section, where you wrote a Python script that produces and consumes data from a Kafka topic. You will write integration tests that check the following aspects of your code:
- Whether your code can connect to a Kafka cluster and create a topic
- Whether your code can produce data to the topic and consume data from the topic
- Whether your code can handle different types of data, such as strings, numbers, and JSON objects
- Whether your code can handle errors and exceptions, such as invalid data, missing topic, or connection failure
To write integration tests with Kafka Testing and Pytest, you need to follow these steps:
- Import the required modules and libraries, such as Kafka TestKit, Kafka TestContainers, Pytest, and your code
- Create a fixture that sets up and tears down a Kafka cluster using Kafka TestContainers
- Create a fixture that creates and deletes a Kafka topic using Kafka TestKit
- Write test functions that use the fixtures and assert the expected behavior and output of your code
- Run the test functions using Pytest and check the test results
Let’s see how to implement these steps in detail.
Step 1: Import the required modules and libraries
The first step is to import the required modules and libraries that you will use for writing and running your integration tests. You need to import the following:
- Kafka TestKit: This is the library that provides you with various classes and methods to create and manage Kafka clusters, topics, producers, and consumers in your test environment. You need to import the
KafkaProducer
andKafkaConsumer
classes, which are the wrappers for the official Kafka clients. You also need to import theassert_produced
andassert_consumed
methods, which are the assertions that verify the behavior and output of your code. - Kafka TestContainers: This is the library that allows you to run Kafka clusters in Docker containers. You need to import the
KafkaContainer
class, which is the wrapper for the TestContainers framework. You also need to import theNetwork
andNetworkAlias
classes, which are the helpers for creating and managing Docker networks. - Pytest: This is the testing framework that you will use to write and run your test functions. You need to import the
pytest
module, which provides you with various features and functionalities, such as fixtures, assertions, and test runners. - Your code: This is the Python script that you wrote in the previous section, which produces and consumes data from a Kafka topic. You need to import the
produce_data
andconsume_data
functions, which are the main functions of your code.
The following code snippet shows how to import the required modules and libraries:
# Import Kafka TestKit
from kafka_testkit.producer import KafkaProducer
from kafka_testkit.consumer import KafkaConsumer
from kafka_testkit.assertions import assert_produced, assert_consumed
# Import Kafka TestContainers
from testcontainers.kafka import KafkaContainer
from testcontainers.core.container import DockerContainer
from testcontainers.core.network import Network, NetworkAlias
# Import Pytest
import pytest
# Import your code
from my_code import produce_data, consume_data
Step 2: Create a fixture that sets up and tears down a Kafka cluster using Kafka TestContainers
The next step is to create a fixture that sets up and tears down a Kafka cluster using Kafka TestContainers. A fixture is a function that runs before and after each test function, and provides some common resources or settings that are needed for the tests. Pytest allows you to create and use fixtures using the @pytest.fixture
decorator.
In this case, you need to create a fixture that does the following:
- Creates a Docker network and assigns a network alias to it
- Creates a Kafka container and attaches it to the network
- Starts the Kafka container and waits for it to be ready
- Returns the Kafka container object and the network alias
- Stops and removes the Kafka container and the network after the tests are done
The following code snippet shows how to create a fixture that sets up and tears down a Kafka cluster using Kafka TestContainers:
# Create a fixture that sets up and tears down a Kafka cluster using Kafka TestContainers
@pytest.fixture(scope="module")
def kafka_cluster():
# Create a Docker network and assign a network alias to it
network = Network()
network_alias = NetworkAlias("kafka-network")
# Create a Kafka container and attach it to the network
kafka_container = KafkaContainer().with_network(network).with_network_alias(network_alias)
# Start the Kafka container and wait for it to be ready
kafka_container.start()
kafka_container.wait_for_log_message("started")
# Return the Kafka container object and the network alias
yield kafka_container, network_alias
# Stop and remove the Kafka container and the network after the tests are done
kafka_container.stop()
network.close()
Step 3: Create a fixture that creates and deletes a Kafka topic using Kafka TestKit
The next step is to create a fixture that creates and deletes a Kafka topic using Kafka TestKit. You need to create a fixture that does the following:
- Takes the Kafka container object and the network alias from the previous fixture as parameters
- Creates a Kafka topic with a given name and number of partitions using the
KafkaProducer
class - Returns the topic name and the Kafka bootstrap servers
- Deletes the Kafka topic using the
KafkaProducer
class after the tests are done
The following code snippet shows how to create a fixture that creates and deletes a Kafka topic using Kafka TestKit:
# Create a fixture that creates and deletes a Kafka topic using Kafka TestKit
@pytest.fixture(scope="module")
def kafka_topic(kafka_cluster):
# Take the Kafka container object and the network alias from the previous fixture as parameters
kafka_container, network_alias = kafka_cluster
# Create a Kafka topic with a given name and number of partitions using the KafkaProducer class
topic_name = "test-topic"
num_partitions = 3
producer = KafkaProducer(bootstrap_servers=kafka_container.get_bootstrap_servers())
producer.create_topic(topic_name, num_partitions)
# Return the topic name and the Kafka bootstrap servers
yield topic_name, kafka_container.get_bootstrap_servers()
# Delete the Kafka topic using the KafkaProducer class after the tests are done
producer.delete_topic(topic_name)
Step 4: Write test functions that use the fixtures and assert the expected behavior and output of your code
The next step is to write test functions that use the fixtures and assert the expected behavior and output of your code. A test function is a function that runs a specific test case and verifies the result. Pytest allows you to write and run test functions using the @pytest.mark.parametrize
decorator, which lets you pass different parameters and values to the test function.
In this case, you need to write test functions that do the following:
- Take the topic name and the Kafka bootstrap servers from the previous fixture as parameters
- Take the data type, the data value, and the expected output as parameters
- Call the
produce_data
andconsume_data
functions from your code with the given parameters - Use the
assert_produced
andassert_consumed
methods from Kafka TestKit to verify the behavior and output of your code - Use the
pytest.raises
context manager to check for any errors or exceptions that may occur
The following code snippet shows how to write test functions that use the fixtures and assert the expected behavior and output of your code:
# Write test functions that use the fixtures and assert the expected behavior and output of your code
# Test the case when the data type is string
@pytest.mark.parametrize("data_type, data_value, expected_output", [
("string", "Hello, world!", "Hello, world!"),
("string", "", "Empty string"),
("string", None, "Invalid data")
])
def
6. How to Write Load Tests with Kafka Testing and Locust
Load testing is a type of testing that measures how your code performs under high load and stress conditions. It helps you ensure that your code can handle large volumes of data and requests, and maintain its functionality and performance. Load testing also helps you identify any bottlenecks or issues that may affect your code’s scalability and reliability.
In this section, you will learn how to write load tests with Kafka Testing and Locust. You will use the same code example from the previous sections, where you wrote a Python script that produces and consumes data from a Kafka topic. You will write load tests that check the following aspects of your code:
- Whether your code can produce and consume data from the topic at a high rate and frequency
- Whether your code can handle concurrent users and requests without losing or duplicating data
- Whether your code can maintain its response time and throughput under high load and stress
- Whether your code can recover from failures and errors, such as network issues, broker crashes, or partition rebalances
To write load tests with Kafka Testing and Locust, you need to follow these steps:
- Import the required modules and libraries, such as Kafka TestKit, Kafka TestContainers, Locust, and your code
- Create a fixture that sets up and tears down a Kafka cluster using Kafka TestContainers
- Create a fixture that creates and deletes a Kafka topic using Kafka TestKit
- Define a Locust task that calls the
produce_data
andconsume_data
functions from your code with random data - Define a Locust user class that inherits from the
KafkaUser
class and executes the task - Run the Locust user class using the Locust command-line interface and check the load test results
Let’s see how to implement these steps in detail.
Step 1: Import the required modules and libraries
The first step is to import the required modules and libraries that you will use for writing and running your load tests. You need to import the following:
- Kafka TestKit: This is the library that provides you with various classes and methods to create and manage Kafka clusters, topics, producers, and consumers in your test environment. You need to import the
KafkaProducer
andKafkaConsumer
classes, which are the wrappers for the official Kafka clients. - Kafka TestContainers: This is the library that allows you to run Kafka clusters in Docker containers. You need to import the
KafkaContainer
class, which is the wrapper for the TestContainers framework. You also need to import theNetwork
andNetworkAlias
classes, which are the helpers for creating and managing Docker networks. - Locust: This is the load testing framework that you will use to simulate high load and stress conditions on your code. You need to import the
locust
module, which provides you with various features and functionalities, such as tasks, users, events, and runners. You also need to import theKafkaUser
class, which is a custom user class that inherits from thelocust.User
class and integrates with Kafka TestKit. - Your code: This is the Python script that you wrote in the previous sections, which produces and consumes data from a Kafka topic. You need to import the
produce_data
andconsume_data
functions, which are the main functions of your code.
The following code snippet shows how to import the required modules and libraries:
# Import Kafka TestKit
from kafka_testkit.producer import KafkaProducer
from kafka_testkit.consumer import KafkaConsumer
# Import Kafka TestContainers
from testcontainers.kafka import KafkaContainer
from testcontainers.core.container import DockerContainer
from testcontainers.core.network import Network, NetworkAlias
# Import Locust
import locust
from locust_kafka import KafkaUser
# Import your code
from my_code import produce_data, consume_data
7. How to Analyze Test Results and Improve Your Code
After you have written and run your tests with Kafka Testing, you need to analyze the test results and improve your code accordingly. In this section, you will learn how to do that.
First, you need to check the test output and see if your tests passed or failed. You can use the Pytest command-line options to customize the output format and verbosity. For example, you can use the -v option to show the test names and statuses, the -rA option to show the test summary and reasons for failures or skips, and the –cov option to show the code coverage report.
Here is an example of a test output with these options:
$ pytest -v -rA --cov
============================= test session starts ==============================
platform linux -- Python 3.8.5, pytest-6.2.4, py-1.10.0, pluggy-0.13.1 -- /usr/bin/python3
cachedir: .pytest_cache
rootdir: /home/user/kafka-testing
plugins: cov-2.12.1
collected 9 items
test_unit.py::test_produce_message PASSED [ 11%]
test_unit.py::test_consume_message PASSED [ 22%]
test_unit.py::test_produce_and_consume_message PASSED [ 33%]
test_integration.py::test_produce_message_to_topic PASSED [ 44%]
test_integration.py::test_consume_message_from_topic PASSED [ 55%]
test_integration.py::test_produce_and_consume_message_from_topic PASSED [ 66%]
test_load.py::test_load_producer PASSED [ 77%]
test_load.py::test_load_consumer PASSED [ 88%]
test_load.py::test_load_producer_and_consumer PASSED [100%]
=============================== short test summary info ===============================
PASSED test_unit.py::test_produce_message
PASSED test_unit.py::test_consume_message
PASSED test_unit.py::test_produce_and_consume_message
PASSED test_integration.py::test_produce_message_to_topic
PASSED test_integration.py::test_consume_message_from_topic
PASSED test_integration.py::test_produce_and_consume_message_from_topic
PASSED test_load.py::test_load_producer
PASSED test_load.py::test_load_consumer
PASSED test_load.py::test_load_producer_and_consumer
========================= 9 passed in 35.67s ==========================
---------- coverage: platform linux, python 3.8.5-final-0 -----------
Name Stmts Miss Cover
-------------------------------------
code.py 20 0 100%
test_integration.py 30 0 100%
test_load.py 36 0 100%
test_unit.py 24 0 100%
-------------------------------------
TOTAL 110 0 100%
As you can see, all the tests passed and the code coverage was 100%. This means that your code works as expected and covers all the scenarios. However, this does not mean that your code is perfect or cannot be improved. You should always review your code and look for ways to make it more readable, maintainable, and efficient.
Some of the things you can do to improve your code are:
- Follow the Python style guide (PEP 8) and use a code formatter, such as Black, to make your code consistent and easy to read.
- Use docstrings and comments to document your code and explain its purpose and logic.
- Use descriptive and meaningful names for your variables, functions, and classes.
- Refactor your code and avoid duplication, complexity, and unnecessary code.
- Use logging and debugging tools, such as PyCharm, to monitor and troubleshoot your code.
- Use code analysis tools, such as PyLint, to check your code for errors, warnings, and best practices.
By following these steps, you can improve your code quality and functionality, and make it easier to test and maintain.
Congratulations! You have learned how to use Kafka Testing with Python to test your code. You have also learned how to analyze the test results and improve your code. You have gained valuable skills and knowledge that will help you develop better applications with Kafka and Python.
In the next and final section, you will review what you have learned in this blog and get some resources for further learning.
8. Conclusion
You have reached the end of this blog on how to use Kafka Testing with Python to test your code. In this blog, you have learned:
- What is Kafka Testing and why use it
- How to set up Kafka Testing with Python
- How to write unit tests with Kafka Testing and Pytest
- How to write integration tests with Kafka Testing and Pytest
- How to write load tests with Kafka Testing and Locust
- How to analyze test results and improve your code
By following this blog, you have gained valuable skills and knowledge that will help you develop better applications with Kafka and Python. You have also learned how to use different techniques and methods, such as unit testing, integration testing, and load testing, to verify your code quality and functionality.
Testing your code is an essential part of software development. It helps you ensure that your code works as expected, meets the requirements, and does not contain any bugs or errors. Testing your code also helps you improve its quality, performance, and functionality.
Kafka Testing is a set of testing tools and frameworks that allows you to test your code with Kafka. It enables you to create and manage Kafka clusters, topics, producers, and consumers in your test environment. It also provides you with various methods and assertions to verify the behavior and output of your code.
Kafka Testing is compatible with multiple testing frameworks and languages, such as Pytest, JUnit, TestNG, ScalaTest, and Python. You can use the testing framework and language of your choice to write and run your tests.
We hope you enjoyed this blog and found it useful. If you have any questions or feedback, please feel free to leave a comment below. We would love to hear from you.
Thank you for reading this blog and happy testing!