Skip to main content
Page Tittle
Performance Testing using Docker/Container
Performance Testing using Docker/Container

It's a common misconception that performance testing with Docker is a lengthy and tiring process. If I asked you to set up JMeter for performance testing on one virtual machine (VM), how long would it take you? And if you needed to set up the same on multiple virtual machines, how much longer would it take? It's widely agreed that this task is tedious and time-consuming.

Once you've decided to use JMeter as your performance testing tool, the next step is to set up JMeter testing environment, which isn't very difficult but can be cumbersome. These are the steps typically followed to set up JMeter performance testing environment:

  1. Procuring the Machine (Master/Slave)
  2. Downloading JDK
  3. Installing JDK
  4. Setting up the environment variable of JDK
  5. Downloading & extracting the JMeter package
  6. Downloading & setting up the required plugins
  7. Setting up the environment variable of JMeter

If you follow this sequence, it will probably take almost an hour per machine. And if you encounter any issues while setting it up, it will take even longer. But here's a hack that can make your life easier and save you plenty of time. It might sound unbelievable, but you can complete this task in just five minutes for any geolocation with just a few clicks.

Yes, you read that right! There are many ways to accomplish this, and one of them is by using a Docker and Container-based JMeter solution.

What do you need?

(The following list is based on Azure cloud)

  • Docker Desktop software: to build and push the Docker image.
  • Azure CLI software: for connecting to Azure cloud from the local laptop where we are building the image.
  • AzCopy software: to help download and upload files from Azure storage, such as JMX, .JTL, .LOG files, etc.
  • Azure Container Registry: to store the image in a ready-to-use condition.
  • Azure Container Instance: to run the image or the test.
  • InfluxDB: for live monitoring of the load test execution (additional step).
  • Grafana Dashboard: for live monitoring of the load test execution (additional step).

High-Level Architecturejmeter-docker

Let's get started

  1. Starting with the easiest one, we are going to create the folder structure in cloud storage.

    We will create 3 folders in file sharing under Azure cloud storage.

    1. TestScript - To keep all .jmx files
    2. ConfigFiles - To keep, files, file, and TestData.csv files
    3. TestResult – Here, we will keep all .jtl and jmeter.log files
  2. Now, we will create an Azure Container Registry, which will act as the repository to keep Docker images.
  3. We are almost done with the Azure portal setup except for setting up the "Azure Container Instance," which we will do once we complete the Docker file.

    Let's create the Dockerfile, which will contain instructions to download and install the following apps/software:

    • Download and install the Ubuntu OS
    • Download and unzip JMeter
    • Download and install the JMeter Plugin Manager
    • Download and install JMeter plugins
    • Download and unzip AZCopy
    • Download the file (we will talk about it in a few minutes)
    • Execute the
  4. Once the Dockerfile is ready, we can build it using the Docker build command and push it to the Azure Container Registry using the Docker push command. Now, our Docker image is ready to use. However, so far, we have only created the JMeter infrastructure, and it does not have any code or instructions to download the script, run the test, and most importantly, save the test results.

    The reason there aren't any instructions to get the test script or run the test or save the result in the Dockerfile is that there is no benefit to hardcoding them into the Docker image. It should work dynamically, with one particular folder where we can simply upload the test script, and similarly, one particular folder where it can keep saving all the result files. To achieve all of this, I am going to create one more .sh file and name it


  5. The Docker image will call the "" file, which is an executable file downloaded from Azure storage. will then download and call "," which contains all the instructions to download the testing artifacts, execute the test, and upload the file back to cloud storage.

    The reason for creating two .sh files ( and is that is directly mapped and built on the image. This means that any changes made to will require rebuilding the images. However, by creating, which is called, we can make changes at any time and upload them to cloud storage, making it 100% dynamic.

    content of master

  6. As most of our setup is ready, let's create an Azure Container instance and run our test. Here are the steps to follow:
    • Go to the Azure portal
    • Create a new resource
    • Search for "Container instance"
    • Select your Container instance
    • Click on "Create"
    • Provide your subscription and resource group name
    • Give a name to your Container
    • Select the geolocation where you want to deploy your image or from where you want to run the test
    • Choose "Azure Container registry" as your image source
    • Select the image you pushed in step #4
    • Select the size, which will allow you to choose the CPU and memory configuration
    • Go to the networking tab and allow port 8086 to communicate with InfluxDB if you have already set it up with the default port
    • Validate and create the Container instance
  7. After creation, the Container instance will automatically run for the first time and terminate automatically once all activities are done as per the instructions. In the future, if you need to run the test, simply hit the "run" button, and it will take care of everything. You will only need to upload a fresh script for another test from the same GEO location. If you want to run the test from a different GEO location, you will need to create a new Azure instance.

    You can visit the Container page of your Azure Container instance and view the current status, events, properties, and logs.

    My Azure Setup For this article

    Azure Portal



Assuming that you are already aware of live monitoring using InfluxDB and Grafana and have the setup ready, you just need to add the backend listener in the test script with the correct hostname and port.


Using the Docker/Container approach will reduce all the manual efforts required to set up the infrastructure and complete the task within 5 minutes. It also prevents issues that may arise during infrastructure setup and can be run in any geolocation offered by the cloud providers. Once you create your Docker file, it can be used in any other cloud provider with minimal changes, as all major cloud providers offer similar features.