Getty Images

Tip

Conduct load tests with JMeter on Kubernetes

Apache JMeter and other load-testing tools can be used with Kubernetes to conduct stress tests to see how well an app performs in specific scenarios.

Developers should always be thinking about how an application will perform when launched. There's a lot to test and to consider, from whether the app is using the correct amount of CPU memory to how much load it can handle.

Your team should ensure an application that's being deployed in Kubernetes stays up. Pods should not constantly be exiting and needing to self-heal; this creates downtime.

Once an application is running, the primary concerns become about whether bugs cause the app to go down and if the app can handle the load. For example, would an e-commerce site stay up during the rush of Black Friday and Cyber Monday? Has the ops team tested this scenario? How would they fix the app if it does go down?

These questions will remain unanswered until the team benchmarks its application, which typically involves load testing. To test if an application would withstand a large number of simultaneous users, load-testing tools can create virtual users to test the application's durability.

Load testing is usually done by QA teams or software engineers who want to test the quality of the application. Let's look at how Apache JMeter can assist with load testing.

Where JMeter comes into play

JMeter can be used to measure and analyze the performance of an application or a container running inside of Kubernetes.

When using JMeter, load tests can be conducted from within the UI to, say, test the performance of a webpage. You can then retrieve results in the form of graphs to see how the application performs.

Let's discuss how to use JMeter in a real Kubernetes environment to see how load testing looks in production.

If you would like to follow along with this tutorial in a hands-on fashion, you need the following:

  • knowledge of Kubernetes.
  • a Kubernetes cluster that's running anywhere -- locally, on premises or in the cloud.
  • JMeter, which can be found here.

To load-test an application on Kubernetes, you need a containerized application deployed. To keep things simple, you can deploy an Nginx web app that is running in a Kubernetes deployment resource along with a Kubernetes service that enables you to expose the web app without needing a load balancer.

Run the below code to deploy the Nginx web app, which consists of the following:

  • a Kubernetes deployment.
  • a Kubernetes service.
  • two replicas.
  • the ability to be reached over port 80.
kubectl apply -f - <<EOF
apiVersion:  apps/v1
kind:  Deployment
metadata:
     name:  nginx-deployment
spec:
     selector:
          matchLabels:
               app:  nginxdeployment
     replicas:  2
     template:
          metadata:
               labels:
                    app:  nginxdeployment
          spec:
               containers:
                -  name:  nginxdeployment
                    image:  nginx:latest
                    ports:
                    -  containerPort:  80
---
apiVersion:  v1
kind:  Service
metadata:
     name:  nginxservice
spec:
     selector:
          app:  nginxdeployment
     ports:

         -  protocol:  TCP
             port:  80
     type:  NodePort
EOF

Once deployed, confirm the Kubernetes deployment by running the following:

Kubectl  get  deployment

An output like the one below should appear:

NAME                 READY      UP-TO-DATE      AVAILABLE       AGE
nginx-deployment     2/2        2               2               35s

Then, confirm the Kubernetes service by running the following:

kubectl  get  svc

This shows an output like the one below:

kubectl  get  service

NAME            TYPE            CLUSTER-IP      EXTERNAL-IP           PORTS(S)
nginxservice    NodePort        10.0.54.249     <none>                80:30836/TCP

Expose the application so it can be tested like any other web app on the localhost by running the following to conduct port forwarding:

kubectl  port-forward  svc/nginxservice  :80

An input like the one below should appear:

Forwarding  from  127.0.0.1:random_port  ->  80
Forwarding  from  [::1}:random_port  ->  80

Testing an app with JMeter

Depending on where JMeter is running, the installation varies. The prerequisites section explains where to install JMeter. For example, macOS users should use Homebrew to install JMeter, while Windows users should use Java Platform, Standard Edition Development Kit.

brew  install  jmeter

Run JMeter by simply typing jmeter on the terminal.

In the background, which is typically behind the terminal, there should be a GUI that opens. That would be the JMeter GUI.

Create a new test plan called nginxtest.

Right-click the nginxtest plan, and follow the path: Add > Threads (Users) > Thread group.

The thread group is used to execute the load test, which enables the user to add the number of virtual users needed to simulate the load test to see if the application can withstand a specific quantity of users. In this case, you can choose 100 for both, which results in 100 requests.

Next, right-click the thread group, and follow the path: Add > Config Element > HTTP Request Defaults.

This is where the IP address and port are typed in. Since the port is forwarding from a Kubernetes cluster to the localhost, the IP address is 127.0.0.1, and the port should match the port number from the kubectl port-forward command.

Once complete, add a new HTTP request by right-clicking the thread group and following the path: Add > Sampler > HTTP Request.

The HTTP request is where you can specify the HTTP protocol and path. In this case, it's a request, and the path is root (/).

The last step for the configuration is to add a new graph by right-clicking the test plan and following the path: Add > Listener > Graph Results.

Once complete, click the save button -- floppy disk icon -- and the green button that resembles a play button.

After this, requests should appear where the port was forwarded.

After about one minute, JMeter should display a graph filled with data.

Michael Levan is a cloud enthusiast, DevOps pro and HashiCorp Ambassador. He speaks internationally, blogs, publishes books, creates online courses on various IT topics and makes real-world, project-focused content to coach engineers on how to create quality work.

Dig Deeper on Software testing tools and techniques

Cloud Computing
App Architecture
ITOperations
TheServerSide.com
SearchAWS
Close