[Dev Tip] Stress test with Apache JMeter

Performance is a critical aspect of most applications. Everything may be pretty, really well developed, intuitive and all, but if the performance is not up to par the user experience is going to suck.

Unfortunately most of these performance problems only show themselves after some time in production, making them much more cumbersome to solve and handle.

In this post I’m going to talk about something that should, IMHO, be integrated in the development lifecyclebefore the application goes live: Load-tests.

I typically talk separately about performance-testing and load-testing. For me performance testing is something you would do with a profiler like dotTrace or ANTS, where you run your application in the context of a user, trigger some functionalities, take some snapshots and then analyze the results. Typically with a profiler you understand the code where most time is spent on.

Load-testing is a different beast. You’re testing the ability of the application to, with enough load, be able to handle requests at an acceptable rate.

With a good load testing strategy, and taking into account the target hardware, you’ll be able to estimate the capacity of your solution in terms of simultaneous users / requests. Also, and very importantly, you may understand if your solution is able to scale well or not. This is of extreme importance because nowadays it’s cheaper (and less riskier) to buy a new server than to pay your developers to refactor/optimize the solution.

With that in mind, in this series of posts I’m going to show some tools that I’ve used for load testing in the context of ASP.NET MVC, starting with a really basic one called Apache Workbench and then evolving into a much more powerful tool called jMeter.

Apache Worbench

For simpler scenarios nothing beats Apache Workbench. It’s a single executable that is able to trigger simultaneous HTTP operations on a specific URL, displaying some really simple metrics.

It’s included with Apache, but it’s not very practical to install Apache just to get this small executable. You can follow the tips here to get the file.

Now, let’s create our sample website. Nothing here will be ASP.NET MVC specific, so use anything that rocks your boat. I’m going to use the empty ASP.NET MVC template and create a really simple Home controller with an index action that returns a view.

After extracting “ab.exe” lets run it at a command prompt. It’s usage is:

Usage: ab [options] [http://]hostname[:port]/path
Options are:
    -n requests     Number of requests to perform
    -c concurrency  Number of multiple requests to make
    -t timelimit    Seconds to max. wait for responses
    -b windowsize   Size of TCP send/receive buffer, in bytes
    -p postfile     File containing data to POST. Remember also to set -T
    -u putfile      File containing data to PUT. Remember also to set -T
    -T content-type Content-type header for POSTing, eg.
                    Default is 'text/plain'
    -v verbosity    How much troubleshooting info to print
    -w              Print out results in HTML tables
    -i              Use HEAD instead of GET
    -x attributes   String to insert as table attributes
    -y attributes   String to insert as tr attributes
    -z attributes   String to insert as td or th attributes
    -C attribute    Add cookie, eg. 'Apache=1234. (repeatable)
    -H attribute    Add Arbitrary header line, eg. 'Accept-Encoding: gzip'
                    Inserted after all normal header lines. (repeatable)
    -A attribute    Add Basic WWW Authentication, the attributes
                    are a colon separated username and password.
    -P attribute    Add Basic Proxy Authentication, the attributes
                    are a colon separated username and password.
    -X proxy:port   Proxyserver and port number to use
    -V              Print version number and exit
    -k              Use HTTP KeepAlive feature
    -d              Do not show percentiles served table.
    -S              Do not show confidence estimators and warnings.
    -g filename     Output collected data to gnuplot format file.
    -e filename     Output CSV file with percentages served
    -r              Don't exit on socket receive errors.
    -h              Display usage information (this message)

Let’s start by running it as:

ab -n10 -c1 http://localhost:5454/Home/Index

We’re basically issuing 10 sequential requests. The results:

Concurrency Level:      1
Time taken for tests:   0.041 seconds
Complete requests:      10
Failed requests:        0
Write errors:           0
Total transferred:      4390 bytes
HTML transferred:       260 bytes
Requests per second:    243.89 [#/sec] (mean)
Time per request:       4.100 [ms] (mean)
Time per request:       4.100 [ms] (mean, across all concurrent requests)
Transfer rate:          104.56 [Kbytes/sec] received

Connection Times (ms)
              min  mean[+/-sd] median   max
Connect:        0    0   0.5      0       1
Processing:     1    4   2.4      3       9
Waiting:        1    3   2.4      3       9
Total:          1    4   2.7      3      10

Percentage of the requests served within a certain time (ms)
  50%      3
  66%      4
  75%      5
  80%      6
  90%     10
  95%     10
  98%     10
  99%     10
 100%     10 (longest request)

Almost every metric here is useful and should be easy enough to understand. Typically the main metrics that I look for are:

  • Requests per second
  • The mean time per request
  • 95% interval -> meaning in this case that 95% of the times the request time is less or equal to 10 ms (this metric is commonly used in SLA specifications).
Let’s make things more interesting. I’m going to change my action to be:
public ActionResult Index()
    return View();

Now, running the same command again:

ab -n10 -c1 http://localhost:5454/Home/Index


Requests per second: 1.99
Time per request: 503.29 (mean)
95%: 504 ms

Makes perfect sense. We’re issuing sequential requests and every one takes half a second to execute.

Now, if we add parallelism to the equation:

ab -n10 -c10 http://localhost:5454/Home/Index


Requests per second: 9.89
Time per request: 1011.058 (mean)
95%: 1010 ms

The performance starts to deteriorate. Although its processing almost 10 per second the mean time per request has surpassed 1 second.

And we can even make it worse:

ab -n100 -c100 http://localhost:5454/Home/Index


Requests per second: 16.40
Time per request: 6099.349 (mean)
95%: 6069 ms

Almost 6 seconds por
Now, suppose that our SLA stated that we needed to handle 95% of the requests in less than 1 second. Our capacity would be around 10 simultaneous requests.

It’s not the focus of this post to try and optimize this code, but just for the fun of it, let’s try to understand what’s happening here.

While issuing the Thread.Sleep, the thread is literally blocked half-a-second while waiting to resume execution. Obviously your own code would not have a Thread.Sleep (I hope), but the same principle applies while waiting for a synchronous query to the database, or a synchronous filesystem operation, etc.

Let’s make our operation asynchronous to see if it does improve the capacity of our application. Fun, fun, fun, let’s use the new async/await goodness to create an asynchronous action. The code becomes just:

public async Task<ViewResult> Index()
    await Task.Delay(500);
    return View();

Freeking awesome. Let’s test it with the same parameters as before:

ab -n10 -c1 http://localhost:5454/Home/Index


Requests per second: 1.96
Time per request: 510.129 (mean)
95%: 517 ms
ab -n10 -c10 http://localhost:5454/Home/Index


Requests per second: 19.16
Time per request: 522.030 (mean)
95%: 521 ms
ab -n100 -c100 http://localhost:5454/Home/Index


Requests per second: 162.07
Time per request: 617.035 (mean)
95%: 602 ms
Mean Time per Request (ms)

So, as you can see, load-testing is very important. In my next post I’m going to show a much more powerful tool called jMeter. What I’ve shown here is just the tip of the iceberg.


Although Apache Workbench provides some simple and useful metrics it’s a little bit limited. In this post I’m going to talk about a much more powerful tool for load-testing: JMeter (also from the Apache Foundation).

I’ll start by replicating the previous experiment that I did with Apache Worbench but using JMeter, and gradually adding more complexity to it.

Let’s get started:

  • Extract the archive somewhere on your disk and run the <folder>/bin/jmeter.bat file (Java required)
JMeter is not a particularly intuitive application. When starting you get this home screen.
You have a Test Plan node and a workbench. The workbench works like a Sandbox. Nothing you place there is saved implicitly with your tests. According to the JMeter manual the workbench is:
“(..) a place to temporarily store test elements while not in use, for copy/paste purposes, or any other purpose you desire”
Yeah, a little bit weird. Anyway, the Test Plan is where you place the test itself. If you right-click it you might be a little overwhelmed with all the available options, but the JMeter page does a really nice job in explaining each of the available options. You can check the manual at http://jmeter.apache.org/usermanual/.
Step 1: Simple HTTP Get Requests
Now, the typical basic JMeter test has a Thread, a Sampler and a Listener. In the context of our first test this will mean:
  • Thread: configuration of number of users and number of executions. Similar to the parameters we tweaked in Apache Worbench
  • Sampler: the HTTP Request configuration
  • Listener: Handles the requests and results and produces result metrics.
So, without further ado, let’s create all these items:


  • Test Plan > Add > Threads (Users) > Thread Group


  •  Let’s create 10 users, where each launches 10 simultaneous requests.

So, we’ll have a total of 100 requests. The ramp-up period is very useful in more realistic testing scenarios and represents the interval between each thread (user) submitting requests. In this case, as we’re trying to mimic the behavior from our Apache Worbench test from my previous post, I’ve set it to 0 (zero).


  • Thread Group > Add > Sampler > HTTP Request
  • Fill the request information. Obviously the server name, port number and path may vary.


  • Thread Group > Add > Listener > Summary Report
  • Launch the Test
  • The Summary Reports shows basic information regarding our test.
As you can see 100 samples were produced with an average time of 516 ms. There are LOTS of other result listeners, allowing you to show information about each request, export to CSV, graphs, etc.

Step 2: HTTP Get Requests with dynamic parameter and measuring success

We’ll pick the previous example and send a dynamic sequential parameter, created by a JMeter counter. Then, the server will give a success or failure error message depending if the number is odd of even. So, we’ll add to the existing example:

  • Counter
  • Tree Listener
  • Response Assertion

First of all we’ll have to change the asp.net mvc project.

The action implementation will be:

public async Task<ViewResult> Index(int id)
    await Task.Delay(500);

    this.ViewBag.Id = id;
    this.ViewBag.Success = id % 2 == 0;

    return View();

And the view:

    ViewBag.Title = "Index";

<p>@(ViewBag.Success ? "Success" : "Error")</p>

<p>Item ID:@ViewBag.Id</p>

So, opening the page should show a different result if the parameter is odd or even:

Regarding JMeter:


  • Thread Group > Add > Config Element > Counter
  •  Fill the counter information. Basically it’ll increment its value by one on each iteration and store the result on a variable called COUNTER.
  • Let’s change our HTTP request to use this counter value. To use variables in JMeter the notation ${VARIABLE} is used. Thus, in our case, our HTTP request url should become: /Home/Index/${COUNTER}
Tree Listener
This is my favorite result listener during development as it shows each request, including the full response.
  • Thread Group > Add > Listener > View Results Tree
  • Rearrange the items using Drag&Drop so that they become like this:
  • Launch the Test
Now, besides having the Summary Report as before, we can also see each request in the “View Results Tree” item. There we can confirm that each request is being sent with a different ID (from the counter) and that the response has a different message when using odd or even IDs.
Response Assertion
Now, sometimes we need to measure the success of a request according to the response we got. In this case, we’ll consider a success if the response has the text “Success”.
  • Thread Group > Add > Assertions > Response Assertion
  • Configure the assertion
  • Now run the test. Half of the requests will be considered errors (the ones with odd IDs)
This response assertion thing is more important than you may consider at first glance. Follow my advice: always have a Response Assertion in place to make sure that your results are not tampered.
Now, this post is starting to be a little big so I’ll wrap this up on a third post where I’m going to talk about some additional things:
  • Using Variables and other misc stuff
  • Submitting HTTP POST (also, using the HTTP Proxy)
  • Handling authentication
  • Handling the ASP.NET MVC Anti-Request forgery token
  • Loading data from a CSV file

As promised, in this post I’m going to explain some more JMeter stuff. So, I’ll show how to:

  • Submit information (HTTP POST)
  • Handle authentication
    • Particularly in the ASP.NET MVC context
  • Read information from a page and store it in a variable
    • Particularly to read the ASP.NET MVC anti-forgery token
  • Vary the posted data by reading it from a CSV file
Also, I’ll include some extra elements to make the test more maintainable and professional.
So, again, lots of ground to cover so lets get started.

First, we need to create the application that will be tested. It’ll be pretty simple: a login screen and a form with a bunch of fields. After submitting it should show a success or error message according to the validation.

So, this is the login screen:

After supplying the correct credentials (meaning, password = “password”) a page is opened with 2 fields ready to submit feedback.

The page has validation set up

And after filling the two fields correctly a message appears and the form fields are cleared so that new feedback may be supplied.

That’s it. Now, there’s some small details to take into consideration.

  • The web page has Forms Authentication. Pretty standard stuff: if the credentials are valid then create the authentication cookie.
  • I’m using an ASP.NET MVC mechanism to defend against Cross-Site Request Forgery by supplying a token in the HTML Form that must be sent on the POST. This is done by using the ActionFilter [ValidateAntiForgeryToken] on the action and a @Html.AntiForgeryToken on the view. So, if a POST request does not include the anti-forgery token the server will reject the request and throw an exception.

Let’s create a pseudo-representation for our test-plan:

For each User
 +--- Login
 +--- Repeat n times
         +--- Submit Feedback
         +--- Wait a couple of seconds before submitting again

This will roughly translate to:

Now, as this is the second post on JMeter I’m going to jump some hoops and show the final test plan and then explain each of the elements.

This might seem a little scary but most of the elements should be really easy to grasp. I’m going to show the configuration for each one:

Test Configuration
It’s really useful to have all the configuration in one spot, typically at the start of the test. This way we can tweak stuff more easily.

HTTP Request Defaults
Instead of repeating the server and port on each request we can define defaults for those values. This is really a time-saver, particularly when switching environments.
HTTP Cookie Manager
The cookie manager is the only thing required to handle forms authentication. It makes sure that the authentication cookie is there, making your requests authorized after the login.
For Each User
Normal thread, but using variables instead of hardcoded numbers. The Loop Count is 1 because we have a Loop Controller inside.
The login is a simple post to a specific action on our website. The username is always supplied as “DummyUser” with a “password” password.
Notice that the server name and port number are empty, as we’re using the HTTP request defaults.
Open Feedback Page
Just opening the feedback page before submitting. This is required because of the next action.
Regular Expression Extractor
The Request verification token is a hidden value in the form. Something like:
<input name="__RequestVerificationToken" type="hidden" value="........"/>
We have to load that field into a JMeter variable. Thus we use a Regular Expression Extractor, passing the following regular expression:

name=”__RequestVerificationToken” type=”hidden” value=”([A-Za-z0-9+=/\-\_]+?)”

Pretty self-explanatory. Repeat n times for each user (thread).
Load data from file
It’s a good idea to have diversity on your tests. Thus, instead of having hardcoded values, we can read values from a CSV file and assign them to variables. These may then be used normally in JMeter.
For example, I’ve created a csv file, named “Feedback.csv” with this content:
Nice Site;I really enjoyed that movie
Pretty good;I liked it, but there's some stuff that I didn't really getAwful;This is the worst piece of **** I've ever seen
Interesting;Nice stuff
much more lines. Should be enough to fulfill the number of users * number of requests.
One thing to note is the Sharing mode. With “All threads” it means that each request for each user will choose a new value from the csv file.

Submit Feedback
This is where the feedback itself is submitted. Notice that all the values use variables and that the Request Verification Token is also sent.
Check Result
Just checking if the response shows a success message or not.

Wait Before Next Request
If we’re simulating real users, we want to wait a little bit before issuing another request. I like the Gaussian Random Timer a lot because the value is a “controlled random” as it follows a normal (Gaussian) distribution on the test-plan as a whole.

The scary part is that JMeter provides much more stuff than what I’ve shown here. Anyway, I’ll wrap up things for now. If you’ve got any doubt/suggestion place a comment.