Evaluate tests automatically with acceptance criteria

In the latest release (1.3.0) we made it possible to define your own acceptance criteria. What’s so great about this is that they allow to evaluate tests automatically! For example, you can tell whether the average response time did not exceed a certain threshold or the CPU was not overloaded without needing to check it manually.

How to start using acceptance criteria:

  1. Make sure you are using the current version of SmartMeter.io (1.3.0 or newer).

  2. Open your test script in SmartMeter.io Editor.

  3. Add new Config Element / et@sm - Acceptance Criterion (location in the test plan does not matter).

After that, there is quite a lot of configurations to choose from. Let’s take a look at some examples.

(You can also see the official documentation for full description)

Example 1

  • The average response time of each sampler (usually HTTP request) will not exceed 1200 ms.

This is quite a simple configuration. Just choose the “Response time” metric, AVG aggregation function and set the expected value to less than or equal (<=) to 1200.

Evaluate tests automatically with acceptance criteria 1

Example 2

  • The error rate of “Basket” transaction must not exceed 10 %.

In this case we set Sample origin to “Transactions”. It means that all samplers belonging to the same transaction will be evaluated together. We also set the filter as “equals to Basket” so that this acceptance criterion will operate on the Basket transaction only.

Evaluate tests automatically with acceptance criteria 2

Example 3

  • The CPU utilization of system under test will not exceed 75 % for more than 10 seconds.

This one is a little bit tricky. First off, we need to setup the et@sm - PerfMon Metrics Collector. It will provide the necessary data.

Make sure to set the correct IP address and port.

Evaluate tests automatically with acceptance criteria 3

Now, you can add a new acceptance criterion.

As a Listener choose the PerfMon Collector from the previous step.

An interesting option here is “Group samples by time frames (in seconds)”. It means that samples will be evaluted in 10 seconds frames. By default samples are evaluated for the whole test run. This is very useful option because otherwise short-term peaks would be missed in average. We recommend to use this option for Response time as well.

Evaluate tests automatically with acceptance criteria 4

Run the test

If you have done the setup of at least one acceptance criterion (which is the limit for Light license), run the test and get the report. Scroll to the evaluation of acceptance criteria section.

Evaluate tests automatically with acceptance criteria 5

Evalution of the CPU < 75 % criterion

Evaluate tests automatically with acceptance criteria 6

Green color means all criteria have passed.

TIP: You can try to tweak the configuration to see some of them fail. There’s no need to run the test again. Just change the configuration and click the Generate report button again.

Evaluate tests automatically with acceptance criteria 7

You can combine as many acceptance criteria as you wish assuming your license supports it.

Acceptance criteria are super useful for running performance tests as part of your continuous integration. Follow this tutorial to learn how to include performance testing into any continuous integration tool.

Now, try it for your own!