Documentation

Updated 7.6.2017 15:18

Introduction

What is SmartMeter.io

SmartMeter.io is a multi-platform load and performance testing tool featuring fast and easy test creation and execution, test management and generating of test reports with focus on testing in a distributed mode

SmartMeter.io is based on Apache JMeter but adds new features such as one-click test reports, advanced scenario recorder, user friendly distributed mode, acceptance criteria and many others. If you are new to JMeter, we recommend to start with JMeter documentation first.

How to start

Start by downloading SmartMeter.io here and then obtain your free licence on the same page. To create your first performance test, navigate to Getting Started.

Recommended hardware

  Controller Generator (min/optimum)
RAM [GB] 6 4/6
HDD [GB] 120 40
CPU [cores] 4 2/4

Software requirements

  • 64-bit operating system
  • Windows, Linux or OS X
  • Java is already embedded in SmartMeter.io

More learning resources

Getting Started

This chapter will guide you through a basic workflow of using SmartMeter.io. We assume that SmartMeter.io is already installed. If not, please see How to start first and then get back here.

TIP: You can also watch video tutorials which cover test creation, editing and execution.

Welcome screen

Navigate to SmartMeter.io home folder and run SmartMeter.exe (.sh on Linux or .command on Mac). Welcome screen will start.

Welcome screen

Run example test

Keep default values (empty Monitor scirpt and test.jmx for Test script), click Start test and wait until SmartMeter.io starts. The test will start automatically.

Runner

The test will run for the set period of time or untill you stop it by clicking  Stop Test icon in the left upper corner. Once the test is finished, click Report icon to generate the test report. Report will automatically open in your default browser.

Basic test components

The following figure shows the tree structure of a typical test script. We highly recommend to read Elements of a Test Plan first.

Test plan

Record test script

Navigate back to Welcome screen and click Start Recorder button and wait until Recorder is started.

*If running on OS X for the first time, you will need to add a security exception.   

Recorder start

Keep default values for now and click Start recording button. Once SmartMeter.io homepage is loaded, navigate to Features page. The test script is being created as you browse the sites. Make few more transactions and click Save test button, confirm the end of recording and wait until editor is opened. It is often necessary to further modify scripts created by Recorder so they can be executed in many threads.​

Continue reading this documentation, do not forget to checkout our blog and YouTube channel. If in trouble, contact us at customerservice@smartmeter.io, or even better at StackOverflow.

Electron Script Recorder

VIDEONew Load Test Scenario Recorder in SmartMeter.io

Electron Recorder is a utility program which greatly helps with creating test scripts for webs and web application. It automatically captures all the HTTP(S) traffics between a client (web browser) and server and converts it into a test script. Here is the list of its key features:

  • Simple to use (no proxy setup, no browser plugin, just start recording and get your test in just a few minutes),
  • Always included Recording log (full details of HTTP request/responses),
  • Automatic transactions and think times,
  • Mobile devices simulation,
  • Undo/Redo.

Create new test

Click Start Recorder on the Welcome screen.

Start window

  1. Test name - Name of your newly created test scenario

  2. URL of the tested site - The test scenario will start at the selected URL

  3. You can choose if new transactions will be created automatically or by yourself

  4. Request filter

    1. The first option will record all requests

    2. When the second option is selected, the recorder will collect resources only from the URL domain you have selected above

    3. When the Advanced include is selected, new options will pop up.

  5. Resolution - Your desired screen resolution. It’s suitable to test responsive version of the web, which should use different resources (images etc.).

  6. User Agent Override - This will allow you to change setting of the user agent, eg. different browser, operation system.

Request filter

Include - use to include requests to specific URLs, using specified protocol. Place each expression on a separate line. You can use both include fields or just one depends on your needs. When you click on the Start Recording button, content of this field is automatically transformed into regular expression. It follows these rules:

  • If you write only domain (e.g. google.com), every request and response with URL containing this domain will be recorded, including URLs like apis.google.com. Both protocols (HTTP and HTTPS) will be captured.

  • If you also specify a subdomain (e.g. www.google.com), only communication under this exact URL will be recorded using still both protocols (e.g. apis.google.com would be ignored).

  • If you write the whole URL (e.g. https://www.google.com), only communication on given protocol (here HTTPS) and under given URL (www.google.com) will be recorded. Everything other will be ignored. This happens when you copy & paste URLs so just keep in mind that by this you are putting restrictions on the Recorder and thus the recorded test doesn't have to correspond to the real user behavior.

Exclude -  use to ignore requests to specific URLs. Suitable for excluding requests to a subdomain or excluding e.g. pictures or icons (^.*\.ico) requests. They will not be part of the test. Use regular expressions and place each expression on a separate line.

Start Recording button - pressing this button will start recording. From this point all requests made in a built-in browser will be recorded as part of the test script. With respect to Exclude and Include fields values.

Test script Recorder

Once you set your settings, you are ready to begin with recording. Let's have a look at the GUI.

Recorder

Recorder is used for creating test scripts scenarios, while browsing a website in a built-in browser. Recorded transactions are shown in the left sidebar. A transaction usually encapsulates a bulk of requests initiated by a single action on the website (for example clicking a link or submitting a form) or it may encapsulate more actions (for example login to a website).

Transaction card

Transaction card

  1. “Plus” button - This button allows you to add a new transaction manually.

  2. Transaction Title - It is recommended to name the transaction, so you have better idea what are you working with when editing the scenario.

  3. Sleep time - It is counted automatically and also can be set manually. It represent time period before another transaction which simulates human clicks on the site.

  4. Info bar - Shows information like number of request, size and response time.

  5. Pause / Unpause sleep - Toggle button which is used for stopping the Sleep timer.

  6. Remove button is used for deleting the transaction.

Main window

The main window in the GUI consists of three parts. Browser, Transaction detail and Search resources.

  1. Browser - If you wish to record a script, use this built-in browser. Each step you perform, will be recorded as a transaction.

  2. Transaction detail - This will give you a full detail about each recorded HTTP request/response.

  3. Search resources - Allows you to search information through the test scenario.

Transaction detail

Transaction detail

Transaction detail table shows all the resources recorded in a particular transaction. The omitted ones by the exclude filter are tinged by gray. You can show/hide excluded resources by the checkbox at the top right corner. To show additional details (request / response headers, response body), click on the resource row.

Save script

When you are finished with the test script, move on to the editor. On the right top side, there is a main menu which contains elements to work with the scenario.

  1. Save the test - Test scenario is saved in a JSON format and then converted to SmartMeter.io (.jmx) format. Once saved, new window will appear with the possibility of opening the JMeter based editor, for editing the script and running the test. Scripts are by default saved to tests folder.

  2. Test settings - Same window as in the beginning will pop up, however you can only rename the test, change resolution or manage resources filter.

  3. Flush cache - This allows you to clean the cache, which will simulate the first visit on the website.

  4. Undo / Redo button - Serve as expected for undoing/redoing the last action performed.

Fallback to Chrome Script Recorder

Electron Script Recorder is the default option since SmartMeter.io 1.2.0. To switch back to the original Chrome Script Recorder uncheck the property New Script Recorder in Configurator.

Fallback to Chrome Recorder

Chrome Script Recorder

This is the original test script which was replaced by Electron Script Recorder in SmartMeter.io 1.2.0. To bring it back see Fallback to Chrome Script Recorder.

Installed Chromium/Chrome is required for Linux and OS X. 

Recorder is used to easily create test scripts while browsing a website. Recorder captures user requests and server responses from the Chrome / Chromium web browser (this is done with Recorder plugin). We recommend to work with transactions. A transaction usually encapsulates a bulk of requests initiated by a single action on the website (for example clicking a link or submitting a form) or it may encapsulate more actions (for example login to website).

Consider cleaning the browser cache  before recording the script (Google Chrome shortcut is CTRL + SHIFT + DEL). If the cache is clean, it simulates the first time visit. On the other hand, if you firstly browse the tested web and will not clean the cache afterwards, you may simulate load caused by a regular visitor. Recorder runs on port 8090, if you want to quickly verify that Recorder is running, we recommend open "localhost:8090/Recorder/recorder".

Click Start Recorder on the Welcome screen.

The figure below shows the Recorder control panel after its start.

smartmeter-io-recorder-numbered

  1. Exclude field - use to ignore requests to specific URLs. Suitable for excluding requests to a subdomain or excluding e.g. pictures or icons (^.*\.ico) requests. They will not be part of the test. Use regular expressions and place each expression on a separate line.
  2. Include (URL address) field - use to include requests to specific URLs, using specified protocol. Place each expression on a separate line. You can use both include fields or just one depends on your needs. When you click on the Start Recording button, content of this field is automatically transformed into regular expression. It follows these rules:
    • If you write only domain (e.g. google.com), every request and response with URL containing this domain will be recorded, including URLs like apis.google.com. Both protocols (HTTP and HTTPS) will be captured.
    • If you also specify the World Wide Web prefix (e.g. www.google.com), only communication under this exact URL will be recorded using still both protocols (e.g. apis.google.com would be ignored).
    • If you write the whole URL (e.g. https://www.google.com), only communication on given protocol (here HTTPS) and under given URL (www.google.com) will be recorded. Everything other will be ignored. This happens when you copy & paste URLs so just keep in mind that by this you are putting restrictions on the Recorder and thus the recorded test doesn't have to correspond to the real user behavior.
  3. Include (regex) field - use to include requests to specific URLs. See the example below the field. This example regular expression defines an exclusive inclusion of links from the example.com domain. All request to different domains will be therefore ignored (for example requests related to Google Analytics). Place each expression on a separate line. You can use both include fields or just one depends on your needs.
  4. Start Recording button - pressing this button will start recording. Also transforms content of the URL address include field into regular expression. From this point on all requests made in the Chrome / Chromium browser will be recorded as part of the test script. With respect to Exclude and Include fields values. After pressing the Start recording button the information "For full recording log press F12 in Chrome Browser!" is displayed. Pressing F12 will open the Developer toolbar. This is not necessary but strongly recommended in order to capture complete requests and response data. Without opened Developer toolbar there will be no Recording log available later in test scenario.

​After pressing the Start recording button, the control pane will look like on the figure below:

MY_RECORDER

  1. Stop recording button - finishes the recording process. Press when your are done with the test script. No more requests will be then recorded.
  2. Transaction name field - enter a name of transaction here. Transactions are explained in more detail at the beginning of this sub chapter.
  3. Start transaction button - starts a new transaction. Transaction is named accordingly to Transaction name field (2). After starting the transaction it is the right time to perform an action in a browser.
  4. Stop transaction button - ends the current transaction (press after your action in browser is completed).
  5. Sleep field - enter the time (in seconds) a user is supposed to wait after performing the transaction.
  6. Insert sleep button - inserts the sleep time from Sleep field (5) into the test script.

Basic workflow

Chrome Script Recorder has two parts - Recorder control panel and web browser. The control panel controls the recording process (settings, start, stop and so on) while the browser captures your activity and converts it into a test script. First fill one of the include fields (either insert URL, or regular expression) by SmartMeter.io homepage. The content of include fields specify protocol and domain which will be recorded.recorder-pre-start

Leave the rest of the fields blank for now and just click on Start Recording button. You will be prompted to open Developer toolbar. This is recommended because developer toolbar allows Recorder to capture bodies of requests and responses. Click on OK to dismiss the dialog and in Recorder web browser hit F12 key.

smartmeter-io-recorder-recording

You will be notified about transactions. Name your first transaction for example "SmartMeter.io Homepage" and click on the button Start Transaction. Now all recorded communication will be also logically wrapped under this transaction name until you click on Stop Transaction. Recorded transactions are visible as lines in most graphs during the run of a test. Transaction logically represent one user action. Therefore by using transactions you can organize your test according to your needs. Now it is finally time to record our first test!

Focus on the browser now (Chromium on Windows / Linux, Chrome on Mac). Make sure that you are using the browser launched by Recorder. Do not use your standard browser. It will not work since it does not have the necessary plugins loaded! Visit https://www.smartmeter.io/. When the site is fully loaded click on Stop Transaction. You should see list of recorded https requests wrapped under the transaction name in Recorded scenario box.

smartmeter-io-recorder-transaction-stop

Now let's record one more transaction so there can be some comparison during the analysis of the test. The procedure is the same as before. Name your transaction for example "SmartMeter.io Blog", click on Start Transaction, in browser click on the link Block in the menu and when the page loads click on Stop Transaction. You should end up with two lists of recorded requests.

smartmeter-io-recorder-transaction2-stop

Now you have recorded all you need and you can click on Stop Recording button and confirm your choice. Two new input fields will appear. Set number of concurrent users to 5 and test duration to 60 (secs). Create the test by clicking on Create & Edit test.

smartmeter-io-recorder-after-rec

After a short while, JMeter based editor will be launched. You may close it right away since we do not want to make any changes in this most simple scenario. You may close Recorder and web browser as well and then switch back to Welcome screen. 

Tip: More advanced tutorial on how to record a test is available on our blog and also on YouTube.

​Recorder extension API

Following features are related to Recorder and can greatly reduce the amount of manual work while preparing the test. The idea behind is to set some rules before recording the test. Those rules are then evaluated for each request. Nice example is a rule for extracting some values from a response and using it in the following requests.

In case you are writing a custom extension in Java, put the JAR with compiled classes into the custom folder and register the component (replacer, correlator or assertion) in custom/smartmeter.properties using the fully qualified name of the component (for example foo.bar.MyReplacer).

Replacing dynamic values in requests

Some parts of a request might be dynamic and must be changed with each test run or even with each new request (for example a timestamp or some one-time tokens). We can split such dynamic values in two groups. The first one contains dynamic values which can be easily generated (a timestamp). The second one is a little bit more complicated because the dynamic values must be extracted first from a response (typically some one-time verification tokens). If you are interested in the second group, see Automatic correlations after reading this chapter.

Values can be replaced in URL, request headers or body. The original value is removed and replaced with a placeholder (${PLACEHOLDER}).

Create your own replacer by implementing the cz.etnetera.jmeter.recorder.replacement.Replacer interface which is part of the Recorder API (located in programs/SmartMeter/extras/RecorderInterfaces.jar). Put the custom replacer (packaged as JAR) into the custom folder in the root of the SmartMeter directory. Do not forget to register the custom replacer in Configurator (tab Recorder, parameter etn_recorder_replacers) using its fully qualified name.

Example - substitution of timestamp parameter in URL

etn_recorder_replacers = cz.etnetera.smartmeter.replacer.timestamp.TimeStampReplacer

import java.util.regex.Pattern;
import cz.etnetera.smartmeter.recorder.replacement.ReplaceType;
import cz.etnetera.smartmeter.recorder.replacement.Replacer;

public class TimeStampReplacer implements Replacer {

	private static final String TS_PARAM = "_ts=";
	private static final String PARAM_DELIMITER = "&";
	private static final String VAR_NAME = "${timestamp}";

	@Override
	public String replace(final ReplaceType type, String text) {
		if (ReplaceType.PATH.equals(type) && text != null && text.contains(TS_PARAM)) {
			String replace = text.split(Pattern.quote(TS_PARAM))[1];
			if (replace.contains(PARAM_DELIMITER)) {
				replace = replace.split(Pattern.quote(PARAM_DELIMITER))[0];
				text = text.replace(TS_PARAM + replace + PARAM_DELIMITER, TS_PARAM + VAR_NAME + PARAM_DELIMITER);
			}
		}
		return text;
	}
}

Automatic correlations

Automatic correlations extend the capability of Replacing dynamic values in requests by automatically extracting the dynamic parts from a response. All future occurrences of such values in requests are replaced with a placeholder (variable).

Note: The test scenario must contain the Recording log (activated by pressing F12 before its recording)!

Built-in correlation mechanisms
  • cz.etnetera.jmeter.recorder.correlation.BoundaryExtractorCollerator - extracts values using left and right borders. Configure either in Configurator or directly by editing custom/smartmeter.properties.
Example - automatic correlation of Vaadin security key

Example of response containing Vaadin security key:

{
  "v-uiId":2,
  "uidl":"{\"Vaadin-Security-Key\":\"bda6f482-785b-4c67-a7d9-cbc5c3b679ee\"}"
}

​We want to ensure automatic correlation of value bda6f482-785b-4c67-a7d9-cbc5c3b679ee by replacing it in future requests with variable ${Vaadin-Security-Key} and inserting Boundary Body Extractor to the point where the value is to be retrieved. Boundary body extractor uses the left and right boundary (highlighted in bold) to extract the key.

Add the following configuration to custom/smartmeter.properties (or use Configurator)

# register correlation
etn_recorder_correlations = vaadin_seckey
# definition of used mechanism, may be the path to its own Correlator implementation,
# separated by a semicolon
etn_recorder_correlation_vaadin_seckey_mechanism = BoundaryExtractor
# variable name that will be used in the test.
# If multiple instances of correlation were found, the variable will be automatically numbered
etn_recorder_correlation_vaadin_seckey_varName = Vaadin-Security-Key
# left border of correlated variable in the response
etn_recorder_correlation_vaadin_seckey_lb = Vaadin-Security-Key\":\"
# right border of correlated variable in the response
etn_recorder_correlation_vaadin_seckey_rb = \"
# order of occurrence
etn_recorder_correlation_vaadin_seckey_match = 1
Example - custom correlator

The custom correlator is suitable for more complex cases. For example when the left / right border changes between responses. Imagine a button which description or identifier changes according the page on which it appears.

The process of creating a custom correlator is analogous to creating a custom replacer. Your custom correlator must implement the cz.etnetera.smartmeter.recorder.correlation.Correlator interface. Do not forget to register the custom correlator in the Configurator (tab Recorder, parameter etn_recorder_correlations).

Example

import cz.etnetera.smartmeter.recorder.correlation.BoundaryExtractorSettings;
import cz.etnetera.smartmeter.recorder.correlation.CorrelationResult;
import cz.etnetera.smartmeter.recorder.correlation.Correlator;

public class ExampleCorrelator implements Correlator {

	private static final String LB = "value=\"";
	private static final String VAR_NAME = "action";
	private static final List actions = new ArrayList();

	static {
		actions.add("\" >Login");
		actions.add("\" >Registrovat");
	}

	@Override
	public CorrelationResult find(final String correlation, final String content, final Map, String> headers) {
		for (final String action : actions) {
			// nalezneme, ktera z uvedenych akci je pritomna na strance
			if (content != null && content.contains(action) && content.contains(LB)) {
				try {
					// najdeme promennou, kterou chceme pro nahrany scenar nahradit
					final String value = content.split(Pattern.quote(LB))[1].split(Pattern.quote(action))[0];
					// vratime vysledek, obsahujici nazev promenne, hodnoty k nahrazeni a nastaveni extractoru                    
					return new CorrelationResult(VAR_NAME, value, new BoundaryExtractorSettings(VAR_NAME, LB, action, 1));
				} catch (final ArrayIndexOutOfBoundsException t) {
					return null;
				}
			}
		}
		return null;
	}
}

Response assertions

The process of creating a custom assertion is analogous to creating a custom replacer. Your custom assertion must implement the cz.etnetera.smartmeter.recorder.correlation.Assertion interface. Do not forget to register the custom assertion in the Configurator (tab Recorder, parameter etn_recorder_assertions).

Built-in assertions
  • cz.etnetera.jmeter.recorder.assertion.TitleAssertion - adds automatic asserts on <title> tag.

Test Report

SmartMeter.io offers detailed HTML test reports at one-click. Once the test is finished, click the Report icon on main toolbar. The report will open automatically in your default web browser. Reports are by default saved to reports folder.

Result Analysis

See example

  • Test script - backup of test script, for easy retesting
  • Results backup - archive with raw test results (JTL)
  • Test duration
  • Test plan comment - custom test plan comment, great for saving additional information about this test run (tuning trials, state of database, ..), can be edited even during the test
  • Active Thread Groups - virtual users configuration
  • Statistics Summary - maximum number of virtual users, total hits, average hits per second, total errors
    • Requests Summary - statistical overview of all requests/transactions 
    • Successful Requests Summary
    • Failed Requests Summary
    • Responses Summary - overview of HTTP response codes and/or exceptions
  • Errors - overview of errors, with links to full detail about the error
  • Acceptance criteria - evaluation of acceptance criteria
  • Graphs

Trend Analysis

Since 1.4.0

Long-term comparison of repeated test runs.

Trend Analysis works out of the box. SmartMeter.io automatically detects previous test runs in reports folder and uses them for comparison. Test runs are grouped by test script name (name of the .jmx file). In case of major changes in the test script we recommend to version its name to start from fresh.

If you want to exclude a test run from comparison, either remove its report folder or delete the data/test-summary.json file from it.

Backups & Data

SmartMeter.io automatically backups the test script (.jmx) file and archive of result files (.jtl) to REPORT/backups folder. You can easily run the same test again or analyze the test logs in more depth.

More useful data (for example summary CSV files, graph images or test run summary) can be found in REPORT/data folder.

Additional report generation

Usually, you generate the report after the test, but this is not the only option. You can also use et@sm - Report Generator component or run generateReport from command line to generate the report at any time (even in a batch).

Configuration

Extra configuration options are available in Configurator - tabs report and report-graphs. 

TIP: Generating a report after a long-running distributed test can take a lot of time. You can speed things up if omitting some of the graphs (property etn_report_create_graphs).

Graphs Overview

This chapter describes essential performance testing metrics to give you insight while analyzing the test results.

Hits Per Second

Sum of all processed requests (to be more precise - every called sampler) to a server per second. The hit is not counted until the server responds.

The shape of this graph should be approximately the same as the Threads State Over Time graph, which captures the loading of virtual users.

Hits Per Second SmartMeter.io

What to look for:

  • There is a substantial decrease of hits at a specific moment. That means the tested system system is getting overloaded or even fails to respond.

Relationship:

  • When the response time increase (Response Times Over Time graph) this graph decrease.

Response Codes Per Second

Types of response codes sent from a server at a specific time.

Response codes 500 mean that there are errors in the tested system.

What to look for:

  • There are both 200 and 500 response codes in the graph. That indicates that only a part of the system is no longer able to respond while the other part is still functional. The system might be using a load balancer or two backends and one of them stopped working.

  • Compared to a previous test run there is a substantial increase in the number of responses with redirect code 301 or 302. That indicates that some changes were made in in the system since the last test run.

Relationship:

  • When the number of hits decrease (the server wasn’t able to respond) the response time (Response Times Over Time graph) increase.

Response Codes Per Second SmartMeter.io

Response Times Over Time

The graph shows a response time for requests within transactions (test step in a test scenario).

Each transaction usually includes more requests to a server. In a perfect state this graph would be horizontal and flat.

What to look for:

  • One of the transactions suddenly takes longer time to respond while others doesn’t. That indicates the reason is specific to the one transaction only and not to the system as a whole.

  • All transactions have slow response (graph peaks) in regular intervals. That might indicate regular task on the server and not an error.

Response Times Over Time SmartMeter.io

Response Times Over Time Aggregate

The graph shows aggregate response time of all requests.

In the perfect state the graph closely copies the X axis.

Substantial response time increase (peak) is easier to spot in this aggregate view than in separate view (Response Times Over Time graph).

It is also easier to spot periodic deviations in response time, which might be caused by regular system tasks and not by an error.

Response Times Over Time Aggregate SmartMeter.io

Threads State Over Time

The number of all active virtual users. Shown separately for each thread group and generator.

The graph confirms that virtual users were really generated as defined.

Users are divided according to load generators and activities they perform (first group only visits the index page, second group is signing in etc.).

What to look for:

  • The graph has different shape than which was defined during test editing.

Threads State Over Time SmartMeter.io

Threads State Over Time Aggregate

The sum of all active virtual users.

The graph confirms that virtual users were really generated as defined.

What to look for:

  • The graph has different shape than which was defined during test editing.

Threads State Over Time Aggregate SmartMeter.io

Transactions Per Second

How many times was each transaction called.

Transaction can be interpreted as a test step in a test scenario such as visit index page, insert product into cart, successful login, failed login,...

In a perfect state the graph increases the same way as the number of virtual users.

What to look for:

  • Virtual users keep increasing and the system is no longer able to process every transaction. The graph stops to increase.

Transactions Per Second SmartMeter.io

Transactions Per Second Aggregate

The sum of all completed transactions.

Transactions are divided into successful and failed.

Transaction can interpreted as a test step in a test scenario such as visit index page, insert product into cart, successful login, failed login,...

In a perfect state the graph increases the same way as the number of virtual users.

What to look for:

  • The graph stops to increase, because virtual users keep increasing and the system is no longer able to process every transaction.

  • There are many failed transactions in the graph.

Transactions Per Second Aggregate SmartMeter.io

Concurrency Over Time

The number of concurrent threads hitting the server.

The requests that the system is unable to process start forming a queue, which is represented in this graph. The size of the queue is limited by system resources and after some point errors start to occur.

NOTE: This graph is turned off by default because its creation is very time-consuming.

What to look for:

  • The number of concurrent threads increases considerably faster than the number of virtual users (Threads State Over Time Aggregate graph). That indicates the system is unable to respond and the requests pile up.

  • If these peaks occur in regular intervals, they may not indicate and error but an unrelated planned system task, for instance.

Concurrency Over Time SmartMeter.io

Concurrency Over Time Aggregate

An aggregate view on the number of concurrent threads hitting the server.

Threads are divided to successful and failed. Graph can also be used when testing two different systems within one test.

Concurrency Over Time Aggregate SmartMeter.io

Bytes Throughput Over Time

Received and Sent Bytes per second.

The shape of this graph should be approximately the same as the Threads State Over Time graph, which captures loading of virtual users.

What to look for:

  • The graph doesn’t copy the load of virtual users. System ran out of resources or the max network throughput was reached.

Bytes Throughput Over Time SmartMeter.io

Response Times Percentiles

How many percents of requests reached a given speed (e.g., 60 % of requests were processed within 10 seconds).

Every transaction from the scenario is displayed separately.

note: There is an error on the X axis. Instead of time it should show percents (10 %, 20 %,30 %,…, 100 %

Response Times Percentiles SmartMeter.io

Response Times Percentiles Aggregate

How many percents of requests reached a given speed (e.g., 60 % of requests were processed within 10 seconds).

Transactions are shown in an aggregate view.

note: There is a mistake on the X axis. Instead of time it should show percents (10 %, 20 %,30 %,…, 100 %).

Response Times Percentiles Aggregate SmartMeter.io

PerfMon

System resources utilization on Controller, generators or any other monitored environment.

  • CPU in %

  • Disc memory used in %

  • Memory used in %

  • Network I/O in MB

What to look for:

  • When the CPU is overloaded, test results will not be accurate. To get reliable results for the test with same settings it is necessary to add more HW resources.

  • It is also possible to run a test without real-time graphs to save HW resources. (the Concurrency Over Time is particularly resource demanding).

Controller SmartMeter.io

Distributed Mode

A single computer simply cannot provide enough power when simulating large number of virtual users. The solution is to connect more computers together and create a distributed network. This, of course, brings many obstacles such as correct environment setup, secure communication between nodes, concurrency, combining the test results together and managing the whole process generally. The good news is SmartMeter.io was designed to run in a distributed mode and is capable of solving all of the mentioned issues with ease. All you have to do is to pick one of the predefined monitor scripts and configure your environment specific variables.

The following picture shows the overall architecture of distributed tests.

Distributed mode architecture

  • Monitor -  Dedicated instance of SmartMeter.io (precisely a single user test) which sets up the distributed environment (starts load generators, DataServer, Controller, ..). Also receives and visualizes real-time data from Agents.
  • Controller - Second instance of SmartMeter.io started by Monitor. It starts/stops the distributed test and receives and visualizes real-time data from load generators.
  • Generators - Servers generating the load by executing the test scenario. Use more generators to increase the performance and reliability. One generator server may host more generator instances.
  • Tested system - System under test.
  • Data server - Data server provides data for generators. It is usually hosted on Controller. Solves the issue of splitting and copying CSV data files to load generators.
  • Agent - A small application that monitors the load on the environment (CPU, RAM, IO, networking) and reports information back to the Monitor. They run on each server usually. More information can be found in the PerfMon Server Agent´s documentation http://jmeter-plugins.org/wiki/PerfMonAgent/.

Controller automatically transfers test script to all load generators and starts the test. Once the test is started, Controller receives live test results from load generators. Complete log file are downloaded once the test is finished in order to restrict communication during the test, so the performance isn’t artificially impaired. Generators make requests on the tested system, measure the speed and process responses.

NOTE: Always use the same version of SmartMeter.io for Controller and load generators!

Read more on about remote testing in JMeter Documentation.

VIDEOWeb performance and load (stress) testing process with SmartMeter.io

Monitor script

Each distributed test consists of a monitor script and a test script.

Monitor and test script

They are two separate JMX files. Monitor script is responsible for setting up the distributed environment (starting performance monitoring agents, generators, DataServer and the Controller with test script). Monitor is a running instance of SmartMeter.io executing the monitor script, usually with a single thread. It runs on the same machine as the Controller.

Optionally, Monitor can start an actual Internet browser and measure the speed of loading and rendering the page as viewed by a real user. The scope of started applications is not limited and can be adjusted to the given situation. Monitor script is a standard script (.jmx) and is created / edited in SmartMeter.io editor.

Test script is a standard script and does not differ from the non-distributed test in any way.

The following picture shows an example of monitor script.

monitor

  1. Generator setup
    1. connect to server hosting generators over SSH
    2. kill all java processes (just to ensure a fresh start, not necessary)
    3. do port forwarding over SSH (to secure communication between Controller and generators)
    4. start generators (2 instances here)
    5. start performance monitoring agent
  2. Controller setup
    1. start performance monitoring agent
    2. start DataServer
    3. import test data to DataServer
    4. run distributed test
  3. View Results Tree - overview of all executed steps (helps to check everything went as expected)
  4. Monitor - runs in an infinite loop and collects results from local performance agent (also keeps the Monitor running until the end of the test)
  5. Listeners - listens for real-time test results
  6. User Defined Variables - environment specific variables (path to SSH key, test script to run, generators location, ..)

Controller and generators

Unlike JMeter, SmartMeter.io always runs in at least 2 instances. One is controller and the rest are generators. The controller communicates with all generators and controls the test process from one place. Generators generate test load (virtual users). One computer can host all instances or they can be spread across many computers. It is recommended to use more computers in order to avoid sharing of system resources (especially between controllers and generators). Shared resources can negatively impact the measured results.

Controller

  • starts / stops the test

  • collects and combines results from all generators

  • generates the final report

Generator

  • generates load

  • sends results back to controller

Communication

The communication is bidirectional. The controller tells generator to run and stop test, the generator sends back measured data. More precisely data aggregation, full data are transferred when the test is finished. This helps to increase performance but may result in long report generation afterwards. See picture below for clarification.

komunikace controller-generator

 

SSH communication

Preferred way of communication is over SSH. The main advantage is that you do not need to allow any extra ports on your firewall because ports are open using SSH tunnels.  You only need to local forward RMI registry and server port and remote forward the RMI client port. SmartMeter.io contains component which will help you set this up in your configuration script.

No-SSH communication

If SSH is not available (for example if running on Windows servers), you need to make sure ports are open and configure RMI manually. Do not worry, it is not overly complicated and SmartMeter.io components will help you out. See SSHRunGenerator and RunDistributedTest requests).

DataServer

DataServer is a web application that provides and collects data during distributed tests. Its main purpose is to make sure that data are evenly distributed to generators and that each generator has its own unique data. Typical use case is virtual users’ login credentials.

Communication with DataServer is realised over standard HTTP requests. DataServer is usually started and populated from Monitor. Standard way of populating the server is from CSV files located in data folder. DataServer holds data in memory and is capable of holding thousands or even tens of thousands of records (depends on average size of one record). Proven throughput is 500 requests per second.

Starting DataServer

DataServer is preferably started from Monitor using cz.etnetera.jmeter.request.RunDataServer command (part of Java request component). After starting up, the server is available at http://localhost:8080/DataServer/data.

DataServer

Another option is to start the server manually (executed from programs/DataServer, Java 8 is required).

java -server -Xms512M -Xmx2G -jar start.jar

Data population

DataServer is populated from HTTP GET request with following parameters:

  • importFile - path to data file relative to test script or to SmartMeter home
  • dataName - custom name of data bank
  • random - turns on/off data randomization (default is false)
  • allowMultipleImport - turns on/off multiple import; multiple import allows populating of data bank from more than one source (default is false)
  • endless - turns on on/off data reuse after reaching the end of the source (default is true)
  • isHeader - if true, the first line is considered to be a header

Preferably, use DataServerImportRequest.

Example

dataserver_import

Data retrieval

Data are retrieved by sending HTTP GET request with getData parameter.

  • getData - name of data bank (matches dataName parameter from Data population)

One record (usually one line from a CSV file) is returned for each request. It is recommended to use DataServerGetRequest or the HTTP Request component with et@sm - CSV Extractor.

http://localhost:8080/DataServer/data?getData=[DATA_NAME]

Note: In order to exclude the request to DataServer from test results, append [exclude]  to the request name.

Header retrieval

Header is retrieved by sending HTTP GET request with getHeader parameter.

  • getHeader - name of data bank (matches dataName parameter from Data population)
http://localhost:8080/DataServer/data?getHeader=[DATA_NAME]

Adding data

Data are added by sending HTTP GET/POST request with following parameters:

  • addData - name of data bank
  • value - new record to be added to data bank

Data can be added even during the test or added to a new data bank and export later as CSV.

http://localhost:8080/DataServer/data?addData=[DATA_NAME]&value=[VALUE]

Removing data

Data are removed by sending HTTP GET request with clearData parameter.

  • clearData - name of data bank to be removed
http://localhost:8080/DataServer/data?clearData=[DATA_NAME]

Exporting data

Data bank is exported to a text file by sending HTTP GET request with exportData parameter.

  • exportData - name of data bank to be exported
http://localhost:8080/DataServer/data?exportData=[DATA_NAME]

DataServer web interface

Simple web interface with limited operations is available at http://localhost:8080/DataServer/data.

Firewall configuration

SmartMeter.io communicates on the following ports.

Open ports on Controller

Service Default value How to change
RMI client - used by load generators to communicate with Controller, for example for sending test results random in monitor script in Start test component as RMI client localport, 0 is for random
DataServer 8080 in monitor script in Start DataServer component as tcp port

Open ports on generators

Service Default value How to change
RMI registry 1099 in bin/runGenerator script add -Dserver_port=PORT, in case of starting load generator via SSH in monitor script in Start remote generator component as server port (RMI registry)
RMI server random in bin/runGenerator script add server.rmi.localport=PORT, in case of starting load generator via SSH in monitor script in Start remote generator component as RMI server localport, 0 is for random
PerfMon Server Agent - measuring system resources utilization (CPU, RAM, ..) 4443 cannot be changed at this moment

Distributed test

Following series of actions describes the standard life cycle of distributed test after clicking the Run test button:

  • Monitor is started. It immediately executes the monitor script (default is monitor.jmx).
    • SSH connection to the remote server hosting generator(s) is established.
    • Performance monitor agent is started on the remote server.
    • Generators are started.
    • Performance monitor agent is started locally.
    • DataServer is started and populated with test data.
    • The controller with test script is started.
    • Monitor keeps on running and display data from agents (CPU, memory, disks and I/O load).
  • Controller is started. It immediately executes the test script (default is test.jmx).
    • Controller sends the test script to generators.
    • Generators simulate virtual users and sends aggregated data back to Controller.
    • Controller displays real-time test results.
    • When the test is finished, complete data are received from generators (this can take a while, even hours).
    • Controller combines test results from generators and makes a final report.​

Geographical distribution

See et@sm - Distributed Concurrency Thread Group

Installation patterns

This section describes common installation patterns of SmartMeter.io. SmartMeter.io installation package contains template scripts quick deployment in distributed mode. It is recommended to create a copy of the template script and then set up the configuration (User Defined Variables component) as needed.

Monitor script config

TIP: Using localhost as Controller.

If you prefer to run tests from your localhost, then use it as a Controller.

One local generator

The most simple setup where the controller and one generator share the same machine. This is how SmartMeter.io operates when no monitor script is given. Performance is limited by the host machine.

one-local-generator

Multiple local generators

Simple setup where the controller and generators are running on the same machine. This is not optimal because the controller and generators share the same resources thus possibly influencing each other and tainting test results. At least Java Virtual Machines are not shared.

multiple-local-generators

One remote generator

The most simple remote setup. The controller and one generator run each on its own machine. Does not provide much power but at least separates the controller from generator. Typical use case is that the controller is a personal computer (notebook) and the generator runs on a powerful server.

one-remote-generator

Multiple remote generators (one server)

Suitable setup for medium performance. The controller is running on its own machine while load is generated from multiple instances of SmartMeter.io running on the same server.

remote-generators-one-server

Multiple remote generators (one per server)

Optimal SmartMeter.io setup. The controller is running on its own machine while load is generated from multiple servers. Servers can be located in various geographic locations. It is possible to assign specified virtual users to particular specified generators.

remote-generators-multi-servers

Multiple remote generators

Optimal SmartMeter.io setup. The controller is running on its own machine while load is generated from multiple servers. One server should be able to host multiple instances of generators. How many depends on its hardware configuration.

remote-generators-multi-servers-2

Features

List of SmartMeter.io unique features.

Acceptance criteria (SLAs)

Since 1.3.0

For documentation see et@sm - Acceptance Criterion.

For examples see Evaluate tests automatically with acceptance criteria blog post.

System resources monitoring

It is a good practice to monitor system resources (CPU, memory, Disks I/O, ..) on all affected machines (system under test, load generators, Controller) while running a performance test. Setup is a 2-step procedure:

1) Deploy PerfMon Server Agent (Controller and load generators already starts Server Agent by default)

2) Set up et@sm - PerfMon Metrics Collector according to PerfMon official documentation

System resources monitoring graphs will be automatically included in the test report.

Recording log

Recording log helps with finding sources of dynamic parameters. It shows full detail of HTTP communication (request/response headers and bodies) from the recording time. Standard use case is to find a response which contains the dynamic parameter.

Recording log

Recording log is based on View Results Tree listener but it has a few modifications:

  • Automatically loads the recording file (recording file .jtl is expected in the same folder as the test script file .jmx)
  • Special character prefixes - ! denotes a response with body, + denotes a response with body for a static resource

Static resources

Both SmartMeter.io recorders detect static resources (images, style sheets, scripts, ..) and save them as a iterable list of URLs, not fully-fledged HTTP requests.

Static resources

This special treatment offers following advantages:

  • clarity - only few meaningful requests remain,
  • performance boost - reusing single HTTP request for a bunch of static resources saves runtime memory,
  • quickly enable/disable all static resources - open SmartMeter menu, select Test > Enable/Disable static resources

menu_test

Undo/Redo

SmartMeter support undo/redo.

  • Undo - undo the last change (CTRL + z)
  • Redo - restore the last undone change (CTRL + y)

Error Results Tree

Error results tree shows failed sampler results including the full detail - assertion error and request and response data. It is available even in distributed mode. The trick is to send the failed samplers only so there is very little performance overhead. Error results tree shows automatically when the first error occurs.

Error Results Tree

Detached chart windows

Metrics can also be viewed in detached windows. This way you can fully customize your workspace, even on multiple monitors. Window positions are saved between test runs.

menu_windows

Smart Proxy Recorder

Smart proxy recorder extends the capabilities of standard HTTP test script recorder. It starts another daemon which accepts controlling commands (HTTP requests) and thus allowing to manage the whole recording process remotely. It is capable of adding new thread groups, inserting pauses between transactions, exporting the test to JMX file and much more. The idea behind is to easily allow fully automatic transformation of functional tests (for example Selenium tests) to load tests. All you need to do is enrich your functional tests with few controlling requests. Another use-case is to automatically re-record your test before running to always have up-to-date static resources.

The best scenario goes like this:

  • Enrich your functional tests with controlling commands
  • Start Smart proxy recorder
  • Setup proxy through Smart proxy recorder
  • Run the enriched functional tests and let them automatically transform into load tests. The output is a bunch of JMX test files.
  • Run the load tests with your CI tool.

smartmeter-io-SPR-numbered

  • et@sm - Smart Proxy Recorder - SmartMeter.io for transformation functional test into load test
  • Thread Groups
    1. Default - if user doesn't create its own Thread Group all transactions go under this component
    2. UserDefined - created by user on command start subset (see below)
  • Start/Stop Smart Proxy Recorder - start or stop recording daemons
  • Capture realtime between transactions - automatically put ThinkTime component between transactions with sleep length according to a real-time waiting

Bootstrap

Starting Smart proxy recorder is very easy - just open the smart-proxy-recorder.jmx file in folder "tests", expand the Workbench node (1) and click on Start button (3) in et@sm - Smart Proxy Recorder component. The recording daemon runs on 8080 (default) and the controlling daemon on 8085 (default) port. Do not confuse them! Smart Proxy Recorder component is based on standard JMeter HTTP Test script recorder component. See the official documentation for more details.

Next step is proxy configuration in your browser, check out this tutorial on page 4 and also this guide for setting up the certificates.

Now you can start recording! First we recommend to create subtest so your transactions will not end up in "Default" (2). If you have to activate some CSV file, now is the right time. Maybe add one or two Replacers too? Your transactions can be recorded directly through browser that has the right proxy configuration. Below you can find more commands to control Smart proxy recorder.

Since 1.1.0, Smart proxy recorder supports automatic correlations.

Commands API

Smart proxy recorder exposes a simple REST API on its own configurable port (default is 8085). Do not forget to encode illegal characters if using any.

Status check

Simple command  which always returns "OK" and do nothing else. Serves to check that Smart Proxy Recorder is up and running. 

../proxy-driver?command=status
Start subtest

Inserts a new thread group which immediately becomes a target for newly captured requests. This command can be called repeatedly.

../proxy-driver?command=start-subtest&name=[NAME]&users=[USERS]&duration=[DURATION]&rampup=[RAMPUP]
  • NAME – name of the subtest
  • USERS – number of users (integer)
  • DURATION – test duration in seconds (integer)
  • RAMPUP - ramp up time of users in seconds (integer)
Example
../proxy-driver?command=start-subtest&name=MySubset&users=5&duration=20
Finish subtest

Finishes the subtest. Target for newly captured requests is changed to the Thread Group "Default" (under Recording Controller) and the internal state of Recorder is reseted. Inserts the pause if capture realtime between transactions option is on.

../proxy-driver?command=finish-subtest
Change recording target

Changes the target of recording to a requested element (thread group).

../proxy­-driver?command=set­-target&name=[NAME]

NAME – name of the new target element

Datasource activation

Inserts HTTP request (including CSV extractor) to DataServer into the active subtest. This request retrieves dynamic data from DataServer and exposes them as variables. Use in combination with replacers.

../proxy-driver?command=activate-datasource&datasource=[DATASOURCE]&delimiter=[DELIMITER]&variables=[VAR1,VAR2,..]&domain=[DOMAIN]&port=[PORT]
  • DATASOURCE – name of the data source
  • DELIMITER - delimiter used within the data source (default is ;)
  • VAR1,VAR2 – List of variable names (order matches the order of values on the one line in CSV data source). If the list is empty, variable names are retrieved from the first line (header) of data source but only if the data source was imported with parameter isHeader=true.
  • DOMAIN – domain name where the DataServer runs (localhost is the default)
  • PORT - port where the DataServer runs (8080 is the default)
CSV file activation

Inserts CSV Data Set Config into the active subtest. This allow to obtain dynamic data from external CSV files.

../proxy-driver?command=activate-csv&filename=[FILENAME]&delimiter=[DELIMITER]&variables=[VAR1,VAR2,..]
  • FILENAME – name of the CSV file. Put the file into folder "tests". Read more in the offical JMeter documentation.
  • DELIMITER – delimiter used within the file (optional, if not specified system will use ";")
  • VAR1,VAR2 – variable list. Similar to first line in CSV file which give names to variables (optional, if not specified first line of CSV file is taken)
Example
../proxy-driver?command=activate-csv&filename=MyCSV_file.csv&delimiter=,&variables=variable1,variable2
Adding replacers

Adds a replacer into memory. From now replacers are applied on recorded requests until they are explicitly removed or the subtest is finished (see Finish subtest command). Replacers applied on a request will stick with it forever.

../proxy-driver?command=add-replacers&replacer=[KEY1]~[PLACEHOLDER1]&replacer=[KEY2]~[PLACEHOLDER2]
  • KEY - key which values are dynamic and must be replaced with placeholders
  • PLACEHOLDER - name of the placeholder
Example
../proxy-driver?command=add-replacers&replacer=login~username

After adding this replacer, requests containing key-value pair for example login=john will become login=${username}. Placeholders are expected to be filled from DataServer or CSV file.

Removing replacers

Removes the replacer(s) from memory. If no replacer is given, all replacers will be removed. From now replacer(s) will not affect future transactions. Already recorded transactions will still contain their replacers though.

../proxy-driver?command=remove-replacers&replacer=[KEY1]&replacer=[KEY2]&..
Example
../proxy-driver?command=remove-replacers&replacer=login~username

After removing this replacer, requests containing key-value pair for example login=john will stay the same.

Add variable

Inserts a new variable (for example ${myToken}) and sets its value. The variable will be defined at the start of the current thread group. It can be used to generate dynamic values (for example to generate a random name). The value of the variable is specific for each thread and is reassigned at the beginning of each walkthrough. More variables can be defined at once.

../proxy­driver?command=add­variable&variable=[NAME]~[VALUE]&variable=[NAME2]~[VALUE2]
  • NAME – variable name (do not wrap inside ${})
  • VALUE – variable value, usually a function

Example

Generate random name (6 alphabet characters)

variable=firstname~${__RandomString(6,abcdefghijklmnoprstuvxyz)}

Watchout, you have to encode the value!

​variable=firstname~%24%7B__RandomString%286%2Cabcdefghijklmnoprstuvxyz%29%7D
Insert pause

Insert a pause (ThinkTime component) on the last position in the active subtest.

../proxy-driver?command=insert-pause&duration=[DURATION]
  • DURATION – duration in ms (optional, if not specified component will wait for 5,000 ms)
Export test

Export the test as a fully runnable JMX file. Tests are exported to the tests folder.

../proxy-driver?command=export-test&testName=[TEST_NAME]
  • TEST_NAME – name of the exported test (optional, if not specified test will be named recorder-test-[TIMESTAMP].jmx)
Clear recording

Clears all captured requests and resets the internal state of the recorder.

../proxy-driver?command=clear-recording
Run test

Starts the recorded or an already existing test. If the name of the test is not provided, the actual test plan is exported to the temporary file and run.

../proxy-driver?command=run-test&monitorName=[MONITOR_NAME]&testName=[TEST_NAME]&gui=[GUI]
  • MONITOR_NAME - name of the monitor script to be run in distributed mode, script is expected to be located in the tests/monitors folder  (Pro version only)
  • TEST_NAME - name of the test to run, test is expected to be located in the tests folder
  • GUI - if false, the test will be run in NON-GUI mode (optional, if not specified will run in GUI mode)

​Integrations

Integrations with 3rd party software.

Continuous Integration

SmartMeter.io fits very well into a continuous integration process. It can be operated using command line interface. A huge benefit is support for definition of acceptance criteria (or SLAs) which allow to automatically decide whether the test passed or failed. There is no dependency on particular CI tool (no plugin is required).

General tips

  • Add test script(s) to your project's VCS.
  • Use Acceptance criteria
  • Create a new job which will run the performance test.
./SmartMeter.sh runTestNonGui PATH/test.jmx
./SmartMeter.sh runDistTestNonGui PATH/monitors/monitor.jmx PATH/test.jmx

Useful properties:

-Jetn_reports_path // path where to save the report
-Jetn_report_name  // name of the report

Jenkins

Example of integration with Jenkins.

Run performance test

Jenkins Start Test

Archive report with HTML Publisher Plugin plugin

Jenkins Publish report

View test report

Jenkins View report

Bamboo

Example of integration with Bamboo.

Run performance test

Bamboo Run Test

Add SmartMeter executable

Bamboo Add Executable

Archive report

Bamboo Add Artifact

View test report

Bamboo View Report

Maven plugin

See https://github.com/etnetera/smartmeter-maven-plugin

Vaadin

Please read this article on our blog.

Regular expression explanation

\\"(\d+)\\":\{[^}]*\\"id\\":\\"LABEL.domestic.initial.priorityPayment\\"

Regex matches in context of escaped JSON, therefore all the slashes. Let's hide them for a while.

"(\d+)":\{[^}]*"id":"LABEL.domestic.initial.priorityPayment"

Looks simpler. The regex matches a number in double quotes which is followed by :{ and then a sequence of any characters except the } followed by the "id" attribute with the given value.

Correlations overview

Name

Description

Occurrs in

Security key (CSRF token)

Unique for each user session. The value is stable. Each portlet has its own security key.

Request body

uiId

Starts from 1 and is incremented everytime the whole UI is freshly rendered (page refresh, visiting the same again or opening the page in a new tab).

Query string

Sync ID

Starts from 1 and is incremented with each new response from the server. Resets after refresh.

Request body

Connector ID

Each UI component has its own unstable connector id. This connector id must be parsed from a previous response otherwise the test script might break at anytime. Unfortunately, there is no easy way how to parse it unless having a unique identificator for each component. Usually, this is not the case and it is necessary to rely on secondary identificators such as captions or CSS classes in combination with the order of occurrence.

Request body (multiple occurrences)

Client ID

Since Vaadin 8. Starts from 1 and is incremented with each new response from the server. Resets after refresh.

Request body

​APM Dynatrace

To quickly find a source of an error in a request, make a detailed analysis of test results or to compare results of each run, an integration with Compuware dynaTrace system can be used.

See et@sm - DynaTrace Header to know how integrate.

Finding request details

Every request sent to the server where a dynaTrace agent is deployed returns the X-dynaTrace header among response headers. Using the value in this header a PurePath can be searched, which displays detailed tree structure showing the path of requests throughout the system.

A detailed procedure is shown in the following figure.

smartmeter-request-details

Viewing test details

Integration with the dynaTrace system lets you search and merge requests according to test name, user group, user id or request type. For this function the "Tagged Web Requests"  dashlet of the Dynatrace system can be used.

A detailed information about the propagation of the test information is shown in the following figure.

smartmeter-test-details

JMeter plugins

SmartMeter.io includes by default some of JMeter plugins.

  • Plugins Manager
  • Custom Thread Groups
  • Custom JMeter Functions
  • Command-Line Graph Plotting Tool
  • 3 Basic Graphs - Average Response Time, Active Threads, Successful/Failed Transactions
  • 5 Additional Graphs - Response Codes, Bytes Throughput, Connect Times, Latency, Hits/s
  • Distribution/Percentile Graphs
  • Selenium/WebDriver Support
  • PerfMon (Servers Performance Monitoring)
  • Synthesis Report
  • Filter Results Tool
  • JSON Plugins

Miscellaneous

This chapter reviews SmartMeter.io in more detail. Some more advanced functionalities are covered here.

CLI

SmartMeter.io might be started from the command line. This is particularly useful in case of including performance tests in your Continuous integration process, or in case of using timed autorun tests.

NOTE: SmartMeter.sh (.bat and .command) or any other scripts from the bin folder contains the cd directive. This can cause troubles if there are path sensitive commands called after those scripts (for example as a part of your CI/CD pipeline).

To execute a command on Linux/OS X

./SmartMeter.sh COMMAND [ARGUMENTS]

To execute a command on Windows

SmartMeter.bat COMMAND [ARGUMENTS]
  • runTest [test path] - Starts SmartMeter.io and runs a test in GUI.
    • [test path] - path to test script with base in tests path (by default the tests folder)
  • runDistTest [monitor path] [test path] - Starts SmartMeter.io and runs a distributed test in GUI.
    • [monitor path] - path to monitor script with base in monitors path (by default the tests/monitors folder)
    • [test path] - path to test script with base in tests path (by default the tests folder)
  • runTestNonGui [test path] - Runs a test in Non-Gui mode. Test progress is printed on standard output. Make sure that the et@sm - Controller Summary Report is included in your test plan! Otherwise you will not be able to get the report.
    • [test path] - path to test script with base in tests path (by default the tests folder)
  • runDistTestNonGui [monitor path] [test path] - Runs a distributed test in Non-GUI mode. Test progress is printed on standard output. Make sure that the et@sm - Controller Summary Report is included in your test plan! Otherwise you will not be able to get the report.​ 
    • [monitor path] - path to monitor script with base in monitors path (by default the tests/monitors folder)
    • [test path] - path to test script with base in tests path (by default the tests folder)
  • runGenerator [gid] - Starts SmartMeter.io generator. RMI server hostname is auto-detected if not specified explicitly by adding -Djava.rmi.server.hostname parameter. The auto-detection mechanism is not 100 % successful depending on the underlying OS.
    • [gid] - generator ID, useful for more instances on the same server to distinguish log files
  • runLocalGenerator [gid] - Starts SmartMeter.io local generator. RMI server hostname is set to 127.0.0.1.
    • [gid] - generator ID, useful for more instances on the same server to distinguish log files
  • runProxyRecorder [template file] - Starts Smart Proxy Recorder with a custom recording template.
    • [template file] - Either a relative path from the tests folder or an absolute path to a recording template file (.jmx).
  • runEditor [test path] - Starts SmartMeter.io editor and opens the test.
    • [test path] - path to test script with base in tests path (by default the tests folder)

Test and monitor path resolution

  • Absolute paths are passed on unchanged.
  • Relative paths are resolved from the given base.
  • If resolution from base fails (file does not exist), path is resolved from SmartMeter root folder. This is convenient for running tests from CLI because it allows to utilize path auto-completion.

Reports

Since 1.4.0

Use script generateReport.sh (.bat) in bin folder to generate report from a log file. 

  • [log file/folder] - relative path from SmartMeter root folder or an absolute path to a log file (.jtl) or a folder with log files (all log files will be processed in one batch)
  • [test script file] - optional, relative path from SmartMeter root folder or an absolute path to a test script file (.jmx); if present SmartMeter.io will evaluate acceptance criteria and add results to trend analysis

Example

bin/generateReport.sh results/20170411-102018-report.jtl tests/test.jmx

Extra Parameters

Since 1.3.0

Besides mandatory parameters, all commands also accepts JMeter properties and SmartMeter properties. Use standard -J prefix. For example, to run a test and save report to a custom folder:

./SmartMeter.sh runTestNonGui my-test.jmx -Jetn_reports_path=/opt/perf/reports
Parameters overview
Parameter key Alias Description Default Value
etn_create_report=true --report Create report after test TRUE in Non-GUI, FALSE in GUI
etn_create_report=false --no-report Do not create report after test TRUE in Non-GUI, FALSE in GUI
etn_result_file_name=STRING   Name of the result (log) file, pattern is [etn_result_file_name]-report [TIMESTAMP]-report
etn_report_name=STRING   Name of the report, pattern is report-[etn_report_name]-[TEST-SCRIPT-NAME] report-[TIMESTAMP]-[TEST-SCRIPT-NAME]
 

Update to newer version

SmartMeter automatically detects new updates when Welcome screen is started.

Auto update

Click OK and wait until the download completes. SmartMeter will restart itself and start in the update mode.

Update mode

Click Update and wait until the update completes. Start SmartMeter again.

NOTE: Updating SmartMeter to newer version will not affect your data (test scripts, results, reports, ..).

Tips & Tricks

  • In case you want to keep the older version, just download and unpack the new version and copy your licence file.
  • In case you want to update offline, download the update, place it into SmartMeter root folder and start SmartMeter.
  • In case you want to update in Non-GUI mode, run:

java -jar [UPDATE-FILE-NAME].jar

Licence installation without GUI

To install the licence file in headless environment, just copy it to SmartMeter custom folder. The licence file name must match licence.bin. Licence is required on Controller only, no need to install it on load generators.

Licence upgrade

To upgrade your licence click Upgrade icobutton on Welcome screen and select a new licence file.

Licence upgrade

Licence info

To find out details about your licence open SmartMeter.io Editor and click Licence info under SmartMeter menu.

Licence info

Configurator

SmartMeter.io configuration is saved to custom/smartmeter.properties file. Either edit the file directly or use Configurator, a graphical interface. To open Configurator, click Config icon on Welcome screen.

Configurator

Global configuration

Global configuration enables to share single configuration among many installations of SmartMeter. SmartMeter can be configured to use shared tests, data, libraries and custom properties. The advantage is easy maintenance and version updating. Local configuration (custom/smartmeter.properties) has precedence over global configuration.

SmartMeter searches for global configuration in the following places:

../smartmeter.properties

USER.HOME/.smartmeter

/etc/smartmeter

Folder layout configuration

Key Description Default
etn_licence_file

Path to licence file

custom/licence.bin
etn_tests_path

Path to tests folder

tests
etn_results_path

Path to results folder

results
etn_reports_path

Path to reports folder

reports
search_paths

Path to libraries (added to classpath)

custom;custom/libs

Files organization

The SmartMeter folder contains the following structure of folders and files.

  • backup - backups of previous versions in case of update
  • bin - auxiliary executable scripts
  • custom
    • libs - default place for extra libraries (.jar)
    • licence.bin - licence file
    • smartmeter.properties - configuration file
  • doc - online and offline documentation 
  • logs - application logs
  • programs - 3rd party programs
  • reports - default location for test reports
  • results - default location for test results (log files)
  • tests - default location for test scripts including some examples
    • monitors - monitor scripts 
  • utils - temporary auxiliary files

Recovery from crashed test

It might happen that the running test crashes. It can be due to running out of memory, accidental restart, application error and so on. In a such scenario, there is still a chance to recover test results. Follow these steps:

  1. We recommend to do a backup of temporary result files. By default they are located in system's temp folder and their names starts with SmartMeterTemporaryResults. We also recommend to backup log files located in the logs folder for later analysis.
  2. Kill all running instances of SmartMeter.io on Controller and load generators.
  3. Select the result-collector.jmx test script, do not change the monitor script.
  4. Start the test and wait until the log collection completes. If something go wrong, recover the temporary results and try again.

Shutdown hook

Shutdown hook executes an arbitrary script after SmartMeter.io exits. Use cases are to move logs on other disk, start another test and so on.

Shutdown hook is configured in Configurator (tab test), namely by the following properties:

  • Shutdown after test (etn_shutdown_after_test) - Turns on/off automatic shutdown of SmartMeter.io after test (including reports).
  • Shutdown hook script (etn_shutdown_hook) - This command (script) will be executed after completion of the test when SmartMeter.io exits.
  • Shutdown hook workdir (etn_shutdown_hook_dir) - Working directory for etn_shutdown_hook.

Example:

etn_shutdown_after_test = true
etn_shutdown_hook = sh my-hook.sh
etn_shutdown_hook_dir = 

Kill & softkill files

Kill files terminates SmartMeter.io from outside and therefore are useful for Non-Gui mode or CI tools (for example Jenkins). Usage is simple, just create an empty file with the proper name in the utils folder. 

  • File kill immediately terminates SmartMeter.io. It is useful for terminating Smart Proxy Recorder from CI. It is not recommended to use while a test is running unless you do not care about the results.
  • File softkill first stops test and then terminates SmartMeter.io. It is useful for interrupting a test in Non-Gui mode or for interrupting a test from CI tool. Report generation is skipped but can be done afterwards.

Components

SmartMeter.io includes original JMeter components documentation, extra components from JMeter plugins, and its own components prefixed with et@sm (etnetera@smartmeter).

Thread Groups

et@sm - Distributed Concurrency Thread Group

This thread group is based on bzm - Concurrency Thread Group and adds support for automatic distribution of virtual users between load generators. For example, if set to 5 000 VUs, and having 5 load generators, each generator will run 1 000 VUs by default. The next feature is the "Run only on generators" property, which allow us to define selective generators, which will run this thread group. Good for simulating load from various geological locations or for testing both from the cloud and own infrastructure at one time.

Test will not end until all virtual users finish their journey.

Lazy Conc Group  

  • Run only on generators - In some scenarios it is necessary to have more control over load distribution. For example to distribute the load unequally from different locations.

Example

  • Need to simulate 1000 VUs
  • 3 generators available - 2 located somewhere in Cloud (10.241.102.30 and 10.241.102.31) and 1 located in LAN of the tested system (161.12.37.58).
  • Simulate 10 % of the load from LAN and 90 % from the Cloud

Solution

Use 2 Lazy Concurrency Thread Groups (let's name them Cloud Users and LAN Users).

Set the Cloud Users Run only on generators to 10.241.102.30, 10.241.102.31 and number of VUs to 900.

Run only on generators

Accordingly, set the LAN Users Run only on generators to 161.12.37.58 and number of VUs to 100.

et@sm - Distributed Lazy Stepping Thread Group

This thread group is very similar to et@sm - Distributed Concurrency Thread Group, one of key difference is that virtual users do not finish the entire journey once the test ends.

Logic controllers

et@sm - Static Resources Transaction Controller

Placeholder for static resources. Enables quick enable/disable of all static resources.

Timers

et@sm - ThinkTime

Unlike standard JMeter timers, ThinkTime stops the thread at the precise position in the test plan tree where it is placed. This makes possible to place it between transaction controllers where it is visible at first glance.

It also allows waiting period randomization (uniformly distributed) by setting min and max boundary in percents.

Note: ThinkTime is based on a sampler, not timer, therefore it cannot be placed inside another sampler!

Pre Processors

et@sm - DynaTrace Header

Integration with DynaTrace APM - this pre processor adds "x-dynaTrace" header to each HTTP request in scope.

Header consists of:

  • TE - Test Name is the name of the entire load test. It uniquely identifies a test run.
  • SN - Script Name. This groups a set of requests that make up a multi-step transaction, for example making an online purchase.
  • NA - Transaction Name. Metrics for requests with the same name will be aggregated (required).
  • VU - The unique number of the Virtual User that sends the request.
  • ID - The unique request ID (serial number). This string should be unique for one web request or a set of web requests that together make up a step/transaction execution.

Example
x-dynaTrace: VU=1-1;ID=1-1.48780505499371;TE=test.jmx_1464006951111;SN=VirtualUsers;NA=www.smartmeter.io/

Post Processors

et@sm - Boundary Body Extractor

Designed for easy and fast extraction of dynamic values which can be found using the left and right boundaries (e.g. token="DYNAMIC-VALUE").

Boundary Body Extractor

  • Extract from - Extract from HTTP response body or headers.
  • Occurrence - Look for the first, last or random occurrence.
  • Reference Name - Name of a variable which will hold the extracted value.
  • Left Boundary - Left boundary of the extracted value. Avoid using special characters such as end of line (\n).
  • Right Boundary - Right boundary of the extracted value. Avoid using special characters such as end of line (\n).
  • Search Limit - The number of occurrence we want to extract. If the value is more than 1, all previous occurrences are stored to variables REFERENCE_NAME_1..x (for example token_1, token_2 and so on). The REFERENCE_NAME will hold the value according the "Occurrence" setting (First/Last/Random).
  • Default Value (f! = Fail if not found) - Default value used if there is no extracted value. If set to "f!" and no value is extracted, the parent sampler is labeled as erroneous.

et@sm - CSV Extractor

Used for quick and simple obtaining of multiple values from a row with a uniform separator. Ideal for saving values from DataServer, for instance.

et@sm - CSV Extractor

  • Variables [example1; Example2] - names of variables for extracted values from CSV record.
  • Delimiter [;] - separator (the default value is a semicolon)

et@sm - Header Value Extractor

Extracts headers values by headers names.

HeaderValueExtractor

  • Variable - name of the JMeter variable which will hold the extracted value
  • Header name - header name

et@sm - Links Extractor

Extracts links from HTML and saves them to indexed variable.

Links extractor

Assertions

et@sm - Better Response Assertion

This is a widespread element for verifying whether a returned message contains an expected string. Functions Range Text Response and Fast Text Response are added. In both cases, a Contains pattern is always used - i.e. whether the request contains the demanded string. The main significance of these features is the speed in which it can verify the existence of the searched text. Used algorithm is much faster than default options, and thus there is no generators overloading, that is not just their processors but memory usage as well. Other functions are unchanged, so they work in the same way as in the component Response Assertion.

Range Text Response searches only in the bottom of the page. The size of the search can be set with the  etn_range_assert_limit parameter in the Configurator. The default value is 72 [byte].

et@sm - Better Response Assertion

  • Response Field to Test - a search algorithm used
  • Patterns to Test - the text to be searched. For Range Text Response and Fast Text Response functions always only the first string is searched for.

Config Elements

et@sm - Acceptance Criterion

Acceptance criteria enables auto-evaluation of test results. The evaluation is not real-time, it is performed at the end of test. If any of the acceptance criteria fail, then the whole test run is considered a failure and exit code 2 is returned. Most of the Continuous integration tools will interpret this as a build failure.

The location of a criterion component in the test plan does not matter.

Acceptance Criterion

Source

The source of data to be evaluated. Imagine as a line of samples with a timestamp and metric value. For example for response times:

[1477310129 - 1788 ms; 1477310987 - 1566 ms; 1477311229 - 1669 ms; ...

  • Source
    • Samplers - Evaluate upon each sampler result, transactions are ignored.
    • Transactions - Evaluate upon each transaction, samples are ignored.
    • System resources - Evaluate upon et@sm - PerfMon Metrics Collector samples.
  • Metric
    • ​Samplers & Transactions
      • Errors - Failed sample is represented as 1, passed sample is represented as 0
      • Response time - Response time in milliseconds
      • Hits - No value
    • System resources
  • Listener
    • For System resources, one of et@sm - Synchronized PerfMon Metrics Collector located in the test script
Filter

Samples not accepted by the filter are excluded.

  • All - Filter is off. All samples are accepted.
  • Equals - Sample name must be equal to the given value.
  • Matches (regex) - Sample name must match the given regular expression.
  • Not matches (regex) - Sample name must not match the given regular expression.
Aggregation

Samples are first put into groups (by name, by timestamp or by both). The the aggregation function is applied on each group.

  • Group samples by label? - If checked, samples are grouped according their names.
  • Group samples by time frames (in seconds) - If set, samples are grouped into slices of the given size according their timestamps. If not set, all samples are put into one slice (from test start to test end). Using slicing will make sure that even short-term deviations will not be missed. The last incomplete slice is discarded.

Slicing

  • Aggregation function
    • AVG - Average of all sample values from the given group
    • PERCENTAGE - Average * 100
    • SUM - Sum of all sample values from the given group
    • TOTAL - Number of samples in the given group
    • MEDIAN - Median of all sample values from the given group
    • PERCENTILE(index) - Percentile of [INDEX] of all sample values from the given group.
Comparison

Compares the result of the aggregation (left operand) with the given threshold (right operand). The result is either FAIL or PASS.

  • Fail if empty? - Some of the aggregation functions may yield empty results (for example Average of an empty group). If not checked, empty groups are evaluated as PASS.

et@sm - Firefox Driver Config

To use the Selenium test (in Monitor test) and to measure an actual load time of the page in a separate browser window, it is necessary to define the configuration of the Driver. This component extends the jp@gc - Firefox Driver Config with evaluation of errors in JavaScript in the request. In order to evaluate occurred errors the Sampler et@sm - Web Driver Sampler should be used for Selenium script calls.

et@sm - Chrome Driver Config

To use the Selenium test (in Monitor test) and to measure an actual load time of the page in a separate browser window, it is necessary to define the configuration of the Driver. This component extends the jp@gc - Chrome Driver Config with evaluation of errors in JavaScript in the request. In order to evaluate occurred errors the Sampler et@sm - Web Driver Sampler should be used for Selenium script calls.

et@sm - Selendroid Driver Config

This component allows to run a Selenium test on Android smartphones.

et@sm - SmartMeter Hook

​​Since 1.3.0

This is a non-functional component which help us to easily hook into the test life cycle. Right now, it is only useful in distributed mode if the controller and generators do not pair automatically.

Samplers

et@sm - HTTP Request (deprecated)

Deprecated since 1.2.0, use standard HTTP request.

This component extends the default HTTP Request. When using Recorder this component is used as default for HTTP request.

http_request

  1. Implementation - default implementations do not allow to define more download speeds in a single test. When setting the parameter in Configurator etn_httpsampler=SmartMeterHttpClient4 a SmartMeter implementation of the HTTP client will be used (using Apache HttpClient 4). Appearance of the component will be extended by the "Connection speed [kbps]" field.
  2. Connection speed [kbps] - when implementation of SmartMeterHttpClient4 is set (see previous point), an input field for entering the speed at which the responses are to be downloaded from the server is displayed. This makes it possible to simulate several download speeds during a single test. It is entered in kbps. 

et@sm - Web Driver Sampler

This Sampler extends jp@gc - Web Driver Sampler component by JavaScript error evaluation. This function is supported in Firefox browser only, which driver must be set using the et@sm - Firefox Driver Config component. If javascript fails during a request, this information will be added to the response report and recorded in the Monitor log. The request will be labeled as failed.

et@sm - AMF Request

AMF is a protocol used by Adobe Flash and Flex. This component allows viewing and even editing of AMF requests. While performing the test, the AMF request is translated into a binary form. It is required by the AMF protocol. AMF requests are captured automatically while recording a web page with Flash or Flex technology. Request bodies are automatically converted from binary to text form.

Java Requests - cz.etnetera.jmeter.request.*

Java requests are specialized components serving various purposes. Usually, a request accepts some parameters, perform its action and returns the result. Add requests as any other component - select Sampler / JavaRequest and choose the proper implementation.

DataServerImportRequest

This requests imports data to DataServer. It is a convenient replacement for hand-made HTTP GET request. See Data population for arguments description.

DataServerGetRequest

This requests retrieves and parses data from DataServer. It is a convenient replacement for hand-made HTTP GET request and CSV extractor. See Data retrieval for arguments description.

Note: There is one extra parameter variable names from DataServer [true/false]. If set to true, variable names will be obtained from the first line of the data source (CSV header).

Non-test elements

Components that can be added to "WorkBench" but are not directly related to a test execution.

et@sm - Recording log

See Recording log feature.

et@sm - Report Generator

Generates a test report from a log file (*.jtl).

Report generator

  • Test results - log file in XML format, by default located in SMARTMETER_PATH/results folder
  • Test script - optional, source of additional information for test report, especially a source of acceptance criteria definition
  • Time filter - select only a subinterval of test run
    • Example 1 - offset start 600, offset end 3600 - select the subinterval from 10th to 60th minute
    • Example 2 - offset start 600, offset end -600 - select the subinterval from 10th minute to test end minus 10 minutes

et@sm - Analyzer Summary Report

Component enabling a detailed analysis of already finished test. It retrieves the log of the test and fills all opened graphs and statistics windows with its information. This makes it possible to use modifications of displayed information to find details about behavior of the tested system.

analyzer

  1. Filename - input field for entering address to the test log (jtl).
  2. Summary Report- table filled with values from the log. At the same time every opened window with graphs and statistics is filled.
  3. Loading result - an information window that appears after all the data from the log are loaded.

et@sm - Smart Proxy Recorder

See Smart Proxy Recorder feature.

Listener

et@sm - PerfMon Metrics Collector

This listener monitors system resources (CPU, RAM, I/O, disc usage, ..). If configured, it may also monitor system resources on generators, tested system or basically any other environment. The monitored system must have a perfMon agent installed and running. It always must be present in Monitor script (Pro version only).

perfmon

  1. Name - if the output is also inserted into the generated report, the graph will be named according to a value in this input field.
  2. Open in separate Window - button which allows to open the graph in its own new window. The position and size of the window is saved and after the next test windows opened the last time will be re-opened.

et@sm - Controller Summary Report

This listener shows aggregated results. It always must be present in each test script. In GUI mode, it is added automatically. However, if running in non GUI mode, make sure you have added it to your test plan.

Controller Summary Report

et@sm - Controller [...]

SmartMeter.io provides a further collection of listeners which begin with the et@sm - Controller prefix. These are widespread listeners that do not use distributed generators or communication between generators and Controller during the test.

Known Issues

Unidentified developer on OS X 

SmartMeter.io package for OS X is not yet signed by the Apple digital certificate (we are working on that). Until then, it is treated as an application from an "unidentified developer". Please read Open an app from an unidentified developer.

You need to add permission for SmartMeter.command and for programs/Recorder-darwin-x64/RecorderPress the Control key, then click the app icon, then choose Open from the shortcut menu and click Open.

OS X add permission

Changelog

SNAPSHOT

See changelog for nightly builds.

STABLE

v1.4.0 (2017-04-25)

  • [GENERAL]
    • FIX: Fix SmartMeter.sh script to resolve path if symlinked
    • UPDATE: Update Java from 8u102 to 8u121
    • UPDATE: Update Chromium (Windows only) from 54 to 57
    • CHANGE: Remove versions file
    • NEW: Add JMeter JSON plugins plugin
  • [CORE]
  • [CHROME-SCRIPT-RECORDER]
    • FIX: Fix concurrency issue
  • [ELECTRON-SCRIPT-RECORDER]
    • FIX: Fix converting to static resources iterator with '?' in transaction name
    • CHANGE: Preserve symlinks in Electron Script Recorder build to decrease build size
    • CHANGE: Add .ttf to list of automatically detected static resources
    • NEW: Automatic correlations for Vaadin framework
    • UPDATE: Update Recorder to 1.1.4
      • FIX: Several bug fixes in requests recording (especially in redirects)
      • NEW: You can switch between automatic / manual transactions creation in the Recorder settings dialogue
      • NEW: You can choose to record all requests in the Recorder settings dialogue (default option now)
      • NEW: Recorder on OSX is correctly signed by developer certificate
  • [WELCOME SCREEN]
    • NEW: Add file dialog for selecting licence file
    • NEW: Automatic download of new licence (if available)
    • NEW: GUI improvements
  • [CONFIGURATOR]

v1.3.0 (2016-12-14)

  • [GENERAL]
    • UPDATE: JMeter updated to 3.1 from 3.0
    • NEW: Add global configuration properties
    • NEW: Add auto-updates
    • CHANGE: Rename build package and folder to 'SmartMeter_VERSION_platform'
    • CHANGE: Update run scripts to always 'cd' into SmartMeter Home folder first
    • CHANGE: Change default thread group to et@sm - DistributedConcurrencyThreadGroup
    • CHANGE: Remove 'data' folder, it is recommended to keep test data together with test scripts
  • [CORE]
    • FIX: Fix disabled button after repeated test run from Editor
    • NEW: Add acceptance criteria definition and evaluation
    • NEW: Start Server Agent for local tests
    • NEW: Print test configuration (active thread groups settings) to report
    • NEW: Add links to documentation for SmartMeter (et@sm) components
    • NEW: Improve DataServerImportRequest to search for file relatively from test script
    • NEW: Add result-collector.jmx to easily collect results from crashed tests
    • NEW: Stop the test once the old log files are collected
    • CHANGE: Change HTTP Cookie Manager from strict to standard (as recommended since JMeter 3.0)
    • CHANGE: Generate report after test in Non-GUI
    • CHANGE: Support et@sm - Synchronized PerfMon Collector in non-distributed tests
    • CHANGE: Append script name to report folder name
    • CHANGE: Add default metrics on Runner tab (for the first launch)
    • CHANGE: Hide stacktrace output after test run in Non GUI mode
    • CHANGE: Remove example Java requests for Oracle
  • [SMART PROXY RECORDER]
    • FIX: Fix missing 'modifiers_list' property to smart-proxy-recorder.jmx
    • NEW: Add statically defined modifiers for Smart Proxy Recorder
  • [WELCOME SCREEN]
    • FIX: Fix FileNotFoundException if custom/smartmeter.properties does not exist
    • CHANGE: Redesign test selectors to use standard file system dialog

v1.2.0 (2016-08-16)

  • [GENERAL]
    • UPDATE: JMeter updated to 3.0 from 2.13​
    • UPDATE: Java updated from 8u66 to 8u102
    • UPDATE: Chromium updated from 48.0.2564.8 to 54.0.2824.2
    • CHANGE: generator JVM's NewSize set to 64m and MaxNewSize set to 128m
    • CHANGE: remove Selendroid-Standalone from build
    • NEW: added brand New Recorder which is based on Electron by GitHub
    • NEW: pre-installed JMeter Plugins Manager
    • NEW: custom icons for SmartMeter applications
  • [CORE]
    • FIX: ConcurrentModificationException while adding new listeners during the test run
    • FIX: ConcurrentModificationException in ControllerConcurrencyOverTimeGui
    • FIX: path to recording log file (.jtl) is relative to test file (.jmx) now
    • CHANGE: removed original SmartMeter test timer
    • CHANGE: unset httpclient4.idletimeout=5000 (keep-alive)
    • CHANGE: removed Concurrency related graphs from report's default setting
    • CHANGE: deprecated et@sm - HttpRequest
    • NEW: Recorder tries to run "chromium", "chromium-browser" or "chrome" (Linux only)
    • NEW: free space info sent during log collection
    • NEW: queue size info (how many sample batches are queued on generator) sent during test
    • NEW: added details of each error in reports (data/errors)
    • NEW: added "Errors overview" table report
    • NEW: custom/libs added to classpath for external libraries
  • [SMARTMETER PROFI]
    • FIX: troublesome empty spaces in path to home folder and in names of test scripts
    • CHANGE: generator exits after test if run on localhost (127.0.0.1)
    • NEW: automatic forwarding of predefined parameters to Controller and generators
  • [RECORDER]
    • FIX: requests served from Chrome's on-disk cache are excluded now
  • [WELCOME SCREEN]
    • FIX: follow symlinks while loading test scripts

v1.1.1 (2016-05-23)

  • [GENERAL]
    • CHANGE: updated bin and monitor scripts
    • CHANGE: renamed top level folder to match the archive name
  • [CORE]
    • FIX: results data are pruned for each successful request (not only for successful transactions)
    • FIX: fixed occasional AWT deadlock in ShowInfoWindow
    • FIX: UNDO/REDO function
    • FIX: Response Times Percentiles graphs X axis description (after changing properties)
    • CHANGE: added et@sm - DynaTrace Header preprocessor, removed etn_add_dynatrace_header parameter
    • CHANGE: set httpclient4.idletimeout=5000 (keep-alive)
    • NEW: added FIRST/LAST/RANDOM options for et@sm - Boundary Body Extractor
  • [SMARTMETER PROFI]
    • NEW: added PerfMon graphs configuration
    • NEW: better support for load distribution (run thread groups on selected generators only)
  • [CONFIGURATOR]
    • NEW: added support for defining proxy (etn_proxy_host and etn_proxy_port)

v1.1.0 (2015-12-21)

Read blog post about v1.1.0 update.

  • [SMARTMETER.IO PRO]
    • FIX: recounting users for specific generators
    • FIX: stopping the test does not corrupt generators (error message Engine is busy)
    • FIX: BAT scripts are now executable from any folder
    • CHANGE: runGenerator script (without any further parameters) starts generator with auto-detected RMI server hostname
    • CHANGE: Monitor does not automatically close after test (configurable, for better integration with CI tools)
    • NEW: new command line parameters runDistTest and runDistTestNonGui for running a distributed test
    • NEW: running tests without monitor script (as SmartMeter.io Light does)
    • NEW: running distributed test from Smart Proxy Recorder (see RunTest command)
    • NEW: added runLocalGenerator script (RMI server hostname set to 127.0.0.1)
    • NEW: added ComboBox for selecting monitor and test script separately
    • NEW: added DataServerImportRequest and DataServerGetRequest Java request
    • NEW: forwarding parameters from Monitor to Controller (use prefix -Jetnc_)
  • [GENERAL]
    • FIX: logging from libraries
    • FIX: properly resized SmartMeter.io icons
    • UPDATE: Java updated from 8u45 to 8u66
    • UPDATE: JMeter plugins updated from 1.3.0 to 1.3.1
    • UPDATE: Chromium updated from 39.0.2150.5 to 48.0.2564.8
    • CHANGE: graphs HitsPerSecond, TransactionsPerSecond and ControllerBytesThroughputOverTime smoothed in final report
    • NEW: extra parameters from CLI are forwarded to SmartMeter, especially useful for setting JMeter properties (-J prefix)
  • [CORE]
    • FIX: real-time results (Runner tab) does not add timers to response times
    • FIX: SmartMeter.io HTTP client ignores MIME type of uploaded files
    • FIX: etn_generator_xmx parameter is ignored
    • FIX: Response Times Percentiles graphs X axis description
    • FIX: static resources extensions are checked against URL path without query string
    • CHANGE: Launcher does not terminate till SmartMeter.io exits (useful for CI tools)
    • CHANGE: SmartMeter.io editor automatically opens last run (or default) test
    • NEW: added shutdown hook (script which is called after test when SmartMeter.io exits)
    • NEW: kill & softkill support from outside (useful for Non-Gui mode, CI tools, ..)
    • NEW: generating report from command line
    • NEW: Header Value Extractor component
  • [SMART PROXY RECORDER]
  • [CONFIGURATOR]
    • FIX: saving trailing spaces in Configurator
    • NEW: added Reset to defaults option
    • NEW: configurable maximum memory allocation pool for JVM (-Xmx) for Editor (etn_editor_xmx)
    • NEW: configurable static resources extensions and types (etn_recorder_static_resource_extensions and etn_recorder_static_resource_types)
    • NEW: configurable auto openning of report after its creation (etn_report_auto_open)

v1.0.5 (2015-09-09)

Read blog post about v1.0.5 update.

  • [SMARTMETER.IO PRO]
    • FIX: performance monitoring agent is started just once
    • CHANGE: CMDDoCommandSynchronized renamed to CMDDoDeferredCommand
    • NEW: new commands (RunLocalGenerator and SSHRunGenerator) for easier starting of generators
    • NEW: new option for starting selected thread groups on selected generators only
    • NEW: templates for monitor scripts for each common installation pattern
  • [GENERAL]
    • FIX: SmartMeter.command on OS X does not work with spaces in the path to SmartMeter folder
    • CHANGE: Apache JMeter updated to 2.13 from 2.12
    • CHANGE: JMeter plugins updated to 1.3.0 from 1.2.0
    • CHANGE: jmeter.log renamed to editor.log
    • CHANGE: distribution packages contains top-level folder
    • NEW: added dock icon for OS X
    • NEW: icons in multiple sizes (16, 32, 64, 128 and 256 pixels)
  • [CORE]
    • FIX: data for table showing "failed requests" in HTML report are correctly reseted between tests
    • FIX: printing of failed requests to pictures in report
    • FIX: et@sm ThinkTime component works correctly if min and max deviation equals
    • FIX: generator is not started in NO-GUI mode anymore (it was unused)
    • FIX: HTTP PUT and DELETE requests have empty bodies if using et@sm - HTTP Sampler
    • CHANGE: all files related to reports moved to its own folder programs/SmartMeter/extras/report
    • NEW: Smart Proxy Recorder
    • NEW: integration with JMeter-WebSocketSampler plugin (https://github.com/maciejzaleski/JMeter-WebSocketSampler/releases) for testing web sockets
    • NEW: new button for starting the test directly from Edtior
    • NEW: graphs in report are automatically added to HTML report (no need to modify the report.html template)
  • [RECORDER]
    • CHANGE: characters are no longer automatically escaped when defining boundaries for replacers and automatic correlations
    • CHANGE: RecorerInterfaces.jar moved to programs/SmartMeter/extras
    • NEW: new input box for easier setting of include pattern
    • NEW: source codes for Recorder extension API (RecorderInterfaces.jar) added to build
  • [WELCOME SCREEN]
    • NEW: Light/Pro version is displayed next to build version
    • NEW: recursive loading of tests from the tests folder
    • NEW: tests are reloaded from the file system if the Settings panel is shown/hidden
  • [CONFIGURATOR]
    • FIX: fixed typos
    • CHANGE: property 'etn_report_create_graphs' was removed, it is no longer necessary
    • NEW: new field for extra configuration of report graphs

v1.0.4 (2015-06-09)

Read blog post about v1.0.4 update.

  • [CORE]
    • FIX: fixed Runner initialization which occasionally failed
    • FIX: fixed thread counting - only those threads which exceeds the whole thread run time are counted
    • NEW: generator arguments can be added / overrided from command line (Editor) 
  • [GENERAL]
    • CHANGE: updated to Java8 (update 45)
  • [RECORDER]
    • FIX: fixed the loading of Recorder plugins in Chrome / Chromium on Linux and OS X if the browser was previously started by user
    • CHANGE: changed Recorder server STOP port from 8010 (possibly occupied by XMPP) to 27222
    • NEW: captures bodies of HTTP PUT requests
  • [CONFIGURATOR]
    • NEW: separate configuration of Light and Pro version
    • NEW: easier customization of graphs
    • NEW: new configurable -Xmx parameter for both generator and controller (Light version)
    • NEW: copyable default values and examples
    • NEW: optimized look in Linux