-
18 Introduction
-
18.1 Samplers
- FTP Request
- HTTP Request
- JDBC Request
- Java Request
- LDAP Request
- LDAP Extended Request
- Access Log Sampler
- BeanShell Sampler
- JSR223 Sampler
- TCP Sampler
- JMS Publisher
- JMS Subscriber
- JMS Point-to-Point
- JUnit Request
- Mail Reader Sampler
- Flow Control Action (was: Test Action )
- SMTP Sampler
- OS Process Sampler
- MongoDB Script (DEPRECATED)
- Bolt Request
-
18.2 Logic Controllers
- Simple Controller
- Loop Controller
- Once Only Controller
- Interleave Controller
- Random Controller
- Random Order Controller
- Throughput Controller
- Runtime Controller
- If Controller
- While Controller
- Switch Controller
- ForEach Controller
- Module Controller
- Include Controller
- Transaction Controller
- Recording Controller
- Critical Section Controller
-
18.3 Listeners
- Sample Result Save Configuration
- Graph Results
- Assertion Results
- View Results Tree
- Aggregate Report
- View Results in Table
- Simple Data Writer
- Aggregate Graph
- Response Time Graph
- Mailer Visualizer
- BeanShell Listener
- Summary Report
- Save Responses to a file
- JSR223 Listener
- Generate Summary Results
- Comparison Assertion Visualizer
- Backend Listener
-
18.4 Configuration Elements
- CSV Data Set Config
- FTP Request Defaults
- DNS Cache Manager
- HTTP Authorization Manager
- HTTP Cache Manager
- HTTP Cookie Manager
- HTTP Request Defaults
- HTTP Header Manager
- Java Request Defaults
- JDBC Connection Configuration
- Keystore Configuration
- Login Config Element
- LDAP Request Defaults
- LDAP Extended Request Defaults
- TCP Sampler Config
- User Defined Variables
- Random Variable
- Counter
- Simple Config Element
- MongoDB Source Config (DEPRECATED)
- Bolt Connection Configuration
- 18.5 Assertions
- 18.6 Timers
- 18.7 Pre Processors
- 18.8 Post-Processors
- 18.9 Miscellaneous Features
18 Introduction¶
18.1 Samplers¶
Samplers perform the actual work of JMeter. Each sampler (except Flow Control Action) generates one or more sample results. The sample results have various attributes (success/fail, elapsed time, data size etc.) and can be viewed in the various listeners.
FTP Request¶
Latency is set to the time it takes to login.
Parameters ¶
HTTP Request¶
This sampler lets you send an HTTP/HTTPS request to a web server. It also lets you control whether or not JMeter parses HTML files for images and other embedded resources and sends HTTP requests to retrieve them. The following types of embedded resource are retrieved:
- images
- applets
- stylesheets (CSS) and resources referenced from those files
- external scripts
- frames, iframes
- background images (body, table, TD, TR)
- background sound
The default parser is org.apache.jmeter.protocol.http.parser.LagartoBasedHtmlParser. This can be changed by using the property "htmlparser.className" - see jmeter.properties for details.
If you are going to send multiple requests to the same web server, consider using an HTTP Request Defaults Configuration Element so you do not have to enter the same information for each HTTP Request.
Or, instead of manually adding HTTP Requests, you may want to use JMeter's HTTP(S) Test Script Recorder to create them. This can save you time if you have a lot of HTTP requests or requests with many parameters.
There are three different test elements used to define the samplers:
- AJP/1.3 Sampler
- uses the Tomcat mod_jk protocol (allows testing of Tomcat in AJP mode without needing Apache httpd) The AJP Sampler does not support multiple file upload; only the first file will be used.
- HTTP Request
-
this has an implementation drop-down box, which selects the HTTP protocol implementation to be used:
- Java
- uses the HTTP implementation provided by the JVM. This has some limitations in comparison with the HttpClient implementations - see below.
- HTTPClient4
- uses Apache HttpComponents HttpClient 4.x.
- Blank Value
- does not set implementation on HTTP Samplers, so relies on HTTP Request Defaults if present or on jmeter.httpsampler property defined in jmeter.properties
- GraphQL HTTP Request
-
this is a GUI variation of the HTTP Request to provide more convenient UI elements
to view or edit GraphQL Query, Variables and Operation Name, while converting them into HTTP Arguments automatically under the hood
using the same sampler.
This hides or customizes the following UI elements as they are less convenient for or irrelevant to GraphQL over HTTP/HTTPS requests:
- Method: Only POST and GET methods are available conforming the GraphQL over HTTP specification. POST method is selected by default.
- Parameters and Post Body tabs: you may view or edit parameter content through Query, Variables and Operation Name UI elements instead.
- File Upload tab: irrelevant to GraphQL queries.
- Embedded Resources from HTML Files section in the Advanced tab: irrelevant in GraphQL JSON responses.
The Java HTTP implementation has some limitations:
- There is no control over how connections are re-used. When a connection is released by JMeter, it may or may not be re-used by the same thread.
- The API is best suited to single-threaded usage - various settings are defined via system properties, and therefore apply to all connections.
- No support of Kerberos authentication
- It does not support client based certificate testing with Keystore Config.
- Better control of Retry mechanism
- It does not support virtual hosts.
- It supports only the following methods: GET, POST, HEAD, OPTIONS, PUT, DELETE and TRACE
- Better control on DNS Caching with DNS Cache Manager
If the request requires server or proxy login authorization (i.e. where a browser would create a pop-up dialog box), you will also have to add an HTTP Authorization Manager Configuration Element. For normal logins (i.e. where the user enters login information in a form), you will need to work out what the form submit button does, and create an HTTP request with the appropriate method (usually POST) and the appropriate parameters from the form definition. If the page uses HTTP, you can use the JMeter Proxy to capture the login sequence.
A separate SSL context is used for each thread. If you want to use a single SSL context (not the standard behaviour of browsers), set the JMeter property:
https.sessioncontext.shared=trueBy default, since version 5.0, the SSL context is retained during a Thread Group iteration and reset for each test iteration. If in your test plan the same user iterates multiple times, then you should set this to false.
httpclient.reset_state_on_thread_group_iteration=true
https.default.protocol=SSLv3
JMeter also allows one to enable additional protocols, by changing the property https.socket.protocols.
If the request uses cookies, then you will also need an HTTP Cookie Manager. You can add either of these elements to the Thread Group or the HTTP Request. If you have more than one HTTP Request that needs authorizations or cookies, then add the elements to the Thread Group. That way, all HTTP Request controllers will share the same Authorization Manager and Cookie Manager elements.
If the request uses a technique called "URL Rewriting" to maintain sessions, then see section 6.1 Handling User Sessions With URL Rewriting for additional configuration steps.
Parameters ¶
- it is provided by HTTP Request Defaults
- or a full URL including scheme, host and port (scheme://host:port) is set in Path field
A Duration Assertion can be used to detect responses that take too long to complete.
More methods can be pre-defined for the HttpClient4 by using the JMeter property httpsampler.user_defined_methods.
"Redirect requested but followRedirects is disabled"This can be ignored.
JMeter will collapse paths of the form '/../segment' in both absolute and relative redirect URLs. For example http://host/one/../two will be collapsed into http://host/two. If necessary, this behaviour can be suppressed by setting the JMeter property httpsampler.redirect.removeslashdotdot=false
Additionally, you can specify whether each parameter should be URL encoded. If you are not sure what this means, it is probably best to select it. If your values contain characters such as the following then encoding is usually required.:
- ASCII Control Chars
- Non-ASCII characters
- Reserved characters:URLs use some characters for special use in defining their syntax. When these characters are not used in their special role inside a URL, they need to be encoded, example: '$', '&', '+', ',' , '/', ':', ';', '=', '?', '@'
- Unsafe characters: Some characters present the possibility of being misunderstood within URLs for various reasons. These characters should also always be encoded, example: ' ', '<', '>', '#', '%', …
When MIME Type is empty, JMeter will try to guess the MIME type of the given file.
If it is a POST or PUT or PATCH request and there is a single file whose 'Parameter name' attribute (below) is omitted, then the file is sent as the entire body of the request, i.e. no wrappers are added. This allows arbitrary bodies to be sent. This functionality is present for POST requests, and also for PUT requests. See below for some further information on parameter handling.
To distinguish the source address value, select the type of these:
- Select IP/Hostname to use a specific IP address or a (local) hostname
- Select Device to pick the first available address for that interface which this may be either IPv4 or IPv6
- Select Device IPv4 to select the IPv4 address of the device name (like eth0, lo, em0, etc.)
- Select Device IPv6 to select the IPv6 address of the device name (like eth0, lo, em0, etc.)
This property is used to enable IP Spoofing. It overrides the default local IP address for this sample. The JMeter host must have multiple IP addresses (i.e. IP aliases, network interfaces, devices). The value can be a host name, IP address, or a network interface device such as "eth0" or "lo" or "wlan0".
If the property httpclient.localaddress is defined, that is used for all HttpClient requests.
The following parameters are available only for GraphQL HTTP Request:
Parameters ¶
Parameter Handling:
For the POST and PUT method, if there is no file to send, and the name(s) of the parameter(s) are omitted,
then the body is created by concatenating all the value(s) of the parameters.
Note that the values are concatenated without adding any end-of-line characters.
These can be added by using the __char() function in the value fields.
This allows arbitrary bodies to be sent.
The values are encoded if the encoding flag is set.
See also the MIME Type above how you can control the content-type request header that is sent.
For other methods, if the name of the parameter is missing,
then the parameter is ignored. This allows the use of optional parameters defined by variables.
You have the option to switch to Body Data tab when a request has only unnamed parameters (or no parameters at all). This option is useful in the following cases (amongst others):
- GWT RPC HTTP Request
- JSON REST HTTP Request
- XML REST HTTP Request
- SOAP HTTP Request
In Body Data mode, each line will be sent with CRLF appended, apart from the last line. To send a CRLF after the last line of data, just ensure that there is an empty line following it. (This cannot be seen, except by noting whether the cursor can be placed on the subsequent line.)
Method Handling:
The GET, DELETE, POST, PUT and PATCH request methods work similarly, except that as of 3.1, only POST method supports multipart requests
or file upload.
The PUT and PATCH method body must be provided as one of the following:
- define the body as a file with empty Parameter name field; in which case the MIME Type is used as the Content-Type
- define the body as parameter value(s) with no name
- use the Body Data tab
The GET, DELETE and POST methods have an additional way of passing parameters by using the Parameters tab. GET, DELETE, PUT and PATCH require a Content-Type. If not using a file, attach a Header Manager to the sampler and define the Content-Type there.
JMeter scan responses from embedded resources. It uses the property HTTPResponse.parsers, which is a list of parser ids, e.g. htmlParser, cssParser and wmlParser. For each id found, JMeter checks two further properties:
- id.types - a list of content types
- id.className - the parser to be used to extract the embedded resources
See jmeter.properties file for the details of the settings. If the HTTPResponse.parser property is not set, JMeter reverts to the previous behaviour, i.e. only text/html responses will be scanned
Emulating slow connections:HttpClient4 and Java Sampler support emulation of slow connections; see the following entries in jmeter.properties:
# Define characters per second > 0 to emulate slow connections #httpclient.socket.http.cps=0 #httpclient.socket.https.cps=0However the Java sampler only supports slow HTTPS connections.
Response size calculation
The HttpClient4 implementation does include the overhead in the response body size, so the value may be greater than the number of bytes in the response content.
Retry handling
By default retry has been set to 0 for both HttpClient4 and Java implementations, meaning no retry is attempted.
For HttpClient4, the retry count can be overridden by setting the relevant JMeter property, for example:
httpclient4.retrycount=3
httpclient4.request_sent_retry_enabled=true
http.java.sampler.retries=3
Note: Certificates does not conform to algorithm constraints
You may encounter the following error: java.security.cert.CertificateException: Certificates does not conform to algorithm constraints
if you run a HTTPS request on a web site with a SSL certificate (itself or one of SSL certificates in its chain of trust) with a signature
algorithm using MD2 (like md2WithRSAEncryption) or with a SSL certificate with a size lower than 1024 bits.
This error is related to increased security in Java 8.
To allow you to perform your HTTPS request, you can downgrade the security of your Java installation by editing the Java jdk.certpath.disabledAlgorithms property. Remove the MD2 value or the constraint on size, depending on your case.
This property is in this file:
JAVA_HOME/jre/lib/security/java.security
See Bug 56357 for details.
JDBC Request¶
This sampler lets you send a JDBC Request (an SQL query) to a database.
Before using this you need to set up a JDBC Connection Configuration Configuration element
If the Variable Names list is provided, then for each row returned by a Select statement, the variables are set up with the value of the corresponding column (if a variable name is provided), and the count of rows is also set up. For example, if the Select statement returns 2 rows of 3 columns, and the variable list is A,,C, then the following variables will be set up:
A_#=2 (number of rows) A_1=column 1, row 1 A_2=column 1, row 2 C_#=2 (number of rows) C_1=column 3, row 1 C_2=column 3, row 2
If the Select statement returns zero rows, then the A_# and C_# variables would be set to 0, and no other variables would be set.
Old variables are cleared if necessary - e.g. if the first select retrieves six rows and a second select returns only three rows, the additional variables for rows four, five and six will be removed.
Parameters ¶
- Select Statement
- Update Statement - use this for Inserts and Deletes as well
- Callable Statement
- Prepared Select Statement
- Prepared Update Statement - use this for Inserts and Deletes as well
- Commit
- Rollback
- Autocommit(false)
- Autocommit(true)
- Edit - this should be a variable reference that evaluates to one of the above
- select * from t_customers where id=23
-
CALL SYSCS_UTIL.SYSCS_EXPORT_TABLE (null, ?, ?, null, null, null)
- Parameter values: tablename,filename
- Parameter types: VARCHAR,VARCHAR
The list must be enclosed in double-quotes if any of the values contain a comma or double-quote, and any embedded double-quotes must be doubled-up, for example:
"Dbl-Quote: "" and Comma: ,"
These are defined as fields in the class java.sql.Types, see for example:
Javadoc for java.sql.Types.
If not specified, "IN" is assumed, i.e. "DATE" is the same as "IN DATE".
If the type is not one of the fields found in java.sql.Types, JMeter also accepts the corresponding integer number, e.g. since OracleTypes.CURSOR == -10, you can use "INOUT -10".
There must be as many types as there are placeholders in the statement.
columnValue = vars.getObject("resultObject").get(0).get("Column Name");
- Store As String (default) - All variables on Variable Names list are stored as strings, will not iterate through a ResultSet when present on the list. CLOBs will be converted to Strings. BLOBs will be converted to Strings as if they were an UTF-8 encoded byte-array. Both CLOBs and BLOBs will be cut off after jdbcsampler.max_retain_result_size bytes.
- Store As Object - Variables of ResultSet type on Variables Names list will be stored as Object and can be accessed in subsequent tests/scripts and iterated, will not iterate through the ResultSet. CLOBs will be handled as if Store As String was selected. BLOBs will be stored as a byte array. Both CLOBs and BLOBs will be cut off after jdbcsampler.max_retain_result_size bytes.
- Count Records - Variables of ResultSet types will be iterated through showing the count of records as result. Variables will be stored as Strings. For BLOBs the size of the object will be stored.
Java Request¶
This sampler lets you control a java class that implements the org.apache.jmeter.protocol.java.sampler.JavaSamplerClient interface. By writing your own implementation of this interface, you can use JMeter to harness multiple threads, input parameter control, and data collection.
The pull-down menu provides the list of all such implementations found by JMeter in its classpath. The parameters can then be specified in the table below - as defined by your implementation. Two simple examples (JavaTest and SleepTest) are provided.
The JavaTest example sampler can be useful for checking test plans, because it allows one to set values in almost all the fields. These can then be used by Assertions, etc. The fields allow variables to be used, so the values of these can readily be seen.
Parameters ¶
The following parameters apply to the SleepTest and JavaTest implementations:
Parameters ¶
The sleep time is calculated as follows:
totalSleepTime = SleepTime + (System.currentTimeMillis() % SleepMask)
The following parameters apply additionally to the JavaTest implementation:
Parameters ¶
LDAP Request¶
If you are going to send multiple requests to the same LDAP server, consider using an LDAP Request Defaults Configuration Element so you do not have to enter the same information for each LDAP Request.
The same way the Login Config Element also using for Login and password.There are two ways to create test cases for testing an LDAP Server.
- Inbuilt Test cases.
- User defined Test cases.
There are four test scenarios of testing LDAP. The tests are given below:
-
Add Test
-
Inbuilt test:
This will add a pre-defined entry in the LDAP Server and calculate the execution time. After execution of the test, the created entry will be deleted from the LDAP Server.
-
User defined test:
This will add the entry in the LDAP Server. User has to enter all the attributes in the table.The entries are collected from the table to add. The execution time is calculated. The created entry will not be deleted after the test.
-
Inbuilt test:
-
Modify Test
-
Inbuilt test:
This will create a pre-defined entry first, then will modify the created entry in the LDAP Server.And calculate the execution time. After execution of the test, the created entry will be deleted from the LDAP Server.
-
User defined test:
This will modify the entry in the LDAP Server. User has to enter all the attributes in the table. The entries are collected from the table to modify. The execution time is calculated. The entry will not be deleted from the LDAP Server.
-
Inbuilt test:
-
Search Test
-
Inbuilt test:
This will create the entry first, then will search if the attributes are available. It calculates the execution time of the search query. At the end of the execution,created entry will be deleted from the LDAP Server.
-
User defined test:
This will search the user defined entry(Search filter) in the Search base (again, defined by the user). The entries should be available in the LDAP Server. The execution time is calculated.
-
Inbuilt test:
-
Delete Test
-
Inbuilt test:
This will create a pre-defined entry first, then it will be deleted from the LDAP Server. The execution time is calculated.
-
User defined test:
This will delete the user-defined entry in the LDAP Server. The entries should be available in the LDAP Server. The execution time is calculated.
-
Inbuilt test:
Parameters ¶
LDAP Extended Request¶
If you are going to send multiple requests to the same LDAP server, consider using an LDAP Extended Request Defaults Configuration Element so you do not have to enter the same information for each LDAP Request.
There are nine test operations defined. These operations are given below:
- Thread bind
-
Any LDAP request is part of an LDAP session, so the first thing that should be done is starting a session to the LDAP server. For starting this session a thread bind is used, which is equal to the LDAP "bind" operation. The user is requested to give a username (Distinguished name) and password, which will be used to initiate a session. When no password, or the wrong password is specified, an anonymous session is started. Take care, omitting the password will not fail this test, a wrong password will. (N.B. this is stored unencrypted in the test plan)
Parameters
AttributeDescriptionRequiredNameDescriptive name for this sampler that is shown in the tree.NoServernameThe name (or IP-address) of the LDAP server.YesPortThe port number that the LDAP server is listening to. If this is omitted JMeter assumes the LDAP server is listening on the default port(389).NoDNThe distinguished name of the base object that will be used for any subsequent operation. It can be used as a starting point for all operations. You cannot start any operation on a higher level than this DN!NoUsernameFull distinguished name of the user as which you want to bind.NoPasswordPassword for the above user. If omitted it will result in an anonymous bind. If it is incorrect, the sampler will return an error and revert to an anonymous bind. (N.B. this is stored unencrypted in the test plan)NoConnection timeout (in milliseconds)Timeout for connection, if exceeded connection will be abortedNoUse Secure LDAP ProtocolUse ldaps:// scheme instead of ldap://NoTrust All CertificatesTrust all certificates, only used if Use Secure LDAP Protocol is checkedNo - Thread unbind
-
This is simply the operation to end a session. It is equal to the LDAP "unbind" operation.
Parameters
AttributeDescriptionRequiredNameDescriptive name for this sampler that is shown in the tree.No - Single bind/unbind
-
This is a combination of the LDAP "bind" and "unbind" operations. It can be used for an authentication request/password check for any user. It will open a new session, just to check the validity of the user/password combination, and end the session again.
Parameters
AttributeDescriptionRequiredNameDescriptive name for this sampler that is shown in the tree.NoUsernameFull distinguished name of the user as which you want to bind.YesPasswordPassword for the above user. If omitted it will result in an anonymous bind. If it is incorrect, the sampler will return an error. (N.B. this is stored unencrypted in the test plan)No - Rename entry
-
This is the LDAP "moddn" operation. It can be used to rename an entry, but also for moving an entry or a complete subtree to a different place in the LDAP tree.
Parameters
AttributeDescriptionRequiredNameDescriptive name for this sampler that is shown in the tree.NoOld entry nameThe current distinguished name of the object you want to rename or move, relative to the given DN in the thread bind operation.YesNew distinguished nameThe new distinguished name of the object you want to rename or move, relative to the given DN in the thread bind operation.Yes - Add test
-
This is the LDAP "add" operation. It can be used to add any kind of object to the LDAP server.
Parameters
AttributeDescriptionRequiredNameDescriptive name for this sampler that is shown in the tree.NoEntry DNDistinguished name of the object you want to add, relative to the given DN in the thread bind operation.YesAdd testA list of attributes and their values you want to use for the object. If you need to add a multiple value attribute, you need to add the same attribute with their respective values several times to the list.Yes - Delete test
-
This is the LDAP "delete" operation, it can be used to delete an object from the LDAP tree
Parameters
AttributeDescriptionRequiredNameDescriptive name for this sampler that is shown in the tree.NoDeleteDistinguished name of the object you want to delete, relative to the given DN in the thread bind operation.Yes - Search test
-
This is the LDAP "search" operation, and will be used for defining searches.
Parameters
AttributeDescriptionRequiredNameDescriptive name for this sampler that is shown in the tree.NoSearch baseDistinguished name of the subtree you want your search to look in, relative to the given DN in the thread bind operation.NoSearch Filtersearchfilter, must be specified in LDAP syntax.YesScopeUse 0 for baseobject-, 1 for onelevel- and 2 for a subtree search. (Default=0)NoSize LimitSpecify the maximum number of results you want back from the server. (default=0, which means no limit.) When the sampler hits the maximum number of results, it will fail with errorcode 4NoTime LimitSpecify the maximum amount of (cpu)time (in milliseconds) that the server can spend on your search. Take care, this does not say anything about the response time. (default is 0, which means no limit)NoAttributesSpecify the attributes you want to have returned, separated by a semicolon. An empty field will return all attributesNoReturn objectWhether the object will be returned (true) or not (false). Default=falseNoDereference aliasesIf true, it will dereference aliases, if false, it will not follow them (default=false)NoParse the search results?If true, the search results will be added to the response data. If false, a marker - whether results where found or not - will be added to the response data.No - Modification test
-
This is the LDAP "modify" operation. It can be used to modify an object. It can be used to add, delete or replace values of an attribute.
Parameters
AttributeDescriptionRequiredNameDescriptive name for this sampler that is shown in the tree.NoEntry nameDistinguished name of the object you want to modify, relative to the given DN in the thread bind operationYesModification testThe attribute-value-opCode triples.
The opCode can be any valid LDAP operationCode (add, delete, remove or replace).
If you don't specify a value with a delete operation, all values of the given attribute will be deleted.
If you do specify a value in a delete operation, only the given value will be deleted.
If this value is non-existent, the sampler will fail the test.Yes - Compare
-
This is the LDAP "compare" operation. It can be used to compare the value of a given attribute with some already known value. In reality this is mostly used to check whether a given person is a member of some group. In such a case you can compare the DN of the user as a given value, with the values in the attribute "member" of an object of the type groupOfNames. If the compare operation fails, this test fails with errorcode 49.
Parameters
AttributeDescriptionRequiredNameDescriptive name for this sampler that is shown in the tree.NoEntry DNThe current distinguished name of the object of which you want to compare an attribute, relative to the given DN in the thread bind operation.YesCompare filterIn the form "attribute=value"Yes
Access Log Sampler¶
AccessLogSampler was designed to read access logs and generate http requests. For those not familiar with the access log, it is the log the webserver maintains of every request it accepted. This means every image, CSS file, JavaScript file, html file, …
Tomcat uses the common format for access logs. This means any webserver that uses the common log format can use the AccessLogSampler. Server that use common log format include: Tomcat, Resin, Weblogic, and SunOne. Common log format looks like this:
127.0.0.1 - - [21/Oct/2003:05:37:21 -0500] "GET /index.jsp?%2Findex.jsp= HTTP/1.1" 200 8343
For the future, it might be nice to filter out entries that do not have a response code of 200. Extending the sampler should be fairly simple. There are two interfaces you have to implement:
- org.apache.jmeter.protocol.http.util.accesslog.LogParser
- org.apache.jmeter.protocol.http.util.accesslog.Generator
The current implementation of AccessLogSampler uses the generator to create a new HTTPSampler. The servername, port and get images are set by AccessLogSampler. Next, the parser is called with integer 1, telling it to parse one entry. After that, HTTPSampler.sample() is called to make the request.
samp = (HTTPSampler) GENERATOR.generateRequest(); samp.setDomain(this.getDomain()); samp.setPort(this.getPort()); samp.setImageParser(this.isImageParser()); PARSER.parse(1); res = samp.sample(); res.setSampleLabel(samp.toString());The required methods in LogParser are:
- setGenerator(Generator)
- parse(int)
Classes implementing Generator interface should provide concrete implementation for all the methods. For an example of how to implement either interface, refer to StandardGenerator and TCLogParser.
(Beta Code)
Parameters ¶
The TCLogParser processes the access log independently for each thread. The SharedTCLogParser and OrderPreservingLogParser share access to the file, i.e. each thread gets the next entry in the log.
The SessionFilter is intended to handle Cookies across threads. It does not filter out any entries, but modifies the cookie manager so that the cookies for a given IP are processed by a single thread at a time. If two threads try to process samples from the same client IP address, then one will be forced to wait until the other has completed.
The LogFilter is intended to allow access log entries to be filtered by filename and regex, as well as allowing for the replacement of file extensions. However, it is not currently possible to configure this via the GUI, so it cannot really be used.
BeanShell Sampler¶
This sampler allows you to write a sampler using the BeanShell scripting language.
For full details on using BeanShell, please see the BeanShell website.
The test element supports the ThreadListener and TestListener interface methods. These must be defined in the initialisation file. See the file BeanShellListeners.bshrc for example definitions.
The BeanShell sampler also supports the Interruptible interface. The interrupt() method can be defined in the script or the init file.
Parameters ¶
- Parameters
- string containing the parameters as a single variable
- bsh.args
- String array containing parameters, split on white-space
If the property "beanshell.sampler.init" is defined, it is passed to the Interpreter as the name of a sourced file. This can be used to define common methods and variables. There is a sample init file in the bin directory: BeanShellSampler.bshrc.
If a script file is supplied, that will be used, otherwise the script will be used.
BeanShell does not currently support Java 5 syntax such as generics and the enhanced for loop.
Before invoking the script, some variables are set up in the BeanShell interpreter:
The contents of the Parameters field is put into the variable "Parameters". The string is also split into separate tokens using a single space as the separator, and the resulting list is stored in the String array bsh.args.
The full list of BeanShell variables that is set up is as follows:
- log - the Logger
- Label - the Sampler label
- FileName - the file name, if any
- Parameters - text from the Parameters field
- bsh.args - the parameters, split as described above
- SampleResult - pointer to the current SampleResult
- ResponseCode defaults to 200
- ResponseMessage defaults to "OK"
- IsSuccess defaults to true
- ctx - JMeterContext
-
vars - JMeterVariables - e.g.
vars.get("VAR1"); vars.put("VAR2","value"); vars.remove("VAR3"); vars.putObject("OBJ1",new Object());
-
props - JMeterProperties (class java.util.Properties) - e.g.
props.get("START.HMS"); props.put("PROP1","1234");
When the script completes, control is returned to the Sampler, and it copies the contents of the following script variables into the corresponding variables in the SampleResult:
- ResponseCode - for example 200
- ResponseMessage - for example "OK"
- IsSuccess - true or false
The SampleResult ResponseData is set from the return value of the script. If the script returns null, it can set the response directly, by using the method SampleResult.setResponseData(data), where data is either a String or a byte array. The data type defaults to "text", but can be set to binary by using the method SampleResult.setDataType(SampleResult.BINARY).
The SampleResult variable gives the script full access to all the fields and methods in the SampleResult. For example, the script has access to the methods setStopThread(boolean) and setStopTest(boolean). Here is a simple (not very useful!) example script:
if (bsh.args[0].equalsIgnoreCase("StopThread")) { log.info("Stop Thread detected!"); SampleResult.setStopThread(true); } return "Data from sample with Label "+Label; //or SampleResult.setResponseData("My data"); return null;
Another example:
ensure that the property beanshell.sampler.init=BeanShellSampler.bshrc is defined in jmeter.properties.
The following script will show the values of all the variables in the ResponseData field:
return getVariables();
For details on the methods available for the various classes (JMeterVariables, SampleResult etc.) please check the Javadoc or the source code. Beware however that misuse of any methods can cause subtle faults that may be difficult to find.
JSR223 Sampler¶
The JSR223 Sampler allows JSR223 script code to be used to perform a sample or some computation required to create/update variables.
SampleResult.setIgnore();This call will have the following impact:
- SampleResult will not be delivered to SampleListeners like View Results Tree, Summariser ...
- SampleResult will not be evaluated in Assertions nor PostProcessors
- SampleResult will be evaluated to computing last sample status (${JMeterThread.last_sample_ok}), and ThreadGroup "Action to be taken after a Sampler error" (since JMeter 5.4)
The JSR223 test elements have a feature (compilation) that can significantly increase performance. To benefit from this feature:
- Use Script files instead of inlining them. This will make JMeter compile them if this feature is available on ScriptEngine and cache them.
-
Or Use Script Text and check Cache compiled script if available property.
When using this feature, ensure your script code does not use JMeter variables or JMeter function calls directly in script code as caching would only cache first replacement. Instead use script parameters.To benefit from caching and compilation, the language engine used for scripting must implement JSR223 Compilable interface (Groovy is one of these, java, beanshell and javascript are not)When using Groovy as scripting language and not checking Cache compiled script if available (while caching is recommended), you should set this JVM Property -Dgroovy.use.classvalue=true due to a Groovy Memory leak as of version 2.4.6, see:
jsr223.compiled_scripts_cache_size=100
props.get("START.HMS"); props.put("PROP1","1234");
Parameters ¶
Notice that some languages such as Velocity may use a different syntax for JSR223 variables, e.g.
$log.debug("Hello " + $vars.get("a"));for Velocity.
If a script file is supplied, that will be used, otherwise the script will be used.
Before invoking the script, some variables are set up. Note that these are JSR223 variables - i.e. they can be used directly in the script.
- log - the Logger
- Label - the Sampler label
- FileName - the file name, if any
- Parameters - text from the Parameters field
- args - the parameters, split as described above
- SampleResult - pointer to the current SampleResult
- sampler - (Sampler) - pointer to current Sampler
- ctx - JMeterContext
-
vars - JMeterVariables - e.g.
vars.get("VAR1"); vars.put("VAR2","value"); vars.remove("VAR3"); vars.putObject("OBJ1",new Object());
-
props - JMeterProperties (class java.util.Properties) - e.g.
props.get("START.HMS"); props.put("PROP1","1234");
- OUT - System.out - e.g. OUT.println("message")
The SampleResult ResponseData is set from the return value of the script. If the script returns null, it can set the response directly, by using the method SampleResult.setResponseData(data), where data is either a String or a byte array. The data type defaults to "text", but can be set to binary by using the method SampleResult.setDataType(SampleResult.BINARY).
The SampleResult variable gives the script full access to all the fields and methods in the SampleResult. For example, the script has access to the methods setStopThread(boolean) and setStopTest(boolean).
Unlike the BeanShell Sampler, the JSR223 Sampler does not set the ResponseCode, ResponseMessage and sample status via script variables. Currently the only way to changes these is via the SampleResult methods:
- SampleResult.setSuccessful(true/false)
- SampleResult.setResponseCode("code")
- SampleResult.setResponseMessage("message")
TCP Sampler¶
The TCP Sampler opens a TCP/IP connection to the specified server. It then sends the text, and waits for a response.
If "Re-use connection" is selected, connections are shared between Samplers in the same thread, provided that the exact same host name string and port are used. Different hosts/port combinations will use different connections, as will different threads. If both of "Re-use connection" and "Close connection" are selected, the socket will be closed after running the sampler. On the next sampler, another socket will be created. You may want to close a socket at the end of each thread loop.
If an error is detected - or "Re-use connection" is not selected - the socket is closed. Another socket will be reopened on the next sample.
The following properties can be used to control its operation:
- tcp.status.prefix
- text that precedes a status number
- tcp.status.suffix
- text that follows a status number
- tcp.status.properties
- name of property file to convert status codes to messages
- tcp.handler
- Name of TCP Handler class (default TCPClientImpl) - only used if not specified on the GUI
Users can provide their own implementation. The class must extend org.apache.jmeter.protocol.tcp.sampler.TCPClient.
The following implementations are currently provided.
- TCPClientImpl
- BinaryTCPClientImpl
- LengthPrefixedBinaryTCPClientImpl
- TCPClientImpl
- This implementation is fairly basic. When reading the response, it reads until the end of line byte, if this is defined by setting the property tcp.eolByte, otherwise until the end of the input stream. You can control charset encoding by setting tcp.charset, which will default to Platform default encoding.
- BinaryTCPClientImpl
- This implementation converts the GUI input, which must be a hex-encoded string, into binary, and performs the reverse when reading the response. When reading the response, it reads until the end of message byte, if this is defined by setting the property tcp.BinaryTCPClient.eomByte, otherwise until the end of the input stream.
- LengthPrefixedBinaryTCPClientImpl
- This implementation extends BinaryTCPClientImpl by prefixing the binary message data with a binary length byte. The length prefix defaults to 2 bytes. This can be changed by setting the property tcp.binarylength.prefix.length.
- Timeout handling
- If the timeout is set, the read will be terminated when this expires. So if you are using an eolByte/eomByte, make sure the timeout is sufficiently long, otherwise the read will be terminated early.
- Response handling
-
If tcp.status.prefix is defined, then the response message is searched for the text following
that up to the suffix. If any such text is found, it is used to set the response code.
The response message is then fetched from the properties file (if provided).
Usage of pre- and suffix¶For example, if the prefix = "[" and the suffix = "]", then the following response:
[J28] XI123,23,GBP,CR
would have the response code J28.
Sockets are disconnected at the end of a test run.
Parameters ¶
JMS Publisher¶
JMS Publisher will publish messages to a given destination (topic/queue). For those not familiar with JMS, it is the J2EE specification for messaging. There are numerous JMS servers on the market and several open source options.
Parameters ¶
- From File
- means the referenced file will be read and reused by all samples. If file name changes it is reloaded since JMeter 3.0
- Random File from folder specified below
- means a random file will be selected from folder specified below, this folder must contain either files with extension .dat for Bytes Messages, or files with extension .txt or .obj for Object or Text messages
- Text area
- The Message to use either for Text or Object message
- RAW:
- No variable support from the file and load it with default system charset.
- DEFAULT:
- Load file with default system encoding, except for XML which relies on XML prolog. If the file contain variables, they will be processed.
- Standard charsets:
- The specified encoding (valid or not) is used for reading the file and processing variables
For the MapMessage type, JMeter reads the source as lines of text. Each line must have 3 fields, delimited by commas. The fields are:
- Name of entry
- Object class name, e.g. "String" (assumes java.lang package if not specified)
- Object string value
name,String,Example size,Integer,1234
- Put the JAR that contains your object and its dependencies in jmeter_home/lib/ folder
- Serialize your object as XML using XStream
- Either put result in a file suffixed with .txt or .obj or put XML content directly in Text Area
The following table shows some values which may be useful when configuring JMS:
Apache ActiveMQ | Value(s) | Comment |
---|---|---|
Context Factory | org.apache.activemq.jndi.ActiveMQInitialContextFactory | . |
Provider URL | vm://localhost | |
Provider URL | vm:(broker:(vm://localhost)?persistent=false) | Disable persistence |
Queue Reference | dynamicQueues/QUEUENAME | Dynamically define the QUEUENAME to JNDI |
Topic Reference | dynamicTopics/TOPICNAME | Dynamically define the TOPICNAME to JNDI |
JMS Subscriber¶
JMS Subscriber will subscribe to messages in a given destination (topic or queue). For those not familiar with JMS, it is the J2EE specification for messaging. There are numerous JMS servers on the market and several open source options.
Parameters ¶
- MessageConsumer.receive()
- calls receive() for every requested message. Retains the connection between samples, but does not fetch messages unless the sampler is active. This is best suited to Queue subscriptions.
- MessageListener.onMessage()
- establishes a Listener that stores all incoming messages on a queue. The listener remains active after the sampler completes. This is best suited to Topic subscriptions.
JMS Point-to-Point¶
This sampler sends and optionally receives JMS Messages through point-to-point connections (queues). It is different from pub/sub messages and is generally used for handling transactions.
request_only will typically be used to put load on a JMS System.
request_reply will be used when you want to test response time of a JMS service that processes messages sent to the Request Queue as this mode will wait for the response on the Reply queue sent by this service.
browse returns the current queue depth, i.e. the number of messages on the queue.
read reads a message from the queue (if any).
clear clears the queue, i.e. remove all messages from the queue.
JMeter use the properties java.naming.security.[principal|credentials] - if present - when creating the Queue Connection. If this behaviour is not desired, set the JMeter property JMSSampler.useSecurity.properties=false
Parameters ¶
- Request Only
- will only send messages and will not monitor replies. As such it can be used to put load on a system.
- Request Response
- will send messages and monitor the replies it receives. Behaviour depends on the value of the JNDI Name Reply Queue. If JNDI Name Reply Queue has a value, this queue is used to monitor the results. Matching of request and reply is done with the message id of the request and the correlation id of the reply. If the JNDI Name Reply Queue is empty, then temporary queues will be used for the communication between the requestor and the server. This is very different from the fixed reply queue. With temporary queues the sending thread will block until the reply message has been received. With Request Response mode, you need to have a Server that listens to messages sent to Request Queue and sends replies to queue referenced by message.getJMSReplyTo().
- Read
- will read a message from an outgoing queue which has no listeners attached. This can be convenient for testing purposes. This method can be used if you need to handle queues without a binding file (in case the jmeter-jms-skip-jndi library is used), which only works with the JMS Point-to-Point sampler. In case binding files are used, one can also use the JMS Subscriber Sampler for reading from a queue.
- Browse
- will determine the current queue depth without removing messages from the queue, returning the number of messages on the queue.
- Clear
- will clear the queue, i.e. remove all messages from the queue.
- Use Request Message Id
- if selected, the request JMSMessageID will be used, otherwise the request JMSCorrelationID will be used. In the latter case the correlation id must be specified in the request.
- Use Response Message Id
- if selected, the response JMSMessageID will be used, otherwise the response JMSCorrelationID will be used.
- JMS Correlation ID Pattern
- i.e. match request and response on their correlation Ids => deselect both checkboxes, and provide a correlation id.
- JMS Message ID Pattern
- i.e. match request message id with response correlation id => select "Use Request Message Id" only.
JUnit Request¶
- rather than use JMeter's test interface, it scans the jar files for classes extending JUnit's TestCase class. That includes any class or subclass.
- JUnit test jar files should be placed in jmeter/lib/junit instead of /lib directory. You can also use the "user.classpath" property to specify where to look for TestCase classes.
- JUnit sampler does not use name/value pairs for configuration like the Java Request. The sampler assumes setUp and tearDown will configure the test correctly.
- The sampler measures the elapsed time only for the test method and does not include setUp and tearDown.
- Each time the test method is called, JMeter will pass the result to the listeners.
- Support for oneTimeSetUp and oneTimeTearDown is done as a method. Since JMeter is multi-threaded, we cannot call oneTimeSetUp/oneTimeTearDown the same way Maven does it.
- The sampler reports unexpected exceptions as errors. There are some important differences between standard JUnit test runners and JMeter's implementation. Rather than make a new instance of the class for each test, JMeter creates 1 instance per sampler and reuses it. This can be changed with checkbox "Create a new instance per sample".
public class myTestCase { public myTestCase() {} }String Constructor:
public class myTestCase { public myTestCase(String text) { super(text); } }
General Guidelines
If you use setUp and tearDown, make sure the methods are declared public. If you do not, the test may not run properly.Here are some general guidelines for writing JUnit tests so they work well with JMeter. Since JMeter runs multi-threaded, it is important to keep certain things in mind.
- Write the setUp and tearDown methods so they are thread safe. This generally means avoid using static members.
- Make the test methods discrete units of work and not long sequences of actions. By keeping the test method to a discrete operation, it makes it easier to combine test methods to create new test plans.
- Avoid making test methods depend on each other. Since JMeter allows arbitrary sequencing of test methods, the runtime behavior is different than the default JUnit behavior.
- If a test method is configurable, be careful about where the properties are stored. Reading the properties from the Jar file is recommended.
- Each sampler creates an instance of the test class, so write your test so the setup happens in oneTimeSetUp and oneTimeTearDown.
Parameters ¶
The following JUnit4 annotations are recognised:
- @Test
- used to find test methods and classes. The "expected" and "timeout" attributes are supported.
- @Before
- treated the same as setUp() in JUnit3
- @After
- treated the same as tearDown() in JUnit3
- @BeforeClass, @AfterClass
- treated as test methods so they can be run independently as required
Mail Reader Sampler¶
The Mail Reader Sampler can read (and optionally delete) mail messages using POP3(S) or IMAP(S) protocols.
Parameters ¶
Failing that, against the directory containing the test script (JMX file).
Messages are stored as subsamples of the main sampler. Multipart message parts are stored as subsamples of the message.
Special handling for "file" protocol:
The file JavaMail provider can be used to read raw messages from files.
The server field is used to specify the path to the parent of the folder.
Individual message files should be stored with the name n.msg,
where n is the message number.
Alternatively, the server field can be the name of a file which contains a single message.
The current implementation is quite basic, and is mainly intended for debugging purposes.
Flow Control Action (was: Test Action ) ¶
This sampler can also be useful in conjunction with the Transaction Controller, as it allows pauses to be included without needing to generate a sample. For variable delays, set the pause time to zero, and add a Timer as a child.
The "Stop" action stops the thread or test after completing any samples that are in progress. The "Stop Now" action stops the test without waiting for samples to complete; it will interrupt any active samples. If some threads fail to stop within the 5 second time-limit, a message will be displayed in GUI mode. You can try using the Stop command to see if this will stop the threads, but if not, you should exit JMeter. In CLI mode, JMeter will exit if some threads fail to stop within the 5 second time limit.
Parameters ¶
SMTP Sampler¶
The SMTP Sampler can send mail messages using SMTP/SMTPS protocol.
It is possible to set security protocols for the connection (SSL and TLS), as well as user authentication.
If a security protocol is used a verification on the server certificate will occur.
Two alternatives to handle this verification are available:
- Trust all certificates
- This will ignore certificate chain verification
- Use a local truststore
- With this option the certificate chain will be validated against the local truststore file.
Parameters ¶
Failing that, against the directory containing the test script (JMX file).
OS Process Sampler¶
The OS Process Sampler is a sampler that can be used to execute commands on the local machine.
It should allow execution of any command that can be run from the command line.
Validation of the return code can be enabled, and the expected return code can be specified.
Note that OS shells generally provide command-line parsing. This varies between OSes, but generally the shell will split parameters on white-space. Some shells expand wild-card file names; some don't. The quoting mechanism also varies between OSes. The sampler deliberately does not do any parsing or quote handling. The command and its parameters must be provided in the form expected by the executable. This means that the sampler settings will not be portable between OSes.
Many OSes have some built-in commands which are not provided as separate executables. For example the Windows DIR command is part of the command interpreter (CMD.EXE). These built-ins cannot be run as independent programs, but have to be provided as arguments to the appropriate command interpreter.
For example, the Windows command-line: DIR C:\TEMP needs to be specified as follows:
- Command:
- CMD
- Param 1:
- /C
- Param 2:
- DIR
- Param 3:
- C:\TEMP
Parameters ¶
MongoDB Script (DEPRECATED)¶
This sampler lets you send a Request to a MongoDB.
Before using this you need to set up a MongoDB Source Config Configuration element
Parameters ¶
Bolt Request¶
This sampler allows you to run Cypher queries through the Bolt protocol.
Before using this you need to set up a Bolt Connection Configuration
Every request uses a connection acquired from the pool and returns it to the pool when the sampler completes. The connection pool size defaults to 100 and is configurable.
The measured response time corresponds to the "full" query execution, including both the time to execute the cypher query AND the time to consume the results sent back by the database.
Parameters ¶
18.2 Logic Controllers¶
Logic Controllers determine the order in which Samplers are processed.
Simple Controller¶
The Simple Logic Controller lets you organize your Samplers and other Logic Controllers. Unlike other Logic Controllers, this controller provides no functionality beyond that of a storage device.
Parameters ¶
Download this example (see Figure 6). In this example, we created a Test Plan that sends two Ant HTTP requests and two Log4J HTTP requests. We grouped the Ant and Log4J requests by placing them inside Simple Logic Controllers. Remember, the Simple Logic Controller has no effect on how JMeter processes the controller(s) you add to it. So, in this example, JMeter sends the requests in the following order: Ant Home Page, Ant News Page, Log4J Home Page, Log4J History Page.
Note, the File Reporter is configured to store the results in a file named "simple-test.dat" in the current directory.
Loop Controller¶
If you add Generative or Logic Controllers to a Loop Controller, JMeter will loop through them a certain number of times, in addition to the loop value you specified for the Thread Group. For example, if you add one HTTP Request to a Loop Controller with a loop count of two, and configure the Thread Group loop count to three, JMeter will send a total of 2 * 3 = 6 HTTP Requests.
Parameters ¶
The value -1 is equivalent to checking the Forever toggle.
Special Case: The Loop Controller embedded in the Thread Group element behaves slightly different. Unless set to forever, it stops the test after the given number of iterations have been done.
Download this example (see Figure 4). In this example, we created a Test Plan that sends a particular HTTP Request only once and sends another HTTP Request five times.
We configured the Thread Group for a single thread and a loop count value of one. Instead of letting the Thread Group control the looping, we used a Loop Controller. You can see that we added one HTTP Request to the Thread Group and another HTTP Request to a Loop Controller. We configured the Loop Controller with a loop count value of five.
JMeter will send the requests in the following order: Home Page, News Page, News Page, News Page, News Page, and News Page.
Once Only Controller¶
The Once Only Logic Controller tells JMeter to process the controller(s) inside it only once per Thread, and pass over any requests under it during further iterations through the test plan.
The Once Only Controller will now execute always during the first iteration of any looping parent controller. Thus, if the Once Only Controller is placed under a Loop Controller specified to loop 5 times, then the Once Only Controller will execute only on the first iteration through the Loop Controller (i.e. every 5 times).
Note this means the Once Only Controller will still behave as previously expected if put under a Thread Group (runs only once per test per Thread), but now the user has more flexibility in the use of the Once Only Controller.
For testing that requires a login, consider placing the login request in this controller since each thread only needs to login once to establish a session.
Parameters ¶
Download this example (see Figure 5). In this example, we created a Test Plan that has two threads that send HTTP request. Each thread sends one request to the Home Page, followed by three requests to the Bug Page. Although we configured the Thread Group to iterate three times, each JMeter thread only sends one request to the Home Page because this request lives inside a Once Only Controller.
Each JMeter thread will send the requests in the following order: Home Page, Bug Page, Bug Page, Bug Page.
Note, the File Reporter is configured to store the results in a file named "loop-test.dat" in the current directory.
Interleave Controller¶
If you add Generative or Logic Controllers to an Interleave Controller, JMeter will alternate among each of the other controllers for each loop iteration.
Parameters ¶
Download this example (see Figure 1). In this example, we configured the Thread Group to have two threads and a loop count of five, for a total of ten requests per thread. See the table below for the sequence JMeter sends the HTTP Requests.
Loop Iteration | Each JMeter Thread Sends These HTTP Requests |
---|---|
1 | News Page |
1 | Log Page |
2 | FAQ Page |
2 | Log Page |
3 | Gump Page |
3 | Log Page |
4 | Because there are no more requests in the controller,
JMeter starts over and sends the first HTTP Request, which is the News Page. |
4 | Log Page |
5 | FAQ Page |
5 | Log Page |
Download another example (see Figure 2). In this example, we configured the Thread Group to have a single thread and a loop count of eight. Notice that the Test Plan has an outer Interleave Controller with two Interleave Controllers inside of it.
The outer Interleave Controller alternates between the two inner ones. Then, each inner Interleave Controller alternates between each of the HTTP Requests. Each JMeter thread will send the requests in the following order: Home Page, Interleaved, Bug Page, Interleaved, CVS Page, Interleaved, and FAQ Page, Interleaved.
Note, the File Reporter is configured to store the results in a file named "interleave-test2.dat" in the current directory.
If the two interleave controllers under the main interleave controller were instead simple controllers, then the order would be: Home Page, CVS Page, Interleaved, Bug Page, FAQ Page, Interleaved.
However, if "ignore sub-controller blocks" was checked on the main interleave controller, then the order would be: Home Page, Interleaved, Bug Page, Interleaved, CVS Page, Interleaved, and FAQ Page, Interleaved.
Random Controller¶
The Random Logic Controller acts similarly to the Interleave Controller, except that instead of going in order through its sub-controllers and samplers, it picks one at random at each pass.
Parameters ¶
Random Order Controller¶
The Random Order Controller is much like a Simple Controller in that it will execute each child element at most once, but the order of execution of the nodes will be random.
Parameters ¶
Throughput Controller¶
The Throughput Controller allows the user to control how often it is executed. There are two modes:
- percent execution
- total executions
- Percent executions
- causes the controller to execute a certain percentage of the iterations through the test plan.
- Total executions
- causes the controller to stop executing after a certain number of executions have occurred.
Parameters ¶
Runtime Controller¶
The Runtime Controller controls how long its children will run. Controller will run its children until configured Runtime(s) is exceeded.
Parameters ¶
If Controller¶
The If Controller allows the user to control whether the test elements below it (its children) are run or not.
By default, the condition is evaluated only once on initial entry, but you have the option to have it evaluated for every runnable element contained in the controller.
The best option (default one) is to check Interpret Condition as Variable Expression?, then in the condition field you have 2 options:
-
Option 1: Use a variable that contains true or false
If you want to test if last sample was successful, you can use ${JMeterThread.last_sample_ok}
- Option 2: Use a function (${__jexl3()} is advised) to evaluate an expression that must return true or false
"${myVar}" == "\${myVar}"Or use:
"${myVar}" != "\${myVar}"to test if a variable is defined and is not null.
Parameters ¶
- ${COUNT} < 10
- "${VAR}" == "abcd"
When using __groovy take care to not use variable replacement in the string, otherwise if using a variable that changes the script cannot be cached. Instead get the variable using: vars.get("myVar"). See the Groovy examples below.
- ${__groovy(vars.get("myVar") != "Invalid" )} (Groovy check myVar is not equal to Invalid)
- ${__groovy(vars.get("myInt").toInteger() <=4 )} (Groovy check myInt is less then or equal to 4)
- ${__groovy(vars.get("myMissing") != null )} (Groovy check if the myMissing variable is not set)
- ${__jexl3(${COUNT} < 10)}
- ${RESULT}
- ${JMeterThread.last_sample_ok} (check if the last sample succeeded)
While Controller¶
The While Controller runs its children until the condition is "false".
Possible condition values:
- blank - exit loop when last sample in loop fails
- LAST - exit loop when last sample in loop fails. If the last sample just before the loop failed, don't enter loop.
- Otherwise - exit (or don't enter) the loop when the condition is equal to the string "false"
For example:
- ${VAR} - where VAR is set to false by some other test element
- ${__jexl3(${C}==10)}
- ${__jexl3("${VAR2}"=="abcd")}
- ${_P(property)} - where property is set to "false" somewhere else
Parameters ¶
Switch Controller¶
The Switch Controller acts like the Interleave Controller in that it runs one of the subordinate elements on each iteration, but rather than run them in sequence, the controller runs the element defined by the switch value.
If the switch value is out of range, it will run the zeroth element, which therefore acts as the default for the numeric case. It also runs the zeroth element if the value is the empty string.
If the value is non-numeric (and non-empty), then the Switch Controller looks for the element with the same name (case is significant). If none of the names match, then the element named "default" (case not significant) is selected. If there is no default, then no element is selected, and the controller will not run anything.
Parameters ¶
ForEach Controller¶
A ForEach controller loops through the values of a set of related variables. When you add samplers (or controllers) to a ForEach controller, every sample (or controller) is executed one or more times, where during every loop the variable has a new value. The input should consist of several variables, each extended with an underscore and a number. Each such variable must have a value. So for example when the input variable has the name inputVar, the following variables should have been defined:
- inputVar_1 = wendy
- inputVar_2 = charles
- inputVar_3 = peter
- inputVar_4 = john
Note: the "_" separator is now optional.
When the return variable is given as "returnVar", the collection of samplers and controllers under the ForEach controller will be executed 4 consecutive times, with the return variable having the respective above values, which can then be used in the samplers.
It is especially suited for running with the regular expression post-processor. This can "create" the necessary input variables out of the result data of a previous request. By omitting the "_" separator, the ForEach Controller can be used to loop through the groups by using the input variable refName_g, and can also loop through all the groups in all the matches by using an input variable of the form refName_${C}_g, where C is a counter variable.
Parameters ¶
Download this example (see Figure 7). In this example, we created a Test Plan that sends a particular HTTP Request only once and sends another HTTP Request to every link that can be found on the page.
We configured the Thread Group for a single thread and a loop count value of one. You can see that we added one HTTP Request to the Thread Group and another HTTP Request to the ForEach Controller.
After the first HTTP request, a regular expression extractor is added, which extracts all the html links out of the return page and puts them in the inputVar variable
In the ForEach loop, a HTTP sampler is added which requests all the links that were extracted from the first returned HTML page.
Here is another example you can download. This has two Regular Expressions and ForEach Controllers. The first RE matches, but the second does not match, so no samples are run by the second ForEach Controller
The Thread Group has a single thread and a loop count of two.
Sample 1 uses the JavaTest Sampler to return the string "a b c d".
The Regex Extractor uses the expression (\w)\s which matches a letter followed by a space, and returns the letter (not the space). Any matches are prefixed with the string "inputVar".
The ForEach Controller extracts all variables with the prefix "inputVar_", and executes its sample, passing the value in the variable "returnVar". In this case it will set the variable to the values "a" "b" and "c" in turn.
The For 1 Sampler is another Java Sampler which uses the return variable "returnVar" as part of the sample Label and as the sampler Data.
Sample 2, Regex 2 and For 2 are almost identical, except that the Regex has been changed to "(\w)\sx", which clearly won't match. Thus the For 2 Sampler will not be run.
Module Controller¶
The Module Controller provides a mechanism for substituting test plan fragments into the current test plan at run-time.
A test plan fragment consists of a Controller and all the test elements (samplers etc.) contained in it. The fragment can be located in any Thread Group. If the fragment is located in a Thread Group, then its Controller can be disabled to prevent the fragment being run except by the Module Controller. Or you can store the fragments in a dummy Thread Group, and disable the entire Thread Group.
There can be multiple fragments, each with a different series of samplers under them. The module controller can then be used to easily switch between these multiple test cases simply by choosing the appropriate controller in its drop down box. This provides convenience for running many alternate test plans quickly and easily.
A fragment name is made up of the Controller name and all its parent names. For example:
Test Plan / Protocol: JDBC / Control / Interleave Controller (Module1)
Any fragments used by the Module Controller must have a unique name, as the name is used to find the target controller when a test plan is reloaded. For this reason it is best to ensure that the Controller name is changed from the default - as shown in the example above - otherwise a duplicate may be accidentally created when new elements are added to the test plan.
Parameters ¶
Include Controller¶
The include controller is designed to use an external JMX file. To use it, create a Test Fragment underneath the Test Plan and add any desired samplers, controllers etc. below it. Then save the Test Plan. The file is now ready to be included as part of other Test Plans.
For convenience, a Thread Group can also be added in the external JMX file for debugging purposes. A Module Controller can be used to reference the Test Fragment. The Thread Group will be ignored during the include process.
If the test uses a Cookie Manager or User Defined Variables, these should be placed in the top-level test plan, not the included file, otherwise they are not guaranteed to work.
However, if the property includecontroller.prefix is defined, the contents are used to prefix the pathname.
If the file cannot be found at the location given by prefix+Filename, then the controller attempts to open the Filename relative to the JMX launch directory.
Transaction Controller¶
The Transaction Controller generates an additional sample which measures the overall time taken to perform the nested test elements.
There are two modes of operation:
- additional sample is added after the nested samples
- additional sample is added as a parent of the nested samples
The generated sample time includes all the times for the nested samplers excluding by default (since 2.11) timers and processing time of pre/post processors unless checkbox "Include duration of timer and pre-post processors in generated sample" is checked. Depending on the clock resolution, it may be slightly longer than the sum of the individual samplers plus timers. The clock might tick after the controller recorded the start time but before the first sample starts. Similarly at the end.
The generated sample is only regarded as successful if all its sub-samples are successful.
In parent mode, the individual samples can still be seen in the Tree View Listener, but no longer appear as separate entries in other Listeners. Also, the sub-samples do not appear in CSV log files, but they can be saved to XML files.
Parameters ¶
Recording Controller¶
The Recording Controller is a place holder indicating where the proxy server should record samples to. During test run, it has no effect, similar to the Simple Controller. But during recording using the HTTP(S) Test Script Recorder, all recorded samples will by default be saved under the Recording Controller.
Parameters ¶
Critical Section Controller¶
The Critical Section Controller ensures that its children elements (samplers/controllers, etc.) will be executed by only one thread as a named lock will be taken before executing children of controller.
The figure below shows an example of using Critical Section Controller, in the figure below 2 Critical Section Controllers ensure that:
- DS2-${__threadNum} is executed only by one thread at a time
- DS4-${__threadNum} is executed only by one thread at a time
Parameters ¶
18.3 Listeners¶
Most of the listeners perform several roles in addition to "listening" to the test results. They also provide means to view, save, and read saved test results.
Note that Listeners are processed at the end of the scope in which they are found.
The saving and reading of test results is generic. The various listeners have a panel whereby one can specify the file to which the results will be written (or read from). By default, the results are stored as XML files, typically with a ".jtl" extension. Storing as CSV is the most efficient option, but is less detailed than XML (the other available option).
Listeners do not process sample data in CLI mode, but the raw data will be saved if an output file has been configured. In order to analyse the data generated by a CLI run, you need to load the file into the appropriate Listener.
If you want to clear any current data before loading a new file, use the menu item
or before loading the file.Results can be read from XML or CSV format files. When reading from CSV results files, the header (if present) is used to determine which fields are present. In order to interpret a header-less CSV file correctly, the appropriate properties must be set in jmeter.properties.
Listeners can use a lot of memory if there are a lot of samples. Most of the listeners currently keep a copy of every sample in their scope, apart from:
- Simple Data Writer
- BeanShell/JSR223 Listener
- Mailer Visualizer
- Summary Report
The following Listeners no longer need to keep copies of every single sample. Instead, samples with the same elapsed time are aggregated. Less memory is now needed, especially if most samples only take a second or two at most.
- Aggregate Report
- Aggregate Graph
To minimise the amount of memory needed, use the Simple Data Writer, and use the CSV format.
For full details on setting up the default items to be saved see the Listener Default Configuration documentation. For details of the contents of the output files, see the CSV log format or the XML log format.
The figure below shows an example of the result file configuration panel
Parameters
Sample Result Save Configuration¶
Listeners can be configured to save different items to the result log files (JTL) by using the Config popup as shown below. The defaults are defined as described in the Listener Default Configuration documentation. Items with (CSV) after the name only apply to the CSV format; items with (XML) only apply to XML format. CSV format cannot currently be used to save any items that include line-breaks.
Note that cookies, method and the query string are saved as part of the "Sampler Data" option.
Graph Results¶
The Graph Results listener generates a simple graph that plots all sample times. Along the bottom of the graph, the current sample (black), the current average of all samples (blue), the current standard deviation (red), and the current throughput rate (green) are displayed in milliseconds.
The throughput number represents the actual number of requests/minute the server handled. This calculation includes any delays you added to your test and JMeter's own internal processing time. The advantage of doing the calculation like this is that this number represents something real - your server in fact handled that many requests per minute, and you can increase the number of threads and/or decrease the delays to discover your server's maximum throughput. Whereas if you made calculations that factored out delays and JMeter's processing, it would be unclear what you could conclude from that number.
The following table briefly describes the items on the graph. Further details on the precise meaning of the statistical terms can be found on the web - e.g. Wikipedia - or by consulting a book on statistics.
- Data - plot the actual data values
- Average - plot the Average
- Median - plot the Median (midway value)
- Deviation - plot the Standard Deviation (a measure of the variation)
- Throughput - plot the number of samples per unit of time
The individual figures at the bottom of the display are the current values. "Latest Sample" is the current elapsed sample time, shown on the graph as "Data".
The value displayed on the top left of graph is the max of 90th percentile of response time.
Assertion Results¶
The Assertion Results visualizer shows the Label of each sample taken. It also reports failures of any Assertions that are part of the test plan.
View Results Tree¶
There are several ways to view the response, selectable by a drop-down box at the bottom of the left hand panel.
Renderer | Description |
---|---|
CSS/JQuery Tester | The CSS/JQuery Tester only works for text responses. It shows the plain text in the upper panel.
The "Test" button allows the user to apply the CSS/JQuery to the upper panel and the results
will be displayed in the lower panel.
The CSS/JQuery expression engine can be JSoup or Jodd, syntax of these 2 implementation differs slightly. For example, the Selector a[class=sectionlink] with attribute href applied to the current JMeter functions page gives the following output: Match count: 74 Match[1]=#functions Match[2]=#what_can_do Match[3]=#where Match[4]=#how Match[5]=#function_helper Match[6]=#functions Match[7]=#__regexFunction Match[8]=#__regexFunction_parms Match[9]=#__counter … and so on … |
Document | The Document view will show the extract text from various type of documents like Microsoft Office
(Word, Excel, PowerPoint 97-2003, 2007-2010 (openxml), Apache OpenOffice (writer, calc, impress), HTML,
gzip, jar/zip files (list of content), and some meta-data on "multimedia" files like mp3, mp4, flv, etc. The complete list of
support format is available on Apache Tika format page.
A requirement to the Document view is to download the
Apache Tika binary package (tika-app-x.x.jar) and put this in JMETER_HOME/lib directory.
If the document is larger than 10 MB, then it won't be displayed.
To change this limit, set the JMeter property document.max_size (unit is byte) or set to 0 to remove the limit.
|
HTML | The HTML view attempts to render the response as
HTML. The rendered HTML is likely to compare poorly to the view one
would get in any web browser; however, it does provide a quick
approximation that is helpful for initial result evaluation.
Images, style-sheets, etc. aren't downloaded. |
HTML (download resources) | If the HTML (download resources) view option is selected, the renderer
may download images, style-sheets, etc. referenced by the HTML code.
|
HTML Source formatted | If the HTML Source formatted view option is selected, the renderer will display the HTML source code formatted and cleaned by Jsoup.
|
JSON | The JSON view will show the response in tree style (also handles JSON embedded in JavaScript).
|
JSON Path Tester | The JSON Path Tester view will let you test your JSON-PATH expressions and see the extracted data from a particular response.
|
JSON JMESPath Tester | The JSON JMESPath Tester view will let you test your JMESPath expressions and see the extracted data from a particular response.
|
Regexp Tester | The Regexp Tester view only works for text responses. It shows the plain text in the upper panel.
The "Test" button allows the user to apply the Regular Expression to the upper panel and the results
will be displayed in the lower panel.
The regular expression engine is the same as that used in the Regular Expression Extractor. For example, the RE (JMeter\w*).* applied to the current JMeter home page gives the following output: Match count: 26 Match[1][0]=JMeter - Apache JMeter</title> Match[1][1]=JMeter Match[2][0]=JMeter" title="JMeter" border="0"/></a> Match[2][1]=JMeter Match[3][0]=JMeterCommitters">Contributors</a> Match[3][1]=JMeterCommitters … and so on … The first number in [] is the match number; the second number is the group. Group [0] is whatever matched the whole RE. Group [1] is whatever matched the 1st group, i.e. (JMeter\w*) in this case. See Figure 9b (below). |
Text |
The default Text view shows all of the text contained in the response.
Note that this will only work if the response content-type is considered to be text.
If the content-type begins with any of the following, it is considered as binary,
otherwise it is considered to be text.
image/ audio/ video/ |
XML | The XML view will show response in tree style.
Any DTD nodes or Prolog nodes will not show up in tree; however, response may contain those nodes.
You can right-click on any node and expand or collapse all nodes below it.
|
XPath Tester | The XPath Tester only works for text responses. It shows the plain text in the upper panel.
The "Test" button allows the user to apply the XPath query to the upper panel and the results
will be displayed in the lower panel.
|
Boundary Extractor Tester | The Boundary Extractor Tester only works for text responses. It shows the plain text in the upper panel.
The "Test" button allows the user to apply the Boundary Extractor query to the upper panel and the results
will be displayed in the lower panel.
|
Scroll automatically? option permit to have last node display in tree selection
With Search option, most of the views also allow the displayed data to be searched; the result of the search will be high-lighted
in the display above. For example the Control panel screenshot below shows one result of searching for "Java".
Note that the search operates on the visible text, so you may get different results when searching
the Text and HTML views.
Note: The regular expression uses the Java engine (not ORO engine like the Regular Expression Extractor or Regexp Tester view).
If there is no content-type provided, then the content will not be displayed in the any of the Response Data panels. You can use Save Responses to a file to save the data in this case. Note that the response data will still be available in the sample result, so can still be accessed using Post-Processors.
If the response data is larger than 200K, then it won't be displayed. To change this limit, set the JMeter property view.results.tree.max_size. You can also use save the entire response to a file using Save Responses to a file.
Additional renderers can be created. The class must implement the interface org.apache.jmeter.visualizers.ResultRenderer and/or extend the abstract class org.apache.jmeter.visualizers.SamplerResultTab, and the compiled code must be available to JMeter (e.g. by adding it to the lib/ext directory).
The Control Panel (above) shows an example of an HTML display.
Figure 9 (below) shows an example of an XML display.
Figure 9a (below) shows an example of a Regexp tester display.
Figure 9b (below) shows an example of a Document display.
Aggregate Report¶
The throughput is calculated from the point of view of the sampler target (e.g. the remote server in the case of HTTP samples). JMeter takes into account the total time over which the requests have been generated. If other samplers and timers are in the same thread, these will increase the total time, and therefore reduce the throughput value. So two identical samplers with different names will have half the throughput of two samplers with the same name. It is important to choose the sampler names correctly to get the best results from the Aggregate Report.
Calculation of the Median and 90 % Line (90th percentile) values requires additional memory. JMeter now combines samples with the same elapsed time, so far less memory is used. However, for samples that take more than a few seconds, the probability is that fewer samples will have identical times, in which case more memory will be needed. Note you can use this listener afterwards to reload a CSV or XML results file which is the recommended way to avoid performance impacts. See the Summary Report for a similar Listener that does not store individual samples and so needs constant memory.
- aggregate_rpt_pct1: defaults to 90th percentile
- aggregate_rpt_pct2: defaults to 95th percentile
- aggregate_rpt_pct3: defaults to 99th percentile
- Label - The label of the sample. If "Include group name in label?" is selected, then the name of the thread group is added as a prefix. This allows identical labels from different thread groups to be collated separately if required.
- # Samples - The number of samples with the same label
- Average - The average time of a set of results
- Median - The median is the time in the middle of a set of results. 50 % of the samples took no more than this time; the remainder took at least as long.
- 90% Line - 90 % of the samples took no more than this time. The remaining samples took at least as long as this. (90th percentile)
- 95% Line - 95 % of the samples took no more than this time. The remaining samples took at least as long as this. (95th percentile)
- 99% Line - 99 % of the samples took no more than this time. The remaining samples took at least as long as this. (99th percentile)
- Min - The shortest time for the samples with the same label
- Max - The longest time for the samples with the same label
- Error % - Percent of requests with errors
- Throughput - the Throughput is measured in requests per second/minute/hour. The time unit is chosen so that the displayed rate is at least 1.0. When the throughput is saved to a CSV file, it is expressed in requests/second, i.e. 30.0 requests/minute is saved as 0.5.
- Received KB/sec - The throughput measured in received Kilobytes per second
- Sent KB/sec - The throughput measured in sent Kilobytes per second
Times are in milliseconds.
The figure below shows an example of selecting the "Include group name" checkbox.
View Results in Table¶
By default, it only displays the main (parent) samples; it does not display the sub-samples (child samples). JMeter has a "Child Samples?" check-box. If this is selected, then the sub-samples are displayed instead of the main samples.
Simple Data Writer¶
Aggregate Graph¶
The figure below shows an example of settings to draw this graph.
Parameters ¶
- Columns to display: Choose the column(s) to display in graph.
- Rectangles color: Click on right color rectangle open a popup dialog to choose a custom color for column.
- Foreground color Allow to change the value text color.
- Value font: Allow to define font settings for the text.
- Draw outlines bar? To draw or not the border line on bar chart
- Show number grouping? Show or not the number grouping in Y Axis labels.
- Value labels vertical? Change orientation for value label. (Default is horizontal)
-
Column label selection: Filter by result label. A regular expression can be used, example: .*Transaction.*
Before display the graph, click on Apply filter button to refresh internal data.
Response Time Graph¶
The figure below shows an example of settings to draw this graph.
Parameters ¶
Mailer Visualizer¶
The mailer visualizer can be set up to send email if a test run receives too many failed responses from the server.
Parameters ¶
BeanShell Listener¶
The BeanShell Listener allows the use of BeanShell for processing samples for saving etc.
For full details on using BeanShell, please see the BeanShell website.
The test element supports the ThreadListener and TestListener methods. These should be defined in the initialisation file. See the file BeanShellListeners.bshrc for example definitions.
Parameters ¶
- Parameters
- string containing the parameters as a single variable
- bsh.args
- String array containing parameters, split on white-space
Before invoking the script, some variables are set up in the BeanShell interpreter:
- log - (Logger) - can be used to write to the log file
- ctx - (JMeterContext) - gives access to the context
-
vars - (JMeterVariables) - gives read/write access to variables:
vars.get(key); vars.put(key,val); vars.putObject("OBJ1",new Object());
- props - (JMeterProperties - class java.util.Properties) - e.g. props.get("START.HMS"); props.put("PROP1","1234");
- sampleResult, prev - (SampleResult) - gives access to the previous SampleResult
- sampleEvent (SampleEvent) gives access to the current sample event
For details of all the methods available on each of the above variables, please check the Javadoc
If the property beanshell.listener.init is defined, this is used to load an initialisation file, which can be used to define methods etc. for use in the BeanShell script.
Summary Report¶
The throughput is calculated from the point of view of the sampler target (e.g. the remote server in the case of HTTP samples). JMeter takes into account the total time over which the requests have been generated. If other samplers and timers are in the same thread, these will increase the total time, and therefore reduce the throughput value. So two identical samplers with different names will have half the throughput of two samplers with the same name. It is important to choose the sampler labels correctly to get the best results from the Report.
- Label - The label of the sample. If "Include group name in label?" is selected, then the name of the thread group is added as a prefix. This allows identical labels from different thread groups to be collated separately if required.
- # Samples - The number of samples with the same label
- Average - The average elapsed time of a set of results
- Min - The lowest elapsed time for the samples with the same label
- Max - The longest elapsed time for the samples with the same label
- Std. Dev. - the Standard Deviation of the sample elapsed time
- Error % - Percent of requests with errors
- Throughput - the Throughput is measured in requests per second/minute/hour. The time unit is chosen so that the displayed rate is at least 1.0. When the throughput is saved to a CSV file, it is expressed in requests/second, i.e. 30.0 requests/minute is saved as 0.5.
- Received KB/sec - The throughput measured in Kilobytes per second
- Sent KB/sec - The throughput measured in Kilobytes per second
- Avg. Bytes - average size of the sample response in bytes.
Times are in milliseconds.
The figure below shows an example of selecting the "Include group name" checkbox.
Save Responses to a file¶
This test element can be placed anywhere in the test plan. For each sample in its scope, it will create a file of the response Data. The primary use for this is in creating functional tests, but it can also be useful where the response is too large to be displayed in the View Results Tree Listener. The file name is created from the specified prefix, plus a number (unless this is disabled, see below). The file extension is created from the document type, if known. If not known, the file extension is set to 'unknown'. If numbering is disabled, and adding a suffix is disabled, then the file prefix is taken as the entire file name. This allows a fixed file name to be generated if required. The generated file name is stored in the sample response, and can be saved in the test log output file if required.
The current sample is saved first, followed by any sub-samples (child samples). If a variable name is provided, then the names of the files are saved in the order that the sub-samples appear. See below.
Parameters ¶
If parent folders in prefix do not exists, JMeter will create them and stop test if it fails.
JSR223 Listener¶
The JSR223 Listener allows JSR223 script code to be applied to sample results.
Parameters ¶
- Parameters
- string containing the parameters as a single variable
- args
- String array containing parameters, split on white-space
Before invoking the script, some variables are set up. Note that these are JSR223 variables - i.e. they can be used directly in the script.
- log
- (Logger) - can be used to write to the log file
- Label
- the String Label
- FileName
- the script file name (if any)
- Parameters
- the parameters (as a String)
- args
- the parameters as a String array (split on whitespace)
- ctx
- (JMeterContext) - gives access to the context
- vars
-
(JMeterVariables) - gives read/write access to variables:
vars.get(key); vars.put(key,val); vars.putObject("OBJ1",new Object()); vars.getObject("OBJ2");
- props
- (JMeterProperties - class java.util.Properties) - e.g. props.get("START.HMS"); props.put("PROP1","1234");
- sampleResult, prev
- (SampleResult) - gives access to the SampleResult
- sampleEvent
- (SampleEvent) - gives access to the SampleEvent
- sampler
- (Sampler)- gives access to the last sampler
- OUT
- System.out - e.g. OUT.println("message")
For details of all the methods available on each of the above variables, please check the Javadoc
Generate Summary Results¶
# Define the following property to automatically start a summariser with that name # (applies to CLI mode only) #summariser.name=summary # # interval between summaries (in seconds) default 3 minutes #summariser.interval=30 # # Write messages to log file #summariser.log=true # # Write messages to System.out #summariser.out=trueThis element is mainly intended for batch (CLI) runs. The output looks like the following:
label + 16 in 0:00:12 = 1.3/s Avg: 1608 Min: 1163 Max: 2009 Err: 0 (0.00%) Active: 5 Started: 5 Finished: 0 label + 82 in 0:00:30 = 2.7/s Avg: 1518 Min: 1003 Max: 2020 Err: 0 (0.00%) Active: 5 Started: 5 Finished: 0 label = 98 in 0:00:42 = 2.3/s Avg: 1533 Min: 1003 Max: 2020 Err: 0 (0.00%) label + 85 in 0:00:30 = 2.8/s Avg: 1505 Min: 1008 Max: 2005 Err: 0 (0.00%) Active: 5 Started: 5 Finished: 0 label = 183 in 0:01:13 = 2.5/s Avg: 1520 Min: 1003 Max: 2020 Err: 0 (0.00%) label + 79 in 0:00:30 = 2.7/s Avg: 1578 Min: 1089 Max: 2012 Err: 0 (0.00%) Active: 5 Started: 5 Finished: 0 label = 262 in 0:01:43 = 2.6/s Avg: 1538 Min: 1003 Max: 2020 Err: 0 (0.00%) label + 80 in 0:00:30 = 2.7/s Avg: 1531 Min: 1013 Max: 2014 Err: 0 (0.00%) Active: 5 Started: 5 Finished: 0 label = 342 in 0:02:12 = 2.6/s Avg: 1536 Min: 1003 Max: 2020 Err: 0 (0.00%) label + 83 in 0:00:31 = 2.7/s Avg: 1512 Min: 1003 Max: 1982 Err: 0 (0.00%) Active: 5 Started: 5 Finished: 0 label = 425 in 0:02:43 = 2.6/s Avg: 1531 Min: 1003 Max: 2020 Err: 0 (0.00%) label + 83 in 0:00:29 = 2.8/s Avg: 1487 Min: 1023 Max: 2013 Err: 0 (0.00%) Active: 5 Started: 5 Finished: 0 label = 508 in 0:03:12 = 2.6/s Avg: 1524 Min: 1003 Max: 2020 Err: 0 (0.00%) label + 78 in 0:00:30 = 2.6/s Avg: 1594 Min: 1013 Max: 2016 Err: 0 (0.00%) Active: 5 Started: 5 Finished: 0 label = 586 in 0:03:43 = 2.6/s Avg: 1533 Min: 1003 Max: 2020 Err: 0 (0.00%) label + 80 in 0:00:30 = 2.7/s Avg: 1516 Min: 1013 Max: 2005 Err: 0 (0.00%) Active: 5 Started: 5 Finished: 0 label = 666 in 0:04:12 = 2.6/s Avg: 1531 Min: 1003 Max: 2020 Err: 0 (0.00%) label + 86 in 0:00:30 = 2.9/s Avg: 1449 Min: 1004 Max: 2017 Err: 0 (0.00%) Active: 5 Started: 5 Finished: 0 label = 752 in 0:04:43 = 2.7/s Avg: 1522 Min: 1003 Max: 2020 Err: 0 (0.00%) label + 65 in 0:00:24 = 2.7/s Avg: 1579 Min: 1007 Max: 2003 Err: 0 (0.00%) Active: 0 Started: 5 Finished: 5 label = 817 in 0:05:07 = 2.7/s Avg: 1526 Min: 1003 Max: 2020 Err: 0 (0.00%)The "label" is the name of the element. The "+" means that the line is a delta line, i.e. shows the changes since the last output.
The "=" means that the line is a total line, i.e. it shows the running total.
Entries in the JMeter log file also include time-stamps. The example "817 in 0:05:07 = 2.7/s" means that there were 817 samples recorded in 5 minutes and 7 seconds, and that works out at 2.7 samples per second.
The Avg (Average), Min (Minimum) and Max (Maximum) times are in milliseconds.
"Err" means number of errors (also shown as percentage).
The last two lines will appear at the end of a test. They will not be synchronised to the appropriate time boundary. Note that the initial and final deltas may be for less than the interval (in the example above this is 30 seconds). The first delta will generally be lower, as JMeter synchronizes to the interval boundary. The last delta will be lower, as the test will generally not finish on an exact interval boundary.
The label is used to group sample results together. So if you have multiple Thread Groups and want to summarize across them all, then use the same label - or add the summariser to the Test Plan (so all thread groups are in scope). Different summary groupings can be implemented by using suitable labels and adding the summarisers to appropriate parts of the test plan.
This is not a bug but a design choice allowing to summarize across thread groups.
Parameters ¶
Comparison Assertion Visualizer¶
Parameters ¶
Backend Listener¶
Parameters ¶
The following parameters apply to the GraphiteBackendListenerClient implementation:
Parameters ¶
See also Real-time results for more details.
Since JMeter 3.2, an implementation that allows writing directly in InfluxDB with a custom schema. It is called InfluxdbBackendListenerClient. The following parameters apply to the InfluxdbBackendListenerClient implementation:
Parameters ¶
See also Real-time results and Influxdb annotations in Grafana for more details. There is also a subsection on configuring the listener for InfluxDB v2.
Since JMeter 5.4, an implementation that writes all sample results to InfluxDB. It is called InfluxDBRawBackendListenerClient. It is worth noting that this will use more resources than the InfluxdbBackendListenerClient, both by JMeter and InfluxDB due to the increase in data and individual writes. The following parameters apply to the InfluxDBRawBackendListenerClient implementation:
Parameters ¶
18.4 Configuration Elements¶
Configuration elements can be used to set up defaults and variables for later use by samplers. Note that these elements are processed at the start of the scope in which they are found, i.e. before any samplers in the same scope.
CSV Data Set Config¶
CSV Data Set Config is used to read lines from a file, and split them into variables. It is easier to use than the __CSVRead() and __StringFromFile() functions. It is well suited to handling large numbers of variables, and is also useful for testing with "random" and unique values.
Generating unique random values at run-time is expensive in terms of CPU and memory, so just create the data in advance of the test. If necessary, the "random" data from the file can be used in conjunction with a run-time parameter to create different sets of values from each run - e.g. using concatenation - which is much cheaper than generating everything at run-time.
JMeter allows values to be quoted; this allows the value to contain a delimiter. If "allow quoted data" is enabled, a value may be enclosed in double-quotes. These are removed. To include double-quotes within a quoted field, use two double-quotes. For example:
1,"2,3","4""5" => 1 2,3 4"5
JMeter supports CSV files which have a header line defining the column names. To enable this, leave the "Variable Names" field empty. The correct delimiter must be provided.
JMeter supports CSV files with quoted data that includes new-lines.
By default, the file is only opened once, and each thread will use a different line from the file. However the order in which lines are passed to threads depends on the order in which they execute, which may vary between iterations. Lines are read at the start of each test iteration. The file name and mode are resolved in the first iteration.
See the description of the Share mode below for additional options. If you want each thread to have its own set of values, then you will need to create a set of files, one for each thread. For example test1.csv, test2.csv, …, testn.csv. Use the filename test${__threadNum}.csv and set the "Sharing mode" to "Current thread".
As a special case, the string "\t" (without quotes) in the delimiter field is treated as a Tab.
When the end of file (EOF) is reached, and the recycle option is true, reading starts again with the first line of the file.
If the recycle option is false, and stopThread is false, then all the variables are set to <EOF> when the end of file is reached. This value can be changed by setting the JMeter property csvdataset.eofstring.
If the Recycle option is false, and Stop Thread is true, then reaching EOF will cause the thread to be stopped.
Parameters ¶
- All threads - (the default) the file is shared between all the threads.
- Current thread group - each file is opened once for each thread group in which the element appears
- Current thread - each file is opened separately for each thread
- Identifier - all threads sharing the same identifier share the same file. So for example if you have 4 thread groups, you could use a common id for two or more of the groups to share the file between them. Or you could use the thread number to share the file between the same thread numbers in different thread groups.
DNS Cache Manager¶
The DNS Cache Manager element allows to test applications, which have several servers behind load balancers (CDN, etc.), when user receives content from different IP's. By default JMeter uses JVM DNS cache. That's why only one server from the cluster receives load. DNS Cache Manager resolves names for each thread separately each iteration and saves results of resolving to its internal DNS Cache, which is independent from both JVM and OS DNS caches.
A mapping for static hosts can be used to simulate something like /etc/hosts file. These entries will be preferred over the custom resolver. Use custom DNS resolver has to be enabled, if you want to use this mapping.
Say, you have a test server, that you want to reach with a name, that is not (yet) set up in your DNS servers. For our example, this would be www.example.com for the server name, which you want to reach at the IP of the server a123.another.example.org.
You could change your workstation and add an entry to your /etc/hosts file - or the equivalent for your OS, or add an entry to the Static Host Table of the DNS Cache Manager.
You would type www.example.com into the first column (Host) and a123.another.example.org into the second column (Hostname or IP address). As the name of the second column implies, you could even use the IP address of your test server there.
The IP address for the test server will be looked up by using the custom DNS resolver. When none is given, the system DNS resolver will be used.
Now you can use www.example.com in your HTTPClient4 samplers and the requests will be made against a123.another.example.org with all headers set to www.example.com.
Parameters ¶
HTTP Authorization Manager¶
The Authorization Manager lets you specify one or more user logins for web pages that are restricted using server authentication. You see this type of authentication when you use your browser to access a restricted page, and your browser displays a login dialog box. JMeter transmits the login information when it encounters this type of page.
The Authorization headers may not be shown in the Tree View Listener "Request" tab. The Java implementation does pre-emptive authentication, but it does not return the Authorization header when JMeter fetches the headers. The HttpComponents (HC 4.5.X) implementation defaults to pre-emptive since 3.2 and the header will be shown. To disable this, set the values as below, in which case authentication will only be performed in response to a challenge.
In the file jmeter.properties set httpclient4.auth.preemptive=false
Parameters ¶
- Java
- BASIC
- HttpClient 4
- BASIC, DIGEST and Kerberos
Kerberos Configuration:
To configure Kerberos you need to setup at least two JVM system properties:
- -Djava.security.krb5.conf=krb5.conf
- -Djava.security.auth.login.config=jaas.conf
You can also configure those two properties in the file bin/system.properties. Look at the two sample configuration files (krb5.conf and jaas.conf) located in the JMeter bin folder for references to more documentation, and tweak them to match your Kerberos configuration.
Delegation of credentials is disabled by default for SPNEGO. If you want to enable it, you can do so by setting the property kerberos.spnego.delegate_cred to true.
When generating a SPN for Kerberos SPNEGO authentication IE and Firefox will omit the port number from the URL. Chrome has an option (--enable-auth-negotiate-port) to include the port number if it differs from the standard ones (80 and 443). That behavior can be emulated by setting the following JMeter property as below.
In jmeter.properties or user.properties, set:
- kerberos.spnego.strip_port=false
Controls:
- Add Button - Add an entry to the authorization table.
- Delete Button - Delete the currently selected table entry.
- Load Button - Load a previously saved authorization table and add the entries to the existing authorization table entries.
- Save As Button - Save the current authorization table to a file.
Download this example. In this example, we created a Test Plan on a local server that sends three HTTP requests, two requiring a login and the other is open to everyone. See figure 10 to see the makeup of our Test Plan. On our server, we have a restricted directory named, "secret", which contains two files, "index.html" and "index2.html". We created a login id named, "kevin", which has a password of "spot". So, in our Authorization Manager, we created an entry for the restricted directory and a username and password (see figure 11). The two HTTP requests named "SecretPage1" and "SecretPage2" make requests to "/secret/index.html" and "/secret/index2.html". The other HTTP request, named "NoSecretPage" makes a request to "/index.html".
When we run the Test Plan, JMeter looks in the Authorization table for the URL it is requesting. If the Base URL matches the URL, then JMeter passes this information along with the request.
HTTP Cache Manager¶
The HTTP Cache Manager is used to add caching functionality to HTTP requests within its scope to simulate browser cache feature. Each Virtual User thread has its own Cache. By default, Cache Manager will store up to 5000 items in cache per Virtual User thread, using LRU algorithm. Use property "maxSize" to modify this value. Note that the more you increase this value the more HTTP Cache Manager will consume memory, so be sure to adapt the -Xmx JVM option accordingly.
If a sample is successful (i.e. has response code 2xx) then the Last-Modified and Etag (and Expired if relevant) values are saved for the URL. Before executing the next sample, the sampler checks to see if there is an entry in the cache, and if so, the If-Last-Modified and If-None-Match conditional headers are set for the request.
Additionally, if the "Use Cache-Control/Expires header" option is selected, then the Cache-Control/Expires value is checked against the current time. If the request is a GET request, and the timestamp is in the future, then the sampler returns immediately, without requesting the URL from the remote server. This is intended to emulate browser behaviour. Note that if Cache-Control header is "no-cache", the response will be stored in cache as pre-expired, so will generate a conditional GET request. If Cache-Control has any other value, the "max-age" expiry option is processed to compute entry lifetime, if missing then expire header will be used, if also missing entry will be cached as specified in RFC 2616 section 13.2.4 using Last-Modified time and response Date.
Parameters ¶
HTTP Cookie Manager¶
The Cookie Manager element has two functions:
First, it stores and sends cookies just like a web browser. If you have an HTTP Request and
the response contains a cookie, the Cookie Manager automatically stores that cookie and will
use it for all future requests to that particular web site. Each JMeter thread has its own
"cookie storage area". So, if you are testing a web site that uses a cookie for storing
session information, each JMeter thread will have its own session.
Note that such cookies do not appear on the Cookie Manager display, but they can be seen using
the View Results Tree Listener.
JMeter checks that received cookies are valid for the URL. This means that cross-domain cookies are not stored. If you have bugged behaviour or want Cross-Domain cookies to be used, define the JMeter property "CookieManager.check.cookies=false".
Received Cookies can be stored as JMeter thread variables. To save cookies as variables, define the property "CookieManager.save.cookies=true". Also, cookies names are prefixed with "COOKIE_" before they are stored (this avoids accidental corruption of local variables) To revert to the original behaviour, define the property "CookieManager.name.prefix= " (one or more spaces). If enabled, the value of a cookie with the name TEST can be referred to as ${COOKIE_TEST}.
Second, you can manually add a cookie to the Cookie Manager. However, if you do this, the cookie will be shared by all JMeter threads.
Note that such Cookies are created with an Expiration time far in the future
Cookies with null values are ignored by default. This can be changed by setting the JMeter property: CookieManager.delete_null_cookies=false. Note that this also applies to manually defined cookies - any such cookies will be removed from the display when it is updated. Note also that the cookie name must be unique - if a second cookie is defined with the same name, it will replace the first.
Parameters ¶
[Note: If you have a website to test with IPv6 address, choose HC4CookieHandler (IPv6 compliant)]
The "domain" is the hostname of the server (without http://); the port is currently ignored.
HTTP Request Defaults¶
This element lets you set default values that your HTTP Request controllers use. For example, if you are creating a Test Plan with 25 HTTP Request controllers and all of the requests are being sent to the same server, you could add a single HTTP Request Defaults element with the "Server Name or IP" field filled in. Then, when you add the 25 HTTP Request controllers, leave the "Server Name or IP" field empty. The controllers will inherit this field value from the HTTP Request Defaults element.
Parameters ¶
HTTP Header Manager¶
The Header Manager lets you add or override HTTP request headers.
JMeter now supports multiple Header Managers. The header entries are merged to form the list for the sampler. If an entry to be merged matches an existing header name, it replaces the previous entry. This allows one to set up a default set of headers, and apply adjustments to particular samplers. Note that an empty value for a header does not remove an existing header, it justs replace its value.
Parameters ¶
Java Request Defaults¶
The Java Request Defaults component lets you set default values for Java testing. See the Java Request.
JDBC Connection Configuration¶
Parameters ¶
If you really want to use shared pooling (why?), then set the max count to the same as the number of threads to ensure threads don't wait on each other.
The list of the validation queries can be configured with jdbc.config.check.query property and are by default:
- hsqldb
- select 1 from INFORMATION_SCHEMA.SYSTEM_USERS
- Oracle
- select 1 from dual
- DB2
- select 1 from sysibm.sysdummy1
- MySQL or MariaDB
- select 1
- Microsoft SQL Server (MS JDBC driver)
- select 1
- PostgreSQL
- select 1
- Ingres
- select 1
- Derby
- values 1
- H2
- select 1
- Firebird
- select 1 from rdb$database
- Exasol
- select 1
The list of the preconfigured jdbc driver classes can be configured with jdbc.config.jdbc.driver.class property and are by default:
- hsqldb
- org.hsqldb.jdbc.JDBCDriver
- Oracle
- oracle.jdbc.OracleDriver
- DB2
- com.ibm.db2.jcc.DB2Driver
- MySQL
- com.mysql.jdbc.Driver
- Microsoft SQL Server (MS JDBC driver)
- com.microsoft.sqlserver.jdbc.SQLServerDriver or com.microsoft.jdbc.sqlserver.SQLServerDriver
- PostgreSQL
- org.postgresql.Driver
- Ingres
- com.ingres.jdbc.IngresDriver
- Derby
- org.apache.derby.jdbc.ClientDriver
- H2
- org.h2.Driver
- Firebird
- org.firebirdsql.jdbc.FBDriver
- Apache Derby
- org.apache.derby.jdbc.ClientDriver
- MariaDB
- org.mariadb.jdbc.Driver
- SQLite
- org.sqlite.JDBC
- Sybase AES
- net.sourceforge.jtds.jdbc.Driver
- Exasol
- com.exasol.jdbc.EXADriver
Different databases and JDBC drivers require different JDBC settings. The Database URL and JDBC Driver class are defined by the provider of the JDBC implementation.
Some possible settings are shown below. Please check the exact details in the JDBC driver documentation.
If JMeter reports No suitable driver, then this could mean either:
- The driver class was not found. In this case, there will be a log message such as DataSourceElement: Could not load driver: {classname} java.lang.ClassNotFoundException: {classname}
- The driver class was found, but the class does not support the connection string. This could be because of a syntax error in the connection string, or because the wrong classname was used.
If the database server is not running or is not accessible, then JMeter will report a java.net.ConnectException.
Some examples for databases and their parameters are given below.
- MySQL
-
- Driver class
- com.mysql.jdbc.Driver
- Database URL
- jdbc:mysql://host[:port]/dbname
- PostgreSQL
-
- Driver class
- org.postgresql.Driver
- Database URL
- jdbc:postgresql:{dbname}
- Oracle
-
- Driver class
- oracle.jdbc.OracleDriver
- Database URL
- jdbc:oracle:thin:@//host:port/service OR jdbc:oracle:thin:@(description=(address=(host={mc-name})(protocol=tcp)(port={port-no}))(connect_data=(sid={sid})))
- Ingress (2006)
-
- Driver class
- ingres.jdbc.IngresDriver
- Database URL
- jdbc:ingres://host:port/db[;attr=value]
- Microsoft SQL Server (MS JDBC driver)
-
- Driver class
- com.microsoft.sqlserver.jdbc.SQLServerDriver
- Database URL
- jdbc:sqlserver://host:port;DatabaseName=dbname
- Apache Derby
-
- Driver class
- org.apache.derby.jdbc.ClientDriver
- Database URL
- jdbc:derby://server[:port]/databaseName[;URLAttributes=value[;…]]
- MariaDB
-
- Driver class
- org.mariadb.jdbc.Driver
- Database URL
- jdbc:mariadb://host[:port]/dbname[;URLAttributes=value[;…]]
- Exasol (see also JDBC driver documentation)
-
- Driver class
- com.exasol.jdbc.EXADriver
- Database URL
- jdbc:exa:host[:port][;schema=SCHEMA_NAME][;prop_x=value_x]
Keystore Configuration¶
The Keystore Config Element lets you configure how Keystore will be loaded and which keys it will use. This component is typically used in HTTPS scenarios where you don't want to take into account keystore initialization into account in response time.
To use this element, you need to setup first a Java Key Store with the client certificates you want to test, to do that:
- Create your certificates either with Java keytool utility or through your PKI
- If created by PKI, import your keys in Java Key Store by converting them to a format acceptable by JKS
-
Then reference the keystore file through the two JVM properties (or add them in system.properties):
- -Djavax.net.ssl.keyStore=path_to_keystore
- -Djavax.net.ssl.keyStorePassword=password_of_keystore
To use PKCS11 as the source for the store, you need to set javax.net.ssl.keyStoreType to PKCS11 and javax.net.ssl.keyStore to NONE.
Parameters ¶
- https.use.cached.ssl.context=false is set in jmeter.properties or user.properties
- You use HTTPClient 4 implementation for HTTP Request
Login Config Element¶
The Login Config Element lets you add or override username and password settings in samplers that use username and password as part of their setup.
Parameters ¶
LDAP Request Defaults¶
The LDAP Request Defaults component lets you set default values for LDAP testing. See the LDAP Request.
LDAP Extended Request Defaults¶
The LDAP Extended Request Defaults component lets you set default values for extended LDAP testing. See the LDAP Extended Request.
TCP Sampler Config¶
The TCP Sampler Config provides default data for the TCP Sampler
Parameters ¶
User Defined Variables¶
The User Defined Variables element lets you define an initial set of variables, just as in the Test Plan.
UDVs should not be used with functions that generate different results each time they are called. Only the result of the first function call will be saved in the variable. However, UDVs can be used with functions such as __P(), for example:
HOST ${__P(host,localhost)}
which would define the variable "HOST" to have the value of the JMeter property "host", defaulting to "localhost" if not defined.
For defining variables during a test run, see User Parameters. UDVs are processed in the order they appear in the Plan, from top to bottom.
For simplicity, it is suggested that UDVs are placed only at the start of a Thread Group (or perhaps under the Test Plan itself).
Once the Test Plan and all UDVs have been processed, the resulting set of variables is copied to each thread to provide the initial set of variables.
If a runtime element such as a User Parameters Pre-Processor or Regular Expression Extractor defines a variable with the same name as one of the UDV variables, then this will replace the initial value, and all other test elements in the thread will see the updated value.
Parameters ¶
Random Variable¶
The Random Variable Config Element is used to generate random numeric strings and store them in variable for use later. It's simpler than using User Defined Variables together with the __Random() function.
The output variable is constructed by using the random number generator, and then the resulting number is formatted using the format string. The number is calculated using the formula minimum+Random.nextInt(maximum-minimum+1). Random.nextInt() requires a positive integer. This means that maximum-minimum - i.e. the range - must be less than 2147483647, however the minimum and maximum values can be any long values so long as the range is OK.
Parameters ¶
Counter¶
Allows the user to create a counter that can be referenced anywhere in the Thread Group. The counter config lets the user configure a starting point, a maximum, and the increment. The counter will loop from the start to the max, and then start over with the start, continuing on like that until the test is ended.
The counter uses a long to store the value, so the range is from -2^63 to 2^63-1.
Parameters ¶
Simple Config Element¶
The Simple Config Element lets you add or override arbitrary values in samplers. You can choose the name of the value and the value itself. Although some adventurous users might find a use for this element, it's here primarily for developers as a basic GUI that they can use while developing new JMeter components.
Parameters ¶
MongoDB Source Config (DEPRECATED)¶
You can then access com.mongodb.DB object in Beanshell or JSR223 Test Elements through the element MongoDBHolder using this code
import com.mongodb.DB; import org.apache.jmeter.protocol.mongodb.config.MongoDBHolder; DB db = MongoDBHolder.getDBFromSource("value of property MongoDB Source", "value of property Database Name"); …
Parameters ¶
There is maximum amount of time to keep retrying, which is 15s by default.
This can be useful to avoid some exceptions being thrown when a server is down temporarily by blocking the operations.
It can also be useful to smooth the transition to a new primary node (so that a new primary node is elected within the retry time).
- for a replica set, the driver will try to connect to the old primary node for that time, instead of failing over to the new one right away
- this does not prevent exception from being thrown in read/write operations on the socket, which must be handled by application.
It is used solely when establishing a new connection Socket.connect(java.net.SocketAddress, int)
Default is 0 and means no timeout.
Default is 0, which means to use the default 15s if autoConnectRetry is on.
Default is 120,000.
Default is 0 and means no timeout.
Default is false.
All further threads will get an exception right away.
For example if connectionsPerHost is 10 and threadsAllowedToBlockForConnectionMultiplier is 5, then up to 50 threads can wait for a connection.
Default is 5.
If w, wtimeout, fsync or j are specified, this setting is ignored.
Default is false.
Default is false.
Default is false.
Default is 0.
Default is 0.
Bolt Connection Configuration¶
Parameters ¶
18.5 Assertions¶
Assertions are used to perform additional checks on samplers, and are processed after every sampler in the same scope. To ensure that an Assertion is applied only to a particular sampler, add it as a child of the sampler.
Assertions can be applied to either the main sample, the sub-samples or both. The default is to apply the assertion to the main sample only. If the Assertion supports this option, then there will be an entry on the GUI which looks like the following:
or the followingIf a sub-sampler fails and the main sample is successful, then the main sample will be set to failed status and an Assertion Result will be added. If the JMeter variable option is used, it is assumed to relate to the main sample, and any failure will be applied to the main sample only.
Response Assertion¶
The response assertion control panel lets you add pattern strings to be compared against various fields of the request or response. The pattern strings are:
- Contains, Matches: Perl5-style regular expressions
- Equals, Substring: plain text, case-sensitive
A summary of the pattern matching characters can be found at ORO Perl5 regular expressions.
You can also choose whether the strings will be expected to match the entire response, or if the response is only expected to contain the pattern. You can attach multiple assertions to any controller for additional flexibility.
Note that the pattern string should not include the enclosing delimiters, i.e. use Price: \d+ not /Price: \d+/.
By default, the pattern is in multi-line mode, which means that the "." meta-character does not match newline. In multi-line mode, "^" and "$" match the start or end of any line anywhere within the string - not just the start and end of the entire string. Note that \s does match new-line. Case is also significant. To override these settings, one can use the extended regular expression syntax. For example:
- (?i)
- ignore case
- (?s)
- treat target as single line, i.e. "." matches new-line
- (?is)
- both the above
- (?i)apple(?-i) Pie
- matches "ApPLe Pie", but not "ApPLe pIe"
- (?s)Apple.+?Pie
- matches Apple followed by Pie, which may be on a subsequent line.
- Apple(?s).+?Pie
- same as above, but it's probably clearer to use the (?s) at the start.
Parameters ¶
- Main sample only - only applies to the main sample
- Sub-samples only - only applies to the sub-samples
- Main sample and sub-samples - applies to both.
- JMeter Variable Name to use - assertion is to be applied to the contents of the named variable
- Text Response - the response text from the server, i.e. the body, excluding any HTTP headers.
- Request data - the request text sent to the server, i.e. the body, excluding any HTTP headers.
- Response Code - e.g. 200
- Response Message - e.g. OK
- Response Headers, including Set-Cookie headers (if any)
- Request Headers
- URL sampled
- Document (text) - the extract text from various type of documents via Apache Tika (see View Results Tree Document view section).
The overall success of the sample is determined by combining the result of the assertion with the existing Response status. When the Ignore Status checkbox is selected, the Response status is forced to successful before evaluating the Assertion.
HTTP Responses with statuses in the 4xx and 5xx ranges are normally regarded as unsuccessful. The "Ignore status" checkbox can be used to set the status successful before performing further checks.- Contains - true if the text contains the regular expression pattern
- Matches - true if the whole text matches the regular expression pattern
- Equals - true if the whole text equals the pattern string (case-sensitive)
- Substring - true if the text contains the pattern string (case-sensitive)
The pattern is a Perl5-style regular expression, but without the enclosing brackets.
Duration Assertion¶
The Duration Assertion tests that each response was received within a given amount of time. Any response that takes longer than the given number of milliseconds (specified by the user) is marked as a failed response.
Parameters ¶
Size Assertion¶
The Size Assertion tests that each response contains the right number of bytes in it. You can specify that the size be equal to, greater than, less than, or not equal to a given number of bytes.
Parameters ¶
- Main sample only - assertion only applies to the main sample
- Sub-samples only - assertion only applies to the sub-samples
- Main sample and sub-samples - assertion applies to both.
- JMeter Variable Name to use - assertion is to be applied to the contents of the named variable
XML Assertion¶
The XML Assertion tests that the response data consists of a formally correct XML document. It does not validate the XML based on a DTD or schema or do any further validation.
Parameters ¶
BeanShell Assertion¶
The BeanShell Assertion allows the user to perform assertion checking using a BeanShell script.
For full details on using BeanShell, please see the BeanShell website.
Note that a different Interpreter is used for each independent occurrence of the assertion in each thread in a test script, but the same Interpreter is used for subsequent invocations. This means that variables persist across calls to the assertion.
All Assertions are called from the same thread as the sampler.
If the property "beanshell.assertion.init" is defined, it is passed to the Interpreter as the name of a sourced file. This can be used to define common methods and variables. There is a sample init file in the bin directory: BeanShellAssertion.bshrc
The test element supports the ThreadListener and TestListener methods. These should be defined in the initialisation file. See the file BeanShellListeners.bshrc for example definitions.
Parameters ¶
- Parameters - string containing the parameters as a single variable
- bsh.args - String array containing parameters, split on white-space
There's a sample script you can try.
Before invoking the script, some variables are set up in the BeanShell interpreter. These are strings unless otherwise noted:
- log - the Logger Object. (e.g.) log.warn("Message"[,Throwable])
- SampleResult, prev - the SampleResult Object; read-write
- Response - the response Object; read-write
- Failure - boolean; read-write; used to set the Assertion status
- FailureMessage - String; read-write; used to set the Assertion message
- ResponseData - the response body (byte [])
- ResponseCode - e.g. 200
- ResponseMessage - e.g. OK
- ResponseHeaders - contains the HTTP headers
- RequestHeaders - contains the HTTP headers sent to the server
- SampleLabel
- SamplerData - data that was sent to the server
- ctx - JMeterContext
-
vars - JMeterVariables - e.g.
vars.get("VAR1"); vars.put("VAR2","value"); vars.putObject("OBJ1",new Object());
-
props - JMeterProperties (class java.util.Properties) - e.g.
props.get("START.HMS"); props.put("PROP1","1234");
The following methods of the Response object may be useful:
- setStopThread(boolean)
- setStopTest(boolean)
- String getSampleLabel()
- setSampleLabel(String)
MD5Hex Assertion¶
The MD5Hex Assertion allows the user to check the MD5 hash of the response data.
Parameters ¶
HTML Assertion¶
The HTML Assertion allows the user to check the HTML syntax of the response data using JTidy.
Parameters ¶
XPath Assertion¶
The XPath Assertion tests a document for well formedness, has the option of validating against a DTD, or putting the document through JTidy and testing for an XPath. If that XPath exists, the Assertion is true. Using "/" will match any well-formed document, and is the default XPath Expression. The assertion also supports boolean expressions, such as "count(//*error)=2". See http://www.w3.org/TR/xpath for more information on XPath.
Some sample expressions:- //title[text()='Text to match'] - matches <title>Text to match</title> anywhere in the response
- /title[text()='Text to match'] - matches <title>Text to match</title> at root level in the response
Parameters ¶
As a work-round for namespace limitations of the Xalan XPath parser (implementation on which JMeter is based) you need to:
-
provide a Properties file (if for example your file is named namespaces.properties) which contains mappings for the namespace prefixes:
prefix1=http\://foo.apache.org prefix2=http\://toto.apache.org …
-
reference this file in user.properties file using the property:
xpath.namespace.config=namespaces.properties
XPath2 Assertion¶
The XPath2 Assertion tests a document for well formedness. Using "/" will match any well-formed document, and is the default XPath2 Expression. The assertion also supports boolean expressions, such as "count(//*error)=2".
Some sample expressions:- //title[text()='Text to match'] - matches <title>Text to match</title> anywhere in the response
- /title[text()='Text to match'] - matches <title>Text to match</title> at root level in the response
Parameters ¶
XML Schema Assertion¶
The XML Schema Assertion allows the user to validate a response against an XML Schema.
Parameters ¶
JSR223 Assertion¶
The JSR223 Assertion allows JSR223 script code to be used to check the status of the previous sample.
Parameters ¶
- Parameters - string containing the parameters as a single variable
- args - String array containing parameters, split on white-space
The following variables are set up for use by the script:
- log - (Logger) - can be used to write to the log file
- Label - the String Label
- Filename - the script file name (if any)
- Parameters - the parameters (as a String)
- args - the parameters as a String array (split on whitespace)
- ctx - (JMeterContext) - gives access to the context
-
vars - (JMeterVariables) - gives read/write access to variables:
vars.get(key); vars.put(key,val); vars.putObject("OBJ1",new Object()); vars.getObject("OBJ2");
-
props - (JMeterProperties - class java.util.Properties) - e.g.
props.get("START.HMS"); props.put("PROP1","1234");
- SampleResult, prev - (SampleResult) - gives access to the previous SampleResult (if any)
- sampler - (Sampler) - gives access to the current sampler
- OUT - System.out - e.g. OUT.println("message")
- AssertionResult - (AssertionResult) - the assertion result
The script can check various aspects of the SampleResult. If an error is detected, the script should use AssertionResult.setFailureMessage("message") and AssertionResult.setFailure(true).
For further details of all the methods available on each of the above variables, please check the Javadoc
Compare Assertion¶
Parameters ¶
SMIME Assertion¶
- bcmail-xxx.jar (BouncyCastle SMIME/CMS)
- bcprov-xxx.jar (BouncyCastle Provider)
If using the Mail Reader Sampler, please ensure that you select "Store the message using MIME (raw)" otherwise the Assertion won't be able to process the message correctly.
Parameters ¶
JSON Assertion¶
This component allows you to perform validations of JSON documents. First, it will parse the JSON and fail if the data is not JSON. Second, it will search for specified path, using syntax from Jayway JsonPath 1.2.0. If the path is not found, it will fail. Third, if JSON path was found in the document, and validation against expected value was requested, it will perform validation. For the null value there is special checkbox in the GUI. Note that if the path will return array object, it will be iterated and if expected value is found, the assertion will succeed. To validate empty array use [] string. Also, if patch will return dictionary object, it will be converted to string before comparison.
Since JMeter version 5.5 the assertion will fail, if an indefinite path is given, an empty list is extracted and no assertion value is set.
Parameters ¶
JSON JMESPath Assertion¶
This component allows you to perform assertion on JSON documents content using JMESPath.
First, it will parse the JSON and fail if the data is not JSON.
Second, it will search for specified path, using JMESPath syntax.
If the path is not found, it will fail.
Third, if JMES path was found in the document, and validation against expected value was requested, it will perform this additional check.
If you want to check for nullity, use the Expect null checkbox.
Note that the path cannot be null as the expression JMESPath will not be compiled and an error will occur.
Even if you expect an empty or null response, you must put a valid JMESPath expression.
Parameters ¶
18.6 Timers¶
You can apply a multiplication factor on the sleep delays computed by Random timer by setting property timer.factor=float number where float number is a decimal positive number.
JMeter will multiply this factor by the computed sleep delay. This feature can be used by:
Timers are only processed in conjunction with a sampler. A timer which is not in the same scope as a sampler will not be processed at all.
To apply a timer to a single sampler, add the timer as a child element of the sampler. The timer will be applied before the sampler is executed. To apply a timer after a sampler, either add it to the next sampler, or add it as the child of a Flow Control Action Sampler.
Constant Timer¶
If you want to have each thread pause for the same amount of time between requests, use this timer.
Parameters ¶
Gaussian Random Timer¶
This timer pauses each thread request for a random amount of time, with most of the time intervals occurring near a particular value. The total delay is the sum of the Gaussian distributed value (with mean 0.0 and standard deviation 1.0) times the deviation value you specify, and the offset value. Another way to explain it, in Gaussian Random Timer, the variation around constant offset has a Gaussian curve distribution.
Parameters ¶
Uniform Random Timer¶
This timer pauses each thread request for a random amount of time, with each time interval having the same probability of occurring. The total delay is the sum of the random value and the offset value.
Parameters ¶
Constant Throughput Timer¶
This timer introduces variable pauses, calculated to keep the total throughput (in terms of samples per minute) as close as possible to a given figure. Of course the throughput will be lower if the server is not capable of handling it, or if other timers or time-consuming test elements prevent it.
N.B. although the Timer is called the Constant Throughput timer, the throughput value does not need to be constant. It can be defined in terms of a variable or function call, and the value can be changed during a test. The value can be changed in various ways:
- using a counter variable
- using a __jexl3, __groovy function to provide a changing value
- using the remote BeanShell server to change a JMeter property
See Best Practices for further details.
Parameters ¶
- this thread only - each thread will try to maintain the target throughput. The overall throughput will be proportional to the number of active threads.
- all active threads in current thread group - the target throughput is divided amongst all the active threads in the group. Each thread will delay as needed, based on when it last ran.
- all active threads - the target throughput is divided amongst all the active threads in all Thread Groups. Each thread will delay as needed, based on when it last ran. In this case, each other Thread Group will need a Constant Throughput timer with the same settings.
- all active threads in current thread group (shared) - as above, but each thread is delayed based on when any thread in the group last ran.
- all active threads (shared) - as above; each thread is delayed based on when any thread last ran.
The shared and non-shared algorithms both aim to generate the desired throughput, and will produce similar results.
The shared algorithm should generate a more accurate overall transaction rate.
The non-shared algorithm should generate a more even spread of transactions across threads.
Precise Throughput Timer¶
This timer introduces variable pauses, calculated to keep the total throughput (e.g. in terms of samples per minute) as close as possible to a given figure. The timer does not generate threads, so the resulting throughput will be lower if the server is not capable of handling it, or if other timers add too big delays, or if there's not enough threads, or time-consuming test elements prevent it.
Although the Timer is called Precise Throughput Timer, it does not aim to produce precisely the same number of samples over one-second intervals during the test.
The timer works best for rates under 36000 requests/hour, however your mileage might vary (see monitoring section below if your goals are vastly different).
Best location of a Precise Throughput Timer in a Test Plan
As you might know, the timers are inherited by all the siblings and their child elements. That is why one of the best places for Precise Throughput Timer is under the first element in a test loop. For instance, you might add a dummy sampler at the beginning, and place the timer under that dummy sampler
Produced schedule
Precise Throughput Timer models Poisson arrivals schedule. That schedule often happens in a real-life, so it makes sense to use that for load testing. For instance, it naturally might generate samples that are close together thus it might reveal concurrency issues. Even if you manage to generate Poisson arrivals with Poisson Random Timer, it would be susceptible to the issues listed below. For instance, true Poisson arrivals might have indefinitely long pause, and that is not practical for load testing. For instance, "regular" Poisson arrivals with 1 per second rate might end up with 50 samples over 60 second long test.
Constant Throughput Timer converges to the specified rate, however it tends to produce samples at even intervals.
Ramp-up and startup spike
You might used "ramp-up" or similar approaches to avoid a spike at the test start. For instance, if you configure Thread Group to have 100 threads, and set Ramp-up Period to 0 (or to a small number), then all the threads would start at the same time, and it would produce an unwanted spike of the load. On top of that, if you set Ramp-up Period too high, it might result in "too few" threads being available at the very beginning to achieve the required load.
Precise Throughput Timer schedules executions in a random way, so it can be used to generate constant load, and it is recommended to set both Ramp-up Period and Delay to 0.
Multiple thread groups starting at the same time
A variation of Ramp-up issue might appear when Test Plan includes multiple Thread Groups. To mitigate that issue one typically adds "random" delay to each Thread Group so threads start at different times.
Precise Throughput Timer avoids that issue since it schedules executions in a random way. You do not need to add extra random delays to mitigate startup spike
Number of iterations per hour
One of the basic requirements is to issue N samples per M minutes. Let it be 60 iterations per hour. Business customers would not understand if you report load test results with 57 executions "just because the random was random". In order to generate 60 iterations per hour, you need to configure as follows (other parameters could be left with their default values)
- Target throughput (samples): 60
- Throughput period (seconds): 3600
- Test duration (seconds): 3600
The first two options set the throughput. Even though 60/3600, 30/1800, and 120/7200 represent exactly the same load level, pick the one that represents business requirements better. For instance, if the requirement is to test for "60 sample per hour", then set 60/3600. If the requirement is to test "1 sample per minute", then set 1/60.
Test duration (seconds) is there so the timer ensures exact number of samples for a given test duration. Precise Throughput Timer creates a schedule for the samples at the test startup. For instance, if you wish to perform 5 minutes test with 60 per hour throughput, you would set Test duration (seconds) to 300. This enables to configure throughput in a business-friendly way. Note: Test duration (seconds) does not limit test duration. It is just a hint for the timer.
Number of threads and think times
One of the common pitfalls is to adjust number of threads and think times in order to end up with the desired throughput. Even though it might work, that approach results in lots of time spent on the test runs. It might require to adjust threads and delays again when new application version arrives.
Precise Throughput Timer enables to set throughput goal and go for it no matter how well application performs. In order to do that, Precise Throughput Timer creates a schedule at the test startup, then it uses that schedule to release threads. The main driver for the think times and number of threads should be business requirements, not the desire to match throughput somehow.
For instance, if you application is used by support engineers in a call center. Suppose there are 2 engineers in the call center, and the target throughput is 1 per minute. Suppose it takes 4 minutes for the engineer to read and review the web page. For that case you should set 2 threads in the group, use 4 minutes for think time delays, and specify 1 per minute in Precise Throughput Timer. Of course it would result in something around 2samples/4minutes=0.5 per minute and the result of such a test means "you need more support engineers in a call center" or "you need to reduce the time it takes an engineer to fulfill a task".
Testing low rates and repeatable tests
Testing at low rates (e.g. 60 per hour) requires to know the desired test profile. For instance, if you need to inject load at even intervals (e.g. 60 seconds in between) then you'd better use Constant Throughput Timer. However, if you need to have randomized schedule (e.g. to model real users that execute reports), then Precise Throughput Timer is your friend.
When comparing outcomes of multiple load tests, it is useful to be able to repeat exactly the same test profile. For instance, if action X (e.g. "Profit Report") is invoked after 5 minutes of the test start, then it would be nice to replicate that pattern for subsequent test executions. Replicating the same load pattern simplifies analysis of the test results (e.g. CPU% chart).
Random seed (change from 0 to random) enables to control the seed value that is used by Precise Throughput Timer. By default it is initialized with 0 and that means random seed is used for each test execution. If you need to have repeatable load pattern, then change Random seed so some random value. The general advice is to use non-zero seed, and "0 by default" is an implementation limit.
Note: when using multiple thread groups with same throughput rates and same non-zero seed it might result in unwanted firing the samples at the same time.
Testing high rates and/or long test durations
Precise Throughput Timer generates the schedule and keeps it in memory. In most cases it should not be a problem, however, remember that you might want to keep the schedule shorter than 1'000'000 samples. It takes ~200ms to generate a schedule for 1'000'000 samples, and the schedule consumes 8 megabytes in the heap. Schedule for 10 million entries takes 1-2 second to build and it consumes 80 megabytes in the heap.
For instance, if you want to perform 2-week long test with 5'000 per hour rate, then you probably want to have exactly 5'000 samples for each hour. You can set Test duration (seconds) property of the timer of the timer to 1 hour. Then the timer would create a schedule of 5'000 samples for an hour, and when the schedule is exhausted, the timer would generate a schedule for the next hour.
At the same time, you can set Test duration (seconds) to 2 weeks, and the timer would generate a schedule with 168'000 samples = 2 weeks * 5'000 samples/hour = 2*7*24*500. The schedule would take ~30ms to generate, and it would consume a little more than 1 megabyte.
Bursty load
There might be a case when all the samples should come in pairs, triples, etc. Certain cases might be solved via Synchronizing Timer, however Precise Throughput Timer has native way to issue requests in packs. This behavior is disabled by default, and it is controlled with "Batched departures" settings
- Number of threads in the batch (threads). Specifies the number of samples in a batch. Note the overall number of samples will still be in line with Target Throughput
- Delay between threads in the batch (ms). For instance, if set to 42, and the batch size is 3, then threads will depart at x, x+42ms, x+84ms
Variable load rate
Even though property values (e.g. throughput) can be defined via expressions, it is recommended to keep the value more or less the same through the test, as it takes time to recompute the new schedule to adapt new values.
Monitoring
As next schedule is generated, Precise Throughput Timer logs a message to jmeter.log: 2018-01-04 17:34:03,635 INFO o.a.j.t.ConstantPoissonProcessGenerator: Generated 21 timings (... 20 required, rate 1.0, duration 20, exact lim 20000, i21) in 0 ms. First 15 events will be fired at: 1.1869653574244292 (+1.1869653574244292), 1.4691340403043207 (+0.2821686828798915), 3.638151706179226 (+2.169017665874905), 3.836357090410566 (+0.19820538423134026), 4.709330071408575 (+0.8729729809980085), 5.61330076999953 (+0.903970698590955), ... This shows that schedule generation took 0ms, and it shows absolute timestamps in seconds. In the case above, the rate was set to be 1 per second, and the actual timestamps became 1.2 sec, 1.5 sec, 3.6 sec, 3.8 sec, 4.7 sec, and so on.
Parameters ¶
Synchronizing Timer¶
The purpose of the SyncTimer is to block threads until X number of threads have been blocked, and then they are all released at once. A SyncTimer can thus create large instant loads at various points of the test plan.
Parameters ¶
BeanShell Timer¶
The BeanShell Timer can be used to generate a delay.
For full details on using BeanShell, please see the BeanShell website.
The test element supports the ThreadListener and TestListener methods. These should be defined in the initialisation file. See the file BeanShellListeners.bshrc for example definitions.
Parameters ¶
- Parameters - string containing the parameters as a single variable
- bsh.args - String array containing parameters, split on white-space
Before invoking the script, some variables are set up in the BeanShell interpreter:
- log - (Logger) - can be used to write to the log file
- ctx - (JMeterContext) - gives access to the context
-
vars - (JMeterVariables) - gives read/write access to variables:
vars.get(key); vars.put(key,val); vars.putObject("OBJ1",new Object());
- props - (JMeterProperties - class java.util.Properties) - e.g. props.get("START.HMS"); props.put("PROP1","1234");
- prev - (SampleResult) - gives access to the previous SampleResult (if any)
For details of all the methods available on each of the above variables, please check the Javadoc
If the property beanshell.timer.init is defined, this is used to load an initialisation file, which can be used to define methods etc. for use in the BeanShell script.
JSR223 Timer¶
The JSR223 Timer can be used to generate a delay using a JSR223 scripting language,
Parameters ¶
- Parameters - string containing the parameters as a single variable
- args - String array containing parameters, split on white-space
Before invoking the script, some variables are set up in the script interpreter:
- log - (Logger) - can be used to write to the log file
- ctx - (JMeterContext) - gives access to the context
-
vars - (JMeterVariables) - gives read/write access to variables:
vars.get(key); vars.put(key,val); vars.putObject("OBJ1",new Object());
- props - (JMeterProperties - class java.util.Properties) - e.g. props.get("START.HMS"); props.put("PROP1","1234");
- sampler - (Sampler) - the current Sampler
- Label - the name of the Timer
- FileName - the file name (if any)
- OUT - System.out
For details of all the methods available on each of the above variables, please check the Javadoc
Poisson Random Timer¶
This timer pauses each thread request for a random amount of time, with most of the time intervals occurring near a particular value. The total delay is the sum of the Poisson distributed value, and the offset value.
Note: if you want to model Poisson arrivals, consider using Precise Throughput Timer instead.
Parameters ¶
18.7 Pre Processors¶
Preprocessors are used to modify the Samplers in their scope.
HTML Link Parser¶
This modifier parses HTML response from the server and extracts links and forms. A URL test sample that passes through this modifier will be examined to see if it "matches" any of the links or forms extracted from the immediately previous response. It would then replace the values in the URL test sample with appropriate values from the matching link or form. Perl-type regular expressions are used to find matches.
Consider a simple example: let's say you wanted JMeter to "spider" through your site, hitting link after link parsed from the HTML returned from your server (this is not actually the most useful thing to do, but it serves as a good example). You would create a Simple Controller, and add the "HTML Link Parser" to it. Then, create an HTTP Request, and set the domain to ".*", and the path likewise. This will cause your test sample to match with any link found on the returned pages. If you wanted to restrict the spidering to a particular domain, then change the domain value to the one you want. Then, only links to that domain will be followed.
A more useful example: given a web polling application, you might have a page with several poll options as radio buttons for the user to select. Let's say the values of the poll options are very dynamic - maybe user generated. If you wanted JMeter to test the poll, you could either create test samples with hardcoded values chosen, or you could let the HTML Link Parser parse the form, and insert a random poll option into your URL test sample. To do this, follow the above example, except, when configuring your Web Test controller's URL options, be sure to choose "POST" as the method. Put in hard-coded values for the domain, path, and any additional form parameters. Then, for the actual radio button parameter, put in the name (let's say it's called "poll_choice"), and then ".*" for the value of that parameter. When the modifier examines this URL test sample, it will find that it "matches" the poll form (and it shouldn't match any other form, given that you've specified all the other aspects of the URL test sample), and it will replace your form parameters with the matching parameters from the form. Since the regular expression ".*" will match with anything, the modifier will probably have a list of radio buttons to choose from. It will choose at random, and replace the value in your URL test sample. Each time through the test, a new random value will be chosen.
HTTP URL Re-writing Modifier¶
This modifier works similarly to the HTML Link Parser, except it has a specific purpose for which it is easier to use than the HTML Link Parser, and more efficient. For web applications that use URL Re-writing to store session ids instead of cookies, this element can be attached at the ThreadGroup level, much like the HTTP Cookie Manager. Simply give it the name of the session id parameter, and it will find it on the page and add the argument to every request of that ThreadGroup.
Alternatively, this modifier can be attached to select requests and it will modify only them. Clever users will even determine that this modifier can be used to grab values that elude the HTML Link Parser.
Parameters ¶
User Parameters¶
Allows the user to specify values for User Variables specific to individual threads.
User Variables can also be specified in the Test Plan but not specific to individual threads. This panel allows you to specify a series of values for any User Variable. For each thread, the variable will be assigned one of the values from the series in sequence. If there are more threads than values, the values get re-used. For example, this can be used to assign a distinct user id to be used by each thread. User variables can be referenced in any field of any JMeter Component.
The variable is specified by clicking the Add Variable button in the bottom of the panel and filling in the Variable name in the 'Name:' column. To add a new value to the series, click the 'Add User' button and fill in the desired value in the newly added column.
Values can be accessed in any test component in the same thread group, using the function syntax: ${variable}.
See also the CSV Data Set Config element, which is more suitable for large numbers of parameters
Parameters ¶
BeanShell PreProcessor¶
The BeanShell PreProcessor allows arbitrary code to be applied before taking a sample.
For full details on using BeanShell, please see the BeanShell website.
The test element supports the ThreadListener and TestListener methods. These should be defined in the initialisation file. See the file BeanShellListeners.bshrc for example definitions.
Parameters ¶
- Parameters - string containing the parameters as a single variable
- bsh.args - String array containing parameters, split on white-space
Before invoking the script, some variables are set up in the BeanShell interpreter:
- log - (Logger) - can be used to write to the log file
- ctx - (JMeterContext) - gives access to the context
-
vars - (JMeterVariables) - gives read/write access to variables:
vars.get(key); vars.put(key,val); vars.putObject("OBJ1",new Object());
- props - (JMeterProperties - class java.util.Properties) - e.g. props.get("START.HMS"); props.put("PROP1","1234");
- prev - (SampleResult) - gives access to the previous SampleResult (if any)
- sampler - (Sampler)- gives access to the current sampler
For details of all the methods available on each of the above variables, please check the Javadoc
If the property beanshell.preprocessor.init is defined, this is used to load an initialisation file, which can be used to define methods etc. for use in the BeanShell script.
JSR223 PreProcessor¶
The JSR223 PreProcessor allows JSR223 script code to be applied before taking a sample.
Parameters ¶
- Parameters - string containing the parameters as a single variable
- args - String array containing parameters, split on white-space
The following JSR223 variables are set up for use by the script:
- log - (Logger) - can be used to write to the log file
- Label - the String Label
- FileName - the script file name (if any)
- Parameters - the parameters (as a String)
- args - the parameters as a String array (split on whitespace)
- ctx - (JMeterContext) - gives access to the context
-
vars - (JMeterVariables) - gives read/write access to variables:
vars.get(key); vars.put(key,val); vars.putObject("OBJ1",new Object()); vars.getObject("OBJ2");
- props - (JMeterProperties - class java.util.Properties) - e.g. props.get("START.HMS"); props.put("PROP1","1234");
- sampler - (Sampler)- gives access to the current sampler
- OUT - System.out - e.g. OUT.println("message")
For details of all the methods available on each of the above variables, please check the Javadoc
JDBC PreProcessor¶
The JDBC PreProcessor enables you to run some SQL statement just before a sample runs. This can be useful if your JDBC Sample requires some data to be in DataBase and you cannot compute this in a setup Thread group. For details, see JDBC Request.
See the following Test plan:
In the linked test plan, "Create Price Cut-Off" JDBC PreProcessor calls a stored procedure to create a Price Cut-Off in Database, this one will be used by "Calculate Price cut off".
RegEx User Parameters¶
Allows to specify dynamic values for HTTP parameters extracted from another HTTP Request using regular expressions. RegEx User Parameters are specific to individual threads.
This component allows you to specify reference name of a regular expression that extracts names and values of HTTP request parameters. Regular expression group numbers must be specified for parameter's name and also for parameter's value. Replacement will only occur for parameters in the Sampler that uses this RegEx User Parameters which name matches
Parameters ¶
Suppose we have a request which returns a form with 3 input parameters and we want to extract the value of 2 of them to inject them in next request
-
Create Post Processor Regular Expression for first HTTP Request
- refName - set name of a regular expression Expression (listParams)
-
regular expression - expression that will extract input names and input values attributes
Ex: input name="([^"]+?)" value="([^"]+?)" - template - would be empty
- match nr - -1 (in order to iterate through all the possible matches)
-
Create Pre Processor RegEx User Parameters for second HTTP Request
- refName - set the same reference name of a regular expression, would be listParams in our example
- parameter names group number - group number of regular expression for parameter names, would be 1 in our example
- parameter values group number - group number of regular expression for parameter values, would be 2 in our example
See also the Regular Expression Extractor element, which is used to extract parameters names and values
Sample Timeout¶
This Pre-Processor schedules a timer task to interrupt a sample if it takes too long to complete.
The timeout is ignored if it is zero or negative.
For this to work, the sampler must implement Interruptible.
The following samplers are known to do so:
AJP, BeanShell, FTP, HTTP, Soap, AccessLog, MailReader, JMS Subscriber, TCPSampler, TestAction, JavaSampler
The test element is intended for use where individual timeouts such as Connection Timeout or Response Timeout are insufficient, or where the Sampler does not support timeouts. The timeout should be set sufficiently long so that it is not triggered in normal tests, but short enough that it interrupts samples that are stuck.
[By default, JMeter uses a Callable to interrupt the sampler. This executes in the same thread as the timer, so if the interrupt takes a long while, it may delay the processing of subsequent timeouts. This is not expected to be a problem, but if necessary the property InterruptTimer.useRunnable can be set to true to use a separate Runnable thread instead of the Callable.]
Parameters ¶
18.8 Post-Processors¶
As the name suggests, Post-Processors are applied after samplers. Note that they are applied to all the samplers in the same scope, so to ensure that a post-processor is applied only to a particular sampler, add it as a child of the sampler.
Post-Processors are run before Assertions, so they do not have access to any Assertion Results, nor will the sample status reflect the results of any Assertions. If you require access to Assertion Results, try using a Listener instead. Also note that the variable JMeterThread.last_sample_ok is set to "true" or "false" after all Assertions have been run.
Regular Expression Extractor¶
Allows the user to extract values from a server response using a Perl-type regular expression. As a post-processor, this element will execute after each Sample request in its scope, applying the regular expression, extracting the requested values, generate the template string, and store the result into the given variable name.
Parameters ¶
- Main sample only - only applies to the main sample
- Sub-samples only - only applies to the sub-samples
- Main sample and sub-samples - applies to both.
- JMeter Variable Name to use - extraction is to be applied to the contents of the named variable
- Body - the body of the response, e.g. the content of a web-page (excluding headers)
-
Body (unescaped) - the body of the response, with all Html escape codes replaced.
Note that Html escapes are processed without regard to context, so some incorrect substitutions
may be made.
Note that this option highly impacts performances, so use it only when absolutely necessary and be aware of its impacts
-
Body as a Document - the extract text from various type of documents via Apache Tika (see View Results Tree Document view section).
Note that the Body as a Document option can impact performances, so ensure it is OK for your test
- Request Headers - may not be present for non-HTTP samples
- Response Headers - may not be present for non-HTTP samples
- URL
- Response Code - e.g. 200
- Response Message - e.g. OK
- Use a value of zero to indicate JMeter should choose a match at random.
- A positive number N means to select the nth match.
- Negative numbers are used in conjunction with the ForEach Controller - see below.
However, if you have several test elements that set the same variable, you may wish to leave the variable unchanged if the expression does not match. In this case, remove the default value once debugging is complete.
If the match number is set to a non-negative number, and a match occurs, the variables are set as follows:
- refName - the value of the template
- refName_gn, where n=0,1,2 - the groups for the match
- refName_g - the number of groups in the Regex (excluding 0)
If no match occurs, then the refName variable is set to the default (unless this is absent). Also, the following variables are removed:
- refName_g0
- refName_g1
- refName_g
If the match number is set to a negative number, then all the possible matches in the sampler data are processed. The variables are set as follows:
- refName_matchNr - the number of matches found; could be 0
- refName_n, where n = 1, 2, 3 etc. - the strings as generated by the template
- refName_n_gm, where m=0, 1, 2 - the groups for match n
- refName - always set to the default value
- refName_gn - not set
Note that the refName variable is always set to the default value in this case, and the associated group variables are not set.
See also Response Assertion for some examples of how to specify modifiers, and for further information on JMeter regular expressions.
CSS Selector Extractor (was: CSS/JQuery Extractor ) ¶
Allows the user to extract values from a server HTML response using a CSS Selector syntax. As a post-processor, this element will execute after each Sample request in its scope, applying the CSS/JQuery expression, extracting the requested nodes, extracting the node as text or attribute value and store the result into the given variable name.
Parameters ¶
- Main sample only - only applies to the main sample
- Sub-samples only - only applies to the sub-samples
- Main sample and sub-samples - applies to both.
- JMeter Variable Name to use - extraction is to be applied to the contents of the named variable
- E[foo] - an E element with a "foo" attribute
- ancestor child - child elements that descend from ancestor, e.g. .body p finds p elements anywhere under a block with class "body"
- :lt(n) - find elements whose sibling index (i.e. its position in the DOM tree relative to its parent) is less than n; e.g. td:lt(3)
- :contains(text) - find elements that contain the given text. The search is case-insensitive; e.g. p:contains(jsoup)
- …
This is the equivalent Element#attr(name) function for JSoup if an attribute is set.
If empty this is the equivalent of Element#text() function for JSoup if not value is set for attribute.
- Use a value of zero to indicate JMeter should choose a match at random.
- A positive number N means to select the nth match.
- Negative numbers are used in conjunction with the ForEach Controller - see below.
However, if you have several test elements that set the same variable, you may wish to leave the variable unchanged if the expression does not match. In this case, remove the default value once debugging is complete.
If the match number is set to a non-negative number, and a match occurs, the variables are set as follows:
- refName - the value of the template
If no match occurs, then the refName variable is set to the default (unless this is absent).
If the match number is set to a negative number, then all the possible matches in the sampler data are processed. The variables are set as follows:
- refName_matchNr - the number of matches found; could be 0
- refName_n, where n = 1, 2, 3, etc. - the strings as generated by the template
- refName - always set to the default value
Note that the refName variable is always set to the default value in this case.
XPath2 Extractor¶
Parameters ¶
- Main sample only - only applies to the main sample
- Sub-samples only - only applies to the sub-samples
- Main sample and sub-samples - applies to both.
- JMeter Variable Name to use - extraction is to be applied to the contents of the named variable
For example //title would return "<title>Apache JMeter</title>" rather than "Apache JMeter".
In this case, //title/text() would return "Apache JMeter".
- 0: means random (default value)
- -1 means extract all results, they will be named as <variable name>_N (where N goes from 1 to Number of results)
- X: means extract the Xth result. If this Xth is greater than number of matches, then nothing is returned. Default value will be used
To allow for use in a ForEach Controller, it works exactly the same as the above XPath Extractor
XPath2 Extractor provides some interestings tools such as an improved syntax and much more functions than in its first version.
Here are some exemples:
- abs(/book/page[2])
- extracts 2nd absolute value of the page from a book
- avg(/librarie/book/page)
- extracts the average number of page from all the books in the libraries
- compare(/book[1]/page[2],/book[2]/page[2])
- return Integer value equal 0 to if the 2nd page of the first book is equal to the 2nd page of the 2nd book, else return -1.
To see more information about these functions, please check xPath2 functions
XPath Extractor¶
Parameters ¶
- Main sample only - only applies to the main sample
- Sub-samples only - only applies to the sub-samples
- Main sample and sub-samples - applies to both.
- JMeter Variable Name to use - extraction is to be applied to the contents of the named variable
- "Use Tidy" should be checked on for HTML response. Such response is converted to valid XHTML (XML compatible HTML) using Tidy
- "Use Tidy" should be unchecked for both XHTML or XML response (for example RSS)
For example //title would return "<title>Apache JMeter</title>" rather than "Apache JMeter".
In this case, //title/text() would return "Apache JMeter".
- 0: means random
- -1 means extract all results (default value), they will be named as <variable name>_N (where N goes from 1 to Number of results)
- X: means extract the Xth result. If this Xth is greater than number of matches, then nothing is returned. Default value will be used
To allow for use in a ForEach Controller, the following variables are set on return:
- refName - set to first (or only) match; if no match, then set to default
- refName_matchNr - set to number of matches (may be 0)
- refName_n - n=1, 2, 3, etc. Set to the 1st, 2nd 3rd match etc.
XPath is query language targeted primarily for XSLT transformations. However it is useful as generic query language for structured data too. See XPath Reference or XPath specification for more information. Here are few examples:
- /html/head/title
- extracts title element from HTML response
- /book/page[2]
- extracts 2nd page from a book
- /book/page
- extracts all pages from a book
- //form[@name='countryForm']//select[@name='country']/option[text()='Czech Republic'])/@value
- extracts value attribute of option element that match text 'Czech Republic' inside of select element with name attribute 'country' inside of form with name attribute 'countryForm'
- All elements and attribute names are converted to lowercase
- Tidy attempts to correct improperly nested elements. For example - original (incorrect) ul/font/li becomes correct ul/li/font
As a work-round for namespace limitations of the Xalan XPath parser (implementation on which JMeter is based) you need to:
-
provide a Properties file (if for example your file is named namespaces.properties) which contains mappings for the namespace prefixes:
prefix1=http\://foo.apache.org prefix2=http\://toto.apache.org …
-
reference this file in user.properties file using the property:
xpath.namespace.config=namespaces.properties
//mynamespace:tagname
//*[local-name()='tagname' and namespace-uri()='uri-for-namespace']uri-for-namespacemynamespace
JSON JMESPath Extractor¶
Parameters ¶
- Main sample only - only applies to the main sample
- Sub-samples only - only applies to the sub-samples
- Main sample and sub-samples - applies to both.
- JMeter Variable Name to use - extraction is to be applied to the contents of the named variable
- 0: means random
- -1 means extract all results (default value), they will be named as <variable name>_N (where N goes from 1 to Number of results)
- X: means extract the Xth result. If this Xth is greater than number of matches, then nothing is returned. Default value will be used
JMESPath is a query language for JSON. It is described in an ABNF grammar with a complete specification. This ensures that the language syntax is precisely defined. See JMESPath Reference for more information. Here are also some examples JMESPath Example.
Result Status Action Handler¶
Parameters ¶
- Continue - ignore the error and continue with the test
- Start next thread loop - does not execute samplers following the sampler in error for the current iteration and restarts the loop on next iteration
- Stop Thread - current thread exits
- Stop Test - the entire test is stopped at the end of any current samples.
- Stop Test Now - the entire test is stopped abruptly. Any current samplers are interrupted if possible.
BeanShell PostProcessor¶
The BeanShell PreProcessor allows arbitrary code to be applied after taking a sample.
BeanShell Post-Processor no longer ignores samples with zero-length result data
For full details on using BeanShell, please see the BeanShell website.
The test element supports the ThreadListener and TestListener methods. These should be defined in the initialisation file. See the file BeanShellListeners.bshrc for example definitions.
Parameters ¶
- Parameters - string containing the parameters as a single variable
- bsh.args - String array containing parameters, split on white-space
The following BeanShell variables are set up for use by the script:
- log - (Logger) - can be used to write to the log file
- ctx - (JMeterContext) - gives access to the context
-
vars - (JMeterVariables) - gives read/write access to variables:
vars.get(key); vars.put(key,val); vars.putObject("OBJ1",new Object());
- props - (JMeterProperties - class java.util.Properties) - e.g. props.get("START.HMS"); props.put("PROP1","1234");
- prev - (SampleResult) - gives access to the previous SampleResult
- data - (byte [])- gives access to the current sample data
For details of all the methods available on each of the above variables, please check the Javadoc
If the property beanshell.postprocessor.init is defined, this is used to load an initialisation file, which can be used to define methods etc. for use in the BeanShell script.
JSR223 PostProcessor¶
The JSR223 PostProcessor allows JSR223 script code to be applied after taking a sample.
Parameters ¶
- Parameters - string containing the parameters as a single variable
- args - String array containing parameters, split on white-space
Before invoking the script, some variables are set up. Note that these are JSR223 variables - i.e. they can be used directly in the script.
- log - (Logger) - can be used to write to the log file
- Label - the String Label
- FileName - the script file name (if any)
- Parameters - the parameters (as a String)
- args - the parameters as a String array (split on whitespace)
- ctx - (JMeterContext) - gives access to the context
-
vars - (JMeterVariables) - gives read/write access to variables:
vars.get(key); vars.put(key,val); vars.putObject("OBJ1",new Object()); vars.getObject("OBJ2");
- props - (JMeterProperties - class java.util.Properties) - e.g. props.get("START.HMS"); props.put("PROP1","1234");
- prev - (SampleResult) - gives access to the previous SampleResult (if any)
- sampler - (Sampler)- gives access to the current sampler
- OUT - System.out - e.g. OUT.println("message")
For details of all the methods available on each of the above variables, please check the Javadoc
JDBC PostProcessor¶
The JDBC PostProcessor enables you to run some SQL statement just after a sample has run. This can be useful if your JDBC Sample changes some data and you want to reset state to what it was before the JDBC sample run.
In the linked test plan, "JDBC PostProcessor" JDBC PostProcessor calls a stored procedure to delete from Database the Price Cut-Off that was created by PreProcessor.
JSON Extractor¶
The JSON PostProcessor enables you extract data from JSON responses using JSON-PATH syntax. This post processor is very similar to Regular expression extractor. It must be placed as a child of HTTP Sampler or any other sampler that has responses. It will allow you to extract in a very easy way text content, see JSON Path syntax.
Parameters ¶
- Main sample only
- only applies to the main sample
- Sub-samples only
- only applies to the sub-samples
- Main sample and sub-samples
- applies to both.
- JMeter Variable Name to use
- extraction is to be applied to the contents of the named variable
- 0: means random (Default Value)
- -1 means extract all results, they will be named as <variable name>_N (where N goes from 1 to Number of results)
- X: means extract the Xth result. If this Xth is greater than number of matches, then nothing is returned. Default value will be used
Boundary Extractor¶
Allows the user to extract values from a server response using left and right boundaries. As a post-processor, this element will execute after each Sample request in its scope, testing the boundaries, extracting the requested values, generate the template string, and store the result into the given variable name.
Parameters ¶
- Main sample only - only applies to the main sample
- Sub-samples only - only applies to the sub-samples
- Main sample and sub-samples - applies to both.
- JMeter Variable Name to use - assertion is to be applied to the contents of the named variable
- Body - the body of the response, e.g. the content of a web-page (excluding headers)
-
Body (unescaped) - the body of the response, with all Html escape codes replaced.
Note that Html escapes are processed without regard to context, so some incorrect substitutions
may be made.
Note that this option highly impacts performances, so use it only when absolutely necessary and be aware of its impacts
-
Body as a Document - the extract text from various type of documents via Apache Tika (see View Results Tree Document view section).
Note that the Body as a Document option can impact performances, so ensure it is OK for your test
- Request Headers - may not be present for non-HTTP samples
- Response Headers - may not be present for non-HTTP samples
- URL
- Response Code - e.g. 200
- Response Message - e.g. OK
- Use a value of zero to indicate JMeter should choose a match at random.
- A positive number N means to select the nth match.
- Negative numbers are used in conjunction with the ForEach Controller - see below.
However, if you have several test elements that set the same variable, you may wish to leave the variable unchanged if the expression does not match. In this case, remove the default value once debugging is complete.
If the match number is set to a non-negative number, and a match occurs, the variables are set as follows:
- refName - the value of the extraction
If no match occurs, then the refName variable is set to the default (unless this is absent).
If the match number is set to a negative number, then all the possible matches in the sampler data are processed. The variables are set as follows:
- refName_matchNr - the number of matches found; could be 0
- refName_n, where n = 1, 2, 3 etc. - the strings as generated by the template
- refName_n_gm, where m=0, 1, 2 - the groups for match n
- refName - always set to the default value
Note that the refName variable is always set to the default value in this case, and the associated group variables are not set.
18.9 Miscellaneous Features¶
Test Plan¶
The Test Plan is where the overall settings for a test are specified.
Static variables can be defined for values that are repeated throughout a test, such as server names. For example the variable SERVER could be defined as www.example.com, and the rest of the test plan could refer to it as ${SERVER}. This simplifies changing the name later.
If the same variable name is reused on one of more User Defined Variables Configuration elements, the value is set to the last definition in the test plan (reading from top to bottom). Such variables should be used for items that may change between test runs, but which remain the same during a test run.
Selecting Functional Testing instructs JMeter to save the additional sample information - Response Data and Sampler Data - to all result files. This increases the resources needed to run a test, and may adversely impact JMeter performance. If more data is required for a particular sampler only, then add a Listener to it, and configure the fields as required.
Also, an option exists here to instruct JMeter to run the Thread Group serially rather than in parallel.
Run tearDown Thread Groups after shutdown of main threads: if selected, the tearDown groups (if any) will be run after graceful shutdown of the main threads. The tearDown threads won't be run if the test is forcibly stopped.
Test plan now provides an easy way to add classpath setting to a specific test plan. The feature is additive, meaning that you can add jar files or directories, but removing an entry requires restarting JMeter.
JMeter properties also provides an entry for loading additional classpaths. In jmeter.properties, edit "user.classpath" or "plugin_dependency_paths" to include additional libraries. See JMeter's Classpath and Configuring JMeter for details.
Open Model Thread Group¶
Open Model Thread Group defines a pool of users that will execute a particular test case against the server. The users are generated according to the schedule.
The load profile consists of a sequence of constant, increasing or decreasing load. The basic configuration is rate(1/sec) random_arrivals(2 min) rate(3/sec) which means the load will increase linearly from one request per second to three requests per second during a period of two-minutes. If you omit rate at the end, then it will be set to the same value as that from the start. For example, rate(1/sec) random_arrivals(2 min) is exactly the same as rate(1/sec) random_arrivals(2 min) rate(1/sec). That is why rate(1/sec) random_arrivals(2 min) random_arrivals(3 min) rate(4/sec) is exactly the same as rate(1/sec) random_arrivals(2 min) rate(1/sec) random_arrivals(3 min) rate(4/sec), so the load is one request per second during the first two minutes, after which it increases linearly from one request per second to four requests per second during three minutes.
Here are examples for using the schedule:
- rate(10/sec) random_arrivals(1 min) rate(10/sec)
- constant load rate of ten requests per second during one minute
- rate(0) random_arrivals(1 min) rate(10/sec)
- linearly increase the load from zero requests per second to ten requests per second during one minute
- rate(0) random_arrivals(1 min) rate(10/sec) random_arrivals(1 min) rate(10/sec) random_arrivals(1 min) rate(0)
- linearly increase the load from zero requests per second to ten requests per second during one minute, then hold the load during one minute, then linearly decrease the load from ten requests per second to zero during one minute
- rate(10) random_arrivals(1 min) rate(10/sec) random_arrivals(1 min) rate(10/sec) random_arrivals(1 min) rate(0)
- linearly increase the load from zero requests per second to ten requests per second during one minute, then hold the load during one minute, then linearly decrease the load from ten requests per second to zero requests per second during one minute
- rate(10) random_arrivals(1 min) pause(2 sec) random_arrivals(1 min)
- run with constant load of ten requests per second during one minute, then make two second pause, then resume the load of ten requests per second for one minute
The following commands are available:
- rate(<number>/sec)
- configures target load rate. The following time units are supported: ms, sec, min, hour, day. You can omit time unit in case the rate is 0: rate(0)
- random_arrivals(<number> sec)
-
configures random arrivals schedule with the given duration.
The starting load rate is configured before random_arrivals, and the finish load rate is configured after random_arrivals.
For example, 10 minute test from five requests per second at the beginning to fifteen request per second at the end could be configured as rate(5/sec) random_arrivals(10 min) rate(15/sec).
The implicit rate at the beginning of the test is 0. If the finish rate is not provided (or if several random_arrivals steps go one after another), then the load is constant. For instance, rate(3/sec) random_arrivals(1 min) random_arrivals(2 min) rate(6/sec) configures constant rate of three requests per second for the first minute, and then the load increases from three requests per second to six requests per second during the next two minutes. The time units are the same as in rate. - even_arrivals(<number> sec)
- configures even arrivals (TODO: not implemented yet). For instance, if the desired load is one request per second, then random_arrivals would lauch samples with exactly one second intervals.
- pause(<number> sec)
-
configures a pause for the given duration.
The rate is restored after the pause, so rate(2/sec) random_arrivals(5 sec) pause(5 sec) random_arrivals(5 sec)
generates random arrivals with two requests per second rate, then a pause for five seconds (no new arrivals), then five more seconds with two requests per second rate.
Note: pause duration is always honoured, even if all the scenarios are complete, and no new ones will be scheduled. For instance, if you use rate(1/sec) random_arrivals(1 min) pause(1 hour), the thread group would always last for sixty-one minutes no matter how much time do individual scenarios take. - /* Comments */
- can be used to clarify the schedule or temporary disable some steps. Comments cannot be nested.
- // line comments
- can be used to clarify the schedule or temporary disable some steps. Line comment lasts till the end of the line.
The thread groups terminates threads as soon as the schedule ends. In other words, the threads are interrupted after all arrivals and pause intervals. If you want to let the threads complete safely, consider adding pause(5 min) at the end of the schedule. That will add five minutes for the threads to continue.
There are no special functions for generating the load profile in a loop, however, the default JMeter templating functions can be helpful for generating the schedule.
For example, the following pattern would generate a sequence of 10 steps where each step lasts 10 seconds: 10/sec, 20/sec, 30/sec, ... ${__groovy((1..10).collect { "rate(" + it*10 + "/sec) random_arrivals(10 sec) pause(1 sec)" }.join(" "))} You can get variables from properties as follows: rate(${__P(beginRate,40)}) random_arrivals(${__P(testDuration, 10)} sec) rate(${__P(endRate,40)})
Currently, the load profile is evaluated at the beginning of the test only, so if you use dynamic functions, then only the first result will be used.
Parameters ¶
Thread Group¶
A Thread Group defines a pool of users that will execute a particular test case against your server. In the Thread Group GUI, you can control the number of users simulated (number of threads), the ramp up time (how long it takes to start all the threads), the number of times to perform the test, and optionally, a start and stop time for the test.
See also tearDown Thread Group and setUp Thread Group.
When using the scheduler, JMeter runs the thread group until either the number of loops is reached or the duration/end-time is reached - whichever occurs first. Note that the condition is only checked between samples; when the end condition is reached, that thread will stop. JMeter does not interrupt samplers which are waiting for a response, so the end time may be delayed arbitrarily.
Since JMeter 3.0, you can run a selection of Thread Group by selecting them and right clicking. A popup menu will appear:
Notice you have three options to run the selection of Thread Groups:
- Start
- Start the selected thread groups only
- Start no pauses
- Start the selected thread groups only but without running the timers
- Validate
- Start the selected thread groups only using validation mode. Per default this runs the Thread Group in validation mode (see below)
This mode enables rapid validation of a Thread Group by running it with one thread, one iteration, no timers and no Startup delay set to 0. Behaviour can be modified with some properties by setting in user.properties:
- testplan_validation.nb_threads_per_thread_group
- Number of threads to use to validate a Thread Group, by default 1
- testplan_validation.ignore_timers
- Ignore timers when validating the thread group of plan, by default 1
- testplan_validation.number_iterations
- Number of iterations to use to validate a Thread Group
- testplan_validation.tpc_force_100_pct
- Whether to force Throughput Controller in percentage mode to run as if percentage was 100 %. Defaults to false
Parameters ¶
- Continue - ignore the error and continue with the test
- Start Next Thread Loop - ignore the error, start next loop and continue with the test
- Stop Thread - current thread exits
- Stop Test - the entire test is stopped at the end of any current samples.
- Stop Test Now - the entire test is stopped abruptly. Any current samplers are interrupted if possible.
If not selected, cookie and cache data from the first sampler response are not used in subsequent requests.
If not selected, cookie and cache data from the first sampler response are not used in subsequent requests.
If not selected, all threads are created when the test starts (they then pause for the appropriate proportion of the ramp-up time). This is the original default, and is appropriate for tests where threads are active throughout most of the test.
SSL Manager¶
The SSL Manager is a way to select a client certificate so that you can test applications that use Public Key Infrastructure (PKI). It is only needed if you have not set up the appropriate System properties.
You may either use a Java Key Store (JKS) format key store, or a Public Key Certificate Standard #12 (PKCS12) file for your client certificates. There is a feature of the JSSE libraries that require you to have at least a six character password on your key (at least for the keytool utility that comes with your JDK).
To select the client certificate, choose .p12' for SSL Manager to recognize it as a PKCS12 file. Any other file will be treated like an average JKS key store. If JSSE is correctly installed, you will be prompted for the password. The text box does not hide the characters you type at this point -- so make sure no one is looking over your shoulder. The current implementation assumes that the password for the keystore is also the password for the private key of the client you want to authenticate as.
from the menu bar. You will be presented with a file finder that looks for PKCS12 files by default. Your PKCS12 file must have the extension 'Or you can set the appropriate System properties - see the system.properties file.
The next time you run your test, the SSL Manager will examine your key store to see if it has at least one key available to it. If there is only one key, SSL Manager will select it for you. If there is more than one key, it currently selects the first key. There is currently no way to select other entries in the keystore, so the desired key must be the first.
Things to Look Out ForYou must have your Certificate Authority (CA) certificate installed properly if it is not signed by one of the five CA certificates that ships with your JDK. One method to install it is to import your CA certificate into a JKS file, and name the JKS file "jssecacerts". Place the file in your JRE's lib/security folder. This file will be read before the "cacerts" file in the same directory. Keep in mind that as long as the "jssecacerts" file exists, the certificates installed in "cacerts" will not be used. This may cause problems for you. If you don't mind importing your CA certificate into the "cacerts" file, then you can authenticate against all of the CA certificates installed.
HTTP(S) Test Script Recorder (was: HTTP Proxy Server ) ¶
The HTTP(S) Test Script Recorder allows JMeter to intercept and record your actions while you browse your web application
with your normal browser. JMeter will create test sample objects and store them
directly into your test plan as you go (so you can view samples interactively while you make them).
Ensure you read this wiki page to setup correctly JMeter.
To use the recorder, add the HTTP(S) Test Script Recorder element. Right-click on the Test Plan element to get the Add menu: (
).The recorder is implemented as an HTTP(S) proxy server. You need to set up your browser use the proxy for all HTTP and HTTPS requests.
Ideally use private browsing mode when recording the session. This should ensure that the browser starts with no stored cookies, and prevents certain changes from being saved. For example, Firefox does not allow certificate overrides to be saved permanently.
HTTPS recording and certificates
HTTPS connections use certificates to authenticate the connection between the browser and the web server. When connecting via HTTPS, the server presents the certificate to the browser. To authenticate the certificate, the browser checks that the server certificate is signed by a Certificate Authority (CA) that is linked to one of its in-built root CAs.
JMeter needs to use its own certificate to enable it to intercept the HTTPS connection from the browser. Effectively JMeter has to pretend to be the target server.
JMeter will generate its own certificate(s). These are generated with a validity period defined by the property proxy.cert.validity, default 7 days, and random passwords. If JMeter detects that it is running under Java 8 or later, it will generate certificates for each target server as necessary (dynamic mode) unless the following property is defined: proxy.cert.dynamic_keys=false. When using dynamic mode, the certificate will be for the correct host name, and will be signed by a JMeter-generated CA certificate. By default, this CA certificate won't be trusted by the browser, however it can be installed as a trusted certificate. Once this is done, the generated server certificates will be accepted by the browser. This has the advantage that even embedded HTTPS resources can be intercepted, and there is no need to override the browser checks for each new server.
Unless a keystore is provided (and you define the property proxy.cert.alias), JMeter needs to use the keytool application to create the keystore entries. JMeter includes code to check that keytool is available by looking in various standard places. If JMeter is unable to find the keytool application, it will report an error. If necessary, the system property keytool.directory can be used to tell JMeter where to find keytool. This should be defined in the file system.properties.
The JMeter certificates are generated (if necessary) when the Start button is pressed.
If necessary, you can force JMeter to regenerate the keystore (and the exported certificates - ApacheJMeterTemporaryRootCA[.usr|.crt]) by deleting the keystore file proxyserver.jks from the JMeter directory.
This certificate is not one of the certificates that browsers normally trust, and will not be for the
correct host.
As a consequence:
-
The browser should display a dialogue asking if you want to accept the certificate or not. For example:
1) The server's name "www.example.com" does not match the certificate's name "_ JMeter Root CA for recording (INSTALL ONLY IF IT S YOURS)". Somebody may be trying to eavesdrop on you. 2) The certificate for "_ JMeter Root CA for recording (INSTALL ONLY IF IT S YOURS)" is signed by the unknown Certificate Authority "_ JMeter Root CA for recording (INSTALL ONLY IF IT S YOURS)". It is not possible to verify that this is a valid certificate.
You will need to accept the certificate in order to allow the JMeter Proxy to intercept the SSL traffic in order to record it. However, do not accept this certificate permanently; it should only be accepted temporarily. Browsers only prompt this dialogue for the certificate of the main URL, not for the resources loaded in the page, such as images, CSS or JavaScript files hosted on a secured external CDN. If you have such resources (gmail has for example), you'll have to first browse manually to these other domains in order to accept JMeter's certificate for them. Check in jmeter.log for secure domains that you need to register certificate for. - If the browser has already registered a validated certificate for this domain, the browser will detect JMeter as a security breach and will refuse to load the page. If so, you have to remove the trusted certificate from your browser's keystore.
Versions of JMeter from 2.10 onwards still support this method, and will continue to do so if you define the following property: proxy.cert.alias The following properties can be used to change the certificate that is used:
- proxy.cert.directory - the directory in which to find the certificate (default = JMeter bin/)
- proxy.cert.file - name of the keystore file (default "proxyserver.jks")
- proxy.cert.keystorepass - keystore password (default "password") [Ignored if using JMeter certificate]
- proxy.cert.keypassword - certificate key password (default "password") [Ignored if using JMeter certificate]
- proxy.cert.type - the certificate type (default "JKS") [Ignored if using JMeter certificate]
- proxy.cert.factory - the factory (default "SunX509") [Ignored if using JMeter certificate]
- proxy.cert.alias - the alias for the key to be used. If this is defined, JMeter does not attempt to generate its own certificate(s).
- proxy.ssl.protocol - the protocol to be used (default "SSLv3")
Installing the JMeter CA certificate for HTTPS recording
As mentioned above, when run under Java 8, JMeter can generate certificates for each server. For this to work smoothly, the root CA signing certificate used by JMeter needs to be trusted by the browser. The first time that the recorder is started, it will generate the certificates if necessary. The root CA certificate is exported into a file with the name ApacheJMeterTemporaryRootCA in the current launch directory. When the certificates have been set up, JMeter will show a dialog with the current certificate details. At this point, the certificate can be imported into the browser, as per the instructions below.
Note that once the root CA certificate has been installed as a trusted CA, the browser will trust any certificates signed by it. Until such time as the certificate expires or the certificate is removed from the browser, it will not warn the user that the certificate is being relied upon. So anyone that can get hold of the keystore and password can use the certificate to generate certificates which will be accepted by any browsers that trust the JMeter root CA certificate. For this reason, the password for the keystore and private keys are randomly generated and a short validity period used. The passwords are stored in the local preferences area. Please ensure that only trusted users have access to the host with the keystore.
Installing the certificate in Firefox
Choose the following options:
- Tools / Options
- Advanced / Certificates
- View Certificates
- Authorities
- Import …
- Browse to the JMeter launch directory, and click on the file ApacheJMeterTemporaryRootCA.crt, press Open
- Click View and check that the certificate details agree with the ones displayed by the JMeter Test Script Recorder
- If OK, select "Trust this CA to identify web sites", and press OK
- Close dialogs by pressing OK as necessary
Installing the certificate in Chrome or Internet Explorer
Both Chrome and Internet Explorer use the same trust store for certificates.
- Browse to the JMeter launch directory, and click on the file ApacheJMeterTemporaryRootCA.crt, and open it
- Click on the "Details" tab and check that the certificate details agree with the ones displayed by the JMeter Test Script Recorder
- If OK, go back to the "General" tab, and click on "Install Certificate …" and follow the Wizard prompts
Installing the certificate in Opera
- Tools / Preferences / Advanced / Security
- Manage Certificates …
- Select "Intermediate" tab, click "Import …"
- Browse to the JMeter launch directory, and click on the file ApacheJMeterTemporaryRootCA.usr, and open it
Parameters ¶
For example, *.example.com,*.subdomain.example.com
Note that wildcard domains only apply to one level, i.e. abc.subdomain.example.com matches *.subdomain.example.com but not *.example.com
- Do not group samplers - store all recorded samplers sequentially, without any grouping.
- Add separators between groups - add a controller named "--------------" to create a visual separation between the groups. Otherwise the samplers are all stored sequentially.
- Put each group in a new controller - create a new Simple Controller for each group, and store all samplers for that group in it.
- Store 1st sampler of each group only - only the first request in each group will be recorded. The "Follow Redirects" and "Retrieve All Embedded Resources …" flags will be turned on in those samplers.
- Put each group in a new transaction controller - create a new Transaction Controller for each group, and store all samplers for that group in it.
Recording and redirects
During recording, the browser will follow a redirect response and generate an additional request. The Proxy will record both the original request and the redirected request (subject to whatever exclusions are configured). The generated samples have "Follow Redirects" selected by default, because that is generally better.
Now if JMeter is set to follow the redirect during replay, it will issue the original request, and then replay the redirect request that was recorded. To avoid this duplicate replay, JMeter tries to detect when a sample is the result of a previous redirect. If the current response is a redirect, JMeter will save the redirect URL. When the next request is received, it is compared with the saved redirect URL and if there is a match, JMeter will disable the generated sample. It also adds comments to the redirect chain. This assumes that all the requests in a redirect chain will follow each other without any intervening requests. To disable the redirect detection, set the property proxy.redirect.disabling=false
Includes and Excludes
The include and exclude patterns are treated as regular expressions (using Jakarta ORO).
They will be matched against the host name, port (actual or implied), path and query (if any) of each browser request.
If the URL you are browsing is
"http://localhost/jmeter/index.html?username=xxxx",
then the regular expression will be tested against the string:
"localhost:80/jmeter/index.html?username=xxxx".
Thus, if you want to include all .html files, your regular expression might look like:
".*\.html(\?.*)?" - or ".*\.html
if you know that there is no query string or you only want html pages without query strings.
If there are any include patterns, then the URL must match at least one of the patterns , otherwise it will not be recorded. If there are any exclude patterns, then the URL must not match any of the patterns , otherwise it will not be recorded. Using a combination of includes and excludes, you should be able to record what you are interested in and skip what you are not.
Thus "\.html" will not match localhost:80/index.html
Capturing binary POST data
JMeter is able to capture binary POST data. To configure which content-types are treated as binary, update the JMeter property proxy.binary.types. The default settings are as follows:
# These content-types will be handled by saving the request in a file: proxy.binary.types=application/x-amf,application/x-java-serialized-object # The files will be saved in this directory: proxy.binary.directory=user.dir # The files will be created with this file filesuffix: proxy.binary.filesuffix=.binary
Adding timers
It is also possible to have the proxy add timers to the recorded script. To do this, create a timer directly within the HTTP(S) Test Script Recorder component. The proxy will place a copy of this timer into each sample it records, or into the first sample of each group if you're using grouping. This copy will then be scanned for occurrences of variable ${T} in its properties, and any such occurrences will be replaced by the time gap from the previous sampler recorded (in milliseconds).
When you are ready to begin, hit "start".
Where Do Samples Get Recorded?
JMeter places the recorded samples in the Target Controller you choose. If you choose the default option "Use Recording Controller", they will be stored in the first Recording Controller found in the test object tree (so be sure to add a Recording Controller before you start recording).
If the Proxy does not seem to record any samples, this could be because the browser is not actually using the proxy. To check if this is the case, try stopping the proxy. If the browser still downloads pages, then it was not sending requests via the proxy. Double-check the browser options. If you are trying to record from a server running on the same host, then check that the browser is not set to "Bypass proxy server for local addresses" (this example is from IE7, but there will be similar options for other browsers). If JMeter does not record browser URLs such as http://localhost/ or http://127.0.0.1/, try using the non-loopback hostname or IP address, e.g. http://myhost/ or http://192.168.0.2/.
Handling of HTTP Request Defaults
If the HTTP(S) Test Script Recorder finds enabled HTTP Request Defaults directly within the controller where samples are being stored, or directly within any of its parent controllers, the recorded samples will have empty fields for the default values you specified. You may further control this behaviour by placing an HTTP Request Defaults element directly within the HTTP(S) Test Script Recorder, whose non-blank values will override those in the other HTTP Request Defaults. See Best Practices with the HTTP(S) Test Script Recorder for more info.
User Defined Variable replacement
Similarly, if the HTTP(S) Test Script Recorder finds User Defined Variables (UDV) directly within the controller where samples are being stored, or directly within any of its parent controllers, the recorded samples will have any occurrences of the values of those variables replaced by the corresponding variable. Again, you can place User Defined Variables directly within the HTTP(S) Test Script Recorder to override the values to be replaced. See Best Practices with the Test Script Recorder for more info.
Replacement by Variables: by default, the Proxy server looks for all occurrences of UDV values. If you define the variable WEB with the value www, for example, the string www will be replaced by ${WEB} wherever it is found. To avoid this happening everywhere, set the "Regex Matching" check-box. This tells the proxy server to treat values as Regexes (using the perl5 compatible regex matchers provided by ORO).
If "Regex Matching" is selected every variable will be compiled into a perl compatible regex enclosed in \b( and )\b. That way each match will start and end at a word boundary.
If you don't want your regex to be enclosed with those boundary matchers, you have to enclose your regex within parens, e.g ('.*?') to match 'name' out of You can call me 'name'.
If you want to match a whole string only, enclose it in (^ and $), e.g. (^thus$). The parens are necessary, since the normally added boundary characters will prevent ^ and $ to match.
If you want to match /images at the start of a string only, use the value (^/images). Jakarta ORO also supports zero-width look-ahead, so one can match /images/… but retain the trailing / in the output by using (^/images(?=/)).
Look out for overlapping matchers. For example the value .* as a regex in a variable named regex will partly match a previous replaced variable, which will result in something like ${{regex}, which is most probably not the desired result.
If there are any problems interpreting any variables as patterns, these are reported in jmeter.log, so be sure to check this if UDVs are not working as expected.
When you are done recording your test samples, stop the proxy server (hit the "stop" button). Remember to reset your browser's proxy settings. Now, you may want to sort and re-order the test script, add timers, listeners, a cookie manager, etc.
How can I record the server's responses too?
Just place a View Results Tree listener as a child of the HTTP(S) Test Script Recorder and the responses will be displayed. You can also add a Save Responses to a file Post-Processor which will save the responses to files.
Associating requests with responses
If you define the property proxy.number.requests=true JMeter will add a number to each sampler and each response. Note that there may be more responses than samplers if excludes or includes have been used. Responses that have been excluded will have labels enclosed in [ and ], for example [23 /favicon.ico]
Cookie Manager
If the server you are testing against uses cookies, remember to add an HTTP Cookie Manager to the test plan when you have finished recording it. During recording, the browser handles any cookies, but JMeter needs a Cookie Manager to do the cookie handling during a test run. The JMeter Proxy server passes on all cookies sent by the browser during recording, but does not save them to the test plan because they are likely to change between runs.
Authorization Manager
The HTTP(S) Test Script Recorder grabs "Authentication" header, tries to compute the Auth Policy. If Authorization Manager was added to target controller manually, HTTP(S) Test Script Recorder will find it and add authorization (matching ones will be removed). Otherwise Authorization Manager will be added to target controller with authorization object. You may have to fix automatically computed values after recording.
Uploading files
Some browsers (e.g. Firefox and Opera) don't include the full name of a file when uploading files. This can cause the JMeter proxy server to fail. One solution is to ensure that any files to be uploaded are in the JMeter working directory, either by copying the files there or by starting JMeter in the directory containing the files.
Recording HTTP Based Non Textual Protocols not natively available in JMeter
You may have to record an HTTP protocol that is not handled by default by JMeter (Custom Binary Protocol, Adobe Flex, Microsoft Silverlight, … ). Although JMeter does not provide a native proxy implementation to record these protocols, you have the ability to record these protocols by implementing a custom SamplerCreator. This Sampler Creator will translate the binary format into a HTTPSamplerBase subclass that can be added to the JMeter Test Case. For more details see "Extending JMeter".
HTTP Mirror Server¶
The HTTP Mirror Server is a very simple HTTP server - it simply mirrors the data sent to it. This is useful for checking the content of HTTP requests.
It uses default port 8081.
Parameters ¶
Parameters ¶
headerA: valueA|headerB: valueB would set headerA to valueA and headerB to valueB.
You can also use the following query parameters:
Parameters ¶
Property Display¶
The Property Display shows the values of System or JMeter properties. Values can be changed by entering new text in the Value column.
Parameters ¶
Debug Sampler¶
The Debug Sampler generates a sample containing the values of all JMeter variables and/or properties.
The values can be seen in the View Results Tree Listener Response Data pane.
Parameters ¶
Debug PostProcessor¶
The Debug PostProcessor creates a subSample with the details of the previous Sampler properties, JMeter variables, properties and/or System Properties.
The values can be seen in the View Results Tree Listener Response Data pane.
Parameters ¶
Test Fragment¶
The Test Fragment is used in conjunction with the Include Controller and Module Controller.
Parameters ¶
setUp Thread Group¶
A special type of ThreadGroup that can be utilized to perform Pre-Test Actions. The behavior of these threads is exactly like a normal Thread Group element. The difference is that these type of threads execute before the test proceeds to the executing of regular Thread Groups.
tearDown Thread Group¶
A special type of ThreadGroup that can be utilized to perform Post-Test Actions. The behavior of these threads is exactly like a normal Thread Group element. The difference is that these type of threads execute after the test has finished executing its regular Thread Groups.