Logging Apache JMeter to an Oracle database on-line


    This article is primarily aimed at technical specialists from the field of ensuring the quality and performance of software products. The article provides detailed information about the capabilities of the high-performance version of Apache JMeter logging into the Oracle database. The development was used in combat conditions on load testing projects that were conducted by the employees of Performance Lab LLC.
    The presented solution allows for centralized storage of the results of load tests that use JMeter, as well as significantly reduce the time for processing and analysis of test results.
    For the engineer, all the Oracle DBMS functionality for working with data will now be available.


    The developed set of tools designed for Apache JMeter allows you to log test results (transactions, response time, server response, operation time) in the Oracle database in real time.

    What is it for

    The described data processing approach allows you to automate the collection and processing of data with frequent load tests using Apache JMeter.
    In the case when the time for processing the test results is very limited and the log volume is quite large, this tool allows you to quickly generate reports on the results of operations (quantity, response time, etc.) in the form of tables or graphs. The time to process the logs in a few hour test is several seconds - the execution of the SQL query.
    To process the results, you can use all the possible functionality of the Oracle DBMS, thereby obtaining a very different report structure, for every taste.


    - High speed of processing logs from the test;
    - Data for all tests performed are in one place - the database;
    - Convenience in the preparation of reports - sets the range for which you want to generate statistics, and executes an SQL query;
    - Flexibility of processing logs - the possibilities are limited only by the functionality of Oracle and the imagination of the tester;
    - High speed of logging due to the absence of locks (access to the database) during the operation of a large number of VUs, since recording is performed only by one thread (by default).


    - Requires the deployment of Oracle DBMS, server capacity depends on the planned load profile;
    - The initial development of SQL query templates requires a certain amount of time;
    - It is necessary to add BeanShell Listener to the test plan of Jmeter and monitor the level of transaction nesting (for more details, see the section “Recording in the general queue”);
    - Processing the results during the test can lead to an increase in the time of writing logs to the database (it all depends on the capacity of the database and the availability of indexes).

    Logging system operation scheme The
    logic is simple:
    - At the beginning of the test, a thread is launched that creates one connection (by default) with the database;
    - Logs are recorded in the general queue;
    - There is a synchronization stream that periodically drops the contents of the queue in the database.

    The test plan for logging should contain two additional streaming groups (Creating Connections and Synchronizing Logs). In order for the result of the sample execution to fall into the general queue, it is necessary to add the BeanShell Listener in the "Workgroup", the contents of which will be discussed below.
    When the test starts, it connects to the database once, after which every second (the interval can be adjusted), the synchronization thread checks for the presence of data in the queue and writes it to the database. Writing to the database is carried out in parallel with the work of the main load, and there is no significant effect on it.

    Test plan template from the application ( you can download it here) is recommended to be used as a standard configuration (file “db_logger_plan.jmx”).

    DB connection

    To connect to the database, you must specify: host, port, led, login, password, number of threads (it is better to use = 1, the name of the table in the database.
    For example, the lines look like this:
    import ru.perflab.jmeter.OracleDBLogger;
    OracleDBLogger. INSTANCE.connect ("", 1521, "xe", "sa", "gfhjkm", 1, "jm_log_table");


    The synchronization call causes the data from the queue to be written to the database (PreparedStatement and AddBatch are used). To call, just specify the lines:
    import ru.perflab.jmeter.OracleDBLogger;
    ResponseCode = new String ("count =" + OracleDBLogger.INSTANCE.insert ());
    The optimal launch interval is 1 second.

    General Queue Recording

    The queue consists of HashMaps, in order to add an element, you need to add a BeanShell Listener plan with the contents to the test:

    import java.util.HashMap;
    import ru.perflab.jmeter.OracleDBLogger;

    HashMap p = new HashMap ();
    p.put ("Timestamp", new java.sql.Timestamp (prev.getTimeStamp ()));
    p.put ("CurrentTimestamp", new java.sql.Timestamp (prev.currentTimeInMillis ()));
    p.put ("Time", prev.getTime ());
    p.put ("Latency", prev.getLatency ());
    p.put ("IdleTime", prev.getIdleTime ());
    p.put ("Bytes", prev.getBytes ());
    p.put ("SampleCount", prev.getSampleCount ());
    p.put ("isSuccessful", prev.isSuccessful ());
    p.put ("SampleLabel", prev.getSampleLabel ());
    p.put ("Hostname", sampleEvent.getHostname ());
    p.put ("ThreadName", prev.getThreadName ());
    p.put ("AllThreads", prev.getAllThreads ());
    p.put ("UrlAsString", prev.getUrlAsString ());
    p.put ("Request", new String (prev.getDataEncodingWithDefault ()));
    p.put ("ResponseData", new String (prev.getResponseData ()));
    p.put ("DataType", prev.getDataType ());

    OracleDBLogger.INSTANCE.put (p);

    We look at the test plan in the application ( file "db_logger_plan.jmx" ).
    The entire list of parameters is mandatory, if you do not need to transfer anything, empty values ​​are indicated, but it is better not to change anything.

    IMPORTANT! Sometimes it is required to record in the database indicators not of the sample itself but of its parent, then “prev.getParent ()” is added, for example, prev.getParent (). GetDataType (). This may be necessary when nested requests, for example, HTTP. To group the subqueries, you can try using Transaction or Simple Controller.

    DB unpacking

    For the system, both Oracle XE and Enterprise are suitable (the second option is better). Server capacities depend on the intensity of the emulated load (the number of samples per second), testing was carried out on the project in “combat conditions”, where the load was about 450 transactions (samples) per second. The database had characteristics: 4 * Power7 (64bit) SMT-4, RAM 12 GB, CPU utilization was ~ 15%, there was also enough memory.

    Script to create a table

    We create a table in the database with the necessary triggers and sequences:

    “TIME_STAMP” TIMESTAMP (7), --Parameter prev.getTimeStamp () (sample start time)
    “JM_DATE "TIMESTAMP (7), - Parameter prev.currentTimeInMillis () (local time of the heating station when the log was written to the queue)
    " DB_TIME_STAMP "TIMESTAMP (7), - DB time when the log was written to the table
    " ELAPSED_TIME " NUMBER (38.0), - Response time (ms)
    "LATENCY" NUMBER (38.0),
    "IDLE_TIME" NUMBER (38.0),
    "BYTE_COUNT" NUMBER (38.0),
    "SAMPLE_COUNT" NUMBER (10 , 0),
    "URL" VARCHAR2 (2048 BYTE),
    "REQUEST_MSG" BLOB, - It is assumed that the request to the server , but there was no way to get this parameter in JM, you can always fix
    the RESPONSE_MSG BLOB listener , - It is filled only if there was a
    DATA_TYPE error VARCHAR2 (200 BYTE) - Useless thing - server response type

    create sequence jm_log_table_t_id_seq start with 1 increment by 1;

    INSERT ON jm_log_table FOR EACH row DECLARE
    SELECT jm_log_table_t_id_seq.nextval, sysdate INTO: new.t_id,: new.db_time_stamp FROM dual;

    --Indicators are added as needed so that there are no problems during the rebuild when there is a large stream of logs.


    To collect statistics for the test, you need to run an SQL query, which must first be prepared. For example, you can do this while the test is in progress. A huge plus of this approach is that new queries need to be created quite rarely, since the format of the data needed in the report almost does not change during the project. Examples of queries in the appendix to the article ( folder "SQL_Requests_for_DB_Logger" ).

    Output Data Format

    As a result, the execution of queries in the database leads to the formation of a plate similar to the following:

    Using a similar table, you can build something like this graph:


    Actual system performance will depend on the size of the test plan, the iron of the load station, the iron of the database server and the communication channel.
    At the moment, it was not possible to fix the maximum, since there was not enough memory on the laptop of the loading station (I was only at hand with 2GB). In a real project, there was no way to determine the ceiling either, so it was recorded that 1000 samples per second were logged without problems (no longer needed for tests).
    Conditions. Test "logger" was held on two notebooks:
    Load. Station - Lenovo v360 (Pentium 2 * P6100 2000 Mhz, Ram 2Gb); 512 MB, Win 7 are allocated for JMeter v 2.8.
    Oracle database (XE 11g) - Lenovo Y550P (CPU Core i7 8 * 1.66 Ghz, Ram 4 Gb), Win XP. Only 1 index (jm_date) was built for the table.
    Network - 1 Gigabit via a cross-cable.
    The maximum how much they managed to squeeze out of the load station was 1800 samples per second, after which JMeter issued Out of memory and “stopped”.
    The test plan contained a Dummy Sampler with a delay of 500 ms, the number of threads 1000.

    The figure above shows a graph of transactions (samples) / per second. It can be seen that problems with the performance of the loading station (not enough laptop power) appeared with an query intensity of 1800 samples per second, which is already very good.
    Loading system resources on the database:
    • CPU 10-15%,
    • free memory 1 GB (25%),
    • disk utilization 5-10%,
    • network <1%.
    At the loading station:
    • CPU 30%,
    • memory ran out (“eaten” JMeter and OS).
    To test the reliability of the development, a test was conducted on 700 threads and lasting 2 hours, the performance was 1400 samples per second.

    The figure shows a graph of transactions per second (aggregation of 1 sec), as you can see the load worked stably during the entire test, and the logging functionality did not affect JMeter’s main requests.

    The stability of development is confirmed by the graph above, which shows that the time of logging (1400 entries per second) to the database did not exceed 600 ms (pink line).
    The operating time of the load emulating samplers (Dummy Sampler) corresponded to the 500 ms indicated in the test plan, with the exception of rare deviations.
    In this test, the laptop power was enough for the base and JMeter to work.
    Loading system resources on the database:
    • CPU 10%,
    • 1 GB free memory,
    • 5-7% disk utilization,
    • network <1%.
    At the loading station:
    • CPU 30%,
    • free memory ~ 25%.


    This article presents one of the options for centralized storage of the results of numerous load (load) tests using the Apache Jmeter load tool and Oracle DBMS.
    Countless files (csv, xml) with JMeter logs from tests can be replaced with one database, where the results are stored securely and can be accessed by all participants of the load testing project.
    The proposed solution is easily implemented in an existing test plan (two additional "thread groups" and a listener), does not require significant computing power for the database, and allows flexible and convenient processing, analysis of test results.
    By introducing this development into your project, you can save time while analyzing the results for more interesting things, such as communicating with friends, tea / coffee, or whatever you like ...

    In the next article

    The development based on MS Excel will be described, with which, by connecting to the database, you can create tables with test results and graphs using the ODBC DataSource operating system.
    The Excel file contains a VBA macro that connects to the database and executes an SQL query to select test results in an Excel table, from which the necessary graphs are automatically built.
    This development turned out to be an excellent addition to the “logger” described in this article, because processing the results of each new test takes only a few seconds.

    Also popular now: