View Test Results
If you have previously gone through the steps to Execute a Test it is quite easy to pull up the results again. Make sure you first login and click on the test case name on the left. If you click the 'Last Executed' time on any test it will pull up the most recent result.
To see historic results click on the test configuration name and then select the run of interest from the page that loads.
Each section of the results page is broken down here.
The overview on the results page gives you the following information:
- Dashboard: Which dashboard is being displayed. Default Dashboard is the default. Read the customize dashboard guide for more details.
- Status: Status of this execution is one of Pending, Running, Processing Results, Completed, Cancelled by User, and Limit Breached.
- Load Parameters: Which test configuration was used and the associated load parameters. Click on the configuration name for more information like the execution history.
- Scenario: Which scenario was executed on each iteration of the test. Click through to view the scenario details.
- Action Menu: Various actions to perform related to this test like Repeat, Stop, Edit Configuration, Export to CSV, Share, and dashboard customization options.
While the test is executing, this section shows you some "active" metrics. Active means executed during the most recent interval.
See a quick high level summary of how the results of this test execution compares with other recent runs of this same test configuration.
The console shows all logging captured during execution, both system and [scenario generated](/documentation/scripts/write-script.html#logging].
Results can be viewed aggregated across all regions where the test executed or for one specific region.
A chart that gives you an overview of performance during the test. See the median latency, 99%ile latency, success rate, and concurrent users all in one chart. The set of metrics shown is customizable as well.
The metrics are aggregated across all results in that region or in all regions for the "ALL" tab. Which metrics are shown is customizable. See how the results during this test compare with up to 10 recent test executions as both a percentage change and a sparkline to spot trends.
The results are broken down by resource. A resource usually corresponds to a network resource like
GET https://www.google.com, a JMeter Label, or a Gatling scenario name + request label, but can also be anything captured with a custom metric. The columns in this grid are customizable.
If you select a resource in the grid, all charts and traces become specific to that resource only.
Traces allow you to view all the details of a particular connection made during test execution including metrics, data sent, and data received.
If you scenario captures output files (e.g. screenshots), a sampling of these will be available on the Files tab. To capture all output files from your test, update the setting by selection the Edit Configuration action => Advanced Options => Capture All Output. The method by which to capture output files is scenario type specific. See the documentation for the specific scenario type for more details (e.g. Selenium, PhantomJS, etc).
Any metric can be charted. Data points are captured every 10 seconds. Like everything else charts are customizable.
View a chart of memory and CPU utilization across the test runners. Updated value captured every 10 seconds.
The locations map shows where the test runners utilized to execute your test are located around the globe. If you hover over any region you can see the peak memory and CPU utilization across all the test runners within that region.