Azure test runners, new browser performance metrics, and more

April 15, 2019

Testable now fully supports the Microsoft Azure cloud platform. Test runners can be spun up on-demand in either our Azure account or your own. Read on for more details as well as other new features including new browser performance metrics.

Azure Test Runners

When configuring a test you can now choose one or more of the Azure global regions to run your test. Testable can automatically spin-up test runners in both our Azure account or your own as part of a test.

We automatically recommend a VM type and number of instances, but you are also free to override these settings yourself.

Your Azure Account



Add your Azure account as a test runner source and Testable can spin up the VM scale sets for load generation within one of your Azure virtual networks. See our self-hosted Azure guide for more details on how to set it up.

When you run a test with Azure locations you will can see results broken down by location and also details of all VMs that were spun up.

New Browser Performance Metrics

If you run a test with Webdriver.io we now capture several new metrics that can help you assess your application's performance in the browser. These metrics have also been added to the default Selenium results view.


  • Page Load Time: Time in milliseconds to completely load a page and become interactive

  • Page Requests: Number of resource requests required to load a page

  • Page Weight: Total amount of data, in bytes, transferred over the network to completely load a page

  • Speed Index: The Speed Index is the average time at which visible parts of the page are displayed as defined by Google

  • Time to First Byte: Time until the first byte is received by the browser when loading a page

  • Time to First Paint: Time until the first pixel is drawn on the screen by the browser when loading a page

  • Time to First Contentful Paint: Time until the browser renders the first DOM element on the screen when loading a page

  • Time to Interactive: Time until the page is both visually rendered and capable of reliably responding to user input

Counter Metrics: Peak and Per Second

For all counter metrics, both system generated ones (e.g. bandwidth, bytes sent, bytes received, etc) and custom ones (e.g. messages processed, logins attempted, etc) we now calculate several useful new aggregators:


  • Peak In Interval: The peak count captured during any 10 second interval during your test.

  • Peak Count/Sec: The rate per second during the peak interval.

  • Average Count/Sec: The average count per second across the entire test. For example, bandwidth average count per second is the same thing as throughput.



These aggregators can be added to a results view or used to set breaking points and success criteria.

SHARE: