MySQL Load Testing in 4 Easy Steps

This post continues our series on how to test various types of technology with Testable. Today we will be load testing a MySQL DB, capturing some useful custom metrics, and analyzing the results.

The test will include creating and dropping a DB schema and then timing insert latency under load.

Step 1: Create a Test Case

Make sure you sign up for a Testable account first. After logging in click the New Test Case button, give it a name, and specify the URL to your DB (i.e. db.myserver.com:3306).

Step 2: Write Test Script

Testable scripts are simply Javascript that executes in a sandboxed Node.JS environment. Once you finish Step 1, click Next and select Write Script as the scenario type. We will use the mysql NPM module for communicating with our DB.

Init Code

The Init code will only execute once globally at the start of our test. Let’s use the following code for Init:

var mysql      = require('mysql');
var connection = mysql.createConnection({
    host              : 'myhostname.com',
    user              : 'myuser',
    password          : 'mypassword',
    multipleStatements: true
});

dataFile.get('https://s3.amazonaws.com/testable.samples/script.sql', function (script) {
    connection.connect();
    connection.query(script.toString());
    connection.end();
});

This code connects to the DB (update the host, user, password for your use case), execute the script found at https://s3.amazonaws.com/testable.samples/script.sql, and closes the connection. script.sql has the following contents:

CREATE DATABASE loadtest;
USE loadtest;

CREATE TABLE person (
    id bigint auto_increment not null,
    email varchar(255) not null,
    name varchar(255) not null,
    address varchar(255) not null,
    PRIMARY KEY (id)
);

This could have also been uploaded to the script in the Data section and then referenced in our code as script.sql. By using a remote URL we can maintain the DB schema for our test independently of the test case though.

Teardown Code

At the end of the test we want to drop the database we just tested. Use the following code for Teardown:

var mysql      = require('mysql');
var connection = mysql.createConnection({
    host     : 'myserver.com',
    user     : 'myuser',
    password : 'mypassword'
});

connection.connect();
connection.query('DROP DATABASE loadtest');
connection.end();

Pretty self explanatory here.

Main Code

Now that we have the init/teardown setup, let’s define the actual test scenario:

var mysql      = require('mysql');
var connection = mysql.createConnection({
    host              : 'myserver.com',
    user              : 'myuser',
    password          : 'mypassword',
    multipleStatements: true
});

connection.connect();
var query = util.format("USE loadtest; INSERT INTO person (name, address) VALUES ('%s', 'some address');", info.currentId());
timing(function(done) {
    connection.query(query, function(err) {
        if (!err) {
            results().counter('inserts', 1);
            done();
        } else
            results().counter('errors', 1);
    });
    connection.end();
}, 'insertLatency');

This code does the following:

  1. Connect to our DB
  2. Time our INSERT query. The INSERT uses info.currentId() (i.e. a unique identifier for the test iteration) as the name and a dummy value for the address
  3. Capture the timer as the insertLatency custom metric.
  4. Count the number of inserts executed as the inserts counter and the number of errors as the errors counter.

This code now defines the Scenario to execute at scale.

Test out your script by pressing the Run button in the upper right. This executes it one time on a shared Testable agent, including init/teardown. Any captured metrics and logging will appear in the Run Console tab.

Example Run Console Output

02/29/2016 12:09:29.879 PM Result Received for tcp://someserver.com:3306
    dnsLookupMs [Timing]: 8
    connectionOpenMs [Timing]: 10
    success [Counter]: +1
    dataPacketsReceived [Counter]: +4
    firstReceivedMs [Timing]: 16
    dataPacketsSent [Counter]: +3
    connectionCloseMs [Timing]: 28
    bytesSent [Counter]: +174
    bytesReceived [Counter]: +115
    bandwidth [Counter]: +289
02/29/2016 12:09:29.903 PM Result Received
    success [Counter]: +1
    inserts [Counter]: +1
    insertLatency [Timing]: 19

Notice that Testable captures a bunch of low level metrics automatically in addition to the custom metric we added in our script.

Step 3: Configure a Load Test

Click Next to move onto the Configuration step. We now define exactly how to execute the scenario we defined in Step 2.

  • 10 concurrent clients in each region. Each concurrent client is like a user using your service and runs on a Testable agent in parallel.
  • 1 minute duration.
  • Three regions (AWS N Virginia, AWS Oregon, AWS Singapore). Note that if your DB is behind a firewall you can also run the test on your own hardware.

Click the Start Test button and your test is off and running! Congratulations you have officially created and run a load test. Now let’s look at analyzing the results.

Step 4: View the Results

By now you should see results flowing in as the test executes. The default dashboard will show a summary, results grid, and graphs of the system captured metrics.

Let’s chart our insertLatency metric and add it the to the Summary and Results Grid.

Add a insertLatency chart

In the upper right click the Dashboard.. button and select Add Chart.

Give it a name like Insert Latency. Select a Line chart and let’s chart insertLatency -> mean, insertLatency -> p95 (95th percentile), and insertLatency -> p99 (99th percentile).

Add the chart and it will appear on your dashboard.

Add insertLatency to summary

Scroll down to the summary and press the configure icon. Press Add Column and select metric insertLatency-mean and press Update Summary. You can reorder the columns by dragging the new one up or down in the configuration window.

After updating the configuration the summary will look as follows:

You can save this new dashboard for future use with the Dashboard menu in the upper right. And that’s it! We’ve setup a DB test scenario, captured custom metrics, run it at scale, and analyzed the results.