SQL Server Benchmarking with DVDStore

Dell developed a synthetic benchmark tool that can run benchmark tests against SQL Server, Oracle, MySQL, and PostgreSQL installations. It is open-sourced and freely available at https://github.com/dvdstore/ds3/. It generates a benchmark value in the form of ‘orders placed’ per test cycle, and tests the performance of the instance configuration and the infrastructure underneath it. You create a synthetic workload dataset with an included tool, load it into the instance via a new test database (DS3), and then run an included load testing tool to simulate orders of DVDs off of a web store.

Download this how-to guide PDF now!

This tool is used quite frequently by Heraflux to compare the relative performance between two SQL Server instances. Load the same test data into two different instances, run the same test on each server, and compare the results. Raw server performance is compared, and differences in the server performance can be isolated.

Pre-Requisites

A base bundle is shipped for all database engines, and the current release is located on the main GitHub page. You can download the entire package by clicking the ‘Clone or download’ button.

dvdstore

Extract this bundle to the drive of your choice, and then move the inner folder ds3-master/ds3 to the root of the drive to keep things simple.

It will create a file structure that is similar to the following.

dvdstore2

The creation script will prompt you to specify the size of the target database. It will create CSV files on the local file system containing data with this much space. Make sure you have enough space on the drive you place them on to create the workload files.

Next, you need to install a Perl for Windows package so the workload files can be successfully generated. Commonly used is ActiveState’s ActivePerl community edition, available at http://www.activestate.com/activeperl/downloads. Download the version that matches your Windows platform and install it.

Setup

Next, create the workload files. You should be doing this on the database server itself so you do not have to copy the workload files across the network after you create them. How large should you make the workload dataset? Strive for targeting the average size of the database that you are looking to migrate between platforms, or just a good size that represents average data volumes in your environment. Again, remember that you need this amount of space on the drive where you are building the workload files.

For this example’s sake, generate a 25GB workload dataset here. Open an elevated command prompt and change directory into the new DS3 directory that you created. Run Install_DVDStore.pl to begin the process.

Enter the database size (integer value), then enter the measurement of space (usually GB). Enter the database platform (MSSQL) and the platform (WIN).

dvdstore3

Next, enter the location where the new database objects will be placed on the SQL Server’s file system itself. It does this as part of the create database script. These locations can be changed later if necessary. Please remember to place a trailing backslash at the end of the path because the build script does not do this for you. This example uses the F:\Data\ directory.

Once you hit enter, it will begin to generate the CSV workload dataset files.

dvdstore4

A short period of time later, and now you are ready for the next step. If you navigate to the sqlserverds3 directory, you will now see a script called sqlserverds3_create_all_25GB.sql. If you have multiple scripts, it is the script containing the same size specification as what you just entered into the data build script.

dvdstore5

Do not execute this script yet. DVDStore for SQL Server has a small bug in it. Edit the highlighted file with your favorite editor. SQL Server Management Studio will work quite well.

At this time, verify that full text search is enabled on your SQL Server instance.

dvdstore7

If you see a 1 as the result, Full Text Search is installed and enabled. If not, work with your DBA to install Full Text Search into this SQL Server instance.

Now, review the script. Some DBAs wish to keep the database transaction log files on a different drive (good practice in most circumstances). The portions at the top of the script specify the locations of the various database components. Review these locations and make any changes as necessary.

dvdstore8

Data Load

Now, load and execute the script while logged into the instance as a sysadmin. It will create the new ‘DS3′ database and a new login called ‘ds3user’. This login does not have the ability to set or change the password because of the way the workload files were coded, so for security purposes disable this login when your tests are complete.

It then loads the CSV workload files with bulk insert commands. It could take a bit of time, so be patient. You can monitor the progress in the query’s Messages window.

dvdstore9

When completed, run whatever index maintenance job that you have available on this new database. I recommend the Ola Hallengren database maintenance solution, freely available at http://ola.hallengren.com. Ensure that the statistics update job is updating all statistics and not those associated with indexes.

Finally, take a backup of this database so that you can restore it on this or other instances to replay this test to get relative performance differences, if necessary.

Test Execution

Now, it’s time for the actual test. Create a new batch file in the \DS3 directory. A sample and easily identifiable name could be ‘runstresstest.bat’. In the batch file, add the line:

.\sqlserverds3\ds3sqlserverdriver.exe –target=localhost –n_threads=8 –run_time=5
–ramp_rate=10 –db_size=25GB –warmup_time=0 –think_time=0.085 –n_searches=3
–search_batch_size=5 –n_line_items=5 –windows_perf_host=agdb01a –detailed_view=Y
–out_filename=results.txt

In this string, please change your target if you intend to run this across a network or if you have a named SQL Server instance. Set your database size according to how you built the workload files. Adjust your number of threads to set the expected load. If you need more threads than 64, I suggest running a second copy of the batch file to get more threads, and would set the test configuration at increments of the total thread count required. Set the run time to the number of minutes that you wish to run the test. At a minimum, I suggest at least five minutes, and probably longer if you are performing an infrastructure load test.

The think_time setting is the amount of time that a simulated user would ‘think’ before clicking again. I have the time set very low so I can stress test the instance. I store the output to a ‘results.txt’ file so I can record the results later.

Now, schedule a time that works best for your test and execute it. If this is a production instance with background production tasks, this test will throttle the CPUs and memory, so be careful not to disrupt production tasks. Make sure to have whatever monitoring tool or Perfmon running so you can capture the performance metrics – i.e. CPU usage, memory PLE, I/O activity, etc. – so you know how hard this machine is being throttled.

You may need to execute this command as administrator since it requires registry access to query the target server with WMI to capture CPU load.

It looks like the test is not doing much at all in the command window. The window shows periodic updates with orders per minute (OPM) values.

dvdstore10

You get an entry in the output file every ten seconds, with an orders per minute (OPM) value recorded.

Finally, the test will end, and you should have a final ordered placed value for the test.

dvdstore11

This number is the relative score for the instance’s performance on this particular test.

Open the testresults.txt output file when the test completes.

dvdstore12

The two primary benchmark values are total orders placed per test cycle (second column in the output file) and orders per minute (third column in the output file).

Now, you are free to re-run this test in any permutation you see fit for your purposes. Restore your backup so you are working from the exact same dataset before each test. Document what you change if you are testing infrastructure or instance changes, and try to only make one change at a time.

You can now test infrastructure, instance, or database configuration changes and see the raw performance differences in one simple to read number.