Tagged as

This article shows you how to develop applications with better performance out of the gate, even if you don’t have a DBA.

Editorial Note

This article is in the Product Showcase section for our sponsors at CodeProject. These reviews are intended to provide you with information on products and services that we consider useful and of value to developers.

Business Need

Indexing is a beneficial yet little understood process. Indexes solve many problems, including locking/blocking and bottlenecks. Table Scans are avoided and physical disk reads minimized. This has a direct impact on users’ realized performance. Despite these benefits, there is confusion about how to create indexes, which fields to create them on, how they work, and exactly how they impact performance.

Fortunately, Microsoft created an Index Tuning Wizard so the appropriate indexes can be created automatically. This often results in significant performance increases. You don’t have to be an expert to use the wizard and realize the benefits. Several tools come together to make the wizard work.

This article is aimed at the broadest audience, with most of the technical details removed. There is no need for a deep understanding of SQL Server to follow this step-by-step index article.

Process Overview

In Step 1, we will time the performance of a database with no indexes other than the indexes used for the Primary Key. To do this, we’ll use a combination of an un-customized Iron Speed Designer-generated application and a stop watch.

In Step 2, we’ll use SQL Server 2000 Profiler to start a Trace and save a Trace file. A Trace file contains all the activity on the database server during the period we are recording.

In Step 3, we will let the Index Tuning Wizard analyze the Trace file for us. The wizard will recommend indexes based on the recorded user activity and create the indexes.

Last, in Step 4, we’ll time the actions from Step 1 (above) on more time and compare the values.

Before getting started, let’s take a quick peek at the database itself.

Database

The database has two tables: SalesOrder and Employee. The SalesOrder table has approximately 1.7 million rows of records, while the Employee table has nine rows. These are copies of the Northwind tables that were modified to add primary keys (PK). I replicated the existing data to increase the number of rows.

Fig. 1 - Database Schema. The SalesOrder table has about 1.7 million records. Each record uses the EmployeeID to "link" to the Employee table, which contains only nine rows.

Step 1 – Timing Application Performance

Fig. 2 – ShowTable screen with Filters. Here is a typical Iron Speed Designer-generated application. This is the Sales Order ShowTable page. Notice the filters for Employee, Customer and Freight ranges. We will test the performance of the application when no indexes exist on the tables. Later we’ll run the test after indexes are created.

Even simple SQL Statements run slowly on an improperly indexed database: When the Sales Order Freight menu items is clicked, we get the following error (Fig. 3).

Fig. 3 – Application Timeout. Uh-Oh. This error message says the query took too long and timed out, preventing the rows from returning.

What SQL Statement is causing this error? See figure 4.

Fig. 4 – Database View vw_SalesOrder_Freight. This view selects the top 1,000 rows from the SalesOrder table where the Freight and ShipVia match some simple criteria. The results are sorted by the CustomerID in ascending order.

That’s it?!? There appears to be nothing to this statement. The view is only accessing a single table. It is not performing any joins. It seems strange it would time out. However, the query exposes the problem: a lack of indexes causes SQL Server to work hard to figure out which rows should be returned.

The Test Results table (see below) shows the different actions taken and the duration time in seconds that it took to execute. For example, the first action taken was Clicking the SalesOrder menu. That loaded the Showtable page with the first page of 10 rows from the Sales Order database table. It took 4 seconds from the time I clicked on the menu item until the page refreshed in my browser... I perform each action several times to ensure cache and compilation are not affecting the values.

There are several behind-the-scenes steps involved in rendering a browser page. When delays occur, they are often the result of the database fetching records inefficiently due to improper indexing. We will use Microsoft SQL Server 2000 Profiler to record data for further analysis.

Step 2 – Microsoft SQL Server 2000 Profiler

Profiler is used to "trace" (record) the activity on the SQL Server. The activity is saved in a log file for analysis. We will use Profiler to record and save all the SQL statements while we click around.

Start Profiler by clicking on Start, Programs, Microsoft SQL Server, Profiler. Once the tool loads, click on File, New Trace. The following dialog appears:

Fig. 5 – SQL 2000 Profiler: New Trace. Fill in the Trace name with whatever you like. For TemplateName, select SQLProfilerTuning. Profiler can record all sorts of information. By selecting SQLProfilerTuning, we will record only the proper activities, and that data allows performance to be tuned. Select "Save to file" and specify a location and filename. You can look in Help to get details on the other options on this screen. For now, leave the default settings for the max file size at 5MB and "Enable file rollover" checked. Finally, click the Run button to finish setting the options and begin recording SQL activity.

Fig. 6 – SQL 2000 Profiler: Recorded SQL Activity. Each SQL statement or stored procedure call sent to the SQL server is recorded. So is the execution duration, measured in elapsed milliseconds. We can see the full SQL statement in the pane at the bottom by clicking one of the rows in the top grid. This is useful to copy/paste any long-running statements into Query Analyzer for further analysis.

In Fig. 6, the SQL statement that previously timed out is highlighted. Notice the duration is 40,126 milliseconds, or about 40 seconds, until the timeout.

After we are done working in our application, we can switch back to Profiler. Click the red Stop button on the top toolbar to save the file. Now we have our trace file saved, and we are ready to load the Index Tuning Wizard. Run the tool from within Profiler by clicking Tools and then selecting Index Tuning Wizard.

Step 3 – Index Tuning Wizard

The Index Tuning Wizard analyzes a trace file and inspects the SQL statements we previously recorded. The wizard will be able to decide when an index should or should not be created. If the wizard determines an index should be created, the details are handled automatically. This tutorial will not discuss what index type to use in a custom situation.

I often uncheck the "Keep all existing indexes" option. This is because the wizard may determine certain indexes should be dropped if they are not being used.

Also, when the Tuning mode is set to “Thorough,” the analysis takes longer. If you are dealing with a very large trace file on a real system, it’s a good idea to run this process and analysis after-hours.

Fig. 8 – Index Tuning Wizard: Specify Workload. A workload file is the trace file containing all the recorded SQL Statements and details. For the purposes of this tutorial, we’ll ignore the settings in the Advanced Options screen.

Fig. 9 – Index Tuning Wizard: Select Tables. Select the tables you want to tune. In most cases, use the Select All Tables button to get index recommendations on all the tables in the system.

Fig. 10 – Index Tuning Wizard: Processing. A series of messages will pop up and notify you of processing progress. On a small trace file, the processing should be quick. However, if you are working with a real trace file with many entries, processing can take hours. It might overtax the system resources if performed on a production server during business hours. Because of this, we recommend that you run the Index Tuning Wizard after-hours when only you are on the system.

In our example, the load file is so small that it processes in about 7 seconds.

Fig. 11 – Index Tuning Wizard: Index Recommendations. Here are the recommendations. Based on our application usage, the wizard recommends we create four or more new indexes (indicated by the icons with the yellow star). If we allow the wizard to create these indexes for us, we should realize a 67% performance improvement! These are estimates, but tend to be pretty accurate.

Fig. 12 – Index Tuning Wizard: Apply or Schedule. It is now time to actually create the indexes. We are working on a development database with no other users at the moment, so we will simply select “Apply changes” and “Execute recommendations now.” This creates the indexes for us.

Creating indexes can take a while depending on how many rows are in each table. More importantly, creating indexes applies locks on the database. This can cause problems with other users attempting to use the system.

Production database note: we recommend that you always back up your database before applying any changes to it. Also, to avoid production conflict, process and create indexes only after-hours. However, you can create a trace file in Profiler during production hours in order to get an accurate record of real users' activity on the system.

Post-Index Creation Application Timing

Running the same tests as we did in Step 1 (above), yields better results.

We cut our response time in half in most instances! And the view named vw_SalesOrder_Freight, which previously timed out after 42 seconds, now works perfectly. It runs the query and paints the screen in under a second!

Conclusion

This article addressed a common problem among application programmers. Indexing is often performed by a full-time Database Administrator (DBA), which many teams don’t have. This article shows you how to develop applications with better performance out of the gate, even if you don’t have a DBA. Indexing provides quick benefit without a lot of technical knowledge about indexing. Index tuning should be an ongoing process.

The sample project can be downloaded here. This ZIP file contains a non-customized application from which the screen shots were taken. It also contains the populated database and the Trace files for the examples in this article.

Share

About the Author

Jim Murphy is the Owner and Chief Application Designer and Developer for River City Software Development LLC. Jim has been programming professionally for over 16 years in fields that include Health Care, Insurance, Banking/Lending, Real Estate, Oil & Gas, Manufacturing, and Government.

Jim was an early adopter of Iron Speed Designer and worked closely with Iron Speed Technical Engineers while the product was maturing. Jim’s major strengths are in Microsoft SQL Server, Microsoft Access, C#, Visual Basic .NET, XML, and Reporting.

Comments and Discussions

Thanks for this simple walkthrough. However, many other tuning effort apart from indexing is required to create a highly performing production database. One performance winner for me is the elimination of "bad SQL" from my code E.g. using operators like LIKE, <>, IS NULL etc. I would like to read more tips on improving SQL Server performance from your replies.

Yeah. Not bad. To be honest I was hoping for more technical details. I also don't quite understand why it is based on Sql 2000. Despite that I'm glad you did this walk through. I think it is a good starter article for new comers to database design. I've been surprised at how some seasoned developers don't know this stuff already too. I think this article makes for a good jump start for them too.