SQLServerCentral Article

A Case Study in Performance Tuning


The client for whom I am working recently released version 1.0 of an application to be used in their centers throughout the US and Canada. This application had undergone extensive testing and was rolled out to select beta sites. Once in the field, a performance issue was seen in a few centers. Initial analysis pointed to a specific stored procedure. However, the problem could not be replicated in the development environments.

The procedure was edited and reviewed by several developers.. Queries were re-written, query hints used and still the performance problems persisted. Finally, the issue was presented to me for further analysis.

The first step in my analysis was to create some benchmark timing runs of the stored procedure. In my local test database, the stored procedure averaged a run time of approximately 240 seconds. While this was within the bounds of the application’s timeout, it was running much longer in the problem centers. Unfortunately, the server platforms and hardware configurations in the centers vary significantly, so it was no surprise that run times differed in the development environments.

The next step was to determine the execution plan and perform a server trace of the stored procedure. This process highlighted two update statements that were the source of the bottleneck. Reviewing the exact steps did not lead me to an immediate solution. The queries, although complex, were written as efficiently as possible.

The final step was to use the Index Tuning Wizard to see if any additional indices were indicated. The run through the Wizard suggested several improvements: a new non-clustered index on one of the base tables used in the procedure, a schema-bound view with a clustered index and a non-clustered index and several Create Statistics commands.

At last it appeared that an answer was at hand. Applying the suggested changes improved runtime from approximately 240 seconds down to 7-8 seconds. This was exactly the kind of performance gain we were looking for. Viewing the new execution plans showed that the bottleneck queries in the stored procedure were now using the schema-bound view and their execution time dropped considerably.

Unfortunately, applying the changes and testing from the application resulted in an error. The application was returning the error “INSERT failed because the following SET options have incorrect settings: 'ARITHABORT'. A more thorough reading of BOL indicated that any batch which updates underlying tables of a schema-bound view that has a clustered index must first set ARITHABORT ON. The MSDN knowledge base (KB305333) added more to the story. OLE DB and ODBC connections do not specify an ARIHABORT setting and therefore connections default to the server default, which is ARITHABORT OFF. The suggested resolution is to execute “SET ARITHABORT OFF” from the command object before executing any queries.

While the suggested resolution works, it would have necessitated extensive regression testing for our application and would have significantly delayed deploying a fix. So, I had to return to the drawing board. I decided to break down the Index Tuning Wizard suggested script and apply each step individually, with a timing run of the stored procedure in between each step. The results were very illuminating.

The generated script applied the changes in the following order:

  1. A new non-clustered index on the table that is the target of the updates
  2. Create the schema-bound view
  3. Create the clustered index and non-clustered index on the new view
  4. Create Statistics for several of the tables used in the procedure.

I altered the generated script, inserting a timing run of the stored procedure between each step. Additionally, I re-ordered the steps, moving the Create Statistics to the second step. What I found was that applying the new non-clustered index had the major impact on improving performance and the remaining steps had little or no additional impact. With the additional analysis of the suggested script, we were able to go from database changes that would have also required executable changes and extensive testing, to a single index creation that required minimal testing.


I learned several valuable lessons from this process. The additional knowledge of the nuances of schema-bound views was certainly worth the work. But the most important lesson was the importance of thoroughly analyzing the suggestions of the Index Tuning Wizard. The Wizard is a great tool for suggesting improvements. But they are just that: suggestions. They need to be further analyzed to determine which suggestions should be carried out and which should be ignored.


3 (1)




3 (1)