Decrease Stored Proc Execution Time By Limiting the Table Size


Hi Guys,

Hope everyone is doing well. Last week, I was working on the most exciting task on which any database developer would love to work - Performance Tuning of a Stored Procedure.

Let me explain about the code that this stored procedure consisted of. The basic task was to truncate the data from an existing table and fill it with new data based on some really complex logic and calculations. There were various temp tables that were being used inside the subsequent steps of the stored procedure.

Whenever the stored procedure consists of various sub-steps, my approach is to get the execution time of each one of the sub-steps (by running it as an ad-hoc query) and find out the possible solutions to decrease the execution time of this step. And yes, you can also look at the Actual/Estimated execution plans and try to check whether the code written follows the best practices or not.

So, after knowing the execution time of all the steps, I began to focus on the step which was consuming more than 70 % of the execution time of the entire stored procedure. It looked something like this:

1. Creation of temp table say #temptb
2. Creation of another temp table  #temptb2 which uses the previously created temp table and a master table (consisting of 40 million rows). It involved a left join with the master table something like

#temptb left join master_table on #tempdb.key=master_table.key

What I did was limited this master_table into smaller version by using another temp table (#short_master_table where key in (select * from #temptb)) at the start of stored proc.

So instead of scanning all the 40 million rows, the query was now scanning some 1 million rows. Which reduced the execution time of the stored proc by 6 minutes approximately.

Hope this makes sense. Thanks for reading, please do share your views or opinions of how do you go about fine tuning a particular stored procedure or a SQL query!