Large Data Set Stored Procedure

  • This is an optimization question. I have a query which has 20 inner joins that it needs to process while the result set is 237,000 rows of data. I have tried to break the joins down into temporary tables which are later joined together. However it still takes me good amount of time to print the result set (3 Mins - 4 Mins 30 Secs). To start of with this might just be a vague question understandably but I want to give as much of additional details as possible to make it clear.

    Crisply here are a few questions that I have

    1) Does having too many temporary tables and indexes also hit the performance

    2) Does inserting the final result set into another result set and selecting that result set make the thing any better.

    What would be a good resource to understand how the joins and temporary tables impact the performance.

  • Wow 20 joins that seems like quite a lot. I wonder if perhaps you are trying to achieve too much in a single procedure. Are you returning the entire database 🙂

    Seriously though, it's difficult to provide focused guidance without further information.

    1. Are you able to provide the source code that you are describing?

    2. What is the business context i.e. what are you are trying to achieve with this procedure?

  • You should enable Execution Plan before executing query, the Execution Plan will let you know if any table required any additional index and also provide you exact query execution time.

    The Execution time which you estimated, it is based on SQL server performance, if the SQL server is busy it may take more time to execute your query while Execution Plan is best way to evaluate query performance.

    Thanks

Viewing 3 posts - 1 through 2 (of 2 total)

You must be logged in to reply to this topic. Login to reply