DBCC Update statistics

  • I had the performance issues for following scenario,

    I had table with name T1 – Staging table. (Below described process is a every month production run regular process.)

    I upload or insert data into table T1 by 15 to 20 Independent jobs with same sp but different input parameters. (Each job will upload 35,000 to 70,000 records to T1).

    Note : Before start my first job I truncated the table T1. (Truncate table T1 only at initial stage and not every time.)

    In between insert data jobs also I read data from table T1. – Here is the problem I am facing since it took very long time to fetch the data from T1.

    Check lists :

    I checked any process which blocked that table to read data. – No process locked that table already.

    No other SPID blocked the current SPID.

    Temporary Solution we are doing:

    Kill the current process (SPID)

    Update statistics for table T1.

    Restart the same Read data process/job.

    Then it is working fine.

    Our concern : Without doing update statistics how do we handle this situation.

    Note : Data insert to T1 and Data read from T1 will not coincide either one process will run at a time.

    Give me suitable solution for this.

  • you should update statistics of your tables on a weekly basis (off-peak hour) which improves performance of your queries.

    Refer these links:

    http://msdn.microsoft.com/en-us/library/ms187348.aspx

    http://www.sqlskills.com/blogs/Kimberly/post/Auto-update-statistics-and-auto-create-statistics-should-you-leave-them-on-andor-turn-them-on.aspx



    Pradeep Singh

Viewing 2 posts - 1 through 2 (of 2 total)

You must be logged in to reply to this topic. Login to reply