• Boris Pazin (8/17/2013)


    Biggest table has about 76 millions of rows. So, I guess deleting and re-creating 76M of rows could be harmful for all queries which use that table?

    No, doing that is fine. That kind of volume of data changes would invalidate all plans, so they'd get recompiled the next time they run.

    Where you can have problems is with smaller data changes, around 10% if it changes the results for queries dramatically (eg http://sqlinthewild.co.za/index.php/2011/03/22/statistics-row-estimations-and-the-ascending-date-column/), or when you run identical queries/procedures with different parameter values that will result in radically different data volumes.

    Gail Shaw
    Microsoft Certified Master: SQL Server, MVP, M.Sc (Comp Sci)
    SQL In The Wild: Discussions on DB performance with occasional diversions into recoverability

    We walk in the dark places no others will enter
    We stand on the bridge and no one may pass