• mayurkb (6/25/2013)


    That's what I was thinking. I do have a prototype ready with this kind of approach. The design is stored proc getting called from .net application hosted in windows scheduler. (Cannot use SQL Agent job here because it should also run on SQLExpress).

    In the test run, it is timing out. I have about 20000 records in each of the tables. And before blindly increasing timeout, I wanted to explore other design options.

    Meaning you have 20,000 rows in 30 tables (that is around 600,000 rows)? Not what I would consider a large amount of data but that is a pretty good sized delete statement to run every day. There a few things to consider when deleting rows performance wise. Indexing, is the row count sufficient enough to warrant deleting in batches, etc. Given that amount of data removal you should probably update your stats after you remove all the data.

    _______________________________________________________________

    Need help? Help us help you.

    Read the article at http://www.sqlservercentral.com/articles/Best+Practices/61537/ for best practices on asking questions.

    Need to split a string? Try Jeff Modens splitter http://www.sqlservercentral.com/articles/Tally+Table/72993/.

    Cross Tabs and Pivots, Part 1 – Converting Rows to Columns - http://www.sqlservercentral.com/articles/T-SQL/63681/
    Cross Tabs and Pivots, Part 2 - Dynamic Cross Tabs - http://www.sqlservercentral.com/articles/Crosstab/65048/
    Understanding and Using APPLY (Part 1) - http://www.sqlservercentral.com/articles/APPLY/69953/
    Understanding and Using APPLY (Part 2) - http://www.sqlservercentral.com/articles/APPLY/69954/