Home Forums SQL Server 2008 T-SQL (SS2K8) Breaking up a data load to prevent log file growth RE: Breaking up a data load to prevent log file growth

  • Vedran Kesegic (12/4/2012)


    #tempSample is never cleared. So you throw out the same rows over and over in cross apply. Is that intentional?

    You also don't need uix_tmpSampleIndex index.

    #tmpSample is truncated right before the WAITFOR. as far as the index goes on #tempSample i found it to speed things up a little in my testing. To answer your second question i missed when i answered from my phone, the log that balloons is the log on our staging database where this is taking place.

    The question is not so much the speed, each iteration of the loop takes 3 seconds to produce 500,000 expanded records and that includes the WAITFOR. Speed is not the primary issue as this is only the final import step of a data import (i have 250,000 records to expand to 250,000,000 so it takes some time by its self along with the other data.). the question i had was on the method and if there is any thing better to minimize load on the system. the thing that gets me is the loop but i think the nature of this beast will require it.

    i do thank you for your improvements and will look at implementing them through some testing. i do like the filtered index on isProcessed which will speed up finding the top 500. im also thinking of throwing MAXDOP 1 on the queries to force it not to go parallel and take advantage of the same thing that makes the quirky update work.


    For faster help in answering any problems Please read How to post data/code on a forum to get the best help - Jeff Moden[/url] for the best way to ask your question.

    For performance Issues see how we like them posted here: How to Post Performance Problems - Gail Shaw[/url]

    Need to Split some strings? Jeff Moden's DelimitedSplit8K[/url]
    Jeff Moden's Cross tab and Pivots Part 1[/url]
    Jeff Moden's Cross tab and Pivots Part 2[/url]