• Super_Grover (10/23/2016)


    Hi all,

    I must transfer a big table (231k rows not that big, but 735GB big because of blob) to another location by INSERT INTO SELECT FROM...

    It appears that first the whole source table is being loaded before any record is actually inserted into the destination table. But there's not enough diskspace because tempdb just grows and grows.

    So to find a solution for this I thought if I loop through batches of let say 1000 or 5000 recs, maybe the tempdb wouldn't grow that much? Or do I need the space (double the table size?) anyway?

    So, how do I set a counter from rec 1 to 1000; select those recs and insert them; then upscale the counter to 1001 to 2000 and so on?

    Thanks in advance!

    Do you have a unique key on the table? If so then it's easy to use that key for the batch sizes.

    😎

    231K rows totaling at 735Gb equals 3.25 Mb p. row, what kind of data are you storing there?