• Koen Verbeeck (12/27/2012)


    There are several options:

    * recreating the table like you did before, but not with SELECT INTO, and using the simple recovery model.

    * updating the table in batches: do the update statement for e.g. 10000 rows at a time.

    * maybe you can temporarily disable/drop the indexes and recreate them after the update statement.

    Thanks for writing.

    -If I create the table in advance with primary keys and identity as tried earlier, then how am I going to put all the data in the table and taking the use of minimal logged operations at the same time. Select into command gave me that flexibility but without using this how am I going to proceed.

    - I tried updating my table in batches using set rowcount 10000 option. However in my update statement, the condition is such that the column may get changed from 1 to 2 and next time when update happens that set of data with value '2' is going to get updated again. My join condition forces it. So it looks like I am going to modify a large number of rows many times. And Update statement is fully logged as per my understanding