• I could but selecting all that data out twice is very slow. The table is 12GB so it would probably take at least a few hours this way. That is why I was thinking about using the same concept but with BCP as it is built to deal with large datasets. I did a test run to copy data out and even with BCP it took 1h 40m to copy all the data out to a file. That doesn't include truncating the table and then reinserting the data.

    The issue is we don't really have many windows to run things like this and I'm looking for the fastest way possible really. If there are any options I can add to an index rebuild command or something along those lines?

    Do you have any other ideas?