updating 180 million records

  • I have a database I'm updating with an internal unique Identifier which I've created a function for and it will process chunk out about 10000 records a second... the problem is I need to do this to 180 million records. Right now I'm looping through it (10000) at a time and I'm doing this with and index on the id column and taking the top 10000 that are null and updating them but even this is taking way to long... any suggestions would be great.

  • It depends on any indexes you have in place and if the unique idenifier is the primary key and a clustered index. If not clustered then you will get a performance increase by making it so but will take a while with that many to change. can also spethings up by dropping any other indexes and recreating them after. The reason is each change you make will update the indexes when the data for those indexes are changed. Other that that, can you post you code and the results of SET SHOWPLAN_TEXT ON (the execution plan) so we can see what it is doing and offer suggestions related to it directly.

    "Don't roll your eyes at me. I will tape them in place." (Teacher on Boston Public)

  • If my math is right that works out to about 5 hours. Doesnt seem totally unreasonable, especially since you're running it small enough batches that it won't lock the entire table.

    Andy

    http://www.sqlservercentral.com/columnists/awarren/

Viewing 3 posts - 1 through 2 (of 2 total)

You must be logged in to reply to this topic. Login to reply