We need to update a data type in an existing database from Int-->BigInt. Overall there are about 10 billion records distributed over 90 tables that will need to be updated. Unfortunately we can't incur downtown of more than 5-10 hours. DBA's have tested, and maximum throughput they can get using standard BCP technque is around 5 million rows converted per minute, which blows right by the allowed downtown window.
Was wondering if anybody had ran up against similar type problems, and if there was a way to use a backup and replication or log shipping as a solution (ie, update all the records in an offline backup while production is still running, bring the backup up-to-date using replication or log shipping, and re-point production over to the backup), or any hardware solutions that might work.
Thanks in advance for any help that you can provide