Click here to monitor SSC
SQLServerCentral is supported by Red Gate Software Ltd.
 
Log in  ::  Register  ::  Not logged in
 
 
 
        
Home       Members    Calendar    Who's On


Add to briefcase 1234»»»

Restructure 100 Million Row (or more) Tables in Seconds. SRSLY! Expand / Collapse
Author
Message
Posted Monday, April 25, 2011 9:34 PM


Old Hand

Old HandOld HandOld HandOld HandOld HandOld HandOld HandOld Hand

Group: General Forum Members
Last Login: Thursday, April 17, 2014 3:01 PM
Points: 342, Visits: 1,781
Comments posted to this topic are about the item Restructure 100 Million Row (or more) Tables in Seconds. SRSLY!




SQL# - http://www.SQLsharp.com/
Post #1098390
Posted Tuesday, April 26, 2011 6:43 AM
SSC Veteran

SSC VeteranSSC VeteranSSC VeteranSSC VeteranSSC VeteranSSC VeteranSSC VeteranSSC Veteran

Group: General Forum Members
Last Login: Monday, April 21, 2014 6:43 AM
Points: 239, Visits: 999
Solomon, this is a great post and thanks for sharing! Just this last week I took a similar approach with 4 large tables that had multiple columns being changed from varchar to nvarchar to support our localization effort. Much of the techniques you mention here were used with the exception of triggers. Given that we were on SQL Server 2008 I was able to use Change Tracking in place of triggers which resulted in 0 modifications to the existing tables. I'd be interested to hear how other folks have used Change Tracking or Change Data Capture. Thanks again for sharing!

Luke C
MCSE: Data Platform, MCP, MCTS, MCITP - Database Administrator & Database Developer
Post #1098583
Posted Tuesday, April 26, 2011 7:21 AM
Forum Newbie

Forum NewbieForum NewbieForum NewbieForum NewbieForum NewbieForum NewbieForum NewbieForum Newbie

Group: General Forum Members
Last Login: Friday, August 03, 2012 6:18 AM
Points: 1, Visits: 70
If Suppose i have 1 Billion Records in the Table. what is the time complexity. Could you please explain the time complexity.
Post #1098611
Posted Tuesday, April 26, 2011 7:36 AM
SSC Rookie

SSC RookieSSC RookieSSC RookieSSC RookieSSC RookieSSC RookieSSC RookieSSC Rookie

Group: General Forum Members
Last Login: Monday, January 13, 2014 12:36 PM
Points: 33, Visits: 34
This is a very good article . My Client here would do enahancements to the application frequently and I end up adding columns to the table with over 3 million rows. Any suggestions how I would do in this scenario?
Post #1098623
Posted Tuesday, April 26, 2011 11:55 AM
SSCertifiable

SSCertifiableSSCertifiableSSCertifiableSSCertifiableSSCertifiableSSCertifiableSSCertifiableSSCertifiableSSCertifiable

Group: General Forum Members
Last Login: Tuesday, April 01, 2014 8:03 PM
Points: 6,266, Visits: 2,027
The article was well explained and straight to the point.
The title on the other hand is *VERY* misleading (seconds?) Apparently you don't count the time you spent setting AND processing it all ...



* Noel
Post #1098881
Posted Tuesday, April 26, 2011 12:11 PM


Old Hand

Old HandOld HandOld HandOld HandOld HandOld HandOld HandOld Hand

Group: General Forum Members
Last Login: Thursday, April 17, 2014 3:01 PM
Points: 342, Visits: 1,781
Luke C (4/26/2011)
Solomon, this is a great post and thanks for sharing! Just this last week I took a similar approach with 4 large tables that had multiple columns being changed from varchar to nvarchar to support our localization effort. Much of the techniques you mention here were used with the exception of triggers. Given that we were on SQL Server 2008 I was able to use Change Tracking in place of triggers which resulted in 0 modifications to the existing tables. I'd be interested to hear how other folks have used Change Tracking or Change Data Capture.


Hey Luke. Thanks! CDC is an interesting option that I did not think of due to us being on SQL Server 2005. But if I get the chance (someday ), I will revise this article with that option.

Take care,
Solomon...





SQL# - http://www.SQLsharp.com/
Post #1098894
Posted Tuesday, April 26, 2011 12:14 PM


Old Hand

Old HandOld HandOld HandOld HandOld HandOld HandOld HandOld Hand

Group: General Forum Members
Last Login: Thursday, April 17, 2014 3:01 PM
Points: 342, Visits: 1,781
sathishsathish81 (4/26/2011)
If Suppose i have 1 Billion Records in the Table. what is the time complexity. Could you please explain the time complexity.


Hello. The amount of time it takes to move the data over varies based on several factors:

1) How wide is each row
2) How much activity/contention is there on the table
3) How fast is the underlying disk subsystem
4) etc?

So it really takes some testing on each system to really have a decent idea.

Take care,
Solomon...





SQL# - http://www.SQLsharp.com/
Post #1098897
Posted Tuesday, April 26, 2011 12:15 PM


Old Hand

Old HandOld HandOld HandOld HandOld HandOld HandOld HandOld Hand

Group: General Forum Members
Last Login: Thursday, April 17, 2014 3:01 PM
Points: 342, Visits: 1,781
jvrakesh-858370 (4/26/2011)
This is a very good article . My Client here would do enahancements to the application frequently and I end up adding columns to the table with over 3 million rows. Any suggestions how I would do in this scenario?


Hello. I don't really understand your question. Can you please explain in more detail? Thanks.

Take care,
Solomon...





SQL# - http://www.SQLsharp.com/
Post #1098898
Posted Tuesday, April 26, 2011 12:22 PM


Old Hand

Old HandOld HandOld HandOld HandOld HandOld HandOld HandOld Hand

Group: General Forum Members
Last Login: Thursday, April 17, 2014 3:01 PM
Points: 342, Visits: 1,781
noeld (4/26/2011)
The article was well explained and straight to the point.
The title on the other hand is *VERY* misleading (seconds?) Apparently you don't count the time you spent setting AND processing it all ...


Hi Noel. I am sorry if you feel it was misleading, but I did address this in my article at the beginning of the Overview section. The main intent of the "quickly" making changes was to minimize customer / system impact. The end result is that from the outside (and from the perspective of the application), the restructuring does indeed take only seconds. I also stated in the article summary that this is a way to make large-scale changes that do not require a down-time or maintenance window.

Take care,
Solomon...





SQL# - http://www.SQLsharp.com/
Post #1098909
Posted Tuesday, April 26, 2011 1:30 PM
SSC Eights!

SSC Eights!SSC Eights!SSC Eights!SSC Eights!SSC Eights!SSC Eights!SSC Eights!SSC Eights!

Group: General Forum Members
Last Login: Monday, October 21, 2013 11:43 PM
Points: 945, Visits: 1,234
Excellent stuff and very detailed. Thanks for sharing. I have done something similar in the past but never thought of using triggers, instead compared the data between old and new table quickly using Red-Gate data compare tool.
But I guess using the Change tracking feature makes it even more convenient.

Thanks!


Amol Naik
Post #1098958
« Prev Topic | Next Topic »

Add to briefcase 1234»»»

Permissions Expand / Collapse