Click here to monitor SSC
SQLServerCentral is supported by Red Gate Software Ltd.
 
Log in  ::  Register  ::  Not logged in
 
 
 
        
Home       Members    Calendar    Who's On


Add to briefcase 12»»

'Select Into' logs Expand / Collapse
Author
Message
Posted Wednesday, November 14, 2012 2:38 AM
Forum Newbie

Forum NewbieForum NewbieForum NewbieForum NewbieForum NewbieForum NewbieForum NewbieForum Newbie

Group: General Forum Members
Last Login: Wednesday, September 18, 2013 9:05 AM
Points: 8, Visits: 67
Hi all,
I need to write a stored procedure that moves large amounts of old data on a nightly basis to another DB where it will be backed up and archived.
The move is going to be just under 1 million rows per evening and was thinking of using the following:

USE Weblogs_Backup
GO
SELECT *
INTO IISLog
FROM WebLogs.dbo.IISLog
WHERE [DATE] = GETDATE() -31


However, I just need to double check, if I change the Recovery Model to Bulk-Logged, will that prevent the log files growing substantially during the process, or would I be better off if I script an export via an SSIS package?

Cheers in advance.
T
Post #1384486
Posted Wednesday, November 14, 2012 2:44 AM
SSC Eights!

SSC Eights!SSC Eights!SSC Eights!SSC Eights!SSC Eights!SSC Eights!SSC Eights!SSC Eights!

Group: General Forum Members
Last Login: Today @ 7:24 AM
Points: 884, Visits: 2,414
If Logging isnt a priority on the Archive DB then set the recovery model to Simple.

Also Does IISLog get deleted/renamed every day, as part of the archive process?

As the Select Into will try and recreate the IISLog table every time, so you might be better off with an INSERT INTO


_________________________________________________________________________
SSC Guide to Posting and Best Practices
Post #1384487
Posted Wednesday, November 14, 2012 2:59 AM
Forum Newbie

Forum NewbieForum NewbieForum NewbieForum NewbieForum NewbieForum NewbieForum NewbieForum Newbie

Group: General Forum Members
Last Login: Wednesday, September 18, 2013 9:05 AM
Points: 8, Visits: 67
Jason-299789 (11/14/2012)
If Logging isnt a priority on the Archive DB then set the recovery model to Simple.


Even with it set to simple though as I try to run an INSERT INTO the log file just grows and grows whilst the transaction is in process and then clears down again when it is comlete.... I was wondering if this would still be the case if it was set to bulk-logged?

Jason-299789 (11/14/2012)
Also Does IISLog get deleted/renamed every day, as part of the archive process?

As the Select Into will try and recreate the IISLog table every time, so you might be better off with an INSERT INTO


Good point well made!!
Post #1384494
Posted Wednesday, November 14, 2012 3:13 AM
SSC Eights!

SSC Eights!SSC Eights!SSC Eights!SSC Eights!SSC Eights!SSC Eights!SSC Eights!SSC Eights!

Group: General Forum Members
Last Login: Today @ 7:24 AM
Points: 884, Visits: 2,414
In simple mode it will be minimal logging as you say the log will still grow but empty on completion, bulk-logged as far as im aware (please correct me i wrong) will operate in a similar way to full, except that the loogin will be minimal, except that on completion to clear down the log it must be backed up.

So Simple is probably what you are looking for and the after the initial growth the transaction log shouldnt grow much unless data volumes increase drastically.


_________________________________________________________________________
SSC Guide to Posting and Best Practices
Post #1384500
Posted Wednesday, November 14, 2012 3:19 AM


SSCertifiable

SSCertifiableSSCertifiableSSCertifiableSSCertifiableSSCertifiableSSCertifiableSSCertifiableSSCertifiableSSCertifiable

Group: General Forum Members
Last Login: Yesterday @ 6:19 AM
Points: 5,216, Visits: 5,109
Using INTO in a simple or bulk logged recovery model uses minimal logging, so depending on how much data your moving you shouldnt see that much of a transaction log jump.

The amount of logging for SELECT...INTO depends on the recovery model in effect for the database. Under the simple recovery model or bulk-logged recovery model, bulk operations are minimally logged. With minimal logging, using the SELECT… INTO statement can be more efficient than creating a table and then populating the table with an INSERT statement. For more information


How much data are we talking about moving?




Want an answer fast? Try here
How to post data/code for the best help - Jeff Moden
Need a string splitter, try this - Jeff Moden
How to post performance problems - Gail Shaw
CrossTabs-Part1 & Part2 - Jeff Moden
SQL Server Backup, Integrity Check, and Index and Statistics Maintenance - Ola Hallengren
Managing Transaction Logs - Gail Shaw
Troubleshooting SQL Server: A Guide for the Accidental DBA - Jonathan Kehayias and Ted Krueger

Post #1384504
Posted Wednesday, November 14, 2012 3:39 AM
SSC Eights!

SSC Eights!SSC Eights!SSC Eights!SSC Eights!SSC Eights!SSC Eights!SSC Eights!SSC Eights!

Group: General Forum Members
Last Login: Today @ 7:24 AM
Points: 884, Visits: 2,414
Anthony,

Cant you can also get Minimal logging from an INSERT INTO....SELECT, statement if using SQL 2008 R2 or later, under certain circumstances?


_________________________________________________________________________
SSC Guide to Posting and Best Practices
Post #1384509
Posted Wednesday, November 14, 2012 3:53 AM


SSCertifiable

SSCertifiableSSCertifiableSSCertifiableSSCertifiableSSCertifiableSSCertifiableSSCertifiableSSCertifiableSSCertifiable

Group: General Forum Members
Last Login: Yesterday @ 6:19 AM
Points: 5,216, Visits: 5,109
Yes that is correct, you have to meet certain pre-reqs first and ensure that you add in the nessesary hints etc.

This link will describe more

http://msdn.microsoft.com/en-us/library/ms191244%28v=sql.105%29.aspx




Want an answer fast? Try here
How to post data/code for the best help - Jeff Moden
Need a string splitter, try this - Jeff Moden
How to post performance problems - Gail Shaw
CrossTabs-Part1 & Part2 - Jeff Moden
SQL Server Backup, Integrity Check, and Index and Statistics Maintenance - Ola Hallengren
Managing Transaction Logs - Gail Shaw
Troubleshooting SQL Server: A Guide for the Accidental DBA - Jonathan Kehayias and Ted Krueger

Post #1384518
Posted Wednesday, November 14, 2012 3:54 AM
SSCrazy

SSCrazySSCrazySSCrazySSCrazySSCrazySSCrazySSCrazySSCrazy

Group: General Forum Members
Last Login: Today @ 10:13 AM
Points: 2,857, Visits: 5,129

...if I change the Recovery Model to Bulk-Logged, will that prevent the log files growing substantially during the process, or would I be better off if I script an export via an SSIS package?
...


Why do you expect using SSIS will avoid database log file growing?

You will be better off with SELECT INTO and minimum logging using Simple or Bulk-logged mode (depending on log policies around your place)




_____________________________________________
"The only true wisdom is in knowing you know nothing"
"O skol'ko nam otkrytiy chudnyh prevnosit microsofta duh!"
(So many miracle inventions provided by MS to us...)

How to post your question to get the best and quick help
Post #1384521
Posted Wednesday, November 14, 2012 3:55 AM


SSC-Dedicated

SSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-Dedicated

Group: General Forum Members
Last Login: Today @ 12:52 PM
Points: 39,965, Visits: 36,321
trowsell (11/14/2012)
Jason-299789 (11/14/2012)
If Logging isnt a priority on the Archive DB then set the recovery model to Simple.


Even with it set to simple though as I try to run an INSERT INTO the log file just grows and grows whilst the transaction is in process and then clears down again when it is comlete.... I was wondering if this would still be the case if it was set to bulk-logged?


In bulk-logged the log will fill while the transaction is in process but will not be cleared afterwards until a log backup is run.

Hence you'll often have more log usage, not less.

Bulk-logged is for when you want minimal logging but also want to be able to restore the DB using log backups, so not just restore to the last full backup if there's a failure.



Gail Shaw
Microsoft Certified Master: SQL Server 2008, MVP
SQL In The Wild: Discussions on DB performance with occasional diversions into recoverability

We walk in the dark places no others will enter
We stand on the bridge and no one may pass

Post #1384523
Posted Wednesday, November 14, 2012 4:10 AM
SSC Eights!

SSC Eights!SSC Eights!SSC Eights!SSC Eights!SSC Eights!SSC Eights!SSC Eights!SSC Eights!

Group: General Forum Members
Last Login: Today @ 7:24 AM
Points: 884, Visits: 2,414
anthony.green (11/14/2012)
Yes that is correct, you have to meet certain pre-reqs first and ensure that you add in the nessesary hints etc.

This link will describe more

http://msdn.microsoft.com/en-us/library/ms191244%28v=sql.105%29.aspx


Thanks Anthony, especially for the link.


_________________________________________________________________________
SSC Guide to Posting and Best Practices
Post #1384529
« Prev Topic | Next Topic »

Add to briefcase 12»»

Permissions Expand / Collapse