'Select Into' logs

  • Hi all,

    I need to write a stored procedure that moves large amounts of old data on a nightly basis to another DB where it will be backed up and archived.

    The move is going to be just under 1 million rows per evening and was thinking of using the following:

    USE Weblogs_Backup

    GO

    SELECT *

    INTO IISLog

    FROM WebLogs.dbo.IISLog

    WHERE [DATE] = GETDATE() -31

    However, I just need to double check, if I change the Recovery Model to Bulk-Logged, will that prevent the log files growing substantially during the process, or would I be better off if I script an export via an SSIS package?

    Cheers in advance.

    T

    Dh g gjhfvghhgfdfg

  • If Logging isnt a priority on the Archive DB then set the recovery model to Simple.

    Also Does IISLog get deleted/renamed every day, as part of the archive process?

    As the Select Into will try and recreate the IISLog table every time, so you might be better off with an INSERT INTO

    _________________________________________________________________________
    SSC Guide to Posting and Best Practices

  • Jason-299789 (11/14/2012)


    If Logging isnt a priority on the Archive DB then set the recovery model to Simple.

    Even with it set to simple though as I try to run an INSERT INTO the log file just grows and grows whilst the transaction is in process and then clears down again when it is comlete.... I was wondering if this would still be the case if it was set to bulk-logged?

    Jason-299789 (11/14/2012)


    Also Does IISLog get deleted/renamed every day, as part of the archive process?

    As the Select Into will try and recreate the IISLog table every time, so you might be better off with an INSERT INTO

    Good point well made!!

    Dh g gjhfvghhgfdfg

  • In simple mode it will be minimal logging as you say the log will still grow but empty on completion, bulk-logged as far as im aware (please correct me i wrong) will operate in a similar way to full, except that the loogin will be minimal, except that on completion to clear down the log it must be backed up.

    So Simple is probably what you are looking for and the after the initial growth the transaction log shouldnt grow much unless data volumes increase drastically.

    _________________________________________________________________________
    SSC Guide to Posting and Best Practices

  • Using INTO in a simple or bulk logged recovery model uses minimal logging, so depending on how much data your moving you shouldnt see that much of a transaction log jump.

    The amount of logging for SELECT...INTO depends on the recovery model in effect for the database. Under the simple recovery model or bulk-logged recovery model, bulk operations are minimally logged. With minimal logging, using the SELECT… INTO statement can be more efficient than creating a table and then populating the table with an INSERT statement. For more information

    How much data are we talking about moving?

  • Anthony,

    Cant you can also get Minimal logging from an INSERT INTO....SELECT, statement if using SQL 2008 R2 or later, under certain circumstances?

    _________________________________________________________________________
    SSC Guide to Posting and Best Practices

  • Yes that is correct, you have to meet certain pre-reqs first and ensure that you add in the nessesary hints etc.

    This link will describe more

    http://msdn.microsoft.com/en-us/library/ms191244%28v=sql.105%29.aspx

  • ...if I change the Recovery Model to Bulk-Logged, will that prevent the log files growing substantially during the process, or would I be better off if I script an export via an SSIS package?

    ...

    Why do you expect using SSIS will avoid database log file growing?

    You will be better off with SELECT INTO and minimum logging using Simple or Bulk-logged mode (depending on log policies around your place)

    _____________________________________________
    "The only true wisdom is in knowing you know nothing"
    "O skol'ko nam otkrytiy chudnyh prevnosit microsofta duh!":-D
    (So many miracle inventions provided by MS to us...)

    How to post your question to get the best and quick help[/url]

  • trowsell (11/14/2012)


    Jason-299789 (11/14/2012)


    If Logging isnt a priority on the Archive DB then set the recovery model to Simple.

    Even with it set to simple though as I try to run an INSERT INTO the log file just grows and grows whilst the transaction is in process and then clears down again when it is comlete.... I was wondering if this would still be the case if it was set to bulk-logged?

    In bulk-logged the log will fill while the transaction is in process but will not be cleared afterwards until a log backup is run.

    Hence you'll often have more log usage, not less.

    Bulk-logged is for when you want minimal logging but also want to be able to restore the DB using log backups, so not just restore to the last full backup if there's a failure.

    Gail Shaw
    Microsoft Certified Master: SQL Server, MVP, M.Sc (Comp Sci)
    SQL In The Wild: Discussions on DB performance with occasional diversions into recoverability

    We walk in the dark places no others will enter
    We stand on the bridge and no one may pass
  • anthony.green (11/14/2012)


    Yes that is correct, you have to meet certain pre-reqs first and ensure that you add in the nessesary hints etc.

    This link will describe more

    http://msdn.microsoft.com/en-us/library/ms191244%28v=sql.105%29.aspx

    Thanks Anthony, especially for the link.

    _________________________________________________________________________
    SSC Guide to Posting and Best Practices

  • Thanks all...

    I've just done a count and the actual number of rows transferring will be about 3.5 - 4 million rows a day and about 4Gb.

    Why do you expect using SSIS will avoid database log file growing?

    I was thinking along the lines of when you do an export (ie right click on the DB, go to tasks, exports) that it doesn't fill the log files, so when transformed to an SSIS package it wouldn't either.... or is that a wrong assumption?

    You will be better off with SELECT INTO and minimum logging using Simple or Bulk-logged mode

    That was why I thought about doing a select into in the first instance, although I have to say I'm now thinking about the entire process in place where the iis log files get imported into the DB in the first place and that there might be a better way to manage the back ups than having to even transfer the data to a different database to back up.

    Any thoughts on how to back up 1 days worth of data on a database daily rather than the whole lot every day?

    Dh g gjhfvghhgfdfg

  • trowsell (11/14/2012)


    Thanks all...

    I've just done a count and the actual number of rows transferring will be about 3.5 - 4 million rows a day and about 4Gb.

    Why do you expect using SSIS will avoid database log file growing?

    I was thinking along the lines of when you do an export (ie right click on the DB, go to tasks, exports) that it doesn't fill the log files, so when transformed to an SSIS package it wouldn't either.... or is that a wrong assumption?

    Exports (reading data) doesn't log. Importing (writing data) does, doesn't matter how it's written, it has to be logged.

    Gail Shaw
    Microsoft Certified Master: SQL Server, MVP, M.Sc (Comp Sci)
    SQL In The Wild: Discussions on DB performance with occasional diversions into recoverability

    We walk in the dark places no others will enter
    We stand on the bridge and no one may pass

Viewing 12 posts - 1 through 11 (of 11 total)

You must be logged in to reply to this topic. Login to reply