Push Those Logs Away!

  • Comments posted to this topic are about the content posted at http://www.sqlservercentral.com/columnists/sjones/pushthoselogsaway.asp

  • Hi

    "duplexing" log backups (and fulls!) is common place for oracle dba's, but typically not to another server. Where possible, I always code it in with a simple custom SP with a xcopy over a share. The backup commands in SS2k are very limited compared to oracle, and simple extensions such as duplexing are key items that are sorely missed.

    Good article.

    Cheers

    Ck


    Chris Kempster
    www.chriskempster.com
    Author of "SQL Server Backup, Recovery & Troubleshooting"
    Author of "SQL Server 2k for the Oracle DBA"

  • I utilise database maintenence plans in sql2k to managage all of this. Am I missing something? my backup strategy backs the transaction logs up to a network file server, and automatically deletes old files etc etc etc... all from within the maintenence plan. Are you suggesting that this isnt the best way to handle this?

  • Steve,

    Not bad code! My only complaint (had to be one, right?) is that I think DTS is overkill. For small amounts of code I'd throw into a job step. Dont know that one is better than the other, I just like the visibility of the code in the step(s).

    Andy

    http://www.sqlservercentral.com/columnists/awarren/

  • Thanks for highlighting.

    Worthy of going into a best practices category! I suspect quite a few of us overlook the 24 hour exposure of our backups until the nightly network backup runs.

    Or until we have a hardware meltdown.

    Andrew


    What's the business problem you're trying to solve?

  • We have a dedicated file server for all full, differential, and tran log backups for the DBA group to use. Had something to do with me being a DBA pushing for it before I transitioned over to the server group!

    Typically we script the file move as a job step immediately following the completion of the backup job. In cases where we have to keep multiple copies, we either use a T-SQL job to run xp_cmdshell and a file copy, or we use an ActiveX script and the File Scripting Objectto copy with a rename based on date, hour, or what have you.

    We have had to go back to that separate file server on occasion and it's saved our necks every time.

    K. Brian Kelley

    http://www.truthsolutions.com/

    Author: Start to Finish Guide to SQL Server Performance Monitoring

    http://www.netimpress.com/shop/product.asp?ProductID=NI-SQL1

    K. Brian Kelley
    @kbriankelley

  • Thanks and glad people liked this one .

    I backup locally to prevent any issues if I am having network problems with another server. Hate to lose a backup because another server is down. That's the only downfall of maintenance plans. Nice thing about this is we have 2 copies of all backups so we can handle a local server explosion as well as data corruption. Likelihood of 2 servers going down at once is low, but chance that your backup server goes down and the next day or two you have issues is greater. If you don't move your maintenance plans ASAP, might have issues. Not sure this is more reliable then network backups, just works for me.

    I like DTS as I build my package completely from the local server (local connection, server name and backup locations are all loaded from the dynamic properties task (will demo in an upcoming article). That way I move the DTS package to any server and it just runs, nothing hardcoded.

    Steve Jones

    sjones@sqlservercentral.com

    http://www.sqlservercentral.com/columnists/sjones

  • The only problem with this strategy is that you are relying on the backup to tape actually working before you copy the files to another server then delete the originals.

    I don't know your backup OS strategy but if the OS backup is an incremental, you may want to check the archive bit before you delete the original files. As so in your 'Remove Log Backups Script':

    If f1.DateCreated < BaseDate and ucase(right(f1.name,3)) = "TRN" and not (f1.attributes and 32) Then...

    If this bit is set then either one of 2 things happened. Either a differentail backup is being done or the file has not been backed up yet.

  • We use Marathon Endurance server so that everything is mirrored and backup locally and later to tape. does anyone else use this and if so , any issues?

    Nigel Moore
    ======================

  • Actually, we do not rely on the tape. We keep 2 days worth of full backups. Since the tape only runs at night, we are copying the local disk T-Log backup to a remote server immediately after the backup occurs (within 5 minutes). Then when the tape runs at night, there are 2 copies of the transaction log on two different servers. We do not back these up to tape since when the tape runs, the t-logs have been superceeded by the fulls.

    Steve Jones

    sjones@sqlservercentral.com

    http://www.sqlservercentral.com/columnists/sjones

  • An excellent idea, I use this for backups of all OLTP systems in my company. I have one idea, an add-on you may want to add to your package. I perform similar feats as you, but all in a stored procedure. On additional step I take is to write all the pertinant backup information (database, parent-most recent full database backup, backup type, filesystem backup name) to a table. This gives you everything you need to write a script that will automate your database restores.

    While it's definately not necessary to go to this length, the time it takes you to perform the actions necessary to start/continue a restore just adds to your overall downtime. 'Reducing potential downtime' is a phrase that will make your boss smile.


    -Ken

  • Great article, and an easy step to miss if you write backups locally. We have always been shipping our logs off to another server as the last step of the transaction log backups. I guess I'm wondering why you are not able to determine when your transaction log dumps have completed? Our backups are run as a SQL Agent job. We just added a step after each database backup to copy the backup to another server. This way the backup and backup copy are all part of a single SQL Agent Job.

    Gregory Larsen, DBA

    If you looking for SQL Server Examples check out my website at http://www.geocities.com/sqlserverexamples

    Gregory A. Larsen, MVP

  • I could, but we have been using Maintance Plans and I dislike tinkering with the jobs that are setup.

    This will change as we are looking at going to Diffs all week and so we will have our own jobs then.

    Steve Jones

    sjones@sqlservercentral.com

    http://www.sqlservercentral.com/columnists/sjones

  • All good points. I'm a big T-SQL fan and therefore wrote something similar to ship logs to multiple other servers all in T-SQL. In addition you have the option of restoring a log onto a standby server for "warm" backup purposes.

    Why did I do this instead of using Log shipping? My client didn't want to pay for more than Standard Edition. My implementation allows for multiple destinations and flexible full and log backup schedules. It's all very simple really.

    I have the whole thing documented. Would anyone like a copy of the lot?

  • Nice article.........

    In our company we are taking Full backups at weekend and diff backups in remaining days of week. We have developed automated backup system using SPs. If I'm not wrong,Your article will is usefull only in case of large amount of data(Tera bytes).

    Thanks for writting such good articles.

    madhusudannaidugundapaneni


    Madhu

Viewing 15 posts - 1 through 15 (of 20 total)

You must be logged in to reply to this topic. Login to reply