Adding new files to a log-shipped database (SQL Spackle)

  • Comments posted to this topic are about the item Adding new files to a log-shipped database (SQL Spackle)

    Wayne
    Microsoft Certified Master: SQL Server 2008
    Author - SQL Server T-SQL Recipes


    If you can't explain to another person how the code that you're copying from the internet works, then DON'T USE IT on a production system! After all, you will be the one supporting it!
    Links:
    For better assistance in answering your questions
    Performance Problems
    Common date/time routines
    Understanding and Using APPLY Part 1 & Part 2

  • All so much easier if you just spend an hour or 2 writing your own logshipping routines.

  • RichB (6/6/2013)


    All so much easier if you just spend an hour or 2 writing your own logshipping routines.

    Agreed - but this is still easy to add a new file after you have been running for a while when new files are added without starting over from a full restore.

  • I wrote my own log shipping routines right down to duplicating the monitor report. I thought it was slick and it all worked very well but all I did was reinvent the wheel.

    Cheers

  • Mimicking the primary volumes on the secondary isn't always possible. Sometimes you have to work with what you are given. I log ship about 500 primary databases from 40 or so remote servers to a single local secondary server that I run tape backups off of. That secondary server has only one enormous data drive - E:. Most of the primary servers are default instances with data on the D: drive, and on top of that many of them have the same database and file names.

    I ended up doing some custom scripting to implement this but I'm using the shrink wrapped log shipping and monitor routines. I just automated the creation of the 500+ log shipping configurations.

  • Nice article Wayne. But more importantly, CONGRATS ON THE MCM CERTIFICATION!! :w00t:

    Best,
    Kevin G. Boles
    SQL Server Consultant
    SQL MVP 2007-2012
    TheSQLGuru on googles mail service

  • TheSQLGuru (6/6/2013)


    Nice article Wayne. But more importantly, CONGRATS ON THE MCM CERTIFICATION!! :w00t:

    Thanks (for both) Kevin.

    Wayne
    Microsoft Certified Master: SQL Server 2008
    Author - SQL Server T-SQL Recipes


    If you can't explain to another person how the code that you're copying from the internet works, then DON'T USE IT on a production system! After all, you will be the one supporting it!
    Links:
    For better assistance in answering your questions
    Performance Problems
    Common date/time routines
    Understanding and Using APPLY Part 1 & Part 2

  • Yes, congrats on that. It is a very big deal indeed!

    Cheers

  • Wayne, good write up and super congrats on the MCM!

  • Thanks Andy

    Wayne
    Microsoft Certified Master: SQL Server 2008
    Author - SQL Server T-SQL Recipes


    If you can't explain to another person how the code that you're copying from the internet works, then DON'T USE IT on a production system! After all, you will be the one supporting it!
    Links:
    For better assistance in answering your questions
    Performance Problems
    Common date/time routines
    Understanding and Using APPLY Part 1 & Part 2

  • i've never done log shipping, but with mirroring all you have to do is leave the database in a restoring state and then start mirroring. can you do the same with log shipping?

  • Yes. Its very similar. The difference is that with log shipping you need to have 3 agent jobs. On the primary there's one to run log backups. On the secondary there's one to copy the log backup from the primary, and another to restore the log backup (with recovery/standby). The latter IMO is the big advantage with log shipping. You can leave the DB in standby mode and run queries against it. There's even a bit of a hack that will let you run BACKUPs of it.

    You can optionally set up a log shipping monitor too to alert you if the log shipping ever breaks or falls seriously behind.

  • Careful with those alerts! If you have a slower connection to your DR site or if maintenance causes larger than normal T log backup file size you will get alerts and plenty if the frequency of the check is short. I setup the schedule for the alert to not run during that time. To me, one of the best features of log shipping besides the obvious is how forgiving it is. There have been times I needed to take down either the source or destination and I've never had issues with things coming back online. I know the way it works is why this is possible but I appreciate the simplicity. We use log shipping and replication and generally things run smooth.

    Cheers

  • Couldn't agree more. By default I think the alert threshold is 45 minutes. I set it to 3 hours, and like you, I schedule the alert job not to run at times when I know its going to fall behind more than that due to circumstances that are beyond my control.

    And yes log shipping is VERY forgiving.

  • Another major plus is if someone makes a mistake and alters or deletes something it may be possible to scrape out what I need from the DR site (assuming they say something fast enough). That totally beats restoring the DB and rolling the logs up to a point in time on another server as all that is required is stopping the restoration job.

    Cheers

Viewing 15 posts - 1 through 15 (of 17 total)

You must be logged in to reply to this topic. Login to reply