Custom Log Shipping

  • ckempste

    SSCoach

    Points: 17983

    Comments posted to this topic are about the content posted at http://www.sqlservercentral.com/columnists/ckempster/customlogshipping.asp


    Chris Kempster
    www.chriskempster.com
    Author of "SQL Server Backup, Recovery & Troubleshooting"
    Author of "SQL Server 2k for the Oracle DBA"

  • ckempste

    SSCoach

    Points: 17983

    Hi all

    Just a quick comment re the log shipping, we have been testing our large COM+ app against the log shipped db, after a restore of logs, the COM+ components seem to be caching the connections to the instance, a shutdown via component services of the affected compoments resolves the issue but is far from ideal. The senior AP's are investigating the issue and how the connections are being pooled/cached.

    Just something to watch out for.

    Cheers

    Ck

    Chris Kempster

    http://www.chriskempster.com

    Author of "SQL Server 2k for the Oracle DBA"


    Chris Kempster
    www.chriskempster.com
    Author of "SQL Server Backup, Recovery & Troubleshooting"
    Author of "SQL Server 2k for the Oracle DBA"

  • Clark Cruz

    SSC Veteran

    Points: 225

    At first blush, Chris has put together

    a well thought solution for an option

    of creating SQL log shipping.

    Excellent work Chris!

  • adamcrv

    Hall of Fame

    Points: 3056

    Hi,

    Just wanted to say that the log shipping idea seems very sound in its implementation here. In the conclusion you say that having users still logged in while doing a recovery is bad, Im not sure if I would want Yukon to be allowing this, because (imho) why would you want people working on data that is about to change.. I sort of think that kicking everyone out is a good idea before recovery..I obviously cant see the good side to not kicking hem out... 🙂


    ------------------------------
    Life is far too important to be taken seriously

  • adamcrv

    Hall of Fame

    Points: 3056

    hem = them


    ------------------------------
    Life is far too important to be taken seriously

  • ckempste

    SSCoach

    Points: 17983

    Hi there

    Well is a good point actually and (i suppose) really depends on your requirement for the standby DB.

    In our case I wanted to avoid having 3 databases, the live OLTP, the standby and then the reporting database. If I can use the standby for all reporting functions (its not warehoused with time series data) and users dont mind the 10min cycle of data change by applying new logs, then its win win all round. Kicking users then becomes a problem, esp for those running long complex queries. To get around this I only restore every 2hrs and everyone is aware of the disconnection. If Yukon had an option to go either way, id see it as a "nice feature" more than anything that makes the standby more flexible and usable to my clients.

    Cheers

    Ck

    Chris Kempster

    http://www.chriskempster.com

    Author of "SQL Server 2k for the Oracle DBA"


    Chris Kempster
    www.chriskempster.com
    Author of "SQL Server Backup, Recovery & Troubleshooting"
    Author of "SQL Server 2k for the Oracle DBA"

  • adamcrv

    Hall of Fame

    Points: 3056

    Oh yeah! i forgot about the long running queries scenario..!! I suppose it would be a good thing for Yukon to have (if indeed it will) once again sounds good 🙂


    ------------------------------
    Life is far too important to be taken seriously

  • Chuck Ritenour

    SSC-Addicted

    Points: 420

    I really think this is a good set of routines and sp's for a log shipping solution. I ran across it while looking for a solution to keep 235 databases (53 gigs) within 24 hours of snyc at our standby server offsite. Replication wasn't an option as I had to leave the online databases user-changeable (add, delete tables, add, drop databases - as DBA I have no control of that, and I'm not notified so snapshot's really weren't in our business model).

    I worked through mod's for Chris's script, and we emailed back and forth some - then decided it wasn't a viable option as I have other restrictions that just make it not possible (only a t-1 line, and only permitted to transfer 50 gigs a month).

    If anyone out there has any suggestions as to how to do this, I'm open to them:

    critenour@imapdata.com

    Thanks for your help Chris.

     


    Thanks, and don't forget to Chuckle

  • Jason Steele

    SSC Enthusiast

    Points: 118

    Chris, first let me thank you for this contribution, you have saved me literally days of work and its far better than I could have achieved on my own.

    I have one small problem with it though. When gzip is called to zip up the BAK files it occasionally fails to delete the original file and therefore causes and error email to be sent.

    I am using a Job running iunder the SQL Server agent and its last step is to also FTP and then MOVE the files. The MOVE command also returns an errorlevel (despite succeeding).

    I suspect that this issue is caused by attempting to delete a file (i.e. a backup) that is still open.

    Have you come across this problem and/or do you have any suggestions to get around it?

    Many Thanks,

    Jason

  • Ravi Kosaraju

    Grasshopper

    Points: 21

    Chris

    Excellent work on this! We have been using this for a couple of our databases which run the MSDE flavor and it works great.

    Thanks

    Ravi

  • Marcel van Pinxteren

    Grasshopper

    Points: 15

    We had problems with files being locked. After the backup and after the zip just add a little wait (e.g. 1 minute).

  • Alex Tomic

    Grasshopper

    Points: 19

    This approach seems very well thought out. I would like to test it, but the link to scripts appears to be broken (404 not found).

    Can someone provide an updated link?

    Thanks!

  • Jason Steele

    SSC Enthusiast

    Points: 118

    Alex Tomic (4/20/2008)


    This approach seems very well thought out. I would like to test it, but the link to scripts appears to be broken (404 not found).

    Can someone provide an updated link?

    Thanks!

    Hi Alex,

    I've uploaded it to my SkyDrive at http://cid-e39114477b2c887f.skydrive.live.com/browse.aspx/Public.

    I've been using for over a year now (with some tweaks). It's pretty good but there's quite a few hard coded things that should be pulled out into a settings table and also here is no automatic retry if an FTP fails - which can be surprisingly often if you are log shipping 30 databases at every 15 minutes!

    But its the best one I have found at the right price 😉 - so many thanks to Chris Kempster for sharing it.

    Jason

  • jando999

    SSC Rookie

    Points: 36

    HI,

    The link to the the zip file containing the code and scripts is broken.

    Any ideas where I can get the source for this?

    Cheers

    James

  • jando999

    SSC Rookie

    Points: 36

    jando999 (3/17/2009)


    HI,

    The link to the the zip file containing the code and scripts is broken.

    Any ideas where I can get the source for this?

    Cheers

    James

    Ignore me, must learn to read previous posts

Viewing 15 posts - 1 through 15 (of 15 total)

You must be logged in to reply to this topic. Login to reply