Custom Log Shipping

  • Comments posted to this topic are about the item Custom Log Shipping

  • Personally, thus far, I find SQL 2005's GUI for Log Shipping pretty good and easy to use

    for your script, the TARGET server/DB are always in NO RECOVERY? so it's more Mirroring than Log Shipping

    as with SQL 2005 Log Shipping, the TARGET server/DB is READ-ONLY

    SQLServerNewbieMCITP: Database Administrator SQL Server 2005
  • So this solution would not be a good solution for someone looking for a reporting server since it does not A) terminate all connections to the database prior to applying the logs and b) Applys the logs with the NO RECOVERY option instead of the READ ONLY option? We are running SQL 2000 SE and we are looking for an alternative to upgrading to Enterprise edition in order to implement Log Shipping or something similar to have a reporting server.

  • Andy Leffler (5/1/2008)


    So this solution would not be a good solution for someone looking for a reporting server since it does not A) terminate all connections to the database prior to applying the logs and b) Applys the logs with the NO RECOVERY option instead of the READ ONLY option? We are running SQL 2000 SE and we are looking for an alternative to upgrading to Enterprise edition in order to implement Log Shipping or something similar to have a reporting server.

    Log shipping will NOT serve you as a reporting server. You may want to consider Replication for that 😉


    * Noel

  • Noeld, why specifically do you say that Log Shipping will not work to establish a reporting server?

  • Good solution. I am not into log shipping. I just tried it 1-2 times so I can't comment much into this. But I like the custom solution.

  • Actually we use log shipping for establishing a reporting database but there is an issue that we need one reporting base updated once a day and another one hourly. The problem comes that these bases are with same names in SQL 2005 log shipping wizzard and now they are in two different servers. Is there simple sollution to put them on one server with different names ie to put somewhere with move clause?

  • I have instituted this process for building a framework to migrate a production database from SQL2000 to SQL2005 without interupting production. Allowing us to test apps and connections 'offline' and then cutover at a planned point in time with a final transaction log restore.

    Conceptually, this process is a perfect fit. In practice, everything appears to be working except for the last step. The 'DataMirror' is stuck in restoring mode. Is this what should happen? The dbDataMirror database indicates the bak was verified and applied, then the subsequent trn files. The job fires off every set interval. Why would the destination db still be restoring?

    Any help would be appreciated.

    Thanks

    -Chris

  • cstrabley Conceptually, this process is a perfect fit. In practice, everything appears to be working except for the last step. The 'DataMirror' is stuck in restoring mode. Is this what should happen? The dbDataMirror database indicates the bak was verified and applied, then the subsequent trn files. The job fires off every set interval. Why would the destination db still be restoring?

    Chris,

    Did you ever figure out what was happening with that last step?

    I am about to implement this solution for a warm backup of our production database. Also - what interval is appropriate to run the big job on the backup server? Ideally - I wouldn't want to be more than 15 minutes behind the actual production box if we had to cut over to the warm backup...so is this realistic:

    1. Full backup at the top of each hour

    2. Trans log backup every 15 minutes

    3. Run complete restore job (Step 3 script) at like :50 minute mark each hour

    Or am I misunderstanding this script? Can I set it to run every 16 minutes so it just restores each new transaction log to the already "ready" database.

    A bit confused...any help would be appreciated.

    Cheers!

    Bruce

  • Hi Chris,

    The solution is supposed to apply the logs in NO RECOVERY mode so that more logs can be continuously applied automatically. The last step of actually bringing the database online is a manual one and has been purposefully kept so. My thought was that in normal circumstances the logs would be continuously applied. And in case of a disaster, the DBA would verify the entire environment and then decide to bring the standby DB online after stopping the DataMirror jobs.

    In order to bring your database online after all logs have been applied, you can consider adding the below statement just above the line which says "lblEndOfLoop:" which is the label for checking errors. That would be on line number 155.

    "RESTORE DATABASE @strDatabase WITH RECOVERY"

    So, immediately after looping through all logs and applying them, if you could add this line in the script, your database would get back online.

    One note though. Next time your DataMirror jobs run, the first database file to be applied needs to be a full backup and not a transaction log since your database would be online. So you may need to verify if this works with your setup.

    Also, if you are planning to connect and use the standby server actively and would have apps connecting through the day, you may run into a scenario where there are connections open when the next run of the DataMirror jobs start. You would then need a way to kill open connections to the DB. For that you may want to look at the following script from the Script library:

    http://www.sqlservercentral.com/scripts/Maintenance+and+Management/30024/

    Thanks,

    Girish

  • I would like to add couple of aditions/changes:

    1. You don't need to use xp_cmdshell to find whether the new backup is available. You can refer to MSDB database to find all backups that have taken place

    2. In case you need to move the backups on other server, if its on WAN you can do so by FTP, if in same domain, you can take backup on the target server itself.

    I had done this in early days when I was more a developer and a fresh DBA. I had written a VB program by implementing timer control.

    I will give complete steps as per the points added in short while.

    Thanks,

    Girish

  • Hi Girish,

    Checking the MSDB is okay if the target database is also on the same server as the source. If the target DB is on a remote server it may probably need a linked server to the source server's MSDB. Hence the idea of checking the filename pattern using xp_cmdshell on a directory accessible from the target server, regardless of how the files land up in that directory - whether by FTP or by taking backup directly into that directory.

    I like your idea of FTP'ing the backups to a remote target server. Please post the code for everyone's benefit.

    Thanks

  • Hi Girish,

    I want to implement the customized solution to synch my test database with production database, the only thing for i what i am worried about is whether After implementing the cutomized solution, can we use the test database for all the transactions. Since this is not available in log_shipping solution, because after restore the database is in read-only mode and any other transaction other than read can not be executed.

    Regards,

    Ibrahim

Viewing 13 posts - 1 through 12 (of 12 total)

You must be logged in to reply to this topic. Login to reply