2nd Live Copy

  • Hi All.

    I have a set of databases on a server in a facility in another city. We do our development here. I want to have a copy of the data locally. The databases are mirrored at the live site. They have daily backup and 30 min transaction log backps. Unfortunately, because we are charged for bandwidth at the location, I can't pull daily backups from the site to me.

    I was thinking that I could have a server here at the office. Load a backup. Copy all the transaction logs each day. Then restore the transaction logs to my copy of the data and I would have current data to use for testing, off-site backup, etc.

    Does this sound workable? Does anyone have any scripts that would help automate the restore process? Any other ideas?

    Thanks!

    Mike

  • you want the data in a writable state presumably?

    -----------------------------------------------------------------------------------------------------------

    "Ya can't make an omelette without breaking just a few eggs" 😉

  • Yes, the data in a writable state is a must. That is why mirroring and log shipping won't work.

    Mike

  • mike 57299 (12/10/2012)


    Yes, the data in a writable state is a must. That is why mirroring and log shipping won't work.

    Mike

    It wasn't stated hence my question, now that we have that cleared up let's go back to your original post.

    mike 57299 (12/8/2012)


    They have daily backup and 30 min transaction log backps. Unfortunately, because we are charged for bandwidth at the location, I can't pull daily backups from the site to me.

    Problem here is if you want fresh writable copy each day you'll need to pull one backup per week and all logs. At 30 min intervals after a couple of days you'll have a truck load of log backups.

    mike 57299 (12/8/2012)


    I was thinking that I could have a server here at the office. Load a backup. Copy all the transaction logs each day. Then restore the transaction logs to my copy of the data and I would have current data to use for testing, off-site backup, etc.

    Why don't you have your dev server on the live site then restores won't cross the network link. It will also prove how robust the application is.

    mike 57299 (12/8/2012)


    Does this sound workable? Does anyone have any scripts that would help automate the restore process? Any other ideas?

    Thanks!

    Mike

    Apart from above the only other option is replication. Bear in mind a re initialisation of the subscription will result in a full backup making its way across the network.

    Best to have the dev server local to the live server.

    -----------------------------------------------------------------------------------------------------------

    "Ya can't make an omelette without breaking just a few eggs" 😉

  • mike 57299 (12/8/2012)


    They have daily backup and 30 min transaction log backps. Unfortunately, because we are charged for bandwidth at the location, I can't pull daily backups from the site to me.

    Problem here is if you want fresh writable copy each day you'll need to pull one backup per week and all logs. At 30 min intervals after a couple of days you'll have a truck load of log backups.

    That is why I want to have a script to restore it each night. Also - why would I need a new backup each week if I am always keeping the database updated?

    mike 57299 (12/8/2012)


    I was thinking that I could have a server here at the office. Load a backup. Copy all the transaction logs each day. Then restore the transaction logs to my copy of the data and I would have current data to use for testing, off-site backup, etc.

    Why don't you have your dev server on the live site then restores won't cross the network link. It will also prove how robust the application is.

    Just won't work for us.

  • mike 57299 (12/10/2012)


    That is why I want to have a script to restore it each night.

    your script will need to restore and recover the following each night

    • full backup
    • any log backups taken up to the point you wish to recover to

    Log backups every 30 mins, thats 48 log backups per day!

    After 5 days alone you have over 200 log backups to restore.

    mike 57299 (12/10/2012)


    Also - why would I need a new backup each week if I am always keeping the database updated?

    New backup would be needed for the reason above. Otherwise after 2 weeks you'd have more log backups than you could shake a stick at and what if one became lost or corrupt.

    Not sure what you mean by "always keeping the database updated", please explain

    mike 57299 (12/8/2012)


    Just won't work for us.

    Sounds like replication could be your best route then, whether your dba will be keen to implement this though is what you'll need to find out.

    -----------------------------------------------------------------------------------------------------------

    "Ya can't make an omelette without breaking just a few eggs" 😉

  • Hi -

    Log backups are 30 min everyday between 8am and 6pm. Once the t-log has been restored - it will be deleted.

    I want to write or get help writing a script that restores the logs from a specific directory. If I do that every night, then delete the files, then:

    1) the database will be current

    2) the # of files will stay constant

    3) I will have a "live" database to work with.

    Mike

  • mike 57299 (12/11/2012)


    Hi -

    Log backups are 30 min everyday between 8am and 6pm. Once the t-log has been restored - it will be deleted.

    I want to write or get help writing a script that restores the logs from a specific directory. If I do that every night, then delete the files, then:

    1) the database will be current

    2) the # of files will stay constant

    3) I will have a "live" database to work with.

    Mike

    Once you recover the database you cant then restore any more log files. You'd have to restore the full backup again and then any log backups to take you up to your chosen recovery point. That's still a whole load of log files, providing one doesnt get lost or corrupted.

    -----------------------------------------------------------------------------------------------------------

    "Ya can't make an omelette without breaking just a few eggs" 😉

  • Take the copy_only backup whenever you require without breaking the log backup and do testing at your test server.

  • Here is an alternative. Restore the the full backup with NORECOVERY, retore all the log files except the last with no recovery, restore the last log file using restore with standby. When that restore is done, the database will be in read only mode.

    Script the all the database objects, and create a second database that is empty. Take a backup of this empty database. You can use it to create a new empty database to populate the next time you add data to the read only database using restore with norecovery or restore with standby.

    You can then use SSIS to transfer the data from the read-only database to your empty database.

Viewing 10 posts - 1 through 9 (of 9 total)

You must be logged in to reply to this topic. Login to reply