Click here to monitor SSC
SQLServerCentral is supported by Red Gate Software Ltd.
 
Log in  ::  Register  ::  Not logged in
 
 
 
        
Home       Members    Calendar    Who's On


Add to briefcase

2nd Live Copy Expand / Collapse
Author
Message
Posted Saturday, December 8, 2012 11:17 AM
SSC-Enthusiastic

SSC-EnthusiasticSSC-EnthusiasticSSC-EnthusiasticSSC-EnthusiasticSSC-EnthusiasticSSC-EnthusiasticSSC-EnthusiasticSSC-Enthusiastic

Group: General Forum Members
Last Login: Wednesday, August 20, 2014 9:32 PM
Points: 137, Visits: 360
Hi All.

I have a set of databases on a server in a facility in another city. We do our development here. I want to have a copy of the data locally. The databases are mirrored at the live site. They have daily backup and 30 min transaction log backps. Unfortunately, because we are charged for bandwidth at the location, I can't pull daily backups from the site to me.

I was thinking that I could have a server here at the office. Load a backup. Copy all the transaction logs each day. Then restore the transaction logs to my copy of the data and I would have current data to use for testing, off-site backup, etc.

Does this sound workable? Does anyone have any scripts that would help automate the restore process? Any other ideas?

Thanks!
Mike
Post #1394347
Posted Monday, December 10, 2012 3:39 AM


SSCertifiable

SSCertifiableSSCertifiableSSCertifiableSSCertifiableSSCertifiableSSCertifiableSSCertifiableSSCertifiableSSCertifiable

Group: General Forum Members
Last Login: Today @ 1:26 PM
Points: 6,300, Visits: 13,580
you want the data in a writable state presumably?

-----------------------------------------------------------------------------------------------------------

"Ya can't make an omelette without breaking just a few eggs"
Post #1394522
Posted Monday, December 10, 2012 9:30 AM
SSC-Enthusiastic

SSC-EnthusiasticSSC-EnthusiasticSSC-EnthusiasticSSC-EnthusiasticSSC-EnthusiasticSSC-EnthusiasticSSC-EnthusiasticSSC-Enthusiastic

Group: General Forum Members
Last Login: Wednesday, August 20, 2014 9:32 PM
Points: 137, Visits: 360
Yes, the data in a writable state is a must. That is why mirroring and log shipping won't work.

Mike
Post #1394677
Posted Monday, December 10, 2012 11:17 AM


SSCertifiable

SSCertifiableSSCertifiableSSCertifiableSSCertifiableSSCertifiableSSCertifiableSSCertifiableSSCertifiableSSCertifiable

Group: General Forum Members
Last Login: Today @ 1:26 PM
Points: 6,300, Visits: 13,580
mike 57299 (12/10/2012)
Yes, the data in a writable state is a must. That is why mirroring and log shipping won't work.

Mike

It wasn't stated hence my question, now that we have that cleared up let's go back to your original post.


mike 57299 (12/8/2012)
They have daily backup and 30 min transaction log backps. Unfortunately, because we are charged for bandwidth at the location, I can't pull daily backups from the site to me.

Problem here is if you want fresh writable copy each day you'll need to pull one backup per week and all logs. At 30 min intervals after a couple of days you'll have a truck load of log backups.


mike 57299 (12/8/2012)
I was thinking that I could have a server here at the office. Load a backup. Copy all the transaction logs each day. Then restore the transaction logs to my copy of the data and I would have current data to use for testing, off-site backup, etc.

Why don't you have your dev server on the live site then restores won't cross the network link. It will also prove how robust the application is.



mike 57299 (12/8/2012)
Does this sound workable? Does anyone have any scripts that would help automate the restore process? Any other ideas?

Thanks!
Mike

Apart from above the only other option is replication. Bear in mind a re initialisation of the subscription will result in a full backup making its way across the network.

Best to have the dev server local to the live server.


-----------------------------------------------------------------------------------------------------------

"Ya can't make an omelette without breaking just a few eggs"
Post #1394719
Posted Monday, December 10, 2012 4:59 PM
SSC-Enthusiastic

SSC-EnthusiasticSSC-EnthusiasticSSC-EnthusiasticSSC-EnthusiasticSSC-EnthusiasticSSC-EnthusiasticSSC-EnthusiasticSSC-Enthusiastic

Group: General Forum Members
Last Login: Wednesday, August 20, 2014 9:32 PM
Points: 137, Visits: 360
mike 57299 (12/8/2012)
They have daily backup and 30 min transaction log backps. Unfortunately, because we are charged for bandwidth at the location, I can't pull daily backups from the site to me.
Problem here is if you want fresh writable copy each day you'll need to pull one backup per week and all logs. At 30 min intervals after a couple of days you'll have a truck load of log backups.

That is why I want to have a script to restore it each night. Also - why would I need a new backup each week if I am always keeping the database updated?

mike 57299 (12/8/2012)
I was thinking that I could have a server here at the office. Load a backup. Copy all the transaction logs each day. Then restore the transaction logs to my copy of the data and I would have current data to use for testing, off-site backup, etc.
Why don't you have your dev server on the live site then restores won't cross the network link. It will also prove how robust the application is.

Just won't work for us.

Post #1394821
Posted Monday, December 10, 2012 11:12 PM


SSCertifiable

SSCertifiableSSCertifiableSSCertifiableSSCertifiableSSCertifiableSSCertifiableSSCertifiableSSCertifiableSSCertifiable

Group: General Forum Members
Last Login: Today @ 1:26 PM
Points: 6,300, Visits: 13,580
mike 57299 (12/10/2012)
That is why I want to have a script to restore it each night.

your script will need to restore and recover the following each night


  • full backup

  • any log backups taken up to the point you wish to recover to



Log backups every 30 mins, thats 48 log backups per day!
After 5 days alone you have over 200 log backups to restore.


mike 57299 (12/10/2012)
Also - why would I need a new backup each week if I am always keeping the database updated?

New backup would be needed for the reason above. Otherwise after 2 weeks you'd have more log backups than you could shake a stick at and what if one became lost or corrupt.

Not sure what you mean by "always keeping the database updated", please explain


mike 57299 (12/8/2012)
Just won't work for us.

Sounds like replication could be your best route then, whether your dba will be keen to implement this though is what you'll need to find out.


-----------------------------------------------------------------------------------------------------------

"Ya can't make an omelette without breaking just a few eggs"
Post #1394897
Posted Tuesday, December 11, 2012 8:05 AM
SSC-Enthusiastic

SSC-EnthusiasticSSC-EnthusiasticSSC-EnthusiasticSSC-EnthusiasticSSC-EnthusiasticSSC-EnthusiasticSSC-EnthusiasticSSC-Enthusiastic

Group: General Forum Members
Last Login: Wednesday, August 20, 2014 9:32 PM
Points: 137, Visits: 360
Hi -

Log backups are 30 min everyday between 8am and 6pm. Once the t-log has been restored - it will be deleted.

I want to write or get help writing a script that restores the logs from a specific directory. If I do that every night, then delete the files, then:

1) the database will be current
2) the # of files will stay constant
3) I will have a "live" database to work with.

Mike
Post #1395142
Posted Tuesday, December 11, 2012 8:13 AM


SSCertifiable

SSCertifiableSSCertifiableSSCertifiableSSCertifiableSSCertifiableSSCertifiableSSCertifiableSSCertifiableSSCertifiable

Group: General Forum Members
Last Login: Today @ 1:26 PM
Points: 6,300, Visits: 13,580
mike 57299 (12/11/2012)
Hi -

Log backups are 30 min everyday between 8am and 6pm. Once the t-log has been restored - it will be deleted.

I want to write or get help writing a script that restores the logs from a specific directory. If I do that every night, then delete the files, then:

1) the database will be current
2) the # of files will stay constant
3) I will have a "live" database to work with.

Mike

Once you recover the database you cant then restore any more log files. You'd have to restore the full backup again and then any log backups to take you up to your chosen recovery point. That's still a whole load of log files, providing one doesnt get lost or corrupted.


-----------------------------------------------------------------------------------------------------------

"Ya can't make an omelette without breaking just a few eggs"
Post #1395148
Posted Tuesday, December 11, 2012 1:20 PM
SSC-Enthusiastic

SSC-EnthusiasticSSC-EnthusiasticSSC-EnthusiasticSSC-EnthusiasticSSC-EnthusiasticSSC-EnthusiasticSSC-EnthusiasticSSC-Enthusiastic

Group: General Forum Members
Last Login: Saturday, September 14, 2013 8:05 AM
Points: 101, Visits: 108
Take the copy_only backup whenever you require without breaking the log backup and do testing at your test server.


Post #1395300
Posted Tuesday, December 11, 2012 5:44 PM


SSC-Insane

SSC-InsaneSSC-InsaneSSC-InsaneSSC-InsaneSSC-InsaneSSC-InsaneSSC-InsaneSSC-InsaneSSC-InsaneSSC-InsaneSSC-Insane

Group: General Forum Members
Last Login: Today @ 3:06 PM
Points: 23,276, Visits: 31,996
Here is an alternative. Restore the the full backup with NORECOVERY, retore all the log files except the last with no recovery, restore the last log file using restore with standby. When that restore is done, the database will be in read only mode.

Script the all the database objects, and create a second database that is empty. Take a backup of this empty database. You can use it to create a new empty database to populate the next time you add data to the read only database using restore with norecovery or restore with standby.

You can then use SSIS to transfer the data from the read-only database to your empty database.



Lynn Pettis

For better assistance in answering your questions, click here
For tips to get better help with Performance Problems, click here
For Running Totals and its variations, click here or when working with partitioned tables
For more about Tally Tables, click here
For more about Cross Tabs and Pivots, click here and here
Managing Transaction Logs

SQL Musings from the Desert Fountain Valley SQL (My Mirror Blog)
Post #1395370
« Prev Topic | Next Topic »

Add to briefcase

Permissions Expand / Collapse