Click here to monitor SSC
SQLServerCentral is supported by Red Gate Software Ltd.
 
Log in  ::  Register  ::  Not logged in
 
 
 
        
Home       Members    Calendar    Who's On


Add to briefcase 12»»

Archive Backups Expand / Collapse
Author
Message
Posted Tuesday, June 15, 2010 10:20 PM


SSC-Dedicated

SSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-Dedicated

Group: Administrators
Last Login: Today @ 3:28 PM
Points: 33,189, Visits: 15,328
Comments posted to this topic are about the item Archive Backups






Follow me on Twitter: @way0utwest

Forum Etiquette: How to post data/code on a forum to get the best help
Post #937920
Posted Wednesday, June 16, 2010 12:47 AM
SSC Veteran

SSC VeteranSSC VeteranSSC VeteranSSC VeteranSSC VeteranSSC VeteranSSC VeteranSSC Veteran

Group: General Forum Members
Last Login: Tuesday, June 3, 2014 8:16 AM
Points: 295, Visits: 1,011
Storage prices are low. With that said however, if it really is a problem and your not in the financial business with it's requirements then perhaps you can backup every now and then, check that the backup is ok and then discard the previous one and in addition also save one per quarter. That would feel a lot better than the solution they use as I understand it...
Post #937950
Posted Wednesday, June 16, 2010 3:15 AM
SSCrazy

SSCrazySSCrazySSCrazySSCrazySSCrazySSCrazySSCrazySSCrazy

Group: General Forum Members
Last Login: 2 days ago @ 9:20 AM
Points: 2,868, Visits: 3,211
Outside of the SQL Server world, deduplication of data in backups is big business. It is the way that large organisations are making $M savings on storage costs.

But this is hard to do with SQL Server backups, unless the backup is done by a backup agent. But most SQL Server folk do not like backup agents because historically they are unreliable when it comes round to doing a recovery.

The lack of ability to deduplicate the data in the SQL backups is maybe a major reason behind the proposal to just do a quarterly backup. In theory you could do daily differential backups to a quarterly full backup, but the management of this amount of backup to me looks horrendous.

We need Microsoft and other backup vendors to come up with a solution that allows deduplication of SQL backup data, otherwise the storage costs for unduplicated SQL Backups will start to become a reason for avoiding SQL Server.

An idea that is used by another DBMS is called 'offline backup merge'. This takes a full backup and a differential backup (or a chain of log backups) as input, and produces a new full backup showing the state as at the end of the merged backup. This can be done on any server, with the merge process connecting to the original DB instance at the end of the work to register the new backup. This process is not the full data deduplication that is possible with block-level backups, but it can help simplify database recovery for slowly changing data.


Original author: SQL Server FineBuild 1-click install and best practice configuration of SQL Server 2014, 2012, 2008 R2, 2008 and 2005. 28 July 2014: now over 30,000 downloads.
Disclaimer: All information provided is a personal opinion that may not match reality.
Concept: "Pizza Apartheid" - the discrimination that separates those who earn enough in one day to buy a pizza if they want one, from those who can not.
Post #938007
Posted Wednesday, June 16, 2010 6:21 AM
SSCrazy

SSCrazySSCrazySSCrazySSCrazySSCrazySSCrazySSCrazySSCrazy

Group: General Forum Members
Last Login: Today @ 12:05 PM
Points: 2,553, Visits: 3,800
Steve,

Presumably, the data is important otherwise it wouldn't be saved. The key part to your editorial for me is that the images don't change often. But, they change. How would you know if anything changed between quarterly backups; or even monthly backups. If the data is important enough to retain, and it changes, it is important to make the investment to back it up daily - maybe weekly at the outside.
Post #938097
Posted Wednesday, June 16, 2010 7:42 AM
SSC Eights!

SSC Eights!SSC Eights!SSC Eights!SSC Eights!SSC Eights!SSC Eights!SSC Eights!SSC Eights!

Group: General Forum Members
Last Login: Today @ 9:54 AM
Points: 865, Visits: 2,381
I didn't see any discussion of how frequently data was being added to the archive store; that's an absolutely critical point, especially if there comes to be a gap: quarterly backups of archive, backups of the main store kept for, say, only 8 weeks... a problem just before the (not unlikely to be delayed; after all, it's "read only") quarterly backup and you completely lose at least 4 weeks of data from the first part of that quarter.

Further, I disagree about a quarterly full and daily differentials being a nightmare to manage; you're talking about less than a hundred backup files. If we can manage millions of rows in hundreds of tables, then surely a hundred files in a single directory path isn't that difficult if you take the time to set up management, validation, and restore test protocols in the first place.
Post #938180
Posted Wednesday, June 16, 2010 8:09 AM


SSC-Dedicated

SSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-Dedicated

Group: Administrators
Last Login: Today @ 3:28 PM
Points: 33,189, Visits: 15,328
I's not difficult to manage the files, but in a crisis, it can be an issue. If you don't have the data on hand on the same backup tape, it can be an issue. In DR, things break. It's not a question of having the file(s), it's being able to put your hands on them when it matters.







Follow me on Twitter: @way0utwest

Forum Etiquette: How to post data/code on a forum to get the best help
Post #938205
Posted Wednesday, June 16, 2010 8:28 AM
Grasshopper

GrasshopperGrasshopperGrasshopperGrasshopperGrasshopperGrasshopperGrasshopperGrasshopper

Group: General Forum Members
Last Login: Friday, May 16, 2014 9:38 AM
Points: 22, Visits: 148
I think the root problem is testing your DR solution. If you rely on tapes for DR then they should be tested (restored) on a regular basis as part of regular DR testing. If tapes are purely for archival (ie legal reasons) then maybe there's less motiviation to test them regularly.
Post #938237
Posted Wednesday, June 16, 2010 9:05 AM


SSC Eights!

SSC Eights!SSC Eights!SSC Eights!SSC Eights!SSC Eights!SSC Eights!SSC Eights!SSC Eights!

Group: General Forum Members
Last Login: Thursday, February 6, 2014 12:59 PM
Points: 801, Visits: 1,962
There are lots of ways to deal with this. There are tons of cheap tools that, when used together, can handle this quite well. I can get a terabyte (or better) external drive cheap, use XXCOPY to handle adds, drop, and changes, then take the drive off-site and stick it in a bank vault.

Then you could have a backup server. Use that to store your backups of pictures, documents, read only (or mostly) stuff, and the like. Then use a service like Barracuda or Carbonite to take care of your off-site. for slowly changing stuff these kind of services seem ideal.

For SQL you can do command line backups using OSQL. Compress the backup with 7-Zip (it's free). Move the compressed file to your repository server and let the off-site service do its work while you sleep. Wrap the whole thing up in a command (batch) file and let the Windows Task Scheduler run it for you.

As for the restore, well you grab one of the files and do a practice restore. You are auditing and confirming your backups now, right? No? Stop reading this and go fix that problem now. Your minimum should be to:
(1) Stop activity.
(2) Do your backup.
(3) Restore to a second machine.
(4) Use your redgate tools to compare the SQL objects and the data.


ATB

Charles Kincaid

Post #938309
Posted Wednesday, June 16, 2010 10:47 AM


Say Hey Kid

Say Hey KidSay Hey KidSay Hey KidSay Hey KidSay Hey KidSay Hey KidSay Hey KidSay Hey Kid

Group: General Forum Members
Last Login: Tuesday, July 30, 2013 9:53 AM
Points: 679, Visits: 456
80-90% under 2gb? No one has made a comment on that statement?

I assume that 80-90% isn't JUST Sql Server databases. If you include Access and all the other end-user databases out there, then yes, I agree. Access can't handle anything of any size, so that's probably the cause.

I just have a hard time believing that 80-90% of the Sql Server databases out there are und 2gb. Hell, I would guess that more than 80-90% are OVER 10gb. That's just my opinion though.

I handle about 50 databases between 2 servers and I would say that 80-90% are above 20gb.



Live to Throw
Throw to Live
Will Summers
Post #938406
Posted Thursday, June 17, 2010 8:48 AM


SSC-Insane

SSC-InsaneSSC-InsaneSSC-InsaneSSC-InsaneSSC-InsaneSSC-InsaneSSC-InsaneSSC-InsaneSSC-InsaneSSC-InsaneSSC-Insane

Group: General Forum Members
Last Login: Today @ 1:37 PM
Points: 21,639, Visits: 15,309
If the data truly is archive data, then my immediate assumption would be that the data is on a different SLA than the rest of the database(s). If the archive is on a different tape or is accessible via some other means, then that may be acceptable. It may be fully acceptable to wait 24 hours for it to be restored (24 hours beyond the other databases).


I would make sure there were multiple copies of the backup to restore from, however.




Jason AKA CirqueDeSQLeil
I have given a name to my pain...
MCM SQL Server


SQL RNNR

Posting Performance Based Questions - Gail Shaw
Posting Data Etiquette - Jeff Moden
Hidden RBAR - Jeff Moden
VLFs and the Tran Log - Kimberly Tripp
Post #938970
« Prev Topic | Next Topic »

Add to briefcase 12»»

Permissions Expand / Collapse