Click here to monitor SSC
SQLServerCentral is supported by Red Gate Software Ltd.
 
Log in  ::  Register  ::  Not logged in
 
 
 
        
Home       Members    Calendar    Who's On


Add to briefcase ««1234»»»

Quickly Copy Data Expand / Collapse
Author
Message
Posted Thursday, June 6, 2013 7:56 AM
Grasshopper

GrasshopperGrasshopperGrasshopperGrasshopperGrasshopperGrasshopperGrasshopperGrasshopper

Group: General Forum Members
Last Login: Wednesday, September 11, 2013 8:23 AM
Points: 23, Visits: 25
The expression I grew up with is, "Never underestimate the bandwidth of a station wagon full of tapes, speeding down the highway."
Post #1460707
Posted Thursday, June 6, 2013 7:56 AM
Grasshopper

GrasshopperGrasshopperGrasshopperGrasshopperGrasshopperGrasshopperGrasshopperGrasshopper

Group: General Forum Members
Last Login: Wednesday, September 11, 2013 8:23 AM
Points: 23, Visits: 25
The expression I grew up with is, "Never underestimate the bandwidth of a station wagon full of tapes, speeding down the highway."
Post #1460708
Posted Thursday, June 6, 2013 8:00 AM
Grasshopper

GrasshopperGrasshopperGrasshopperGrasshopperGrasshopperGrasshopperGrasshopperGrasshopper

Group: General Forum Members
Last Login: Wednesday, September 11, 2013 8:23 AM
Points: 23, Visits: 25
The expression I grew up with is, "Never underestimate the bandwidth of a station wagon full of tapes, speeding down the highway."
Post #1460710
Posted Thursday, June 6, 2013 8:18 AM
SSC Veteran

SSC VeteranSSC VeteranSSC VeteranSSC VeteranSSC VeteranSSC VeteranSSC VeteranSSC Veteran

Group: General Forum Members
Last Login: Yesterday @ 8:27 AM
Points: 227, Visits: 2,174
Looks like we need to get introduce deduplication to the DBMS world. In the virtual world, VMWare and Hyper V only keep one copy of the same block in memory, and when doing backups for that matter. Granted, we have transaction log and differential backups, but deduplication for database backups should be built into the backup subsystem. So if only 1% of the database blocks have changed, the backup command should only back up those blocks. During restores, ALL of the blocks should be retrieved without DBA intervention.

Backupand restore is an area where SQL Server isn't even where DB2 was in V3, back in the early 1990's.
Post #1460718
Posted Thursday, June 6, 2013 8:25 AM
Forum Newbie

Forum NewbieForum NewbieForum NewbieForum NewbieForum NewbieForum NewbieForum NewbieForum Newbie

Group: General Forum Members
Last Login: Wednesday, September 17, 2014 9:33 AM
Points: 6, Visits: 24
I copy SQL Server backups over the Internet. I have two remote servers connected using a (paid for) LogMeIn Hamachi2 Peer-to-Peer VPN. I then transfer the files over the VPN using rsync (a Linux application not used much in the Windows world but is available as part of the Cygwin install). No need to compress the backups - just let rsync (client/server) work out which blocks have changed and it will only transmit the differences.

Its all command line stuff so create a script and launch it from another step in your SQL backup job following the actual backup.
Post #1460719
Posted Thursday, June 6, 2013 8:26 AM


SSC-Dedicated

SSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-Dedicated

Group: Administrators
Last Login: Today @ 4:00 PM
Points: 31,181, Visits: 15,627
lptech (6/6/2013)
Looks like we need to get introduce deduplication to the DBMS world. In the virtual world, VMWare and Hyper V only keep one copy of the same block in memory, and when doing backups for that matter. Granted, we have transaction log and differential backups, but deduplication for database backups should be built into the backup subsystem. So if only 1% of the database blocks have changed, the backup command should only back up those blocks. During restores, ALL of the blocks should be retrieved without DBA intervention.

Backupand restore is an area where SQL Server isn't even where DB2 was in V3, back in the early 1990's.


I would agree. Some of the de-dupe products don't work well over time with SQL Server, since those original blocks have to be maintained over time and assembled with the changed ones.

However, I'd think the backup process could be streamlined and at the same time smart enough to produce some "changed blocks" from a base backup, excluding indexes. If we throw those out, because they are re-org'd, then I think lots of data rarely changes.

The thing is, we'd need to rewrite those blocks the same instead of just streaming out pages. After index movement of data, including the clustered indexes, the blocks are not necessarily in the same spot each day when they are streamed out from the datbaase.







Follow me on Twitter: @way0utwest

Forum Etiquette: How to post data/code on a forum to get the best help
Post #1460721
Posted Thursday, June 6, 2013 9:31 AM
Forum Newbie

Forum NewbieForum NewbieForum NewbieForum NewbieForum NewbieForum NewbieForum NewbieForum Newbie

Group: General Forum Members
Last Login: Thursday, June 5, 2014 12:55 PM
Points: 5, Visits: 46
1) Compression Tools - the free tools do such a great job, why bother paying for something? 7zip works great and I usually get a SQL backup down to 5-10% of the original size. If you want to pay for proprietary tools, have at it - but consider if you are actually getting a better tool. I'm not aware of any magic tool that gives you significantly better compression just because you pay for it (in fact, I (very unscientifically) seemed to get better results with 7zip than with winzip.

2) I was also going to suggest rsync. There is a product called Syncrify Backup that incorporates rsync packaged up with a backup client and backup web server package, very reasonably priced. Works great. It will also compress the network traffic, so you don't need to compress beforehand (and probably works better uncompressed, since the SQL backup is likely to have fewer changed blocks of data that way - depends on specific workloads for your database of course). Allows for versioning on the backup server as well, so you could potentially restore a backup from days or weeks ago (and only uses up space for the changed blocks, so you don't need full copies of the database for each version).
Post #1460759
Posted Thursday, June 6, 2013 9:50 AM
Grasshopper

GrasshopperGrasshopperGrasshopperGrasshopperGrasshopperGrasshopperGrasshopperGrasshopper

Group: General Forum Members
Last Login: Monday, March 24, 2014 11:08 AM
Points: 24, Visits: 53
Being a resourced (money) constrained part-time DB admin, I use free tools to transfer my backups.

I compress with gzip using the rsyncable flag, then transfer with rsync. For those wanting to use rsync, and are storing compressed data, make sure the compression is rsync compatible. Plenty of information on this is out there.
Post #1460775
Posted Thursday, June 6, 2013 9:59 AM
Hall of Fame

Hall of FameHall of FameHall of FameHall of FameHall of FameHall of FameHall of FameHall of FameHall of Fame

Group: General Forum Members
Last Login: Today @ 3:18 AM
Points: 3,108, Visits: 11,504
To copy a backup of a large database over a WAN, I recommend the following:
1. Make a compressed backup of the database to multiple output files. Example: Make 20x5 GB files, instead of 1x100 GB file. The compression and multiple output file options are available with the BACKUP command.
2. Setup multiple streams of file copies so that you are copying multiple files at a time, say 5 in parallel. This will help to make sure that the WAN pipe stays full.
3. If possible, make sure that the source and target machines for the file copy are running Server Message Block (SMB) 2.0 or higher protocol (Windows Server 2008 and above, Windows Vista, Windows 7, Windows 8) . There were big enhancements to this protocol that dramatically improve performance on high latency links (WAN connections).

Post #1460780
Posted Thursday, June 6, 2013 10:08 AM
SSC-Addicted

SSC-AddictedSSC-AddictedSSC-AddictedSSC-AddictedSSC-AddictedSSC-AddictedSSC-AddictedSSC-Addicted

Group: General Forum Members
Last Login: Friday, October 10, 2014 7:31 AM
Points: 492, Visits: 812
It comes down to doing analysis. Too often we get into the rut of doing things exactly the same way every time.

When faced with a problem that seems overwhelming, throw out all assumptions. Everything. Then ANALYZE the problem. Determine the facts, restraints, and make sure they are real and not imagined. Then start thinking about how to fix it. Take things a step at a time.

A good example occurred to me today. Someone had a complex spreadsheet that had data elements used to make a decision. Instead of helping them write a formula to do what they wanted, I wrote a formula to do a portion of what they wanted. Then another, and another, and another. Once those were done and tested, I explained that they could combine them all into one formula (good luck!) or simply write one more that based the answer on the ones we wrote. All I did was break the work down into manageable pieces.

What seemed to be impossible (to them) seemed so obvious once I showed them the methodology I use. I didn't do anything special, and nothing they could not have done themselves. The difficulty they had was they were focused entirely on the big picture of AZ columns and 7,000 rows. When they explained each piece to me one at a time, it was easy to do.


Dave
Post #1460787
« Prev Topic | Next Topic »

Add to briefcase ««1234»»»

Permissions Expand / Collapse