Backup Data Deduplication

  • Hopefully someone out there is using some sort of deduplication in their SQL Server backup strategy. I am having a lot of trouble finding articles on deduplicating SQL Server backups on the web. All the vendors say they do it but are not providing much information on strategy and what works best to ensure their system can eliminate the redundant block. If anyone has a success story, I would greatly appreciate it. Even a tip for what to stay away from would be helpful. We have over 150TB of SQL Server databases online and this technology sounds like a promising way to significantly reduce backup storage costs and windows.

    Some questions I have are:

    Is block level or bit level deduping best?

    Is inline or offline best for SQL Backups?

    We use SQL Lightspeed and while the backups are much faster, will we have issues or should we go back to native backups in SQL Server?:)

    Thanks for your help....

  • What do you mean reduce duplication?

    Litespeed (and Red Gage/Hyperbac) perform compression, which involves some tokens and pattern replacement, just like ZIP files.

    Litespeed, and the other vendors, provide utilitis to uncompress their backup files back to native format.

  • Yes I understand Lightspeed as I have been using for the past 5 years. Deduplication is a backup strategy offered by EMC, NetApp and DataDomain for removing redundant blocks of data from your tape library.

    If you are backing up to disk and you are not the one responsible for spinning those backups to tape, its likely your backup admin will know if deduplication is being run on your dump files.

Viewing 3 posts - 1 through 2 (of 2 total)

You must be logged in to reply to this topic. Login to reply