SQL Server 3rd party Backup tool

  • hi,
    we have a DB of 20 TB in 2 files, got lot of unused space and need some shrink, but thats a tedios and long process, plus we do not have full backups due to the size, relying on storage snap backups for now,

    Any one know how we can get a full db backup using any tool available?
    Also while restoring can we restore the 2 files in lets say 10 files of 2TB each so to maintain the DB files
    Any other scenarios will be greatly helpful

    Regards,

  • I would never shrink data-files (https://littlekendra.com/2016/11/08/shrinking-sql-server-data-files-best-practices-and-why-it-sucks/ ), log files however a different story.

    I use output of following to shrink the databases
    (Just make sure you have auto growth enabled on log files or you might halt database operation)

    select 'exec ( ''use [' + cast(db_name(database_id) as nvarchar(254)) + ']; dbcc shrinkfile(' + cast(file_id as nvarchar) + ',1)'') -- ' + cast(physical_name as nvarchar(max))
    from sys.master_files
    where database_id > 4
    and type_desc='LOG'
    order by 1

    Native or 3rd party backup tool, it will not work if you do not have space to hold the backup.

    As for restore you can partially restore a database, provided you have taken file-group backups along with transaction log bakup

  • Regardless of the tool used, your full backup is going to take a fair amount of time.  Keep in mind, the backup only backs up the *DATA* not the file, so your backup file(s) will be smaller than the database itself.

    As for restoring to multiple files, no, you can only restore back to the same state it was when it was backed up.  So in your case, you'll have to restore to two 10TB files.  Those files can be restored to a different location than they were on originally (example, file 1 is on drive F:\ and file 2 on drive G:\, you can restore them to completely different drive letters,) but you can't split them up during the restore.

    Any other solutions are going to involve a fair bit of work and time.  Plus, regardless, they're not going to appreciably impact the time it will take to run a full backup and may negatively impact that time.

  • ALIF-662928 - Thursday, February 1, 2018 12:42 PM

    hi,
    we have a DB of 20 TB in 2 files, got lot of unused space and need some shrink, but thats a tedios and long process, plus we do not have full backups due to the size, relying on storage snap backups for now,

    Any one know how we can get a full db backup using any tool available?
    Also while restoring can we restore the 2 files in lets say 10 files of 2TB each so to maintain the DB files
    Any other scenarios will be greatly helpful

    Regards,

    The likely reason you have so much freespace in your files is any index rebuilds that you may do.  Of course, it could also be due to deletes.  In either case, shrinking your database is going to cause you great pain and agony.

    If you're working with 20TB of disk space and, say, 10 TB of data, then you need to do things a little bit differently starting now.  And, no... it's not going to be quick and you won't think it easy.

    First, you need to identify what you're largest tables are, what they're used for, and how they're used.

    Normally, the largest tables in a database are "WORM" tables (Write Once, Read Many) and are in the form of some type of audit table.  These tables are temporal in nature and older time periods (say, months) are never updated.  What you need to do with those is partition them to a separate database, 1 file group per time period and 1 file per file group.  Once partitioned and properly compacted to contain virtually no free space, set the non-current (current and any future month partitions you made) to READ ONLY and back them up.  Done correctly, you will never have to defrag them or back them up ever again regardless of whether you use Partitioned Views or Partitioned Tables.

    Once successfully partitioned, drop the original table(s) from the original database and replace them with synonyms if DRI isn't required (and it should not be required if they're audit tables).

    There's more but I don't know enough about the tables or their usage in your database.  You need to identify more information about the largest tables in your database before we can actually help.  And when I say "help", I also mean making these files a whole lot more bullet proof in the future.

    We'll talk about recovering the unused freespace after we've made it much easier to do such a thing properly.

    --Jeff Moden


    RBAR is pronounced "ree-bar" and is a "Modenism" for Row-By-Agonizing-Row.
    First step towards the paradigm shift of writing Set Based code:
    ________Stop thinking about what you want to do to a ROW... think, instead, of what you want to do to a COLUMN.

    Change is inevitable... Change for the better is not.


    Helpful Links:
    How to post code problems
    How to Post Performance Problems
    Create a Tally Function (fnTally)

  • ALIF-662928 - Thursday, February 1, 2018 12:42 PM

    ...20 TB in 2 files, got lot of unused space...

    SQL Server backups do not store empty blocks so that shouldn't be a problem at all. Also you can employ compression. If the SQL Server compression is not sufficient you can try RedGate backups which have higher compression rate.

  • Evgeny Garaev - Wednesday, February 7, 2018 1:04 PM

    ALIF-662928 - Thursday, February 1, 2018 12:42 PM

    ...20 TB in 2 files, got lot of unused space...

    SQL Server backups do not store empty blocks so that shouldn't be a problem at all. Also you can employ compression. If the SQL Server compression is not sufficient you can try RedGate backups which have higher compression rate.

    True enough but... they restore empty blocks. 😉

    --Jeff Moden


    RBAR is pronounced "ree-bar" and is a "Modenism" for Row-By-Agonizing-Row.
    First step towards the paradigm shift of writing Set Based code:
    ________Stop thinking about what you want to do to a ROW... think, instead, of what you want to do to a COLUMN.

    Change is inevitable... Change for the better is not.


    Helpful Links:
    How to post code problems
    How to Post Performance Problems
    Create a Tally Function (fnTally)

  • Any one know how we can get a full db backup using any tool available? 
    >> I think it's still possible to get a full DB backup. It depends on (1) How much time you have for backup  (2) How is you backup device and network (3) How much are you willing to pay:-) You know, tool is not cheap. I guess you need a special one:-)

    Also while restoring can we restore the 2 files in lets say 10 files of 2TB each so to maintain the DB files
    >> No way. File count and size are exactly the same as original ones. 

    Basically, I suggest you re-design the files/filegroups ( split it to multiple DBs ). So that, we can get huge advantage of file/filegroup backup and piecemeal restore. If you use any 3rd backup tool, you need to be very careful to use their private compression methodology. If your MSSQL version is too old to support compression, it's a different story.

    If you are in Chicago area and have a budget, please feel free to contact me.

    GASQL.com - Focus on Database and Cloud

Viewing 7 posts - 1 through 6 (of 6 total)

You must be logged in to reply to this topic. Login to reply