Fast Database Clone

  • Hi,
    What is the fastest way to clone a database of 50 GB every day (SQL2014), without any interruption (detach / attache). This database is intended as a ReadOnly DB for marketing. (PowerShell, C # Smo, ??)
    Thanks and 🙂 Regards
    Nicole

  • info 58414 - Saturday, September 8, 2018 7:17 AM

    Hi,
    What is the fastest way to clone a database of 50 GB every day (SQL2014), without any interruption (detach / attache). This database is intended as a ReadOnly DB for marketing. (PowerShell, C # Smo, ??)
    Thanks and 🙂 Regards
    Nicole

    SAN SNAPSHOT if both your marketing database and source database are on the same SAN and the SAN has the capability (which may require the purchase of some optional software).  There will be no interruption of the source database and only a couple of seconds "outage" on the marketing database.

    --Jeff Moden


    RBAR is pronounced "ree-bar" and is a "Modenism" for Row-By-Agonizing-Row.
    First step towards the paradigm shift of writing Set Based code:
    ________Stop thinking about what you want to do to a ROW... think, instead, of what you want to do to a COLUMN.

    Change is inevitable... Change for the better is not.


    Helpful Links:
    How to post code problems
    How to Post Performance Problems
    Create a Tally Function (fnTally)

  • Jeff is dead on accurate. That is the fastest and safest way I know to get the job done.

    However, I do know a way that is just as safe, if not as fast. Redgate makes a tool called SQL Clone. You have to create an image, which takes about as long as a backup, but then you can create a clone of the image (actually lots of clones of the image) in about 15 seconds or so, regardless of the size of the database. This won't be as good as what Jeff suggested, but it does offer you another possibility (plus a lot of other uses that these can be put to).

    "The credit belongs to the man who is actually in the arena, whose face is marred by dust and sweat and blood"
    - Theodore Roosevelt

    Author of:
    SQL Server Execution Plans
    SQL Server Query Performance Tuning

  • Grant Fritchey - Monday, September 10, 2018 5:44 AM

    Jeff is dead on accurate. That is the fastest and safest way I know to get the job done.

    However, I do know a way that is just as safe, if not as fast. Redgate makes a tool called SQL Clone. You have to create an image, which takes about as long as a backup, but then you can create a clone of the image (actually lots of clones of the image) in about 15 seconds or so, regardless of the size of the database. This won't be as good as what Jeff suggested, but it does offer you another possibility (plus a lot of other uses that these can be put to).

    Ya know... I keep forgetting about that tool.  I'm going to have to read up on it and give it a try one of these days.

    --Jeff Moden


    RBAR is pronounced "ree-bar" and is a "Modenism" for Row-By-Agonizing-Row.
    First step towards the paradigm shift of writing Set Based code:
    ________Stop thinking about what you want to do to a ROW... think, instead, of what you want to do to a COLUMN.

    Change is inevitable... Change for the better is not.


    Helpful Links:
    How to post code problems
    How to Post Performance Problems
    Create a Tally Function (fnTally)

  • Simplest, cheapest, backup and restore.
    Complex and cheap, replication, but likely to be an ongoing source of headaches
    If you have the capability, availability groups work well.

    SAN snapshots are fantastic. Software and capability can be pricey. Also, these are often a pain as users need to be off to replace a snapshot with a new one. If you do this once a day, that's great. If you're trying to do this ad hoc or randomly, it sometimes becomes a pain.

    As Grant mentioned, Redgate has data  virtualization software in SQL Clone. This does make quick copies, but the image process is needed. Again, replacing the database a user sees is a slight interruption to work, but it's fast like SAN snapshots. Usually a few seconds, but users need to be kicked off.

    Disclosure: I work for Redgate.

  • We have a number of databases that get copied to a reporting server every night and just use backup and restore. 50GB isn't very large by today's standards so the restores of database of around that size are taking less than 60 seconds across the network to our reporting server.

    If you want to minimise the outage you can restore to a temporary database then drop the live one and rename the newly restored one to replace it.

    You could also look at Always On and use the standby database for reporting so you would then have a totally up to date copy to report against.

Viewing 6 posts - 1 through 5 (of 5 total)

You must be logged in to reply to this topic. Login to reply