What the best ways to create an AG and do failovers with 8TB databases?

  • Hello.

    I've been working with AGs for a few years now.  However, in my latest position, I am now working with a company database that is 8TB.

    Are there best practices or anything you could or should do differently when working with large databases and performing AG failovers?

    I usually add the database to the AG and then it adds the database to the replicas and starts to seed them.   What happens when the database is extremely large? is best practice the same steps?

    Thanks for the thoughts on how to work with large dbs and AGs.

     

     

     

    Things will work out.  Get back up, change some parameters and recode.

  • I've not done anything that big before, but there are tools that make failovers nice and easy without the need for AG's. We use DxEnterprise for our failover and it is nice because it fails the disk over with the services and has things up in minutes. However long it takes for SQL to start really. We have had surprise failovers during the day and the business hardly noticed for a 500 GB database. I believe you can set up DxEnterprise to use AG's instead of disk/service swaps on hosts, but we have never used it for that.

    From my understanding of AG's though - the slowness will occur during the initial sync but after it is synced, as long as the data flow can be replicated to the others in the AG, you are only syncing changes. So size of the DB doesn't really come into play AFTER the initial sync. Initial sync is slower the larger the database.

    The above is all just my opinion on what you should do. 
    As with all advice you find on a random internet forum - you shouldn't blindly follow it.  Always test on a test server to see if there is negative side effects before making changes to live!
    I recommend you NEVER run "random code" you found online on any system you care about UNLESS you understand and can verify the code OR you don't care if the code trashes your system.

  • Yes sir.  That is why I am trying to plan this out.

    This could turn into a 7 hour job.  I wanted to try to avoid the downtime.

    I am thinking of setting up log shipping to initially get the data to the secondary and then add it to the AG.

     

    Thanks.

    Things will work out.  Get back up, change some parameters and recode.

  • I think log shipping should work. Found a blog post where someone did that with a 60TB database:

    https://johnsterrett.com/2015/08/18/adding-a-vldb-database-to-an-sql-server-availability-group-in-60-seconds/

    And they had basically no downtime!

    The above is all just my opinion on what you should do. 
    As with all advice you find on a random internet forum - you shouldn't blindly follow it.  Always test on a test server to see if there is negative side effects before making changes to live!
    I recommend you NEVER run "random code" you found online on any system you care about UNLESS you understand and can verify the code OR you don't care if the code trashes your system.

  • Thanks all.

    Things will work out.  Get back up, change some parameters and recode.

Viewing 5 posts - 1 through 5 (of 5 total)

You must be logged in to reply to this topic. Login to reply