Upgrading tables without downtime

  • Hi,

    We are in the process of a larger upgrade of an existing database.

    Basically we are normalizing some of the tables which means we need to copy data from old table(s) to new table(s).

    The data migration is planned to be done from within an external application that reads from the old tables and the write the data to the new tables.

    We expect the migration to take at least a couple of hours.

    It's one thing if you could just read the old data into the new tables but as the old tables also constantly are being read from and updated we should need a way to keep track of that.

    One thing I have thought of is using update-triggers but I haven't come around to have a closer look at that option.

    Another thought I have is to add one or more columns (like boolean) in the existing tables and then set a "flag" if a row has been updated after I have read the data in the migration process and then go back and take any actions needed.

    Meanwhile, our customers ask that they should at least be able to have read access to our application which is a web application.

    I think we can make the web application read-only we some efforts.

    But how do you manage scenarios like this without any or minimal downtime?

    As I understand it only one sql instance can be the owner of the data files hence a cluster wouldn't help in this scenario? Or ... ?

    We are currently on 2012 Web edition, no cluster or HA, on premises (VMWare actually).

    But if you find a good reason to move to later version or other edition that should not be a problem as we have that in the pipe anyway. At least later version.

    As this is a smaller company I'm both app developer and SQL DBA and developer

    (or really none of the above 🙂 )

    and this is why I'm turning to this forum for any suggestions and hints on best practices.

    (Like I've never seen Facebook announce downtime due to an upgrade)

    If anything is unclear please don't hesitate to ask for clarification.

    Cheers

    Mikael

  • Thanks for posting your issue and hopefully someone will answer soon.

    This is an automated bump to increase visibility of your question.

  • There are patterns to deal with this, and if you have some flexibility, you can implement them. Since you need to deal with changes and updates, this can be complex.

    Do you insert to the tables with specific column lists, meaning that if there is a new column, the application won't break?

    How many tables are there, and is your plan to move all the data and then change to a new version of the app that sees the new tables?

    Is this just one application that accesses the data?

    The more control and the fewer places that will touch the data, the easier this is to do. I would look at moving some of the data over time, with small changes to queries that will adjust to new tables/columns over time. You definitely should read up on triggers and use those to keep things in sync as you slowly migrate data around.

    Why use an external application to move data?

  • Do you insert to the tables with specific column lists, meaning that if there is a new column, the application won't break?

    [Yes]

    How many tables are there, and is your plan to move all the data and then change to a new version of the app that sees the new tables?

    [8 existing tables will become 4 (or 3 depening on how you look at it) new tables.]

    Is this just one application that accesses the data?

    [One public web site and then a few internal window apps for administration.]

    The more control and the fewer places that will touch the data, the easier this is to do.

    [Agree. And this is also why I consider to use an "external app" (see below).]

    I would look at moving some of the data over time, with small changes to queries that will adjust to new tables/columns over time.

    You definitely should read up on triggers and use those to keep things in sync as you slowly migrate data around.

    [Wilco]

    Why use an external application to move data?

    [With external application I mean an applikation that I write. This is to easier be able to handle all the If - ElseIf - etc and also for easier debugging while we migrate the data instead of writing gigant sql-scripts. We also need to clean up much of the existing data during the migration process.]

    Many thanks for your feedback Steve!

  • If migration would only take 2 hours and they only want read access during the conversion.

    1. Make the system so it can only read the database.
    2. Take a backup
    3. Restore the backup to a different database and run the conversion scripts
    4. On successful conversion restore the converted backup over the existing database.
    5. Make the system read-writable.

    Just make sure you've properly tested the conversion script so it runs successfully if it doesn't you just need to just make the existing database writable again.

  • Thanks for input Jonathan!

    Yes, we will need to make extensive testing before doing the live migration.

    And also test performance to make sure we have at least the basic indexes in place.

    I had a quick thought of just pointing to the upgraded database once the conversion is successful to save time but then I have to change all the connection strings and also make sure all backup scripts and such are in place. Backup and restore will add another hour or so to the process. Both before and after the migration. I'm sure I will revisit that thought and see what's viable.

    Cheers

  • Hi everyone,

    This is an update to let you know what path we took and how it turned out.

    First of all I'm sorry for the delay with letting you know the outcome.

    I myself don't like when the outcome is not posted to the question as you then don't know if the suggestions did help or not.

    You are all spending time trying to help and you should of course get credit for that.

    We went for the trigger option and that turned out very well.

    We did the migration mid December and we can now say that it was successful.

    What we enticipated to be a couple of hours running our script (well, a console app) turned out to be more like a couple of days to get data moved to new tables and cleaning up in old tables.

    Basically we added a new column to the "old" tables with a bit value.

    Then the "script" updated that column once processed.

    If a user then updated the row in the old table after the "script" was run, the trigger took notice of that and we saved off some values in a temporary table that the "script" could query in its next run and check if anything should be done with that "dirty" data.

    This way we could start the migration a couple of weeks before the switch having it run in the nights when load was low.

    Of course we tried it out a couple of times and had to tweek the "script" meaning we had to restart all over but as that then was just one column to reset in the old tables it wasn't that bad.

    Then when it was time to do the switch we only had to bring the datbase down for a very short period of time and run the "script" a last time to update any remaining rows that recently had been updated by users.

    So thank you for your input and have a nice weekend!

    Mikael

     

Viewing 7 posts - 1 through 6 (of 6 total)

You must be logged in to reply to this topic. Login to reply