Script to make 2 databases consistent

  • Hi Experts,

    Need some help here.

    Problem Description : Looking for different ways to make 2 databases consistent/sync. Both the databases are Azure Paas dbs.

    I have one prod database and one dev database. Based on sprint model, new changes will be made to the databases (like new tables,new indexes, create /drop index, adding an sp,function, changing datatype size, changing the column datatype) etc... Now the thing is production is lag behind 3 releases/deployments when compared to Dev/QA. There is continuous delivery/new release changes happening on both Dev n QA but prod is 3 releases behind.

    Team, has entity framework database migration scripts which will eventually get applied to the databases.
    other thing is, there is some scripts are manually run when the migration fails (lets say add new columns, changiong the datatype from nchar(26) to uniqueidentifier) etc....

    The problem is, when they start applying the new migration scripts to the prod for existing objects it is throwing errors, i dont know, if the .net EF model will handle it or not.
    Since there are inconsistencies between PROD copy and Dev copy , cx is basically looking for sql script which can can bring both the db into a consistent state.
    Also, the script should handle cases like if object already exists ignore it rather failing the entire database migration, it should take care of column comparison, datatype comparison, data should be same even after changing the datatype , table rowcount should be same, table column count should be same, indexes should be same, any change of clustered index column or primary key columns ....what not everything.

    1-way i am thinking is, dump all the table data into temporary permanent tables into a intermediate database or into .txt files using bcp or ssis , then  generate latest schema , create a blank db, apply the latest schema , using bcp /bulk insert/ssis dump it into new tables or using intermediate database tables dump data into the new tables back.
     what are the drawbacks using this approach? but they cant afford ssis to be installed on the server ? may be cost involved. ? what are the pitfalls using this approach.  Also, I dont know the prod rowcounts at this point.

    2-way : using dacpac or backpac. some of my colleagues are advising this approach of using Database Project - > dacpac model.  These databases are PaaS db's. Did anyone tried this approach?

    3-way : can we do this using trail version of SQL Compare redgate?

    Looking for various options with minimal manual work and also taking care of all above conditions( same structure, same data, same columns, same datatypes, same constraints, same indexes etc ...)

    End goal is to make both the Prod and DEV/QA to be in sync/consistent.

    Please advice.

    Thanks,

    Sam

  • vsamantha35 - Wednesday, June 20, 2018 5:15 AM

    Hi Experts,

    Need some help here.

    Problem Description : Looking for different ways to make 2 databases consistent/sync. Both the databases are Azure Paas dbs.

    I have one prod database and one dev database. Based on sprint model, new changes will be made to the databases (like new tables,new indexes, create /drop index, adding an sp,function, changing datatype size, changing the column datatype) etc... Now the thing is production is lag behind 3 releases/deployments when compared to Dev/QA. There is continuous delivery/new release changes happening on both Dev n QA but prod is 3 releases behind.

    Team, has entity framework database migration scripts which will eventually get applied to the databases.
    other thing is, there is some scripts are manually run when the migration fails (lets say add new columns, changiong the datatype from nchar(26) to uniqueidentifier) etc....

    The problem is, when they start applying the new migration scripts to the prod for existing objects it is throwing errors, i dont know, if the .net EF model will handle it or not.
    Since there are inconsistencies between PROD copy and Dev copy , cx is basically looking for sql script which can can bring both the db into a consistent state.
    Also, the script should handle cases like if object already exists ignore it rather failing the entire database migration, it should take care of column comparison, datatype comparison, data should be same even after changing the datatype , table rowcount should be same, table column count should be same, indexes should be same, any change of clustered index column or primary key columns ....what not everything.

    1-way i am thinking is, dump all the table data into temporary permanent tables into a intermediate database or into .txt files using bcp or ssis , then  generate latest schema , create a blank db, apply the latest schema , using bcp /bulk insert/ssis dump it into new tables or using intermediate database tables dump data into the new tables back.
     what are the drawbacks using this approach? but they cant afford ssis to be installed on the server ? may be cost involved. ? what are the pitfalls using this approach.  Also, I dont know the prod rowcounts at this point.

    2-way : using dacpac or backpac. some of my colleagues are advising this approach of using Database Project - > dacpac model.  These databases are PaaS db's. Did anyone tried this approach?

    3-way : can we do this using trail version of SQL Compare redgate?

    Looking for various options with minimal manual work and also taking care of all above conditions( same structure, same data, same columns, same datatypes, same constraints, same indexes etc ...)

    End goal is to make both the Prod and DEV/QA to be in sync/consistent.

    Please advice.

    Thanks,

    Sam

    Restore the production database to the lower environments regularly. And then the developers run the change scripts for whatever it is they are working on or what state it needs to be in on that day - for example version 1234 deployed to Development, version 12 deployed to QA. It's pretty easy to setup a job to do the restore and whatever needed permission changes. The incremental changes day to day are typically in source control so those are just run in order on a restored version of the production database.
    Prior to deployment to production, they run the deployment on a newly restored version of production in a lower environment. And then fix any issues that come up with the deployment, testing, QA, etc.  in that lower environment. And then run that job to restore to the lower environment again and have them test the fixed scripts in that environment. And do it until it is successful in the lower environment.
    You should rarely run into issues where the deployment fails. It should be rare enough that you would likely roll back the changes. You can still do the same with dacpacs if you wanted - the process would be a little bit different for the rollback.

    Sue

Viewing 2 posts - 1 through 1 (of 1 total)

You must be logged in to reply to this topic. Login to reply