Multi-Script, Multi-DB Deployments

  • Comments posted to this topic are about the item Multi-Script, Multi-DB Deployments

  • Hello,

    I might have read the article too quickly. So I apologize if I miss some information.

    The methodology you are taking is very similar to the methodology in place at my work.

    However, in real world, it is almost impossible to just let it run via sqlcmd, except for minor updates.

    Like a lot of companies, There are 4 to 5 environments (dev, staging, integration prérelease, production) and a lot of environment variables make usually the things go wrong.

    All can be variablize, no question. I love SqlCmd, do not get me wrong.

    In the best world ever, developers would not messed up their code but they do all the time.

    Second point, you apparently miss the rollback script. You should be able to rollback as soon as the scripts are deployed to check that rollback scripts work. Then you can after one day, decide to rollback if you see that performance is poor or that there are unpexpected behaviors by your applications.

    Finally, I do not understand why you use xp_cmdshell. You can put you sqlcmd line directly on dos and you can just change the line within you agent first step.

    I am guessing you can use xp_cmdshell if you like to parameterized in some sort of database repository all the scripts that you went through. Still, you should activate and deaxctivate xp_cmdshell as you would keep a security hole in place otherwise.

    Clement

  • This is almost identical to our process. In addition, I require our developers to ensure the scripts can be re-run without impacting the database nor requiring a back-out first. In the development environment we've automated testing by using Hudson to start a job that gets updated scripts from Subversion, restores a database snapshot that has not had any scripts applied, then executes the command file that runs the deployments scripts. When the process is done, Hudson starts the database unit tests, which then start the application unit tests. We have to strip the snapshot of all data not related to tests to to reduce the execution time.

  • We have a similar concept but in our case the developer uses some Visual Basic code to build up a single change script. Each individual script is bracketed with a comment header that has the file name & last-modified date, and a footer with a GO statement to prevent any conflict between the scripts.

    On one occasion, a junior DBA applied a QA script to the production db, so we also added this to the top of the change script:

    IF NOT EXISTS

    (SELECT 1

    FROM sys.sysdatabases

    WHERE [name] = 'ABC')

    RAISERROR ('INCORRECT DATABASE', 21, 1) WITH LOG

    which will disconnect from the server.

  • The one thing that I see missing from this process is any hint that you're using some type of source control. As a matter of fact, I think a lot of your process, the naming & storage standards in particular, is built up because you're not using source control. At the root of it, this process is not dissimilar to the one we use, but because we have all our scripts stored in source control, in general, all we have for deployment is a listing of which scripts, from source control, by label, need to be run. This allows us to integrate our database deployments very tightly with the application deployments so that for any given application release, we have a well matched database release.

    Also, I'd suggest changing your QA process so that you only ever do full deployments to it. If you build some type of production rollback process, so that prior to a deployment you reset the QA databases to what production looks like, you'll get a better test of the deployments before you go to production. That will help to eliminate a lot more errors that creep in because of people introducing changes to script outside of the process. This also would be helped by having the scripts in source control.

    "The credit belongs to the man who is actually in the arena, whose face is marred by dust and sweat and blood"
    - Theodore Roosevelt

    Author of:
    SQL Server Execution Plans
    SQL Server Query Performance Tuning

  • clementhuge (8/23/2010)


    Hello,

    I might have read the article too quickly. So I apologize if I miss some information.

    The methodology you are taking is very similar to the methodology in place at my work.

    However, in real world, it is almost impossible to just let it run via sqlcmd, except for minor updates.

    Like a lot of companies, There are 4 to 5 environments (dev, staging, integration prérelease, production) and a lot of environment variables make usually the things go wrong.

    All can be variablize, no question. I love SqlCmd, do not get me wrong.

    In the best world ever, developers would not messed up their code but they do all the time.

    Second point, you apparently miss the rollback script. You should be able to rollback as soon as the scripts are deployed to check that rollback scripts work. Then you can after one day, decide to rollback if you see that performance is poor or that there are unpexpected behaviors by your applications.

    Finally, I do not understand why you use xp_cmdshell. You can put you sqlcmd line directly on dos and you can just change the line within you agent first step.

    I am guessing you can use xp_cmdshell if you like to parameterized in some sort of database repository all the scripts that you went through. Still, you should activate and deaxctivate xp_cmdshell as you would keep a security hole in place otherwise.

    I appreciate the feedback!

    I wasn't targeting the rollback as part of the article, but trying to show a way that we execute multiple updates against multiple databases. We ensure that the scripts have already been executed in the lower environments and proven by the time they get to the production release. Certainly, a second SQL Agent job could be utilized to handle the rollback in a single unit of deployment.

    We utilize xp_cmdshell so that we can keep everything self-contained within SQL Server. When you say "change the line within you first agent step", can you elaborate? In the job step that invokes xp_cmdshell and calls sqlcmd, I am enabling xp_cmdshell, and then returning its state back to what it was after sqlcmd execution is complete. If it was disabled, it goes back to disabled.

  • rottensystem (8/23/2010)


    PragmaSQL (CodeProject winner of 2010 on Best General Database Tool category)can handle script execution on multiple databases with a single click.

    Please check out http://www.pragmasql.com for details. You can install a personal edition for free, for multiple database execution you will have to purchase a professional editon though.

    Thanks- I'll take a look at this.

  • Grant Fritchey (8/23/2010)


    The one thing that I see missing from this process is any hint that you're using some type of source control. As a matter of fact, I think a lot of your process, the naming & storage standards in particular, is built up because you're not using source control. At the root of it, this process is not dissimilar to the one we use, but because we have all our scripts stored in source control, in general, all we have for deployment is a listing of which scripts, from source control, by label, need to be run. This allows us to integrate our database deployments very tightly with the application deployments so that for any given application release, we have a well matched database release.

    For this, I agree with you, but we (for better or worse) leave the source control up to our development teams. Since my team is primarily production support, we probably don't store scripts in the best way possible. With this folder layout, I was simply trying to provide a way that would make sense for purposes of the example.

    Also, I'd suggest changing your QA process so that you only ever do full deployments to it. If you build some type of production rollback process, so that prior to a deployment you reset the QA databases to what production looks like, you'll get a better test of the deployments before you go to production. That will help to eliminate a lot more errors that creep in because of people introducing changes to script outside of the process. This also would be helped by having the scripts in source control.

    Generally speaking, we do go with the full deployment in a lower environment tested against a copy of production. On occasion, we have last-minute updates come in, and we simply append them to the release because we already know the previous scripts were successful, so we only need to verify the implementation and rollback for the addition updates. Instead of trying to rebuild the full release, adding the implementation and rollback for the added changes logically works in our organization, but it may not in yours.

    Thanks for taking the time to check out the article. 🙂

  • This is also one good way to deploy the scripts on multi server ...the best way is that ssis ..

    Ashwin VP
    CSC India ...

  • Thanks for your feedback on my two remarks.

    I think we are on the same page as you enable then disable xp_cmdshell.

    When I meant within the line, I was referring to the fact that you can have Dos instead of T-Sql as a job step.

    Clement

    Clement

  • clementhuge (8/23/2010)


    Thanks for your feedback on my two remarks.

    I think we are on the same page as you enable then disable xp_cmdshell.

    When I meant within the line, I was referring to the fact that you can have Dos instead of T-Sql as a job step.

    Clement

    Thanks! I'll take a look at that. I haven't tested with that, but it would certainly eliminate the xp_cmdshell call as you recommended.

  • I have something similar to do but my scenario is little different, any suggestion would be highly appreciated. Lets say i need to deploy a stored proc, it has to be done across multiple servers with in different db( names are also different). For ex i need to deploy usp_test it has to be done on server 1(db1,db2,db3....) ,server 2(db4,db5,db6....)........ I know third party tools can do this but looking for an in-built way. Btw we do have a control server which has linked servers to all our prod servers. Thanks

  • Thanks for taking a look at the article!

    iqtedar (8/23/2010)


    I have something similar to do but my scenario is little different, any suggestion would be highly appreciated. Lets say i need to deploy a stored proc, it has to be done across multiple servers with in different db( names are also different). For ex i need to deploy usp_test it has to be done on server 1(db1,db2,db3....) ,server 2(db4,db5,db6....)........ I know third party tools can do this but looking for an in-built way. Btw we do have a control server which has linked servers to all our prod servers. Thanks

    If you look at the example SQL Server Agent Job in the attachment, you'd be able to accomplish that by making multiple calls to sqlcmd and by changing the -S parameter in each subsequent call.

    for example:

    --run against instance 1

    EXEC xp_cmdshell 'SQLCMD -S"Instance1" -E -dmaster -i"\\path\01a_update_Instance1.sql" -o"\\path\script_output_Instance1.txt"'

    --run against instance 2

    EXEC xp_cmdshell 'SQLCMD -S"Instance2" -E -dmaster -i"\\path\02a_update_Instance2.sql" -o"\\path\script_output_Instance2.txt"'

    GO

    Your 01a_update_Instance1.sql would reference the update scripts containing "USE [DB1]" atop of them on Instance1 and 02a_update_Instance2.sql would then reference the update scripts containing "USE [DB2]" atop of them, each set of scripts being geared toward the specific instance you are running it against.

    To use the same exact script, you could likely change the -d parameter from master to your DB's on each instance and remove the USE statements at the top of each script.

    for example:

    --run against instance 1

    EXEC xp_cmdshell 'SQLCMD -S"Instance1" -E -d"DB1" -i"\\path\update_stored_proc.sql" -o"\\path\script_output_Instance1.txt"'

    --run against instance 2

    EXEC xp_cmdshell 'SQLCMD -S"Instance2" -E -d"DB2" -i"\\path\update_stored_proc.sql" -o"\\path\script_output_Instance2.txt"'

    GO

    Hope that helps!

  • Thanks S.K for the follow up, i haven't looked at the scripts yet but i want the Use[dbname] to be dynamic something like it should get the dbname by itself and use in "Use[dbname]".thanks

  • I am trying to test this tool, what is Product Code Name? I would like to run DDL's and DML's across different servers on different dbs. Please advice. Thanks

Viewing 15 posts - 1 through 15 (of 22 total)

You must be logged in to reply to this topic. Login to reply