A Simple way of Automating Scripts

  • Very long discussion... Visual Studio was mentioned just once I'd say...

    Why do not use Visual Studio Database project functionality (with any code version control system)?

    Use normal build functionality, get *.DBSchema file, deploy it with VSDBCMD utility .... ?

  • For me SSDT is one of the great things in 2012.. looking forward to the release next year 🙂

  • I mentioned Visual Studio 2010 Ultimate or Premium Editions.

    It has everything you need now - don't wait for 2012:-)

    The only problem - no multi-sever deployment tool

  • Cost is the usual reason given for not using VS database projects.

  • gchornenkyy (12/30/2011)


    I mentioned Visual Studio 2010 Ultimate or Premium Editions.

    It has everything you need now - don't wait for 2012:-)

    The only problem - no multi-sever deployment tool

    Db deployment was made available in VS 2008 Database Edition.

    Multiserver deployment and Azure deployment are coming in 2012.

  • Martin Bell-290511 (12/30/2011)


    Cost is the usual reason given for not using VS database projects.

    With 2008, usually not a problem if you do it right, meaning that you keep your projects in TFS ad deploy from TFS.

    The big f***up is that 2010 does not support BI and that you cannot do deployments of cubes and reports from the same version of VS.

  • Adding TFS to the equation iwill increase the cost!

  • Martin Bell-290511 (1/3/2012)


    Adding TFS to the equation iwill increase the cost!

    Not if you cost it over say a year, and factor in that TFS is a management tool, a key part of test environment, and at the same time a deployment tool.

  • We use something very similar where I work. The key difference for us is that we have a table in the database to record which scripts have run. We execute a batch file in our script directory and it proceeds to run the SQL files in the same directory in list order (we prefix filenames with descending date format; multiples on one day would be 20120103a, b, and so on). Each SQL file consists of 2 parts: the wrapper SQL and the payload SQL. The wrapper sql executes a stored procedure to see if this script has already been run in the db. If it has not, the payload is executed and the script name is inserted into the database table. We also write the output of the cmd window to a text file.

    The main benefit of the table in the database is that it essentially 'versions' the database. We have dozens of environments in development and QA, and developers will restore newer backups at their own pace. This means all these databases can be from various points in time with scripts run up to the point the DB was backed up/sanitized from production. Since we draw from a common SQL folder in each branch, this allows us to automate running the scripts after any db restore, regardless of the date of the db. After restoring a db, just get latest code and execute the batch script. We archive SQL scripts quarterly in the script folder so a developer would need to update their db at that rate, which is quite reasonable.

    It should be noted that we do not use this process for production deployment, as we have a release DBA execute the scripts in order manually, although there is no reason this process could not be done in production. Although we run the batch repeatedly on restored test dbs prior to a release, things can happen in a moving target like production (moving wrt data) and we prefer to slow things down with single script execution. We have a GUI in our application that we can monitor the executed scripts table to verify that the dba is running them in order.

    The one hiccup we had recently was relating to updating some strings that contained some javascript syntax (with @ signs, I believe) and the SQLCMD was treating it like a variable. Since our scripts in production are run via SSMS, we could not use the same escape mechanism for SQLCMD since they are not compatible. We ended up breaking up the updates into a couple parts using concatenation of existing column values.

    There's probably a slicker way to do parts of what I have described. I'm a developer, not a DBA, though with many SQL server installations, the developer gets to wear that hat a little, and not always by choice.

  • Would this also work for SQL Express databases?

    Kindest Regards, Rod Connect with me on LinkedIn.

  • Sadiq,

    could you let me know where to put the database name on which I am running the scripts, also will it work on SQL 2008?

  • I get an error on executing the script and I am ot able to find it. I was looking for this kind of script and now I am not able to use it because of the error.

    Msg 245, Level 16, State 1, Line 59

    Conversion failed when converting the nvarchar value 'usp_GetListOwner_AZ00' to data type int.

    Please help me fix this.

    Thank you

  • pamalatha (3/7/2012)


    I get an error on executing the script and I am ot able to find it. I was looking for this kind of script and now I am not able to use it because of the error.

    Msg 245, Level 16, State 1, Line 59

    Conversion failed when converting the nvarchar value 'usp_GetListOwner_AZ00' to data type int.

    Please help me fix this.

    Thank you

    I too came to the end of this discussion to find the answer to the above quandary. Reread the article and found I missed an assumption. Posting in case someone else did too.

    Assumptions and Prerequisites

    If you want your scripts to automatically execute in a pre-defined order, the easiest method is to ensure the file names reflect the order of execution. For example, if you have three script files, ScriptA.sql, ScriptB.sql and ScriptC.sql, you would want to name them as

    1.ScriptA.sql

    2.ScriptB.sql and

    3.ScriptC.sql

    Thanks for the script Sadequl Hussain.

    Out of curiosity about half way through the discussion there was a post on using PoSH for this. It seems to be incomplete in that there are no parms and the output txt files show up as 0 byte empty files. If anyone is feeling generous I am looking for a way to do what this SQL script does in PowerShell, complete with error/success tracking. I have modified this SQL script for a SPROC with parms but I don't know PoSH yet. Currently xp_cmdshell is enabled on all servers in my environment but that may not always be true.

    TIA

Viewing 13 posts - 61 through 72 (of 72 total)

You must be logged in to reply to this topic. Login to reply