SQLServerCentral Editorial

How can we standardize SQL Server Data Deployments?

,

As DBAs, we know that our SQL Servers will never remain in the pristine state they were in when we painstakingly built them in our labs, pre-deployment. Inevitably, they will be assaulted by all manner of modifications, flung from afar by the developer’s trebuchet, grounded in the sandbox of staging or test environments.

As the oft-quoted "gatekeepers" of SQL Server, it is our job as DBAs to ensure that we have implemented sound change management and source control procedures that dictate code review processes, consistent roll back plans, solid naming conventions and thorough deployment instructions. With such processes in palace, certain change requests, such as deploying stored procedures, functions, triggers and even indexes, become relatively easy to manage, since they can be scripted reproducibly from one environment to another. As gatekeepers, we vigilantly keep watch, only allowing entrance into the fortress when our criteria are met. Using a DROP/CREATE with an IF EXISTS clause instead of an ALTER statement for deploying SQL Server objects like stored procedures is one example of a standard that can be put in place that will help assure an error free deployment.

One type of deployment though, moving data from one environment to another, tends to thwart attempts at standardization, as there as just so many variables that will affect how to tackle the problem in any given environment. Here are just a few questions that we as DBAs need to ask in order to narrow down our possible response to the common “Can you move TABLEA from Server1 to TABLEA on Server2?” request from developers.

  • Do you want me to move the table and the data, or just the table?

    • If just the table, please provide a CREATE script.
  • Assuming you want data, do you want all of the data or a subset?

    • If a subset please provide the criteria, via a scripted query.
  • Does the table have an identity column?
  • Does the table have any constraints or dependencies?
  • Does the table have indexes? How many?

    • If so, I presume you would like the indexes moved as well?

      • Please provide the CREATE INDEX script with the CREATE TABLE script.
  • How many records are in the table?

    • If less than 50,000 please provide INSERTS
    • If more than 50,000….then what?

The last question, in particular, introduces the shade of gray that exist between the normal black and white delineation of responsibilities of the DBA and the SQL Developer. For small tables, a developer may use simple CREATE and INSERT scripts; for larger tables, perhaps the Export/Import wizard in SSMS. DBAs, on the other hand, may have access to tools unknown or unavailable to the developer, such as dedicated third party tools for comparing and scripting the synchronization of two tables, or bulk copy tools. Without a defined standard in place for moving tables and data, then what will most likely happen is that the DBA will be the one to decide “how” to fulfill the requirement using their preferred, defiantly non-standard, approach (usually the path of least resistance).

So the question is: if a standard for moving table data can be achieved, what is the best way to do it, and who is responsible for making the process smooth, tested and repeatable, the DBA or Developer? I believe this is a case where the DBA needs to work closely with the developer to produce that standard, whatever it may be. If it is decided that the Export/Import wizard will be employed then it is the responsibility of the developer to document each move so that there is no question left unanswered for the DBA who is deploying. I would think that even screen shots of the proposed move should be provided so that there is evidence showing that the process was completed successfully by the developer.

Knowing that an SSIS package is created when using the Export/Import wizard, may offer an alternative solution. The developer can build an SSIS package to move the data and provide that package to the DBA. That puts a lot more work on the developer, but it does make for an easier deployment. Of course a configuration file would need to be used to provide the server and object names. I have even seen SSIS used as a verb in some data move requests, as in “Can you please SSIS over the table and data to production?”

If, on the other hand, a comparison tool does become the standard deployment solution then it is still the developer’s responsibility to ensure that the code created by the tool has been tested.

A standard deployment process is critical in large, publicly traded organizations, with mandated restrictions for maintaining production systems. Each change has to be accounted for so that no unscrupulous activity can go undetected. In these organizations, with large teams of developers, there is a lot of code movement. It is the DBA who is responsible for making sure that a deployment does not have any ill effect, and potentially bring down a critical application, and to ensure that there is always a quick and reliable way to get the environment back the way it was prior to the change. Sometimes this can be as easy as putting back a stored procedure, and other times it means restoring a database. With both teams working in tandem to build a standard, the responsibilities are equally shared. If no one is responsible, then everyone is responsible (I think I read that last line somewhere).

Cheers,

Rodney landrum (Guest Editor).

Rate

5 (2)

You rated this post out of 5. Change rating

Share

Share

Rate

5 (2)

You rated this post out of 5. Change rating