SQLServerCentral Article

DB Change Management - An Automated Approach - Part 4


Database Change Management Best


Achieving an Automated Approach and Version


Part 4

In part 3 Darren Fuller listed the requirements to implement an automated approach to database change management and highlighted some of the benefits.  In this fourth and final article in the series, he continues the list of benefits that can be realised by adopting an automated methodology and discusses implementation considerations.

Benefits of an Automated Approach (continued)

4. Controlled Rollback

A critical attribute of an automated process is the ability to back-out an upgrade in the event of failure.  Rolling back deployments manually can become extremely complex.  In effect, you have to craft a change script for the failed change script.  Things can get nasty if you need to re-instate data that was deleted by the upgrade – either validly or invalidly.  If a critical error occurs in your deployment, automation will ensure a straight-forward and controlled rollback.

5. Benefits for ISVs

If you are an ISV (Independent Software Vendor), then you possibly distribute a product to your customers that relies on an underlying database. To release a new version of your software may also require an update to this database.  How do you confidently distribute these changes?  How do you know with certainty, what version of the database each customer has and in what condition?  What if the customer has made unknown changes or customisations?  Creating an install package for your latest release can be daunting. 

Another approach would be to include with your release package, the required version of base scripts and an application to interrogate the customer’s current database and how it differs from the new version.  Changes to databases can then be automatically deployed “on-the-fly”, without having to manually track the version and state of each customer’s database.

6. Parallel Code Development

With the pressures of tight deadlines for development projects, one strategy is to “split” the code base and assign teams to work in parallel on separate pieces of functionality.  If you have two or more code bases that developers are making amendments to, then re-integrating them can be tricky at the best of times. With an automated approach and integration to a version control system (VCS), you label your code hourly (or daily) and can test against that baseline.  With baselines in place, it is easier to analyse the code and perform a merge, knowing that all code on either side is syntactically correct beforehand.

7. Improved Productivity

The individuals responsible for the management of database change control; whether it is the DBA(s), developer(s) or change controller(s), have a crucial role as gatekeepers and custodians of a valuable asset. Throughout the database development project, the DBA(s) must liaise between development and operations to ensure the smooth deployment of upgrades.  This is a delicate balance between swift deployments of functionality (to keep a project on schedule and minimise hold-ups); while at the same time guaranteeing the integrity of the database (by verifying all proposed changes). 

Introducing efficiencies to the process of database change will empower these people with greater flexibility and improved productivity. The DBA(s) would then have the ability to propagate a change far more quickly and with a greater standard of quality compared to using manual methods.  Consequently, they would have more time to focus on proactive planning, integrity and performance issues.  In fact, the effect of an improved change control approach can be felt throughout the organisation and even extend to the customer.

8. Accurate Project Estimation & Planning

It is unfortunately far too common for IT projects to run severely over budget with continual postponement of completion deadlines.  Attempts to hit deadlines by allocating additional resources often exacerbate the problem of escalating project costs.  One of the main reasons for this is the myriad of complex technical issues and unforeseen critical errors that can occur.  A degree of contingency can be allocated to combat these uncertainties, but methods of reducing development errors is a topic that is high on the project manager’s agenda.

So, another key strength of an automated approach is that it eradicates many unforeseen technical issues that manifest throughout the build and deploy phases of the application development lifecycle.  The actual task of estimating project length and appropriate resource allocation becomes a lot easier and much more accurate as many of the “unknowns” are removed.  This is because the tasks of generating, building, verifying and deploying changes are automated, thereby minimising errors and avoiding unexpected consequences of changing database objects.

Implementation Considerations

For the vast majority of organisations, the implementation of an automated database change management approach would require very little effort.  This is because most of the pre-requisite components for an automated approach are already available within the organisation itself.  It is merely a case of using what is there in a different manner. In addition, such an approach should provide the following attributes resulting in a smooth implementation:

  • low overhead to install and configure
  • quick to learn and productive within a few minutes
  • requires little or no staff training
  • no additional hardware requirements

However, depending on the organisations culture towards change acceptance, the following will need to occur in varying degrees of depth (dependant upon the requirements of the organisation).  This will facilitate internal recognition that an investment in time and money should be made for the adoption of a more efficient and effective process:

  • undertake a detailed technical evaluation of the enabling software via a pilot study/trial run in orderto equip the evaluator with sufficient evidence to eradicate misgivings concerning acceptance of the new methodology
  • demonstrate that the current processes are outdated and inefficient in comparison
  • produce evidence that a speedy and notable return on the initial investment will be achieved
  • establish an effective communication strategy targeted towards those that are likely to benefit from and utilise the “new way of doing things” so as to ease implementation and gain acceptance

As in any business, in order to justify the implementation of change, an evaluation must include the analysis of return on investment alongside the cost of continuing to practice an obsolete methodology. Only then can the “opportunity cost” be calculated, which can in itself, be the deciding factor in moving forward with the implementation of a new process.  If a decision is made to “leave it for now”, such a decision incurs an “opportunity cost”. What is this cost?  Is it more than a monetary value?

By undertaking these exercises, evidence can be communicated and confidence generated amongst the users (and investors), paving the way for a seamless implementation of a new approach to database change management.


The complexity of database systems is on the rise and IT departments face increasing pressures to deliver functionality and timely, accurate information to ensure an organisation’s competitive advantage.  Additional responsibilities arise from legislation such as the Sarbanes Oxley Act. Companies must assess their current approach to database development and be receptive to a better way of doing things.  Failure to embrace improved methodologies can only waste valuable technical resources and prove extremely costly. Your IT department cannot continue to endure database chaos, as it can directly affect a corporation’s bottom line. The ramifications of an inefficient methodology extend to lost business opportunities, higher operating costs, second-rate product/service quality, poor public relations, high staff turnover and legal liabilities.

This series of articles has outlined a new approach to database change management which is increasingly viewed as a best practice and there is no reason why it shouldn’t become a de facto industry standard.  It is an approach that organisations are strongly encouraged to adopt, as it provides an audit trail and is both reversible and repeatable.  This is achieved through integration with a version control system, hourly compilation (hence verification) of database objects, automated generation and propagation of changes and deployment of upgrades.  This has the ability to reduce database development project phases from weeks down to days, free-up expensive technical resources from mundane and time consuming tasks (so they can concentrate on deploying their knowledge and skills more effectively and efficiently) and virtually guarantee the integrity of your database code.

About the Author

Darren Fuller began his IT career in 1990.  Much of his experience has been gained by providing database consultancy on mission-critical systems and development projects within large organisations, often within heterogeneous environments using multiple technologies.  Darren has specialised in database technologies with a particular interest in troubleshooting, optimisation and tuning, especially with Microsoft SQL Server (since 1994).  He has a passion for ensuring a database system is finely tuned and the processes within a development project help to achieve this aim in an efficient and cost-effective manner. 

Darren holds a Bachelor of Business in Computing and a Microsoft MCSE certification.  He welcomes any feedback and can be contacted at darren.fuller@innovartis.co.uk

© Copyright Innovartis Ltd 2004.  All rights reserved.

If you missed the previous parts they're linked below:


You rated this post out of 5. Change rating




You rated this post out of 5. Change rating