Technical Article

Version Control -Part 1- Dealing with Code

,

This article is part 1 of my series on version control.

Introduction

Writing code. Building applications. Rapid Application Development. Integrated development environments. There have been many advanced in the field of software engineering, especially with regards to developer tools in the last fifteen years. When I started writing applications in college, I used emacs on a UNIX platform. When I moved onto the PC platform, specifically for MS-DOS, I relied heavily on Norton Editor, still today one of my favorite tools.

SQL Server, however, hasn't progressed as far in terms of a good development environment. The one thing that saved me quite often when writing VB and C programs was the integration with a version control system. There were quite a few times that the ability to rollback a wasted days effort of implementing some ridiculously complicated algorithm was welcome. There were also a few times that having a backup copy in Visual Sourcesafe, PCVS, or VCS probably saved my job.

While I have welcomed the advances in Query Analyzer ( a long way from isql), there are a few features that I am surprised are not included, notably, integration with a version control system. Since most people reading this articles use Microsoft products, I will use Visual Sourcesafe, or VSS, as my sample version control system, but the techniques presented here will work with most any system.

How Does SQL Server Integrate with VSS?

The short answer is, it doesn't. Perhaps because the code and the compiled application are available on the server, or perhaps because so many people are used to editing objects in a window and compiling them on the server right away, or maybe some of you have some other reasoning I am missing. Whatever the reasons, the fact remains that the client tools that come with SQL Server do not integrate with VSS. In my mind, this is a major shortcoming for the SQL developer.

I won't deny that developing objects in SQL Server is perhaps different from developing them for IIS, using Interdev, or even creating packages to run in MTS. I do feel, however, that all actions in Enterprise Manager and Query Analyzer should be integrated with VSS. Behind every action you perform using the GUI, there is a script that runs that contains T-SQL code, not some magical Win32 API call. But enough of the soap box.

My Environment

Before I delve further, let me explain the type of environment that I have used in each of the last few jobs. Basically I have (at least) three systems setup for each application. These are not necessarily, but usually are three separate servers. These systems are defined as follows:

  • Production - The "live" system that is actually being used by users. This is the data of record, the system that runs the database, accepts orders, etc. and runs the business.
  • QA - A copy of the production system that is used for testing. This is usually the system that is most often updated (overwritten). Developers should not have access to this system, instead this should be exclusively given to the testing or Quality Assurance group. Integration and system testing occur here.
  • Development - The "sandbox" or "playground" for the development group. This is where items are developed, tested, and verified by developers before being submitted to formal QA. This would be where unit testing and developer integration testing occurs.
  • Beta (optional) - A copy of the production system where clients can perform testing. This is exactly like the QA environment. This is a playground for clients.
  • Demo (optional) - The last couple companies have needed a "demo" system for the salespeople to use. This functions like the Beta system, except that it is rarely overwritten (from a data perspective).

Changes in the SQL environment are new and altered objects, data seeds, imports of data, etc. All of these changes are first made in the development environment. They are then applied to the QA environment where the intended effects can be verified by our QA environment before being migrated to the other environments.

It is my job to manage the changes as they move from one system to another, along with the QA person, though in actual practice, I "build" a package that migrates change from the development environment to the QA environment. The QA person is then responsible for running my "package" against the other systems to apply the changes. I will discuss packages in Part 3 of this series.

As a SQL Developer, Can You Integrate with VSS?

Yes and no. I used a third party tool for developing some T-SQL that claimed VSS integration. I could not get it to work correctly and neither could their tech support. I also investigated the integration in Visual Interdev v6.0, but I HIGHLY recommend against using this. Perhaps when the next version of SQL Server comes out and T-SQL becomes a .NET language, this will change.

After inheriting a mess of a system a few years ago, I decided to develop my own techniques, reaching back to the habits I had acquired in college on their text based, UNIX systems. The techniques I use are fairly simple and I adhere to them rigidly. I don't advocate that everyone duplicate my methods, but you should have some method and stick with it, religeously!

My environment is always the same, whether I am using my desktop, my laptop, or the server console. In fact, this is the first thing I do when starting a new job. Well, after insuring there is a version control server of some sort installed. This is a twelve stop process that has evolved over the last 4 years into a very stable system.

The first step for me is always to create a /SQL folder on my c: (or other local) drive. This folder is where I will store the temporary working versions of all code that I am working with. I place this at the root so I can easily get to it from a command prompt if I ever feel the need to use Norton Editor. One thing I hate is opening a command prompt and navigating through eight levels of folders to find a file (ever looked for the template files?).

The next step is to build out a structure in VSS to hold all my objects. There are various methods of doing this, but the easiest (I have found) is the following:

  • Tables
  • Stored Procedures
  • Views
  • Triggers
  • Scripts

    • Test
    • FK
    • Table Alters
    • Indexes
  • Diagrams
  • DTS
  • DBA Tools

There are a few more items that I may include, but this covers most of the items that I feel I need to get started in every environment. At this point, most experienced people are getting bored because they have implemented something similar to this.

It is at this point that I probably depart from most peoples environments. As I setup these projects in VSS, I set the working directory for all of these projects to the same folder, my local .\SQL folder. Interesting, huh? I also set VSS to remove all local versions of files whenever it checks a new version in.

Why would I do this? After all, one of the features of VSS is to duplicate the project structure in the folder system on the local disk. To me this becomes confusing and I cannot quickly find what I am working on. Usually when I am working, I am using a few scripts and want to find them all easily without wasting time navigating folders. VSS provides the structure for me, my local disk is a temp area, so I treat it as such.

A few years ago, I inherited a system where all stored procedures were encrypted, there were versions of these stored procedures in VSS, on the old DBAs local machine, and in his network home. To top it off, the datestamps did not match and testing some of the latest versions (by timestamp) against the server gave different results. After straigtening this out, I decided that keeping local versions when the item was checked into VSS, especially without auto-checkout from the tools, was an extremely bad idea. Instead, I ensure that if I have something checked out, no one else can use it. If they need it, I have a phone on my desk they can use to request the file.

My daily process works like one of the following:

  • New objects - I usually create these from a template, give the object a name, and then check it into VSS as soon as I am confident it works (passes Unit Testing).
  • New scripts - I usually create these from scratch, give them a name that corresponds to what I am doing in the script, and then check it into VSS.
  • Changed Objects (scripting) - I check these out from VSS and immediately open them in Query Analyzer. I save them as I work and when I am finished, I check them back into VSS. If I go down the wrong path, I can undo the checkout in VSS and recheck out the last version. If I reach the end of the day, I can always check it back in (to save the version) and then check back out right away so I can work on it the next day.
  • Table Alters - I usually make these in Enterprise Manager to be sure that I cover all dependent objects. I alsways save these scripts, open them in QA, add comments, and immediately add them to the VSS Table Alters. I then check out the original table and make the updates there as well.

Conclusions

This is my simple, yet effective VSS integration. I guess integration is not really appropriate since the integration is due to my habits. However, this process has ensured that I have not lost any code or had versioning problems in the last year. By adhering to this process, I have maintained a stable environment. As a matter of fact, the only issues that I have had have resulted from other developers making modifications using Enterprise Manager or Query Analyzer.

As always, I hope you learned something and I welcome feedback on this article using the "Your Opinion" button below. Please take a moment to also rate this article.

Steve Jones

©dkRanch.net November 2001


Return to Steve Jones Home

Rate

You rated this post out of 5. Change rating

Share

Share

Rate

You rated this post out of 5. Change rating