Version Control - Dealing with Code

  • I see you advicing against using Visual Interdev, my company is just looking for some vss integration for sql server. Why do you advice against it? (Thnx, i guess you'r saving me a lot of time..

    regards henk meulekamp

    /edit=typos, damn why not read messages before hitting that submit button

    Edited by - henkm on 11/19/2001 02:30:50 AM

    Edited by - henkm on 11/19/2001 02:32:10 AM

  • I also would like to know why you advise against using Visual Interdev. I've implemented and used it successfully in a couple of jobs.

    Colt 45 - the original point and click interface

  • It has been problematic for me. In VI 6, there were bugs with managing stored procedures. I heard these were ironed out, but haven't checked it since SP3.

    Also, it only handles stored procedures. IF you have to use 2 different methods, then the chances for error increase dramtically. IMHO, it's best to have a simple, clean solution. VI doesn't provide the complete solution.

    Steve Jones

  • Steve, thanks for the Article, I also don't use interdev because now, it doesn't support SQL Server 2000 and does not handle UDF's.

    Steve, what I was not clear about is exactly how do you check in and check out of VSS, is it just into a text file?

    In Query Analyser, do you do File>Save As, then save it to your C:\SQL folder then go into source safe and then check it in? it seems like a long winded process, for each and every object in the DB. (I have 3500 objects in my DB and I don't want to have to do that). Is there a quicker way.

  • Yes I do it manually. Updating lots of objects is tedious, but there are a few tricks.

    1. Naming conventions. Keeping all objects named according to some rule makes things simpler.

    2. I always have VSS and QA running. They are both set to work with the c:\sql folder, so things are in one place.

    3. Use the search features of VSS. If I need all stored procedures that reference the "Products" table, I search for them and then check out all the items that are found.

    I have probably 1000 objects on my main server and I make changes very quickly. Week to week, I am working with only a dozen or so and I use the method in my new article to track changes. That way I know which objects I need to work on.

    Steve Jones

  • Steve,

    I liked your article, I have been wondering about SQL version control and will probably adopt some or all of your methods.

    You mention that you hate navigating through lots of levels folders from the command prompt. This is something i have found annoying in the past one quick way round it is to put a short cut into the 'SendTo' folder that points to Cmd.exe. Once you have done this you can right click on any folder in explorer and select send to command prompt this will cause a command prompt window to open at the folder.

    In W2K the shortcut should be:

    Target: %windir%\System32\cmd.exe /k cd

    Start in: %windir%

    I also use this technique to register and unregister dll's



  • Oddly enough, about 2 weeks ago, I came up with what amounts to be the same technique, including the multiple vss subprojects and a single working folder.

    I did note that if I check a file out of vss (e.g. dbo.tablex.TAB) , edit it and run it in query analyzer, and then check the file back in, then I get an odd issue that the version that sql server holds differs from the vss version in that the sql version somehow adds extra line breaks...

    Anyway, like you, I have several developers who don't always use vss, but go straight to enterprise manager. So every few days, I generate the sql scripts for the entire db to my working folder and I run a recursive differences report on the entire vss project. This allows me to catch and persecute the offenders very quickly.

    I am actually thinking of writing a job to automate the generation of the sql scripts and reference vss object model to create the differences report and then email me info. If it is cool, I will post the code.


  • Actually Andy, I found the skeleton for such a project on msdn. When i get back to work, I'll post the link.

    btw, how do you get to presecute the offenders? 🙂

    Porkstone, I actually have that setup, I just hate looking through 6 levels of folders in Explorer.

    Thanks for the comments.

    Steve Jones

  • Steve, thanks for the article. I have been looking for a way to track changes and this may just work for us. However I would like a little more info on EXACTLY what you put in each project folder and how you get that information.

    For example under the Tables project folder, is that a script for the table itself without any other information like index, FK, ect.

    Another example would be if you have a Scripts project with Table Alters as a sub project then what goes here that wouldn't go into the Tables project.

    Anyway, it would be great if you could list each project folder with a corresponding example of what you put in there and how you go about creating what you are putting in there.

    I think it would be very helpful for those of us who are newer at this.

    Thanks for your time and your article.

  • The long and short is that I place everything in a separate script and it get's shared (to an "InFlight" folder) or branched (To a specific release).

    Each script is short and simple and does one thing, so a create or alter table is in its own file. A stored proc is in its own file. A FK is in its own file.

    At the end of each file, there is a select statement that verifies object creation (or not) and lets me know that where in script execution I am in the error file.

    The exceptions:

    - Security is placed in each file, so I set the object security inside the table, view, procedure file.

    - Defaults are in the table script.

    - data updates are in files and may be shared. Depends on the data update. If they are related, I may include 2 or 3 in the same file. I separate each with a GO and include "select 'something happened'" before each GO so I can tell where an error occurred.

    An example:

    Last week, I did the following:

    - Altered 2 sprocs. As soon as the changes were checked into VSS, one was shared to "InFlight" and one was branched into the release project for the week. I knew one change would be released, but wasn't sure about the other.

    - I created a new table. The script was branched to the release project. this included a couple defaults bound to the columns and the GRANT statement for the table in the script.

    - I altered a table. The alter script was shared to the Inflight project and I manually checked out and updated the original table create script.

    - I had a script to add data to the altered table (above). Shared to InFlight.

    At this point, a developer completed work on the altered table. I then moved them and branched them to the release project. I deleted them from the InFlight folder.

    I created my 0Upload.cmd file for the release project and checked out all scripts. This was applied to the QA server and I verified that all items were branched.

    QA found a bug in the procedure. I then deleted it from the Release project, checked it out from the sprocs project and fixed the issue. It was checked in and rebranched to the release folder. I then manually ran this on QA.

    Once QA verifies everything, the release project is applied to the live system in the same way as it was applied to the QA system.

    If I hot fix something, I usually do follow the same procedure and make a releasedate-HOTFIX project. This is really for tracking more than anything.

    There are lots of manual items here, but it has worked flawlessly for over 2 years. If it runs on QA without errors (and this may take testing) it has ALWAYS worked on production. The important things I learned over time were the orders in which to apply things.

    Steve Jones

  • I might have not been clear enough in my original post... I am at the point before even getting to in-flight or branching. I am still trying to figure out exactly what goes into each Project Folder and how you created whatever you place in each project folder.

    Let me give a basic list below of what you have in the article and what I think goes in there and how I would get it. And if you could comment or correct anything I list below I would appreciate it:

    DBA Tools = I don't know this one... What goes here? More detail please.

    Diagrams = I know what a diagram is, but how do I get a script for it? I have a couple that I have messed with but I don't see a way to create a script for it.

    DTS = I know what a DTS package is. All of my DTS's are stored as local packages. How do I get a script to store it in text format?

    Stored Procedures = Simply contains all stored procedures. Just generate the script. (Include Security)

    Tables = Just generate the script. I assume since you track triggers, FK, and Indexes in separate folders...I don't include them in the base table script... is that correct? The only addition would be security.

    Triggers = Just generate a script for a table, but only check triggers.

    Views = Just generate a script for a view including security.

    Scripts = Folder to hold various scripts. Why break this out into a folder called scripts if the other folders one level above the scripts folder actually contain scripts as well?

    FK = Foreign Keys. Just generate a script for the table where I only include the FK.

    Indexes = Just generate a script for a table where I only include the indexes.

    Test = Not sure what goes here? Just test stuff.

    Table Alters = I know I want to place table alters here, but where do I get the sql script for this? Is it simply clicking the "Save Change Script button" before saving the file? or is there another perferred method of doing this?

    And if so do I ever go back and re-generate the table script to put in the folder Tables? Or does the Tables folder just contain the original creation of the table and all future alterations to the table go into the Alter Tables folder?

    Sorry for such simple questions. I just want to make sure I have all the basics down first.

    Thank you again for your time and you quick response.


  • Oh... I forgot... do you include Generate Drop Object in your scripts.

  • I do include the DROP object because it has worked well for me and, other than tables, I like it better than ALTERs.

    Addressing your items:

    DBA Tools - scripts and stored procedures that I tend to carry from job to job. These are misc ones that do not fit a business purpose.

    Diagrams - I use Visio (thinking of changing) and so I store the VSD here. I also don't have a way of exporting the EM diagrams.

    DTS = I know what a DTS package is. All of my DTS's are stored as local packages. How do I get a script to store it in text format?

    Stored Procedures = As you wrote.

    Tables = yes

    Triggers = yes

    Views = yes, include drop

    Scripts = Top level (this project) holds data update scripts.

    FK = Yes, subfolder (subproject) from Scripts. No real reason, but it fits.

    Indexes = subfolder of Scripts.

    Test = misc scripts that I use to test things. Named for function. really a holding place for scripts that I may use more than once.

    Table Alters = I save the change script from EM after using the table alter. This has worked every time so far. subproject from scripts. I used to use an Embarkadero product that did the same thing, but sometimes did not work right. I do usually go back and update the original table script. Not that I would likely go back and use the Table scripts for a DR, but mainly to keep them in synch in case I use a few tables in a new application.

    Steve Jones

  • Your going to hate me.

    You didn't say how to get the DTS into a script format. How do you get that out?

    Do you also generate scripts for all dependant objects?? for tables? for views? etc. I want to make sure I get everything I will possibly need.

    Let me give an example scenario and see if this is what you would do?

    I want to work with a stored procedure so I go to VSS and check out dbo.GetBrands.prc which contains the drop, create, and security rights for the Stored Procedure GetBrands.

    I then go to EM and start editing the Stored Procedure and testing it within Query Manager. Once I am sure it passes all my testing I then generate a sql script of the newly saved Stored Procedure with the drop, create, and security. When I create the script I create it in the SQL directory on my local drive which overwrites the one I checked out. I then checkin dbo.GetBrands.prc. Then I should be done.

    Here is a scenario of a Table.

    I already have a table and I want to add a field. I check out the and start working with the table in the EM. Once I make my changes I then click the "Save Change Script" button and save this new alter file to the SQL folder. I could also regenerate a sql script for the table. I would then check in the and I would also add the Table Alter script to the project.

    If I did future alters would I check out the Table Alter Script previously created and copy the new alter script in place and then check it back in?

    Does all of the above make sense?

    Is that how you go about doing it?

    You mention using Query Analyzer for Changed Objects (scripting). The only time I really use Query Analyzer is to test the results of my edited SP or View or Table. I do it in EM first and then test it in Query Analyzer.

    Can you give a little more explaination on Changed Objects (scripting)?

    Sorry if this is confusing... just trying to get a firm handle on how you are doing things.

Viewing 15 posts - 1 through 15 (of 47 total)

You must be logged in to reply to this topic. Login to reply