Click here to monitor SSC
SQLServerCentral is supported by Red Gate Software Ltd.
 
Log in  ::  Register  ::  Not logged in
 
 
 
        
Home       Members    Calendar    Who's On


Add to briefcase ««123»»

Change Managment in database development? Expand / Collapse
Author
Message
Posted Friday, June 29, 2007 7:56 AM
Forum Newbie

Forum NewbieForum NewbieForum NewbieForum NewbieForum NewbieForum NewbieForum NewbieForum Newbie

Group: General Forum Members
Last Login: Friday, June 29, 2007 8:32 PM
Points: 6, Visits: 1
But your solution only supports VSS, so that invalidates your arguments about using the SCM to manage the changes.
Post #377815
Posted Friday, June 29, 2007 7:59 AM
SSCertifiable

SSCertifiableSSCertifiableSSCertifiableSSCertifiableSSCertifiableSSCertifiableSSCertifiableSSCertifiableSSCertifiable

Group: Moderators
Last Login: Tuesday, August 12, 2014 10:53 AM
Points: 6,783, Visits: 1,876
Not all of us - or even most of us, work in cross continent scenarios, but Im here to learn. The discussion seems valuable. Malcolm & others, you should approach Steve about writing some articles on this to take us from the chaos where we are to where we would like to end up!

Andy
SQLAndy - My Blog!
Connect with me on LinkedIn
Follow me on Twitter
Post #377816
Posted Friday, June 29, 2007 8:05 AM
Forum Newbie

Forum NewbieForum NewbieForum NewbieForum NewbieForum NewbieForum NewbieForum NewbieForum Newbie

Group: General Forum Members
Last Login: Friday, June 29, 2007 8:32 PM
Points: 6, Visits: 1
But Malcomn was making an argument that the approach I was using was prone to error and was solved by his product by using the SCM to manage the locking of the resource. If his product doesn't use the SCM that I'm using (Vault, TFS, Subversion) then his resource locking mechanism won't work in my scenario and thus his arguments are invalidated for my environment.
Post #377818
Posted Friday, June 29, 2007 8:14 AM
Forum Newbie

Forum NewbieForum NewbieForum NewbieForum NewbieForum NewbieForum NewbieForum NewbieForum Newbie

Group: General Forum Members
Last Login: Friday, June 29, 2007 8:32 PM
Points: 6, Visits: 1
If I could offer one suggestion to any vendor developing an approach for database change management is if you can't support different SCMs, then try to leverage MSSCCI so at least others that are not using VSS have a chance to use/try your product by using the same API that Visual Studio uses to communicate with the SCM.
Post #377823
Posted Friday, June 29, 2007 8:17 AM
SSCertifiable

SSCertifiableSSCertifiableSSCertifiableSSCertifiableSSCertifiableSSCertifiableSSCertifiableSSCertifiableSSCertifiable

Group: Moderators
Last Login: Tuesday, August 12, 2014 10:53 AM
Points: 6,783, Visits: 1,876
Was just looking on the dbghost site and it says you can manully check the scripts into other SCM's (which I grant seems like more work) so at least not 100% limited to VSS. And Im not saying he's right, dont think I understand the problem well enough to vote for a solution. Which is why some bite sized articles would be nice for the site.

Andy
SQLAndy - My Blog!
Connect with me on LinkedIn
Follow me on Twitter
Post #377826
Posted Friday, June 29, 2007 8:23 AM
Valued Member

Valued MemberValued MemberValued MemberValued MemberValued MemberValued MemberValued MemberValued Member

Group: General Forum Members
Last Login: Tuesday, March 25, 2014 1:21 AM
Points: 73, Visits: 69

Hi Andy - I mentioned cross continent as an example to illustrate my point about a source control system being necessary as it doesn't care about time or distance boundaries.

Terry - I'm not specifically talking about DB Ghost here - I'm making a point about source control and how it should be used. 

However, in answer to your question; DB Ghost and it's way of doing thing works with ANY source control system.

This is evidenced by our major customers who are using it with VSS, CVS, Subversion, Vault, PVCS Dimensions, Telelogic Synergy, MKS SourceIntegrity, Perforce, Microsoft TFS and Seapine Surround.  All DB Ghost actually requires is a set of CREATE scripts that describe the entire schema on disk somewhere to start the process off so which SCM those scripts have come from is basically irrelevant.  As the developers make their changes only to the object creation scripts no diff scripts are necessary and there is, therefore, one single source for the schema.  It is also easy to look at the schema history in the SCM as the list will show the names of the actual objects that were modified rather than a sequentially numbered set of diff/change scripts.



Malcolm
DB Ghost - Build, compare and synchronize from source control = Database Change Management for SQL Server
www.dbghost.com
Post #377829
Posted Friday, June 29, 2007 8:35 AM
Valued Member

Valued MemberValued MemberValued MemberValued MemberValued MemberValued MemberValued MemberValued Member

Group: General Forum Members
Last Login: Wednesday, February 1, 2012 9:00 AM
Points: 65, Visits: 4

Terry,

I think you're doing your company a favor by having a solid process in place. Adding additional software into your process often requires your process to change without significant benefits in control or quality. Keep it simple, keep it flexible.

We also use a similar process to yours, but with some minor differences. The differences are probably based on our different organizational structure and methodology, but fundamentally the same.

Database developer sandbox.
Shared Application developer playground.
QA/Testing environment.
Production/Customer delivery.

In the database developer sandbox, the developer can go hog-wild here. When they have tested their solution, they push it into the next mode (or zone).
For tables and data, the scripts that they write are simply the diff scripts to get a table from one build to the next. We do not modify the Create table script in order to generate a change script. It's not that hard to write a change script to modify one or more tables and manage any data migration that is necessary as the result of the schema change. So I don't see the need for software to produce diff scripts. Also, we have setup a naming convention for SQL files that allow any developer to know if another has also changed the same object among the "myriad" of changes that build up over many months or years. Once a table is modified using a change script, you can easily generate the Create Table script in your sleep -- from SSMS, SMO, or other free utilities. We only use the Create table script for record-keeping purposes. All new database are created from a "clean" database backup, maintained by the DBA and created by the change scripts (after they leave the QA/Test zone). All of this is done using batch files, osql/sqlcmd, and SQL script files -- and all of the bat and sql files are also under the same version control (We use MS VSS, but it doesn't matter which one you use).
For textual objects (stored procedures, views, functions) we don't bother with Alter statements. We simply have Drop-n-Create statements in the SQL file that defines the object. The db developer simply modifies the syntax in the file and checks it back into version control, where it will be ready for the next step. The only special-cases for these, are dynamically created objects and functions used in Check or Default constraints.

We automate (batch files) pushing builds into QA/testing zone. The automation labels/marks all DB, application, and unit-testing code in source control. The labeled version is then pushe into QA/Testing. Part of the automation runs scripts to check for conformance to standards and inclusion of certain files or scripts. Because of the automation and version control, we have better consistency and quality. Automation also means we spend less than 2 minutes running and documenting a build! Additionally, our final push into Production/Customer delivery is the equivalent of the last QA-Approved build, so there's almost no additional work involved there.

Good luck, Terry.

~Marc

Post #377838
Posted Friday, June 29, 2007 8:36 AM
Forum Newbie

Forum NewbieForum NewbieForum NewbieForum NewbieForum NewbieForum NewbieForum NewbieForum Newbie

Group: General Forum Members
Last Login: Friday, June 29, 2007 8:32 PM
Points: 6, Visits: 1
I hope I'm not hijacking this thread but I feel the discussion is a good one.

Malcolm (sorry I misspelled it before) the solution I talked about does use a "schema.sql" file as the master copy of the database schema (the same approach your product seems to take), but if you are not storing the diffs, how do I as a remote dev, get your changes applied to my sandbox database without dropping the whole thing.

The approach I talked about uses both, the full schema file to represent the 'official' version and diff files that can be applied by team members to update their database environment.

You bring up a good point and one that I didn't consider was to use the SCM to lock the 'schema.sql' since this resource is serving as the official version. If the patch diff generation process uses an exclusive lock of the schema.sql file to ensure that only diff is being generated at a time then this is much better than the 'hey i'm about to generate diff 15' approach that I mentioned in my first response.

Now I highly recommend in any sandbox approach that devs don't get married to their copy of the database and that they periodically wipe it and recreate it from the official scripts and that any data they are they are using and any tables that they are working on that are not part of the official schema, they should have their own scripts to create the objects/data until that feature is ready to be merged into the official database schema.

I see your product does allow pre/post scripts to run so I see how you could batch this up to use ant/nant/msbuild to use any SCM to lock the schema.sql file even if you are not using VSS.
Post #377839
Posted Friday, June 29, 2007 8:49 AM
Valued Member

Valued MemberValued MemberValued MemberValued MemberValued MemberValued MemberValued MemberValued Member

Group: General Forum Members
Last Login: Wednesday, February 1, 2012 9:00 AM
Points: 65, Visits: 4

Me again...

Just wanted to be clear that I'm not against Red Gate's products. Rather, I like them.

But I've seen Development groups get too excited about buying new, cool-looking software that may not be the right fit for them, or ends up not being utilized to its fullest potential.

It's all about trying it on to see if it fits.

Sometimes, I wish that the money that was spent on certain purchases had gone into my salary instead.

~Marc

Post #377852
Posted Friday, June 29, 2007 9:36 AM
Valued Member

Valued MemberValued MemberValued MemberValued MemberValued MemberValued MemberValued MemberValued Member

Group: General Forum Members
Last Login: Tuesday, March 25, 2014 1:21 AM
Points: 73, Visits: 69

Hi Terry,

Don't get me wrong - your process isn't bad (sorry if I came across badly back there) - it's just not as optimal as it could be and, in fact, we aren't very far apart in most of our thinking.  Perhaps a little history of where we came from might help...

<shimmering effect> We started out writing DB Ghost because we had virtually the exact same process that you use and, althought we were just five guys sitting in the same cubicle, we still managed to create problems when moving to the QA environment such as re-using primary key values for static data, overwriting each others logic changes and writing new code against columns that had been removed by someone else.

We said "we didn't have these problems with release 1.0" why do we get them now?  The answer, of course, is that in R1 we just worked on the create scripts and then threw away & rebuilt the database whenever we did a release.  So, we mused, all we needed was a black box that can just look at our create scripts, automatically work out what is different, and then make those changes to the target database without losing any data. We had some full-and-frank discussions over whether it was possible - some toys were thrown, harsh words were spoken and some were even sent to bed early.  However we decided to write the "black box" and that's where DB Ghost came from (actually there was nearly a punch up over the name as well so maybe some counselling is in order ) </shimmering effect>

The custom scripts feature (before and after sync) is there so that any data migrations etc. (i.e. things that require semantic knowledge) can be slotted in.

Generally speaking the developers all work in their "wild west" sandbox databases and when they're ready, they check in their changes, backup the sandbox database and then use DB Ghost to update their sandbox database from the latest set of CREATE scripts from source control.  Better yet, if the scripts are extracted from a baseline/labelled set then you have a known point to work from in source control to answer questions like "what has been checked in since I last synched my database"?  As a purist a would suggest simply rebuilding the sandbox database from the scripts and then re-populating the data to avoid those annoying issues you get with corrupted data giving phantom failures during testing.

Another subtle benefit is that DB Ghost not only adds new objects and updates existing ones but it also removes anything that isn't in the source.  Thus it tidies up your local database whereupon you can do a last run through unit testing to absolutely make sure nothing was missed from the check-in... 



Malcolm
DB Ghost - Build, compare and synchronize from source control = Database Change Management for SQL Server
www.dbghost.com
Post #377875
« Prev Topic | Next Topic »

Add to briefcase ««123»»

Permissions Expand / Collapse