An Example of Test-Driven Development, Part 4

  • Comments posted to this topic are about the item An Example of Test-Driven Development, Part 4

    Andy Leonard, Chief Data Engineer, Enterprise Data & Analytics

  • Your articles have generated a lot of thought for me on the nature of test-driven development. It's kind of freaky that just this morning on my walk into work that I was thinking about your articles in conjunction with my conclusions on a discussion about documentation/comment headers that occurred recently on this site. Then I came into work and magically your 4th article on test-driven database development appears in my inbox. Even more "woo-woo", the article confirms my morning thoughts.

    I won't go into details on my actual thoughts. What I will say is: What strikes me about this article is that it is a lot about documentation. This gives me a lot of insight that I didn't have previously about why some people see value in test-driven development.

    I also appreciate, as in your previous articles, seeing how other people accomplish the same things that I do but in different ways. For example, you write up a script that to my way of thinking manually strings together all the individual scripts in the desired order. I use a mini-database and code to dynamically do something similar. (I like my method because I can group code easily, easily change the order, quickly find what I need to find, and manage versions easily.) Regardless of the method, what I liked seeing about your article was that you do -it- at all. (It: have a way to deploy a large number of scripts in a particular order.)

  • I have followed all four of Andy's articles in the "An Example of Test-Driven Development" series. I don't know if he plans more. Look under the tag "Test-Driven Development" and you'll see only a few articles so far, but Andy's series is a nice addition to this topic.

    The title is apt--he offers "An Example" (nothing more, nothing less) that from my take offers a clean look into how Andy approaches database TDD start-to-finish from setting up a project/solution in SSMS, organizing the code into the Queries folder to deployment-- with lots for us to consider and learn from.

    This is the kind of material that I find valuable--having a look at how other professionals do things from someone who is willing to open up rather than making their solutions "closely-guarded secrets" and who is willing to expose themselves to the inevitable criticism of exposure.

    Here are some questions I think are worth pursuing from the series thus far:

    1. Article 3 offers an example of automating the creation of scripts that test for the existence of database objects like tables and stored procedures. But the article is light on the discussion of why and to what extent one would want to do that--which is fine. That's what these discussion posts are for.

    I think in the world of RDBMS and database development, the test-driven development technique/decision of "test before coding" can be considered optional and subject to project-specific considerations.

    Andy's solution of automating the creation of test scripts gives the developer the option of using a "first create an object, then immediately after, create the existence test" technique with the understood risks taken.

    Before a TDD purist attacks this proposition, take in mind that we're on the cusp of developers (and other less-technical stakeholders) using Oslo and MSchema which expresses T-SQL code, putting the creation first in a fluid development environment where frequent initial changes to objects will take place as entities and relationships are hammered out. So, I think the road ahead favors the agility of "create the existence tests immediately after" technique.

    2. Existence tests have been discussed--which begs the question of to what level of detail we should test for. How about test for the table columns with their data types, default values, nullability, and so forth? There's a good bit of complexity once we move outside of top-level object names. I think this is where the overhead of TDD really becomes an issue--and where tools and techniques are most needed.

    I suspect that we need tools that support the option of evaluating our entire schema and generating existence/creation tests for all the details--so we can generate the tests as we go--and that the tool to do this will heavily if not exclusively use the INFORMATION_SCHEMA views which are part of the ANSI SQL standard, and which Joe Celko notes that this is SQL describing itself.

    As the tool above is not available at the moment as open-source for SQL Server to my knowledge, I use RedGate's SQL Toolbelt tools to generate create scripts.

    3. The discussion posts have mentioned that tests come in more than one category. After reading these brief comments, I still think most people will leave with the categories unclear--functional, performance, unit, existence, etc. So, perhaps someone would like to clear it up along with any needed points on what kinds of techniques lend to the various categories.

    Bill Nicolich: www.SQLFave.com.
    Daily tweet of what's new and interesting: AppendNow

  • There's one more coming (at least).

    Nice job, Andy! It's gotten me to think a bit more about how to test things ahead of time, and plan for testing.

  • Fascinating series of articles, I have been developing a bit of an offshoot to these procedures and only stumbled upon this articles by accident.

    My variation on this takes a copy of database structures after a milestone release, placing them inside a locked down database and time stamps them. A familar story I am sure, I have been working with a someone who will not adhere to team standards and will 'tinker' with table structures, SSIS packages, stored procedures etc without testing, and the rest of the team only become aware when the overnight ETL process fails.

    Yes, I can hear you all shouting 'lock him down, reduce his permissions, sack him!', but it is not easy to control a senior member, with some SQL knowledge, and also happens to be the company system accountant/business process guru.

    My process will compare the latest stored version with the current structures and report the changes.

    If there are articles eluding to controlling development by 'loose canons' they seem to be off the beaten track or hidden in the dark corridors of HR.

    The problem is magnified by joining a project that appears to have evolved rather than a nice clean green field project. By combining any new work with the TDD approach, and running the comparison, I would have some control over testing. The person in question is very open to new ideas, 'Tinkering' being the result, and will hopefully latch onto this TDD concept.

    On a different note, not one for re-inventing the wheel, I noticed the project was made available during part 2, could the project containing additions from parts 3, 4 and the suggested part 5 be made available?

    Kind Regards

    Paul Grubb

  • Link to part 4 in title points to this article only:blink:

Viewing 6 posts - 1 through 5 (of 5 total)

You must be logged in to reply to this topic. Login to reply