Accidently Agile

  • A top notch article as usual, Mr. Poole. Where I work it's all waterfalls....

    Signature is NULL

  • Thanks as ever for your comments.

    Some of the books I've read on the subject do mention that house building is a bad analogy for agile development but I couldn't think of a more suitable one. The whole point of an analogy is to find a common point subject that most people can relate to and to utilise it to aid communication. An ORM layer for communication?:unsure:

    I am currently working on a project on which I have preached the agile gospel. The business people, the PMs, the programmers, the delivery managers and support staff all sat around a table having a face to face discussion and made a rapid choice in favour of a small, light weight project that would be designed to be extendable. There were quite a few points and counter-points made but the face-to-face element really worked.

    My role is now as a DBA as I have chosen to be a specialist in a particular area. Regardless of whether I were a DBA or not I would say it would be blindingly obvious that a data driven project is going to require a good database foundation.

    If you were talking about a project to harvest information from a web site and store it then I would say that the database layer would be of lesser importance than the harvesting part of the project.

    The one thing that the DBA role encourages is not to test the depth of the water with both feet!

    The move to agile development is a paradigm shift but having spoken with experience XP protagonists (backed up my own reading) you have to buy into it whole heartedly, you can't really cherry pick bits here and there or the whole deck of cards comes down. My personal opinion is that agile development suits object orientated programming, I believe it originated in the JAVA camp. Even for object orientated programmers I believe some of the techniques came as a shock. i.e. inheritance is bad, association is good. Objects should be open for extension but closed for change etc.

    As a final point one XPer pointed out that there are two sorts of itteration. If you see the deliverable as a drawing of a man then one draws an outline of the man on the first itteration and subsequent itterations add more and more detail until you get the finished drawing.

    The 2nd approach draws a part of the man in entirety and each subsequent itteration adds a new body part.

    If you subscribe to this site then you have almost certainly had to deal with a database that was put in place with the "it doesn't matter now, we'll fix it later" attitude. There are 1001 reasons why it won't be fixed later the least of which are the technical hurdles. I am currently liaising with 3 different triametrically opposed groups of people to get them to agree to do the investigation work to find out what the effects getting rid of a small amount of data that prevents DRI being added would have on their systems.

    To the business this is a non-issue as it has zero visiblity. To the DBAs keeping the database functioning is a mind-numbing and time consuming job. Had DRI been put in place in the first place the DBAs could be building the better mousetrap and devoting more time to the developers rather than acting as a bottleneck while they fix legacy problems.

  • Steve Schmechel,

    Thanks for logging in just to crap all over the article, that time as not wasted. I'm sure everyone appreciates your snide comments and asinine POV. You remind me of a high schooler still caught up in what is "cool" and what is "not cool", as if that has anything to do with the right ways to develop software.

    I've been critical in this forum before, and I've even occasionally been critical of some of the things Poole has said. It's OK, his ego can take it; but for God's sake please keep things constructive!

    (To all Forum Members: I know, I know...don't feed the stinkin' trolls)

    Signature is NULL

  • A well written article, but this is exactly the type of thinking that has brought Development in America to an all time low and why ultimately businesses continue to seek external resources.

    Agile Development is definately a methodology that allows the customer to see more frequent results. From a Business perspective, that may be what your customer wants to see. However, what they want and what they need are often two very different things and this is absolutely an area where a customer's ferver for a 'visual fix' each couple weeks should be managed. The very best project managers try to balance the customer's excitement to see something against the need for quality and attention to detail. For those projects with no real competent technical project management and no layer inbetween the develper and the customer, Agile may be the only way to passify the customers regular need for that 'fix'. However, you are ultimate not serving that customer or yourself and you are exchanging true development quality for short term customer satisfaction during the development process.

    For me personally, I believe the Sales and Marketing parts of our industry have encouraged this approach because it makes more sense to them and unfortunately, we'll always have a set of 'experienced' people who buy into the hype and the cool word "Agile".

    While I'm certainly no advocate of a long drawn out waterfall model where you do 6 months worth of hardcore development before anyone see's you emerge from a cube, I KNOW a far better result will be achieved for a project that is managed by clear and early identification of MILESTONES at regular short intervals. Those milestones don't necessarily need to be some new screen that a customer can see, but are still identifiable progression points that at the very least, a develper and technical project manager can identify and agree on and measure.

    I'll completely agree that great team communications and regular measurement points are fundamental to a project and those aspects of Agile are points well considered for any developer. However, the precepts of Agile involving the complete end-to-end development of one small piece of the project at a time often defeats the goal of having a well thought out system archtecture with the necessary attention to system integration that great products display. You can say that you can overcome this by having a lot of communication, but in practice it doesn't work that way.

    For a set of software development tasks that have been repeated by develpers a few hundred times in their career, Agile makes a lot of sense. For the other 98% of the projects that most of us find ourselves doing, the process will ultimately work against a quality result.

    Brad Benham

    Senior Architect

    McKesson Corporation

  • I'm sorry if it appeared I came in "guns blazing". I have read other articles by David Poole and normally I think he is right on the money. This time I think he missed the mark.

    The article has more of the flavor of a blog/opinion piece than a call for professional discussion.

    Where I work we embed databases in our products. Often, before the user requirements are even fully understood, database developers and code developers (who know enough about databases to be dangerous) are busy discussing the database requirements and schema so they can "begin" their work. So maybe this idea of "As a database guy I tend to look at the database layer as the foundations for software development" rubs me the wrong way.

    If your job is to maintain the a big OLTP database of financial data that software developers write against you probably won't have easier alternatives to offer them for the near future, so you will have to figure out how to be agile within those constraints. Maybe that is the audience the author had in mind.

    I know this is SQL Server Central (database work is part of my job description and daily activity). However, I read the "Central" as a "central place to get information about SQL Server". (Which it does quite well - probably the best! 🙂 )

    For many people, SQL Server is one of many useful tools for solving problems and not "Central" to their identity. Information on how to integrate database development into an agile process is useful to those people also. Put yourself in their shoes and ask what they take away from the article "Accidently Agile". Not much I am afraid.

  • Steve Schmechel (2/14/2008)


    Where I work we embed databases in our products. Often, before the user requirements are even fully understood, database developers and code developers (who know enough about databases to be dangerous) are busy discussing the database requirements and schema so they can "begin" their work. So maybe this idea of "As a database guy I tend to look at the database layer as the foundations for software development" rubs me the wrong way.

    I'll say first that I'm often enough coming across as "all guns blazing" - so I understand how easy it is to very "extra spicy" into an answer... I was feeling a tad bit "extra spicy" writing the post a little earlier, so I'll go right ahead and say I wasn't trying to be harsh, and am hoping no feathers were ruffled.

    As to the quote above - there's definitely some truth to it, and it's a fairly common flaw to many projects I come across. A frequent poster on here phrased that problem very well with: when all you've been given is a hammer, a LOT of things start to look like a nail. Walking in with a pre-conceived design is definitively not the greatest for a project, certainly before any of the requirements exist.

    One of the common gripes I have with systems running poorly is the vast amount of stuff that doesn't belong IN a database (or doesn't belong in that particular database anymore). If you're not going to read it, update it, search it or correlate to it, why is it still here? Why was it ever here?

    ----------------------------------------------------------------------------------
    Your lack of planning does not constitute an emergency on my part...unless you're my manager...or a director and above...or a really loud-spoken end-user..All right - what was my emergency again?

  • Hopefully this won't kick up too much mud.

    I agree with some of what Steve Schmechel is advocating (if not necessarily his approach in advocating it). In working with our developers building new systems, on the DBA side, I try to get as much of a valid database design based on the requirements as I can. However, we've hit issues where the requirements were so ill defined or changing so fast, that trying to maintain a database would have killed the project. Instead, we created about three tables that represented the core concepts of the app and let the developers drop XML straight out of their object defnitions into the database. Then, the requirements gelled a bit and we were able to add on another five tables, again with XML columns since we still didn't have adequate definitions for more. We've now grown the system out to about 40 or 50 tables, but some of these still have XML columns where we haven't refined the requirements. At no point have I simply slapped junk into the database without the full knowledge, that the stuff there was going to get changed as development moved forward. Now we're to the point where I can start performance and load testing with a high degree of assurance that the fundamental design isn't going to change underneath me. It works.

    BTW, Steve, sometimes, if an application is being designed to collect, collate, store, manage and report data, the database just might, might, be the foundation for that application. That's not to say the client, the app server, and any other SOA architecture layers are not just as important to the overall application, but the app is about data and the data is kept... in a database. Sorry, dude. It happens.

    That said, the key to any Agile project is direct communication between disciplines like dev and dba. Every time we've had a problem, the core of the issue was the lack of communication (or sometimes, bad communication). If you like, that's the real foundation of a successful app dev effort.

    "The credit belongs to the man who is actually in the arena, whose face is marred by dust and sweat and blood"
    - Theodore Roosevelt

    Author of:
    SQL Server Execution Plans
    SQL Server Query Performance Tuning

  • Ok John,

    Maybe I spent to much time bashing the article and not enough being constructive.

    Let me give you some insights and resources that have helped in my journey thus far.

    Scott Ambler has a great essay on "The Design of a Robust Persistence Framework for Relational Databases".

    It is available online at http://www.ambysoft.com/essays/persistenceLayer.html

    I combined this concept with another established idea of a "pluggable data provider" that I first experienced working with DotNetNuke.

    You will find this idea also in NET 2.0's Data Provider Factory Classes (originally in Application Data Blocks).

    Mostly, these deal with plugging in different relational databases. (DNN uses it for other parts of the WebApp also.)

    Links:

    DotNetNuke Whitepaper (unfortunately not on the DNN site anymore): http://www.websecurestores.com/LinkClick.aspx?link=DNN+Documentation%2FDotNetNuke+Data+Access.doc&tabid=85&mid=588

    More recent related post: http://love-dotnetnuke.blogspot.com/2007/10/provider-model.html

    While our application is not a web application, and I didn't limit myself to just relational databases, I found that these methods

    added to my ability as a database developer to operate in an increasingly "agile" development environment.

    How you ask?

    By making the data provider more "robust", the developers were shielded from the relational database world.

    This had two benefits:

    1) They could concentrate on object oriented design and programming without needing to understand how things were persisted in the database.

    2) By removing any traces of the database, it was less likely to influence their design. (As I stated before, that this was an existing problem.)

    By making the data provider pluggable, I could swap in a simple object serialization provider during the early stages of development,

    to avoid a direct dependence on the database.

    This allowed developers to:

    1) Change their object model as frequently as desired and even refactor entire branches without dependencies on the data layer.

    2) Exercise the running application with data storage enabled using the same interface that would be used in the production environment.

    They did months of initial development on a multi-tier application that will use a SQL Server database, without ever touching a database!

    During this time, I have been preparing the "SQL Data Provider" to accomodate the their "Domain Classes" (see Ambler article).

    Another DBA has been working with me to prepare the schema of the production database.

    By the way, we are outnumbered 10 to 1 by code developers, so you can see why having them wait on us to keep their databases in a usable state

    amidst their constant code changes would be a DB developers nightmare and affect their ability to be agile.

    When a section of the application model begins to solidify and the change rate slows, that's when we begin configuring the database and the provider.

    There were some costs to going this route:

    - You can not write the provider in TSQL. Either a database person needs to do some coding (me in this case) or you need to get a developer to help.

    - The database person needs to understand enough code to be able to discern what the programmers are trying to persist.

    - You need a way to define what needs to be persisted. This could be attributes in the code, special storage classes (that they own, not generated from the database), or an external document.

    - Developers need to follow a consistent pattern with their object model and abide by some limitations. These conventions can be fairly general and are not too hard for them to swallow because consistency is good on their side also.

    - All business logic and most data integrity constraints need to be provided in the domain classes. Most of these constraints work better closer to the applications and users that interact with them, rather than exceptions bouncing back from the database.

    - Our resulting database schema won't win any "normalization" contests, but it is not a "big blob of data" either. You can still understand it and work with it, even though object oriented concepts, like inheritance, can compel some denormalization.

    Some interesting side effects:

    - Communication with developers improved, in part because they saw the freedom they were given without having to "appease the DBA".

    - Instead of a flood of "change requests", DBA's were consulted more on design pattern changes and big changes in the model, before they were implemented, in order to determine if it would affect the provider. (Before the attitude was "It's just a database change, just make it happen." or “Here's how I think you should change the database.”)

    - Communication within the team improved in both directions. Part by necessity (new territory) and part because our goals seemed more in-line.

    - Application behavior changed from frequent, small trips to the database to less frequent requests for more data, which the application code managed and cached for longer periods of time. This should actually reduce the load on the database.

    Other notes:

    - Unit test coverage (and better yet Test driven development) of the data provider is CRITICAL. This is hard when you introduce a state machine like a database - lots of setup and tear down, but it is worth every hour invested.

    - You have less code to test compared to all the CRUD code that is generated in other architectures, but it is harder code to test and more critical. (A bug could affect multiple tables.)

    - Isolate your unit tests from the production tables and production classes that will interact with the provider. You don't want to mix developer or DBA bugs in your unit tests of the provider. Again, more work but greater peace of mind.

    - Create integration tests that help you flush out storage-related errors in the in the production classes and possible mistakes in the database schema.

    - If there is not an explicit way to determine what data in the domain classes needs to be stored, write tests to point out areas you might have missed.

    - Unit testing in Visual Studio (2005 Team Edition) is difficult and does not lend itself to test driven development. It's a shame the Microsoft tools haven't advanced further, but the testing is still worth the effort.

    - If you can automate the use of stored procedures, go ahead. I found that generating parameterized queries in the provider offered similar performance. (Kudos to the SQL Server Query Plan Optimizer Team for that.)

    When this project is complete, maybe I will present an article on the experience; benefits and pitfalls.

    A few things I am sure of already:

    - The end product will meet our customers' needs better than before. Agile is not about giving them a “visual fix” as someone else commented. It is about being able to "make it right" in the limited time you are given, by not wasting time creating what you think the customer needs. (In my case, the customers are the application developers and their customers, including sales and marketing.)

    - We will be able to make changes and enhancements to our application quicker than before, with greater confidence, and better test coverage.

    - Agile comes from intentional changes to your process and how you view your work. It will never occur “by accident”.

  • Grant,

    Interesting approach also.

    I fully realize that some applications will be all about data and its reuse.

    I am only against the preconception that everything must first be thought about from the paradigm of a relational database. As Matt stated, often it is a person with a hammer thinking everything is a nail.

    RDBMS's are still the "King" of many software developments and I have a great appreciation for them and what they can do. Other options are evolving - often to keep pace with changes in development (Object oriented, service oriented, distributed, etc.)

    I also tried other types of providers for our project:

    - DB4O - an object database for the .NET framework.

    - nHibernate - a object-relational mapper package.

    They all worked. They all had pros and cons in our environment.

    We ended up choosing SQL Server and a custom ORM for the production provider for some good reasons and some "not so good" reasons.

    It was still worth exploring the options. Maybe on a future project we will handle data in a way that reduces the need to map between object and relational.

    It's about keeping an open mind.

  • Thanks for the details Steve. I will not have time to go through your linked material until next week, but I will read it and then re-read your bullet points. Being a traditional waterfall guy and getting thrown into a development group that is agile is definately a culture shock for me. I'm willing to learn and change my thinking, but I will understandably go into it with caution and questions.

    Please do submit an article when your project finishes. I'm sure it will spawn more good discussion.

    John Rowan

    ======================================================
    ======================================================
    Forum Etiquette: How to post data/code on a forum to get the best help[/url] - by Jeff Moden

  • Steve,

    Your previous feedback is something useful that I can do something with. The situation I am in is that I am trying to move towards agile development but I'm coming from a traditional background.

    It is far easier when you have someone to guide you through the processes and techniques. If you don't have any reference other than variable quality books and opinion then it is trial and (one hell of a lot of) error.

    Hope you get around to writing that article soon

  • Other notes:

    - Unit test coverage (and better yet Test driven development) of the data provider is CRITICAL. This is hard when you introduce a state machine like a database - lots of setup and tear down, but it is worth every hour invested.

    - You have less code to test compared to all the CRUD code that is generated in other architectures, but it is harder code to test and more critical. (A bug could affect multiple tables.)

    - Isolate your unit tests from the production tables and production classes that will interact with the provider. You don't want to mix developer or DBA bugs in your unit tests of the provider. Again, more work but greater peace of mind.

    - Create integration tests that help you flush out storage-related errors in the in the production classes and possible mistakes in the database schema.

    - If there is not an explicit way to determine what data in the domain classes needs to be stored, write tests to point out areas you might have missed.

    - Unit testing in Visual Studio (2005 Team Edition) is difficult and does not lend itself to test driven development. It's a shame the Microsoft tools haven't advanced further, but the testing is still worth the effort

    You might want to take a look at the DB side of VSTS

    While I mostly agree on the app code side - the DataDude (VSTS for the DB professional) projects allow for pretty darn seamless setup and tear-down all at the click of a button. drop everything in test database, recreate everything in test DB, and run tests to doublecheck everything, in one click.

    The testing stuff is not 100% of the way there for the database side, in that it's harder than it needs to be I think (in particular - some of the "standard tests" I would have hoped for are missing so you need to build them). but still - once you start setting up various tests and test "templates" (reusable tests that can help search for a specific condition for you), it's not so bad (in other words - learning curve is a B***CH, but you can often can get away with reusing and building on previous tests).

    A little mouse once told me that one of the Andy's (Paging Mr Leonard) that post on here is writing a book on the subject (or - contributing to a book on the subject). Perhaps we can get him interested in an article on the subject matter. I unfortunately am but an amateur in the field so far.

    ----------------------------------------------------------------------------------
    Your lack of planning does not constitute an emergency on my part...unless you're my manager...or a director and above...or a really loud-spoken end-user..All right - what was my emergency again?

  • I worked on a project using VS for database professionals and was initially very impressed. I say inititially because I found that deploying the project to an existing database was an absolute car crash.

    I needed to be able to write scripts that performed tasks dependent on the server on which they were being run but the parser instantly threw an error if I tried anything but the simplest thing.

    If replication was involved using a SQL2000 box then my world became one of pain and stress.

    The things I wanted to do seemed to require me to use the pre and post scripts in the deployment but if any errors occurred it wasn't set up to do a roll back.

    The tool was parachuted onto my desk with "we are now all going to use this" as the only instruction. I know VS can do one hell of a lot more than I know to do with it but without some sort of "How to" guide I felt I was fighting the tool rather than using it.

    The last thing you want is for a tool to screw up a 4TB LIVE database where the maximum down-time allowed at any one time is 5 minutes!

    Let us suppose that I was altering the structure of a table and associated data changes. My manual script approach would be to do this in isolation as a unit of work, then move on to the next table that needed altering etc.

    I wouldn't try and drop all replication, constraints etc up front, then alter all tables, then do data work as this would be too high a risk, but this is what VS seemed to want to do.

    I haven't found any good articles on VS for DB professionals yet. As this version of VS is extremely expensive I don't expect any good articles anytime soon.

  • glad to hear you are a fan of documentation tools. here's a shameless plug for one I wrote. 🙂

    It supports 11 different DBMS in addition to SQL Server. It's called SqlSpec, see the link in my sig for details.

    ---------------------------------------
    elsasoft.org

  • I've seen that this article is going to be published again in the 3rd week in September 2009. A great deal has changed in the 18 months since I wrote the original article and while there is much in the original I still stand by there is also a considerable amount that needs revisiting. I intend to write an update now I have a better (but by no means exhaustive) understanding of the subject.

    I remain enthusiastic about agile development but in large IT organisations it will radically effect the role of the DBA. My experience is that if you want to be a production DBA keeping the wheels turning then there is still a role for you. If you lean more on the development side then you are going to have to morph into a more general developer with a highly developed database speciality.

    I have had various reactions from developers. Most are enthusiastic about getting to do DB work and to learn more about it. My experience is that they tend to reciprocate and help you gain a broader set of development skills.

    The agile evangelists tend to be , well, less welcoming. A sizeable number seem to regard their work as a divine calling and that DBAs are obsolete dinosaurs who should be grateful to be put out of their misery.

    There isn't really an awful lot of information out there to help a DBA who wishes to practise agile development. I've found that questions that I feel are legitimate have not been answered and have even been met with derision and scorn. Some of those questions are fundamental to the data professional mindset so there needs to be a constructive discussion on the issues.

    How do you make a multi-terrabyte datasource agile?

    If you use an ORM tool that requires direct table access how exactly do you secure your DB and DB server? If I am responsible for the sanctity and security of the data I want to hear a damn site more than "well so-and-so uses it". I am surprised by the number of UK developers who have never heard of the Maginot line http://www.bunkertours.co.uk/the_maginot_line.htm which used to be part of the history syllabus for all 13-14 year olds. It was basically a perimeter security measure that was ideal for defending against a specific attack. Bit of a chocolate fireguard if the attacker used a different method though!

Viewing 15 posts - 16 through 30 (of 47 total)

You must be logged in to reply to this topic. Login to reply