This editorial was originally published on 28 Sep 2015. It is being republished as Steve is on holiday.
There is a bit of a rant from John Welch on testing your data manipulation that I like. I think some of the impact is lost because the end shows an advertisement for a product that helps here, but the points made are good. We all want to test, we think it's hard, we don't have time, and our businesses live with the issues from limited testing.
I'm not a fan of Test Driven Development, as John is. Usually this is because I'm not always 100% sure of the results I want or have been given. I've often been given a request to do x and as I get involved, I find that the requirements might be incomplete, or even wrong, and they'll change. As a result, I like to write a little code, get some idea of what I want to return or change, and then write a test that verifies what I've done is correct.
It's a subtle difference, and maybe I'm doing TDD in the wrong order, but I like to get code, test it, then think about potential issues (which I might find as I write code) and write a few tests for the things that I've missed.
However I do believe we need to test our code. We all do test our code, even if it's with a few before/after queries. What I don't get is why we don't just mock up a quick test that we can run in an automated fashion. It's not much more work, and then we can more easily re-run the test later to ensure any refactoring or optimizations we make continue to work as expected.