• Steve Jones - Editor (11/10/2009)


    You can't performance tune without some testing of how efficient things are. I guess I was wondering as well whether people actually think about testing beyond that. Jeff addressed some of that in his first comment, but most others haven't.

    Do you look for edge cases? In other programming, you can send in zeros, too much data in a parameter, etc., but do you do that in SQL?

    For example, do you look at aggregations with and without NULLs in there? Do you examine what happens for character comparisons if you have numerics or dates in the fields? Do you look for implicit conversions that could cause issues?

    Those are more what I think of when I consider unit tests. My feeling is that most developers aren't overly concerned with more than getting back the correct result set with their test data.

    I often keep SQL Profiler running on the SPID that I've got for my Query Analyzer session, to give me some feedback as I develop.

    The level of testing I do depends on how much time I have available and how important the results are; some projects do get pushed back so I can do more testing, some don't. I always test against full production data, as it's both larger and dirtier than most test data. In real life, on small SQL with large interlocking datasets, I rarely make up test rows, but I do deliberately look for data that causes errors.

    For anything with arguments, sending in blanks or NULLs or (under) minimum values or (over) maximum values or 0 or other boundary conditions is easy.

    As a general statement, SQL is just a high level computer language. It needs no more or less testing than any other similar high level computer language that fails to implement good internal error handling does.