Think like a DBA

  • My take as a code mostly.

    The reason coders translate nulls into something more meaningful in most cases is not because we don't like nulls but that the languages we use are strongly typed and, for example, an CreditCardBalance needs to be represented in out business layer as something like a Decimal.  A null Decimal cannot exist.  So our choice is down to whether we want to strongly type our variables or use some generic holder that can contain a null!!

    My other take is that I will never trust the return values from a storedprocedure (even if I write it myself) because many years of experience has taught me not to so I'll always deal with null appropriately (and dates, etc.).

    One suggestion for your coders: have one of them write a 'safe' data handler that they can pass all record sets through so that their code is not littered with IsNull questions.

    SafeDataHandler sdh = new SafeDataHandler(whatever.GetData(...));

    strValue = SafeDataHandler.GetString("field1")

    where SafeDataHandler.GetString does the null checking.  You could also override GetString to provide your own default in the case of null values:

    strValue = SafeDataHandler.GetString("field1", "<unknown"&gt

     

     

  • This is an excellent post!  I'll tell you why it falls short for me as a developer.

    A lot of us work with legacy or code that we inherited.

    As a developer, I can't take the risk that this type of planning and preparation was done by the previous developer/dba on this project.  So over time, the habit has been ingrained in me as a developer to validate data from the DB in the code to test these conditions.

    However I'm going to start looking into adding these features in the procs I write in the future or suggest them to my DBA.

     

     

  • Sometimes developers raise the objection that, since C++ runs a gazillion times faster than T-SQL, that's where the data manipulation ought to take place.  The viewpoint is that T-SQL is fine for schlepping data, period.

    Depending on the app, I can see their point.

    I work in what is pretty much a report shop, however, so we generally don't run into situations where shaving off a gazillisecond is going to affect performance greatly.

  • We used to be a smaller shop that enforced lots of business rules and formatted data in the stored procs that we wrote. After a few years we got very good at T-SQL and learned the in and outs, and eventually came up with standard ways to deal with most situations. The UI code was frequently done by interns or fairly inexperienced developers (or us ), so in almost all cases we translated the values to what the application or report needed.

    Then, we grew.

    Now there is a lot of pressure to treat the database as a dumb storage box, and just submit select, insert, update, delete against it. What a pity. I can't help but get a bit of smugness when the apps have performance problems and we are [unofficially] asked to solve performance issues in the stored procs (i.e., moving some processing back into the data layer).

    Why does "new" seem to equal "better" for some people? I say live and let live... if it ain't broke, then spend time on something better (like gathering complete and accurate requirements).

    I had to give my $.02.

Viewing 4 posts - 31 through 33 (of 33 total)

You must be logged in to reply to this topic. Login to reply