Using Interface-Based Programming Techniques in SQL Server

  • Comments posted to this topic are about the content posted at http://www.sqlservercentral.com/columnists/ccubley/sql_interface_final.asp


    Chris Cubley
    www.queryplan.com

  • Interfaces are an in interesting topic to be sure. The advantage that "bigger" languages have is that they can enforce the interface - in VB using the implements statement. That keeps intellisense working and helps the compiler do the checking. SQL doesn't have implements, so you have to enforce it by convention.

    From a client coding perspective it makes sense to do this especially if you're manually populating the parameters collection in ADO to avoide the extra round trip. You can change the proc name and everything else works.

    I'd be interested to hear your response about why it wouldnt be just as effective to put all the rules in one proc and just branch to the appropariate code based on a passed parameter.

    Andy

    http://www.sqlservercentral.com/columnists/awarren/

  • Good article Chris and this is a good way to design the app. Less release and greater flexibility.

    Andy - my thoughts for seperate procs would be less regression testing. Everyone that goes to this site does that right???

    David

    David

    @SQLTentmaker

    “He is no fool who gives what he cannot keep to gain that which he cannot lose” - Jim Elliot

  • I wish!

    You're just moving the testing from one place to another arent you? It's the same code, only difference is how you get there. Or you could even just pass a param to the "main" proc and have it do this:

    if param=1 then

    exec proca

    if param=2 then

    exec procb

    Andy

    http://www.sqlservercentral.com/columnists/awarren/

  • There are several reasons why it is advantageous to separate the rules and put them in different procedures. For the most part, the reasons are the same as why its good to use seperate functions/subroutines in other programming languages. First, you reduce the amount of code in a single stored procedure, thus making maintenance of the rules much simpler. This also enables multiple developers to work on the rules at one time. If a system's primary function involves processing 50 or 100 non-trivial business rules against some set of data, the development of the rules can be a very time-consuming task. If this task can be split up between 6 developers, the system can get done much more quickly. I realize that since the rules affect the state of the system, the actions of one rule can potentially affect actions of another. However, with proper planning, requirement specs, and design documentation for each of the rules, this work can still be effectively partitioned amongst several developers.

    Another benefit of breaking each of the rules out into its own procedure is that it separates the control of flow logic from the actual processing logic. Thus, when developing the rules, you don't have to worry about when the rule should be called, only what it is supposed to do and whether or not it is making changes to the state of the system that would adversely affect other rules. Even the simplest control of flow mechanism should have at least two routes: a normal route and an error route. If all is well, continue processing; otherwise, call the error handling routine. Coding the control of flow logic is easier if you don't have to worry about actual rule processing in the middle of your control of flow code. This also simplifies testing of your control of flow system. As long as the control of flow system communicates with the rules themselves only by well-defined interfaces, the control of flow system should work correctly no matter what changes are made to the rules.

    The separation of the control of flow logic also enables more complex systems to be implemented more easily. For instance, a state machine could be used to control the order of execution of the rules. Since a state diagram is essentially a directed graph with a few special features, it can be represented in a database quite easily. (SQL For Smarties by Joe Celko has a good chapter on representations of graphs in relational databases.)

    I've seen quite a few newsgroup posts asking how to dynamically change databases so that different tables can be updated depending on some parameter value. This can be quite easily accomplished by using interface-based programming techniques. In this case, you could have the same stored procedure in different databases. You then create a lookup table that maps whatever parameter determines which data gets updated to the name of the fully-qualified name of the stored procedure used to update the data. The same technique works across linked servers.

    In short, I think there are quite a few uses for this technique. Is it the end-all be-all of application design? Of course not. However, when needed, the techniques presented can provide an easy-to-implement design that is easily maintainable down the road.

    Now a question for everyone else... Based upon this article and this post, do you think a future article that focuses on specific applications of this technique (such as I mentioned above) would be useful?

    Chris Cubley

    quote:


    Interfaces are an in interesting topic to be sure. The advantage that "bigger" languages have is that they can enforce the interface - in VB using the implements statement. That keeps intellisense working and helps the compiler do the checking. SQL doesn't have implements, so you have to enforce it by convention.

    From a client coding perspective it makes sense to do this especially if you're manually populating the parameters collection in ADO to avoide the extra round trip. You can change the proc name and everything else works.

    I'd be interested to hear your response about why it wouldnt be just as effective to put all the rules in one proc and just branch to the appropariate code based on a passed parameter.

    Andy

    http://www.sqlservercentral.com/columnists/awarren/


    Chris Cubley

    http://www.queryplan.com


    Chris Cubley
    www.queryplan.com

  • Excellent article.

    I tend to agree with Andy, however, because I can also use a couple tables to specify not only which business rules are needed, but the ordering for different processes.

    I.E., FICA calculation might be called first for a "normal payroll calc", but second for a "flex spending account calc". Not much different than yours, but if I keep it all in the database, less network and connection overhead. Also, multiple interfaces (VB, ASP, C++, etc), can call the same stored procedure and the logic is maintained without changing multiple clients. Since I tend to favor keeping changes on a desktop app to a minimum, this allows me to make an "emergency" change on the backend without redeploying another front end. Less important for web apps, but I've worked on apps that went to hundreds of geographically distributed areas. Redeploying something because someone decided later at there needed to be two different processes stink.

    Not sure I'm being completely clear. I think your method might handle it, and it might be more an matter of opinion on that actual architecture. However, again, a great article and something that people don't often think about.

    Steve Jones

    sjones@sqlservercentral.com

    http://www.sqlservercentral.com/columnists/sjones

Viewing 6 posts - 1 through 5 (of 5 total)

You must be logged in to reply to this topic. Login to reply