One size does not fill all

  • Comments posted to this topic are about the item One size does not fill all

  • I normally agree with (and am interested in) most of what is written in these blogs but in this instance this is not the case, to the extent that I felt I should respond.

    As an architect, as well as a developer, I see significant value in having our applications interoperate with a number of different DBMS types; indeed, it is sometimes a design imperative. In my experience, keeping the interface to them all as generic as possible greatly simplifies the ability to achieve this.

    Also, not distributing the application logic into the DBMS has 2 main advantages, these being 1) it is easier to deploy the applications, and 2) it easier to maintain the applications. For point 1: we sometimes have to operate our applications in environments where our database gets slotted into an existing enterprise DBMS. Also, we sometimes deploy to locations where IT cannot or will not bend from their "rules"; often these are of the making of the individual rather than the corporation. Simplifying the database side of things can greatly aid overall deployment, both technically and politically. For point 2: I have seen situations where the extent of stored procedure use and their side-effects have made it much harder to debug problems.

    Most of our tables never contain anything like hundreds of thousands of records so performance is not as critical (as for large tables). In any case, I have developed and worked with abstraction (shim) layers that isolate the application logic from the database access and provide a good level of specific DBMS optimisation.

    Not only that, but we always test our applications against the largest data installations before we release them (what a surprise). There is always the option to test specific queries and optimise them if required; so, in the end, we always achieve more than satisfactory performance. I have seen stored procedures with poor database performance...

    I do not advocate the program oriented approach in all cases, especially if data loads are substantial or the application is being developed for a single DBMS type. I do advocate seeing application development from the development/architecture side as well as the operations/DBA side. If your experiences are only with poorly tested and poorly performing applications then this is a skewed view of the world.

  • This doesn't surprise me at all. It's the same with so many aspects of IT today! Developer/User productivity improves but usuall at the expense of efficiency.

    In days long past when I started on mainframes and developed in assembler we used to calculate processor execution times and peripheral I/O times using machine code execution times and tape/disc head movement and transfer times etc.

    In these days many/most people are totally unaware of what is going on behind the scenes not just in databases but in hardware as well.

    Ease of use is nearly always at the expense of software/hardware efficiency!

  • I couldn't agree more. I am currently working on a .net/SQL Server app which uses Entity Framework/Linq (not my idea) for 98% of the data access. The SQL which is generated as a result of the Linq queries is basically a massive bunch of union statements which returns a kind of pivoted data dump of every entity and every field. The query runs quickly enough but shifting this massive result set across a slow network can be very slow.

    Give me SQL written by a competent developer any day. In the world of migrations using VS2012 and TFS (convoluted in a large development project) stored procedures can be easily amended in an emergency hotfix scenario. Speed of delivering fix to customers matters as much as strict configuration management.

    The whole justification for EF seems to be

    I) EF always writes better SQL than developers ....that may be the case for bad SQL developers but not good ones

    2) EF makes it easy to change RDBMS. Maybe, assuming the app has good separation of concerns, but how often does anyone actually decide to change from SQL Server to Oracle etc

    3) EF makes it easy to map from database entities to objects, esp if you use automapper or similar. I'd agree with that.

    I am a big fan of manipulating and querying data on the database server unless there is a good reason not too.

  • I could not agree more, if anybody uses EF in my opinion on a reasonably large multi table/relationship database, such a warehouse system then all I can say is that they are a better developer than me. When it comes to complex data structures and transaction integrity I feel SQL procedures are a far better bet, I am sure half the EF hype is as per usual it looks good on the CV.

  • Several years ago at the point in time that Microsoft was introducing Linq to SQL an architect of my acquaintance was so enthralled with the technology that he immediately used Linq to SQL on an enterprise production application. Everything worked fine until the first patch came out for the framework. His application blew up the day after the patch was installed. It seems the patch converted an inner join to an outer join and resulted in several records being returned instead of one.

    My own experience goes back to DAO,RDO and ADO prior to ADO.NET. I learned way back in the '90s that you cannot trust Microsoft technologies that cross platforms. If I have any influence on a project, I will not use the controls to update the database for any reason. I want to have lean and not fat code in my applications and I wan to be able to trace what the code is doing.

    I was recently an architect on a project for a bank and the developers wanted to use the Entity Framework in the data access layer. They wanted the latest technology, I wanted a light weight data access layer; as architect of the project I won the argument. The easiest way to deal with this issue is to require a performance test of the new technology versus the use of stored procedures. In my experience stored procedures always win.

  • I would say that "it depends". Years ago I worked in product development environments, where the software had to be installed into the customers existing environment. So yes, the application handled the data access. But we learned after version 1 not to make a generic data access layer. Instead we made database specific access layers that all implemented the same interface (Factory design pattern for those keeping score). Each database specific layer was optimized as best we could for the underlying database, depending on the expertise available at the time.

    But, in corporate environments where I've been the architect for internal development, I push data access into the database. I'm not worried about here about having to access different databases. Companies don't change databases frequently, if at all. We are more likely to change versions. And if we have our test environment set up correctly (ha) we don't have to worry about rogue SQL in application code; all the stored procedures can be tested on new dbms versions.

  • My belief is that software providers don't like stored procedures because they enable DBAs and developers to see what the system is doing. All our inhouse developments use stored procedures - in addition to the performance implications they can make testing easier and they can be reused.

  • why am I categorised as a "grasshopper"?

  • I understand that stored procedures can be encrypted so stored procedures can still be used by suppliers/developers without revealing their code.

  • Hard-coded SQL in our apps has been my pet peeve for years. I cannot count the number of times I have been asked to put a clustered index on a table because they forgot the ORDER BY and will not do a build.

    Now we hard code Linq statements.

    I think for the most part EF can do a good job with the simple stuff as many have pointed out. But sometimes the simple things are easy to code and still not a good idea -- such as a simple mass delete. I believe EF still cannot do this without looping on the client.

    In the end the good developers will use a combination of client side DML and sprocs as needed.

  • Entity Framework can do a "mass delete" by executing an arbitrary SQL statement. It's arguable as to whether that's actually EF since you're essentially "stepping out" of the framework to do it, but it can be done, and it's done through EF.

  • I could not agree more with the article. Not to say that ORM frameworks are all bad, but honestly i do prefer the use of stored procedures esp in larger scale LOB applications to handle database operations. on smaller apps with no complex database operations needed and the amount of records in database reasonably small fine..however If it were up to me , i would never..ever ever use tools like Entity Framework, but when you inherit a project that already uses EF and your managers+senior devs are convinced of it..grrrrr. ( lets not go there).

    Just my two cents worth.

  • I agree with this observation. I have seen some pretty terrible performance-killing dynamic SQL statements generated by applications that access databases. One aspect that I feel is often ignored is that management must buy in to the value of data architecture in an organization, and encourage developers to coordinate their efforts with database architects and administrators to ensure best performance. No one should be working in a vacuum.

  • It is heartening to see a groundswell of opinion against this nonsense and know I'm not alone.

    One problem I found with EF is that when you have a many to many lets take a hypothetical example such as Product and Color. A product can come in many colors and a color can be used in many products. EF will automatically create a "link table" for you when you define the relationship called e.g ProductColor, with ProductId and ColorId.

    However, when you want to change for example the colors a product can have, in your DAL you need to ask EF to delete all the colors, then add the new ones one by one again. You cant just give it the new collection of Color objects and let it get on with it, which you would expect given all the bold claims which get made about it.

    The first time I did this I was scratching my head for hours. It would have taken me about 15 minutes to write a stored proc.

Viewing 15 posts - 1 through 15 (of 40 total)

You must be logged in to reply to this topic. Login to reply