Home Forums SQLServerCentral.com Editorials Why Object Databases will always be Tomorrow's Technology RE: Why Object Databases will always be Tomorrow's Technology

  • I am running the risk of stating nonsense here as honestly I have no real experience with object database implementations today.

    To my possibly outdated knowledge the technical mechanism that is supporting object databases is visualization of local memory. Instead of having an object in local memory, you now indirectly work with an object that is persisted elsewhere. Thus a linked list or any other structure in local memory is persisted like that in the database...its persistent memory. This is what I got from it in the early 90s, so correct me if I am horribly off. The idea to me is just completely illogical from a quality and manageability point of view.

    For me any persistent storage has to be:

    * Decoupled from the processes acting on it;

    * Has a state you can snapshot easily and independently verify;

    * Accessible by many processes concurrently and handle operations in a way will not result in a corrupt state;

    * Needs to be based on a low complexity model, thus only support a limited number of relation types between entities and data types;

    * Does not require domain specific software for accessing the data and be maintainable by standardized software;

    Any deviation from these things will result in errors accumulating and persisting and other problems. Say I as a programmer have everything I do in memory, persisted instantly for objects I tagged as needing this. Then the data and its relationships is tightly coupled to my algorithm that operates on it, if I use a linked list in memory, the data is structured that way. Now I made a mistake somewhere or I need to do an optimization..that likely will result in a change in both. But wait, what if there are multiple and different clients acting on the data....o shit....you are so horribly screwed.

    On top of it, locality of reference will be totally out of whack. Algorithms that programmers use are in general NOT optimized to be efficient on remote virtual memory. They often aren't even optimized for locality at all, incurring penalties for doing accesses outside the processors level 1 cache. How this would translate to memory across a network, even further away from the processing in the CPU.....not good.

    You end up with situation you will have to make mappers and wrappers for their past mistakes, in effect persisting the problem of an inadequate design or implementation. And because everyone programs to his own view, you can say goodbye to interoperability of software too. Code can't become any cleaner for any significant complex project...its all smoke and mirrors.

    And that physics lab used an object database says nothing to me and that it is the biggest database in the world neither. It says nothing about scaling. Scaling is not just a function of size in storage, even a flat file with a fixed record lengths is scalable in that sense. Scaling applies to operations in the face of accumulating data, things like concurrent access and the efficiency of retrieval and mutation (the algorithms). Thus an Object database with algorithms that aren't going to be change, say they hash everything and this is persisted as the access structure in their database. And lets assume this suits the work they do perfectly, then yes it will be amazingly fast at retrieval, but that is a very specialized case.

    I believe object databases by very definition are domain specific beasts and as a result will never find broad acceptance. I seen the OO hype in the early 90s and always felt people thought more if it then it actually proved to be. Now a new generation is at it doing exactly the same things but with improved technology at its base. I still haven't seen the fundamentals change and thus expect similar outcome.

    Instead of learning how to do things right and understand the data they operate on, people take any solution that "promises" to take away that sore. OOP is always perceived as having this "promise", but I know the hard core OOP folks are pretty smart and put a lot more emphasis on design then the vast majority of developers do. They just want the promise of not having to do any design beforehand in the first place and start implementing quickly.

    I would classify object databases as having the promise to do really rapid prototyping. But don't think they are a good idea to adopt as a replacement for actual software that is going to be used.

    Now, prove me totally wrong! 🙂