October 26, 2010 at 6:08 am
Interesting article, Steve, and certainly of great financial and personal significance to millions. But I'm not sure this is a technology or database issue -- at least not strictly.
The plaintiffs in the suit aren't challenging the coding in the db:
But critics, like North Carolina bankruptcy lawyer O. Max Gardner, say the MERS database isn't always up to date, leading to uncertainty about the lien holder's identity. "Sometimes MERS members enter the information, and sometimes they don't."
Maybe this is your (unintended?) point: the data in our DB is only as good as the people entering and updating the info. I don't get the "DBA vs. developer" arguments that entertain so many serious IT professionals. If you want accurate and up-to-date data, then the entire system -- DB, application front end, hardware, network, training, QA/QC, security, support, all of it -- needs to work.
Thanks for a great SQL community, Steve,
Rich
October 26, 2010 at 6:36 am
Interesting take on the issue. I'm trying to get staff, including IT, at my current employer to understand that there's nothing wrong with having multiple copies of data and then using a data warehouse to combine the data for reporting. I hadn't thought of the added benefit of verifying data in a data load.
Jack Corbett
Consultant - Straight Path Solutions
Check out these links on how to get faster and more accurate answers:
Forum Etiquette: How to post data/code on a forum to get the best help
Need an Answer? Actually, No ... You Need a Question
October 26, 2010 at 6:51 am
Maybe this is your (unintended?) point: the data in our DB is only as good as the people entering and updating the info.
From dealing with clients and coworkers, most data has some errors in content or formating. Now imagine a system run by those mainly interested in making a fast buck and you have something that reflects the current state of the economy:
But as the volume of refinanced mortgages grew in the late 1990s, the mortgage industry sought to reduce its fee expenses and speed up the process of re-assigning mortgage liens as mortgages were being rapidly bought and sold.
Dealing with data dumps from mortgage companies is a royal pain, one of the least desirable things I've done in my career. 🙁
October 26, 2010 at 6:56 am
Right now there's a legal challenge as to the rights of MERS to sign foreclosure documents, but there have been mistakes made, and someone is challenging the accuracy of data as well.
I guess there are a few issues here, which I haven't expressed very well. One is that data accuracy from data entry is important. The other is that any feeds/loads need to be accurate and and code in transforms must work correctly. The last might be the legal implications if things aren't accurate.
October 26, 2010 at 12:55 pm
With MERS, there's also the issue that many governmental entities responsible for recording this sort of data require that every transaction be properly recorded to be legal. That means as far as the local authorities are concerned, the only entity that can actually foreclose is the entity reflected on the deed records. To straighten all that out will require that every sale transaction be recorded, with fees paid. Which brings me to the issue of having appropriate audit trails on central data that affects multiple independent entities. There will be times where a transaction must be reconstructed, and that can't be done if you don't have the history.
October 27, 2010 at 3:10 pm
The highest quality data in any organization is Accounts Receivable: Who owes me money?
After that, the data quality goes downhill really fast.
Viewing 7 posts - 1 through 7 (of 7 total)
You must be logged in to reply to this topic. Login to reply