I've just delved into an interesting-yet-troubling book, "Delete: The Virtue of Forgetting in the Digital Age" by Viktor Mayer-Schönberger, which highlights some of the more unsettling repercussions of living in a world where cheap storage media and ubiquitous high technologies have eroded privacy and anonymity.
Thanks to the evolution of said technology and data storage techniques, the amount and accuracy of information which can be recorded about an individual has increased exponentially in recent history, and this information is now collected by default. More to the point, this information tends to persist long after the person in question has forgotten it ever existed. This kind of data has been compared to nuclear waste for it's tenacious, dangerous nature, and it's tendency to leak. Such 'weapons-grade' information can lead to peoples' pasts being dragged into the light, out of context, and potentially costing them their jobs. Yet it can equally lead to people being less likely to commit crimes when they know they are being 'informationally' watched, and their record will follow them, immutable, for years afterwards. To pick just one troubling example (of which Viktor has many):
Andrew Feldmar, a Canadian psychotherapist in his late 60's, tried to cross the Canadian / US border in 2006 to pick up a friend from Seattle international airport. After having his name entered into a search engine by a border guard, he was held for 4 hours, fingerprinted, and barred from further entry into the United States because he'd mentioned in a 2001 academic article that he had once taken LSD - in the 1960's. A consummate professional with no criminal record, the Andrew being barred from the US is a very different man to Andrew who took LSD, and whilst in his youth he broke the law, there is now no scope for society to forget time-and-context-sensitive information which is irrelevant to the people we become in later life.
I'd agree that the permanent and unconditional retention of all personal information is not a good thing, but the alternatives are not immediately obvious. The solution favoured by the author seems to be a DRM-like informational erosion, similar to human forgetting: the introduction of an expiry date to each and every person's digital records, and having those records erase themselves either completely or gradually after that date. Ultimately, the goal would be to have a system for data storage and retrieval that is instilled with a sense of social conscience and, in Andrew's case, a concept of redemption.
Sounds interesting, but wouldn't it also require database systems worldwide to be redesigned and rebuilt to facilitate this kind of functionality? No small task, there. And who would be responsible? That's an easy one. In future, not only will DBA's be responsible for making sure data IS available, but they'll be the ones charged with making sure it ISN'T available, too. And we're not just talking about security and access privileges here; this is data erasure, not data censorship - an altogether more hairy prospect.
Given that professional DBA's would be on the front line of a privacy revolution I'd love to hear what you have to say on this topic, and how you think systematic forgetfulness might best be managed.