SQLServerCentral Editorial

Badly Encrypted Databases

,

I ran across a post about encrypted databases that was linked from Bruce Schneier's blog. I follow his musings and writings on security ,and he recommended we read it with this sentence: "Even the summary is too much to summarize, so read it." Good enough for me, so I clicked the link and read about encrypted databases.

I like the idea of stronger encryption in databases, and I've given a few talks on the subject. At times there are attendees that will debate that encryption in the database doesn't do a lot of good. Often they dismiss the idea of TDE, since administrators can still read the data and break the encryption, and normal users aren't affected. Many also note that database encryption does nothing for data on the wire, which is true. Most people want to do the encryption and decryption on the client, which has other challenges and is fairly hard to do well.

I think that security is a series of layers, and as noted by the author of the blog, most criminals are lazy. If they can copy a backup file or data file, they'll just do that and read the data. TDE isn't perfect, but it does limit these simple attacks. Always Encrypted was developed to try and make it easy to include encryption from the client side, but in SQL Server 2016, it has lots of limitations. In SQL Server 2019, we get secure enclaves, which should help adoption somewhat, but we will see once developers start to experiment with the feature.

The blog talks about problems with encryption, spending quite a bit of time on approximate database reconstruction, which is essentially guessing data values with some information and by watching queries and results. It's somewhat fascinating, and also scary, but complex and likely requiring lots of queries from a client. To me, this is an area we ought to focus, and really an area that all our protocol libraries and possibly database firewalls (or built in limits) ought to focus efforts. We shouldn't be most clients to make large queries of all data in a table. Really at this point, we ought to have built in limitations of queries to ensure that users are not exporting all data from a table. I'd also like some throttle that might prevent the return of very large result sets to clients, or at least an alert.

At the same time, there ought to be some way to analyze the queries coming in and see if an attacker is "guessing" values. I won't pretend to know how we might do this, but it would seem that in the same way we detect lots of login attempts, we could have some alert being raised when we had xxx of the same type of query in yyy time. That might alert us to potential problems.

Or users running lots of searches.

I don't know the best way to protect data, but I do know that too many of us aren't doing a good job of this. We need to get better, both in production and development environments. We need to be better at protecting databases and the data within them.

Rate

You rated this post out of 5. Change rating

Share

Share

Rate

You rated this post out of 5. Change rating