• We implemented a passive log while trying to audit SQL injection attempts from our website. We couldn't even begin to consider server-based solutions because the command batch itself was potentially poisonous. I wrote an access interface for the ADO connection, then forced every web transaction to go through that interface. It give me a chance to do basic heuristics before sending the command to SQL (and dumping obvious attack attempts before it gets anywhere near the data). It also writes each transaction to a folder based on page request. After a week it was interesting to review which folders contained the most transactions - you don't realize how much conversation the webserver has with the SQL server until you have a several gigabyte folder to illustrate that fact for you.

    In the post-mortem of an event, we tried searching the nearly one million <1k files. It never finished and we never found what we were searching for. Just like backups with untested restores, it's a good idea to test your audit strategy for realistic usability.

    Ultimately we turned off most of the 'normal' logging due to the extreme space requirement, but we continue to record those transactions that fail the heuristic. This was useful to recover a false-positive match due to a too-aggressive regex.

    Another useful point about auditing is to know your baseline so you can more quickly assess out-of-norm behaviors.

    </soapbox> now back to work....