Audit Trails and Logging Part I

  • Comments posted to this topic are about the item Audit Trails and Logging Part I

    - Gus "GSquared", RSVP, OODA, MAP, NMVP, FAQ, SAT, SQL, DNA, RNA, UOI, IOU, AM, PM, AD, BC, BCE, USA, UN, CF, ROFL, LOL, ETC
    Property of The Thread

    "Nobody knows the age of the human race, but everyone agrees it's old enough to know better." - Anon

  • Nicely done, Gus... Can't wait for the others!

    --Jeff Moden


    RBAR is pronounced "ree-bar" and is a "Modenism" for Row-By-Agonizing-Row.
    First step towards the paradigm shift of writing Set Based code:
    ________Stop thinking about what you want to do to a ROW... think, instead, of what you want to do to a COLUMN.

    Change is inevitable... Change for the better is not.


    Helpful Links:
    How to post code problems
    How to Post Performance Problems
    Create a Tally Function (fnTally)

  • Excellent one... nicely explained. 🙂

  • Good article.

    I use audit trails for management reporting as well as automatically generated emails of certain type of changes. I currently use procs in jobs that run every minute or so, collecting the entire changed row into a seperate table. I then run a another job to analyze the changed rows that have been collected. This analysis produces specific "events" that are of intrest to users. Those events are stored in a seperate table. I then report and send notifications from there.

    This works well so far but I'll be looking for your next round of articles for any better ideas!

  • A nice summary, always useful to check what we're doing with what others are saying.

    One thing I would add to the downside of "passive" audits... the cost of keeping the log files around. One auditor suggested we keep our sql log files for a rolling 15 months (1 year plus a quarter or something like that). We did some quick math on the disk requirements and were well over a couple terabytes.

    True, disk space is becoming a rather inexpensive commodity, but still, it has to be considered. And heaven help you if you need to back sure that is backed up as well and now your doubling your disk requirements and cost, etc.

    fwiw.

  • Excellent work! I really enjoyed this one, it's timely as we've been discussing different approaches to auditing.

  • margo (6/9/2008)


    A nice summary, always useful to check what we're doing with what others are saying.

    One thing I would add to the downside of "passive" audits... the cost of keeping the log files around. One auditor suggested we keep our sql log files for a rolling 15 months (1 year plus a quarter or something like that). We did some quick math on the disk requirements and were well over a couple terabytes.

    True, disk space is becoming a rather inexpensive commodity, but still, it has to be considered. And heaven help you if you need to back sure that is backed up as well and now your doubling your disk requirements and cost, etc.

    fwiw.

    Active logging (next article, should be tomorrow if I'm not mistaken), with some control over what gets stored, might be a better solution in that case. Either way, it's going to take disk space if you want logging.

    - Gus "GSquared", RSVP, OODA, MAP, NMVP, FAQ, SAT, SQL, DNA, RNA, UOI, IOU, AM, PM, AD, BC, BCE, USA, UN, CF, ROFL, LOL, ETC
    Property of The Thread

    "Nobody knows the age of the human race, but everyone agrees it's old enough to know better." - Anon

  • We implemented a passive log while trying to audit SQL injection attempts from our website. We couldn't even begin to consider server-based solutions because the command batch itself was potentially poisonous. I wrote an access interface for the ADO connection, then forced every web transaction to go through that interface. It give me a chance to do basic heuristics before sending the command to SQL (and dumping obvious attack attempts before it gets anywhere near the data). It also writes each transaction to a folder based on page request. After a week it was interesting to review which folders contained the most transactions - you don't realize how much conversation the webserver has with the SQL server until you have a several gigabyte folder to illustrate that fact for you.

    In the post-mortem of an event, we tried searching the nearly one million <1k files. It never finished and we never found what we were searching for. Just like backups with untested restores, it's a good idea to test your audit strategy for realistic usability.

    Ultimately we turned off most of the 'normal' logging due to the extreme space requirement, but we continue to record those transactions that fail the heuristic. This was useful to recover a false-positive match due to a too-aggressive regex.

    Another useful point about auditing is to know your baseline so you can more quickly assess out-of-norm behaviors.

    </soapbox> now back to work....

  • Mike, I would actually classify what you're describing as a form of active auditing. You created a log other than the SQL server transaction log.

    Yeah, in any sort of active logging, logging too much and not being able to use it for anything is a common problem. As is huge amounts of disk-use.

    For what you were doing, preventing code from ever reaching the database in the first place, passive auditing definitely wouldn't do it. Neither would trigger-based SQL auditing. That's when you have to have the front-end or some other tier do the logging for you.

    - Gus "GSquared", RSVP, OODA, MAP, NMVP, FAQ, SAT, SQL, DNA, RNA, UOI, IOU, AM, PM, AD, BC, BCE, USA, UN, CF, ROFL, LOL, ETC
    Property of The Thread

    "Nobody knows the age of the human race, but everyone agrees it's old enough to know better." - Anon

  • GSquared (6/9/2008)


    Mike, I would actually classify what you're describing as a form of active auditing. You created a log other than the SQL server transaction log.

    yeah... I didn't want to wait for Part II, sorry. 🙂

  • Very good article but you should add trace files to your option for auditing. I'm currently using Idera's tool which uses this method and been pretty happy with the results. There are some bugs and work around needed but no software is perfect.

    This process is obviously bit more invasive then reading a log file but from what I've seen the impact is not noticeable... especially since the actual manipulation of the trace files should be on a separate server. One possible downside though, is that you will not have the actual data manipulated like you would with a log reader or trigger, instead you would just have the DML statement. On the positive side, you can audit login activity and SELECT statements.

    David

  • article missed prevention and catching fraud as a reason for logging. we have triggers and management has caught people giving freebies to friends in the audit tables.

  • SQL Noob (6/9/2008)


    article missed prevention and catching fraud as a reason for logging. we have triggers and management has caught people giving freebies to friends in the audit tables.

    I would count that as a combination Reports/Blamethrower audit.

    - Gus "GSquared", RSVP, OODA, MAP, NMVP, FAQ, SAT, SQL, DNA, RNA, UOI, IOU, AM, PM, AD, BC, BCE, USA, UN, CF, ROFL, LOL, ETC
    Property of The Thread

    "Nobody knows the age of the human race, but everyone agrees it's old enough to know better." - Anon

  • David (6/9/2008)


    Very good article but you should add trace files to your option for auditing. I'm currently using Idera's tool which uses this method and been pretty happy with the results. There are some bugs and work around needed but no software is perfect.

    This process is obviously bit more invasive then reading a log file but from what I've seen the impact is not noticeable... especially since the actual manipulation of the trace files should be on a separate server. One possible downside though, is that you will not have the actual data manipulated like you would with a log reader or trigger, instead you would just have the DML statement. On the positive side, you can audit login activity and SELECT statements.

    David

    I guess I'm not sure what you mean by trace files. Are you talking about having some piece of the database/application write data to a separate file and use that for logging?

    If so, yeah, that would be another means of active logging. I didn't actually think of having the logging take place outside of the database, but something like that was brought up by another person as part of a means of preventing SQL injection attacks from getting into the database, and tracking the attempted attacks. It certainly is an option, but not having used it, I don't think I can write more on that subject myself.

    - Gus "GSquared", RSVP, OODA, MAP, NMVP, FAQ, SAT, SQL, DNA, RNA, UOI, IOU, AM, PM, AD, BC, BCE, USA, UN, CF, ROFL, LOL, ETC
    Property of The Thread

    "Nobody knows the age of the human race, but everyone agrees it's old enough to know better." - Anon

  • Nice article, Gus.

    [font="Times New Roman"]-- RBarryYoung[/font], [font="Times New Roman"] (302)375-0451[/font] blog: MovingSQL.com, Twitter: @RBarryYoung[font="Arial Black"]
    Proactive Performance Solutions, Inc.
    [/font]
    [font="Verdana"] "Performance is our middle name."[/font]

Viewing 15 posts - 1 through 15 (of 20 total)

You must be logged in to reply to this topic. Login to reply