Capture CREATE, DROP, ALTER from the default SQL trace file

  • Comments posted to this topic are about the item Capture CREATE, DROP, ALTER from the default SQL trace file

    ______________________________________________________________________________Never argue with an idiot; Theyll drag you down to their level and beat you with experience

  • Thanks for posting!

    I've seen a few different change tracking systems, some using the default trace, and others using DDL triggers/ What are your (and others) thoughts about using DDL triggers to capture this change information vs using the default trace?

  • Thanks for posting wonderful script for schema change monitoring,

    How will altered your whole script for monitoring stand alone instance only.. I don't want capture other instance thru dblink features...

  • @Jonathan.hickford

    The Default Trace method is better where you are not allowed any intrusion of code into the system for monitoring purposes, or if you need to catch DDL changes retrospectively. For a normal system where you want audit, I use service broker because the audit process is then done on a different thread in background, or even delegated to another server via the queue. DDL Triggers are the easiest to implement.

    See Feodor Georgiev's great series on the Default Trace auditing.:

    The default trace in SQL Server - the power of performance and security auditing[/url]

    Collecting the Information in the Default Trace[/url]

    Collect Your SQL Server Auditing and Troubleshooting Information Automatically[/url]

    Best wishes,
    Phil Factor

  • jonathan.hickford (2/2/2015)


    Thanks for posting!

    I've seen a few different change tracking systems, some using the default trace, and others using DDL triggers/ What are your (and others) thoughts about using DDL triggers to capture this change information vs using the default trace?

    Personally, I'm not a fan of triggers unless they are absolutely necessary...hence why I went this route!

    ______________________________________________________________________________Never argue with an idiot; Theyll drag you down to their level and beat you with experience

  • Very interesting post, i enjoy it :w00t:

    Btw, is there any kind of method like this to track DML without using CDC or Triggers??

    Thanks, Regards

  • @SQL Galaxy - you should be able to change the script to easily monitor a single server

    ______________________________________________________________________________Never argue with an idiot; Theyll drag you down to their level and beat you with experience

  • Thank you for the article. I would like to know how to capture 'TRUNCATE TABLE' command. The reason I am asking this is DROP CREATE ALTER event can also be captured using DDL triggers on database. However 'TRUNCATE TABLE' is not captured. I know one can use fn_dblog function. Example "SELECT * FROM fn_dblog(NULL, NULL) WHERE [Transaction Name] = ‘TRUNCATE’" . However after full db backup and deletion of old t-logs one cannot track truncate table changes.

  • It can be captured. SQL:batchcompleted / SQLBatchstarting should be checked . :-). No need of any trigger for that .

  • Hi MyDoggieJessie,

    What a very interesting and absolutely efficient way of tracking changes on your server/s. Two questions though.

    1. Can a person not capture the actual changes that was changed? For example, I changed a column's datatype from example char(10) to varchar(10) but all this gives me is that the table was altered. Any reason for that?

    2. Why did you use the char data type in stead of varchar? You know that if you use for instance char(256) and you only use 10 characters that the rest of the 256 will be added with spaces. I changed the char data types to varchar and it works fine and no spaces. Just wondering and you might have a perfectly good explanation. Let me know please.:-P:-P:-P:-P:-P:-P

    :-PManie Verster
    Developer
    Johannesburg
    South Africa

    I can do all things through Christ who strengthens me. - Holy Bible
    I am a man of fixed and unbending principles, the first of which is to be flexible at all times. - Everett Mckinley Dirkson (Well, I am trying. - Manie Verster)

  • No, using this method there is no way to track the actual change made. You would need to either use a 3rd party monitoring tool or enable CDC. This is more of a "who did it" process, not "what did they do" :hehe:

    I typically use varchar everywhere I can get away with it and have no reason as to why I chose char except "I forgot" 😀

    ______________________________________________________________________________Never argue with an idiot; Theyll drag you down to their level and beat you with experience

  • Thanks for sharing this.

    MM



    select geometry::STGeomFromWKB(0x0106000000020000000103000000010000000B0000001000000000000840000000000000003DD8CCCCCCCCCC0840000000000000003DD8CCCCCCCCCC08408014AE47E17AFC3F040000000000104000CDCCCCCCCCEC3F9C999999999913408014AE47E17AFC3F9C99999999991340000000000000003D0000000000001440000000000000003D000000000000144000000000000000400400000000001040000000000000F03F100000000000084000000000000000401000000000000840000000000000003D0103000000010000000B000000000000000000143D000000000000003D009E99999999B93F000000000000003D009E99999999B93F8014AE47E17AFC3F400000000000F03F00CDCCCCCCCCEC3FA06666666666FE3F8014AE47E17AFC3FA06666666666FE3F000000000000003D1800000000000040000000000000003D18000000000000400000000000000040400000000000F03F000000000000F03F000000000000143D0000000000000040000000000000143D000000000000003D, 0);

  • Forum Etiquette: How to post Reporting Services problems
  • [/url]
  • Forum Etiquette: How to post data/code on a forum to get the best help - by Jeff Moden
  • [/url]
  • How to Post Performance Problems - by Gail Shaw
  • [/url]

  • Thanks for sharing .... I seem to have found a bug though... we are missing some changes and the point of failure that I am looking at.. is when the default trace rolls to a new trace file you miss everything between the last time the capture job was run and when the trace file rolled. Since it is using a rolling 5 files, I am thinking about tracking the names each time it has run... detecting the change and looking at both files in this case. It takes us about an hour to fill up a file and I am capturing every 30 min.

    I like this method, over trigger... but need to track down why I am not getting all changes made.

  • Interesting! I have no experienced this issue.

    It could be that you're filling up your files faster than the job is able to pull them out...so it's getting overwritten before the job picks back up again.

    You could try extending the number of files out, or shortening the timeframe the collection job is running

    ______________________________________________________________________________Never argue with an idiot; Theyll drag you down to their level and beat you with experience

  • MyDoggieJessie (3/26/2015)


    Interesting! I have no experienced this issue.

    It could be that you're filling up your files faster than the job is able to pull them out...so it's getting overwritten before the job picks back up again.

    You could try extending the number of files out, or shortening the timeframe the collection job is running

    I have 5 rolling files, but the code does only uses the current file. So I am going to have to write the file each time to a table, thus if they do not match I can read the old file first, and then the current file in use. That should fix the missing DDL statements.

  • Viewing 15 posts - 1 through 14 (of 14 total)

    You must be logged in to reply to this topic. Login to reply