Thanks for sharing .... I seem to have found a bug though... we are missing some changes and the point of failure that I am looking at.. is when the default trace rolls to a new trace file you miss everything between the last time the capture job was run and when the trace file rolled. Since it is using a rolling 5 files, I am thinking about tracking the names each time it has run... detecting the change and looking at both files in this case. It takes us about an hour to fill up a file and I am capturing every 30 min.
I like this method, over trigger... but need to track down why I am not getting all changes made.
Interesting! I have no experienced this issue.
It could be that you're filling up your files faster than the job is able to pull them out...so it's getting overwritten before the job picks back up again.
You could try extending the number of files out, or shortening the timeframe the collection job is running
______________________________________________________________________________Never argue with an idiot; Theyll drag you down to their level and beat you with experience
MyDoggieJessie (3/26/2015)
Interesting! I have no experienced this issue.It could be that you're filling up your files faster than the job is able to pull them out...so it's getting overwritten before the job picks back up again.
You could try extending the number of files out, or shortening the timeframe the collection job is running
I have 5 rolling files, but the code does only uses the current file. So I am going to have to write the file each time to a table, thus if they do not match I can read the old file first, and then the current file in use. That should fix the missing DDL statements.
Viewing 15 posts - 1 through 14 (of 14 total)
You must be logged in to reply to this topic. Login to reply