June 6, 2008 at 4:37 am
I ended up using Jack's code. Just created my own table dbChanges and also added a datetime field. So the main thing now looks like this:
INSERT INTO dbChanges
(
databasename,
eventtype,
objectname,
objecttype,
sqlcommand,
loginname,
eventtime
)
VALUES
(
@data.value('(/EVENT_INSTANCE/DatabaseName)[1]', 'VARCHAR(256)'),
@data.value('(/EVENT_INSTANCE/EventType)[1]', 'VARCHAR(50)'),
@data.value('(/EVENT_INSTANCE/ObjectName)[1]', 'VARCHAR(256)'),
@data.value('(/EVENT_INSTANCE/ObjectType)[1]', 'VARCHAR(25)'),
@data.value('(/EVENT_INSTANCE/TSQLCommand)[1]', 'VARCHAR(MAX)'),
@data.value('(/EVENT_INSTANCE/LoginName)[1]', 'VARCHAR(256)'),
@data.value('(/EVENT_INSTANCE/PostTime)[1]', 'DATETIME'))
Thanks again everyone.
June 6, 2008 at 9:50 am
This is fab stuff, both the code, and all the reports.
Presumably there is negligible impact on performance from the trigger based solution?
June 6, 2008 at 2:16 pm
If all you're doing is the reporting, there's probably no noticeable impact. The DDL trigger can be used to rollback the schema change (prevent it from happening), and that can take a little more, but it shouldn't be very noticeable, either. Of course, the example given here just handles the stored procedures, but you could use it for any schema change. That's typically my recommendation.
K. Brian Kelley
@kbriankelley
Viewing 3 posts - 16 through 18 (of 18 total)
You must be logged in to reply to this topic. Login to reply