Monitoring Changes in Your Database Using DDL Triggers

  • Comments posted to this topic are about the item Monitoring Changes in Your Database Using DDL Triggers

  • This is exactly why I use this site. Simple clear solutions that genuinely add some value to the products I develop. Although, I had to rename the objects (BlameMeTrigger and BlameMeLog). I think this will convey my intentions clearly enough ;¬).

    I look forward to the next installment.

  • Hi,

    I surly join the post saying that this site is the best!.

    Though - i have a question - anyone investigated the DB performance issue? My concern is that if I will use the solution detailed here in a DB that is widely used, the DDL will affect the DB response time.

    What do you think?

    Uzi

  • I upgraded my SQL2000 to SQL2005 and all of the triggers were fine after the upgrade. Then my vendor provided an upgrade to the ERP which blew away a couple of the triggers on a custom table I had in the database. This has happened before. I simply re-created the triggers as I had done before when we were on SQL2000. Except now, on SQL2005, the triggers won't work. I am creating them in SSMS, logged on as 'sa'. The error message that the user gets is that she is not able to access the server. I've certainly experienced permission problems when I create SSIS packages. Could this be the same kind of problem? I find it difficult to figure out what is wrong with triggers. Any ideas about what I can do?

  • David,

    Excellent article - concise and to the point! Well done!

    Mark

  • Something to keep in mind ... the connection making a change, has to have access to the audit table. That and unless you want to create an audit table per database, it's usually best to use a central table per server.

    Check the attached for a script that will push the trigger to all databases. The trigger is more or less the same, but also catches the application. Change the @LoggingDBName to whatever you use for your central DBA/Logging/Admin database.

    As stated though, make sure your logins have write access to the changelog table.

  • In my environment I have 20 databases that are constantly updated 24 hours a day. The application uses one user account so my question has two parts.

    1. Can I exclude this user from the trigger?

    2. With changes being made constantly is there a way to keep the table size down?

  • Steven Mann (10/2/2008)


    In my environment I have 20 databases that are constantly updated 24 hours a day. The application uses one user account so my question has two parts.

    1. Can I exclude this user from the trigger?

    2. With changes being made constantly is there a way to keep the table size down?

    1) In your trigger you can check the login name

    [Code]

    SET @loginname = @data.value('(/EVENT_INSTANCE/LoginName)[1]',

    'NVARCHAR(100)')

    ....

    IF @loginname != 'appuser'

    BEGIN

    -- do the work

    END

    [/Code]

    2) You can implement a job that does cleanup on a regular basis.

    We just recently implemented this in our three main databases for one of our applications, logging to a central table (yes Adam, user has the rights =) ) just in our dev environment (we have a strict dev->qa->production process so we really just need to identify/validate which objects need to be pushed for a release). We have not had an opportunity to use it just yet, but seems to be working fine with the testing we have done.

  • Nice one ... I like it, is very simple!

    😉

    ============================================================
    SELECT YOUR PROBLEM FROM SSC.com WHERE PROBLEM DESCRIPTION =
    http://www.sqlservercentral.com/articles/Best+Practices/61537/[/url]

  • David,

    Thanks for the elegant trigger, perfect for keeping an eye on those rogue developers! (like me -- hey, wait a minute ...) 😉

  • Fantastic! I appreciate your direct answer to my questions. 🙂

  • We used a similar aproach,however, most of the time whe want the trigger to do as little and as fast as possible, so we move the data straight into an XML field and created a view to parse the xml which give us a better performance.

  • I'm using it this way on SQL 2005 Databases for version control:

    SET ANSI_NULLS ON

    GO

    SET QUOTED_IDENTIFIER ON

    GO

    CREATE TABLE [dbo].[DatabaseDDLLogging](

    [id] [uniqueidentifier] NOT NULL,

    [EventType] [nvarchar](100) COLLATE SQL_Latin1_General_CP1_CI_AS NULL,

    [PostTime] [datetime] NULL,

    [SPID] [int] NULL,

    [ServerName] [nvarchar](100) COLLATE SQL_Latin1_General_CP1_CI_AS NULL,

    [LoginName] [nvarchar](100) COLLATE SQL_Latin1_General_CP1_CI_AS NULL,

    [UserName] [nvarchar](100) COLLATE SQL_Latin1_General_CP1_CI_AS NULL,

    [DatabaseName] [nvarchar](100) COLLATE SQL_Latin1_General_CP1_CI_AS NULL,

    [SchemaName] [nvarchar](100) COLLATE SQL_Latin1_General_CP1_CI_AS NULL,

    [ObjectName] [nvarchar](100) COLLATE SQL_Latin1_General_CP1_CI_AS NULL,

    [ObjectType] [nvarchar](100) COLLATE SQL_Latin1_General_CP1_CI_AS NULL,

    [CommandText] [nvarchar](max) COLLATE SQL_Latin1_General_CP1_CI_AS NULL,

    CONSTRAINT [PK_DatabaseLogging] PRIMARY KEY CLUSTERED

    (

    [id] ASC

    )WITH (PAD_INDEX = OFF, IGNORE_DUP_KEY = OFF) ON [PRIMARY]

    ) ON [PRIMARY]

    GO

    CREATE TRIGGER [trig_event_data]

    ON DATABASE

    FOR CREATE_TABLE, DROP_TABLE, ALTER_TABLE,

    CREATE_VIEW, DROP_VIEW, ALTER_VIEW,

    CREATE_INDEX, DROP_INDEX, ALTER_INDEX,

    CREATE_FUNCTION, DROP_FUNCTION, ALTER_FUNCTION,

    CREATE_PROCEDURE, DROP_PROCEDURE, ALTER_PROCEDURE

    AS

    DECLARE @data XML

    SET @data = EVENTDATA()

    INSERT INTO DATABASEDDLLOGGING (EVENTTYPE,

    POSTTIME,

    SPID,

    SERVERNAME,

    LOGINNAME,

    USERNAME,

    DATABASENAME,

    SCHEMANAME,

    OBJECTNAME,

    OBJECTTYPE,

    COMMANDTEXT

    ) VALUES

    ( @data.value('(/EVENT_INSTANCE/EventType)[1]','nvarchar(100)'),

    @data.value('(/EVENT_INSTANCE/PostTime)[1]','datetime'),

    @data.value('(/EVENT_INSTANCE/SPID)[1]','int'),

    @data.value('(/EVENT_INSTANCE/ServerName)[1]','nvarchar(100)'),

    @data.value('(/EVENT_INSTANCE/LoginName)[1]','nvarchar(100)'),

    @data.value('(/EVENT_INSTANCE/UserName)[1]','nvarchar(100)'),

    @data.value('(/EVENT_INSTANCE/DatabaseName)[1]','nvarchar(100)'),

    @data.value('(/EVENT_INSTANCE/SchemaName)[1]','nvarchar(100)'),

    @data.value('(/EVENT_INSTANCE/ObjectName)[1]','nvarchar(100)'),

    @data.value('(/EVENT_INSTANCE/ObjectType)[1]','nvarchar(100)'),

    @data.value('(/EVENT_INSTANCE/TSQLCommand/CommandText)[1]','nvarchar(max)')

    )

    GO

    SET ANSI_NULLS OFF

    GO

    SET QUOTED_IDENTIFIER OFF

    GO

    ENABLE TRIGGER [trig_event_data] ON DATABASE

  • Do I have to modify this trigger to work with my database? When I try to run the trigger in SQL2000 it blows syntax errors? Can anyone help?

    Sorry I am new to the database world:hehe:

    This is my errors.

    Server: Msg 156, Level 15, State 1, Procedure Audit, Line 1

    Incorrect syntax near the keyword 'DATABASE'.

    Server: Msg 195, Level 15, State 1, Procedure Audit, Line 10

    'EVENTDATA' is not a recognized function name.

    Server: Msg 170, Level 15, State 1, Procedure Audit, Line 11

    Line 11: Incorrect syntax near '.'.

    Server: Msg 170, Level 15, State 1, Procedure Audit, Line 13

    Line 13: Incorrect syntax near '.'.

    Server: Msg 170, Level 15, State 1, Procedure Audit, Line 14

    Line 14: Incorrect syntax near '.'.

    Server: Msg 170, Level 15, State 1, Procedure Audit, Line 15

    Line 15: Incorrect syntax near '.'.

  • it only works for sql2005 or higher

Viewing 15 posts - 1 through 15 (of 44 total)

You must be logged in to reply to this topic. Login to reply