Blog Post

SQL Audit 201 – Creating an Audit Solution

,

Welcome back, folks!

Ok so, now that we’ve covered the basics of creating a SQL Audit and viewing the output, let’s put the pieces together to form a complete solution. As with any implementation, the key to success lies in careful planning. Before we take off running, we need to take a moment and figure out exactly where we want to go.

My requirements

  • What do I want to audit? DDL changes, excluding specific objects and excluding index/statistics maintenance events.
  • Where do I want to run the audit? AdventureWorks, with the option to audit additional databases in the future.
  • Where do I want the output to go? All audit data should be stored in a central audit database for archiving and reporting.
  • How will the audit output be processed? I’ll be using SSRS to generate reports.

Design

Now, there are a few ways I could do this, depending on what version of SQL Server I’m using. If I’m running SQL 2008, unless I want to audit all my databases, I’m limited to creating a Database Audit Specification in AdventureWorks to track the DDL changes. I would also have no way to eliminate index maintenance events from the audit, so those would need to be processed out later. However, if I’m running SQL 2012, I have the option of creating a Server Audit Specification and using a filter on the audit to limit my auditing to AdventureWorks DDL events and exclude any index maintenance commands. Like this:

CREATE SERVER AUDIT [DDLAudit]
TO FILE
(FILEPATH = N'D:\SQL2012\Audits\'
,MAXSIZE = 100 MB
,MAX_ROLLOVER_FILES = 10
,RESERVE_DISK_SPACE = OFF
)
WITH
(QUEUE_DELAY = 0
,ON_FAILURE = CONTINUE
,AUDIT_GUID = 'd50cf1ad-2927-44c7-afd0-0c31d302ca35'
)
WHERE ([database_name]='AdventureWorks' AND NOT [statement] like 'ALTER INDEX%REBUILD%' AND NOT [statement] like 'ALTER INDEX%REORGANIZE%')
GO

In order to maintain backward compatibility to SQL 2008, I’m going to avoid using the filter for this demo.

 CREATE SERVER AUDIT [DDLAudit]
TO FILE
(FILEPATH = N'D:\SQL2012\Audits\'
,MAXSIZE = 100 MB
,MAX_ROLLOVER_FILES = 10
,RESERVE_DISK_SPACE = OFF
)
WITH
(QUEUE_DELAY = 0
,ON_FAILURE = CONTINUE
,AUDIT_GUID = 'd50cf1ad-2927-44c7-afd0-0c31d302ca35'
)
GO
USE AdventureWorks
GO
CREATE DATABASE AUDIT SPECIFICATION [AdventureWorks_DB_DDLAudit]
FOR SERVER AUDIT [DDLAudit]
ADD (SCHEMA_OBJECT_CHANGE_GROUP)
WITH (STATE = ON)
GO 

Ok, so that takes care of my first two requirements. The next item on my wish list was to have my audit records stored to a centralized audit database. Right now, that isn’t happening, all my audit info is being written to .sqlaudit files on my D drive. So my next order of business is to build a database repository to hold them.

I’ll create a centralized SQLAudit database. Ideally would be isolated from whatever instance I’m auditing, but in this case it’s located on the same instance. In this audit database I’ll create a handful of tables.

  • AuditExclude – list of all objects I want excluded from my audit output.

  • AuditStage – this is the staging table for my audit records, before filtering; The contents of my audit file(s) will be pulled directly into this table.

  • AuditRecord – final storage for filtered/processed audit output.

  • AuditLocator – stores current audit file and offset to be used as starting point for next run; By knowing where I left off, I won’t end up re-processing audit records.

  • AuditLoadLog – run log; This just keeps track of the number of records staged and retained from each run.
 USE [SQLAudit]
GO
IF  EXISTS (SELECT * FROM sys.objects WHERE object_id = OBJECT_ID(N'[dbo].[AuditRecord]') AND type in (N'U'))
DROP TABLE [dbo].[AuditRecord]
GO
IF  EXISTS (SELECT * FROM sys.objects WHERE object_id = OBJECT_ID(N'[dbo].[AuditStage]') AND type in (N'U'))
DROP TABLE [dbo].[AuditStage]
GO
IF  EXISTS (SELECT * FROM sys.objects WHERE object_id = OBJECT_ID(N'[dbo].[AuditLocator]') AND type in (N'U'))
DROP TABLE [dbo].[AuditLocator]
GO
IF  EXISTS (SELECT * FROM sys.objects WHERE object_id = OBJECT_ID(N'[dbo].[AuditLoadLog]') AND type in (N'U'))
DROP TABLE [dbo].[AuditLoadLog]
GO
IF  EXISTS (SELECT * FROM sys.objects WHERE object_id = OBJECT_ID(N'[dbo].[AuditExclude]') AND type in (N'U'))
DROP TABLE [dbo].[AuditExclude]
GO
CREATE TABLE [dbo].[AuditExclude](
[InstanceName] [nvarchar](128) NULL,
[DatabaseName] [varchar](50) NULL,
[SchemaName] [sysname] NOT NULL,
[ObjectName] [varchar](50) NULL,
[ObjectType] [varchar](50) NULL,
[Reason] [varchar](100) NULL
) ON [PRIMARY]
GO
CREATE TABLE [dbo].[AuditRecord](
[audit_name] [varchar](128) NOT NULL,
[event_time] [datetime2](7) NOT NULL,
[sequence_number] [int] NOT NULL,
[action_id] [varchar](4) NULL,
[succeeded] [bit] NOT NULL,
[permission_bitmask] [bigint] NOT NULL,
[is_column_permission] [bit] NOT NULL,
[session_id] [smallint] NOT NULL,
[server_principal_id] [int] NOT NULL,
[database_principal_id] [int] NOT NULL,
[target_server_principal_id] [int] NOT NULL,
[target_database_principal_id] [int] NOT NULL,
[object_id] [int] NOT NULL,
[class_type] [varchar](2) NULL,
[session_server_principal_name] [nvarchar](128) NULL,
[server_principal_name] [nvarchar](128) NULL,
[server_principal_sid] [varbinary](85) NULL,
[database_principal_name] [nvarchar](128) NULL,
[target_server_principal_name] [nvarchar](128) NULL,
[target_server_principal_sid] [varbinary](85) NULL,
[target_database_principal_name] [nvarchar](128) NULL,
[server_instance_name] [nvarchar](128) NULL,
[database_name] [nvarchar](128) NULL,
[schema_name] [nvarchar](128) NULL,
[object_name] [nvarchar](128) NULL,
[statement] [nvarchar](4000) NULL,
[additional_information] [nvarchar](4000) NULL,
[file_name] [nvarchar](260) NOT NULL,
[audit_file_offset] [bigint] NOT NULL
) ON [PRIMARY]
GO
CREATE TABLE [dbo].[AuditStage](
[audit_name] [varchar](128) NOT NULL,
[event_time] [datetime2](7) NOT NULL,
[sequence_number] [int] NOT NULL,
[action_id] [varchar](4) NULL,
[succeeded] [bit] NOT NULL,
[permission_bitmask] [bigint] NOT NULL,
[is_column_permission] [bit] NOT NULL,
[session_id] [smallint] NOT NULL,
[server_principal_id] [int] NOT NULL,
[database_principal_id] [int] NOT NULL,
[target_server_principal_id] [int] NOT NULL,
[target_database_principal_id] [int] NOT NULL,
[object_id] [int] NOT NULL,
[class_type] [varchar](2) NULL,
[session_server_principal_name] [nvarchar](128) NULL,
[server_principal_name] [nvarchar](128) NULL,
[server_principal_sid] [varbinary](85) NULL,
[database_principal_name] [nvarchar](128) NULL,
[target_server_principal_name] [nvarchar](128) NULL,
[target_server_principal_sid] [varbinary](85) NULL,
[target_database_principal_name] [nvarchar](128) NULL,
[server_instance_name] [nvarchar](128) NULL,
[database_name] [nvarchar](128) NULL,
[schema_name] [nvarchar](128) NULL,
[object_name] [nvarchar](128) NULL,
[statement] [nvarchar](4000) NULL,
[additional_information] [nvarchar](4000) NULL,
[file_name] [nvarchar](260) NOT NULL,
[audit_file_offset] [bigint] NOT NULL
) ON [PRIMARY]
GO
CREATE TABLE [dbo].[AuditLocator](
[audit_name] [varchar](128) NULL,
[file_name] [nvarchar](260) NOT NULL,
[audit_file_offset] [bigint] NOT NULL,
[file_pattern] [nvarchar](260) NULL,
[locator_id] int identity(1,1) not null,
[active] char(1) default 'Y'
) ON [PRIMARY]
GO
CREATE TABLE [dbo].[AuditLoadLog](
[audit_name] [varchar](128) NULL,
[staged_count] [int] NOT NULL,
[saved_count] [int] NOT NULL,
[run_date] datetime DEFAULT getdate()
) ON [PRIMARY]
GO 

Now, since I already have an audit in place that I’d like to load, I’ll seed the AuditLocator table with that information.

USE master
GO
declare @initfile nvarchar(260)
select @initfile=log_file_path+ log_file_name from sys.server_file_audits where name = 'DDLAudit'
set @initfile = STUFF(@initfile,len(@initfile)-charindex('.',reverse(@initfile)), 1, '*')
Insert into SQLAudit.dbo.AuditLocator
SELECT top 1 'DDLAudit', file_name, audit_file_offset, @initfile FROM fn_get_audit_file (@initfile, default,  default) order by event_time asc 

Now that I’ve got someplace to store my audit data permanently, I need a way to get it out of external file(s) and into my database. So I’ll also be creating a stored procedure to read my audit files, massage the data, and save it into my AuditRecord table. That would be the LoadAuditData procedure. Let’s take a look.

USE SQLAudit
GO
 create procedure LoadAuditData
as
begin
declare @audit varchar(128),
@file nvarchar(260),
@offset bigint,
@pattern nvarchar(260),
@staged int,
@saved int
set nocount on
declare cAudits cursor for
select audit_name, file_name, audit_file_offset, file_pattern
from AuditLocator
where active = 'Y'
FOR UPDATE

The first thing I’m going to do is find all the active audits in my AuditLocator table and grab the audit file location, the current file and the current offset. I decided that, if I discontinue or move an audit for some reason, rather then deleting it from the AuditLocator table, I would just mark it inactive. That way I’ll have a historical record for my reference.

open cAudits
fetch cAudits into @audit, @file, @offset, @pattern
while @@fetch_status = 0
begin
set @staged = 0
set @saved = 0
insert into AuditStage
SELECT @audit, * FROM fn_get_audit_file (@pattern, @file,  @offset)
set @staged = @@rowcount
insert into AuditRecord
SELECT * from AuditStage a
WHERE NOT EXISTS (SELECT 1 FROM dbo.AuditExclude ae WHERE
a.server_instance_name = ae.InstanceName and
a.database_name = ae.DatabaseName and
a.schema_name = ae.SchemaName and
a.object_name = ae.ObjectName)
and statement not like '%STATISTICS%'
and statement NOT LIKE 'ALTER INDEX%REBUILD%'
and statement NOT LIKE 'ALTER INDEX%REORGANIZE%'
set @saved = @@rowcount

For each of my active audits, I read in all audit records, starting with the file and offset I pulled out of AuditLocator. All audit records are loaded into AuditStage, then I pull only the records I care about into AuditRecord. In this case, I’m eliminating any statistics or index maintenance statements, and anything dealing with my excluded objects.

select top 1 @file=file_name, @offset=audit_file_offset from AuditStage order by event_time desc
update AuditLocator set file_name = @file, audit_file_offset = @offset
where current of cAudits

The next step is to grab the most current record from the AuditStage table, and that’s going to be my jumping off point for the next run. This was really the major purpose of the staging table. I could easily have loaded the audit records directly from my audit file(s) into AuditRecord, filtering at the same time. But what if none of the records met my criteria? I wouldn’t have any way to update my AuditLocator record and I’d end up re-processing the same records next time. This way, I know I processed this set of records and I know none of them met my criteria, so I can move on to the next set.

insert into AuditLoadLog (audit_name, staged_count, saved_count) values (@audit, @staged, @saved)
DELETE AuditStage
fetch cAudits into @audit, @file, @offset, @pattern
end
close cAudits
deallocate cAudits
end 

Finally I log my record counts into the AuditLoadLog table and clear out my staging table for the next run.

Reviewing my requirements

  • What do I want to audit? DDL changes, excluding specific objects and excluding index/statistics maintenance events.
    • Using the SCHEMA_OBJECT_CHANGE_GROUP I’ll record the DDL changes I’m interested in. I’m not filtering the audit, but I’m able to filter the output before it gets loaded into the AuditRecord table, using procedure logic and the AuditExclude table.
  • Where do I want to run the audit? AdventureWorks, with the option to audit additional databases in the future.
    • I’m using a database audit specification to audit only AdventureWorks right now. In the future, I could create additional audits and simply add their information to the AuditLocator table for processing. As long as my audit files are on an accessible network share, I’m good to go.
  • Where do I want the output to go? All audit data should be stored in a central audit database for archiving and reporting.
    • Got it. Right now that database is on the same local server, but it could easily be on a dedicated remote instance, away from prying eyes.
  • How will the audit output be processed? I’ll be using SSRS to generate reports.
    • Because I’m loading my data into a single database, creating SSRS reports to view audit data for a particular database, or to track changes across audits, will be a snap.

Considerations

This audit solution has 2 things working against it: the lack of filtering in the audit and the use of rollover files.  Let’s say I size my audit files really small and I set a low limit for my maximum rollover files, say 3 files.  I’ve made a few DDL changes and they get logged to File 1.  Then let’s say I kick off a big index maintenance or update stats job, and the audit records from that job fills up the rest of File 1, File 2, and File 3.  Once File 3 is full, File 1 will be deleted and File 4 will be created.  If I haven’t processed my audit files in a while, it’s possible that valid audit records in File 1 could be lost. So it’s important to size your files appropriately for the amount of audit data you expect, keep a good number of rollover files, and, most importantly, process your files often.

What’s next?

What if I want to deploy SQLAudit to a number of instances/databases? I could point and click my way through it, but that’s asking for a typo or configuration mistake. I could script it in T-SQL, but that’s so last year. So how about we use PowerShell? Next time we’ll go over ways to create, drop, and manage SQL Audit via PowerShell and we’ll use a master audit as a template for deploying audits to the instance/database of our choosing.

Rate

You rated this post out of 5. Change rating

Share

Share

Rate

You rated this post out of 5. Change rating