Blog Post

Automate Credit Card Audits

,

TSQL Tuesday

Boy has it gotten hot outside. As the heat turns up outdoors, it is frequently a really good idea to get together with friends and family to relax with a little outdoor party. As luck would have it, it is now time for a fabulous party. Sadly, this party likely is being held indoors for everybody. It is time for a little TSQL Tuesday gathering!!

This party, that was started by Adam Machanic, has now been going for long enough that changes have happened (such as Steve Jones (b | t) managing it now). For a nice long read, you can find a nice roundup of all TSQLTuesdays over here.

This month, the blog party is all about finding room for improvement – err automation. Elizabeth Noble  (b | t) believes that automation can help all that stresses you. Ok, maybe not every stress can be automated away, but some stress can definitely be decreased through the power of automation. I am a firm believer in the power of automation. I don’t have the time to manually touch every thing that comes across my desk. You can find the invite from Elizabeth – here.

This topic is so crucial to the success of the effective data professional, that we have seen it come up a few times as TSQL Tuesday topics. Every time this topic comes back around, I learn something more about both the topic and my personal progress with automation. If you are interested in a lot more reading in regards to automation, here are a couple of those past topics: Pat Wright TSQL Tuesday 15 roundup and Hemanth D. TSQL Tuesday 50 and respectively my submissions for those are here and here (maybe I should rewrite those).

Time to Automate

While considering this topic this month, I was in the middle of a nasty task of rewriting some code for a client. I seriously considered writing about that experience and how I take those opportunities to teach client Data Professionals how to write better code. After all, it is an automation of sorts since somebody/something else will be doing the task for you ;). Teach a DBA to write better code, and you have an automation that is performance tuning your server for you.

The better topic though is a project I undertook for a client to help identify the extent of the cardholder data that could be exposed. We never want to see cardholder data being exposed (such as we see happening over and over again like this) and sometimes the best way to understand the risk is through an audit. In other words, you have to go get your hands a little dirty to find the extent of the risk. From there, you work backwards to find where the data issues are being created, by whom, and the how behind the issues. Hopefully, through an extensive audit, you can convince stakeholders as well as vendors of the extent of the risk. (Side note: I say hopefully because so far I have been unsuccessful in showing this PCI certified vendor, for this client that requested this audit, that they are not properly securing card data despite showing them the data first hand).

Let’s start with the “how” to do a mass scale data search searching for all possible credit cards stored in a database. Luckily, there is an article that has most of the heavy lifting done on this topic. Raymond Macaalay has a nice article on how to do most of this heavy lifting here. I will cover some gaps that I encountered along with what I did to “automate” this task.

The first augmentation I made to this script was to add some error catching. For me, I encountered some breaking errors when trying to parse some of the tables. In order to figure out where things were breaking, I needed a way to capture the errors for the statement that was failing and to ensure the two were properly correlated. This little change allows for me to store the errors.

USE DBA;
GO
IF EXISTS
(
    SELECT *
    FROM sys.objects o
    WHERE type = 'U'
          AND name = 'CCNumQueries'
          AND SCHEMA_NAME(o.schema_id) = 'AUDIT'
)
    DROP TABLE AUDIT.CCNumQueries;
GO
CREATE TABLE AUDIT.CCNumQueries
(
    [ID] [INT] IDENTITY(1, 1) PRIMARY KEY CLUSTERED NOT NULL,
    [Query] [VARCHAR](4000) NOT NULL,
QueryStatus TINYINT NOT NULL DEFAULT(0),
QueryFailed TINYINT NOT NULL DEFAULT(0),
ErrMsg VARCHAR(256) NULL,
ErrNum VARCHAR(32) NULL
);
CREATE INDEX IX_IDStatusFiltered ON AUDIT.CCNumQueries(ID) WHERE QueryStatus = 0 AND QueryFailed = 0;

Great, so how do I get the errors into that table so I can see which statement(s) failed? That comes with the next augmentation.

USE [msdb]
GO
IF NOT EXISTS
(
    SELECT name
    FROM msdb.dbo.syscategories
    WHERE name = N'DBA'
          AND category_class = 1
)
BEGIN
    EXECUTE msdb.dbo.sp_add_category @class = N'JOB',
                                                   @type = N'LOCAL',
                                                   @name = N'DBA';
END;
GO
DECLARE @jobId BINARY(16),
        @jobName sysname;
SET @jobName = N'DBA - Audit CCN';
IF EXISTS (SELECT * FROM msdb.dbo.sysjobs j WHERE j.name = @jobName)
BEGIN
    EXECUTE dbo.sp_delete_job @job_name = @jobName;
END;
EXEC  msdb.dbo.sp_add_job @job_name = @jobName, 
@enabled=1, 
@notify_level_eventlog=0, 
@notify_level_email=2, 
@notify_level_page=2, 
@delete_level=0, 
@category_name = N'DBA', 
@owner_login_name=N'sa'
EXEC msdb.dbo.sp_add_jobstep @job_name = @jobName, @step_name = N'Audit Script', 
@step_id=1, 
@cmdexec_success_code=0, 
@on_success_action=1, 
@on_fail_action=2, 
@retry_attempts=0, 
@retry_interval=0, 
@os_run_priority=0, @subsystem=N'TSQL', 
@command=N'SET NOCOUNT ON;
SET QUOTED_IDENTIFIER ON;
DECLARE @SqlQuery AS VARCHAR(4000)
,@QueryID BIGINT
,@errmsg VARCHAR(256)
,@errnum VARCHAR(32);
DECLARE QueryCursor CURSOR LOCAL STATIC FORWARD_ONLY READ_ONLY FOR
SELECT cq.ID,cq.Query
FROM DBA.[AUDIT].CCNumQueries cq
WHERE cq.QueryStatus = 0
AND cq.QueryFailed = 0
ORDER BY cq.ID;
OPEN QueryCursor;
FETCH NEXT FROM QueryCursor INTO @QueryID, @SqlQuery;
WHILE @@FETCH_STATUS = 0
BEGIN
BEGIN TRY
    EXECUTE (@SqlQuery);
UPDATE cq
SET QueryStatus = 1
FROM DBA.AUDIT.CCNumQueries cq
WHERE cq.ID = @QueryID;
END TRY
BEGIN CATCH
SELECT @errmsg = ERROR_MESSAGE(), @errnum = ERROR_NUMBER()
UPDATE cq
SET cq.QueryFailed = 1
, cq.ErrMsg = @errmsg
, cq.ErrNum = @errnum
FROM DBA.AUDIT.CCNumQueries cq
WHERE cq.ID = @QueryID;
END CATCH
FETCH NEXT FROM QueryCursor INTO @QueryID, @SqlQuery;
END;
CLOSE QueryCursor;
DEALLOCATE QueryCursor;', 
@database_name=N'VISTA', 
@flags=0;
EXECUTE msdb.dbo.sp_add_jobserver @job_name = @jobName;
EXEC msdb.dbo.sp_update_job @job_name = @jobName, 
@enabled=1, 
@start_step_id=1, 
@notify_level_eventlog=0, 
@notify_level_email=2, 
@notify_level_page=2, 
@delete_level=0, 
@description=N'', 
@category_name=N'DBA', 
@owner_login_name=N'sa', 
@notify_email_operator_name=N'SQLOps', 
@notify_page_operator_name=N''
GO

In this particular change, I created a SQL Agent job to ensure that the job, if it got killed somehow, would be able to restart after a few minutes and resume. Auditing of this nature takes a LOOOOONG time. Not only does this script create the job, but it also does the error handling. If the statement fails, it will update the associated query record with the error number and the error message. Then it will set the error status flag to let me know that there is an error on that row and to help prevent it from being captured for rerun until the error is corrected.

Ok, so why go to all this trouble? As luck would have it, I was receiving some ugly errors that were killing my audit and I needed to find where those errors originated. Having these measures in place, allowed for the automated routine to complete and for me to investigate the failures after the fact. Once investigated, I was able to discover that a third enhancement was required for the solution mentioned earlier.

USE [TargetDBHere];
GO
/* 
Load CCNumQueries table with queries for each column / table combination in the target database to be audited
*/INSERT INTO DBA.[AUDIT].CCNumQueries (Query)
SELECT 'INSERT INTO DBA.[AUDIT].CCNumResults 
SELECT ''' + Columns.name + ''' as ColumnName, ''' + Tables.name
       + ''' as TableName, REPLACE(CONVERT(NVARCHAR,' + Columns.name
       + ',4000), ''-'', '''')  AS RecordWithCreditCardInformation 
   FROM ' + SCHEMA_NAME(Tables.schema_id) + '.' + Tables.name 
   --case statement required to account for the following error
   --Msg 6358, Level 16, State 1, Line 20 4000 is not a valid style number when converting to XML
   + CASE WHEN Types.name = 'xml' 
THEN ' WHERE LTRIM(RTRIM(CONVERT(NVARCHAR(MAX),' + Columns.name + '))) <> ''''
AND '
ELSE ' WHERE LTRIM(RTRIM(CONVERT(NVARCHAR,' + Columns.name + ',4000))) <> '''' 
AND '
END + Columns.name + ' IS NOT NULL 
AND ' + CASE WHEN Types.name = 'xml'
THEN 'LEN(REPLACE(CONVERT(NVARCHAR(MAX),' + Columns.name + '), ''-'', '''')) > 12 
AND '
ELSE 'LEN(REPLACE(CONVERT(NVARCHAR,' + Columns.name + ',4000), ''-'', '''')) > 12 
AND '
END + 'DBA.[AUDIT].CreditCardMatch(REPLACE(CONVERT(NVARCHAR,' + Columns.name + ',4000), ''-'', ''''))  IS NOT NULL'
FROM sys.tables AS Tables
    INNER JOIN sys.columns AS Columns
        ON Tables.object_id = Columns.object_id
    INNER JOIN sys.types AS Types
        ON Columns.system_type_id = Types.system_type_id
WHERE Types.name IN ( 'text', 'real', 'money', 'float', 'sql_variant', 'ntext', 'numeric', 'bigint', 'varchar', 'char',
                      'nvarchar', 'nchar', 'xml'
                    )
      AND Tables.name NOT IN ( 'CCNumQueries', 'CCNumResults' );

On the surface, it doesn’t appear that there is much difference. Due to that, I have a note to help identify what has changed and why. The why comes down to the error “Msg 6358, Level 16, State 1, Line 20 4000 is not a valid style number when converting to XML”. As I dug into the problems, I found that it was only happening on XML columns within which there happened to be a credit card number being stored. The easy fix is to check for the data type of the column before building out the “query statement” and based on the data type, then we do one or other the type conversions.

As you have likely noted, this is a second level of the automation here – code writing code. This greatly simplifies the task at hand and allows the data professional to move on to other tasks / priorities whilst the script is executing and doing required job.

Once the enhancements to Raymunds solution were finalized and tested, I was able to subsequently deploy this to 20+ instances for the client and allow the automation to just run. Most of the instances took 4-7 days to complete the first time around. Since the job is already deployed, we can run it again on each server once (and if) any of the data issues are resolved to validate our compliance.

Wrapping it Up

In this article, I have shown how to use the power of automation to help identify the risk of having card data stored in the database. Identifying this risk is essential to helping maintain the trust of your clientele and protecting your companies most valuable asset – the data!

Feel free to explore some of the other TSQL Tuesday posts I have written.

Are you interested in more articles showing what and how to audit? I recommend reading a some of my auditing articles. For some “back to basics” related articles, feel free to read here.

The post Automate Credit Card Audits first appeared on SQL RNNR.

Original post (opens in new tab)
View comments in original post (opens in new tab)

Rate

You rated this post out of 5. Change rating

Share

Share

Rate

You rated this post out of 5. Change rating