Click here to monitor SSC
SQLServerCentral is supported by Red Gate Software Ltd.
 
Log in  ::  Register  ::  Not logged in
 
 
 
        
Home       Members    Calendar    Who's On


Add to briefcase ««12

Logon Triggers. A good idea? Or a nerve-racking exprience? Expand / Collapse
Author
Message
Posted Thursday, January 12, 2012 8:37 AM


Old Hand

Old HandOld HandOld HandOld HandOld HandOld HandOld HandOld Hand

Group: General Forum Members
Last Login: Friday, October 24, 2014 11:56 AM
Points: 317, Visits: 877
In case you decide to go the Event Notification route, here's a generic sample of how to implement it. All this does is maintain a table of logins and the last time they were used. Though not done in this example, you can also configure all your servers to log to a central repository.

 
USE [master]
GO

IF EXISTS (SELECT name FROM sys.databases WHERE name = N'Audit')
DROP DATABASE [Audit]
GO

USE [master]
GO

CREATE DATABASE [Audit]
GO

ALTER DATABASE [Audit]
SET ENABLE_BROKER;
GO

USE [Audit];
GO

CREATE TABLE AuditLoginsLog (
LoginName varchar(128),
LastLogin datetime)
GO

--CREATE SERVICE PROCEDURE
CREATE PROCEDURE [dbo].[auditLogins_usp]
AS
BEGIN
SET NOCOUNT ON;
SET ARITHABORT ON;

DECLARE @message XML,
@messageName NVARCHAR(256),
@dialogue UNIQUEIDENTIFIER ;

BEGIN TRY

--Continuous loop
WHILE (1 = 1)
BEGIN

BEGIN TRANSACTION;

--Retrieve the next message from the queue

SET @dialogue = NULL;

WAITFOR (
GET CONVERSATION GROUP @dialogue FROM dbo.AuditLoginsQueue
), TIMEOUT 2000;

IF @dialogue IS NULL

BEGIN
ROLLBACK;
BREAK;
END
;RECEIVE TOP(1)
@messageName=message_type_name,
@message=message_body,
@dialogue = conversation_handle
FROM dbo.AuditLoginsQueue
WHERE conversation_group_id = @dialogue;

IF EXISTS (SELECT LoginName FROM AuditLoginsLog where LoginName = @message.value('(/EVENT_INSTANCE/LoginName)[1]', 'VARCHAR(128)'))
UPDATE AuditLoginsLog SET LastLogin = @message.value('(/EVENT_INSTANCE/PostTime)[1]', 'DATETIME')
WHERE LoginName = @message.value('(/EVENT_INSTANCE/LoginName)[1]', 'VARCHAR(128)')
ELSE
INSERT INTO AuditLoginsLog VALUES (
@message.value('(/EVENT_INSTANCE/LoginName)[1]', 'VARCHAR(128)'),
@message.value('(/EVENT_INSTANCE/PostTime)[1]', 'DATETIME') )

COMMIT;

END
END TRY
BEGIN CATCH
DECLARE @errorNumber INT,@errorMessage NVARCHAR(MAX),@errorState INT,@errorSeverity INT,@errorLine INT,@errorProcedure NVARCHAR(128)
SET @errorNumber = error_number();
SET @errorMessage = error_message();
SET @errorState = error_state();
SET @errorSeverity = error_severity();
SET @errorLine = error_line();
SET @errorProcedure = error_procedure();
if not(xact_state() = 0)
ROLLBACK;
RAISERROR('%s:%d %s (%d)',@errorSeverity,@errorState,@errorProcedure,@errorLine,@errorMessage,@errorNumber) WITH log;
END CATCH
END
GO

--CREATE QUEUE
CREATE QUEUE AuditLoginsQueue
WITH STATUS = ON
,ACTIVATION (
PROCEDURE_NAME = dbo.auditLogins_usp ,
MAX_QUEUE_READERS = 2, EXECUTE AS SELF)
GO

--CREATE SERVICE
CREATE SERVICE AuditLoginsService
ON QUEUE [AuditLoginsQueue]
([http://schemas.microsoft.com/SQL/Notifications/PostEventNotification])
GO

--CREATE ROUTE
CREATE ROUTE AuditLoginsRoute
WITH SERVICE_NAME = 'AuditLoginsService',
ADDRESS = 'Local'
GO

--CREATE EVENT
CREATE EVENT NOTIFICATION AuditLogins_event
ON SERVER
WITH FAN_IN
FOR AUDIT_LOGIN
TO SERVICE 'AuditLoginsService', 'current database'
GO







Colleen M. Morrow
Cleveland DBA
Post #1234891
Posted Thursday, January 12, 2012 8:45 AM


SSChampion

SSChampionSSChampionSSChampionSSChampionSSChampionSSChampionSSChampionSSChampionSSChampionSSChampion

Group: General Forum Members
Last Login: Monday, November 17, 2014 12:50 PM
Points: 13,872, Visits: 9,598
Leave off the file number, and use "default" for the number of files to read, and it will start with the first existing file and read through to the end.

That can be a problem if you don't roll files over, or have huge files, in which case reading one at at time might be better. SSIS can run a For Each Next loop over the files in a directory, and can feed the file name to a variable used in a dynamic query. That might be better if you have a lot of big files and need to break the job down a bit for hardware resource reasons.

If the files are created with a standard name on every server, it should be possible to query them in such a way that you can dump them easily into a warehouse on a central server. Add @@servername as one of the columns in the query, to make sure you can tell what server the data came from.

If you're going to harden the data into a table, I suggest setting a relatively low rollover threshhold for the trace, and import to the table frequently. That way the table will grow, but the files won't. Otherwise, both will grow, and you'll have to allocate storage to files you really won't be querying directly anyway, since you'll use the table for that.

You've probably already thought of this, but I suggest importing fast into a staging table, and then from there load into a more permanent table. You can dedupe in that process, instead of adding to the trace-query overhead by having dynamic Where clauses on that. The final table can have whatever indexes, constraints, et all, you find you need for whatever you're using the trace data for, while the staging table can be optimized for fast bulk loading.


- Gus "GSquared", RSVP, OODA, MAP, NMVP, FAQ, SAT, SQL, DNA, RNA, UOI, IOU, AM, PM, AD, BC, BCE, USA, UN, CF, ROFL, LOL, ETC
Property of The Thread

"Nobody knows the age of the human race, but everyone agrees it's old enough to know better." - Anon
Post #1234906
Posted Wednesday, January 18, 2012 1:37 AM
SSC Rookie

SSC RookieSSC RookieSSC RookieSSC RookieSSC RookieSSC RookieSSC RookieSSC Rookie

Group: General Forum Members
Last Login: Tuesday, November 11, 2014 6:01 AM
Points: 32, Visits: 585
Thanks again for the very useful replies. I am currently implementing the Notification services option on my management server. The reason I chose this instead of traces, is that I now do not need to create directories and organize access on the file systems of all servers. (Our servers are unfortunately not standardized and I therefore have to do this manually on many of the 50 servers!)

AndI thank you so much for the scripts Colleen. It saved me a lot of googling and BOL-ing! I do have one more question about the CREATE QUEUE statement. From BOL I understand that ‘EXECUTE AS SELF’ means that the sp will be executed with privileges of the user who created the queue. If this is me, does itt mean that this will cause a problem when I leave the company and my AD account gets deleted? If so, what then is best practice, keeping in mind that some servers run under Local System account and others under AD startup accounts.



Post #1237727
Posted Wednesday, January 18, 2012 6:57 AM


Old Hand

Old HandOld HandOld HandOld HandOld HandOld HandOld HandOld Hand

Group: General Forum Members
Last Login: Friday, October 24, 2014 11:56 AM
Points: 317, Visits: 877
You could also use EXECUTE AS OWNER or specify another login. I use a generic "auditor" login who owns the Audit database. That will allow it to create all of the Service Broker objects. To create the event notification the login will need CREATE TRACE EVENT NOTIFICATION permission and a corresponding user in the msdb database (no special permissions needed there, it just needs to exist).



Colleen M. Morrow
Cleveland DBA
Post #1237929
« Prev Topic | Next Topic »

Add to briefcase ««12

Permissions Expand / Collapse