Database monitoring

  • Currently there are various teams accessing the database. For costing reasons, we need to track usage.

    Is there an efficient way to monitor User access to the database.

    Can we track which user has executed which query(SELECT,insert etc),the login time and such parammeters?

    Thanks in advance.

  • There isn't a good way to do this. SQL doesn't really track access of resources by user and aggregate it. It can track some things by resource accessed, but not really user.

    You could run a trace that captures users and just something like CPU or reads and then create some cost back calculation from that data.

    It's an interesting idea, and if you come up with something, I'd love to see you write something up on how you implement it.

  • Yes, but it isn't cheap (at least some years back when I evaluated it for a client).

    http://www.whitesands.com

    They have some really cool and useful capabilities.

    Best,
    Kevin G. Boles
    SQL Server Consultant
    SQL MVP 2007-2012
    TheSQLGuru on googles mail service

  • It's not terribly difficult if you modify the default trace to capture the data you want and then setup a nightly agent job to pull the trace data you require into a table for analysis. Beware that you may get a lot more data than you might expect! You'll probably want the nightly job to store just roll-up statistics to avoid creating a monster table of useless data.

  • Since we're talking SQL Server 2012, I strongly advise against using trace for this. Extended events are much more efficient at this type of data collection. The amount of data collected is the same, so the warning about being prepared for it still applies.

    "The credit belongs to the man who is actually in the arena, whose face is marred by dust and sweat and blood"
    - Theodore Roosevelt

    Author of:
    SQL Server Execution Plans
    SQL Server Query Performance Tuning

  • The DMV sys.dm_exec_query_stats accumulates performance statistics about cached query plans, which is perfect for keeping track of the expense of queries. That's easy, but now if only there were an elegant way to tie this back to accounts. For example which (or at least how many) of those executions were performed by which users and when?

    http://sqlserverperformance.wordpress.com/2008/01/21/five-dmv-queries-that-will-make-you-a-superhero/

    As it is now, just joining cached plans back to sessions would be a hack. You can use SQL Server Audit (which leverages extended events) to audit what users executed what queries and when, and there are some 3rd party monitoring solutions that probably just do basically the same thing, but take care of the configuration and implementation details for you.

    http://solutioncenter.apexsql.com/auditing-select-statements-on-sql-server/

    You can do something like poll sys.dm_exec_connections and sys.dm_exec_sessions once every minute to gather accumulated counts on reads, writes, and CPU. Perhaps write a job that inserts this to a table for reporting purposes. This provides only generalized performance counts, but would be a relatively lightweight solution compared to active tracing or event auditing a heavily used server.

    SELECT

    c.session_id,

    c.client_net_address,

    c.connect_time,

    c.last_read as connection_last_read,

    c.num_reads as connection_reads,

    c.last_write as connection_last_write,

    c.num_writes as connection_writes,

    s.session_id,

    s.login_time,

    s.last_request_start_time,

    s.login_name,

    s.host_name,

    s.nt_domain,

    s.nt_user_name,

    s.cpu_time,

    s.reads as session_reads,

    s.writes as session_writes

    FROM sys.dm_exec_connections AS c

    JOIN sys.dm_exec_sessions AS s ON c.session_id = s.session_id

    order by (num_reads + num_writes) desc;

    If all database access via stored procedure calls, then another approach would be to add code to insert an audit table with login account name, start / end time, and perhaps also leverage sys views for session reads and writes. This would be more useful in a reporting environment where procedures calls are relatively less frequent and resource intensive. It wouldn't work well for in a high activity OLTP database in which case inserting the audit table would actually impact performance.

    "Do not seek to follow in the footsteps of the wise. Instead, seek what they sought." - Matsuo Basho

Viewing 6 posts - 1 through 5 (of 5 total)

You must be logged in to reply to this topic. Login to reply