Click here to monitor SSC
SQLServerCentral is supported by Redgate
Log in  ::  Register  ::  Not logged in
Home       Members    Calendar    Who's On

Add to briefcase

Database monitoring Expand / Collapse
Posted Tuesday, April 22, 2014 9:32 AM



Group: General Forum Members
Last Login: Wednesday, March 16, 2016 5:11 AM
Points: 446, Visits: 182
Currently there are various teams accessing the database. For costing reasons, we need to track usage.
Is there an efficient way to monitor User access to the database.
Can we track which user has executed which query(SELECT,insert etc),the login time and such parammeters?

Thanks in advance.
Post #1563910
Posted Tuesday, April 22, 2014 9:39 AM



Group: Administrators
Last Login: Yesterday @ 1:59 PM
Points: 34,363, Visits: 18,582
There isn't a good way to do this. SQL doesn't really track access of resources by user and aggregate it. It can track some things by resource accessed, but not really user.

You could run a trace that captures users and just something like CPU or reads and then create some cost back calculation from that data.

It's an interesting idea, and if you come up with something, I'd love to see you write something up on how you implement it.

Follow me on Twitter: @way0utwest

Forum Etiquette: How to post data/code on a forum to get the best help
Post #1563914
Posted Tuesday, April 22, 2014 9:48 AM



Group: General Forum Members
Last Login: Today @ 8:05 AM
Points: 5,659, Visits: 8,180
Yes, but it isn't cheap (at least some years back when I evaluated it for a client).

They have some really cool and useful capabilities.


Kevin G. Boles
SQL Server Consultant
SQL MVP 2007-2012
TheSQLGuru at GMail
Post #1563925
Posted Tuesday, April 22, 2014 1:58 PM
SSC Rookie

SSC RookieSSC RookieSSC RookieSSC RookieSSC RookieSSC RookieSSC RookieSSC Rookie

Group: General Forum Members
Last Login: Monday, November 28, 2016 8:47 PM
Points: 40, Visits: 219
It's not terribly difficult if you modify the default trace to capture the data you want and then setup a nightly agent job to pull the trace data you require into a table for analysis. Beware that you may get a lot more data than you might expect! You'll probably want the nightly job to store just roll-up statistics to avoid creating a monster table of useless data.
Post #1564018
Posted Wednesday, April 23, 2014 3:46 AM



Group: General Forum Members
Last Login: Today @ 10:23 AM
Points: 17,173, Visits: 32,135
Since we're talking SQL Server 2012, I strongly advise against using trace for this. Extended events are much more efficient at this type of data collection. The amount of data collected is the same, so the warning about being prepared for it still applies.

"The credit belongs to the man who is actually in the arena, whose face is marred by dust and sweat and blood..." Theodore Roosevelt
The Scary DBA
Author of:
SQL Server Query Performance Tuning
SQL Server Execution Plans

Product Evangelist for Red Gate Software
Post #1564172
Posted Thursday, May 1, 2014 9:06 AM This worked for the OP Answer marked as solution

SSCarpal Tunnel

SSCarpal TunnelSSCarpal TunnelSSCarpal TunnelSSCarpal TunnelSSCarpal TunnelSSCarpal TunnelSSCarpal TunnelSSCarpal TunnelSSCarpal Tunnel

Group: General Forum Members
Last Login: Today @ 10:40 AM
Points: 4,110, Visits: 9,204
The DMV sys.dm_exec_query_stats accumulates performance statistics about cached query plans, which is perfect for keeping track of the expense of queries. That's easy, but now if only there were an elegant way to tie this back to accounts. For example which (or at least how many) of those executions were performed by which users and when?

As it is now, just joining cached plans back to sessions would be a hack. You can use SQL Server Audit (which leverages extended events) to audit what users executed what queries and when, and there are some 3rd party monitoring solutions that probably just do basically the same thing, but take care of the configuration and implementation details for you.

You can do something like poll sys.dm_exec_connections and sys.dm_exec_sessions once every minute to gather accumulated counts on reads, writes, and CPU. Perhaps write a job that inserts this to a table for reporting purposes. This provides only generalized performance counts, but would be a relatively lightweight solution compared to active tracing or event auditing a heavily used server.
c.last_read as connection_last_read,
c.num_reads as connection_reads,
c.last_write as connection_last_write,
c.num_writes as connection_writes,
s.reads as session_reads,
s.writes as session_writes
FROM sys.dm_exec_connections AS c
JOIN sys.dm_exec_sessions AS s ON c.session_id = s.session_id
order by (num_reads + num_writes) desc;

If all database access via stored procedure calls, then another approach would be to add code to insert an audit table with login account name, start / end time, and perhaps also leverage sys views for session reads and writes. This would be more useful in a reporting environment where procedures calls are relatively less frequent and resource intensive. It wouldn't work well for in a high activity OLTP database in which case inserting the audit table would actually impact performance.

"The universe is complicated and for the most part beyond your control, but your life is only as complicated as you choose it to be."
Post #1566745
« Prev Topic | Next Topic »

Add to briefcase

Permissions Expand / Collapse