Click here to monitor SSC
SQLServerCentral is supported by Red Gate Software Ltd.
 
Log in  ::  Register  ::  Not logged in
 
 
 
        
Home       Members    Calendar    Who's On


Add to briefcase 12»»

When is the database NOT in use? Expand / Collapse
Author
Message
Posted Wednesday, March 26, 2014 6:17 PM
Grasshopper

GrasshopperGrasshopperGrasshopperGrasshopperGrasshopperGrasshopperGrasshopperGrasshopper

Group: General Forum Members
Last Login: Thursday, July 17, 2014 1:18 PM
Points: 15, Visits: 62
I have a very large system (32 databases, 7.5 T total of data) growing at a rate of 1 database a month and projection of 1.5T addition data / yr in 2014.

I have been asked to generate a report on a weekly basis (6am Monday) showing the periods of time (date : hh : min ) start/end where there was NO activity on the each database. No reads/writes/ddl/dml/backups ect. No one touching it.

I can pull and monitor for many types of activity, pull reports when things are ran, profile/trace ect.

But what can be used (SQL Server, Perfmon, SolarWinds) at a database level?

Many of the databases interact with each other. Reports are pulled acros 2,3,6 databases.

Goal : find time to update / ddl /dml / etl when there is no activity. Minimize the downtime for the system database by database.

Post #1555233
Posted Wednesday, March 26, 2014 10:47 PM
SSCrazy

SSCrazySSCrazySSCrazySSCrazySSCrazySSCrazySSCrazySSCrazy

Group: General Forum Members
Last Login: Yesterday @ 2:56 AM
Points: 2,603, Visits: 2,061
SQL Server Profiler is the best tool to use. You can use respective filters as per your requirement. However just keep in mind that Profiler is resource consuming and avoided on production box.

---------------------------------------------------
"Thare are only 10 types of people in the world:
Those who understand binary, and those who don't."
Post #1555254
Posted Thursday, March 27, 2014 4:30 AM


SSCoach

SSCoachSSCoachSSCoachSSCoachSSCoachSSCoachSSCoachSSCoachSSCoachSSCoachSSCoach

Group: General Forum Members
Last Login: Today @ 11:27 AM
Points: 15,498, Visits: 27,884
I don't that Profiler is the best tool. In fact, Profiler, the GUI, is a somewhat dangerous tool to point at a production server. It can actually cause performance problems.

My own preference for something like this would be to use extended events. They're the most light-weight method you can run to capture the metrics you need. And what you need doesn't have to be all the queries being run on the system (although that could be interesting, it's going to be a ton of data). You can just monitor for connections to see when the systems are being accessed. But, the problem with extended events in 2008 is that there is no GUI and the output is in XML, so you'd have to write queries to read it.

That brings up trace events and the server-side trace. You can generate the T-SQL script to define this using the Profiler GUI, without connecting that to production. Same thing, you can monitor each query, or you can monitor connections, or both. I'd start with connections to see if that gets you what you need (least harm, least impact as a starting point). The nice thing about using trace events is that you can use the Profiler GUI to look at them. It is easier than extended events, but puts more of a load on the system.


----------------------------------------------------
"The credit belongs to the man who is actually in the arena, whose face is marred by dust and sweat and blood..." Theodore Roosevelt
The Scary DBA
Author of: SQL Server 2012 Query Performance Tuning
SQL Server 2008 Query Performance Tuning Distilled
and
SQL Server Execution Plans

Product Evangelist for Red Gate Software
Post #1555345
Posted Thursday, March 27, 2014 5:16 AM
SSCrazy

SSCrazySSCrazySSCrazySSCrazySSCrazySSCrazySSCrazySSCrazy

Group: General Forum Members
Last Login: Yesterday @ 2:56 AM
Points: 2,603, Visits: 2,061
Yes, I missed extended events. Thank you Grant.

---------------------------------------------------
"Thare are only 10 types of people in the world:
Those who understand binary, and those who don't."
Post #1555363
Posted Thursday, March 27, 2014 5:21 AM
Grasshopper

GrasshopperGrasshopperGrasshopperGrasshopperGrasshopperGrasshopperGrasshopperGrasshopper

Group: General Forum Members
Last Login: Thursday, July 17, 2014 1:18 PM
Points: 15, Visits: 62
I am looking at various DMV to see when there is activity. Pointing out when there is activity is easy. Punching holes in the data is the hard part. I see when connections are made, when a query is executed, but not find a good way to see the absence of activity. I can see connections that are sleeping, but how long have the been sleeping?

I did some testing: I started 20 connections over a 3 hour period. I made them run various lengths of time. I can see when the run, not always how long, and i had one i knew would run 5 hours (1 billion row table).

I am also looking into perfmon data elements, see if there is something that can help identify lack of activity.

Part of the issue is, 1 connection running 2 hours is the same to me as 500 connections over a 10 minute period. I am looking for 0 connections/connectivity and trying to find windows of 30 minutes or more on individual databases. I am thinking of a job running every "x" minutes, looking at the connections and state. See if I come up with an algorythm based on that data.

Thank you for the suggestions. I will investigate further.
Post #1555366
Posted Thursday, March 27, 2014 5:53 AM


SSChampion

SSChampionSSChampionSSChampionSSChampionSSChampionSSChampionSSChampionSSChampionSSChampionSSChampion

Group: General Forum Members
Last Login: Today @ 11:08 AM
Points: 12,876, Visits: 31,786
Aaron Bertrand has a post on last accessed tracking with a couple of different ways, as well as having posted a script that i've modified in the past a tiny bit from there:

http://sqlblog.com/blogs/aaron_bertrand/archive/2008/05/06/when-was-my-database-table-last-accessed.aspx

with that script, if you ran it in a job and stuffed the results into an audit table you'd create, running that job say every five minutes, you can find, within five minute increments, when a database is not accessed.

since it's hitting a DMV, it's only since when the server last restarted.

this is my modification that tracks when the last table was read or updated:
;;WITH myCTE AS
(
SELECT
DB_NAME(database_id) AS TheDatabase,
OBJECT_NAME(object_id,database_id) As TheTableName,
last_user_seek,
last_user_scan,
last_user_lookup,
last_user_update
FROM sys.dm_db_index_usage_stats
)
SELECT
ServerRestartedDate = (SELECT CREATE_DATE FROM sys.databases where name='tempdb'),
x.TheDatabase,
x.TheTableName,
MAX(x.last_read) AS last_read,
MAX(x.last_write) AS last_write
FROM
(
SELECT TheDatabase,TheTableName,last_user_seek AS last_read, NULL AS last_write FROM myCTE
UNION ALL
SELECT TheDatabase,TheTableName,last_user_scan, NULL FROM myCTE
UNION ALL
SELECT TheDatabase,TheTableName,last_user_lookup, NULL FROM myCTE
UNION ALL
SELECT TheDatabase,TheTableName,NULL, last_user_update FROM myCTE
) AS x

GROUP BY TheDatabase,TheTableName
ORDER BY TheDatabase,TheTableName




Lowell

--There is no spoon, and there's no default ORDER BY in sql server either.
Actually, Common Sense is so rare, it should be considered a Superpower. --my son
Post #1555374
Posted Thursday, March 27, 2014 6:13 AM


SSChampion

SSChampionSSChampionSSChampionSSChampionSSChampionSSChampionSSChampionSSChampionSSChampionSSChampion

Group: General Forum Members
Last Login: Today @ 11:08 AM
Points: 12,876, Visits: 31,786
ok i did a little bit more: i'm joining against all databases, so i can find things that were never accessed; on my test server i see a lot of db's that fit that description, and on production, there's just one database that hasn't been touched since the last reboot;all the others show dbs with more frequency.

hope this helps :
;;WITH myCTE AS
(
SELECT
DB_NAME(database_id) AS TheDatabase,
OBJECT_NAME(object_id,database_id) As TheTableName,
last_user_seek,
last_user_scan,
last_user_lookup,
last_user_update
FROM sys.dm_db_index_usage_stats
),AllDbs
AS
(
SELECT name AS TheDatabase from master.sys.databases
)

SELECT
ServerRestartedDate = (SELECT CREATE_DATE FROM sys.databases where name='tempdb'),
db.TheDatabase,
MAX(x.last_read) AS last_read,
MAX(x.last_write) AS last_write
FROM AllDbs db
LEFT OUTER JOIN
(
SELECT TheDatabase,TheTableName,last_user_seek AS last_read, NULL AS last_write FROM myCTE
UNION ALL
SELECT TheDatabase,TheTableName,last_user_scan, NULL FROM myCTE
UNION ALL
SELECT TheDatabase,TheTableName,last_user_lookup, NULL FROM myCTE
UNION ALL
SELECT TheDatabase,TheTableName,NULL, last_user_update FROM myCTE
) AS x
on db.TheDatabase = x.TheDatabase
GROUP BY db.TheDatabase
ORDER BY db.TheDatabase




Lowell

--There is no spoon, and there's no default ORDER BY in sql server either.
Actually, Common Sense is so rare, it should be considered a Superpower. --my son
Post #1555379
Posted Thursday, March 27, 2014 6:20 AM


SSCoach

SSCoachSSCoachSSCoachSSCoachSSCoachSSCoachSSCoachSSCoachSSCoachSSCoachSSCoach

Group: General Forum Members
Last Login: Today @ 11:27 AM
Points: 15,498, Visits: 27,884
If you're going to use sys.dm_exec_requests to monitor, you'll have to capture it over and over again then write a query that shows the changes over time. As far as the sleeping connections go, that's part of connection pooling that you have to acknowledge. I'd still prefer extended/trace events as the mechanism.

----------------------------------------------------
"The credit belongs to the man who is actually in the arena, whose face is marred by dust and sweat and blood..." Theodore Roosevelt
The Scary DBA
Author of: SQL Server 2012 Query Performance Tuning
SQL Server 2008 Query Performance Tuning Distilled
and
SQL Server Execution Plans

Product Evangelist for Red Gate Software
Post #1555383
Posted Thursday, March 27, 2014 7:00 PM
Grasshopper

GrasshopperGrasshopperGrasshopperGrasshopperGrasshopperGrasshopperGrasshopperGrasshopper

Group: General Forum Members
Last Login: Thursday, July 17, 2014 1:18 PM
Points: 15, Visits: 62
thank you very much. I will be reviewing and testing this over the next few days.

I will post updates and my final solution.

Joe
Post #1555728
Posted Thursday, March 27, 2014 10:12 PM
SSC Veteran

SSC VeteranSSC VeteranSSC VeteranSSC VeteranSSC VeteranSSC VeteranSSC VeteranSSC Veteran

Group: General Forum Members
Last Login: Yesterday @ 4:34 PM
Points: 211, Visits: 819
Grant Fritchey (3/27/2014)
It can actually cause performance problems.


From what I've seen where people actually tested this out, it's no more than 10%. I guess there are environments where 10% is unacceptable, but for the vast majority (according to Microsoft who says we're all on over-provisioned hardware hence the core licensing) it's a drop in the ocean.

Do you disagree?
Post #1555753
« Prev Topic | Next Topic »

Add to briefcase 12»»

Permissions Expand / Collapse