DB's with logfiles larger than datafiles

  • Hi all,

    Are there any scripts to identify databases whose logfiles larger than their associated datafiles?  I've of course had a Google around but no dice...

    Thanks,

    JC

  • Start with sys.master_files.  It should be a simple matter to query that to find out where the log file is the largest file, or whether all the log files add up to more than all the data files.  Beware, though - sys.master_files may not be accurate up to the minute.  If that's important to you, make sure you take it into consideration.

    John

  • Here's a basic query. It produces rows, but you get the ROWs and LOG information for each database.
    SELECT d.name AS databasename,
       f.type_desc,
       SUM(size) AS totalsize
    FROM sys.master_files f
      INNER JOIN sys.databases d
       ON d.database_id = f.database_id
    GROUP BY d.name,
       f.type_desc;

    The way you might do this is separate this into two CTEs and compare the size for rows with LOG to those with ROWS.

    A quick question, why do you want this? Is there some business reason?

  • Steve Jones - SSC Editor - Wednesday, October 10, 2018 3:26 PM

    Here's a basic query. It produces rows, but you get the ROWs and LOG information for each database.
    SELECT d.name AS databasename,
       f.type_desc,
       SUM(size) AS totalsize
    FROM sys.master_files f
      INNER JOIN sys.databases d
       ON d.database_id = f.database_id
    GROUP BY d.name,
       f.type_desc;

    The way you might do this is separate this into two CTEs and compare the size for rows with LOG to those with ROWS.

    A quick question, why do you want this? Is there some business reason?

    Hi Steve, yeah I've now remembered this function comes bundled in SP_Blitz.  We're auditing our QA environment and as part of that drive wanted to highlight DB's who's logs were larger than the data.  Not that we're carrying out a witch hunt of any kind for neglectful app owners mind, we just want to flag up potential issues and add the same to the case we're building for capacity planning.

  • I'd dig through any that have log larger to be sure that backups are running, verify this and audit to determine why. I've had a few with larger logs, that were high workload and mostly updates (few inserts). 

    Good luck. Be interested in your process here, if you'd want to write a 2-3 pager on why and process to publish.

Viewing 5 posts - 1 through 4 (of 4 total)

You must be logged in to reply to this topic. Login to reply