Reducing Log File Size from 50GB

  • I have a database that is currently using 50GB, 25GB DB, 25GB log file.

    The DB is set to simple mode and doesn't need to be changed and I do a full backup once a night.

    However when I run the Disk Usage report (DB > Reports > Standard Reports > Disk Usage ) it always seems to show 99% of un-uunused file space.

    I do run a lot of jobs in the middle of the night, reports that use tempDB etc but I have no idea where to find the "max used log file space" so that I can reduce the size of the log file appropriately. It must be using it during transactions so I need a report to show the max size reached if possible as it obviously doesn't need 25GB.

    Also I am considering moving away to the free Express DB to put the system that is currently on a standard DB to a cloud based hosting service.

    I can handle not having MS Agent as I am only requiring it for a .NET Windows Service (BOT) to run on the server and store data in a DB but the problem is the 10GB size limit. I could split the system up into multiple DB's and set them to trustworthy and then link them all together in the stored procs if I had to but I would rather find the cause of the massive log file if possible as if it at one stage during my nightly jobs it does use a lot of space I need to find out about it.

    Does anyone have a script to show the sizes over time - I suppose I could use a timed job and log the space to a table every minute but I was wondering if there was a DMV combo SQL that would show me the same thing without having to write my own code.

    Thanks for any help in advance.

  • Sorry just realised this is currently an SQL 2005 DB not SQL 2008 but I am using the 2008 Management Studio to connect to it from my PC.

    Also I am hoping to use SQL 2012 Express when I move to a cloud based service.

  • Rob Reid-246754 (7/30/2012)


    Does anyone have a script to show the sizes over time - I suppose I could use a timed job and log the space to a table every minute but I was wondering if there was a DMV combo SQL that would show me the same thing without having to write my own code.

    No DMV, the code you suggest is about the best. The % free comes from DBCC SQLPERF(LOGSPACE)

    You'll probably find that when indexes are rebuilt is the time that the log file gets almost full.

    Gail Shaw
    Microsoft Certified Master: SQL Server, MVP, M.Sc (Comp Sci)
    SQL In The Wild: Discussions on DB performance with occasional diversions into recoverability

    We walk in the dark places no others will enter
    We stand on the bridge and no one may pass
  • So are you suggesting I run an MS Agent job to fire that DBCC proc every minute around the time of the backup and store the results into a database table for looking at later?

  • That works. Maybe every 5 min all day long.

    Gail Shaw
    Microsoft Certified Master: SQL Server, MVP, M.Sc (Comp Sci)
    SQL In The Wild: Discussions on DB performance with occasional diversions into recoverability

    We walk in the dark places no others will enter
    We stand on the bridge and no one may pass
  • You can also check the values from the performance counters:

    select *

    from sys.dm_os_performance_counters

    where object_name = 'SQLServer:Databases'

    and counter_name in ('Percent Log Used', 'Log File(s) Size (KB)')

  • Gazareth (7/30/2012)


    You can also check the values from the performance counters:

    select *

    from sys.dm_os_performance_counters

    where object_name = 'SQLServer:Databases'

    and counter_name in ('Percent Log Used', 'Log File(s) Size (KB)')

    That's actually way easier than what I suggested. Use perfmon and set that counter up and let the perfmon trace run for a day or two.

    Gail Shaw
    Microsoft Certified Master: SQL Server, MVP, M.Sc (Comp Sci)
    SQL In The Wild: Discussions on DB performance with occasional diversions into recoverability

    We walk in the dark places no others will enter
    We stand on the bridge and no one may pass
  • GilaMonster (7/30/2012)


    Gazareth (7/30/2012)


    You can also check the values from the performance counters:

    select *

    from sys.dm_os_performance_counters

    where object_name = 'SQLServer:Databases'

    and counter_name in ('Percent Log Used', 'Log File(s) Size (KB)')

    That's actually way easier than what I suggested. Use perfmon and set that counter up and let the perfmon trace run for a day or two.

    Ha, I didn't make the next logical step from there, as you've suggested Gail - just use perfmon instead 🙂

  • Ok, looks like a timed log check then.

    Thanks for your help

  • Err, no. See the last suggestion I made re perfmon. Much easier than a scheduled job.

    Gail Shaw
    Microsoft Certified Master: SQL Server, MVP, M.Sc (Comp Sci)
    SQL In The Wild: Discussions on DB performance with occasional diversions into recoverability

    We walk in the dark places no others will enter
    We stand on the bridge and no one may pass

Viewing 10 posts - 1 through 9 (of 9 total)

You must be logged in to reply to this topic. Login to reply