Log file R drive- What is consuming it

  • Good Morning Experts.

    We have a SQL Server instance. All the user databases log files are on R drive. We have been getting alerts that R drive is 99% full, 100% full. I am interested to know what is consuming the R drive. Could you please advise. Any script to check what is consuming the R drive?

  • coolchaitu - Thursday, November 16, 2017 10:01 PM

    Good Morning Experts.

    We have a SQL Server instance. All the user databases log files are on R drive. We have been getting alerts that R drive is 99% full, 100% full. I am interested to know what is consuming the R drive. Could you please advise. Any script to check what is consuming the R drive?

    How big is the drive and is anything else using the drive?  Also, how often are you taking log file backups?  And, what are the initial size and growth settings set to?  And finally, do you have any form of log shipping or replication running?

    --Jeff Moden


    RBAR is pronounced "ree-bar" and is a "Modenism" for Row-By-Agonizing-Row.
    First step towards the paradigm shift of writing Set Based code:
    ________Stop thinking about what you want to do to a ROW... think, instead, of what you want to do to a COLUMN.

    Change is inevitable... Change for the better is not.


    Helpful Links:
    How to post code problems
    How to Post Performance Problems
    Create a Tally Function (fnTally)

  • Jeff Moden - Thursday, November 16, 2017 10:16 PM

    coolchaitu - Thursday, November 16, 2017 10:01 PM

    Good Morning Experts.

    We have a SQL Server instance. All the user databases log files are on R drive. We have been getting alerts that R drive is 99% full, 100% full. I am interested to know what is consuming the R drive. Could you please advise. Any script to check what is consuming the R drive?

    How big is the drive and is anything else using the drive?  Also, how often are you taking log file backups?  And, what are the initial size and growth settings set to?  And finally, do you have any form of log shipping or replication running?

    Nothing else is using the drive. It is dedicated to user databases log files. We take log file backups every 30 minutes. No log shipping or replication. This issue is happenning only from last 2 days. Is there a way to find out the spid/transaction that is eating up the drive

  • Have you established which database's log file is growing?  Has anything changed in the last two days?  Maybe you've imported (or deleted or updated) a lot of data in one go.  Maybe you've started doing index maintenance.  Or perhaps you have an overnight batch process?  If you look at the date of the transaction log file on disk, that will tell you when it last grew - was anything happening at that time?

    What do you get if you run this?
    SELECT name, log_reuse_wait_desc
    FROM sys.databases

    John

  • coolchaitu - Friday, November 17, 2017 12:07 AM

     Is there a way to find out the spid/transaction that is eating up the drive

    It won't be a single session. The log grows if it can't be marked reusable by a log backup. First thing to do is see which DB's log file is large, and check why it can't be marked reusable. See John's post for details.

    Gail Shaw
    Microsoft Certified Master: SQL Server, MVP, M.Sc (Comp Sci)
    SQL In The Wild: Discussions on DB performance with occasional diversions into recoverability

    We walk in the dark places no others will enter
    We stand on the bridge and no one may pass
  • coolchaitu - Friday, November 17, 2017 12:07 AM

    Jeff Moden - Thursday, November 16, 2017 10:16 PM

    coolchaitu - Thursday, November 16, 2017 10:01 PM

    Good Morning Experts.

    We have a SQL Server instance. All the user databases log files are on R drive. We have been getting alerts that R drive is 99% full, 100% full. I am interested to know what is consuming the R drive. Could you please advise. Any script to check what is consuming the R drive?

    How big is the drive and is anything else using the drive?  Also, how often are you taking log file backups?  And, what are the initial size and growth settings set to?  And finally, do you have any form of log shipping or replication running?

    Nothing else is using the drive. It is dedicated to user databases log files. We take log file backups every 30 minutes. No log shipping or replication. This issue is happenning only from last 2 days. Is there a way to find out the spid/transaction that is eating up the drive

    Yes but we need to know the answer to my other questions to help guide us to what the most likely cause is.  What are the settings for the initial file sizes and growth settings of the offending log file (hopefully, you've narrowed it down to that... Check out the sys.master_files if you haven't) and how big is the drive?

    Also, if it only changed in the last 2 days, then really carefully find out if someone added some new code or, like the others have stated, have you recently just started something like index maintenance?

    We need answers to those questions to be able to proceed.

    --Jeff Moden


    RBAR is pronounced "ree-bar" and is a "Modenism" for Row-By-Agonizing-Row.
    First step towards the paradigm shift of writing Set Based code:
    ________Stop thinking about what you want to do to a ROW... think, instead, of what you want to do to a COLUMN.

    Change is inevitable... Change for the better is not.


    Helpful Links:
    How to post code problems
    How to Post Performance Problems
    Create a Tally Function (fnTally)

  • If transactional log backups have not been taken for a long time after full backup , the log keeps growing. Can be another reason apart from what experts have said above.
  • @coolchaitu ,

    I see you've flagged two posts as the answer, so I'm curious...  What did you actually come up with?  What was it that was causing your R: drive to fill up?

    --Jeff Moden


    RBAR is pronounced "ree-bar" and is a "Modenism" for Row-By-Agonizing-Row.
    First step towards the paradigm shift of writing Set Based code:
    ________Stop thinking about what you want to do to a ROW... think, instead, of what you want to do to a COLUMN.

    Change is inevitable... Change for the better is not.


    Helpful Links:
    How to post code problems
    How to Post Performance Problems
    Create a Tally Function (fnTally)

  • Jeff Moden - Monday, November 20, 2017 4:00 PM

    @coolchaitu ,

    I see you've flagged two posts as the answer, so I'm curious...  What did you actually come up with?  What was it that was causing your R: drive to fill up?

    It was a huge ETL load that was causing

  • coolchaitu - Tuesday, November 21, 2017 10:45 PM

    Jeff Moden - Monday, November 20, 2017 4:00 PM

    @coolchaitu ,

    I see you've flagged two posts as the answer, so I'm curious...  What did you actually come up with?  What was it that was causing your R: drive to fill up?

    It was a huge ETL load that was causing

    Thanks for the feedback.  That still begs to question how big the R drive is because even a "huge" ETL load should not be causing extreme log file usage.  Unless your R drive is only a couple of gig in size, that ETL process may have some serious problems.

    --Jeff Moden


    RBAR is pronounced "ree-bar" and is a "Modenism" for Row-By-Agonizing-Row.
    First step towards the paradigm shift of writing Set Based code:
    ________Stop thinking about what you want to do to a ROW... think, instead, of what you want to do to a COLUMN.

    Change is inevitable... Change for the better is not.


    Helpful Links:
    How to post code problems
    How to Post Performance Problems
    Create a Tally Function (fnTally)

  • Jeff Moden - Wednesday, November 22, 2017 6:14 AM

    coolchaitu - Tuesday, November 21, 2017 10:45 PM

    Jeff Moden - Monday, November 20, 2017 4:00 PM

    @coolchaitu ,

    I see you've flagged two posts as the answer, so I'm curious...  What did you actually come up with?  What was it that was causing your R: drive to fill up?

    It was a huge ETL load that was causing

    Thanks for the feedback.  That still begs to question how big the R drive is because even a "huge" ETL load should not be causing extreme log file usage.  Unless your R drive is only a couple of gig in size, that ETL process may have some serious problems.

    R drive is 100 GB

  • Aside the R drive, how is your log file set up? With a maximum size? Or with Auto Grow?

    And how much of that 100 is being taken up by the log file for this one database verses the regular database files and other database log files?

    What about the system DBs? Are they on the same drive? If so, have you looked at tempdb to see how big it is compared to the others?

    Brandie Tarvin, MCITP Database AdministratorLiveJournal Blog: http://brandietarvin.livejournal.com/[/url]On LinkedIn!, Google+, and Twitter.Freelance Writer: ShadowrunLatchkeys: Nevermore, Latchkeys: The Bootleg War, and Latchkeys: Roscoes in the Night are now available on Nook and Kindle.

  • Brandie Tarvin - Friday, December 8, 2017 5:37 AM

    Aside the R drive, how is your log file set up? With a maximum size? Or with Auto Grow?

    And how much of that 100 is being taken up by the log file for this one database verses the regular database files and other database log files?

    What about the system DBs? Are they on the same drive? If so, have you looked at tempdb to see how big it is compared to the others?

    It is setup to auto grow. System dbs are on different drive. How to find out what consumed the log drive

  • coolchaitu - Friday, December 8, 2017 6:24 AM

    Brandie Tarvin - Friday, December 8, 2017 5:37 AM

    Aside the R drive, how is your log file set up? With a maximum size? Or with Auto Grow?

    And how much of that 100 is being taken up by the log file for this one database verses the regular database files and other database log files?

    What about the system DBs? Are they on the same drive? If so, have you looked at tempdb to see how big it is compared to the others?

    It is setup to auto grow. System dbs are on different drive. How to find out what consumed the log drive

    You haven't answered the rest of my questions.

    Brandie Tarvin, MCITP Database AdministratorLiveJournal Blog: http://brandietarvin.livejournal.com/[/url]On LinkedIn!, Google+, and Twitter.Freelance Writer: ShadowrunLatchkeys: Nevermore, Latchkeys: The Bootleg War, and Latchkeys: Roscoes in the Night are now available on Nook and Kindle.

  • Brandie Tarvin - Friday, December 8, 2017 6:32 AM

    coolchaitu - Friday, December 8, 2017 6:24 AM

    Brandie Tarvin - Friday, December 8, 2017 5:37 AM

    Aside the R drive, how is your log file set up? With a maximum size? Or with Auto Grow?

    And how much of that 100 is being taken up by the log file for this one database verses the regular database files and other database log files?

    What about the system DBs? Are they on the same drive? If so, have you looked at tempdb to see how big it is compared to the others?

    It is setup to auto grow. System dbs are on different drive. How to find out what consumed the log drive

    You haven't answered the rest of my questions.

    The log file is taking full 100gb

Viewing 15 posts - 1 through 15 (of 18 total)

You must be logged in to reply to this topic. Login to reply