Files with larger disk consumption

  • Comments posted to this topic are about the item Files with larger disk consumption

  • So.... how do you find out what's causing the problem?

    --Jeff Moden


    RBAR is pronounced "ree-bar" and is a "Modenism" for Row-By-Agonizing-Row.
    First step towards the paradigm shift of writing Set Based code:
    ________Stop thinking about what you want to do to a ROW... think, instead, of what you want to do to a COLUMN.

    Change is inevitable... Change for the better is not.


    Helpful Links:
    How to post code problems
    How to Post Performance Problems
    Create a Tally Function (fnTally)

  • Jeff Moden - Thursday, January 10, 2019 9:11 AM

    So.... how do you find out what's causing the problem?

    Hi Jeff,

    Did you refer to slowness issues in reading the data?
  • Junior Galvão - MVP - Thursday, January 10, 2019 3:43 PM

    Jeff Moden - Thursday, January 10, 2019 9:11 AM

    So.... how do you find out what's causing the problem?

    Hi Jeff,

    Did you refer to slowness issues in reading the data?

    No.  I'm talking about finding high values in the things you measured.  How do you find what is causing those high values?  The article doesn't really provide a clue as to how to determine if it's code, hardware, or simply a temporary "data storm".  I realize that you're "just" providing a script but it seems to me that at least mentioning that it could be one of those 3 would make the script article more valuable because being able to identify a problem doesn't really help if you don't know what the actual problem really is.

    --Jeff Moden


    RBAR is pronounced "ree-bar" and is a "Modenism" for Row-By-Agonizing-Row.
    First step towards the paradigm shift of writing Set Based code:
    ________Stop thinking about what you want to do to a ROW... think, instead, of what you want to do to a COLUMN.

    Change is inevitable... Change for the better is not.


    Helpful Links:
    How to post code problems
    How to Post Performance Problems
    Create a Tally Function (fnTally)

  • Jeff Moden - Thursday, January 10, 2019 10:21 PM

    Junior Galvão - MVP - Thursday, January 10, 2019 3:43 PM

    Jeff Moden - Thursday, January 10, 2019 9:11 AM

    So.... how do you find out what's causing the problem?

    Hi Jeff,

    Did you refer to slowness issues in reading the data?

    No.  I'm talking about finding high values in the things you measured.  How do you find what is causing those high values?  The article doesn't really provide a clue as to how to determine if it's code, hardware, or simply a temporary "data storm".  I realize that you're "just" providing a script but it seems to me that at least mentioning that it could be one of those 3 would make the script article more valuable because being able to identify a problem doesn't really help if you don't know what the actual problem really is.

    Jeff,
    I really shared the script, which I'm always using in my consulting activities here in Brazil, at no time I thought of writing an article that can illustrate or guide how we can reach this values, maybe do it, even if you want to access my humble blog: pedrogalvaojunior.wordpress.com, may have a dimension of what I do and a little of what I aim to do.
    There are several possibilities that can cause the presentation of these values, this will depend very much on what is being executed or even processed by the instance, server or hardware that we are analyzing.
    When this script is run in an environment is the values ​​presented are close to what I highlighted, we then have to start analyzing their possible causes, by default when faced with a supposedly slow reading process, the main causes can be:
    - Hard Disk presenting performance problems in the search of data;
    - Fragmentation of data in our tables and indexes;
    - Lack of indexes in our tables;
    - Use of columns in the where clause that do not satisfy the condition to obtain the data in a better time; and
    - As well as the query being executed may be one of the causes.
    At this point, you see these items as possible causes of slow reading of data that is on disk.
    But reinforcing again that this script is meant to alert the values ​​that are being presented at the time of its execution, the causes or reasons will vary from scenario to scenario.
  • Junior Galvão - MVP - Friday, January 11, 2019 6:49 AM

    Jeff Moden - Thursday, January 10, 2019 10:21 PM

    Junior Galvão - MVP - Thursday, January 10, 2019 3:43 PM

    Jeff Moden - Thursday, January 10, 2019 9:11 AM

    So.... how do you find out what's causing the problem?

    Hi Jeff,

    Did you refer to slowness issues in reading the data?

    No.  I'm talking about finding high values in the things you measured.  How do you find what is causing those high values?  The article doesn't really provide a clue as to how to determine if it's code, hardware, or simply a temporary "data storm".  I realize that you're "just" providing a script but it seems to me that at least mentioning that it could be one of those 3 would make the script article more valuable because being able to identify a problem doesn't really help if you don't know what the actual problem really is.

    Jeff,
    I really shared the script, which I'm always using in my consulting activities here in Brazil, at no time I thought of writing an article that can illustrate or guide how we can reach this values, maybe do it, even if you want to access my humble blog: pedrogalvaojunior.wordpress.com, may have a dimension of what I do and a little of what I aim to do.
    There are several possibilities that can cause the presentation of these values, this will depend very much on what is being executed or even processed by the instance, server or hardware that we are analyzing.
    When this script is run in an environment is the values ​​presented are close to what I highlighted, we then have to start analyzing their possible causes, by default when faced with a supposedly slow reading process, the main causes can be:
    - Hard Disk presenting performance problems in the search of data;
    - Fragmentation of data in our tables and indexes;
    - Lack of indexes in our tables;
    - Use of columns in the where clause that do not satisfy the condition to obtain the data in a better time; and
    - As well as the query being executed may be one of the causes.
    At this point, you see these items as possible causes of slow reading of data that is on disk.
    But reinforcing again that this script is meant to alert the values ​​that are being presented at the time of its execution, the causes or reasons will vary from scenario to scenario.

    I get and very much appreciate all of that, Junior.  If you would have stated the very things in the preamble of your script as possibly "Here are the things that can go wrong with your system...." followed by the rest of what you wrote as a first measure to see if you even have a problem (which you did very well!), people would see the value in your script even more.

    It's meant as a suggestion because there are too many people that write scripts out there without emphasizing why the script is valuable.

    --Jeff Moden


    RBAR is pronounced "ree-bar" and is a "Modenism" for Row-By-Agonizing-Row.
    First step towards the paradigm shift of writing Set Based code:
    ________Stop thinking about what you want to do to a ROW... think, instead, of what you want to do to a COLUMN.

    Change is inevitable... Change for the better is not.


    Helpful Links:
    How to post code problems
    How to Post Performance Problems
    Create a Tally Function (fnTally)

  • Jeff Moden - Friday, January 11, 2019 8:00 AM

    Junior Galvão - MVP - Friday, January 11, 2019 6:49 AM

    Jeff Moden - Thursday, January 10, 2019 10:21 PM

    Junior Galvão - MVP - Thursday, January 10, 2019 3:43 PM

    Jeff Moden - Thursday, January 10, 2019 9:11 AM

    So.... how do you find out what's causing the problem?

    Hi Jeff,

    Did you refer to slowness issues in reading the data?

    No.  I'm talking about finding high values in the things you measured.  How do you find what is causing those high values?  The article doesn't really provide a clue as to how to determine if it's code, hardware, or simply a temporary "data storm".  I realize that you're "just" providing a script but it seems to me that at least mentioning that it could be one of those 3 would make the script article more valuable because being able to identify a problem doesn't really help if you don't know what the actual problem really is.

    Jeff,
    I really shared the script, which I'm always using in my consulting activities here in Brazil, at no time I thought of writing an article that can illustrate or guide how we can reach this values, maybe do it, even if you want to access my humble blog: pedrogalvaojunior.wordpress.com, may have a dimension of what I do and a little of what I aim to do.
    There are several possibilities that can cause the presentation of these values, this will depend very much on what is being executed or even processed by the instance, server or hardware that we are analyzing.
    When this script is run in an environment is the values ​​presented are close to what I highlighted, we then have to start analyzing their possible causes, by default when faced with a supposedly slow reading process, the main causes can be:
    - Hard Disk presenting performance problems in the search of data;
    - Fragmentation of data in our tables and indexes;
    - Lack of indexes in our tables;
    - Use of columns in the where clause that do not satisfy the condition to obtain the data in a better time; and
    - As well as the query being executed may be one of the causes.
    At this point, you see these items as possible causes of slow reading of data that is on disk.
    But reinforcing again that this script is meant to alert the values ​​that are being presented at the time of its execution, the causes or reasons will vary from scenario to scenario.

    I get and very much appreciate all of that, Junior.  If you would have stated the very things in the preamble of your script as possibly "Here are the things that can go wrong with your system...." followed by the rest of what you wrote as a first measure to see if you even have a problem (which you did very well!), people would see the value in your script even more.

    It's meant as a suggestion because there are too many people that write scripts out there without emphasizing why the script is valuable.

    Jeff, thanks for understanding, within the possible I will seek to share my experiences and knowledge.

  • Having read through this quickly I thought it was valuable. Performance tools are always welcome!

    I wondered if the title 

    Files with larger disk consumption

    would be better as

    Files with slower disk consumption

    412-977-3526 call/text

Viewing 8 posts - 1 through 7 (of 7 total)

You must be logged in to reply to this topic. Login to reply