• Gazareth (2/22/2013)


    ben.brugman (2/22/2013)


    The average time a page remains in the cache is two times the time it has been in the cache at any moment.

    Is it?

    Sorry but with any moment I actually meant a random moment. And with average I meant off a large number of tries.

    If you pick a point in time totally random and look at a totally random page in the cache. Statistically the time that the page has been in cache and the time that the page will remain in the cache are the same.

    But maybe I did not write it clearly enough. Suppose a page remains in the cache for one hour. 3600 seconds.

    If you randomly check the cache many times during a long time and you find that specific page is the cache on average that page will have been there 1800 seconds statistically.

    Hope to have been a bit clearer with this explanations.

    If not I am prepared to try to give a more comprehensive illustration.

    Gazareth (2/22/2013)

    Edit: actually, I see how you got that. Not sure I agree though.

    In real life no, but to understand what the definition is of PLE, I am assuming a stable situation were the amount new pages read from disk is totaly constant. And pages enter AND exit the cache in the same order. After I get this about the PLE, I want to understand how the unfavored pages should be seen in this context. But statistically the assumption is correct. Compare an always open shop. If at random times you measure the time that a person is in the shop and at (other) random times you ask an exiting person how long he/she has been in the shop, the exiting persons have been there twice the time of the the first measurement.

    When using random times this principle holds true for many situations.

    ben

    (Sorry, by now I am noticing that my English is not correct in all my sentences, sorry)