Differential Backups Internals

  • Comments posted to this topic are about the item Differential Backups Internals

    ken kaufman
    Lead DBA, Zillow.com

  • I wish I read this before doing the QotD, which implied I got it wrong

    SQLServerNewbieMCITP: Database Administrator SQL Server 2005
  • althought the knowlege is good to have, but asking this question in interview will make them think you are crazy, I would not like to work for you if you ask me that.

  • Actually it's an interesting question for an interview. If you have a lot of simple mode databases, then differentials could conceivably save you lots of resources on backups.

    It's not crazy, and I'm not sure I'd penalize you in an interview, but I'd be curious to see how you thought through the logic if you didn't know the answer.

  • I got it right because this is exactly what I am doing hourly for one of our databases, in lieu of transaction log backups (DB is 3GB in size, generates 3GB per hour of transaction log traffic. Backend for an off-the-shelf product and the vendor/creator is of little use).

    What I have noticed and cannot explain, is the varying size of the differential backups. Theoretically, the differential backup only backs up extents changed since the last full backup. That would imply the differential backup file will only get bigger and bigger. However, I am noticing a more up down pattern: 18, 10, 10, 10, 18, 10, 10, 20, 10, 10, 26, 70, 72, 77, 92, 84, 85, 89, 93 and so on. Generally up (especially once people get in & use the system) but not consistently so.

    Of no concern whatsoever, just curious to know why it should be. Possibly by an extent being updated more than once but a subsequent update having less data in it?

    Scott Duncan

    MARCUS. Why dost thou laugh? It fits not with this hour.
    TITUS. Why, I have not another tear to shed;
    --Titus Andronicus, William Shakespeare

  • As for the first part on being able to do a tran backup in simple mode, this is a pretty entry level question, in fact it was one of the first basics covered when I taught the MCDBA classes. As for the follow up, it’s intended to separate out levels of expertise. When I’m deciding on a candidate, I need to be able to distinguish whether they are entry/mid/SR level. Answering this is one of ten questions that qualify as SR level. Obviously I don’t expect everyone to get them all right, but if they can land over 50% there in the SR category. At Zillow we’re very strong on understanding internals of the applications you work with. My theory is if you’re paying your mortgage with SQL you should know how it works under the hood, and specifically look for people with the same attitude. 🙂

    Ken Kaufman

    Lead DBA, Zillow.com

    ken kaufman
    Lead DBA, Zillow.com

  • all right, guys, doesn't mean to offending, apologize if I did. actually I love the post here and benefit a lot, as a DBA that dealing with sql, oracle and sybase, the posts keep me up to speed with sql...

    I agree with you that some apps require DBA to baby-sitting it, I've been there and are still supporting on one oracle app which we spent 70 million dollars and still cannot see the light. we have to answer questions like what the server was running in from 12pm to 12:05pm, and what was the db lock at that moment, etc. etc.

    I hate the app so much because it was not well designed and caused a lot DBA time, it is paying my mortgate. 😉 but I would rather spend my time doing other stuff because being able to answer those questions does not help the vendor to improve the app, we are still receiving letter patch everyday.

    I am telling you my story to say why i was mentioning the "crazy" and "don't want to work for you".

    have a nice day, guys.:cool:

  • Good information, thanks.

    I agree with Ken, when interviewing candidates, its important to gauge their level of knowledge of the product and you can't do that without asking in-depth questions.

  • I ran across this article through a Google search. I am already familiar with DCM, and I thought this would be the perfect tool to use to determine the rate of change (ROC) in bytes/day for my production databases. You might ask why I'm doing this. We have a new NetApp SAN, and it has the ability to perform SQL Server snapshots for amazingly fast short-term backup and recovery. In order to implement this, however, I need to size the volumes appropriately and not waste a lot of space. The formula I have been provided is this:

    Space for Snapshot copies = ROC in bytes per day * number of Snapshot copies

    Does anybody have any ideas as to how I might calculate ROC?



  • Will this manual replication plan work?

    1.To be more specific, at Date1, we want to detach US DB, copy and attach to PacRim.

    2.On Date1, also turn on differential backup transaction log on PacRim DB

    3.Set Identity Odd# + 2x increments for PacRim Db

    4.Set Identity Event# + 2x increments for US Db

    5.After pilot user testing on PacRim Db went well, we then want to repeat this.

    Few weeks after Date1, detach the existing PacRim DB.

    6.Then detach US DB (Date1+few weeks new data), copy and attach to PacRim.

    7.Then Restore the PacRim DB transaction log from Date1 onto the 2nd time Attached PacRim DB which is the US DB (Date1+few weeks new data).

    The utimate goal is PacRim DB =

    (US DB + US few weeks new data) + PacRim few weeks new data

    Will task#7 Restore produce expected result for PacRim DB having both odd and even # Identity?

    For illustration, User inserts 2 records with transaction log turned on.

    “TableX from PacRim” (Identity ID seeded (1, 1 + 2x) => Odd # ID

    ID, Name

    1, A

    3, C

    “TableX from US” (Identity ID seeded (0, 0 + 2x) => Even# ID.

    ID, Name

    2, B

    4, D

    What will be the result after applying transaction log (insert A & C)

    from “TableX from PacRim”

    to “TableX from US”?

    Expected Result

    “TableX from US”?

    ID, Name

    2, B

    4, D

    1, A

    3, C


    UnExpected Result

    “TableX from US”?

    ID, Name

    2, B

    4, D

    6, A

    8, C

    Many Thanks,


Viewing 10 posts - 1 through 9 (of 9 total)

You must be logged in to reply to this topic. Login to reply