• sgmunson - Tuesday, April 24, 2018 2:19 PM

    ScottPletcher - Tuesday, April 24, 2018 12:12 PM

    sgmunson - Tuesday, April 24, 2018 8:14 AM

    ScottPletcher - Tuesday, April 24, 2018 7:53 AM

    sgmunson - Tuesday, April 24, 2018 6:15 AM

    DBA Live - Monday, April 23, 2018 4:30 PM

    31068894_144334073075083_2256829684245331968_n.png

    1. is above table has partial dependency between title and author name ?
    2. if yes then there would be a transitive dependency between title and publisher or not ? please help.

    I gave up trying to be so formal about the nature of data.   ISBN numbers are unique, so while a fully normalized table would contain pointers to Author and Publisher tables, calling something a "partial dependency" or a "transitive dependency" just confuses most people.   Every ISBN has an Author and a Publisher, even if those entities turn out to be one and the same, so I fail to see anything "partial" here, from my point of view.   However, my point of view doesn't always agree with the alleged science.   Practicality and performance are just too important to ignore for the sake of normalization.  Things that can complicate matters...  1) the potential for multiple authors for a title (e.g. for royalty purposes), 2) the need to link multiple editions together for search purposes, 3) Date, as shown, is only a year, and usually there is both year and month somewhere on most books within the first couple of physical pages, and the same ISBN can have multiple printings or re-printings, which speaks potentially to Edition, but not necessarily.  Thus one might need a column to designate a number representing which print run it was, which could take you back to having to expand primary key beyond ISBN to include which printing it is, and 4) how about much older books that for whatever reason, break this model in some way...

    Plenty to think about there, so what may be far more important is how such a table will be used, and how often, and by how many users.   Such answers are far more useful than some simple but mostly meaningless phrase that speaks to a type of dependency for certain of the data.   Data modeling is often too much science and not enough practical experience.  Over normalization is usually the result, which often leads to poor performance and is often wasteful of resources - especially disk space.   Such models are also often considerably less flexible and changes can be hard to make,    Consider your options carefully....

    You're conflating logical design and physical design.  Normalization is a logical process.  Performance doesn't exist at this stage. 
    Rather, the most important thing is to model the data accurately and completely.

    Author can't go in the same table because of 1NF -- a book can have many authors, and repeating values are not allowed in any normalized form.

    During the physical design stage, denormalization is done if/as necessary.

    Yep.  And I'll continue to do so until someone can demonstrate a clear benefit to changing my perspective.   I haven't had a database design I've created need changing at any time in the last 25 years, so I'll keep doing what works.   Formalizing and separating logical and physical design for me is a waste of my valuable time.   I have to live in the real world, and given most of the designs I keep coming across that seem to focus a tad too intently on normalization, despite their having been ample evidence to support a slight deviation, I continue to be amazed at just how silly they can get, as I end up having to either seriously change the design or jump through query hoops left right and sideways.  Both are usually costly and time-consuming.   And yes, I do have an attitude.  It didn't take all that long to develop, either, and in the last quarter century, I have yet to see ANY evidence supporting a better alternative.   I'll continue to focus solely on the solution.

    Yes, Dr. Codd and CJ Date and all the experts who've thought about this for the last 55 years -- and those that follow them -- or so are completely stupid and wrong and you, and you alone, are the only one that's right.  They've just (deliberately?) wasted our time by making us do a logical design.

    After all, as long as the db performs OK when it is first implemented, then it must be a good "design", right?  Just slap the data quickly into tables and it will work out just fine.  In the real world, that caused massive data issues and rework, which is why it was changed to think a little before leaping.  But, obviously, that's not for everyone.

    Yeah, fan the flames, dude... that's productive...   Do you for one second believe that I haven't considered logical design at all?   I just don't take the extra time to make it a separate step.  I just don't stop and consider whether or not something is partially dependent or transitively dependent before I create a physical design.   My logical design is usually inherent in my physical one.  Note that what I call out as wasteful is the resulting crap designs I keep running into.  Clearly, despite someone taking the extra time to allegedly think it through, the net result has repeatedly shown that thinking was probably not on the list of tasks actually performed.   So you'll just have to pardon my jaded point of view that somehow allocating time for a task that is then not actually done after the time has been spent, makes any sense.   If those teachings by Mr's Codd and Date had actually been followed often enough, this might be an entirely different conversation.  I just cut out the formality, do the thinking, and make the physical design as I think.   I write queries to test that the design will work without forcing queries to jump through hoops, because even with zero data, I can see the execution plans and see where potential hot-spots might occur.   I also validate the design with test data and see what performance looks like before committing to a final design,  So I don't just leap to a design without thinking it through first.  I've made a career out of fixing problems created by a lack of thinking, and in that business, you simply can't afford to be off the mark or you're going to be out of work in a hurry.   I'm just saying that my methods are different because I have yet to see much success coming out of all the "extra thinking", which just makes we wonder what was thought about, or if the requirements were actually fully understood, or if future consequences were taken into consideration.   I'm sure there are folks out there who successfully use such processes.   I just haven't met very many of them because those folks won't be out looking for someone to come in and fix things, or for new people, because their work solves those kinds of problems and the need for more people just isn't there...

    It's inherently absurd to call a single design both a "logical" design and a "physical" design.

    If after 25 years you've never seen a success from doing a logical design -- which it really sounds like you've never actually done anyway -- then they are the worst designers I've ever heard of or seen.  Even modest amounts of genuine thought beforehand can help resolve lots of potential problems down the road.  [SAP became one of the largest software companies in the world mostly by actually doing a full logical design first.]

    SQL DBA,SQL Server MVP(07, 08, 09) A socialist is someone who will give you the shirt off *someone else's* back.