• sgmunson - Wednesday, April 25, 2018 1:59 PM

    ScottPletcher - Wednesday, April 25, 2018 9:49 AM

    sgmunson - Wednesday, April 25, 2018 8:03 AM

    ScottPletcher - Tuesday, April 24, 2018 2:57 PM

    sgmunson - Tuesday, April 24, 2018 2:19 PM

    ScottPletcher - Tuesday, April 24, 2018 12:12 PM

    sgmunson - Tuesday, April 24, 2018 8:14 AM

    ScottPletcher - Tuesday, April 24, 2018 7:53 AM

    sgmunson - Tuesday, April 24, 2018 6:15 AM

    DBA Live - Monday, April 23, 2018 4:30 PM

    31068894_144334073075083_2256829684245331968_n.png

    1. is above table has partial dependency between title and author name ?
    2. if yes then there would be a transitive dependency between title and publisher or not ? please help.

    I gave up trying to be so formal about the nature of data.   ISBN numbers are unique, so while a fully normalized table would contain pointers to Author and Publisher tables, calling something a "partial dependency" or a "transitive dependency" just confuses most people.   Every ISBN has an Author and a Publisher, even if those entities turn out to be one and the same, so I fail to see anything "partial" here, from my point of view.   However, my point of view doesn't always agree with the alleged science.   Practicality and performance are just too important to ignore for the sake of normalization.  Things that can complicate matters...  1) the potential for multiple authors for a title (e.g. for royalty purposes), 2) the need to link multiple editions together for search purposes, 3) Date, as shown, is only a year, and usually there is both year and month somewhere on most books within the first couple of physical pages, and the same ISBN can have multiple printings or re-printings, which speaks potentially to Edition, but not necessarily.  Thus one might need a column to designate a number representing which print run it was, which could take you back to having to expand primary key beyond ISBN to include which printing it is, and 4) how about much older books that for whatever reason, break this model in some way...

    Plenty to think about there, so what may be far more important is how such a table will be used, and how often, and by how many users.   Such answers are far more useful than some simple but mostly meaningless phrase that speaks to a type of dependency for certain of the data.   Data modeling is often too much science and not enough practical experience.  Over normalization is usually the result, which often leads to poor performance and is often wasteful of resources - especially disk space.   Such models are also often considerably less flexible and changes can be hard to make,    Consider your options carefully....

    You're conflating logical design and physical design.  Normalization is a logical process.  Performance doesn't exist at this stage. 
    Rather, the most important thing is to model the data accurately and completely.

    Author can't go in the same table because of 1NF -- a book can have many authors, and repeating values are not allowed in any normalized form.

    During the physical design stage, denormalization is done if/as necessary.

    Yep.  And I'll continue to do so until someone can demonstrate a clear benefit to changing my perspective.   I haven't had a database design I've created need changing at any time in the last 25 years, so I'll keep doing what works.   Formalizing and separating logical and physical design for me is a waste of my valuable time.   I have to live in the real world, and given most of the designs I keep coming across that seem to focus a tad too intently on normalization, despite their having been ample evidence to support a slight deviation, I continue to be amazed at just how silly they can get, as I end up having to either seriously change the design or jump through query hoops left right and sideways.  Both are usually costly and time-consuming.   And yes, I do have an attitude.  It didn't take all that long to develop, either, and in the last quarter century, I have yet to see ANY evidence supporting a better alternative.   I'll continue to focus solely on the solution.

    Yes, Dr. Codd and CJ Date and all the experts who've thought about this for the last 55 years -- and those that follow them -- or so are completely stupid and wrong and you, and you alone, are the only one that's right.  They've just (deliberately?) wasted our time by making us do a logical design.

    After all, as long as the db performs OK when it is first implemented, then it must be a good "design", right?  Just slap the data quickly into tables and it will work out just fine.  In the real world, that caused massive data issues and rework, which is why it was changed to think a little before leaping.  But, obviously, that's not for everyone.

    Yeah, fan the flames, dude... that's productive...   Do you for one second believe that I haven't considered logical design at all?   I just don't take the extra time to make it a separate step.  I just don't stop and consider whether or not something is partially dependent or transitively dependent before I create a physical design.   My logical design is usually inherent in my physical one.  Note that what I call out as wasteful is the resulting crap designs I keep running into.  Clearly, despite someone taking the extra time to allegedly think it through, the net result has repeatedly shown that thinking was probably not on the list of tasks actually performed.   So you'll just have to pardon my jaded point of view that somehow allocating time for a task that is then not actually done after the time has been spent, makes any sense.   If those teachings by Mr's Codd and Date had actually been followed often enough, this might be an entirely different conversation.  I just cut out the formality, do the thinking, and make the physical design as I think.   I write queries to test that the design will work without forcing queries to jump through hoops, because even with zero data, I can see the execution plans and see where potential hot-spots might occur.   I also validate the design with test data and see what performance looks like before committing to a final design,  So I don't just leap to a design without thinking it through first.  I've made a career out of fixing problems created by a lack of thinking, and in that business, you simply can't afford to be off the mark or you're going to be out of work in a hurry.   I'm just saying that my methods are different because I have yet to see much success coming out of all the "extra thinking", which just makes we wonder what was thought about, or if the requirements were actually fully understood, or if future consequences were taken into consideration.   I'm sure there are folks out there who successfully use such processes.   I just haven't met very many of them because those folks won't be out looking for someone to come in and fix things, or for new people, because their work solves those kinds of problems and the need for more people just isn't there...

    It's inherently absurd to call a single design both a "logical" design and a "physical" design.

    If after 25 years you've never seen a success from doing a logical design -- which it really sounds like you've never actually done anyway -- then they are the worst designers I've ever heard of or seen.  Even modest amounts of genuine thought beforehand can help resolve lots of potential problems down the road.  [SAP became one of the largest software companies in the world mostly by actually doing a full logical design first.]

    Your willingness to twist words continues...   At no point did I actually call my design both physical and logical.   I just merge both steps to produce the physical design.   If you choose to label it both, feel free...   I just don't see the point in attaching so many labels to things where relatively few people are clear on which is which.   Given that the designs I've come across are usually the reason I'm employed, it shouldn't be that surprising that I don't see a lot of success...   I mostly get the calls from the failures.   The people who designed the data structure are either long gone, or have been promoted to their level of incompetence...   and are effectively unreachable via the perhaps infamous "Jeff Moden pork chop".   Those folks were also always referred to as "data modelers".   If what those folks did as data modelers was supposed to be "thinking it through", then I'd say they were largely unqualified to think, period.   In most cases, some seriously obvious future proofing not only wasn't done, it clearly wasn't even thought about, and many of the designs had pretty serious flaws in terms of either over normalization (to the point of actually creating more pain than pleasure) or a considerable lack of it.   The latter was often easier to fix than the former, and some pieces of some things were so messed up that the only real fix would have been to start from scratch.   When that's what you deal with day in and day out, you do become a tad jaded.   I also don't see how data modeling is going to solve the problem associated with assigning people to that role who either don't know enough about the business to create a useful design in the first place, or who aren't the kind of people willing to ask a LOT of questions, or aren't willing to consider a large number of alternatives.   Again, I get the calls from places that are struggling.   So when I do have to create a new design, I consider everything it needs to handle, and create it to do that and more, and with sufficient future-proofing to last a good decade without need for much change beyond minor report changes.   I may not be SAP, but I do know how to design a database.

    Obviously you don't; you've repeatedly admitted that.  You know how to construct a table and how to write code.  You work around everything else.  That works well enough to get by for small stuff, I suppose. 

    The places I worked couldn't afford to "just wing it" like that.  International Paper, for example, had excellent logical modelers (and they built a superb data (meta-data, not database) dictionary/repository).  I quit working with the modeling group after IP decided that logical modelers should have no physical experience (I could have stayed, but that would have meant never doing anything physical again.  I didn't want to go that far).  With the number, and size, of companies IP bought and sold, it was critical to have a logical data view independent of physical implementations.  

    Likewise, if you're designing a brand-new system, it simply must be a logical design, since the physical structures don't exist yet.  Sure, you can rush and slap something physical together, but you've just caused yourself data and code failures down the road.

    Well, if you want to continue to distort what I do and twist words at your convenience, feel free.   I never "just wing it".   You might wonder how, if I'm just winging it, I continue to get long-term contracts, where I'll be around for the consequences, and in many cases my code is subjected to review, and is often seen as better than a lot of code the reviewers have seen in their days.  If you want to believe I just BS one client after another, feel free, but I can assure you, that isn't happening.   You don't make it to the next contract all that easily if you aren't getting good results.  Word tends to get around.   But since you appear to be dead set on just fanning the flames, you can fight the rest of this war by yourself.   I've had quite enough of your condescending attitude and absolute insistence that there's one and only one way to skin the design cat that works long term.

    That's what you are doing, by insisting that a separate, logical-only model is never needed.  That's absurd on its face.  There are many reasons to do a logical model, including If the physical platform hasn't been decided yet, and others as I've noted above.

    As to your work.  As you've noted repeatedly, the work you do is on systems that are already screwed up.  So, yeah, I'd expect you to fit in perfectly in those types of environments.  Oracle's royally screwed up many multi-billion-dollar projects and they keep getting hired.  Sometimes reputation exceeds results.  By your own claims, in 25 years in the places you've worked, you've never met even one competent (let alone good) data modeler.  How good can these places be??

    SQL DBA,SQL Server MVP(07, 08, 09) A socialist is someone who will give you the shirt off *someone else's* back.