• Laurence Neville (9/27/2016)


    I often come up against the following challenge, and end up de-normalizing my table design in order to enforce data integrity. I would be interested to hear if there is a different way of modelling the data that is both normalized and enforces integrity.

    I think you are probably not doing sufficient normalisation: If you don't top at 3NF but intead go on to EKNF and then to 4NF and 5NF wile avoiding the BCNF nonsense (BCNF was a mistake, but descriptions of 4NF and 5NF that incorporate it are excessively common) you may find that you are enforcing data integrity quite effectively without any denormalisation. Of course if teh denormalisation you are referring tois reverting from BCNF to EKNF, that is not denormalisation if the two forms are different (because in that case BCNF is not a normal form for your data model, and reverting to EKNF is normalisation).

    This challenge seems to come up when my design departs from a simple branching "snowflake" and there needs to be "loops" in the relationships:

    P ---------> PT

    ^ ^

    | |

    | |

    PPV ------> PTP

    Note: this is a simple example with a 4 tables. With a real-world design with many levels of related tables, the issue becomes magnified (unique constraints on multiple columns, more than one "unnecessary" unique constraint per table etc).

    Well, your diagram doesn't have a loop in it - it is acyclic - so the loop idea seems to be something of a red herring. Real loops in the network of foreign key constraints would be a problem, but if you had such a loop I am pretty sure that the problem would be that you had chosen a bizarre and illogical bunch of data that didn't represent reality. If the network is acyclic and not a tree, like the network in the diagram, cascades are a pain to handle (which is why SQL Server refuses to handle them and you have to do it yourself) but that shouldn't lead to denormalisation.

    Tom