• I think a part of the problem is the way people approach it.  When I work with developers, they already have preconceived notions coming into a design session of what the database should be, and want to jump to the end of the process.  There are some that try to tie the data model directly to how the application screens are designed, and don't want to model the data.  There are some that just want to come up with the most normalized design they can think of and be done with it.  Then there are sill others that think the best thing is just an attribute/value list, because they think that will be the most flexible and never require changes then.

    I approach it by thinking about the logical design first, come up with the conceptual ideas of the data we want a system to represent, and the cardinality of their relationships.  Next I start doing the normalization and analysis, then produce the physical model and design the tables.

    Related to that, people often forget that normalization is part of analysis, not part of design.  Once you have a model in third normal form, you will sometimes find that there is a need to denormalize certain aspects of it to make it usable in a real world system.  I've had to work with other people's designs before where they stuck to third normal form or higher, and then to get any useful information out of the system you had to do 5 or 6 joins, for something that I think should have been 2 or 3. 

    Then of course people try to build their reporting structure directly on top of this heavily normalized system, because they don't want to do the work of designing a star schema as part of a data mart or data warehouse, and end up doing more work because of it.  Data modeling is work, but I believe that if you go through the process, then it can actually reduce the total work time spent in the end.