This question might be more apt to programmers.stackexchange. If so, please migrate.
I am currently pondering the complexity of typical data models. Everybody knows that data models should be normalized, however on the other hand a normalized data model will require quite a few joins to reassemble the data later. And joins are potentially expensive operations, depending on the size of the tables involved. So the question I am trying to figure out, is how one would usually go about this tradeoff? I.e. in practice how many joins would you find acceptable in typical queries when designing a data model? This would be especially interesting when counting multiple joins in single queries.
As an example let's say we have users, who own houses, in which there are rooms, which have drawers, which contain items. Trivially normalizing this with tables for users, houses, rooms, drawers, and items in the sense explained above, would later require me to join five tables, when getting all the items belonging to a certain user. This seems like an awful lot of complexity to me.
Most likely the size of the tables would be involved, too. Joining five tables with little data is not as bad as three tables with millions of rows. Or is this consideration wrong?