Recently I thought of something that had never previously crossed my mind: what about purposely testing against a database with fragmented indexes as may exist in a production database? Is this is a valid scenario? How do you deliberately create such data?
When a database is restored to another instance having the same version of SQL Server; the indexes, fragmentation, and statistics will be the same as when the backup was performed. So, if your QA process involves working from a recent restore of production, you should have a good base for comparison in that regard.
However, when performance testing in a QA environment, I don't think it's necessary to duplicate the physical environment of production to get a good idea of how it will perform in production. Even if the QA environment only has 10% the data volume as production; so long as the object schemas, configuration, and data cardinality are equivalent, the execution plans should be equivalent too. So, for example, if adding a new join and columns to a stored procedure results in 30% more page reads in QA, then you should expect the same proportional 30% increase in production. That's what you should focus on, how the latest changes modify the the plan and proportionally impact i/o, cpu, and memory, not runtime duration.
"Do not seek to follow in the footsteps of the wise. Instead, seek what they sought." - Matsuo Basho