Today we have a guest editorial as Steve is away on holiday.
When preparing for performance testing we tend to ensure that the testing environment is as near to a production-like scenario as possible. We consider hardware configurations, software versions (including Service Packs, Cumulative Updates and patches), system configuration, networking, data volumes and physical database file locations. There are undoubtedly many others. What really peaks my interest here is the performance testing of SQL Server databases. I suspect that many of you know a thing or two that I would not automatically consider and would struggle to imagine even given plenty of time.
Recently I thought of something that had never previously crossed my mind: what about purposely testing against a database with fragmented indexes as may exist in a production database? Is this is a valid scenario? How do you deliberately create such data? Of course, we ensure that indexes are rebuilt when necessary but there must be a moment before that when the indexes are fragmented. Should performance testing check that systems still fulfil their non-functional requirements (NFRs) at this point?
This is just one scenario that I have thought of. There must be others but what are they? How do we create databases with these setups? How can we automate it? Perhaps, reproduce production scenarios? Or am I being pedantic and testing a database with an appropriate amount of data will suffice?