This is confusing: "As you can see from the disk bytes/sec counter, there is a considerable savings caused by both row and page level compression, with the page level compression yielding the highest improvement."
How is this? The number of bytes per second is much lower with compression on. What is the "considerable savings"? A lower number of bytes per second is a bad thing, not a good thing. Can you explain this please?
Finally, what is the "time" axis on all of the charts? What kind of time? The axis is not labeled with any units (minutes, seconds, hours...). Is this the time since the test started, and if so, how is that significant?