help with benchmarking

  • I'm comparing query performance between a dedicated physical server running 2008r2 and a virtual server running 2012. I'm using queries with randomized parameters that are prototypical of user activity and launching multiple user "sessions" in parallel to really hammer the servers.

    if (object_id('tempdb..#t') is not null) drop table #t;

    SELECT ... INTO #t FROM ... GROUP BY ...

    if (object_id('tempdb..#t') is not null) drop table #t;

    Since this is a reporting application and some users run big reports (raw data in excess of 2GB is not uncommon), my tactic of stuffing results in tempdb is skewing the performance results.

    I could wrap the query to partially alleviate the tempdb usage:

    SELECT count(*) from (SELECT ... FROM ... GROUP BY ...) X

    Any other suggestions on how I can generate the same workload as a big report without actually materializing the resulting data? Using TOP(N) or SET ROWCOUNT N to limit the data is not viable since the query process is terminated once N rows are collected.

  • If you want to test the whole system, disk, memory, etc., you really do have to retrieve the data. If you're on a 2012 system, I'd suggest looking at distributed replay as a mechanism to allow for testing two different machines without introducing too much weirdness from the testing process itself.

    "The credit belongs to the man who is actually in the arena, whose face is marred by dust and sweat and blood"
    - Theodore Roosevelt

    Author of:
    SQL Server Execution Plans
    SQL Server Query Performance Tuning

Viewing 2 posts - 1 through 1 (of 1 total)

You must be logged in to reply to this topic. Login to reply