measuring sql server effiency

  • I'm doing some analysis and trending on CPU utilization on a DB server which support an application which frequently changes over time. New enhancements, indexing improvements, etc.

    I have good data on CPU utilizatoin, and can measure the workload to some extent in terms of "jobs" completed and such (application terms) and do a comparison of CPU trends with "work done" from an application perspective.

    The problem is that the definition of "job" changes over time, so this analysis is not so useful. I'm looking for some cumulative measure of the work SQL Server has completed. For example, I can pull "batches/sec" , and this is close to what I am looking for, or I could total up stored procedure execution counts, but that introduces all kinds of other gotcha's.

    Is there some measure like "batches" (not per second)- essentially a measure of the number of commands SQL Server has processed total which I could track?

    thanks in advance for any insight you can provide.

  • ugh efficiency not effiency

  • So, you're looking for the number of commands SQL Server has processed between StartTime and EndTime? How does this help you?

  • Erin Ramsay (8/8/2012)


    So, you're looking for the number of commands SQL Server has processed between StartTime and EndTime? How does this help you?

    Basically, yes. Essentially I'm looking to measure changes in efficiency as measured by CPU utlizaton in comparison to some measure of "work done". I know we are doing work more effienciently as a number of problematic, costly queries have been optimized, but I still measure an upward trend in CPU utilization. I am fairly certain that the upward trend in CPU utilizatoin is due to the fact that we are doing more work- new functoniality has been added which results in more stored procedure calls, the user base has increased, tables sizes are growing, etc... but I'm having difficulty coming up with a good measure of "amount of work performed" in a given time period.

  • I decided I could use execution counts of the top 200 most costly queries (measured by delta total_worker_time daily) as a measure of the "work done" in a day. This seems to be a very good measure.

Viewing 5 posts - 1 through 4 (of 4 total)

You must be logged in to reply to this topic. Login to reply