Click here to monitor SSC
SQLServerCentral is supported by Red Gate Software Ltd.
 
Log in  ::  Register  ::  Not logged in
 
 
 
        
Home       Members    Calendar    Who's On


Add to briefcase ««123»»

Measuring SQL Performance Expand / Collapse
Author
Message
Posted Thursday, December 29, 2005 9:36 AM
Forum Newbie

Forum NewbieForum NewbieForum NewbieForum NewbieForum NewbieForum NewbieForum NewbieForum Newbie

Group: General Forum Members
Last Login: Thursday, June 22, 2006 1:36 PM
Points: 5, Visits: 1

DBAs and Developers measuring query performance is important.  Indexing and query performance is one area for a DBA to really shine.  A couple adjustments and you can be a hero for a day.

Also, if you want to measure many other performance counters on SQL Server/Windows, and many other (IIS, Exchange, network hardware) around the clock and all at the same time, check out http://www.ascendview.com  I have purchased this product and have been using it since March 05.

 

Post #247258
Posted Monday, December 25, 2006 12:03 PM


SSC-Dedicated

SSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-Dedicated

Group: General Forum Members
Last Login: Today @ 8:38 PM
Points: 35,371, Visits: 31,912

Yep... I realize that my response is a year late, but well done, Chris... Very nice intro to some of the performance measuring tools that are available in SQL Server.

Just a couple of thoughts for everyone, though... having tools without knowing what to expect doesn't really do much...

An example from work... 3 years ago, one of our developers identified a particular stored procedure as a very good candidate for optimization in light of large volumes of data.  She very successfully got that process down from 30 minutes to 6 minutes and change.  Yippee!!  Everyone was happy!  Having been recently hired into the company just for such optimizations, they asked me to optimize the same code without me having any knowledge of the recent optimization success of the developer.  I got the process down to a little over 4 SECONDS and used less resources to boot!

Am I bragging?  Sure, a bit... who wouldn't?  But it points out a much larger problem... a lot of developers simply don't know what to expect for performance and, in light of such success as getting a 30 minute process down to 6 minutes, it's hard to convince folks that their perception of performance code needs a little work.  They just don't know and having the tools to measure performance without knowing what's actually possible is pretty meaningless.  That's where in-house training, mentorship, and performance standards come into play.

Another problem, particullarly in the development phase of a project, is having data to test with.  One particular example that comes to mind is one of our developers was asked to write a "grouped running total" based on CustID.  Having no data to test against, he used the Northwind DB to test the wonderful correlated subquery he had just been introduced to... Heck, it didn't have a cursor in it so it must be fast, right?  His testing showed [him] that it was fast.  The code was promoted to production and immediately had performance problems on 1,000 rows... it was taking seconds instead of sub-seconds.  When it hit 10,000 rows, it took minutes to run.  Millions of rows were expected in the next couple of months.

The problem turned out to be the triangular join (about half a cartesion join) he'd used in the correlated subquery.  A lack of data to test against was the killer here.

There's a third, more insideous problem than not knowing what's possible or having enough data to test against.  That's TIME... going from 30 minutes to 6 minutes is considered "good enough, next problem please".  Managers hover over developers asking "are ya done yet?"  Managers and Project Managers, not having a good understanding of why it takes so long, will frequently estimate time for the development of a proc simply by how many blocks the functional diagram (if there is one) has in it.  And, of course, everybody, including the developer, wants things done as quickly as possible.  What you frequently end up with is code that works great provided there's no real volume of data to work against because the developer didn't take the time to gen test data (pretty easy, actually, if you know what you're doing) or didn't take the time to see if they could do better.  The first "good enough" is hardly ever good enough and managers just don't understand how much time is expended fixing performance problems down the road.  It's much cheaper to do it right the first time but you need time for that.

Of course, the fourth and most important problem is the whole bloody relationship between management and developers... management wants the code to be done NOW and the developer likes the paycheck they're getting.  The developer knows that in the face of a tight schedule, they can write performance challenged code to "get the job done now" and that they'll be given more time later (really good for consultants) to fix the performance problems which, by the way, are percieved as a different set of problems by management.  Usually, neither management nor the developers are held accountable for the large expenditure of money and resources it takes to convert performance challenged, crap production code into something that isn't.  Again, that's where some performance standards and some volumetric data would come into play.

Measure all you want... have your developers become very skilled at measuring the performance of their code.  But, if you don't also train them as to what is possible/expected and actually give them the time to write performance based code, then the performance measuring tools and their results are just so many pretty lights on a Christmas tree... in July.



--Jeff Moden
"RBAR is pronounced "ree-bar" and is a "Modenism" for "Row-By-Agonizing-Row".

First step towards the paradigm shift of writing Set Based code:
Stop thinking about what you want to do to a row... think, instead, of what you want to do to a column."

(play on words) "Just because you CAN do something in T-SQL, doesn't mean you SHOULDN'T." --22 Aug 2013

Helpful Links:
How to post code problems
How to post performance problems
Post #332749
Posted Friday, September 26, 2008 1:02 PM
Mr or Mrs. 500

Mr or Mrs. 500Mr or Mrs. 500Mr or Mrs. 500Mr or Mrs. 500Mr or Mrs. 500Mr or Mrs. 500Mr or Mrs. 500Mr or Mrs. 500

Group: General Forum Members
Last Login: Tuesday, October 14, 2014 8:26 AM
Points: 537, Visits: 1,923
can you please specify why running these commands would not be good to run in a production environment?
Post #577121
Posted Friday, September 26, 2008 7:27 PM


SSC-Dedicated

SSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-Dedicated

Group: General Forum Members
Last Login: Today @ 8:38 PM
Points: 35,371, Visits: 31,912
Because they clear cache... not just for the proc running, but for everybody. Depending on what's being done, cache is very important for performance by repetative and other code.

--Jeff Moden
"RBAR is pronounced "ree-bar" and is a "Modenism" for "Row-By-Agonizing-Row".

First step towards the paradigm shift of writing Set Based code:
Stop thinking about what you want to do to a row... think, instead, of what you want to do to a column."

(play on words) "Just because you CAN do something in T-SQL, doesn't mean you SHOULDN'T." --22 Aug 2013

Helpful Links:
How to post code problems
How to post performance problems
Post #577230
Posted Saturday, September 27, 2008 7:23 AM
Mr or Mrs. 500

Mr or Mrs. 500Mr or Mrs. 500Mr or Mrs. 500Mr or Mrs. 500Mr or Mrs. 500Mr or Mrs. 500Mr or Mrs. 500Mr or Mrs. 500

Group: General Forum Members
Last Login: Tuesday, October 14, 2014 8:26 AM
Points: 537, Visits: 1,923
Thanks for your reply, my issue is that I am having major performance issues from time to time the only thing that seems to fix it is running these commands, I think the problem could be parameter sniffing, how would I know for sure?
Post #577311
Posted Saturday, September 27, 2008 11:12 AM


SSC-Dedicated

SSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-Dedicated

Group: General Forum Members
Last Login: Today @ 8:38 PM
Points: 35,371, Visits: 31,912
For starters, you could try adding the "WITH RECOMPILE" option to the sprocs just to test the water. A better way would be to actually assume that parameter sniffing is, in fact the problem, and fix it on the more troublesome code. Last, but certainly not least, it to check the execution plans under different conditions...

--Jeff Moden
"RBAR is pronounced "ree-bar" and is a "Modenism" for "Row-By-Agonizing-Row".

First step towards the paradigm shift of writing Set Based code:
Stop thinking about what you want to do to a row... think, instead, of what you want to do to a column."

(play on words) "Just because you CAN do something in T-SQL, doesn't mean you SHOULDN'T." --22 Aug 2013

Helpful Links:
How to post code problems
How to post performance problems
Post #577325
Posted Monday, September 29, 2008 7:13 AM
Mr or Mrs. 500

Mr or Mrs. 500Mr or Mrs. 500Mr or Mrs. 500Mr or Mrs. 500Mr or Mrs. 500Mr or Mrs. 500Mr or Mrs. 500Mr or Mrs. 500

Group: General Forum Members
Last Login: Tuesday, October 14, 2014 8:26 AM
Points: 537, Visits: 1,923
hi but we are talking about many stored procedures, how would i know the ones that have the have the parameter sniffing problem? do i have to check each one? i have a query i found which gives me the number of recompilations, writes, reads, can this query help me find the stores procedures which have this issue?
this is the query:

SELECT 
substring(text,qs.statement_start_offset/2
,(CASE
WHEN qs.statement_end_offset = -1 THEN len(convert(nvarchar(max), text)) * 2
ELSE qs.statement_end_offset
END - qs.statement_start_offset)/2)
,qs.plan_generation_num as recompiles
,qs.execution_count as execution_count
,qs.total_elapsed_time - qs.total_worker_time as total_wait_time
,qs.total_worker_time as cpu_time
,qs.total_logical_reads as reads
,qs.total_logical_writes as writes
FROM sys.dm_exec_query_stats qs
CROSS APPLY sys.dm_exec_sql_text(qs.sql_handle) st
LEFT JOIN sys.dm_exec_requests r
ON qs.sql_handle = r.sql_handle
ORDER BY 3 DESC


can i run this query when i am having the performance problem to see which stored procedure/query could be the one with the problems? we are talking about 30-40 transactions per min.
Post #577695
Posted Monday, September 29, 2008 7:33 AM


SSChampion

SSChampionSSChampionSSChampionSSChampionSSChampionSSChampionSSChampionSSChampionSSChampionSSChampion

Group: General Forum Members
Last Login: Yesterday @ 6:53 AM
Points: 13,890, Visits: 28,285
I'd suggest running a trace while the events are occurring. Collect the data out to a file. You can then import it back into a table on a different server and run aggregate queries to identify which procedures are causing the most problems, say the top 10. Focus on them. Get them fixed and running well, then go and collect another trace. Repeat ad infinitum.

----------------------------------------------------
"The credit belongs to the man who is actually in the arena, whose face is marred by dust and sweat and blood..." Theodore Roosevelt
The Scary DBA
Author of: SQL Server Query Performance Tuning
SQL Server 2012 Query Performance Tuning
SQL Server 2008 Query Performance Tuning Distilled
and
SQL Server Execution Plans

Product Evangelist for Red Gate Software
Post #577709
Posted Monday, September 29, 2008 9:05 AM
Mr or Mrs. 500

Mr or Mrs. 500Mr or Mrs. 500Mr or Mrs. 500Mr or Mrs. 500Mr or Mrs. 500Mr or Mrs. 500Mr or Mrs. 500Mr or Mrs. 500

Group: General Forum Members
Last Login: Tuesday, October 14, 2014 8:26 AM
Points: 537, Visits: 1,923
thank you :)
Post #577777
Posted Wednesday, October 8, 2008 7:02 AM
Mr or Mrs. 500

Mr or Mrs. 500Mr or Mrs. 500Mr or Mrs. 500Mr or Mrs. 500Mr or Mrs. 500Mr or Mrs. 500Mr or Mrs. 500Mr or Mrs. 500

Group: General Forum Members
Last Login: Tuesday, October 14, 2014 8:26 AM
Points: 537, Visits: 1,923
hi just a quick question, i was able to identify some stored procedures with parameter sniffing and was able to fix them. After updating this stored procedures, should i do an update statistics with full scan to my DB?
Post #582513
« Prev Topic | Next Topic »

Add to briefcase ««123»»

Permissions Expand / Collapse