Click here to monitor SSC
SQLServerCentral is supported by Redgate
 
Log in  ::  Register  ::  Not logged in
 
 
 


Measuring SQL Performance


Measuring SQL Performance

Author
Message
Chris Hedgate
Chris Hedgate
SSCommitted
SSCommitted (1.6K reputation)SSCommitted (1.6K reputation)SSCommitted (1.6K reputation)SSCommitted (1.6K reputation)SSCommitted (1.6K reputation)SSCommitted (1.6K reputation)SSCommitted (1.6K reputation)SSCommitted (1.6K reputation)

Group: General Forum Members
Points: 1579 Visits: 7
Comments posted to this topic are about the content posted at http://www.sqlservercentral.com/columnists/chedgate/m

--
Chris Hedgate
http://www.hedgate.net/
Contributor to the Best of SQL Server Central volumes
Articles: http://www.sqlservercentral.com/columnists/chedgate/
David.Poole
David.Poole
SSCarpal Tunnel
SSCarpal Tunnel (4.4K reputation)SSCarpal Tunnel (4.4K reputation)SSCarpal Tunnel (4.4K reputation)SSCarpal Tunnel (4.4K reputation)SSCarpal Tunnel (4.4K reputation)SSCarpal Tunnel (4.4K reputation)SSCarpal Tunnel (4.4K reputation)SSCarpal Tunnel (4.4K reputation)

Group: General Forum Members
Points: 4391 Visits: 3175

Good primer!

If I have a stored procedure that does multiple tasks, is there a way of getting the overall cost of the entire procedure or do I have to manually tot up all the costs for each of the individual queries?



LinkedIn Profile

Newbie on www.simple-talk.com
msurasky
msurasky
SSC Rookie
SSC Rookie (47 reputation)SSC Rookie (47 reputation)SSC Rookie (47 reputation)SSC Rookie (47 reputation)SSC Rookie (47 reputation)SSC Rookie (47 reputation)SSC Rookie (47 reputation)SSC Rookie (47 reputation)

Group: General Forum Members
Points: 47 Visits: 1

Hi David:

Use SP:Completed Event in the SQL Profiler... this should be enough for you needs.





Aaron Dutton
Aaron Dutton
SSC Veteran
SSC Veteran (227 reputation)SSC Veteran (227 reputation)SSC Veteran (227 reputation)SSC Veteran (227 reputation)SSC Veteran (227 reputation)SSC Veteran (227 reputation)SSC Veteran (227 reputation)SSC Veteran (227 reputation)

Group: General Forum Members
Points: 227 Visits: 88

It would be nice if you also showed how to clear the sp cache so you can run the results over and over and compare. I think the commands are

DBCC DROPCLEANBUFFERS
DBCC FREEPROCCACHE





Steve Jones
Steve Jones
SSC-Forever
SSC-Forever (40K reputation)SSC-Forever (40K reputation)SSC-Forever (40K reputation)SSC-Forever (40K reputation)SSC-Forever (40K reputation)SSC-Forever (40K reputation)SSC-Forever (40K reputation)SSC-Forever (40K reputation)

Group: Administrators
Points: 40647 Visits: 18851
Great introductory article to looking at performance. The suggestions above are good, but most code might be improved if each developer took a few minutes to run these and see the relative performance of each query v the time for their code to run.

Follow me on Twitter: @way0utwest
Forum Etiquette: How to post data/code on a forum to get the best help
My Blog: www.voiceofthedba.com
Chris Hedgate
Chris Hedgate
SSCommitted
SSCommitted (1.6K reputation)SSCommitted (1.6K reputation)SSCommitted (1.6K reputation)SSCommitted (1.6K reputation)SSCommitted (1.6K reputation)SSCommitted (1.6K reputation)SSCommitted (1.6K reputation)SSCommitted (1.6K reputation)

Group: General Forum Members
Points: 1579 Visits: 7
David: Using those tools I mentioned you will have to sum up the numbers. Ad msuraski said there are other tools that can give a better idea of the entire proc. Personally I am normally more interested in optimizing each of the specific queries inside the proc, but of course you could sometimes go for different approaches for the entire proc and want to check which of them to use.

adutton: Very good advice. I thought about including more, but I thought I would lose a little focus with that. But maybe they should have been mentioned. Maybe they can be included in a follow-up sometime, as well as how to simulate production environments on dev servers.

--
Chris Hedgate http://www.hedgate.net/
Contributor to the Best of SQL Server Central volumes
Articles: http://www.sqlservercentral.com/columnists/chedgate/
David.Poole
David.Poole
SSCarpal Tunnel
SSCarpal Tunnel (4.4K reputation)SSCarpal Tunnel (4.4K reputation)SSCarpal Tunnel (4.4K reputation)SSCarpal Tunnel (4.4K reputation)SSCarpal Tunnel (4.4K reputation)SSCarpal Tunnel (4.4K reputation)SSCarpal Tunnel (4.4K reputation)SSCarpal Tunnel (4.4K reputation)

Group: General Forum Members
Points: 4391 Visits: 3175

Thanks Chris,

I've been tuning a stored procedure that itterates down a tree structure to accumulate a list of ids within that structure and then returns the details associated with those ids.

I experimented with temp tables vs table variables and found that some parts of the procedure were faster with temporary tables and others were faster with table variables.

In this scenario I was interested in the overall performance of the procedure.

I spent quite a bit of time with Excel plotting the impact of one vs the other as the tree structure grew more complex.

In the end I decided that for shallow trees table variables were better, but for deep trees it was temporary tables that gained the edge.



LinkedIn Profile

Newbie on www.simple-talk.com
Chuck Ritenour
Chuck Ritenour
SSC Rookie
SSC Rookie (44 reputation)SSC Rookie (44 reputation)SSC Rookie (44 reputation)SSC Rookie (44 reputation)SSC Rookie (44 reputation)SSC Rookie (44 reputation)SSC Rookie (44 reputation)SSC Rookie (44 reputation)

Group: General Forum Members
Points: 44 Visits: 3
Can we take it a step further, i.e., an article that explains what to do with the results obtained - from both a query rewrite and a hardware POV?
I ran a sample query that linked two tables, each with 200million rows, looking to return 1001 records using where 'id' between. I don't think 4.6 seconds is bad, but I always look for better.
I'm heavily indexed on both these tables - and have just recently completed a rewrite of the entire process from our front end web search tool to the redesign of the database and all the associated stored proc's. The data gets a update of about 700k records a week, indexing is dropped, records inserted, and indexs reapplied (and statistics re-computed). It works, seems like it generates fast responses - but it takes 6-8 hours to process.
Like the others above I found that for short data sets table var's are better, and for temp tables are better for larger returns. My SAN's are normalized (set up to be read/write neutral as opposed to one being faster than the other).
My overall questions(based on the results below) are
How do I reduce logical reads, and what does that mean.
How do I reduce physical reads.
What's a read-ahead - and why did it need 916 when it was 1001 records (it was queried on the primary clustered key - should it not have 'known' via the indexing, statistics and bookmarking and not needed to read ahead).
The primary table is a 3rd normal ID only table in it's own database, the secondary table is a loose 2nd normal (lots of text, some id's) in it's own database consisting of one year's data (there are 7 secondary db's).
Hope that was enough info to go on.
my results:
Table 'primary'. Scan count 1001, logical reads 4859, physical reads 6, read-ahead reads 7.
Table 'secondary'. Scan count 1, logical reads 1007, physical reads 0, read-ahead reads 916.

SQL Server Execution Times:
CPU time = 1734 ms, elapsed time = 4675 ms.


Thanks, and don't forget to Chuckle
Alex-217289
Alex-217289
SSC Journeyman
SSC Journeyman (81 reputation)SSC Journeyman (81 reputation)SSC Journeyman (81 reputation)SSC Journeyman (81 reputation)SSC Journeyman (81 reputation)SSC Journeyman (81 reputation)SSC Journeyman (81 reputation)SSC Journeyman (81 reputation)

Group: General Forum Members
Points: 81 Visits: 3
Though I was aware of all the tools mentioned, I always find myself looking for excuses to not bother. This was a great reminder that SQL provides various very useful tools for the purpose of performance enhancement.

Regarding DBCC DROPCLEANBUFFERS and DBCC FREEPROCCACHE, in my experience, these are useful in a testing/staging environment only. I don't believe you would want to do this in a production environment.


Cheers,

Alex


Rogue DBA
Grant Fritchey
Grant Fritchey
SSCoach
SSCoach (19K reputation)SSCoach (19K reputation)SSCoach (19K reputation)SSCoach (19K reputation)SSCoach (19K reputation)SSCoach (19K reputation)SSCoach (19K reputation)SSCoach (19K reputation)

Group: General Forum Members
Points: 19626 Visits: 32340

When I see words like "heavily indexed" red flags go up. How heavily indexed, and are all those indexes being used (selectivity needs to be as high as possible in order for the query analyzer to use the index, generally above 95%), are you getting excessive bookmark lookups (which double your I/O in lots of cases), are you getting hash joins instead of loop joins (have to evaluate the query plan), are you getting index scans instead of index seeks (another indication that your indexes may be selective enough or your joins are incorrect), does the table have a clustered index, is the cluster placed on the column(s) that will give you the best of chance of avoiding bookmark lookups (leading edge, first column, of the index needs to cluster the data in logical groups that will help speed up the data)... Get all these questions answered, then you can start to look at hardware as a mechanism to speed things up. Worst thing you can do to your queries (write or read) is place a bunch of indexes that are either not used or improperly used by the query optimizer.

Oh, and nice article. It is shocking the number of times you hear that it doesn't matter how fast a query runs on the database. It's good to see that other people are pushing the need for these types of checks as part of query writing. Now if we can just get people to think about the queries being written as they design tables too.



----------------------------------------------------
The credit belongs to the man who is actually in the arena, whose face is marred by dust and sweat and blood...
Theodore Roosevelt

The Scary DBA
Author of: SQL Server Query Performance Tuning and SQL Server Execution Plans
Product Evangelist for Red Gate Software
Go


Permissions

You can't post new topics.
You can't post topic replies.
You can't post new polls.
You can't post replies to polls.
You can't edit your own topics.
You can't delete your own topics.
You can't edit other topics.
You can't delete other topics.
You can't edit your own posts.
You can't edit other posts.
You can't delete your own posts.
You can't delete other posts.
You can't post events.
You can't edit your own events.
You can't edit other events.
You can't delete your own events.
You can't delete other events.
You can't send private messages.
You can't send emails.
You can read topics.
You can't vote in polls.
You can't upload attachments.
You can download attachments.
You can't post HTML code.
You can't edit HTML code.
You can't post IFCode.
You can't post JavaScript.
You can post emoticons.
You can't post or upload images.

Select a forum

































































































































































SQLServerCentral


Search