Click here to monitor SSC
SQLServerCentral is supported by Red Gate Software Ltd.
 
Log in  ::  Register  ::  Not logged in
 
 
 
        
Home       Members    Calendar    Who's On


Add to briefcase 1234»»»

Performance testing and tuning for a beginner Expand / Collapse
Author
Message
Posted Sunday, November 16, 2008 9:36 PM


SSC-Addicted

SSC-AddictedSSC-AddictedSSC-AddictedSSC-AddictedSSC-AddictedSSC-AddictedSSC-AddictedSSC-Addicted

Group: General Forum Members
Last Login: Tuesday, July 03, 2012 5:49 AM
Points: 418, Visits: 365
Comments posted to this topic are about the item Performance testing and tuning for a beginner
Post #603441
Posted Monday, November 17, 2008 5:19 AM
Valued Member

Valued MemberValued MemberValued MemberValued MemberValued MemberValued MemberValued MemberValued Member

Group: General Forum Members
Last Login: Wednesday, February 02, 2011 8:04 AM
Points: 74, Visits: 77
That is a resourceful post. As you stated, there is no one single approach to tuning a proc. It is purely a case by case issue, so a variety of testings have to be made after baseline numbers have been taken, and thereafter, examining and comparing new readings of various performance indicators. with each modification. The fact is that you have to start from somewhere, and eliminate those indicators that do not improve with changes until eventually, the real cause(s) of the performance is identified and then tweaked to achieve improvement. As stated, it could be problems with indexes, memory & I/O, temp tables, network, how the T_SQL itself had been written, contention of server resources etc... I think one of the principal problems is identifying the real cause of the issue, and that has to do with using the right tools and understanding the tools in use.
Post #603576
Posted Monday, November 17, 2008 5:24 AM
SSCertifiable

SSCertifiableSSCertifiableSSCertifiableSSCertifiableSSCertifiableSSCertifiableSSCertifiableSSCertifiableSSCertifiable

Group: General Forum Members
Last Login: Thursday, March 20, 2014 7:32 AM
Points: 5,191, Visits: 1,368
Nice article ...:)


Post #603578
Posted Monday, November 17, 2008 5:39 AM


SSC-Addicted

SSC-AddictedSSC-AddictedSSC-AddictedSSC-AddictedSSC-AddictedSSC-AddictedSSC-AddictedSSC-Addicted

Group: General Forum Members
Last Login: Tuesday, July 03, 2012 5:49 AM
Points: 418, Visits: 365
HI Addai and Anirban,

Many thanks for your valued review.
Post #603589
Posted Monday, November 17, 2008 5:54 AM
Forum Newbie

Forum NewbieForum NewbieForum NewbieForum NewbieForum NewbieForum NewbieForum NewbieForum Newbie

Group: General Forum Members
Last Login: Monday, November 17, 2008 10:55 PM
Points: 1, Visits: 2
nice article man... :) done a great job and research... gr8 going .. and waiting for the next article which u have said to do so..
Post #603598
Posted Monday, November 17, 2008 7:25 AM


SSC Journeyman

SSC JourneymanSSC JourneymanSSC JourneymanSSC JourneymanSSC JourneymanSSC JourneymanSSC JourneymanSSC Journeyman

Group: General Forum Members
Last Login: Wednesday, April 16, 2014 1:16 PM
Points: 80, Visits: 351
Well I agree with your statement about being careful with temporary tables, don't ignore them. In many cases I have taken a complex multi-table query and broken it up into simpler queries that have significant less overall query cost. This is especially true when the original complex query contains a PK table scan on a very large table and then joining it to a derived result set in lieu of performing an index seek on the large table. This is especially obvious when looking at the query plan and observing large intermediate result sets (thick lines) where there should not be such.

Usually when I am given a stored proc to "tune", after cleaning it up (converting to ANSI-92 standard for readibility, SET NOCOUNT ON, moving DDL statements to top of stored proc, and fully qualifing object names) I analyze the query plan to see if it makes sense. As M$ frequently states, the query optimizer doesn't pick the best plan, just one that it thinks is OK. My job then is to "help" the optimizer pick a better plan.

Usually I use profiler to determine which stored procs need attention, then the query plan to identify statements within the stored proc that need attention.

Also, don't forget to consider revising/adding indexes to help performance. The INCLUDE clause in SQL Server 2005 is a "God send" for covering indexes.

Cheers,
Mike Byrd


Mike Byrd
Post #603661
Posted Monday, November 17, 2008 9:26 AM
Ten Centuries

Ten CenturiesTen CenturiesTen CenturiesTen CenturiesTen CenturiesTen CenturiesTen CenturiesTen Centuries

Group: General Forum Members
Last Login: Sunday, April 13, 2014 5:06 PM
Points: 1,426, Visits: 318
It's a nice topic.
The "tuning" is more than just apply the "best practices" to the T-SQL, but is the first step to try to improve the performance of your system.
Many of ours peers avoid the T-SQL tuning because have a very good HW supporting behind the scenes, and crashes when the database become a millions of records and the qrys give them bad time response, then is when they start thinking in improve HW; and the solution could be tuning code.

Saludos a todos!
Post #603744
Posted Monday, November 17, 2008 9:27 AM


Grasshopper

GrasshopperGrasshopperGrasshopperGrasshopperGrasshopperGrasshopperGrasshopperGrasshopper

Group: General Forum Members
Last Login: Friday, December 06, 2013 8:48 AM
Points: 21, Visits: 165
This was a great artical.

Here is my Experience. In My Previous Job we had a Stored Procedure that used to Take around 6 Hours to Run (Batch JOB) use to do a Lot of Things on a 35 GB Database. Many a times it used to fill up the Temp DB and Crash. Here are few things I did to fix it, and there was a great Improvement (1 Hour 40 Mts.)

There were a few Very Complex Query and used to Join with 9 Big Table. I broke that query and created a Temp Table, Used a While Loop on the Temp Table. Removed the Dynamic SSQL. Removed the Function like AVG , SUM on a Large Query and added it to Sub Query, Remember those function are lot of Overheads. Of course I had to create few Index.

But Yes !! There are quite a few ways of tuning a Stored Procedure. My Favorate is breaking up joins into various small queries and Pumping the Info in Temp Table and then Join them.

I am not sure if this is the best way but has always worked for me ;)

R A J






Post #603745
Posted Monday, November 17, 2008 10:12 AM
SSC-Enthusiastic

SSC-EnthusiasticSSC-EnthusiasticSSC-EnthusiasticSSC-EnthusiasticSSC-EnthusiasticSSC-EnthusiasticSSC-EnthusiasticSSC-Enthusiastic

Group: General Forum Members
Last Login: Tuesday, April 15, 2014 6:00 PM
Points: 141, Visits: 291
Points for writing and posting the article. And a good start.

For smallist, in-house corp applications tuning seems to be ignored or overlooked. And even with externally facing internet applications, I've found performance and tuning only become important when problems arise. Keep tuning.


The more you are prepared, the less you need it.
Post #603806
Posted Monday, November 17, 2008 11:11 AM


SSC Eights!

SSC Eights!SSC Eights!SSC Eights!SSC Eights!SSC Eights!SSC Eights!SSC Eights!SSC Eights!

Group: General Forum Members
Last Login: Monday, March 10, 2014 9:39 AM
Points: 942, Visits: 1,050
Nice article :).

I spend a lot of time on performance tuning; it is my favorite activity. I find many people don't like to pay attention to it. I find biggest problems in vendor applications and access developers applications. Microsoft Access developers don't think about all the affects when they are working on the dataset, at least not the concurrent data access. So I often find developers selecting the entire table. So doesn't matter how many indexes, covering indexes, or include indexes you built it will be slow. And I usually get a reply when Microsoft Access could do it ... Sure it can when you only have 10k records with only few users hitting it. I usually prove to them the point using READS, and lots of diagrams on Indexes and how SQL Server stores information.

As you said you focused on READS, I look at those too. In addition to I look at the number of rows returned. On recent application I tuned the application was generating 150K reads, and I was like great another full table select. When looking at the rows returned it was less then 100 rows; so that didn't turn out to be the issue. In this case developer was at least being nice, but if you average the rows to read that’s 1500 Logical reads per row. Best case would be 2-3 reads/row because of how the data is accessed. In my case it turned out the function call was a problem; RBAR issue.

Again good article, just my two cents on tuning I seen ;). As you said there is no set way for tuning, I usually tell people at work. I can't teach you tuning, it is experience and art; you learn it. And I usually hand them the "Dissecting SQL Server Execution Plans" by Grant Fritchey. Really good book to learn more basics :).

Thanks.


---

Mohit K. Gupta, MCITP: Database Administrator (2005), My Blog, Twitter: @SQLCAN.
Microsoft FTE - SQL Server PFE

* Some time its the search that counts, not the finding...
* I didn't think so, but if I was wrong, I was wrong. I'd rather do something, and make a mistake than be frightened and be doing nothing.


How to ask for help .. Read Best Practices here.
Post #603834
« Prev Topic | Next Topic »

Add to briefcase 1234»»»

Permissions Expand / Collapse