• Specially I learned in tuning a few things:

    We should be careful about creating temporary tables, dynamic SQL and joining. Those things are the gold mine for the tuners as eliminating them can make your procedure much faster.

    Some of the others brought it up (I especially agree with Grant's and Kevin's posts) and I'll stress, again, that correct usage of Temp Tables and certain types of Dynamic SQL are, in fact, a gold mine for tuners... but only because they work so effectively when done correctly. I agree somewhat on the joins... you have to be careful that you don't build accidental "cross joins" or "triangular joins" and you have to make sure the predicates are sargeable to make effective use of indexes. In fact, you can eliminate many huge join problems by, ironically, creating the right kind of Temp Tables for the right reasons. 😉 It's known as "Divide'n'Conquer".

    Be very careful about the functionality of the procedure as your procedure should return the same results after tuning as before. It is not enough that SSMS is returning the "Command(s) completed successfully." message. We should check all the tables used in the procedures. Whether they have the same number of rows and whether the rows are contain the same data. These are the kinds of checks we should perform after tuning.

    I'm really happy to see someone say this, especially a "beginner". One of the best things you can do is to set up and execute a series of tests using known data BEFORE you start test and save the results. After tuning, run the exact same tests on the exact same data and compare the results to ensure that NOTHING has changed in the results.

    We always want to see the execution plan for each DML statement. The execution plan is the road layout, based on which the optimizer will decide which path will take less time and resources.

    I learned to try and avoid table scans. If the query optimizer chooses a table scan, it will scan the whole table, which is far more time consuming than index scan. If possible, create indexes where table scan is used (though based on the size of the table, optimizer sometimes decide to go for a table scan though there are indexes present for the column, but in this case table should contains a small number of records).

    Yep... that's definitely a good idea. It'll also let you know which sections of code you might want to check for predicate sargeablility so an index can be more effectively used.

    Always try to write all the SQL statements in ANSI format. Generally, ANSI syntax not reduces the logical Reads, but it is more helpful to understand.

    You'll find that statements like that have caused SQL "holy wars". Generally speaking, relegating your code to using only ANSI functionality is like relegating yourself to using only the 4 basic math functions of a scientific calculator and, IMHO, is some of the worst advice anyone could give. No matter what you do, SQL is currently NOT 100% portable nor do I believe it ever should be. It would be like telling people they shouldn't use certain add-ons for C because others may not buy them or may not understand them. Look at all the performance problems posted on this and dozens of other SQL forums and then decide to use ALL the power of SQL that is available no matter which dialect it may be. 😉

    As an conclusion, this is my advice to all of you. Do everything you want, there are no specific rules to tune SQL statements.

    Very mixed message here, IMHO... You CAN'T do EVERYTHING you want because a lot of people want to write RBAR Cursors, RBAR While Loops, RBAR Recursion, and a wealth of other non-set-based sins into production code because they think it's "good enough" for a limited set of requirements. Do I agree that you should try those things to make sure that you shouldn't use them? Absolutely... and you should never just take someone's word for something. Always set up a test and prove a claim. "A Developer must NOT guess... a Developer must KNOW." Like they said on an NFL/GMC commercial, "Amateurs practice until they get it right... professionals practice until they can't get it wrong."

    The ultimate goal is to reduce time and system resources. It is better to do it in some logical way and document what impat your changes have. This allows you to explain it your senior or testing team what you have done and why you have done it.

    That's SOOOOOO very true. Better than that, YOU have to understand it so YOU become a better Developer and know enough to not make the same mistake the first time. You also need to be able to prove to yourself that there's no reason to ever write code that is just "good enough" because you'll know the right way to do it the first time and without it ever taking any extra time. If you learn to do your job right, there's no such thing as "premature optimization"... it'll just happen. 😉

    --Jeff Moden


    RBAR is pronounced "ree-bar" and is a "Modenism" for Row-By-Agonizing-Row.
    First step towards the paradigm shift of writing Set Based code:
    ________Stop thinking about what you want to do to a ROW... think, instead, of what you want to do to a COLUMN.

    Change is inevitable... Change for the better is not.


    Helpful Links:
    How to post code problems
    How to Post Performance Problems
    Create a Tally Function (fnTally)