• I believe that the true beginner's tuning guide boils down to:

    A) Pop open Profiler with CPU, Reads, Writes, and Duration on the SQL:BatchCompleted event filtered for the SPID of your session (shown at the bottom of your SSMS/Query Analyzer session, or by SELECT @@SPID)

    B) Write your query in several different ways

    C) Against your full production-sized dataset, execute each way about three times; note that the first time is likely to show different stats than immediately subsequent times

    D) Learn; get a feel for your particular environment.

    #Temp tables (with and without indexes, and those indexes created/rebuilt at different places in the code) vs. @table variables vs. CTE's vs. derived tables vs. simple joins.

    Personally, I find that on 2000, at least, derived tables hold an advantage for simple queries that need them, and #temp tables hold an advantage for complex queries that need them (even if only used once). Indexing those temp tables; sometimes a big win (especially if it's used more than once), sometimes a loser, sometimes it simply breaks even.

    But the basic things remain the same:

    Use Profiler

    Use a full-size dataset (or bigger, if you're planning for significant growth)

    Try different things.

    If you're able to, pull up Performance Monitor (perfmon) or Resource Monitor (resmon) as well, and watch how CPU, Disk and Network traffic is flowing, too.

    Every environment is different, and must be learned in detail. Maybe CPU is more important, maybe Reads+Writes are more important; it depends (until 2008's compression, I've almost never seen CPU be a bottleneck; with 2008, index rebuilds with page compression I've seen to be CPU constrained even with 16 CPU cores available).