How SQL Server Chooses the Type of Join

  • Does anyone know how to solve this problem with a function? It appears that using hints is the only option. The WITH RECOMPILE option is not supported for user defined functions.

  • Wouldn't it be more effective to either call two different stored procedures (one for wide date ranges, one for narrow) or use the HINT option to force the JOIN type? That way you wouldn't incur the overhead of a RECOMPILE with every call of the stored procedure.

    There's a good writeup on HINTs here: http://www.microsoft.com/technet/prodtechnol/sql/2005/frcqupln.mspx

    Andy Hilliard
    Owl Creek Consulting[/url]

  • Just another option...

    Rather than using RECOMPILE in your procedure, using an OPTOMIZE FOR and specifying date parameters that give you your hash joins may give you better performance overall and avoid recompiling the procedure each time it runs.

    It would force the optimization engine to avoid the loop joins all of the time, but if the performance of small date ranges os ok this way, your overall time to execute the entire set of batches may go down.

  • Wow, thanks for the article. This is the kind of mystery that would have me banging my head against a wall for days. I was unaware of sp_recompile and with recompile. I'm going to research these options immediately.

    Thanks for saving my head 😀

  • There have been scripts posted on the site before to go through and recompile all stored procs and refresh all views. These have been particularly helpful for me in the past.

    Good article topic!

    Here is one of them:

    http://www.sqlservercentral.com/scripts/Maintenance+and+Management/31902/

  • Another technique we use to disable the so called "parameter sniffing" instead of WITH RECOMPILE is to declare local variables to replace the parameters immediately inside the stored procedure. That way, the optimizer would not know to use ("sniff") a particular set of parameter values passed in during 1st execution to come up with the query plan, but rather use the typical values. So we would not run into the risk of a plan optimized on "atypical" values. The advantabe of this technique is plan-reuse - the plan may not be the optimal for all scenarios, but probably good enough for all of them.

    We have tested that technique for some of our stored procedures and it seems to be working fine. We are weighing that against WITH RECOMPILE to decide when and where to use either.

  • If I have a situation where Query Analyzer and .NET, both running under the same parameter values, take different time to finish (.NET is slower), can I assume that Query Analyzer and .NET are using different cache plans?

    If so, are both plans stored in a system table?

    Thanks,

    Luiz.

  • Hi friends,

    I am new to this and my issue is my stored procedure runs very fast in management studio less than 1-2 seconds. And when i try to get those results in a dataset it takes 8-10 seconds. My Application code is below

    myConnection.Open();

    SqlDataAdapter MyAdapter = new SqlDataAdapter(myCommand);

    MyAdapter.Fill(ds);

    any one can suggest any ideas?

    Jino

  • here's one possibility:

    if it's not parameter sniffing as mentioned above, which uses a bad execution plan so it's slow,

    it might actually be the data and the time it takes to move the data over the wire (or wireless)

    i believe if i select 10 billion rows from a table with QA or SSMS, it doesn't really fetch all 10 billion rows...it executes the query, but really grabs the top few hundred rows, and as i scroll down, it fetches more and more rows...

    it doesn't do it all at once.

    but if i try to move those ten billion rows to a dataset, it's gotta move all ten billion rows over the network instead of the top few hundred, and then the local disk has to allocate some space in memory for the dataset, which might get paged to the harddisk because there;s not enough ram....

    how big is the recordset you are grabbing.

    Lowell


    --help us help you! If you post a question, make sure you include a CREATE TABLE... statement and INSERT INTO... statement into that table to give the volunteers here representative data. with your description of the problem, we can provide a tested, verifiable solution to your question! asking the question the right way gets you a tested answer the fastest way possible!

  • I agree with Lowell, mostly. If jino.jose had said "Enterprise Manager" then I would not even posted a reply. Management Studio does get the whole recordset. Open a big table, sit back, and watch your memory drain away.

    You are right about pushing the data "down the wire". That could be the culprit. Same thing to Luiz. Even if they are using the same plan they don't use the same libraries to fetch the result set.

    I remember the same controversy when ADO came out. More capable, takes more memory, works slower.

    ATBCharles Kincaid

Viewing 10 posts - 31 through 39 (of 39 total)

You must be logged in to reply to this topic. Login to reply