Calling SP through SSMS vs .NET app

  • I have a stored proc. If i run it through SSMS, it returns data within 3 secs. But if I run it through .NEP app, it takes about

    a minute or so. I checked the acitivity monitor, the process actually goes into a suspended state. Any idea?

  • My guess is lots of rows that the app is handling one-by-one or as Jeff Moden calls RBAR.. In SSMS it just takes the rows and displays them, no processing really..

    CEWII

  • What would be the solution? Really it just returns 72 rows.

  • Well what is the .Net app doing with those 72 rows?

    CEWII

  • Just calls the sp and shows that results to the grid.

  • SQL_Surfer (2/24/2012)


    Just calls the sp and shows that results to the grid.

    how is it loading into a grid? DataAdapter.LoadDataTable, then binding the grid to the datatable?

    is it using a sql statement to build a "EXEC MyStoredProc" string, or is it using a command object wiht parameters?

    Lowell


    --help us help you! If you post a question, make sure you include a CREATE TABLE... statement and INSERT INTO... statement into that table to give the volunteers here representative data. with your description of the problem, we can provide a tested, verifiable solution to your question! asking the question the right way gets you a tested answer the fastest way possible!

  • Yes..DataAdapter. Using command object.

  • Another clue. SP makes use of a table variable. Does changing it to temp table will help? The same sp on another server returns data to the app within few secs. Makes me think, it could be some server settings. Any idea?

  • SQL_Surfer (2/25/2012)


    Another clue. SP makes use of a table variable. Does changing it to temp table will help? The same sp on another server returns data to the app within few secs. Makes me think, it could be some server settings. Any idea?

    Table variables are good when you have few rows in it, my rulecof thumb is 1000 rows or more needs a temp table instead of table variable.

    Besides that, you might not need a tempxanything at all, you could post ttlhe Prof for a little peer review.

    You already ruled out stale statistics,right? It could possibly be parameter sniffing too,have you looked at that?(optimize for unknown)

    Lowell


    --help us help you! If you post a question, make sure you include a CREATE TABLE... statement and INSERT INTO... statement into that table to give the volunteers here representative data. with your description of the problem, we can provide a tested, verifiable solution to your question! asking the question the right way gets you a tested answer the fastest way possible!

  • One thing I didn't understand is, if sp returns fine on one server, why would it need a revison on another server?

  • the code might be the same, but the execution plan is not.

    that's the definitive way to review it...get the execution plan from each server, and compare them. post them here if you want, we can definitely help you with that .

    performance issues like this could be due to statistics, which is based on the actual data on one server.

    it could be one server has indexes that the other doesn't.

    it could also be the specific values being used; if the procedure was compiled assuming one value, but is called with a different value than the tailored execution plan assumed, you can see a performance hit.

    Lowell


    --help us help you! If you post a question, make sure you include a CREATE TABLE... statement and INSERT INTO... statement into that table to give the volunteers here representative data. with your description of the problem, we can provide a tested, verifiable solution to your question! asking the question the right way gets you a tested answer the fastest way possible!

  • Thanks. But SSMS returns fine within 3 secs on both servers. The problem ion one of the server is when .NET app calls it. All the indexes as suggested by Tuning advisor and stats are already placed on both servers. Its very strange. Makes me think some kind of database settings or server settings. But I am not sure. I will post the execution plan when I get a chance.

  • Hi, I have attached the exec plan. I also noticed database setting is set to Auotupdate Statistics to False. Not sure if this needs to set to true.

    Thanks for your help.

  • there we go;

    a see a few things that can help improve perforamnce;

    first, it's designed to be a catch all query;

    read Gails article on the performance issues here:

    http://sqlinthewild.co.za/index.php/2009/03/19/catch-all-queries/

    i see there's a couple of function calls, i'd like to make sure that the two functions were inline table value functions,as the performance difference can be an order of magnitude or more if it's re-written to be that.

    i see there is a suggested/missing index that SQl beleives will help;

    CREATE NONCLUSTERED INDEX [<Name of Missing Index, sysname,>]

    ON {snip by request}

    i'll look at the execution plan now,and see if i see any other issues.

    Lowell


    --help us help you! If you post a question, make sure you include a CREATE TABLE... statement and INSERT INTO... statement into that table to give the volunteers here representative data. with your description of the problem, we can provide a tested, verifiable solution to your question! asking the question the right way gets you a tested answer the fastest way possible!

  • Added that index. But no significant improvement. So, database setting Auto Statisctics Update to False, doesn't impact anything?

Viewing 15 posts - 1 through 15 (of 16 total)

You must be logged in to reply to this topic. Login to reply