Stored procedure very slow execution

  • Hello,

    Please help me, I'm desperate.

    I have an old SQL2000 (SP4) - and no, I can't upgrade it, is in my customer server - on which I have a very, very strange issue which started on 01/01/2013. Until then, this problem never appeared.

    I have an data export stored procedure, which fill a temporary table from a quite complicated select (multiple subqueries, with data cast, sums and so on) and export the final content in another table in a specific format. If I execute this s/p, either from calling application code, or from Query Analyzer (exec ...), it take very much time to execute (1.5 - 2 hours). If I check the status with sp_who2 I see that the process appears to be blocked by itself in the "INSERT INTO" temporary table. But if I directly execute the s/p code from Query Analyzer, the result come almost instantly (5-10 seconds).

    I tried to simplify the main query, to remove some subqueries and sums, no result 'till now.

    Does anybody faced this problem? Is there any solution, except re-writing the whole s/p from scratch?

    Interested fact: I have a similar (not exactly the same, but same kind) of stored procedure, which retrieve from exact same tables similar data in a similar way for a report, and this is working fine...

    Thanks in advance for any hint that may get me out of this problem.

    ___________________________________________________________________
    Is better shut up and look stupid than open your mouth and eliminate any doubt

  • Radu Costescu (1/25/2013)


    If I execute this s/p, either from calling application code, or from Query Analyzer (exec ...), it take very much time to execute (1.5 - 2 hours).

    But if I directly execute the s/p code from Query Analyzer, the result come almost instantly (5-10 seconds).

    What exactly are you checking from query analyzer ? Running the procedure with parameters or the code inside the procedure ?

    If the code inside the procedure is running perfectly and procedure not, then try recompiling the procedure and check the result.

  • sqlnaive (1/25/2013)

    What exactly are you checking from query analyzer ? Running the procedure with parameters or the code inside the procedure ?

    I tried both. Running the procedure with parameters (exec sp_name 'yyyy-mm-dd','xxxxxx') result in slow execution, executing the code itself (declare @P1 datetime, @P2 varchar(10) set @P1='yyyy-mm-dd' set @P2='xxxxxx' select ... etc) give immediate results.

    sqlnaive (1/25/2013)

    If the code inside the procedure is running perfectly and procedure not, then try recompiling the procedure and check the result.

    I tried to run exec sp_name 'yyyy-mm-dd','xxxxxx' with recompile (after I dropped the s/p and re-create it, to be sure) but unfortunately still no improvements 🙁

    ___________________________________________________________________
    Is better shut up and look stupid than open your mouth and eliminate any doubt

  • Sounds like a badplan query plan or a case of Paramater sniffing not selecting the optimal plan for the paramaters.

    Theres a simple way to test if its parameter sniffing, create internal paramaters in the SP and use those in the query.

    Eg

    CREATE PROCEDURE SP_NAME

    @P1 Datetime

    ,@P2Varchar(100)

    AS

    DECLARE @Date DateTIme=@Date

    DECLARE @Text Varchar(100)=@text

    Select *

    from

    Table

    Where Col1=@Date

    and Col2=@Text

    Hope that give you the jist, as a quick fix in 2008 you could also issue a hint with the OPTION(RECOMPILE) at the end of the query, HOWEVER this short term fix in order to give you breathing space to put something more robust in place.

    Sorry the above strike through is 2008 specific just noticed you're on SQL 2000.

    I would suggest reading Gail's blogs on Paramater sniffing http://sqlinthewild.co.za/index.php/2007/11/27/parameter-sniffing/

    and Catch all queries http://sqlinthewild.co.za/index.php/2009/03/19/catch-all-queries/

    hope these help.

    _________________________________________________________________________
    SSC Guide to Posting and Best Practices

  • I agree. It sounds like parameter sniffing is leading to issues. I would suggest updating the statistics on the tables being referenced. In addition to the solutions for parameter sniffing, just getting better statistics frequently solves the issue.

    "The credit belongs to the man who is actually in the arena, whose face is marred by dust and sweat and blood"
    - Theodore Roosevelt

    Author of:
    SQL Server Execution Plans
    SQL Server Query Performance Tuning

  • Grant, in addition to this, does it means just declaring internal parameters won't resolve parameter snififng ? Is updating stats mandatory (I know there's always "as per situation" criteria but in generalized way) ? I am asking this because at times it becomes hard to update stats on big tables on daily basis in prod environment. In that case what would you suggest ?

  • Grant Fritchey (1/25/2013)


    I agree. It sounds like parameter sniffing is leading to issues. I would suggest updating the statistics on the tables being referenced. In addition to the solutions for parameter sniffing, just getting better statistics frequently solves the issue.

    I ruled out the parameter sniffing possibility, while within the code I have no such condition (@P1='yyyy-mm-dd' or @P1 is null). I will take the second suggested path, the update of the statistics (involved tables are really big and the customer does not care very much to maintain the database) and inform you.

    Thank you all for the contributions and interest.

    ___________________________________________________________________
    Is better shut up and look stupid than open your mouth and eliminate any doubt

  • That last post makes me think of indexes. Since you changed the code to make things simpler and the database isn't maintained, check to see if the procedure is using any of your indexes. If, say, a change resulted in a full table scan, that would explain the performance degradation. It could be that a single covering index isn't being used any more. If it is using the indexes properly, then definitely go with updating the statistics. Just food for thought.

  • sqlnaive (1/25/2013)


    Grant, in addition to this, does it means just declaring internal parameters won't resolve parameter snififng ? Is updating stats mandatory (I know there's always "as per situation" criteria but in generalized way) ? I am asking this because at times it becomes hard to update stats on big tables on daily basis in prod environment. In that case what would you suggest ?

    OK, now I will have to admit my lack of knowledge in SQL Server administration, but is better to look stupid than to be stupid. So, admitting that the customer never updated the statistics for that database (which is a large one, and the tables involved are the three largest within it), can I do the update statistics for these tables in live (production) environment without jeopardizing the user activity? Or I need special conditions (like DB in Single User mode or stuff)? Thanks again.

    ___________________________________________________________________
    Is better shut up and look stupid than open your mouth and eliminate any doubt

  • You can do a targeted Update Statistics on single tables, the BoL should help lookup the UPDATE STATISTICS TSQL page.

    You can I beleive run an EXEC sp_updateStats in line as well.

    however I would also suggest that a Rebuild of indexes be run on a weekly basis during quiet times, 12-5am local is generally a good window.

    There are also targeted index rebuild scripts available on line that will hit the worst offenders on a nightly basis (say over 60% fragmentation).

    You can rebuild Nonclustered Indexes on the fly but it will impact perfomance a little using something like

    ALTER INDEX <index> ON <Table> REBUILD (WITH ONLINE=ON)

    (check the syntax in BOL).

    I wouldnt recommend doing this with a clustered index though.

    _________________________________________________________________________
    SSC Guide to Posting and Best Practices

  • As a word of caution, I believe you can do online index rebuild only with the Enterprise Edition of SQL Server. If you have EE, great. If not, taking down the index on a very large table will most likely result in the table being effectively offline. You aren't going to efficiently query 3M rows without lots of disk I/O.

    If the table has to remain online, consider reorganizing the indexes instead of rebuilding. Doing a rebuild generally yields the best results, but reorganizing is sometimes necessary.

    Another word of caution, if you alter your clustered index, all nonclustered indexes will be dropped, then the clustered index is changed and built, then nonclustered indexes get recreated. This is necessary because the clustered index gets inherited in by all the nonclustered indexes. This can take considerable time and should always be tested on a test system first.

    I completely understand being under the gun to get something fixed, but please don't rush too much. Think it through, make sure you understand how it works and test it out.

  • sqlnaive (1/25/2013)


    Grant, in addition to this, does it means just declaring internal parameters won't resolve parameter snififng ? Is updating stats mandatory (I know there's always "as per situation" criteria but in generalized way) ? I am asking this because at times it becomes hard to update stats on big tables on daily basis in prod environment. In that case what would you suggest ?

    "internal parameters" are called variables. And SQL Server can't sniff variables. The difference is that with a parameter on a stored procedure you get a precise seek against the statistics based on the value provided. With a variable you get a sampled average of the statistics. The issue is, are your queries better off with precise values or with averages? I can't tell you, you have to investigate it. But, the other issue comes into play when the statistics are bad (out of date or sampled instead of a full scan). Then, the precise seek returns a value, but it's an imprecise value leading the optimizer to make bad choices again. So, if you have lots of data being added to these tables and your statistics are going out of date (and this becomes especially true when dealing with date based data that is usually filtering for the latest information) then your query plans are going to go wonky. Your solutions are limited, update the stats.

    Other than that... you can get into trying weird stuff. If the date range being searched against regularly is the latest data, then create a filtered index that includes today's date. Then the data set will be very small and you could update the statistics several times during the day (assuming the automatic updates don't do it for you).

    But the issue is, if you want fast queries, you need good statistics and there's very few options around that.

    "The credit belongs to the man who is actually in the arena, whose face is marred by dust and sweat and blood"
    - Theodore Roosevelt

    Author of:
    SQL Server Execution Plans
    SQL Server Query Performance Tuning

  • Ed Wagner (1/25/2013)


    As a word of caution, I believe you can do online index rebuild only with the Enterprise Edition of SQL Server.

    and only in SQL 2005 above, so on SQL 2000, it's ofline index rebuilds.

    Gail Shaw
    Microsoft Certified Master: SQL Server, MVP, M.Sc (Comp Sci)
    SQL In The Wild: Discussions on DB performance with occasional diversions into recoverability

    We walk in the dark places no others will enter
    We stand on the bridge and no one may pass
  • Grant Fritchey (1/25/2013)


    sqlnaive (1/25/2013)


    Grant, in addition to this, does it means just declaring internal parameters won't resolve parameter snififng ? Is updating stats mandatory (I know there's always "as per situation" criteria but in generalized way) ? I am asking this because at times it becomes hard to update stats on big tables on daily basis in prod environment. In that case what would you suggest ?

    "internal parameters" are called variables. And SQL Server can't sniff variables. The difference is that with a parameter on a stored procedure you get a precise seek against the statistics based on the value provided. With a variable you get a sampled average of the statistics. The issue is, are your queries better off with precise values or with averages? I can't tell you, you have to investigate it. But, the other issue comes into play when the statistics are bad (out of date or sampled instead of a full scan). Then, the precise seek returns a value, but it's an imprecise value leading the optimizer to make bad choices again. So, if you have lots of data being added to these tables and your statistics are going out of date (and this becomes especially true when dealing with date based data that is usually filtering for the latest information) then your query plans are going to go wonky. Your solutions are limited, update the stats.

    Other than that... you can get into trying weird stuff. If the date range being searched against regularly is the latest data, then create a filtered index that includes today's date. Then the data set will be very small and you could update the statistics several times during the day (assuming the automatic updates don't do it for you).

    But the issue is, if you want fast queries, you need good statistics and there's very few options around that.

    For the big tables which are constantly getting affected by DMLs and hence impacting the statistics, what should be the solution in production environment ? Though we have enabled automatic updates of statistics on those tables, but I believe these work only on the basis of the amount of records getting impacted by the DMLs and/or on percentage basis only.

  • Yes, the default behavior is that 20% of the data must be modified before statistics are updated automatically (for those desperate to correct me, yes, I know it's more complex a formula than that, but when it comes to very large tables, this is effectively the formula, right?). You can set a traceflag, 2371, to change this behavior. Here are the details. In general, you have to figure out if either of these automated updates are adequate or if you need to manually update statistics. If you do manually update the statistics, you further have to determine if you can use the sampled scans or a full scan to get your statistics. There are no hard and fast rules for me to provide here. You have to figure out your own situation. Just know that there is no one size fits all solution for statistics updates.

    "The credit belongs to the man who is actually in the arena, whose face is marred by dust and sweat and blood"
    - Theodore Roosevelt

    Author of:
    SQL Server Execution Plans
    SQL Server Query Performance Tuning

Viewing 15 posts - 1 through 15 (of 29 total)

You must be logged in to reply to this topic. Login to reply