Stored Procedure Performance

  • I am reviewing a large stored procedure in order to increase its readability and performance. This uncritical procedure is executed at the end of a business day and its purpose is to update several read-only tables that contain daily totals. I am testing it on the latest build (15.0.4198.2) of SQL Server 2019 running on a fast local computer.

    I do not think it is useful to show the actual 3000 lines of code of the procedure but its main structure is as follows:

    DECLARE cur CURSOR FOR

    SELECT A, B, C, ...

    UNION ALL

    SELECT D, E, F, ...

    UNION ALL

    SELECT G, H, I, ...

    WHILE @@FETCH_STATUS = 0 BEGIN

    ...

    UPDATE table ...

    IF (condition) UPDATE table ...

    IF (condition) UPDATE table ...

    UPDATE table ...

    DELETE FROM table ...

    IF (condition) UPDATE table ...

    IF (condition) DELETE FROM table ...

    UPDATE table ...

    ...

    END

    On a given database, procedure runs in 24 seconds, which seemed acceptable to me until I started to comment out the UPDATE statements one after another. Just by commenting one specific UPDATE, the procedure now runs in 12 seconds. However, the commented statement updates a 3 rows table and runs in milliseconds. Logic suggests that running it or not should have no impact on the overall performance.

    Any idea on what is going on here? Any suggestion on what I could do to make my procedure always run in 12 seconds? Does it have something to do with parallelism?

    Regards.

    • This topic was modified 1 year, 12 months ago by  cmartel 20772.
    • This topic was modified 1 year, 12 months ago by  cmartel 20772.
  • Is it critical that it completes in 12 seconds instead of 24? Are you experiencing blocking or other issues due to the longer execution time? If not, and especially if the data is not growing rapidly and you have plenty of other priorities, it may be good enough.

    If you need to improve performance, you need to analyze actual execution plans to determine the actual bottlenecks. Without that, the first suspect is probably the cursor. Most (not all) problems can be handled w/o cursors and generally (not always) perform better w/o looping row by agonizing row (© Jeff Moden).

    The IF updates are another suspect. Sometimes we unnecessarily duplicate logic & execution effort in the IF statements that could/should just be in the WHERE clause of the update.

  • Thanks ratbak,

    When I check "Include Actual Execution Plan", I am not getting a big plan, I am getting tons of single plans for every SELECT, INSERT, UPDATE or DELETE and they all seem OK (any one of them accounts for less than 1% of the total time).

    This procedure uses a cursor in its main loop because is it complex. As rows are inserted, deleted or updated, local variables are calculated and these variables have direct impact on the next series of inserts, deleted and updates.

    If you tell me that any stored procedure using cursors will always generate unpredictable plans then I will stop focusing on performance and solely try to improve readability. Otherwise, I favor 12 seconds over 24 seconds.

  • With cursors, I think it's usually not about unpredictable plans so much as whether it's more efficient to do many, many small transactions vs. a few larger transactions. That said, there are transactions so huge it is better to at least batch them N-thousand at a time instead of N-million all at once.

    How many rows are being looped through in the cursor?

    Being immersed in SQL Server & set-based approaches for years, and not so much in looping/iterative approaches of other programming languages, I generally find well designed set-based statements quite readable. But I certainly understand that's an individual perception/preference. Massive, poorly designed statements can be a nightmare to interpret, but those can also be created in loops.

    I agree that if performance isn't a problem, it may not be worth refactoring.  Maintainability is valuable... especially if  rules/code have to change frequently.

  • Your problem is almost certainly the cursor. To get any reasonable help you will need to post consumable test data and the full procedure. As well as showing the logic the full procedure will also show if other procedures or UDFs are being called.

    Depending on the data and what needs to be done, ways of getting rid of a cursor include:

    1. Windowed functions.
    2. If the result is for an application, then offload the loop to the application.
    3. The Quirky Update. Non-relational and officially unsupported so up to you: https://www.sqlservercentral.com/articles/solving-the-running-total-and-ordinal-rank-problems-rewritten
    4. Recursion - generally only useful if MAXRECURSION <= 20.
  • I agree that you'd most likely be better off using a single UPDATE for specific conditions whenever possible.

    However, if that's not feasible at this time, other possibilites are:

    (1) Add BEGIN TRANSACTION and COMMIT for blocks of rows (such as every 500 rows or 1000 rows or whatever number) to improve logging efficiency, rather than having every UPDATE / DELETE be a separate trans.

    (2) Make sure you have indexes matching the conditions in the UPDATE and DELETE statements.

     

    SQL DBA,SQL Server MVP(07, 08, 09) A socialist is someone who will give you the shirt off *someone else's* back.

  • Hi Ken,

    For your knowledge, this is the actual statement that triggers the problem:

    UPDATE SC
    SET
    SC.QuickEffectiveDate = SCR.EffectiveDate,
    SC.QuickNominalValueChangeTypeID = SCR.NominalValueChangeTypeID,
    SC.QuickHasRightToVote = SCR.HasRightToVote,
    SC.QuickVotesPerShare = SCR.VotesPerShare,
    SC.QuickNominalValue = SCR.NominalValue,
    SC.QuickAllowAuthorizedQuantity = SCR.AllowAuthorizedQuantity,
    SC.QuickAuthorizedQuantity = SCR.AuthorizedQuantity,
    SC.QuickAuthorizedQuantityChangeTypeID = SCR.AuthorizedQuantityChangeTypeID,
    SC.QuickAllowAuthorizedCapital = SCR.AllowAuthorizedCapital,
    SC.QuickAuthorizedCapital = SCR.AuthorizedCapital
    FROM #SecuritiesClass SC
    OUTER APPLY (
    SELECT TOP 1
    R.EffectiveDate, R.HasRightToVote,
    R.VotesPerShare, R.NominalValue, --NULL AS VotesPerShare, NULL AS NominalValue,
    R.AllowAuthorizedQuantity, R.AuthorizedQuantity, R.AllowAuthorizedCapital, R.AuthorizedCapital,
    R.NominalValueChangeTypeID, R.AuthorizedQuantityChangeTypeID
    FROM #SecuritiesClassRules R
    WHERE R.SecClassID = SC.SecClassID AND R.EffectiveDate <= @strxEffectiveDate
    ORDER BY R.EffectiveDate DESC
    ) SCR

    If this statement is part of the stored procedure, the procedures runs in 24 seconds even though conditions make that the statement is only run once. Removing the statement makes the procedure run in 12 seconds. Just modifying line 17 so NULL values are returned instead of Votes Per Share and Nominal Value also make the procedure run in 12 seconds.

    The procedure was originally written in C# and deployed as a CLR stored procedure. It was rewritten as standard stored procedure a couple a years ago when CLR support on recent versions of SQL Server became harder and harder. It does a lot of math calculations (I mean a lot) and it will be very hard to change its logic from a main loop algorithm to large set operations.

    Still, SQL Server is behaving strangely here. I just need to tell you that replacing the aforementioned UPDATE (that updates every row of table #SecuritiesClass) with a cursor that updates every row one after another also makes the procedure run in 12 seconds!

    Regards

  • cmartel 20772 wrote:

    Hi Ken,

    For your knowledge, this is the actual statement that triggers the problem:

    UPDATE SC
    SET
    SC.QuickEffectiveDate = SCR.EffectiveDate,
    SC.QuickNominalValueChangeTypeID = SCR.NominalValueChangeTypeID,
    SC.QuickHasRightToVote = SCR.HasRightToVote,
    SC.QuickVotesPerShare = SCR.VotesPerShare,
    SC.QuickNominalValue = SCR.NominalValue,
    SC.QuickAllowAuthorizedQuantity = SCR.AllowAuthorizedQuantity,
    SC.QuickAuthorizedQuantity = SCR.AuthorizedQuantity,
    SC.QuickAuthorizedQuantityChangeTypeID = SCR.AuthorizedQuantityChangeTypeID,
    SC.QuickAllowAuthorizedCapital = SCR.AllowAuthorizedCapital,
    SC.QuickAuthorizedCapital = SCR.AuthorizedCapital
    FROM #SecuritiesClass SC
    OUTER APPLY (
    SELECT TOP 1
    R.EffectiveDate, R.HasRightToVote,
    R.VotesPerShare, R.NominalValue, --NULL AS VotesPerShare, NULL AS NominalValue,
    R.AllowAuthorizedQuantity, R.AuthorizedQuantity, R.AllowAuthorizedCapital, R.AuthorizedCapital,
    R.NominalValueChangeTypeID, R.AuthorizedQuantityChangeTypeID
    FROM #SecuritiesClassRules R
    WHERE R.SecClassID = SC.SecClassID AND R.EffectiveDate <= @strxEffectiveDate
    ORDER BY R.EffectiveDate DESC
    ) SCR

    If this statement is part of the stored procedure, the procedures runs in 24 seconds even though conditions make that the statement is only run once. Removing the statement makes the procedure run in 12 seconds. Just modifying line 17 so NULL values are returned instead of Votes Per Share and Nominal Value also make the procedure run in 12 seconds.

    The procedure was originally written in C# and deployed as a CLR stored procedure. It was rewritten as standard stored procedure a couple a years ago when CLR support on recent versions of SQL Server became harder and harder. It does a lot of math calculations (I mean a lot) and it will be very hard to change its logic from a main loop algorithm to large set operations.

    Still, SQL Server is behaving strangely here. I just need to tell you that replacing the aforementioned UPDATE (that updates every row of table #SecuritiesClass) with a cursor that updates every row one after another also makes the procedure run in 12 seconds!

    Regards

    You really need to look at the execution plan for that query.

    Try indexing the temporary tables before the update:

    CREATE CLUSTERED INDEX IX_#SecuritiesClass_1 ON #SecuritiesClass(SecClassID);

    CREATE CLUSTERED INDEX IX_#SecuritiesClassRules_1
    ON #SecuritiesClassRules(SecClassID, SecClassID, EffectiveDate DESC);
  • Jonathan,

    There is no problem with the UPDATE. All indexes are created. Moreover, tables #SecuritiesClass and #SecuritiesClassRules only contain 3 records. By putting GETDATE() before and after the UPDATE, I can see that it runs in milliseconds.

    The problem is that the mere existence of this particular UPDATE (and maybe that could be said for other UPDATEs) has SQL Server run the procedure in twice the time.

    The procedure is big (2900 lines), creates a lot of variables (some being table variables). It cannot be broken into small procedures that call each other. It seems that SQL Server does not handle it well and I would like to discover a "magic" keyword that would have the procedure run fast all the times.

    Regards

    • This reply was modified 1 year, 12 months ago by  cmartel 20772.
  • cmartel 20772 wrote:

    Jonathan,

    There is no problem with the UPDATE. All indexes are created. Moreover, tables #SecuritiesClass and #SecuritiesClassRules only contain 3 records. By putting GETDATE() before and after the UPDATE, I can see that it runs in milliseconds.

    The problem is that the mere existence of this particular UPDATE (and maybe that could be said for other UPDATEs) has SQL Server run the procedure in twice the time.

    The procedure is big (2900 lines), creates a lot of variables (some being table variables). It cannot be broken into small procedures that call each other. It seems that SQL Server does not handle it well and I would like to discover a "magic" keyword that would have the procedure run fast all the times.

    Regards

    Have you tried a combination of adding "WITH RECOMPILE" or using OPTION(OPTIMIZE FOR UNKNOWN) ?

    You never know, it's worth trying.

  • Jonathan,

    OPTIMIZE FOR UNKNOWN cannot be applied to a stored procedure.   EXEC myStoredProc WITH RECOMPILE does not make it faster.

    In an earlier post, Scott suggested to add BEGIN TRANSACTIONS and COMMIT TRANSACTIONS. As expected this only makes the procedure run much slower.

    Regards.

  • In the past, table variables would have been another red flag. But SQL Server 2019 did improve row estimates for table variables, so they aren't automatically suspect.

    Temp tables created in calling procedures can be referenced in called procedures, and table variables can be passed to called procedures via TVP, so it's not inherently true that you can't break procedures that use them into smaller procedures. But I still understand the reluctance to risk refactoring a 2900 line stored procedure.

    There is this: Staples Easy Button

    🙂

     

  • Have you looked at the execution plan? Maybe you should paste all the code in this thread.

  • I'd suggest getting performance metrics on individual statements within the query. AND, as has been said several times, the execution plans for the queries. Those two bits of data are going to drive your decisions.

    "The credit belongs to the man who is actually in the arena, whose face is marred by dust and sweat and blood"
    - Theodore Roosevelt

    Author of:
    SQL Server Execution Plans
    SQL Server Query Performance Tuning

  • What are you trying to solve ?

    An "uncritical" procedure that runs at end of day taking 24 seconds doesn't really sound like a problem.

    I had a long running procedure once (1-2 hours), and as a first step, I wrote a record to a log file after each statement, with a step # and time stamp. Then I could narrow down which statements in the long SP were taking the most time, and needed attention.

    • This reply was modified 1 year, 12 months ago by  homebrew01.
    • This reply was modified 1 year, 12 months ago by  homebrew01.

Viewing 15 posts - 1 through 15 (of 19 total)

You must be logged in to reply to this topic. Login to reply