Are cursors bad for iterating through the result set returned by a function?

  • Another problem with doing this in a set-based fashion is that you don't have normalized tables.

    UPDATE [Current]

    SET /*Other fields to other variables*/,

    ptFlagIndicator1 = @ptFlagIndicator1Var,

    ptFlagIndicator2 = @ptFlagIndicator2Var, etc....

    WHERE nurseStation = @nurseStationVar AND roomNo = @roomNoVar AND bedNo = @bedNoVar;

    Wayne
    Microsoft Certified Master: SQL Server 2008
    Author - SQL Server T-SQL Recipes


    If you can't explain to another person how the code that you're copying from the internet works, then DON'T USE IT on a production system! After all, you will be the one supporting it!
    Links:
    For better assistance in answering your questions
    Performance Problems
    Common date/time routines
    Understanding and Using APPLY Part 1 & Part 2

  • pdonley (4/8/2011)


    Hi all,

    I know that cursors are evil, so I try not to use them. I've read that they are evil because they lock the table that you are iterating through, so no other processes can use that table while your cursor is running.

    <snip>

    As others have indicated, using iteration instead of a set-based solution is one reason why cursors have a bad rep. The other reason, and more of an issue to me, is that they are not memory safe. Granted, if you code well and clean up your cursors (deallocate) you probably won't run into an issue. But, it's real fun to watch a badly written cursor take out a server. 🙂

  • Wayne, sets are not passed into this table. Only single INSERT statements populate it, and that is not up to me. You must realize that I inherited this portion of the system and did not design it. I can't redesign the way it works at the drop of a hat. I merely need a solution to my problem, not a lecture on why my system is terrible.

  • pdonley (4/12/2011)


    Wayne, sets are not passed into this table. Only single INSERT statements populate it, and that is not up to me. You must realize that I inherited this portion of the system and did not design it. I can't redesign the way it works at the drop of a hat. I merely need a solution to my problem, not a lecture on why my system is terrible.

    He isn't trying to lecture you on your system. If the whole trigger is based on the assumption that there is only 1 record being inserted it is destined to have MAJOR problems at some point in the future. Yes currently the system only inserts 1 record at a time. This will continue to work until some day something goes awry. Then in the course of trying to get the records back into the table somebody does an "insert into table select values" type of query and the trigger is designed to handle 1 and only 1 record. This is the kind of thing that lots of us on this board have dealt with time and time again.

    _______________________________________________________________

    Need help? Help us help you.

    Read the article at http://www.sqlservercentral.com/articles/Best+Practices/61537/ for best practices on asking questions.

    Need to split a string? Try Jeff Modens splitter http://www.sqlservercentral.com/articles/Tally+Table/72993/.

    Cross Tabs and Pivots, Part 1 – Converting Rows to Columns - http://www.sqlservercentral.com/articles/T-SQL/63681/
    Cross Tabs and Pivots, Part 2 - Dynamic Cross Tabs - http://www.sqlservercentral.com/articles/Crosstab/65048/
    Understanding and Using APPLY (Part 1) - http://www.sqlservercentral.com/articles/APPLY/69953/
    Understanding and Using APPLY (Part 2) - http://www.sqlservercentral.com/articles/APPLY/69954/

  • That can never happen. The interface that inserts records into this table receives data from a mainframe through an interface system. Even if we had to do a major resend of data, it would come in as individual INSERT statements, one at a time, until the backlog of records has completed. That's not my design - it's what I have to work with. Now we're getting really off topic...all I need is an alternative to a cursor. I have confirmed that the cursor is the only part of this trigger that is slow, because when the cursor part of the trigger is commented out, the trigger executes flawlessly and very quickly. With the cursor in, it works for awhile and eventually slows down.

    Anyone here who'd like to actually help?

  • pdonley (4/12/2011)


    That can never happen. The interface that inserts records into this table receives data from a mainframe through an interface system. Even if we had to do a major resend of data, it would come in as individual INSERT statements, one at a time, until the backlog of records has completed. That's not my design - it's what I have to work with. Now we're getting really off topic...all I need is an alternative to a cursor. I have confirmed that the cursor is the only part of this trigger that is slow, because when the cursor part of the trigger is commented out, the trigger executes flawlessly and very quickly. With the cursor in, it works for awhile and eventually slows down.

    Anyone here who'd like to actually help?

    Of course the interface won't allow it. This is why many of us have been bitten by this in the past. We all understand that is isn't your design. Nobody was being critical of you. We are trying to help you solve the issue and make it a solid fix. Trust me if all we wanted to do is berate other people there are far easier outlets for that. It sounds like you don't want to explore a full solution and that all you want is a band-aid.

    With that in mind it is nearly impossible for us to tell what is going outside of setting these variables within your cursor. We have no knowledge of your system or requirements. There have been several people already offering to help. Given what we can see it looks like a cursor may be the only option you have because of the non-normal data. It is going to add a level of complexity that is very challenging (as you obviously already know). If you are willing to provide some details about the rest of the cursor there are plenty of people on these boards that are willing and able to help. We all understand that you can't always post all of the logic as it is 100%. Psuedo code is fine to help us understand the whole scope of what you are trying to accomplish but without it we are rendered unable to help you.

    _______________________________________________________________

    Need help? Help us help you.

    Read the article at http://www.sqlservercentral.com/articles/Best+Practices/61537/ for best practices on asking questions.

    Need to split a string? Try Jeff Modens splitter http://www.sqlservercentral.com/articles/Tally+Table/72993/.

    Cross Tabs and Pivots, Part 1 – Converting Rows to Columns - http://www.sqlservercentral.com/articles/T-SQL/63681/
    Cross Tabs and Pivots, Part 2 - Dynamic Cross Tabs - http://www.sqlservercentral.com/articles/Crosstab/65048/
    Understanding and Using APPLY (Part 1) - http://www.sqlservercentral.com/articles/APPLY/69953/
    Understanding and Using APPLY (Part 2) - http://www.sqlservercentral.com/articles/APPLY/69954/

  • Never mind. I'll do it myself, as usual. If you read the whole post you'll see I provided everything that is relevant. This is the second time I've posted here with no results. I may as well just cancel my account. I figured it out myself last time, too.

    You have to understand that in the business world, band-aids are what management wants. They don't want me to re-write their system. They want me to spend fourteen seconds putting a band-aid on it until it breaks again five years down the line, at which point they will ask me to fix it again. I've lost the ability to care about good programming. If I was in college, I'd whole-heartedly agree with you for a more complete solution. But now, it is a luxury I cannot afford.

  • pdonley (4/12/2011)


    You have to understand that in the business world, band-aids are what management wants. They don't want me to re-write their system.

    management expects you to use your professional expertise to solve problems. editing 20-50 lines of code so that the trigger follows standard industry practices and handles SET based data, regardless of the interface used to access the table, is your responsibility as a professional DBA; don't let a incomplete, misguided and uninformed management directives keep you from doing your job the right way.

    Lowell


    --help us help you! If you post a question, make sure you include a CREATE TABLE... statement and INSERT INTO... statement into that table to give the volunteers here representative data. with your description of the problem, we can provide a tested, verifiable solution to your question! asking the question the right way gets you a tested answer the fastest way possible!

  • Yes I understand that is what business wants. School was so long ago I can scarcely remember it. There is no need to be hostile. We are all here offering help voluntarily. It is too bad that you don't appreciate the help. Nobody was asking you to rewrite your system. We were asking for enough details to help you fix YOUR problem. If want to pay my consulting fees I will happily help you create the band aid you so desperately need. My free help only goes so far.

    _______________________________________________________________

    Need help? Help us help you.

    Read the article at http://www.sqlservercentral.com/articles/Best+Practices/61537/ for best practices on asking questions.

    Need to split a string? Try Jeff Modens splitter http://www.sqlservercentral.com/articles/Tally+Table/72993/.

    Cross Tabs and Pivots, Part 1 – Converting Rows to Columns - http://www.sqlservercentral.com/articles/T-SQL/63681/
    Cross Tabs and Pivots, Part 2 - Dynamic Cross Tabs - http://www.sqlservercentral.com/articles/Crosstab/65048/
    Understanding and Using APPLY (Part 1) - http://www.sqlservercentral.com/articles/APPLY/69953/
    Understanding and Using APPLY (Part 2) - http://www.sqlservercentral.com/articles/APPLY/69954/

  • If all you are trying to do, is take an input field (storing multiple records in a delimited method) and extract the individual strings, try the XML route

    DECLARE @STR VARCHAR(100)

    SET @STR = '0001,0002,0003,0004,0005'

    DECLARE @x XML

    SET @x = '<i>' + REPLACE(@str, ',', '</i><i>') + '</i>'

    SELECT x.i.value('.', 'VARCHAR(4)') AS Item

    FROM @x.nodes('//i') x(i)

    This is from an article on this site... I searched and found it this morning as a colleague asked me a similar quesiton.

  • Ok, making sense of this as best as I can, it seems that you are processing a comma separated string, into a list (from the table function) and then into separate variables (no more than 6, specifically).

    This sounds like a table row to me.

    Assumptions:

    1) ptFlags = "Flag1,Flag2,Flag3,Flag4,Flag5,Flag6"

    2) Output = single table row comprising six fields (f1,f2,f3,f4,f5,f6)

    My solution to this

    SELECT

    Max(f1) AS f1

    ,Max(f2) AS f2

    ,Max(f3) AS f3

    ,Max(f4) AS f4

    ,Max(f5) AS f5

    ,Max(f6) AS f6

    FROM

    (SELECT

    CASE Rownum WHEN 1 THEN FlagIndicatorVar END AS f1

    ,CASE Rownum WHEN 2 THEN FlagIndicatorVar END AS f2

    ,CASE Rownum WHEN 3 THEN FlagIndicatorVar END AS f3

    ,CASE Rownum WHEN 4 THEN FlagIndicatorVar END AS f4

    ,CASE Rownum WHEN 5 THEN FlagIndicatorVar END AS f5

    ,CASE Rownum WHEN 6 THEN FlagIndicatorVar END AS f6

    FROM

    (SELECT flag.FlagIndicatorVar, ROW_NUMBER() OVER (ORDER BY FlagIndicatorVar) AS Rownum

    FROMsplitCSV((SELECT ptFlags FROM Inserted), ',')) flags

    ) AllFlags

    ie.

    Step 1: Split flags into rows using splitCSV

    Step 2: Combine rows into 6 column, 6 row result set, each with a maximum of one valid field

    Step 3: Aggregate 6 row result set into 6 column, 1 row table.

    This really isn't a nice way of processing data, but is sometimes necessary.

    Unfortunately, in this case, I have not been provided with a specific input or output format, and so can be of little further help!!

    Cheers ... Ian

Viewing 11 posts - 16 through 26 (of 26 total)

You must be logged in to reply to this topic. Login to reply