Click here to monitor SSC
SQLServerCentral is supported by Redgate
 
Log in  ::  Register  ::  Not logged in
 
 
 


Row-By-Row Processing Without Cursor


Row-By-Row Processing Without Cursor

Author
Message
Jim Nesbitt
Jim Nesbitt
SSC Journeyman
SSC Journeyman (84 reputation)SSC Journeyman (84 reputation)SSC Journeyman (84 reputation)SSC Journeyman (84 reputation)SSC Journeyman (84 reputation)SSC Journeyman (84 reputation)SSC Journeyman (84 reputation)SSC Journeyman (84 reputation)

Group: General Forum Members
Points: 84 Visits: 229
I don't think the string processing is a problem, but the query would a) not work properly or b) fall over due to the 8K varchar limit (as the author in fact mentioned) which only allows 1.6 characters per insert so the dynamic sql would be incorrect.

So the advantage of using cursors is that they would work.

Cursors get a bad press for some reason, but they're much nicer than setting the rowcount to 1 and doing some arcane while loop just to get one row back at a time which I had to resort to in SQL 4.2 and earlier. I think a lot of this is due to them being used where they shouldn't be - I saw one example where someone had used a cursor to get some variables instead of a singleton select.

Cursors are essential in circumstances where row by row processing is necessary. I often have to use cursors on the inserted and deleted tables to update higher level totals.

This avoids the use of multiple subquery statements to sum quantities to the higher levels since transact SQL only returns a single variable from a subquery (a feature like oracle to return several variables would be useful) and the update statements got too complex for the compiler besides being very long winded.

Cursors are also useful where different updates are required depending on the row data or different calculations may have to be used before updating.

I'm not an expert, but cursors on the inserted and deleted tables should perform well since most of the trigger tables should be in memory.

Table variables are an option, but the while loop on a table variable seems pretty similar to using a cursor and there's the overhead to create and populate the table variable, so I'm not sure that they offer a significant performance advantage. From a maintenance/development point of view, it's easier to skip the table creation by using the cursor definition as there's less chance of error.

I'll keep on using them and liking it.
Jeff Moden
Jeff Moden
SSC-Forever
SSC-Forever (45K reputation)SSC-Forever (45K reputation)SSC-Forever (45K reputation)SSC-Forever (45K reputation)SSC-Forever (45K reputation)SSC-Forever (45K reputation)SSC-Forever (45K reputation)SSC-Forever (45K reputation)

Group: General Forum Members
Points: 45076 Visits: 39909

Here, here, Kevin! Any process other than single record GUI support that uses RBAR (pronounced "ree-bar" and is a "Modenism" for "Row By Agonizing Row") either has an incorrect database design or someone simply doesn't understand the methods used to make a process setbased. And, if you write things to handle more than one row at a time, you'll be amazed at how fast just one row will process and the kind of "concurrency" you can get out of your GUI procs, as well.



--Jeff Moden

RBAR is pronounced ree-bar and is a Modenism for Row-By-Agonizing-Row.
First step towards the paradigm shift of writing Set Based code:
Stop thinking about what you want to do to a row... think, instead, of what you want to do to a column.
Although they tell us that they want it real bad, our primary goal is to ensure that we dont actually give it to them that way.
Although change is inevitable, change for the better is not.
Just because you can do something in PowerShell, doesnt mean you should. Wink

Helpful Links:
How to post code problems
How to post performance problems
Forum FAQs
Tim Chapman-218780
Tim Chapman-218780
SSC-Enthusiastic
SSC-Enthusiastic (125 reputation)SSC-Enthusiastic (125 reputation)SSC-Enthusiastic (125 reputation)SSC-Enthusiastic (125 reputation)SSC-Enthusiastic (125 reputation)SSC-Enthusiastic (125 reputation)SSC-Enthusiastic (125 reputation)SSC-Enthusiastic (125 reputation)

Group: General Forum Members
Points: 125 Visits: 115

Actually, in 2005, the solution would probably work if a VARCHAR(MAX) data type was used, which supports up to 2GB of data.


T Y MA
T Y MA
SSC Rookie
SSC Rookie (35 reputation)SSC Rookie (35 reputation)SSC Rookie (35 reputation)SSC Rookie (35 reputation)SSC Rookie (35 reputation)SSC Rookie (35 reputation)SSC Rookie (35 reputation)SSC Rookie (35 reputation)

Group: General Forum Members
Points: 35 Visits: 365

I had speed example of using cursor and select:

In my previous project "data migration". using cursor would spend more then TWO days, with select, it just 30 min. There were about 200K total customer records


Jeffery Williams
Jeffery Williams
SSC Veteran
SSC Veteran (271 reputation)SSC Veteran (271 reputation)SSC Veteran (271 reputation)SSC Veteran (271 reputation)SSC Veteran (271 reputation)SSC Veteran (271 reputation)SSC Veteran (271 reputation)SSC Veteran (271 reputation)

Group: General Forum Members
Points: 271 Visits: 913

I would need to test this in order to verify (not sure if it would work).

Perhaps having a timestamp on the table? When a SELECT is initiated this would change. Then again I believe this is only when data is changed. With that in mind you could incorporate a change to a field (a meaningless field used only for this purpose) which would fire the trigger. You could also have the trigger fire only for a change to THIS field which means you could have a custom process fire for a query that would not be included in a record update / delete / add.

With this in mind I was going to post, after reading the article (and still will) the concept of TABLE VARIABLE. I use these often, when a set based operation is simply not possible.

Upon entering this forum I see the first post is regarding the very concept that I planned on writing about. I believe I would change the logic just a bit however as the poster only mentions the Identity column. I don't know that this would work as the complete record is not present until the trigger has completed (committed).

That said I would implement the TABLE VARIABLE as follows (keeping the logic in the trigger rather than a trigger calling an external procedure – Eliminates the dynamic portion as well as the parsing of XML, Comma string, ect.).

Lets say we are inserting simple customer data (multiple records / bulk insert). Another assumption is that we are not inserting more than something like 5,000 records at a time.

DECLARE @CustomerID as varchar (20)

,@CustFirstName as varchar (30)

,@CustLastName as varchar (40)

,@CustPhone1 as varchar (15)

,@RecordCount as int

DECLARE @ProcessTable as TABLE

(

CustomerID varchar (20)

,CustFirstName varchar (30)

,CustLastName varchar (40)

,CustPhone1 varchar (15)

,Processed bit 0

)

INSERT INTO @ProcessTable

(CustomerID, CustFirstName, CustLastName, CustPhone1, Processed)

SELECT CustomerID, CustFirstName, CustLastName, CustPhone1, Processed

FROM Inserted

SELECT @RecordCount = count(*)

FROM @ProcessTable

WHILE @Count > 0

BEGIN

SELECT TOP 1 @CustomerID = CustomerID, @CustFirstName = CustFirstName

,@CustLastName = CustLastName, @CustPhone1 = CustPhone1

FROM @ProcessTable

WHERE Processed = 0

[

Perform your logic here such as duplicating the record somewhere, checking the data, ect

]

SET @Count = (@Count – 1)

UPDATE @ProcessTable

SET Processed = 1

WHERE CustomerID = @CustomerID

CONTINUE

END

This is a very basic example however I think you see the point. Now another consideration is that there could be more than one update to a given CustomerID within this bulk insert. With that in mind I would add field to my table variable, lets call it ( InsertedID int PK). I would then update my processed flag based upon this Identity column so that multiple records affecting the same CustomerID could be handled.

Any thoughts?

Jeff



<hr noshade size='1' width='250' color='#BBC8E5'>


Regards,

Jeffery Williams
http://www.linkedin.com/in/jwilliamsoh
kevin mann
kevin mann
SSC Veteran
SSC Veteran (226 reputation)SSC Veteran (226 reputation)SSC Veteran (226 reputation)SSC Veteran (226 reputation)SSC Veteran (226 reputation)SSC Veteran (226 reputation)SSC Veteran (226 reputation)SSC Veteran (226 reputation)

Group: General Forum Members
Points: 226 Visits: 119
Hey everyone, I haven't posted in a while, so I thought I'd chime in on a good one like this. I've used alot of the suggestions in the past with varying results. It's really hard to design business rules with triggers when there are so many inputs and outputs. This is more and more apparent to me every day in the health industry.

I got an idea reading through this discussion...I want to see what yall think about it

CREATE TRIGGER tr1 ON Books AFTER INSERT AS

UPDATE Books
SET batchUID = NEWID() --batchUID being a uniqueidentifier field
FROM Books
INNER JOIN Inserted
ON Inserted.bookCode = Books.bookCode
GO

CREATE TRIGGER tr2 ON Books AFTER INSERT AS
DECLARE @batchUID uniqueidentifier
SET @batchUID = (SELECT TOP 1 @batchUID FROM Inserted)
EXEC usp_Process @batchUID=@batchUID
GO

Of course this only works if a.) you can alter the table schema and b.) if you can alter stored procedures to handle optional parameters. The only other exceptions I can see if is the SP uses a non-batch API or you need to generate some type of customized ID field.
Tim Chapman-218780
Tim Chapman-218780
SSC-Enthusiastic
SSC-Enthusiastic (125 reputation)SSC-Enthusiastic (125 reputation)SSC-Enthusiastic (125 reputation)SSC-Enthusiastic (125 reputation)SSC-Enthusiastic (125 reputation)SSC-Enthusiastic (125 reputation)SSC-Enthusiastic (125 reputation)SSC-Enthusiastic (125 reputation)

Group: General Forum Members
Points: 125 Visits: 115
From just looking at it, it seems a bit tricky. Are you planning on recursive triggers? Also, not sure what happens in usp_Process, but it looks like you are only getting the first batchid that is inserted and calling that proc (what if >1 is inserted). I may be overlooking something because I just gave it an overview, but those are my thoughts. Does it work like you expect?
kevin mann
kevin mann
SSC Veteran
SSC Veteran (226 reputation)SSC Veteran (226 reputation)SSC Veteran (226 reputation)SSC Veteran (226 reputation)SSC Veteran (226 reputation)SSC Veteran (226 reputation)SSC Veteran (226 reputation)SSC Veteran (226 reputation)

Group: General Forum Members
Points: 226 Visits: 119
correct...no recursive triggers (which is default for SQL Server). The first trigger assigns the same GUID to each record in the insert. So the author of usp_Process has to code based on batchUID instead of bookCode.
Tim Chapman-218780
Tim Chapman-218780
SSC-Enthusiastic
SSC-Enthusiastic (125 reputation)SSC-Enthusiastic (125 reputation)SSC-Enthusiastic (125 reputation)SSC-Enthusiastic (125 reputation)SSC-Enthusiastic (125 reputation)SSC-Enthusiastic (125 reputation)SSC-Enthusiastic (125 reputation)SSC-Enthusiastic (125 reputation)

Group: General Forum Members
Points: 125 Visits: 115
What if there are >1 record inserted?
kevin mann
kevin mann
SSC Veteran
SSC Veteran (226 reputation)SSC Veteran (226 reputation)SSC Veteran (226 reputation)SSC Veteran (226 reputation)SSC Veteran (226 reputation)SSC Veteran (226 reputation)SSC Veteran (226 reputation)SSC Veteran (226 reputation)

Group: General Forum Members
Points: 226 Visits: 119
--This code affects all the records inserted into books with the same uniqueidentifier

CREATE TRIGGER tr1 ON Books AFTER INSERT AS

UPDATE Books
SET batchUID = NEWID() --batchUID being a uniqueidentifier field
FROM Books
INNER JOIN Inserted
ON Inserted.bookCode = Books.bookCode
GO
Go


Permissions

You can't post new topics.
You can't post topic replies.
You can't post new polls.
You can't post replies to polls.
You can't edit your own topics.
You can't delete your own topics.
You can't edit other topics.
You can't delete other topics.
You can't edit your own posts.
You can't edit other posts.
You can't delete your own posts.
You can't delete other posts.
You can't post events.
You can't edit your own events.
You can't edit other events.
You can't delete your own events.
You can't delete other events.
You can't send private messages.
You can't send emails.
You can read topics.
You can't vote in polls.
You can't upload attachments.
You can download attachments.
You can't post HTML code.
You can't edit HTML code.
You can't post IFCode.
You can't post JavaScript.
You can post emoticons.
You can't post or upload images.

Select a forum

































































































































































SQLServerCentral


Search