• Jeff Moden (1/2/2009)


    battelofhalfwits (1/1/2009)


    Wow...

    Two things

    1) Never use Cursors! Looping through a temp table takes less overhead than a Cursor; especially when you are working on a 24/7/365 server where you really need to watch your resources.

    Not true... if you use Forward Only, Read Only (or sometimes Static), the cursor is just as "effecient" as the WHILE loop... and make no doubt about it, neither is effecient. If you use a WHILE loop, you've not done any better than using a cursor. 😉

    I use a couple of cursors in my code...

    One is for retrieving messages from a service broker queue (multiple messages per conversation handle, retrieved into a temp table and then the cursor processes each retrieved message one by one as they have to be - each message will take a different process path depending on what it contains):

    -- Declare cursor

    DECLARE MessageCursor CURSOR LOCAL FORWARD_ONLY READ_ONLY

    FOR

    SELECT ConversationHandle

    ,MessageTypeName

    ,MessageBody

    FROM @ReceiveTable

    ORDER BY QueuingOrder;

    -- and a little later in the code...

    WAITFOR(

    RECEIVE

    [queuing_order]

    ,[conversation_handle]

    ,[message_type_name]

    ,CAST([message_body] AS NVARCHAR(MAX))

    FROM [EventDetailsTargetQueue]

    INTO @ReceiveTable

    ), TIMEOUT 5000;

    -- and process using the cursor --

    The other is in a utility procedure that returns record counts and a health check (are there sufficient records) for all the tables involved in a loading process - in this case a table contains the names and minimum acceptable counts for all the loading tables, I use a very simple cursor (SELECT TableName, MinimumRecordCount FROM HealthCheckTables) to retrieve these and for each one I construct a SQL statement to retrieve the count from the table and execute the statement, storing the results in a temp table. This is then used to report the counts, and to throw an error where insufficient records are found in any of the tables (this procedure is used in the load process to halt processing before the live data is affected when we haven't loaded a healthy amount of data)

    I do loop through a temp table in another component of my loading process to achieve almost the same thing (disabling/enabling all foreign keys on those same loading tables), driven by the same source table and again constructing a SQL statement for each key and executing it.

    i.e. this gets all the related foreign keys:

    SELECT OBJECT_NAME(s.constid) AS ConstraintName

    ,'[' + sch.[name] + '].[' + OBJECT_NAME(s.fkeyid) + ']' AS TableName

    INTO #Const_Table

    FROM sysforeignkeys s

    INNER JOIN HealthCheckTables h

    ON h.TableName = OBJECT_NAME (s.rkeyid)

    INNER JOIN sys.tables t

    ON t.[name] = OBJECT_NAME(s.fkeyid)

    INNER JOIN sys.schemas sch

    ON sch.[schema_id] = t.[schema_id]

    ...and for each one I have to construct an "ALTER TABLE abc {NOCHECK|WITH CHECK CHECK} CONSTRAINT xyz" statement and execute it.

    These last two, both could use a cursor, or both could use a loop, but essentially they both have to be REBAR processes as far as I can tell, and similarly for the service broker processing.

    Is there a way to achieve any of these things in a set-based fashion? I've not found one... though I'm by no means an expert on these matters. In the latter cases the number of tables involved is small enough that the REBAR makes no odds (around 20 or so tables to count or switch keys on or off for), but the principle matters, and if I can do this a better way I'd like to know before I am faced with a much bigger similar situation!

    Dave.