Click here to monitor SSC
SQLServerCentral is supported by Red Gate Software Ltd.
 
Log in  ::  Register  ::  Not logged in
 
 
 
        
Home       Members    Calendar    Who's On


Add to briefcase ««123»»

Cursor in Stored Procedure Expand / Collapse
Author
Message
Posted Monday, April 1, 2013 11:01 AM
Mr or Mrs. 500

Mr or Mrs. 500Mr or Mrs. 500Mr or Mrs. 500Mr or Mrs. 500Mr or Mrs. 500Mr or Mrs. 500Mr or Mrs. 500Mr or Mrs. 500

Group: General Forum Members
Last Login: Thursday, August 28, 2014 2:20 PM
Points: 541, Visits: 1,047
That's a nice bit of code there, Sean.

While I've used dynamic sql to either a) build the list of commands I want to run, or b) run dynamic commands, I don't usually combine a) and b) to run a dynamic LIST of commands.. not a bad idea, maybe a little harder to debug than doing it with single command steps but definitely something to keep in mind.



Post #1437534
Posted Monday, April 1, 2013 11:10 AM
Mr or Mrs. 500

Mr or Mrs. 500Mr or Mrs. 500Mr or Mrs. 500Mr or Mrs. 500Mr or Mrs. 500Mr or Mrs. 500Mr or Mrs. 500Mr or Mrs. 500

Group: General Forum Members
Last Login: Thursday, August 28, 2014 2:20 PM
Points: 541, Visits: 1,047
I concede that this task can be done with a set-based command that generates a list of commands.. lol.. now let's consider a moment.

Now consider your task may take a LARGE block of dynamic SQL (say about 6000 characters) that you're going to execute against all 300 tables.. do you still want to generate a character variable to hold this massive statement so that if (when) it breaks you'll get "Error on line 1 of...." or hey! put it in a procedure and THEN put it in your statement to save space.. but then you still get "Error on line 1 of..." when it breaks.

Generally, I use a cursor because it's a non-production issue, it's developmental and I'm using it only to save myself typing time.. it's likely a one time event.. but I'd rather have ONE particular command fail that I can debug than have the whole block fail and try to figure out where it's falling down.

The point I was trying to get across was that there ARE times an places that a cursor can be useful.. just need to be careful about when and where, you know?
Post #1437540
Posted Monday, April 1, 2013 11:15 AM


SSC-Insane

SSC-InsaneSSC-InsaneSSC-InsaneSSC-InsaneSSC-InsaneSSC-InsaneSSC-InsaneSSC-InsaneSSC-InsaneSSC-InsaneSSC-Insane

Group: General Forum Members
Last Login: Today @ 7:58 AM
Points: 23,299, Visits: 32,047
Erin Ramsay (4/1/2013)
I concede that this task can be done with a set-based command that generates a list of commands.. lol.. now let's consider a moment.

Now consider your task may take a LARGE block of dynamic SQL (say about 6000 characters) that you're going to execute against all 300 tables.. do you still want to generate a character variable to hold this massive statement so that if (when) it breaks you'll get "Error on line 1 of...." or hey! put it in a procedure and THEN put it in your statement to save space.. but then you still get "Error on line 1 of..." when it breaks.

Generally, I use a cursor because it's a non-production issue, it's developmental and I'm using it only to save myself typing time.. it's likely a one time event.. but I'd rather have ONE particular command fail that I can debug than have the whole block fail and try to figure out where it's falling down.

The point I was trying to get across was that there ARE times an places that a cursor can be useful.. just need to be careful about when and where, you know?


We aren't denying that there may be times when cursors are appropriate. I admit to using them for maintenance or one-time code. And yes, I have created dynamic code that actually exceeds 8000 bytes and does multiple commands.



Lynn Pettis

For better assistance in answering your questions, click here
For tips to get better help with Performance Problems, click here
For Running Totals and its variations, click here or when working with partitioned tables
For more about Tally Tables, click here
For more about Cross Tabs and Pivots, click here and here
Managing Transaction Logs

SQL Musings from the Desert Fountain Valley SQL (My Mirror Blog)
Post #1437541
Posted Monday, April 1, 2013 2:34 PM


SSCertifiable

SSCertifiableSSCertifiableSSCertifiableSSCertifiableSSCertifiableSSCertifiableSSCertifiableSSCertifiableSSCertifiable

Group: General Forum Members
Last Login: Today @ 7:34 AM
Points: 7,098, Visits: 12,606
Lynn Pettis (4/1/2013)
And yes, I have created dynamic code that actually exceeds 8000 bytes and does multiple commands.

Mind if I ask what made you break them into multiple commands?


__________________________________________________________________________________________________
There are no special teachers of virtue, because virtue is taught by the whole community. --Plato
Post #1437614
Posted Monday, April 1, 2013 2:37 PM


SSCertifiable

SSCertifiableSSCertifiableSSCertifiableSSCertifiableSSCertifiableSSCertifiableSSCertifiableSSCertifiableSSCertifiable

Group: General Forum Members
Last Login: Today @ 7:34 AM
Points: 7,098, Visits: 12,606
I use a slightly different method to avoid some of the entitization problems FOR XML brings with it. Technically it is an undocumented method that I know of but I have not had trouble with it since SQL 2000 and I know of some other techniques that leverage it without trouble too:

DECLARE @SQLCmd NVARCHAR(MAX) = N'';

-- instead of using FOR XML just append the conjured string from each row to the same variable
SELECT @SQLCmd += 'EXEC sys.sp_rename ' + --
'@objname = N''' + SCHEMA_NAME(tab.schema_id) + '.' + tab.name + ''', ' + --
'@newname = N''' + tab.name + '_' + CONVERT(VARCHAR(10), GETDATE(), 112) + ''', ' + --
'@objtype = N''OBJECT'';' + -- OBJECT
NCHAR(13) + NCHAR(10)
FROM sys.tables tab;

-- show the entire variable contents as an XML document...no truncation like with PRINT
SELECT @SQLCmd AS [processing-instruction(query)]
FOR XML PATH(''),
TYPE;

-- when ready, uncomment
--EXEC sys.sp_executesql @SQLCmd;



__________________________________________________________________________________________________
There are no special teachers of virtue, because virtue is taught by the whole community. --Plato
Post #1437616
Posted Monday, April 1, 2013 3:19 PM


SSC-Insane

SSC-InsaneSSC-InsaneSSC-InsaneSSC-InsaneSSC-InsaneSSC-InsaneSSC-InsaneSSC-InsaneSSC-InsaneSSC-InsaneSSC-Insane

Group: General Forum Members
Last Login: Today @ 7:58 AM
Points: 23,299, Visits: 32,047
opc.three (4/1/2013)
I use a slightly different method to avoid some of the entitization problems FOR XML brings with it. Technically it is an undocumented method that I know of but I have not had trouble with it since SQL 2000 and I know of some other techniques that leverage it without trouble too:

DECLARE @SQLCmd NVARCHAR(MAX) = N'';

-- instead of using FOR XML just append the conjured string from each row to the same variable
SELECT @SQLCmd += 'EXEC sys.sp_rename ' + --
'@objname = N''' + SCHEMA_NAME(tab.schema_id) + '.' + tab.name + ''', ' + --
'@newname = N''' + tab.name + '_' + CONVERT(VARCHAR(10), GETDATE(), 112) + ''', ' + --
'@objtype = N''OBJECT'';' + -- OBJECT
NCHAR(13) + NCHAR(10)
FROM sys.tables tab;

-- show the entire variable contents as an XML document...no truncation like with PRINT
SELECT @SQLCmd AS [processing-instruction(query)]
FOR XML PATH(''),
TYPE;

-- when ready, uncomment
--EXEC sys.sp_executesql @SQLCmd;



Yea, but I think I'll stick with the FOR XML Version, the following was done against a database with 5147 tables in sys.tables:



---------- opc.three verison ----------
Beginning execution loop
CPU(ms): 73015 Logical Reads: 45463611 Elapsed(ms): 73495 Reads: 3790 Writes: 481191
CPU(ms): 72468 Logical Reads: 45460066 Elapsed(ms): 73294 Reads: 3983 Writes: 483049
CPU(ms): 73015 Logical Reads: 45458216 Elapsed(ms): 73864 Reads: 3886 Writes: 483049
CPU(ms): 72453 Logical Reads: 45458185 Elapsed(ms): 73498 Reads: 4068 Writes: 483050
CPU(ms): 72437 Logical Reads: 45461626 Elapsed(ms): 73262 Reads: 3731 Writes: 483049
Batch execution completed 5 times.
---------- FOR XML PATH Version ----------
Beginning execution loop
CPU(ms): 187 Logical Reads: 79425 Elapsed(ms): 531 Reads: 3477 Writes: 1201
CPU(ms): 187 Logical Reads: 79425 Elapsed(ms): 251 Reads: 3470 Writes: 1201
CPU(ms): 203 Logical Reads: 79425 Elapsed(ms): 414 Reads: 3475 Writes: 1201
CPU(ms): 219 Logical Reads: 79425 Elapsed(ms): 262 Reads: 3466 Writes: 1201
CPU(ms): 172 Logical Reads: 79425 Elapsed(ms): 317 Reads: 3457 Writes: 1201
Batch execution completed 5 times.




Test harness:


PRINT '---------- opc.three verison ----------';
go

set nocount on;
--===== Create the timer and bit-bucket variables and start the timer.
DECLARE @BitBucket sysname; --< change or add variables needed here to eliminate display distortion

Declare @cpu_ int,
@lreads_ bigint,
@eMsec_ bigint,
@Reads_ bigint,
@Writes_ bigint;
declare @CpuMs int,
@LogRds bigint,
@Elapsed bigint,
@Reads bigint,
@Writes bigint;

dbcc freeproccache with no_infomsgs;
dbcc freesystemcache('ALL') with no_infomsgs;
dbcc dropcleanbuffers with no_infomsgs;


Select
@cpu_ = cpu_time
, @lreads_ = logical_reads
, @eMsec_ = total_elapsed_time
, @Reads_ = reads
, @Writes_ = writes
From
sys.dm_exec_requests
Where
session_id = @@spid;

---
DECLARE @SQLCmd NVARCHAR(MAX) = N'';

-- instead of using FOR XML just append the conjured string from each row to the same variable
SELECT @SQLCmd += 'EXEC sys.sp_rename ' + --
'@objname = N''' + SCHEMA_NAME(tab.schema_id) + '.' + tab.name + ''', ' + --
'@newname = N''' + tab.name + '_' + CONVERT(VARCHAR(10), GETDATE(), 112) + ''', ' + --
'@objtype = N''OBJECT'';' + -- OBJECT
NCHAR(13) + NCHAR(10)
FROM sys.tables tab;
---

Select
@CpuMs = cpu_time - @cpu_
, @LogRds = logical_reads - @lreads_
, @Elapsed = total_elapsed_time - @eMsec_
, @Reads = reads - @Reads_
, @Writes = writes - @Writes_
From
sys.dm_exec_requests
Where
session_id = @@spid;
--===== Display the duration
PRINT 'CPU(ms): ' + right(' ' + cast(@CpuMs as varchar(10)),10) +
' Logical Reads: ' + right(' ' + cast(@LogRds as varchar(10)),10) +
' Elapsed(ms): ' + right(' ' + cast(@Elapsed as varchar(10)),10) +
' Reads: ' + right(' ' + cast(@Reads as varchar(10)),10) +
' Writes: ' + right(' ' + cast(@Writes as varchar(10)),10);

set nocount off;
GO 5
set nocount on;

PRINT '---------- FOR XML PATH Version ----------';
go

--===== Create the timer and bit-bucket variables and start the timer.
DECLARE @BitBucket sysname; --< change or add variables needed here to eliminate display distortion

Declare @cpu_ int,
@lreads_ bigint,
@eMsec_ bigint,
@Reads_ bigint,
@Writes_ bigint;
declare @CpuMs int,
@LogRds bigint,
@Elapsed bigint,
@Reads bigint,
@Writes bigint;

dbcc freeproccache with no_infomsgs;
dbcc freesystemcache('ALL') with no_infomsgs;
dbcc dropcleanbuffers with no_infomsgs;


Select
@cpu_ = cpu_time
, @lreads_ = logical_reads
, @eMsec_ = total_elapsed_time
, @Reads_ = reads
, @Writes_ = writes
From
sys.dm_exec_requests
Where
session_id = @@spid;

---
declare @SQLCmd nvarchar(max);

select
@SQLCmd = stuff((select char(13) + char(10) +
'exec sp_rename @objname = N''' + schema_name(tab.schema_id) + '.' + tab.name + ''', @newname = N''' + schema_name(tab.schema_id) + '.' + tab.name + '_' + convert(varchar(10), getdate(), 112) + ''', @objtype = N''TABLE'''
from sys.tables tab
for xml path(''),TYPE).value('.','nvarchar(max)'),1,2,'');

---

Select
@CpuMs = cpu_time - @cpu_
, @LogRds = logical_reads - @lreads_
, @Elapsed = total_elapsed_time - @eMsec_
, @Reads = reads - @Reads_
, @Writes = writes - @Writes_
From
sys.dm_exec_requests
Where
session_id = @@spid;
--===== Display the duration
PRINT 'CPU(ms): ' + right(' ' + cast(@CpuMs as varchar(10)),10) +
' Logical Reads: ' + right(' ' + cast(@LogRds as varchar(10)),10) +
' Elapsed(ms): ' + right(' ' + cast(@Elapsed as varchar(10)),10) +
' Reads: ' + right(' ' + cast(@Reads as varchar(10)),10) +
' Writes: ' + right(' ' + cast(@Writes as varchar(10)),10);

set nocount off;
GO 5





Lynn Pettis

For better assistance in answering your questions, click here
For tips to get better help with Performance Problems, click here
For Running Totals and its variations, click here or when working with partitioned tables
For more about Tally Tables, click here
For more about Cross Tabs and Pivots, click here and here
Managing Transaction Logs

SQL Musings from the Desert Fountain Valley SQL (My Mirror Blog)
Post #1437632
Posted Monday, April 1, 2013 3:56 PM


SSC-Dedicated

SSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-Dedicated

Group: General Forum Members
Last Login: Yesterday @ 9:58 AM
Points: 36,995, Visits: 31,517
AndrewSQLDBA (4/1/2013)
SQL Server does not Loop thru rows. SQL Server is Set Based.

Please do not indicate that a database loops thru rows.

Andrew SQLDBA


Certainly not in the classical sense, but SQL Server is like any other file server. Behind the scenes, it loops like made. R.Barry Young coined the phrase "Pseudo Cursor" for these types of things and it's actually what is responsible for things like the Tally Table working.


--Jeff Moden
"RBAR is pronounced "ree-bar" and is a "Modenism" for "Row-By-Agonizing-Row".

First step towards the paradigm shift of writing Set Based code:
Stop thinking about what you want to do to a row... think, instead, of what you want to do to a column."

(play on words) "Just because you CAN do something in T-SQL, doesn't mean you SHOULDN'T." --22 Aug 2013

Helpful Links:
How to post code problems
How to post performance problems
Post #1437644
Posted Monday, April 1, 2013 4:29 PM


SSCommitted

SSCommittedSSCommittedSSCommittedSSCommittedSSCommittedSSCommittedSSCommittedSSCommitted

Group: General Forum Members
Last Login: Friday, August 29, 2014 12:42 PM
Points: 1,787, Visits: 5,696
Lynn Pettis (4/1/2013)


Yea, but I think I'll stick with the FOR XML Version, the following was done against a database with 5147 tables in sys.tables:



---------- opc.three verison ----------
Beginning execution loop
CPU(ms): 73015 Logical Reads: 45463611 Elapsed(ms): 73495 Reads: 3790 Writes: 481191
CPU(ms): 72468 Logical Reads: 45460066 Elapsed(ms): 73294 Reads: 3983 Writes: 483049
CPU(ms): 73015 Logical Reads: 45458216 Elapsed(ms): 73864 Reads: 3886 Writes: 483049
CPU(ms): 72453 Logical Reads: 45458185 Elapsed(ms): 73498 Reads: 4068 Writes: 483050
CPU(ms): 72437 Logical Reads: 45461626 Elapsed(ms): 73262 Reads: 3731 Writes: 483049
Batch execution completed 5 times.
---------- FOR XML PATH Version ----------
Beginning execution loop
CPU(ms): 187 Logical Reads: 79425 Elapsed(ms): 531 Reads: 3477 Writes: 1201
CPU(ms): 187 Logical Reads: 79425 Elapsed(ms): 251 Reads: 3470 Writes: 1201
CPU(ms): 203 Logical Reads: 79425 Elapsed(ms): 414 Reads: 3475 Writes: 1201
CPU(ms): 219 Logical Reads: 79425 Elapsed(ms): 262 Reads: 3466 Writes: 1201
CPU(ms): 172 Logical Reads: 79425 Elapsed(ms): 317 Reads: 3457 Writes: 1201
Batch execution completed 5 times.





Well those poor stats will be improved a bit by reversing the concatenation:

SELECT  @SQLCmd = N'EXEC sys.sp_rename ' + --
'@objname = N''' + SCHEMA_NAME(tab.schema_id) + '.' + tab.name + ''', ' + --
'@newname = N''' + tab.name + '_' + CONVERT(VARCHAR(10), GETDATE(), 112) + ''', ' + --
'@objtype = N''OBJECT'';' + -- OBJECT
NCHAR(13) + NCHAR(10) + @SQLCmd
FROM sys.tables tab;

I don't have a suitably large database to test on , but I am sure you will see some improvement if you test it that way...


MM


  • MMGrid Addin
  • MMNose Addin


  • Forum Etiquette: How to post Reporting Services problems
  • Forum Etiquette: How to post data/code on a forum to get the best help - by Jeff Moden
  • How to Post Performance Problems - by Gail Shaw

  • Post #1437648
    Posted Monday, April 1, 2013 4:43 PM


    SSCertifiable

    SSCertifiableSSCertifiableSSCertifiableSSCertifiableSSCertifiableSSCertifiableSSCertifiableSSCertifiableSSCertifiable

    Group: General Forum Members
    Last Login: Today @ 7:34 AM
    Points: 7,098, Visits: 12,606
    That seems to be a function of continually updating an NVARCHAR(MAX) which does not scale linearly. It gets truncated so it's worthless to use it but if you run the same code with an NVARCHAR(4000) it keeps up.

    There is no question the XML method offers a nice performance improvement Lynn but I think the measurements are off. When I run this on a DB with ~600 tables my query finishes in 70ms and yours in 5ms, yet your measurement shows 47901ms and there is no way that's correct given that my machine only has 8 CPUs.

    I find the variable-method a little easier to read (personal pref) and am not going to worry all that much about how long one of these kinds of queries is going to run mainly because performance working with DDL like this has never been too much of a concern for me. If a cursor is easier in this context it's one of the few places where I would use one, not for performance or readbility though, for the row-by-row error-handling option it offers that Erin mentioned. To each their own.

    A couple things I tweaked in the original code that I thought should be fixed in the other samples, just in case someone tried to copy it and run off to use it, so I'll call them out here since I think they went unnoticed:

    1. This snip:

    @newname = N''' + schema_name(tab.schema_id) + '.' +  tab.name

    should have the schema name removed from the new name else you end up with tables named like [dbo].[dbo.table_20130401]

    2. @objtype = N'TABLE' should be changed to @objtype = N'OBJECT'



    __________________________________________________________________________________________________
    There are no special teachers of virtue, because virtue is taught by the whole community. --Plato
    Post #1437652
    Posted Monday, April 1, 2013 4:49 PM


    SSCertifiable

    SSCertifiableSSCertifiableSSCertifiableSSCertifiableSSCertifiableSSCertifiableSSCertifiableSSCertifiableSSCertifiable

    Group: General Forum Members
    Last Login: Today @ 7:34 AM
    Points: 7,098, Visits: 12,606
    mister.magoo (4/1/2013)
    I don't have a suitably large database to test on , but I am sure you will see some improvement if you test it that way...
    You have done it again...have I told you that you are incredible lately MM?

    I am guessing that appending is more efficient than prepending but never really ran into a performance issue that made me think about it like that...can you confirm the reason?

    I created a little test bed, because it doesn't really matter what's in the tables or really how many columns. Just change the WHILE loop to create more or less tables and make sure really do not have a DB named [test] on your instance before running. Note that I am using SET STATISTICS ON instead of reading sys.dm_exec_sessions until Lynn can explain the other metrics.

    SET STATISTICS TIME OFF;

    USE master
    DROP DATABASE test
    CREATE DATABASE test

    USE test

    DECLARE @c INT = 10,
    @sql NVARCHAR(MAX);

    WHILE @c < 5000
    BEGIN
    SET @sql = 'use test; create table dbo.test' + CAST(@c AS CHAR(10)) + ' ( id int)';
    --PRINT @sql
    EXEC (@sql)
    SET @c += 1
    END
    GO

    PRINT '---------- opc.three verison ----------';
    go

    set nocount on;
    --===== Create the timer and bit-bucket variables and start the timer.
    dbcc freeproccache with no_infomsgs;
    dbcc freesystemcache('ALL') with no_infomsgs;
    dbcc dropcleanbuffers with no_infomsgs;

    ---
    DECLARE @SQLCmd NVARCHAR(MAX),
    @dt NVARCHAR(10) = CONVERT(NVARCHAR(10), GETDATE(), 112);

    SET STATISTICS TIME ON;

    -- instead of using FOR XML just append the conjured string from each row to the same variable
    SELECT @SQLCmd = 'EXEC sys.sp_rename ' + --
    '@objname = N''' + s.name + '.' + tab.name + ''', ' + --
    '@newname = N''' + tab.name + '_' + @dt + ''', ' + --
    '@objtype = N''OBJECT'';' + NCHAR(13) + NCHAR(10) + @SQLCmd
    FROM sys.schemas s
    JOIN sys.tables tab ON s.schema_id = tab.schema_id;
    ---
    SET STATISTICS TIME OFF;

    set nocount off;
    GO
    set nocount on;

    PRINT '---------- FOR XML PATH Version ----------';
    go

    --===== Create the timer and bit-bucket variables and start the timer.
    dbcc freeproccache with no_infomsgs;
    dbcc freesystemcache('ALL') with no_infomsgs;
    dbcc dropcleanbuffers with no_infomsgs;

    ---
    declare @SQLCmd nvarchar(max);

    SET STATISTICS TIME ON;

    select
    @SQLCmd = stuff((select char(13) + char(10) +
    'sp_rename @objname = N''' + schema_name(tab.schema_id) + '.' + tab.name + ''', @newname = N''' + schema_name(tab.schema_id) + '.' + tab.name + '_' + convert(varchar(10), getdate(), 112) + ''', @objtype = N''TABLE'''
    from sys.tables tab
    for xml path(''),TYPE).value('.','nvarchar(max)'),1,2,'');

    ---
    SET STATISTICS TIME OFF;

    set nocount off;
    GO

    For 5000 tables on my machine:

    ---------- opc.three verison ----------

    SQL Server Execution Times:
    CPU time = 9719 ms, elapsed time = 9776 ms.
    ---------- FOR XML PATH Version ----------

    SQL Server Execution Times:
    CPU time = 32 ms, elapsed time = 51 ms.


    edit: posted results for 500, not 5000. the above results are corrected for 5000 tables. ~10s to ~50ms


    __________________________________________________________________________________________________
    There are no special teachers of virtue, because virtue is taught by the whole community. --Plato
    Post #1437654
    « Prev Topic | Next Topic »

    Add to briefcase ««123»»

    Permissions Expand / Collapse