• opc.three (4/1/2013)


    That seems to be a function of continually updating an NVARCHAR(MAX) which does not scale linearly. It gets truncated so it's worthless to use it but if you run the same code with an NVARCHAR(4000) it keeps up.

    There is no question the XML method offers a nice performance improvement Lynn but I think the measurements are off. When I run this on a DB with ~600 tables my query finishes in 70ms and yours in 5ms, yet your measurement shows 47901ms and there is no way that's correct given that my machine only has 8 CPUs.

    I find the variable-method a little easier to read (personal pref) and am not going to worry all that much about how long one of these kinds of queries is going to run mainly because performance working with DDL like this has never been too much of a concern for me. If a cursor is easier in this context it's one of the few places where I would use one, not for performance or readbility though, for the row-by-row error-handling option it offers that Erin mentioned. To each their own.

    A couple things I tweaked in the original code that I thought should be fixed in the other samples, just in case someone tried to copy it and run off to use it, so I'll call them out here since I think they went unnoticed:

    1. This snip:

    @newname = N''' + schema_name(tab.schema_id) + '.' + tab.name

    should have the schema name removed from the new name else you end up with tables named like [dbo].[dbo.table_20130401]

    2. @objtype = N'TABLE' should be changed to @objtype = N'OBJECT'

    I'll give you the @objtype change as something to be corrected, and the second as I just reread ap_rename. Now, how to deal with multiple schemas??

    Without testing, does the new table name keep the old table name's schema?