I liked your article, so I hope you don't think I was being criticle. You gave some good ideas about things that can effect performance. My thought on renaming tables was simply adding to the discussion, not trying to say there was a better way. It seemed reasonable that instead of moving 3,000,000 records, it would be more effecient to move 10,000 and renaming would allow that. Clearly, any real world system may have additional complexities such as you pointed out.
Again, it was a good article both in its content and in the discussion it created. I'm looking forward to part two.
Thanks for the warm words. I do like good solutions and your's is good for the discussion. I do follow your's advise while I am doing quick and dirty tests. In the real world I have some issues with object renaming. I have seeing the cases where after renaming an object and creating another one with the same name some applications start erroring with message that it can't find an object (even I can see it and work with it in Query Analyzer). Personally, I do avoid renaming in production databases.
Second part will be soon. It is under review.
Per BOL..."Important After renaming stored procedures and views, flush the procedure cache to ensure all dependent stored procedures and views are recompiled."
Notice the "DBCC FREEPROCCACHE" at the end of the rename script.
On another topic, I noticed you didn't comment on the different while loop. What do you think about using a break instead of min and max variables? If you check the optimizer it's definitely more efficient.
Possible that optimizer shows that the break is more effective but it is like 10 or 10.01 miles/gal. Is it different? For testing purpose on factory - yes. In real world ????? When I am working, I am trying to cut worst parts and advice developers to focus on important changes. This one I am not consider as important because there is no major time cut and it is not influence another processes.
Actually...it is significantly different, per my optimizer it costs 2.5% of the total cost of both (although this includes creating the table variable, which I would HOPE you had in for testing purposes only).
I would agree that this is not significant if it's under a 100 loops or so (which is the case in your example), but it is significant with more loops. I might add, under a 100 loops there is no significant difference between a cursor and either while loop.
I certainly wasn't saying you should modify your existing code. But since your article was about optimization, I figured you'd be interested. Seems like you are a little defensive instead. Oh well...
I am not defencive, but trying to be realistic. If I need over 200 -500 loops then I need to look an alternative to the loop solution and in most cases I was able to find it. For example, instead of writing a loop to get permission for all tables
grant SELECT on table1 to XYZ
Grant SELECT on table2 to XYZ
it can be achived by
declare @cmd varchar(8000)
set @cmd = ''
select @cmd = @cmd + '
grant SELECT on ' + name + ' on XYZ ' from sysobjects where type = 'U'
Definitely a set-based solution is always best, and you're right that many loops can be converted into set-based processing.
That's not what we were talking about, though; we were talking about replacements for cursors when a loop is needed. I always appreciate seeing a "while" loop instead of a cursor (and i appreciate seeing a set-based solution even more), as they are cleaner and (I think) easier to read. I commend you on this, but I was also pointing out that using a "break" instead of checking variables is more efficient, especially when using "Select min, max" to set the loop control variables.
I'm not trying to piss you off or anything...