December 1, 2014 at 12:03 am
Comments posted to this topic are about the item sp_msforeachdb: Improving on an Undocumented Stored Procedure
December 1, 2014 at 5:57 am
Nice article Ed.
You can add "distribution" database also to the list when excluding the system databases.
December 1, 2014 at 7:00 am
You might look at what Aaron did for the same problem: http://sqlblog.com/blogs/aaron_bertrand/archive/2010/12/29/a-more-reliable-and-more-flexible-sp-msforeachdb.aspx.
December 1, 2014 at 9:56 am
Another great solution is the set based solution by Gianluca Sartori.
http://spaghettidba.com/2011/09/09/a-better-sp_msforeachdb/
Jason...AKA CirqueDeSQLeil
_______________________________________________
I have given a name to my pain...MCM SQL Server, MVP
SQL RNNR
Posting Performance Based Questions - Gail Shaw[/url]
Learn Extended Events
December 1, 2014 at 9:58 am
Divine Flame (12/1/2014)
Nice article Ed.You can add "distribution" database also to the list when excluding the system databases.
It depends on the case as to whether to skip or include the distribution database. One big problem with trying to include or exclude distribution is that it is a user database and it can be named anything.
Jason...AKA CirqueDeSQLeil
_______________________________________________
I have given a name to my pain...MCM SQL Server, MVP
SQL RNNR
Posting Performance Based Questions - Gail Shaw[/url]
Learn Extended Events
December 1, 2014 at 11:26 am
Great article. I had a need to execute T-SQL for every database on a server. I did not want to use the MS procedure for the reasons that you state. I chose another approach but your solution makes this available to more people.
In a "It's better than nothing" mindset we ran a COPY_ONLY backup of every database on our servers in the dark of the early morning. This is not a great plan but it was not for production data either. Now if that has got gotten the ire of folks raised hold onto your hat.
T-SQL is wonderful for what it does. Easy handling of string manipulation is not one of its strong points. I would like to see MS implement a revised version of sp_msforeachdb as an internal procedure doing all the string handling in code. This would result in a supported, documented, and reliable tool that we could use.
Until such time this is a great replacement.
ATBCharles Kincaid
December 2, 2014 at 5:31 am
SQLRNNR (12/1/2014)
Divine Flame (12/1/2014)
Nice article Ed.You can add "distribution" database also to the list when excluding the system databases.
It depends on the case as to whether to skip or include the distribution database. One big problem with trying to include or exclude distribution is that it is a user database and it can be named anything.
That sure is a problem with distribution databases. It's good to be reminded sometimes. Thanks Jason.
December 2, 2014 at 9:23 am
This is a very minor quibble, but presenting procs and functions like these in the correct case, ie. sp_MSforeachdb instead of sp_msforeachdb, will help maintain the sanity of those poor souls marooned on binary collation instances...
December 3, 2014 at 8:05 am
I may have missed it, but is the database status accounted for anywhere in the final version of the procedure? I know that's something that I've run into multiple times when trying to cycle through databases.
December 3, 2014 at 8:31 am
bubenik.1 (12/3/2014)
I may have missed it, but is the database status accounted for anywhere in the final version of the procedure? I know that's something that I've run into multiple times when trying to cycle through databases.
Database status was not taken into account here. I chose to keep this simple and allow the reader to add to it as needed. The big failure in Microsoft's implementation was a desire to do way too much at one time, and by doing so, create something too obfuscated and complex to be usable.
Database status could be a great parameter or built-in feature, if a server has databases that go offline or restoring often.
December 3, 2014 at 11:22 am
Thank you for sharing your approach! Am I correct in saying that this would not work if multiple users happen to run this simultaneously? (because of the global temp tables). If so, is there a work-around, which would help?
December 3, 2014 at 11:44 am
Misha_SQL (12/3/2014)
Thank you for sharing your approach! Am I correct in saying that this would not work if multiple users happen to run this simultaneously? (because of the global temp tables). If so, is there a work-around, which would help?
You are correct: this approach would definitely fail if multiple users run it at once. One solution would be to tag the name of the global temp table with a unique suffix, a timestamp maybe. So instead of the table being called ##database_files, you could dynamically name it and call it something like ##database_files_12032014082319123, or something like that. It's a bit roundabout, but would work nicely to avoid duplication.
An alternative would be to have the run_query_across_databases query execute a SELECT, instead of an INSERT. Define a local temp table, #database_files. Then, use syntax such as this to insert directly into it:
INSERT INTO #database_files
EXEC run_query_across_databases '
USE [?]
SELECT
DB_NAME(),
file_id,
type,
name,
physical_name
FROM sys.database_files',
1, NULL, NULL, NULL, @my_db_list
This isn't thoroughly tested, but with a few changes would allow you to avoid global temp tables altogether and allow multiple users to run it at once.
I personally would prefer the dynamically-named table, as the date/time tagging can be a useful debugging tool, but that is a matter of opinion : )
December 3, 2014 at 4:32 pm
Rather than a global temp table what about a table type variable?
ATBCharles Kincaid
April 15, 2016 at 7:18 am
You write:
Global temp tables are used so that they will be accessible from our dynamic SQL statement, which otherwise would be out-of-scope with respect to a local temp table, and therefore unable to read it.
This is an incorrect statement. Any local temp tables visible to the current scope will be visible to a dynamic session executed from that scope. In other words, the code below works (assuming that you change "SomeTable" to the name of a table that exists, or course).
DECLARE @x NVARCHAR(500) = N'SELECT * FROM #z;';
SELECT * INTO #z FROM dbo.SomeTable;
EXEC (@x); -- it works with EXEC
EXEC sp_executesql @statement = @x; -- and it works with sp_executesql
DROP TABLE #z;
Table Vars, on the other hand, will not be visible to the dynamic code because their scope is limited.
So, you can get rid of the global temp tables and two people can run the utility at the same time without collision.
April 15, 2016 at 8:48 am
I worried about using these undocumented procs as well and thought the biggest fear was the global cursor it employed. I modified it to use a dynamic local cursor to avoid any collisions a global one might encounter when multiple users ran at same time. Eliminating that made me feel a little safer about their use. Here is how I made the cursor dynamic as a local fast_forward (maybe a better choice if speed is concern would be local static as Hugo Kornelis has noted in one of his blogs on cursors):
DECLARE @sql nvarchar(max);
SET @sql =
N'SET @my_cur = CURSOR LOCAL FAST_FORWARD FOR '
+ N'SELECT ''['' + REPLACE(schema_name(syso.schema_id), N'']'', N'']]'') + '']'' + ''.'' + ''['' + REPLACE(object_name(o.id), N'']'', N'']]'') + '']'' '
+ N' FROM dbo.sysobjects o '
+ N' INNER JOIN sys.all_objects syso on o.id = syso.object_id '
+ N' WHERE OBJECTPROPERTY(o.id, N''IsUserTable'') = 1 '
+ N' AND o.category & ' + @mscat + N' = 0 ';
IF @whereand IS NOT NULL BEGIN
END;
SET @sql = @sql + N'; OPEN @my_cur;';
DECLARE @local_cursor cursor
EXEC sp_executesql
,N'@my_cur cursor OUTPUT'
,@my_cur = @local_cursor OUTPUT;
FETCH @local_cursor INTO @name;
WHILE (@@fetch_status >= 0) BEGIN
--<whatever other code>
FETCH @local_cursor INTO @name;
END /* WHILE FETCH_SUCCESS */
SET @curStatus = Cursor_Status('variable', '@local_cursor');
IF @curStatus >= 0 BEGIN
CLOSE @local_cursor;
DEALLOCATE @local_cursor;
END;
Viewing 15 posts - 1 through 15 (of 18 total)
You must be logged in to reply to this topic. Login to reply