Execute a table of statements

  • Hi

    My business case

    We want to start with "security" (this word becomes ridiculous within the next sentences!) in our development and test system. Since now all developers are sysadmin on our SQL Server (I know...). To start with this "security" my chief requested a script to create a login "development" on server and create a user with same name and "db_owner" rights to all databases... (and so the "security" is away ...sigh).

    There are only some special databases which shall be excluded from this solution but that's not my current problem.

    My problem

    I need a script which creates this "development" account on all databases (ignore the excluded databases requirement).

    I have the following script which works fine. But I really would like to remove the WHILE loop!

    -- Table to get all statements

    DECLARE @stmts TABLE (id INT IDENTITY, stmt NVARCHAR(MAX), PRIMARY KEY (id))

    DECLARE @crlf CHAR(2)

    SELECT @crlf = CHAR(13) + CHAR(10)

    -- Get the statements for all databases

    INSERT INTO @stmts

    SELECT 'USE ' + QUOTENAME(name) + @crlf +

    'IF NOT EXISTS (SELECT 1 FROM sys.database_principals WHERE name = ''development'')' + @crlf +

    'BEGIN' + @crlf +

    ' CREATE USER [development] FOR LOGIN [development]' + @crlf +

    ' EXECUTE sp_addrolemember ''db_owner'', ''development''' + @crlf +

    'END'

    FROM sys.databases

    WHERE name NOT IN ('master', 'msdb', 'tempdb', 'model')

    DECLARE @stmt NVARCHAR(MAX)

    DECLARE @id INT

    -- Loop through all statements and execute them

    WHILE EXISTS (SELECT TOP(1) 1 FROM @stmts)

    BEGIN

    SELECT TOP(1) @id = id, @stmt = stmt

    FROM @stmts

    PRINT @stmt

    --EXECUTE sp_executesql @stmt

    DELETE FROM @stmts WHERE Id = @Id

    END

    Another possible way I know would be to concatenate all statements into a variable and execute it:

    DECLARE @crlf CHAR(2)

    DECLARE @stmt NVARCHAR(MAX)

    SELECT @stmt = '', @crlf = CHAR(13) + CHAR(10)

    -- Get the statements for all databases

    SELECT @stmt = @stmt + 'USE ' + QUOTENAME(name) + @crlf +

    'IF NOT EXISTS (SELECT 1 FROM sys.database_principals WHERE name = ''development'')' + @crlf +

    'BEGIN' + @crlf +

    ' CREATE USER [development] FOR LOGIN [development]' + @crlf +

    ' EXECUTE sp_addrolemember ''db_owner'', ''development''' + @crlf +

    'END' + @crlf + @crlf

    FROM sys.databases

    WHERE name NOT IN ('master', 'msdb', 'tempdb', 'model')

    PRINT @stmt

    --EXECUTE sp_executesql @stmt

    ... but we currently have over 260 databases on this server and the statement will have a size of 120kb. May this become a problem?

    Is it possible to execute all statements within a table (from my first script) without a loop? I remember there may be something with OPENQUERY but it seems not possible to use on local server and I don't want to create a linked server to local server.

    I am looking for something like (poseudo code):

    SELECT 1

    FROM @stmts s CROSS APPLY EXECUTE(s.stmt)

    -- OR

    SELECT 1,

    EXECUTE (s.stmt)

    FROM @stmts

    -- OR

    SELECT 1,

    (SELECT 1 FROM OPENQUERY(localServer, N'EXECUTE sp_executesql N''' + s.stmt + ''''))

    FROM @stmts s

    Is this possible?

    Thanks for your advices!

    Flo

  • Have you looked at sp_MSforeachdb?

  • I need to cite where I got this trick from, but try this:

    DECLARE @stmts TABLE (id INT IDENTITY, stmt NVARCHAR(MAX), PRIMARY KEY (id)) -- your table

    insert into @stmts (stmt)

    select 'print ''It '';' UNION

    select 'print ''works!'';' -- some dummy data

    declare @sql nvarchar(max) -- our sql to be executed

    select @sql=(select stmt + '' from @stmts for xml path('')) -- Using for xml path concatentates all the rows, and the + '' ensures there is no row name.

    exec(@sql)

    Note it only works if all your commands concatenated are shorter than the nvarchar(max) limit

    /edit: the output:

    (2 row(s) affected)

    It

    works!

    \edit:

    Could not cite exactly where I found this - but it's all over the internet so I doubt my source was original anyway.

    ---
    Dlongnecker

  • Hi Richard

    Thank your for your feedback. The sp_MSforeachdb also uses a CURSOR internally. See:

    DECLARE @v-2 NVARCHAR(MAX)

    SELECT @v-2 = definition

    FROM sys.all_sql_modules

    WHERE object_id = OBJECT_ID('sys.sp_MSforeachdb')

    PRINT @v-2

    I would like to know if it is possible to do this without any cursor/loop in a column based solution. Anyway the sp_MSforeachdb is a very nice solution for my script!

    Greets

    Flo

  • Hi Dlongnecker

    This is an alike solution as my second one to concatenate all statements. But the XML way is definitely cooler! 😀

    I'm just not sure if this may become a problem with huge scripts. In the current case the resulting @sql has over 120kb currently it works, but I don't know the boundary for the size of one script to make it slow. Do you know this?

    Thank you!

    Flo

  • Florian Reischl (3/25/2009)


    Hi Richard

    Thank your for your feedback. The sp_MSforeachdb also uses a CURSOR internally. See:

    DECLARE @v-2 NVARCHAR(MAX)

    SELECT @v-2 = definition

    FROM sys.all_sql_modules

    WHERE object_id = OBJECT_ID('sys.sp_MSforeachdb')

    PRINT @v-2

    I would like to know if it is possible to do this without any cursor/loop in a column based solution. Anyway the sp_MSforeachdb is a very nice solution for my script!

    Greets

    Flo

    I use sp_MSforeachdb heavily. For administrative purposes like this there is no shame in using a cursor.


    * Noel

  • Hi Noel

    If I rethink my question I think my thought makes no sense, because it is a circle:

    * If I concatenate to many statements I run into the problem that the single script becomes to slow

    * If I don't concatenate I have to loop

    It's always a performance issue to work with too much dynamic SQL...

    So thanks to everybody. I will use the sp_MSforeachdb.

    Have a nice day!

    Flo

  • Flo, with all due respect I think you are working too hard to avoid using a cursor here. Administrative tasks such as creating users in a set of databases are prime examples where cursors are the best solution.

    This is not something that is going to be run on a regular basis. It can easily be written to loop (yes, cursor) through each database - check to see if the user exists and if not, create the user.

    Any time a new database is added to the instance, execute the script and the appropriate accounts will be created.

    I just don't see the need to work so hard on avoiding a cursor here. 🙂

    Jeffrey Williams
    “We are all faced with a series of great opportunities brilliantly disguised as impossible situations.”

    ― Charles R. Swindoll

    How to post questions to get better answers faster
    Managing Transaction Logs

  • Florian Reischl (3/25/2009)


    Hi Dlongnecker

    This is an alike solution as my second one to concatenate all statements. But the XML way is definitely cooler! 😀

    I'm just not sure if this may become a problem with huge scripts. In the current case the resulting @sql has over 120kb currently it works, but I don't know the boundary for the size of one script to make it slow. Do you know this?

    I do these kind of scripts (a lot) and I always concatenate them all together using either the Variable method or the XML method.

    The Variable method is more efficient at the low end, however, because it is inherently O(n2), the XML solution will eventually overtake it because it is only O(n). And of course that means that it scales perfectly. So your upper limit there would be the size limit of Varchar(MAX) or NVarchar(MAX).

    [font="Times New Roman"]-- RBarryYoung[/font], [font="Times New Roman"] (302)375-0451[/font] blog: MovingSQL.com, Twitter: @RBarryYoung[font="Arial Black"]
    Proactive Performance Solutions, Inc.
    [/font]
    [font="Verdana"] "Performance is our middle name."[/font]

  • [font="Verdana"]Flo, I've actually dynamically built stored procedures and then executed them in the past for this kind of thing. So you'd end up with code that did the following:

    1. build string to create stored procedure

    2. create the stored procedure (dynamic SQL)

    3. call the stored procedure (can insert the results into a table to track errors and so on).

    4. drop the stored procedure

    Could you use a similar approach here?

    On the other hand, I'm not sure how much value you get from avoiding using cursors in this scenario.

    [/font]

  • Hi Jeffrey

    Thanks for your reply!

    You are all absolutely right for these administrative task!

    I think I've just been nerved for this "security" introduction. Since now all my team members worked with windows authentication and their correct rights. Also my processes all worked with either own user accounts or with the process servers authorization on the database server. I didn't matter on the other projects. This implied that the processes had no problems when they came to production (like in others because of thinks like TRUNCATE TABLE). Now the "development" user is permitted for all developers and I subject that more and more processes and developers of my team use it.

    Because I struggled with this requirement I just started thinking about any possibility to run many SQL statements fast against the server.

    Sorry for this senseless idea... :hehe:

    Greets

    Flo

  • [font="Verdana"]Another idea...

    Part of the SQLCMD is the ability to create variables for the environment (rather than T-SQL variables.) You could use this to build a command line script that you give it the database name, and it runs your statements within SQLCMD.

    So then you'd have something like:

    1. Copy out the table of commands to a script file

    2. Change to the correct database

    3. Run the script file

    [/font]

  • RBarryYoung (3/25/2009)


    I do these kind of scripts (a lot) and I always concatenate them all together using either the Variable method or the XML method.

    The Variable method is more efficient at the low end, however, because it is inherently O(n2), the XML solution will eventually overtake it because it is only O(n). And of course that means that it scales perfectly. So your upper limit there would be the size limit of Varchar(MAX) or NVarchar(MAX).

    Hi Barry

    Thanks for your feedback. You are right, I think I will take more to the FOR XML functionality. At the moment our SQL Server 2000 become less and less so this becomes more possible.

    Greets

    Flo

  • Jeffrey Williams (3/25/2009)


    Flo, with all due respect I think you are working too hard to avoid using a cursor here. Administrative tasks such as creating users in a set of databases are prime examples where cursors are the best solution.

    Sorry Jeffery, I am going to have to disagree here. I definitely do not think that Cursors are the best solution here. Why? Because:

    1) Cursors are not necessary. The Dynamic Concat method works just fine.

    2) The Dynamic Concat method is faster.

    3) It is not that much harder, once you learn the technique.

    4) The Big Reason: Set An Unimpeachable Example.

    What does this last one mean? Just that it is important when trying to encourage good development practices to set a good example, even if it may not seem necessary or is inconvenient. This is especially true when trying to get developers to stop using explicit cursors and loops, because developers feel drawn to use cursors and loops the way that a heroin addict needs a fix. In these circumstances, it is not enough to just be consistent with a set of rules, it is necessary to be unimpeachable on the matter so that no one can say behind your back "well (s)he does it sometimes, therefore so can we". You do not want them to ever have that excuse.

    [font="Times New Roman"]-- RBarryYoung[/font], [font="Times New Roman"] (302)375-0451[/font] blog: MovingSQL.com, Twitter: @RBarryYoung[font="Arial Black"]
    Proactive Performance Solutions, Inc.
    [/font]
    [font="Verdana"] "Performance is our middle name."[/font]

  • Bruce W Cassidy (3/25/2009)


    [font="Verdana"]Flo, I've actually dynamically built stored procedures and then executed them in the past for this kind of thing. So you'd end up with code that did the following:

    1. build string to create stored procedure

    2. create the stored procedure (dynamic SQL)

    3. call the stored procedure (can insert the results into a table to track errors and so on).

    4. drop the stored procedure

    Could you use a similar approach here?

    On the other hand, I'm not sure how much value you get from avoiding using cursors in this scenario.

    [/font]

    Hi Bruce!

    Thanks also to you for your feedback! Great idea! I will definitely consider this for future dynamic SQL solutions!

    Greets

    Flo

Viewing 15 posts - 1 through 15 (of 22 total)

You must be logged in to reply to this topic. Login to reply