Script to enable Auto-growth

  • I need to generate a script to enable autogrowth on multiple databases on a SQL Server 2005 instance, it needs to be a mass update instead of manually updating all 3K+ databases one at a time. Anyone has a script for this?

  • I don't know any way to alter multiple databases in once statement but you might consider creating a simple cursor on sysdatabases, then generate dynamic SQL to do the alter, then execute it. It won't be fast, but it should work.

    Here it is basically. You just have to figure out how you want to decide the filegrowth.

    DECLARE DBCur CURSOR

    KEYSET

    FOR SELECT name FROM master.dbo.sysdatabases

    WHERE name NOT IN ('master','msdb','model','tempdb')

    DECLARE @dbname varchar(255)

    DECLARE @filename varchar(255)

    DECLARE @sql nvarchar(4000)

    OPEN DBCur

    FETCH NEXT FROM DBCur INTO @dbname

    WHILE (@@fetch_status <> -1)

    BEGIN

    IF (@@fetch_status <> -2)

    BEGIN

    SET @sql = 'DECLARE FileCur CURSOR ' +

    'KEYSET ' +

    'FOR SELECT name FROM ['+@dbname+'].dbo.sysfiles ' +

    'WHERE filename NOT LIKE ''%.ldf'''

    EXEC sp_executesql @sql

    OPEN FileCur

    FETCH NEXT FROM FileCur INTO @filename

    WHILE (@@fetch_status <> -1)

    BEGIN

    IF (@@fetch_status <> -2)

    BEGIN

    SET @sql = 'ALTER DATABASE ['+@dbname+'] MODIFY FILE ( ' +

    'NAME = N'''+@filename+''', FILEGROWTH = 103424KB )'

    PRINT @sql

    -- EXEC sp_executesql @sql

    END

    FETCH NEXT FROM FileCur INTO @filename

    END

    CLOSE FileCur

    DEALLOCATE FileCur

    END

    FETCH NEXT FROM DBCur INTO @dbname

    END

    CLOSE DBCur

    DEALLOCATE DBCur

    GO

    Kenneth FisherI was once offered a wizards hat but it got in the way of my dunce cap.--------------------------------------------------------------------------------For better, quicker answers on T-SQL questions, click on the following... http://www.sqlservercentral.com/articles/Best+Practices/61537/[/url]For better answers on performance questions, click on the following... http://www.sqlservercentral.com/articles/SQLServerCentral/66909/[/url]Link to my Blog Post --> www.SQLStudies.com[/url]

  • An alternative could be to use sp_msforeachdb

    Jason...AKA CirqueDeSQLeil
    _______________________________________________
    I have given a name to my pain...MCM SQL Server, MVP
    SQL RNNR
    Posting Performance Based Questions - Gail Shaw[/url]
    Learn Extended Events

  • I had thought of using sp_msforeachdb. But for each database you have to scan through all of the data files (and/or log files depending on what the requirements are) in case there are multiples. I'm fairly sure there is a rather sharp limit on the command size using sp_msforeachdb and I don't think you could get it done. Not to mention how complicated the dynamic SQL would end up being to do it. This method while somewhat slow and clumsy seemed to me a better solution.

    Kenneth

    Kenneth FisherI was once offered a wizards hat but it got in the way of my dunce cap.--------------------------------------------------------------------------------For better, quicker answers on T-SQL questions, click on the following... http://www.sqlservercentral.com/articles/Best+Practices/61537/[/url]For better answers on performance questions, click on the following... http://www.sqlservercentral.com/articles/SQLServerCentral/66909/[/url]Link to my Blog Post --> www.SQLStudies.com[/url]

  • Thanks for the replies everyone!

    SSCrazy,

    I actually need to enable autogrowth for both the data and log file for each database, the maxsize would be 150MB for data and 60MB for log and the growth would be 1MB, I see that your script only update the data file. I think you are right, the cursor would be faster.

  • Tarsha Shannon (11/2/2011)


    Thanks for the replies everyone!

    SSCrazy,

    I actually need to enable autogrowth for both the data and log file for each database, the maxsize would be 150MB for data and 60MB for log and the growth would be 1MB, I see that your script only update the data file. I think you are right, the cursor would be faster.

    Simple enough changes. Just modify the dynamic SQL that creates the FileCur to include the log files (remove the where clause), and add the right 3 characters of the path. Then when you are generating the SQL to do the alter database check the 3 characters and put either 150mb or 60mb for max size the growth of 1mb.

    An easy way to script out an example of the ALTER DATABASE script itself is to set it up the changes you want for a single database in the GUI and hit the script button at the top of the window. Then you can use that example to modify the dynamic SQL to suit.

    Kenneth

    Kenneth FisherI was once offered a wizards hat but it got in the way of my dunce cap.--------------------------------------------------------------------------------For better, quicker answers on T-SQL questions, click on the following... http://www.sqlservercentral.com/articles/Best+Practices/61537/[/url]For better answers on performance questions, click on the following... http://www.sqlservercentral.com/articles/SQLServerCentral/66909/[/url]Link to my Blog Post --> www.SQLStudies.com[/url]

  • Kenneth Fisher-475792 (11/2/2011)


    I had thought of using sp_msforeachdb. But for each database you have to scan through all of the data files (and/or log files depending on what the requirements are) in case there are multiples. I'm fairly sure there is a rather sharp limit on the command size using sp_msforeachdb and I don't think you could get it done. Not to mention how complicated the dynamic SQL would end up being to do it. This method while somewhat slow and clumsy seemed to me a better solution.

    Kenneth

    Good points. Yes there is a limit. And while it should be a one off - I don't think appearance of slow would be too terribly bad. Better than manually doing it for 3000 databases.

    Jason...AKA CirqueDeSQLeil
    _______________________________________________
    I have given a name to my pain...MCM SQL Server, MVP
    SQL RNNR
    Posting Performance Based Questions - Gail Shaw[/url]
    Learn Extended Events

  • I forgot to add that there are some databases on the system that does already have autogrowth enabled, I am executing the query below to determine which ones don't, is there a way to incorporate this condition into the script? Thanks.

    select [name] from master.dbo.sysaltfiles where growth = 0

  • Believe it or not sysaltfiles will simplify things. Here is the code with just a single loop using a join between sys.databases and sysaltfiles. Note I'm still just setting the filegrowth to a single value, and not setting the maxgrowth at all. Should be easy enough for you to update. I did remove the reference to data files only however.

    DECLARE DBCur CURSOR

    KEYSET FOR

    SELECT db.name, altfiles.name

    FROM master.dbo.sysaltfiles altfiles

    JOIN master.sys.databases db

    ON db.database_id = altfiles.dbid

    WHERE altfiles.growth = 0

    AND db.name NOT IN ('master','msdb','model','tempdb')

    DECLARE @dbname varchar(255)

    DECLARE @filename varchar(255)

    DECLARE @sql nvarchar(4000)

    OPEN DBCur

    FETCH NEXT FROM DBCur INTO @dbname, @filename

    WHILE (@@fetch_status <> -1)

    BEGIN

    IF (@@fetch_status <> -2)

    BEGIN

    SET @sql = 'ALTER DATABASE ['+@dbname+'] MODIFY FILE ( ' +

    'NAME = N'''+@filename+''', FILEGROWTH = 103424KB )'

    PRINT @sql

    -- EXEC sp_executesql @sql

    END

    FETCH NEXT FROM DBCur INTO @dbname, @filename

    END

    CLOSE DBCur

    DEALLOCATE DBCur

    GO

    Kenneth FisherI was once offered a wizards hat but it got in the way of my dunce cap.--------------------------------------------------------------------------------For better, quicker answers on T-SQL questions, click on the following... http://www.sqlservercentral.com/articles/Best+Practices/61537/[/url]For better answers on performance questions, click on the following... http://www.sqlservercentral.com/articles/SQLServerCentral/66909/[/url]Link to my Blog Post --> www.SQLStudies.com[/url]

  • Thanks SSCrazy!

  • Tarsha Shannon (11/2/2011)


    Thanks for the replies everyone!

    SSCrazy,

    I actually need to enable autogrowth for both the data and log file for each database, the maxsize would be 150MB for data and 60MB for log and the growth would be 1MB, I see that your script only update the data file. I think you are right, the cursor would be faster.

    I would question the autogrowth size of 1 MB. Autogrowth should no replace the DBA making appropriate increases on a database every 3 to 6 months. Multiple increases in database or log file size during working hours can cause performance issues as well as leading to heavily fragmented mdf/ldf files on disk.

  • Lynn Pettis (11/3/2011)


    Tarsha Shannon (11/2/2011)


    Thanks for the replies everyone!

    SSCrazy,

    I actually need to enable autogrowth for both the data and log file for each database, the maxsize would be 150MB for data and 60MB for log and the growth would be 1MB, I see that your script only update the data file. I think you are right, the cursor would be faster.

    I would question the autogrowth size of 1 MB. Autogrowth should no replace the DBA making appropriate increases on a database every 3 to 6 months. Multiple increases in database or log file size during working hours can cause performance issues as well as leading to heavily fragmented mdf/ldf files on disk.

    I agree that the autogrowth size of 1 MB is too small. I tend to be a proponent of autogrowth though. Particularly in this case where the OP is working with 1000s of databases. Keeping an eye on, much less growing manually that many databases could take an huge amount of time.

    Now you may be able to set up an automated process to run at night and do the growth then. But I would still keep autogrowth on in case of emergency. You might consider increasing it to somewhere between 5 and 10 MB (given a max of 150/60MB)

    Kenneth

    Kenneth FisherI was once offered a wizards hat but it got in the way of my dunce cap.--------------------------------------------------------------------------------For better, quicker answers on T-SQL questions, click on the following... http://www.sqlservercentral.com/articles/Best+Practices/61537/[/url]For better answers on performance questions, click on the following... http://www.sqlservercentral.com/articles/SQLServerCentral/66909/[/url]Link to my Blog Post --> www.SQLStudies.com[/url]

  • Kenneth Fisher-475792 (11/3/2011)


    Lynn Pettis (11/3/2011)


    Tarsha Shannon (11/2/2011)


    Thanks for the replies everyone!

    SSCrazy,

    I actually need to enable autogrowth for both the data and log file for each database, the maxsize would be 150MB for data and 60MB for log and the growth would be 1MB, I see that your script only update the data file. I think you are right, the cursor would be faster.

    I would question the autogrowth size of 1 MB. Autogrowth should no replace the DBA making appropriate increases on a database every 3 to 6 months. Multiple increases in database or log file size during working hours can cause performance issues as well as leading to heavily fragmented mdf/ldf files on disk.

    I agree that the autogrowth size of 1 MB is too small. I tend to be a proponent of autogrowth though. Particularly in this case where the OP is working with 1000s of databases. Keeping an eye on, much less growing manually that many databases could take an huge amount of time.

    Now you may be able to set up an automated process to run at night and do the growth then. But I would still keep autogrowth on in case of emergency. You might consider increasing it to somewhere between 5 and 10 MB (given a max of 150/60MB)

    Kenneth

    I'm not against autogrowth, just that it shouldn't be used to manage database size on its own. It is useful when there is unecpected growth in a database.

Viewing 13 posts - 1 through 13 (of 13 total)

You must be logged in to reply to this topic. Login to reply