Automatically Gathering Database Size
This article continues the same topic of Automatically Gathering Server Information. If you want to read the other articles, click on one of these links:
I was recently asked by a collegue how to gather certain information about a SQL Server on a periodic basis. He had the need to report on the data as well as store it for analysis over time. This article addresses one of those data items: dynamically finding the size of each database on a server.
In order to get the size of each database dynamically, you have to query the SQL Server internal information that is stored in the master database. In the past, many SQL developers (myself included) have queried the system tables directly. In this case, you would query master..sysdatabases to get the database names and sizes. There is, however, one problem with this method. When SQL Server v7.0 was released, some of the definitions of the system tables changed from v6.5 and as a result many queries that I had written stopped working. It was my fault because all SQL Server documentation warns against using the system tables in production code and I was forced to modify various stored procedures to work with the new table definitions.
If you are not supposed to query system tables directly, then what can you do? Starting with v7.x, Microsoft started providing Information Schema views of system information that you can develop against and that will be maintained in future versions. These views are called Information Schema views because the "owner" of these views is "Information_Schema". One example of these views is the "Tables" view which provides a list of tables and views in any database. If you run the following query:
Select * from information_schema.tables
You will get a list of tables and views and a few other pieces of information. This is useful if you need to work with a list of tables in code. So we should just run the following query to get the sizes of our databases:
Select * from master..information_schema.databases
Well, try it and if you get the sizes back, email me and I will rewrite this article. When I tried I found there was no "databases" view, much to my surprise. So are we stuck with querying the sysdatabases table? I know I told you this was not a good idea and not recommended by Microsoft. (And I hope my backup software doesn't use this method).
Never fear, there is another method. After a little digging, I produced the following code:
Create table #mydbs (dbname char( 50), size char( 20), dbowner char( 50), dbid int, crdate datetime, status char( 100)) Insert #mydbs Exec sp_helpdb Select * from #mydbs /* */ Drop table #mydbs
Why do I create a table? And then why drop it? And why is there an empty comment? I need this information, right? Actually this is a great technique for handling result sets in SQL Server v7.x and below. In SQL Server 2000 we get a table datatype and can actually pass result sets in and out of stored procedures, but for now, this is the best method I have found for handling a result set inside of a stored procedure.
Since I cannot handle more than a single value in T-SQL (v7.x and below), I am creating a temporary table that will hold the results of the stored procedure I can call. This technique is not used too much in code I have seen, but is perfectly valid. It is another form of the INSERT command that will work like other inserts, but requires the table schema to match the result of the stored procedure.
When I started writing this article, I first ran the stored procedure and then created the table definition using the results. If you run this batch on your server, you should see the same results you would get by running sp_helpdb directly.
So what do you do with these results?
Anything you want! The section of my batch that contains the comment marks is where the processing of the result set would go. You could write a query that returns a report on the database sizes. Or you could insert this data into a permanent table and track the space growth trend of each database over time. There are a variety of ways you can use this data and I will address one in a future article.
As always, please send me an email with comments, questions, or suggestions. I look forward to hearing how others use this technique.
© dkRanch.net October 2000