December 30, 2008 at 7:20 am
There is an MSSQL 2005 Server that has been running for a while as the backend for a company Intranet. The database is very modest and barely grows. This same SQL Server is starting to be used by Reporting Services (2005). I am running a script to clear the database and then running an import of some extract files. The extract in total is just over 600MB and it grows by only about 5-10MB a day, even that may be an overstatement. However, the database MDF file (specific to the DB the files are being imported to) seems to grow by over 1GB a day.
Is there a way to find out what may be going wrong? Is something caching or not clearing? Even when I clear the DB of records it seems to stay the same size.
December 30, 2008 at 7:46 am
You can run sp_spaceused on your tables to find out the table’s size and which table/s uses more space that you thought that they would. Also don’t forget that it is more important how the data is stored in the table then how much space it takes in your text(?) files. For example, if you import few records that have only 3 columns (one numeric and 2 small strings), but the table has bigint column and 2 char(3500) columns, each record will use a whole page (8K) in the database, but in the text file it could be stored using less then 100 bytes.
Adi
--------------------------------------------------------------
To know how to ask questions and increase the chances of getting asnwers:
http://www.sqlservercentral.com/articles/Best+Practices/61537/
For better answers on performance questions, click on the following...
http://www.sqlservercentral.com/articles/SQLServerCentral/66909/
December 30, 2008 at 7:46 am
I do think a read of a good admin book would be useful; however - your mdf won't shrink even when empty unless you have autoshrink on, which I hope you don't. certain operations such as index rebuilds etc. will generally require a certain amount of working headroom in your mdf file. Do you have the db in simple? and do you back it up? I'd probably be running a manual checkpoint, a sp_updatestats that sort of thing on a database used this way. I'd also probably fix the db size a bit larger in the first place to allow for working room.
[font="Comic Sans MS"]The GrumpyOldDBA[/font]
www.grumpyolddba.co.uk
http://sqlblogcasts.com/blogs/grumpyolddba/
December 30, 2008 at 7:51 am
Are you keeping the extracts from the previous day? Possibly manipulating the extracts and moving them into another table, thus causing the growth?
Bear in mind that the space will not be released from the physical files unless you shrink the database.
To help identify you could create a profile trace that include file growth and see what is running at the time so that you could get a handle on it.
If you want to check on the size of objects and how much space, along with free space for each you can execute the following
--this will update the usage information for all the tables, and will be a great deal slower
--change the "true" value to "false to prevent this and get much faster results, although maybe not as accurate
create table #spaceused
(name nvarchar(128),
rows varchar(11),
reserved varchar(18),
data varchar(18),
index_size varchar(18),
unused varchar(18))
exec sp_msforeachtable 'insert into #spaceused exec sp_spaceused ''?'', ''true'''
select * From #spaceused
Viewing 4 posts - 1 through 4 (of 4 total)
You must be logged in to reply to this topic. Login to reply