Click here to monitor SSC
SQLServerCentral is supported by Red Gate Software Ltd.
 
Log in  ::  Register  ::  Not logged in
 
 
 
        
Home       Members    Calendar    Who's On


Add to briefcase

Reduce size of BAK file after deleting database table records? Expand / Collapse
Author
Message
Posted Tuesday, December 04, 2012 5:35 AM


SSC Rookie

SSC RookieSSC RookieSSC RookieSSC RookieSSC RookieSSC RookieSSC RookieSSC Rookie

Group: General Forum Members
Last Login: Wednesday, February 19, 2014 11:02 AM
Points: 28, Visits: 153
Hi All,

Firstly wanted to wish everyone a Happy Christmas and I hope someone may be able to help me with my issue I have a DB with several tables one of which receives a constant stream of data from meters (1 minute data). The DB and the backups (.bak) are now getting very large. We decided to create a job that will delete several thousand records every hour to gradually reduce the file size and (we thought) the size of the backups (.bak) but unfortunately the backups are still growing

I have looked into using SHRINKDATABASE and Optimize Table <table name> but being honest I am unsure what they do and if they will help. If anyone has any ideas or if you could post a link to another post I could read it would be really helpful. Thanks in advance

Kind Regards,
Craig



Specifications:

- Windows Server 2008 R2 Standard (SP1)
- SQL Server 2008 R2
Post #1392396
Posted Tuesday, December 04, 2012 5:46 AM


SSCertifiable

SSCertifiableSSCertifiableSSCertifiableSSCertifiableSSCertifiableSSCertifiableSSCertifiableSSCertifiableSSCertifiable

Group: General Forum Members
Last Login: Today @ 8:32 AM
Points: 5,956, Visits: 12,838
Can you post the table definitions for the tables you are deleting from?

-----------------------------------------------------------------------------------------------------------

"Ya can't make an omelette without breaking just a few eggs"
Post #1392399
Posted Tuesday, December 04, 2012 6:02 AM


SSC Rookie

SSC RookieSSC RookieSSC RookieSSC RookieSSC RookieSSC RookieSSC RookieSSC Rookie

Group: General Forum Members
Last Login: Wednesday, February 19, 2014 11:02 AM
Points: 28, Visits: 153
Hi Perry,

Thanks for the quick reply I have attached an excel sheet with the table definitions.

Hope its ok.


  Post Attachments 
TableDefinitions.xlsx (15 views, 10.60 KB)
Post #1392408
Posted Tuesday, December 04, 2012 7:17 PM


SSC-Addicted

SSC-AddictedSSC-AddictedSSC-AddictedSSC-AddictedSSC-AddictedSSC-AddictedSSC-AddictedSSC-Addicted

Group: General Forum Members
Last Login: Tuesday, November 05, 2013 7:44 PM
Points: 446, Visits: 1,314
Database backups are roughly the size of you data, not the size of you database files. This is assuming no compression. If the .BAK file is still getting bigger it's either because the amount of data is growing (despite the deletes) or possibly (but unlikely) you have the backup doing an append. If this is something like an electricity utility getting meter readings every minute could be adding more records than your delete process is removing. Consider using partitioned tables to split the data a bit, then you can just drop one of the partitions instead of doing an expensive and slow delete.

Leo
Post #1392765
Posted Wednesday, December 05, 2012 3:21 AM


SSC Rookie

SSC RookieSSC RookieSSC RookieSSC RookieSSC RookieSSC RookieSSC RookieSSC Rookie

Group: General Forum Members
Last Login: Wednesday, February 19, 2014 11:02 AM
Points: 28, Visits: 153
Good Morning Leo,

Thanks for the reply When taking the backups, in the overwrite media section, I have selected to "Back up to the existing media set" and the sub-selection "Append to the existing backup set". Could this be the problem? I was worried that if I chose to "overwrite all existing backup sets" and the latest backup was corrupted then all would be lost

If you could give me some advice on what you think an ideal configuration would be I would love to test it on my test environment and I really do appreciate the help.

Thanks again,
Craig
Post #1392879
Posted Wednesday, December 05, 2012 3:29 AM


SSCrazy

SSCrazySSCrazySSCrazySSCrazySSCrazySSCrazySSCrazySSCrazy

Group: General Forum Members
Last Login: Friday, March 14, 2014 2:19 AM
Points: 2,820, Visits: 3,916
the below script can help you to see and anlalyze that which tables and indexes which are taking huge space so that you can easily check where the actual space consumption

for table http://www.develop-one.net/blog/2011/06/20/SQLServerGettingTableSizeForAllTables.aspx

and for indexes

SELECT
i.name AS IndexName,
SUM(s.used_page_count) * 8 AS IndexSizeKB
FROM sys.dm_db_partition_stats AS s
JOIN sys.indexes AS i
ON s.[object_id] = i.[object_id] AND s.index_id = i.index_id
WHERE s.[object_id] = object_id('dbo.TableName')
GROUP BY i.name
ORDER BY i.name

SELECT
i.name AS IndexName,
SUM(page_count * 8) AS IndexSizeKB
FROM sys.dm_db_index_physical_stats(
db_id(), object_id('dbo.TableName'), NULL, NULL, 'DETAILED') AS s
JOIN sys.indexes AS i
ON s.[object_id] = i.[object_id] AND s.index_id = i.index_id
GROUP BY i.name
ORDER BY i.name



-------Bhuvnesh----------
I work only to learn Sql Server...though my company pays me for getting their stuff done
Post #1392882
Posted Wednesday, December 05, 2012 4:42 AM


SSCertifiable

SSCertifiableSSCertifiableSSCertifiableSSCertifiableSSCertifiableSSCertifiableSSCertifiableSSCertifiableSSCertifiable

Group: General Forum Members
Last Login: Today @ 8:32 AM
Points: 5,956, Visits: 12,838
craig.dixon (12/5/2012)
Good Morning Leo,

Thanks for the reply When taking the backups, in the overwrite media section, I have selected to "Back up to the existing media set" and the sub-selection "Append to the existing backup set". Could this be the problem? I was worried that if I chose to "overwrite all existing backup sets" and the latest backup was corrupted then all would be lost

If you could give me some advice on what you think an ideal configuration would be I would love to test it on my test environment and I really do appreciate the help.

Thanks again,
Craig

If you're constantly appending the backups then that is why the file continously grows. Try implementing an intelligent backup script that creates each file as a new file using details such as servername, database name, backup type and date time stamp. So for instance

Mysqlserver_Mydb_Full_20121205_113000.trn

Mysqlserver_Mydb_Tlog_20121205_020000.bak


-----------------------------------------------------------------------------------------------------------

"Ya can't make an omelette without breaking just a few eggs"
Post #1392917
« Prev Topic | Next Topic »

Add to briefcase

Permissions Expand / Collapse