Click here to monitor SSC
SQLServerCentral is supported by Red Gate Software Ltd.
 
Log in  ::  Register  ::  Not logged in
 
 
 
        
Home       Members    Calendar    Who's On


Add to briefcase «««45678

Deleting Large Number of Records Expand / Collapse
Author
Message
Posted Saturday, February 19, 2011 6:45 AM
Mr or Mrs. 500

Mr or Mrs. 500Mr or Mrs. 500Mr or Mrs. 500Mr or Mrs. 500Mr or Mrs. 500Mr or Mrs. 500Mr or Mrs. 500Mr or Mrs. 500

Group: General Forum Members
Last Login: Monday, April 14, 2014 10:11 PM
Points: 594, Visits: 1,650
Great discussion and helpful solutions.
1) many of us are a little too touchy about critiques of our code
2) tell your management how much disk space you need to run sql server and that it's a cost of doing business. Otherwise tell them to use a pencil and notepad to keep track of data. So many posts about shrinking files, "ballooning" log files etc etc. Just buy the danged disk space and suck it up.

3) I don't know if this is a trend, but in our shop backups were turned over to the systems team using Commvault. Any backups done outside of that product could break the log chain. Naturally this severely ties my hands. Also, any change in recovery model will cause Commvault to react according to it's programming, often causing an immediate full backup.

4) You "could" go to simple recovery on a production database, but if the company asks you to restore to a point in time, and you can't do it because you stopped transaction log backups, you might be looking for another job.
I've actually considered changing modes during our weekend maintenance because Commvault can't keep up and it's log backups are now occasionally finishing many hours apart, rather than the scheduled 15 minute interval. But I'm hoping this will give me the ammunition I need to establish weeknight maintenance windows so it's not all done on the weekend.






Post #1066777
Posted Sunday, February 20, 2011 4:07 AM
Ten Centuries

Ten CenturiesTen CenturiesTen CenturiesTen CenturiesTen CenturiesTen CenturiesTen CenturiesTen Centuries

Group: General Forum Members
Last Login: Tuesday, April 08, 2014 6:23 AM
Points: 1,082, Visits: 4,887
A great article - my only concern is that out production database is replicated to two other servers and log shipped to a third, so you couldn't use this process as is without breaking the log chain. But I think I can use the general gist of the process to only delete X rows at a time, by putting the delete code into a SQL Agent job and setting it to run every say 10 minutes e.g. to delete a million rows, you could have a job that deletes 10,000 rows and have it run 100 times.
Post #1066843
Posted Friday, April 04, 2014 4:30 AM
Forum Newbie

Forum NewbieForum NewbieForum NewbieForum NewbieForum NewbieForum NewbieForum NewbieForum Newbie

Group: General Forum Members
Last Login: Friday, April 04, 2014 4:30 AM
Points: 6, Visits: 18

http://www.sqlperformance.com/2013/03/io-subsystem/chunk-deletes
http://social.msdn.microsoft.com/Forums/sqlserver/en-US/52e6be98-6165-4b1e-a926-5d5609ab8486/delete-large-number-of-rows-without-growing-the-transaction-log
Post #1558431
« Prev Topic | Next Topic »

Add to briefcase «««45678

Permissions Expand / Collapse