If these rows make up your entire table, use truncate and you log will not grow.
Truncate will not work if your table is parent table for others.
Otherwise you may want to switch to simple logging and use delete-batches for the operation. Your logfile(s) will not grow if you keep these transactions small enough to fit into the current size.
Declare @BatchSize int
Set @BatchSize = 5000 -- Modify as needed !!!
Set nocount on
declare @RowsDeleted bigint
Declare @MyRowcount bigint
set @RowsDeleted = 0
while 0 = 0
begin
DELETE top ( @BatchSize )
FROM
WHERE
set @MyRowcount = @@rowcount
if @MyRowcount = 0 break
select @RowsDeleted = @RowsDeleted + @MyRowcount
-- just to see it advancing ..
-- % = modulo
if 0.000 = @RowsDeleted % 100000.000 print convert(varchar(26), @RowsDeleted) + ' - ' + convert(varchar(26), getdate(),121)
end
Print '[' + cast(@RowsDeleted as varchar(15)) + '] Rows deleted.'
Since size does matter, did you consider partitioning this table.
That way you may be able to just drop a partition of data. (sliding window)
Check BOL.
Johan
Learn to play, play to learn !
Dont drive faster than your guardian angel can fly ...
but keeping both feet on the ground wont get you anywhere :w00t:
- How to post Performance Problems
- How to post data/code to get the best help[/url]
- How to prevent a sore throat after hours of presenting ppt
press F1 for solution, press shift+F1 for urgent solution 😀
Need a bit of Powershell? How about this
Who am I ? Sometimes this is me but most of the time this is me