I guess that you also don't want to change your archiving mechanism too much.
If you want to reduce the number of rows processed per execution, you need to reduce the number of days to process in each archive step.
One possibility is to create a loop for each day between the last archive run and the current value of @MaxDate and archive/delete records between those two dates. This would effectively do your archiving on a day by day basis and allow you to catch up for those database that haven't archived anything yet.
It might take some time to catch up and you might want to put some time restrictions in so that it runs at a quiet time of the day (or night) - it will take more days to complete the archive but would have a lesser impact on the current users.
It's not a perfect solution but might give you some ideas.
Hi,That's actually the way it's currently setup. We take the minimum date and compare it to @MaxDate and do a WHILE loop that contains a WHILE for the list of tables to archive. When all tables have been processed for a given day, we increment @currenDate by 1 until @currentDate = @MaxDate.I'm opened to changing the way this works completely if there's something better out there though. I'm not sure I know one right now though, that's why I was asking. Since it also needs to work on 2000 or 2005, it further restricts my choices to using a 2000 compatible method.Greg
What's the actual problem with implementing your archiving strategy as it is? It, presumably, works when it is up and running so is there an actual problem or are you just planning ahead?