• Re: I will have to place the query in a SQL job and run that weekly once, please help me out.

    With 2.2 billion records in your table, why would you think running a delete job [once a week] - using a cursor of all things - was a good idea? I'm not trying to rip into you but as a DBA for what sounds like a very active production environment you have to make some basic decisions on the management of your data based on the business need to retain two weeks of data. I'm glad you posted your question in the forum to ask for help but you also need to consider the possibility that your constraint to run this purge weekly may have been a bit misguided.

    Several of the replies took that "weekly" constraint as a fixed situation and provided some solid solutions (nod to the respondents) but the simple fact that you want to run this massive purge once a week suggests to me that you really need to evaluate this situation closely before asking for help.

    I suggest reviewing the reason for retaining two weeks of data. Is it because Marketing wants a rolling two weeks? Accounting? If so, identify the granularity of the data they need. Do they run a report once a day or every hour? Do they need summary or detail data to report on? Maybe you can "pre-run" that summarization with results going to a reporting table.

    In my humble opinion, if given this situation as my own, I would run this purge hourly - possibly right after "the report" was completed. The size of the transaction log would "self-level" to the size it needs to be to handle the delete and would only grow to fit the most active hour during the day.

    If that's not an option, I suggest you invest in more disk space and extend your transaction log file to allow you to successfully complete your delete. In this case I'd choose one of the prior suggestions that limit the number of records processed or run the delete based on the record date/time with intervals that limit the number of records in each "batch".