• baabhu (4/17/2013)


    ChrisM@Work (4/15/2013)


    That looks ok to me, other than the joins to the temp table not being optimized. You don't really need the temp table anyway:

    SET @RowsDeleted = 1000;

    WHILE @RowsDeleted > 0

    BEGIN

    BEGIN TRAN;

    DELETE TOP(1000) s

    FROM [Experimental].[dbo].[SampleData] s

    WHERE xyzmasterCode BETWEEN '0' AND 'Z'

    AND filterDate <= @filterDate;

    SET @RowsDeleted = @@ROWCOUNT;

    COMMIT TRAN;

    CHECKPOINT;

    BEGIN TRAN;

    DELETE TOP(1000) s

    FROM [Experimental].[dbo].[SampleData] s

    WHERE xyzmasterCode BETWEEN '0' AND 'Z';

    SET @RowsDeleted = @RowsDeleted + @@ROWCOUNT;

    COMMIT TRAN;

    CHECKPOINT;

    END

    You could run it as one transaction spanning both deletes or as two as shown -I'd try both.

    Add a waitfor delay statement after the commit. This will help to reduce the blocking.

    Can you explain what you mean by this, baabhu? I can understand why a delay might be useful - removing obsolete log data isn't instantaneous - but I don't understand what this has to do with blocking. Cheers.


    [font="Arial"]Low-hanging fruit picker and defender of the moggies[/font]

    For better assistance in answering your questions, please read this[/url].


    Understanding and using APPLY, (I)[/url] and (II)[/url] Paul White[/url]

    Hidden RBAR: Triangular Joins[/url] / The "Numbers" or "Tally" Table: What it is and how it replaces a loop[/url] Jeff Moden[/url]