OK, I am not sure I am on board as the example is written. If you know you are going to be cleaning up the database regularly I would personally cluster the EventClosedDate field to ensure I can use that for fast removal and non-cluster the primary key (identity column). This is for multiple reasons.
1) I don't see anything at all saying EventID 2 couldn't end months after EventIDs 3-2000. And since your top doesn't actually take that into account you could in theory delete something still valid in your rules.
2) If your purpose is to remove values from a table where a particular is your target goal then that column as a rule must be referenced in your query otherwise you may miss values or get more values than expected.
3) You are using ceiling in your query, thus if the calculation is 880.000000001 you will make 881 loops. In the 881st loop you will wipe out data which does have a date value >= your 6 day old value becuase there is no sanity check. I suggest use floor, you will have some values hanging but better to trim to little than to trim to much in the business world. Refer back to 1.
4) I would use your method after altering indexes to ensure I can use EntryDate as my qualifier to trim exactly what I need.
BTW, title is a bit misleading, breaking processes into smaller transactions has always been known to have the best performance method in reducing contention.