Also not what I was expecting.
We download data from a legacy system and we constantly find garbage like a date of 06/31/1997, a time like 08:75 am or Social Security Numbers with the wrong number of characters or blanks in fields that are supposed to be required. That was the sort of garbage I was thinking of, not deletion of all the rows in a bunch of tables.
A better title might have been "Automating Deletion From Multiple Tables".
I have to delete all the data in the tables before starting a load from my other system and it's a pain with 20 or 30 tables that are slowly changing as we continue development.
I agree with Paul that DELETE really chews up your log and my test server isn't that big. I prefer to use TRUNCATE.
I tried keeping a "TruncateAll" script with all the constraints that needed to be dropped, tables to truncate & constraints to be re-added but someone was always adding a table or contraint and forgetting to put it in the script.
Your solution is interesting. I will look at it further and see if it can be used to automatically TRUNCATE tables.