February 23, 2010 at 10:23 am
Also not what I was expecting.
We download data from a legacy system and we constantly find garbage like a date of 06/31/1997, a time like 08:75 am or Social Security Numbers with the wrong number of characters or blanks in fields that are supposed to be required. That was the sort of garbage I was thinking of, not deletion of all the rows in a bunch of tables.
A better title might have been "Automating Deletion From Multiple Tables".
I have to delete all the data in the tables before starting a load from my other system and it's a pain with 20 or 30 tables that are slowly changing as we continue development.
I agree with Paul that DELETE really chews up your log and my test server isn't that big. I prefer to use TRUNCATE.
I tried keeping a "TruncateAll" script with all the constraints that needed to be dropped, tables to truncate & constraints to be re-added but someone was always adding a table or contraint and forgetting to put it in the script.
Your solution is interesting. I will look at it further and see if it can be used to automatically TRUNCATE tables.
March 4, 2010 at 10:22 am
EXEC stp_CleanGarbageData 'Table_1, Table_2, Table_3, Table_n'
Just a minor comment. Since you are identifying tables that you want to keep; I think a better name for a stored procedure would be: stp_ExcludeFromGarbageDataCleanup.
March 4, 2010 at 11:16 am
Viewing 3 posts - 16 through 17 (of 17 total)
You must be logged in to reply to this topic. Login to reply