I have a table with @ 20 Million rows of data I want to truncate or delete. I don't need any of the data in it anymore at all. I also need to do this on disk space challenged servers. Each row is comprised of 71 characters divided as 35, 35 and 1 in the included columns. Would it be best to:
- DELETE in batches (don't think so myself)
- TRUNCATE the table (concerned how many pages get logged in this scenario)
- Possibly? SCRIPT, DROP and RECREATE the offending table?
The table itself is not foreign keyed anywhere, nor does it participate in an indexed view. It is not replicated anywhere, although it might be at some point. And yes, I know I need to get after the devs to create a process that limits growth here going forward -- that is not in my province to dictate however. I can merely "suggest" and hope which sucks, but I digress.
I am an accidental, but long time and fairly capable DBA here where I work. In other words, I ask other experts before I leap on faith alone lol...
What say you experts out there?