I’m after some advice.
I’m helping a colleague with a SQL Server 2016 database that is getting quite large and they are considering an archive strategy.
The database essentially stores data from a number of sensors on machines that are building stuff. Readings are taken every 1, 5 and 10 seconds from a variety of sensors. As you can imagine the tables storing this data are growing quite quickly.
The historical data needs to be maintained in case one of the devices demonstrates an anomaly in the field and an investigation needs to be undertaken to see if there were any issues within the manufacturing process that were not originally picked up. The consequences of a failure can be serious.
So far in several years no major failures have occurred and this data has only been needed on a couple of occasions. Failures are so far considered a rare event.
If and when this data is required it is not required immediately so it can be stored off site and off line and loaded into an offline database if it is ever required to be reviewed by engineering staff.
The thoughts are.
1 Create a new identically structured database on a separate storage system within the data centre and copy the historical data across to this storage.
2 Verify that the data copy is valid.
3 Copy the newly created historical archive database to an off site storage area.
4 Delete the historical data from the production database.
5 Re-run statistics if they haven’t already been triggered by the delete.
Is this a valid archive strategy?
Is there a better option?
Are there any gotchas with this option?
Thanks in anticipation.