• Sergiy - Sunday, January 21, 2018 4:49 AM

    Is there any better way to archiving data which wont use much disk space and also resources?

    Yes, of course.First, you have to keep JSON and XML messages our of SQL Server.Placing a string into a table incurs a huge overhead comparing to simply storing it in a file. An overhead measured not in %, but times.Second, store JSON messages on a separate file server.This way you'll perform 2 I/O opreration per each message - 1 for writing, an 1 for reading when archiving. Repeating backups, recording into trn logs, page splits, re indexing, etc. - will be out of the picture.Zip the files using an appropriate tool in a combination with a script similar to the one posted by Mike. Except - you'll read uncompressed messages not from a table but from a file folder.Which core of which CPU of which server to be used - it's up to you to decide. Whichever has the most of free time.

    Thanks Sergiy,

    What I understood from your words is ,from main OLTP database the table having json data has to be moved to a file. We have 1000 of messages a day that means 1000 files per day (Correct me if I am wrong).  How to achieve this? and how we can get old data if in case it is required in tabular form? 

    You didnt comment on the memory usage of xml vs insert..select from the main OLTP table. If i am not wrong both will take same amount of buffer.