• You welcome.

    But it still gonna be too much waste.

    Size of backups is only a part of the issue.

    Ok.

    See it for yourself.

    Go to your local/DEV instance of SQL server.

    Create 2 empty databases.

    In one of them create the table of your structure: 3 columns + JSON.

    In another one create a table with the same 3 columns, but make 4th one a varchar(300) to hold path\name for JSON files.

    Extract 1000 (or may by just 100 for the initial test) of JSON files from your actual database and store them locally.

    Now, write 2 scripts to emulate the BizTalk behaviour.

    1 to populate the DB 1 with JSON's, as your BizTalk does now.

    Another one - to copy JSON files into a folder and save the links in the 4th column of the table.

    Start your timer and launch the scripts one after another.

    You may wish to monitor performance metrics of the server while the scripts are executed, like memory usage, PLE, etc.

    After it's finished - compare the size of the database1 with the total storage size taken by the database 2 and the JSON files of disk.

    After that you may wish to try zipping the JSON messages.

    See how long will it take to zip messages stored in the table comparing to reading the links from the DB and zip files in a folder.

    At the end - imitate a cleanup process.

    Delete half of the JSONs which are "expired" from both storages. In the second storage you're gonna need to delete both links and the "expired" files they pointing to.

    You must have most of the infrastructure ready, so making up such a test should not take long.

    But it must give you a lot go points to discuss with your BizTalk team.

    _____________
    Code for TallyGenerator