• To make a decision you must answer some questions: first, there are dependencies between rows in that big chunk of data?

    Second, what do you want to do if a insert fail? Rollback all the inserts, just keep going and create some sort of alert/logging?

    Third, there are performance issues? can I run a batch overnight? That data must be persisted ASAP?

    I don't thing a big blob/string is a good option because you ill need to parse/deserialize that (sql side) and, in general, putting that kind of logic in a SP(or whatever) is not good for performance, reliability and maintenance.

    Search for transaction control and (maybe) stage tables and (maybe) integration strategies.

    You can end doing something like:

    procedure insert raw data

    1- begin transaction

    2- loop(insert in stage table)

    3- commit tran

    procedure process stage table

    1- select no processed data from stage

    2- loop in

    3- begin transaction

    4- process and persist data

    5- if ok commit, else rollback

    6- log results for that "row"

    7- end loop

    But off course that depends on your requirements.