• ManicStar - Tuesday, February 13, 2018 9:31 AM

    Eirikur Eiriksson - Tuesday, February 13, 2018 9:25 AM

    ManicStar - Tuesday, February 13, 2018 9:07 AM

    Any ideas how to efficiently move a lot of rows from a table in the OLTP to an archive on another server when that table  contains a varchar(max)?  My SSIS package just spins and spins on these, even after setting rows per batch and max insert commit size kinda small to keep batching small.

    The archive database is set to simple recovery model, would bulk load be better? 

    Is BCP worth looking at?

    What is the actual size of the data (datalength)?
    😎

    Close to a meg for the largest one.  They are comments and call notes, so some of them are very long.

    You could try to insert the varchar(max) field as NULL first and later update them with actual data. this way you can finish the actions soon.