• ManicStar - Tuesday, February 13, 2018 9:31 AM

    Eirikur Eiriksson - Tuesday, February 13, 2018 9:25 AM

    ManicStar - Tuesday, February 13, 2018 9:07 AM

    Any ideas how to efficiently move a lot of rows from a table in the OLTP to an archive on another server when that table  contains a varchar(max)?  My SSIS package just spins and spins on these, even after setting rows per batch and max insert commit size kinda small to keep batching small.

    The archive database is set to simple recovery model, would bulk load be better? 

    Is BCP worth looking at?

    What is the actual size of the data (datalength)?
    😎

    Close to a meg for the largest one.  They are comments and call notes, so some of them are very long.

    Not certain why the SSIS isn't working properly, those are not large chunks of data. What is the volume/time (number of entries) that you need to transfer?
    One option is to offload the content into a local staging table before the transfer if the source table is busy or cluttered with lock escalation etc.
    😎