Why are my SSIS data flow steps using a CURSOR to transfer data?

  • We have some SSIS packages that were upgraded from DTS. I do not know how this was done as they were created by an analyst that no longer works here. The server is 2008R2.

    I started testing the first package. All it does is transfer data from source tables in one database to destination tables in another database (both dbs are on the same server).

    What I found is that it was taking a really long time... when I started doing a trace I discovered that it was inserting data ONE ROW AT A TIME using a cursor. At least, it was for some tables. For others I see "insert bulk" statements which is what I would have expected for ALL the tables. Obviously the "insert bulk" tables are getting loaded very quickly while the ones that use a cursor take forever.

    Looking at the package, I cannot see any difference in the settings between the data flow tasks that get done in bulk and the tasks that get done by cursor. In all cases the source is just a table or query and the destination is a table. The "DefaultBufferMaxRows" is set to 10000 in all cases.

    Does anyone know what could cause this?

  • In the data flow task, bulk insert is enabled by the properties of the destination component. An OLEDB destination should use a data access mode of "Table or view - fast load". An ADO.Net destination has a checkbox for "Use Bulk Insert where possible".

    An SSIS data flow handles data one row at a time as a buffer object full of row objects. But it should be capable of using bulk insert to send them to the server. If the package has a lot of simple data flow tasks that are just Source -> Destination, you might try rewriting them as Execute SQL tasks with INSERT/SELECT statements instead.

  • I just reread your original post. For data transfer between databases on the same server, the Execute SQL task with INSERT/SELECT should be faster than an SSIS data flow because SQL Server controls the entire process. If the SSIS package is running on a different server, INSERT/SELECT will be much faster because the data doesn't have to make a round trip over the network.

  • Scott Coleman (8/19/2014)


    In the data flow task, bulk insert is enabled by the properties of the destination component. An OLEDB destination should use a data access mode of "Table or view - fast load". An ADO.Net destination has a checkbox for "Use Bulk Insert where possible".

    I think this is it - I found that some are set to "open rowset" and some are set to "open rowset using fastload".

    Thanks, I'll try changing them and see if it works.

    Edit: Yup, that was it. Went from two hours to two minutes. Thanks again!

  • Scott Coleman (8/19/2014)


    I just reread your original post. For data transfer between databases on the same server, the Execute SQL task with INSERT/SELECT should be faster than an SSIS data flow because SQL Server controls the entire process. If the SSIS package is running on a different server, INSERT/SELECT will be much faster because the data doesn't have to make a round trip over the network.

    But you can still have some advantages by using the batch inserts used by SSIS. It will depend on the size of the information that you're using. Remember that a huge insert will need a bigger log and multiple smaller inserts (in a simple recovery mode) can reuse the log.

    I'd say that it depends. 🙂

    Luis C.
    General Disclaimer:
    Are you seriously taking the advice and code from someone from the internet without testing it? Do you at least understand it? Or can it easily kill your server?

    How to post data/code on a forum to get the best help: Option 1 / Option 2

Viewing 5 posts - 1 through 4 (of 4 total)

You must be logged in to reply to this topic. Login to reply