Connect to Oracle and perdoem bulk insert

  • Hello,

    I have a regular ssis package that as a source to Oracle and that inserts data into a table. No transformations on this.

    This has now 600 million rows in the source and I need to insert it into my stage.

    Regular insert in now taking hours and I would like instead to do a bulk insert.

    Problem is that I read that bulk insert tasks can only connect to txt files.

    Do you have any ideas?

    Thank you

  • you got SQL Enterprise or other?

    Is this incremental or full load? if incremental how do you deternime inserts/update/deletes?

    and how exactly are you extracting the data - oledb, ado.net, odbc?

    and is your destination sQL server set to use fastload or not? (fastload equates to bulkinsert)

    and are all indexes removed from the destination table before loading the data?

    and is the destination partitioned or not?

    all the above can and will affect what you may do to improve performance.

  • DEar Frederico,

    Please see my replies below:

    you got SQL Enterprise or other?

    Not sure.

    Is this incremental or full load? if incremental how do you deternime inserts/update/deletes?

    It's full load. We truncate the stage and then we insert the data from Oracle (a set of views) into our SQL Server using a source Oracle and inserting into a destination in SQL

    and how exactly are you extracting the data – oledb, ado.net, odbc?

    Not sure

    and is your destination sQL server set to use fastload or not? (fastload equates to bulkinsert)

    Not sure, how can I see that in the destination?

    and are all indexes removed from the destination table before loading the data?

    We have no indexes

    and is the destination partitioned or not?

    Not partitioned

    all the above can and will affect what you may do to improve performance.

  • Is this option possible only if the source is a data file? can the source be oracle instead of a data file?

    Fast Load Options

    If the OLE DB destination uses a fast-load data access mode, you can specify the following fast load options in the user interface, OLE DB Destination Editor, for the destination:

    •Keep identity values from the imported data file or use unique values assigned by SQL Server.

    •Retain a null value during the bulk load operation.

    •Check constraints on the target table or view during the bulk import operation.

    •Acquire a table-level lock for the duration of the bulk load operation.

    •Specify the number of rows in the batch and the commit size.

    Some fast load options are stored in specific properties of the OLE DB destination. For example, FastLoadKeepIdentity specifies whether to keep identify values, FastLoadKeepNulls specifies whether to keep null values, and FastLoadMaxInsertCommitSize specifies the number of rows to commit as a batch. Other fast load options are stored in a comma-separated list in the FastLoadOptions property. If the OLE DB destination uses all the fast load options that are stored in FastLoadOptions and listed in the OLE DB Destination Editor dialog box, the value of the property is set to TABLOCK, CHECK_CONSTRAINTS, ROWS_PER_BATCH=1000. The value 1000 indicates that the destination is configured to use batches of 1000 rows.

  • that option has nothing to do with the source but only with the destination.

    You need to check if you have the Enterprise edition - if so you can and should use the Microsoft Attunity driver for Oracle as it will be faster than other drivers.

    And, and probably a better option if your oracle and sql server can hold the load, split the extract/load into multiple chunks and do the chunks in parallel.

    also have a read of https://www.anexinet.com/blog/how-to-optimize-bulk-insert-operations-in-ssis/

    on your ssis destination make sure that buffer size and rows per buffer are more than the default values and that the tablock option is used.

    database is advisable to be in either simple or bulk logged mode to ensure minimal logging.

    and also to ensure that the source/network isn't an issue create a ssis package just to test the extract - output onto row count component.

    this will give you the times it takes to extract the data and transfer to the ssis flow - make sure you running this test on the same location where the other ssis package is executed (I expect it to be on the same server !!)

Viewing 5 posts - 1 through 4 (of 4 total)

You must be logged in to reply to this topic. Login to reply