September 29, 2008 at 11:51 am
I have an OLE DB Source which is pointing to an oracle server. I pull rows from a table where a column indicates they have been updated within the last two days.
The source flows to an OLE DB Command that performs a delete for each row on the Destination table based on the primary keys.
The OLE DB Command flows to the SQL Destination which performs a bulk insert.
If on the OLE DB Command I select table lock, for some reason the Activity monitor on the SQL Destination server shows the DELETE command being blocked by the BULK INSERT which is supposed to occur after all the deletes are done.
I have an identical structure for another set of tables which doesn't do this. It works after I remove table lock but I want to lock the table on the insert.
September 29, 2008 at 12:09 pm
An OLEDB Command component sends rows to the output buffer as they are processed. So, your deletes happen one at a time and then the rows are sent to the output buffer to be processed down stream. You have nothing preventing the data from getting to the insert before all rows have made it through the delete.
I would recommend not using any kind of table lock on your delete or your insert. Unless you are trying to prevent other processes from updating the table at the same time, table locks are not going to help you much in your performance because of the one delete at a time structure you have.
If you really want the table lock, you could put another component between the OLEDB command and the destination component that will force all of the records to pass through together - a SORT component, for instance, needs to have every record before it can send anything to the output buffer because the last record it gets may need to be the first one in the output.
Viewing 2 posts - 1 through 2 (of 2 total)
You must be logged in to reply to this topic. Login to reply