Issue with temp table as destibation

  • Hi All,

    I have the requirement to load source txt file and perform incremental update on my destination table. As I was not allowed to create any table to hold staging records (from file), I'm storing file data to a global temp table and then performing MERGE operation (temp & Dest table).

    In the data low task i have a derived column that handls date column (if blank or null then pass db_null to dest db else pass the column as it it).

    Now everything is working fine. The only problem is with performance. For transferring 1 million records through the derived column task my package is taking almost an hour.

    Any help on where I'm loosing my performance??

    Thanks,

    Anjan

    __________________________________________
    ---------------------------------------------------
    Save our mother Earth. Go Green !!!

  • Can you not use a lookup transformation instead of Merge? It's always good over Merge with the type of activity you are doing.

    Good Luck 🙂 .. Visit www.sqlsaga.com for more t-sql code snippets and BI related how to articles.

  • Anjan Wahwar (4/15/2014)


    Hi All,

    I have the requirement to load source txt file and perform incremental update on my destination table. As I was not allowed to create any table to hold staging records (from file), I'm storing file data to a global temp table and then performing MERGE operation (temp & Dest table).

    In the data low task i have a derived column that handls date column (if blank or null then pass db_null to dest db else pass the column as it it).

    Now everything is working fine. The only problem is with performance. For transferring 1 million records through the derived column task my package is taking almost an hour.

    Any help on where I'm loosing my performance??

    Thanks,

    Anjan

    Are you able to work out the split between how long it takes to populate the temp table and how long the merge takes?

    The derived column task is unlikely to be the problem. In my experience, they generally work fast and do not cause blocking.

    Does the temp table have any indexes or primary keys? Dropping these before the import and then recreating them before the merge might speed things up.

    If the source data does not contain duplicates, you could consider using a lookup with full cache on the target table and sending the 'not matched' rows directly into the target table. Then the MERGE at the end has much less to do - just the updates.

    Do all of the rows selected contain updates or inserts, or can you potentially filter the source?

    If you haven't even tried to resolve your issue, please don't expect the hard-working volunteers here to waste their time providing links to answers which you could easily have found yourself.

Viewing 3 posts - 1 through 2 (of 2 total)

You must be logged in to reply to this topic. Login to reply