Click here to monitor SSC
SQLServerCentral is supported by Red Gate Software Ltd.
 
Log in  ::  Register  ::  Not logged in
 
 
 
        
Home       Members    Calendar    Who's On


Add to briefcase

Issue with temp table as destibation Expand / Collapse
Author
Message
Posted Tuesday, April 15, 2014 2:28 PM


Valued Member

Valued MemberValued MemberValued MemberValued MemberValued MemberValued MemberValued MemberValued Member

Group: General Forum Members
Last Login: Wednesday, April 23, 2014 2:48 AM
Points: 63, Visits: 237
Hi All,

I have the requirement to load source txt file and perform incremental update on my destination table. As I was not allowed to create any table to hold staging records (from file), I'm storing file data to a global temp table and then performing MERGE operation (temp & Dest table).

In the data low task i have a derived column that handls date column (if blank or null then pass db_null to dest db else pass the column as it it).

Now everything is working fine. The only problem is with performance. For transferring 1 million records through the derived column task my package is taking almost an hour.

Any help on where I'm loosing my performance??

Thanks,
Anjan


__________________________________________
---------------------------------------------------
Save our mother Earth. Go Green !!!
Post #1562048
Posted Tuesday, April 15, 2014 2:49 PM


SSC-Enthusiastic

SSC-EnthusiasticSSC-EnthusiasticSSC-EnthusiasticSSC-EnthusiasticSSC-EnthusiasticSSC-EnthusiasticSSC-EnthusiasticSSC-Enthusiastic

Group: General Forum Members
Last Login: Wednesday, May 7, 2014 10:09 AM
Points: 141, Visits: 313
Can you not use a lookup transformation instead of Merge? It's always good over Merge with the type of activity you are doing.

Good Luck :) .. Visit www.sqlsaga.com for more t-sql code snippets and BI related how to articles.
Post #1562052
Posted Tuesday, April 15, 2014 3:14 PM


SSCarpal Tunnel

SSCarpal TunnelSSCarpal TunnelSSCarpal TunnelSSCarpal TunnelSSCarpal TunnelSSCarpal TunnelSSCarpal TunnelSSCarpal TunnelSSCarpal Tunnel

Group: General Forum Members
Last Login: Today @ 11:29 AM
Points: 4,986, Visits: 11,685
Anjan Wahwar (4/15/2014)
Hi All,

I have the requirement to load source txt file and perform incremental update on my destination table. As I was not allowed to create any table to hold staging records (from file), I'm storing file data to a global temp table and then performing MERGE operation (temp & Dest table).

In the data low task i have a derived column that handls date column (if blank or null then pass db_null to dest db else pass the column as it it).

Now everything is working fine. The only problem is with performance. For transferring 1 million records through the derived column task my package is taking almost an hour.

Any help on where I'm loosing my performance??

Thanks,
Anjan


Are you able to work out the split between how long it takes to populate the temp table and how long the merge takes?

The derived column task is unlikely to be the problem. In my experience, they generally work fast and do not cause blocking.

Does the temp table have any indexes or primary keys? Dropping these before the import and then recreating them before the merge might speed things up.

If the source data does not contain duplicates, you could consider using a lookup with full cache on the target table and sending the 'not matched' rows directly into the target table. Then the MERGE at the end has much less to do - just the updates.

Do all of the rows selected contain updates or inserts, or can you potentially filter the source?



Help us to help you. For better, quicker and more-focused answers to your questions, consider following the advice in this link.

When you ask a question (and please do ask a question: "My T-SQL does not work" just doesn't cut it), please provide enough information for us to understand its context.
Post #1562062
« Prev Topic | Next Topic »

Add to briefcase

Permissions Expand / Collapse