Click here to monitor SSC
SQLServerCentral is supported by Red Gate Software Ltd.
 
Log in  ::  Register  ::  Not logged in
 
 
 
        
Home       Members    Calendar    Who's On


Add to briefcase

ETL Load Approach Expand / Collapse
Author
Message
Posted Thursday, May 16, 2013 1:26 AM
Forum Newbie

Forum NewbieForum NewbieForum NewbieForum NewbieForum NewbieForum NewbieForum NewbieForum Newbie

Group: General Forum Members
Last Login: Friday, May 9, 2014 2:46 AM
Points: 4, Visits: 35
We have a table 'Sales' which is Fully loaded daily. Our target delivery is insert else update, So every time most of the records will be updated which is taking huge time (> 1 hour).

We don't have a key in Source to make it incremental load.

Multiple Users are using that table from different Location, So we cant truncate and load, also We cant use 'Switch' Option in sql server.

We thought of the below approach.

1) Truncate and Load all the data in the 'Sales_Swap' table , This would be much better since it is just bulk insert.

2)
Begin Transaction
Lock Table Sales
EXEC sp_rename 'Sales', 'Sales_temp';
EXEC sp_rename 'Sales_Swap', 'Sales';
EXEC sp_rename 'Sales_temp', 'Sales_Swap';
End Transaction

Experts, Please share your ideas whether the above option is good in ETL Prod environment


Thanks in Advance

Post #1453349
Posted Thursday, May 16, 2013 3:33 AM
SSC-Addicted

SSC-AddictedSSC-AddictedSSC-AddictedSSC-AddictedSSC-AddictedSSC-AddictedSSC-AddictedSSC-Addicted

Group: General Forum Members
Last Login: Friday, July 25, 2014 2:40 AM
Points: 451, Visits: 847
What method are you using to update the rows (in your original solution)? Is this a Data Warehouse? What is the size of the table?




I'm on LinkedIn
Post #1453387
Posted Thursday, May 16, 2013 3:39 AM
Forum Newbie

Forum NewbieForum NewbieForum NewbieForum NewbieForum NewbieForum NewbieForum NewbieForum Newbie

Group: General Forum Members
Last Login: Friday, May 9, 2014 2:46 AM
Points: 4, Visits: 35
Thanks for the reply. We are using SSIS Lookup transformation and OLEDB Command to update the records in the target table.

The Size of the target table is 4o Lakhs currently. It will grow very large in future.
Post #1453390
Posted Thursday, May 16, 2013 4:14 AM
SSC-Addicted

SSC-AddictedSSC-AddictedSSC-AddictedSSC-AddictedSSC-AddictedSSC-AddictedSSC-AddictedSSC-Addicted

Group: General Forum Members
Last Login: Friday, July 25, 2014 2:40 AM
Points: 451, Visits: 847
The OLEDB command component in SSIS is a row by row transformation - I'm not surprised it's slow. If you must update changing rows, then consider using a MERGE in T-SQL.
As the table grows, you may wish to start to partition it - this depends on how much history will change.

What sort of information is in the table? Is it a fact or a dimension?





I'm on LinkedIn
Post #1453405
Posted Thursday, May 16, 2013 5:34 AM
Forum Newbie

Forum NewbieForum NewbieForum NewbieForum NewbieForum NewbieForum NewbieForum NewbieForum Newbie

Group: General Forum Members
Last Login: Friday, May 9, 2014 2:46 AM
Points: 4, Visits: 35
We are using SQL Server version 2005, Merge Statement is Not Supported.

Please share if anybody has used the below approach for their ETL

Begin Transaction
Lock Table Sales
EXEC sp_rename 'Sales', 'Sales_temp';
EXEC sp_rename 'Sales_Swap', 'Sales';
EXEC sp_rename 'Sales_temp', 'Sales_Swap';
End Transaction

Thanks,
Post #1453441
Posted Thursday, May 16, 2013 6:10 AM
SSC-Addicted

SSC-AddictedSSC-AddictedSSC-AddictedSSC-AddictedSSC-AddictedSSC-AddictedSSC-AddictedSSC-Addicted

Group: General Forum Members
Last Login: Friday, July 25, 2014 2:40 AM
Points: 451, Visits: 847
Although MERGE isn't available in 2005 you can still replicate it to a decent degree. See here for some links.

Since you're not letting on what type of load this is (data warehouse, something else) and what data sits in the table (If it's a fact table with FKs to surrogates in Dimensions then it's different than if it's one of these catch-all "reporting" tables I've seen on my travels and normalising what you have may help) - going purely by what you have said your idea, though workable, will just get slower and slower as time goes on.





I'm on LinkedIn
Post #1453462
Posted Thursday, May 16, 2013 7:02 AM
Forum Newbie

Forum NewbieForum NewbieForum NewbieForum NewbieForum NewbieForum NewbieForum NewbieForum Newbie

Group: General Forum Members
Last Login: Friday, May 9, 2014 2:46 AM
Points: 4, Visits: 35
Thanks for your suggestion
Post #1453498
Posted Friday, May 17, 2013 10:43 AM
SSC Rookie

SSC RookieSSC RookieSSC RookieSSC RookieSSC RookieSSC RookieSSC RookieSSC Rookie

Group: General Forum Members
Last Login: Thursday, July 24, 2014 5:03 PM
Points: 32, Visits: 307
I might suggest modifying your intial approach...

Create a staging table per the design you need. Add a SQL task to truncate the staging table in early in your process.

Using your Lookup, direct non-matching output to insert into your destination table, but direct matching output to insert into the staging table (as opposed to row by row update - OLE DB Command).

Then use a SQL task to run a set-based update joining your staging table to your destination table using the predication required.

An alternative is to add an additional column - isUpdate - to said Staging table and direct ALL output from your Lookup to the Staging table - flagging the matching output in the isUpdate column. Then create a dataflow that inserts the non-isUpdate values and a SQL task to run a set-based update on the isUpdate values.
Post #1454092
« Prev Topic | Next Topic »

Add to briefcase

Permissions Expand / Collapse