ETL Load Approach

  • We have a table 'Sales' which is Fully loaded daily. Our target delivery is insert else update, So every time most of the records will be updated which is taking huge time (> 1 hour).

    We don't have a key in Source to make it incremental load.

    Multiple Users are using that table from different Location, So we cant truncate and load, also We cant use 'Switch' Option in sql server.

    We thought of the below approach.

    1) Truncate and Load all the data in the 'Sales_Swap' table , This would be much better since it is just bulk insert.

    2)

    Begin Transaction

    Lock Table Sales

    EXEC sp_rename 'Sales', 'Sales_temp';

    EXEC sp_rename 'Sales_Swap', 'Sales';

    EXEC sp_rename 'Sales_temp', 'Sales_Swap';

    End Transaction

    Experts, Please share your ideas whether the above option is good in ETL Prod environment

    Thanks in Advance

  • What method are you using to update the rows (in your original solution)? Is this a Data Warehouse? What is the size of the table?


    I'm on LinkedIn

  • Thanks for the reply. We are using SSIS Lookup transformation and OLEDB Command to update the records in the target table.

    The Size of the target table is 4o Lakhs currently. It will grow very large in future.

  • The OLEDB command component in SSIS is a row by row transformation - I'm not surprised it's slow. If you must update changing rows, then consider using a MERGE in T-SQL.

    As the table grows, you may wish to start to partition it - this depends on how much history will change.

    What sort of information is in the table? Is it a fact or a dimension?


    I'm on LinkedIn

  • We are using SQL Server version 2005, Merge Statement is Not Supported.

    Please share if anybody has used the below approach for their ETL

    Begin Transaction

    Lock Table Sales

    EXEC sp_rename 'Sales', 'Sales_temp';

    EXEC sp_rename 'Sales_Swap', 'Sales';

    EXEC sp_rename 'Sales_temp', 'Sales_Swap';

    End Transaction

    Thanks,

  • Although MERGE isn't available in 2005 you can still replicate it to a decent degree. See here for some links.

    Since you're not letting on what type of load this is (data warehouse, something else) and what data sits in the table (If it's a fact table with FKs to surrogates in Dimensions then it's different than if it's one of these catch-all "reporting" tables I've seen on my travels and normalising what you have may help) - going purely by what you have said your idea, though workable, will just get slower and slower as time goes on.


    I'm on LinkedIn

  • Thanks for your suggestion 🙂

  • I might suggest modifying your intial approach...

    Create a staging table per the design you need. Add a SQL task to truncate the staging table in early in your process.

    Using your Lookup, direct non-matching output to insert into your destination table, but direct matching output to insert into the staging table (as opposed to row by row update - OLE DB Command).

    Then use a SQL task to run a set-based update joining your staging table to your destination table using the predication required.

    An alternative is to add an additional column - isUpdate - to said Staging table and direct ALL output from your Lookup to the Staging table - flagging the matching output in the isUpdate column. Then create a dataflow that inserts the non-isUpdate values and a SQL task to run a set-based update on the isUpdate values.

Viewing 8 posts - 1 through 7 (of 7 total)

You must be logged in to reply to this topic. Login to reply