Click here to monitor SSC
SQLServerCentral is supported by Red Gate Software Ltd.
 
Log in  ::  Register  ::  Not logged in
 
 
 
        
Home       Members    Calendar    Who's On


Add to briefcase 12»»

Parallel Processing of Large Volume ETL Jobs Expand / Collapse
Author
Message
Posted Wednesday, November 7, 2007 11:28 PM
Forum Newbie

Forum NewbieForum NewbieForum NewbieForum NewbieForum NewbieForum NewbieForum NewbieForum Newbie

Group: General Forum Members
Last Login: Wednesday, May 12, 2010 7:09 AM
Points: 4, Visits: 35
Comments posted to this topic are about the item Parallel Processing of Large Volume ETL Jobs
Post #419923
Posted Wednesday, November 7, 2007 11:29 PM


SSC-Dedicated

SSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-Dedicated

Group: General Forum Members
Last Login: Today @ 3:27 PM
Points: 35,216, Visits: 31,673
The title of this article is "Parallel Processing of Large Volume ETL Jobs"... what do you consider to be a "Large Volume ETL Job"?

--Jeff Moden
"RBAR is pronounced "ree-bar" and is a "Modenism" for "Row-By-Agonizing-Row".

First step towards the paradigm shift of writing Set Based code:
Stop thinking about what you want to do to a row... think, instead, of what you want to do to a column."

(play on words) "Just because you CAN do something in T-SQL, doesn't mean you SHOULDN'T." --22 Aug 2013

Helpful Links:
How to post code problems
How to post performance problems
Post #419924
Posted Thursday, November 8, 2007 12:17 AM
Forum Newbie

Forum NewbieForum NewbieForum NewbieForum NewbieForum NewbieForum NewbieForum NewbieForum Newbie

Group: General Forum Members
Last Login: Wednesday, May 12, 2010 7:09 AM
Points: 4, Visits: 35
Large volume means number of rows and data size of source(s) that need to be processed into the target system by applying good amount of complex validation/filtration/transformation logics. The "Number of rows" and "good amount of" in the previous sentence depends on the hardware capacity and availability of ETL time window of individual systems. Before considering the idea discussed in article, study need to be done on the project for facts like current/future hardware capacity, how much data parallelism can be applied with current hardware, current/future available time window for etl processing, future complexities in validation logics, etc.
Post #419935
Posted Thursday, November 8, 2007 10:44 AM
Forum Newbie

Forum NewbieForum NewbieForum NewbieForum NewbieForum NewbieForum NewbieForum NewbieForum Newbie

Group: General Forum Members
Last Login: Monday, November 12, 2007 6:09 PM
Points: 1, Visits: 4
I think the information you have attempted to communicate is valuable. However, the repeated use of slashes fragmented or run-on sentences made the article almost impossible to read.


Post #420196
Posted Thursday, November 8, 2007 12:38 PM
SSC Rookie

SSC RookieSSC RookieSSC RookieSSC RookieSSC RookieSSC RookieSSC RookieSSC Rookie

Group: General Forum Members
Last Login: Monday, July 6, 2009 1:33 PM
Points: 31, Visits: 36
Ohh, and you teased me. You mentioned SQL Set Theory twice in you introduction and then never addressed it in the body. Now I have to go google SQL Set Theory.
Post #420234
Posted Thursday, November 8, 2007 12:56 PM


Mr or Mrs. 500

Mr or Mrs. 500Mr or Mrs. 500Mr or Mrs. 500Mr or Mrs. 500Mr or Mrs. 500Mr or Mrs. 500Mr or Mrs. 500Mr or Mrs. 500

Group: General Forum Members
Last Login: Friday, July 6, 2012 9:12 AM
Points: 500, Visits: 183
This is great timing we were just discussing this in regards to using HaDoop or SQL Server to do some very large, close to terabyte, imports and transformations.

I have a couple questions, and I realize you didn't want to include code but a little start on it would help if possible:

?How do you physically spit up the large files and track them in each thread?
?How is the master process receiving it's messages from the children processes?
?In a SQL SERVER implementation would you most likely use CRL code to break up the files and Bulk Load to import it?
?Any suggestions on foreign keys and indexes?

This is a great topic, it may give us a lead on how to proceed with our new data project.
Very much appreciated!


Skål - jh
Post #420243
Posted Thursday, November 8, 2007 10:15 PM


SSC-Dedicated

SSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-Dedicated

Group: General Forum Members
Last Login: Today @ 3:27 PM
Points: 35,216, Visits: 31,673
RAGHAVENDRA NARAYANA (11/8/2007)
Large volume means number of rows and data size of source(s) that need to be processed into the target system by applying good amount of complex validation/filtration/transformation logics. The "Number of rows" and "good amount of" in the previous sentence depends on the hardware capacity and availability of ETL time window of individual systems. Before considering the idea discussed in article, study need to be done on the project for facts like current/future hardware capacity, how much data parallelism can be applied with current hardware, current/future available time window for etl processing, future complexities in validation logics, etc.


C'mon... How many rows and columns per row are you talking about?


--Jeff Moden
"RBAR is pronounced "ree-bar" and is a "Modenism" for "Row-By-Agonizing-Row".

First step towards the paradigm shift of writing Set Based code:
Stop thinking about what you want to do to a row... think, instead, of what you want to do to a column."

(play on words) "Just because you CAN do something in T-SQL, doesn't mean you SHOULDN'T." --22 Aug 2013

Helpful Links:
How to post code problems
How to post performance problems
Post #420357
Posted Saturday, November 10, 2007 4:32 AM
Forum Newbie

Forum NewbieForum NewbieForum NewbieForum NewbieForum NewbieForum NewbieForum NewbieForum Newbie

Group: General Forum Members
Last Login: Wednesday, May 12, 2010 7:09 AM
Points: 4, Visits: 35
We can think of an ETL system where we expect around 5 million transaction coming in with around 25 columns.

I would request if any of the Technical Architects who worked on database sizing and hardware configuration to share their thoughts on the article including parallelism, processors plus this question please.
Post #420774
Posted Saturday, November 10, 2007 7:37 AM


SSC-Dedicated

SSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-Dedicated

Group: General Forum Members
Last Login: Today @ 3:27 PM
Points: 35,216, Visits: 31,673
Thanks for the rowcount and line size... I've got an experiment or two to do... I'll be back...

--Jeff Moden
"RBAR is pronounced "ree-bar" and is a "Modenism" for "Row-By-Agonizing-Row".

First step towards the paradigm shift of writing Set Based code:
Stop thinking about what you want to do to a row... think, instead, of what you want to do to a column."

(play on words) "Just because you CAN do something in T-SQL, doesn't mean you SHOULDN'T." --22 Aug 2013

Helpful Links:
How to post code problems
How to post performance problems
Post #420778
Posted Monday, November 12, 2007 2:03 AM
Forum Newbie

Forum NewbieForum NewbieForum NewbieForum NewbieForum NewbieForum NewbieForum NewbieForum Newbie

Group: General Forum Members
Last Login: Wednesday, May 12, 2010 7:09 AM
Points: 4, Visits: 35
?How do you physically spit up the large files and track them in each thread?
?How is the master process receiving it's messages from the children processes?
FOR ABOVE TWO QUESTIONS, IN ARTICLE I HAVE GIVEN DETAILED EXPLANATION.

?In a SQL SERVER implementation would you most likely use CRL code to break up the files and Bulk Load to import it?
NOT SURE ABOUT THIS. I WOULD REQUEST EXPERTS IN THIS PARTICULAR AREA TO ADDRESS THIS.

?Any suggestions on foreign keys and indexes?
NOTHING MUCH THAT I CAN THINK OF NOW, FEW THINGS COULD BE, KEEP MAIN TABLE AND RELATED TABLE INSERTS/UPDATES IN ONE COMMIT/ROLLBACK BLOCK. TAKE CARE OF IDENTITY COLUMNS IN CASE OF ROLLBACK/FAILURE, WE'LL LOSE THE IDENTITY NUMBER IF ROLLBACK/FAILURE HAPPENS. INSERT/UPDATE THE RECORDS IN BULK LIKE THOUSANDS IN CHUNKS IN PREFERABLY TARGET TABLE PRIMARY INDEX ORDER. IN TARGET TABLES, BE STRICT IN INCLUDING INDEXES, TRY COMPOSITE INDEXES INSTEAD OF MULTIPLE INDIVIDUAL INDEXES.
Post #420944
« Prev Topic | Next Topic »

Add to briefcase 12»»

Permissions Expand / Collapse