SQL Clone
SQLServerCentral is supported by Redgate
 
Log in  ::  Register  ::  Not logged in
 
 
 


Enhance current process


Enhance current process

Author
Message
NewBornDBA2017
NewBornDBA2017
Hall of Fame
Hall of Fame (3.3K reputation)Hall of Fame (3.3K reputation)Hall of Fame (3.3K reputation)Hall of Fame (3.3K reputation)Hall of Fame (3.3K reputation)Hall of Fame (3.3K reputation)Hall of Fame (3.3K reputation)Hall of Fame (3.3K reputation)

Group: General Forum Members
Points: 3312 Visits: 838
Experts, I am wondering if I can get some advise. I will try to explain the situation as best as I can. We are given some tasks on a daily basis from end users where they give us some files. Those files contain ID, firstName, LastName, title, address, company they work for, company address, phone number etc. Source files can contain records anywhere between 20000 to 100,000. Once we have those files, we store those values in a DB table. After that, we run a match process against those values with our main table and create 4 staging tables. 1st staging table for a tight match , 2nd one is for loose match (For individual name, address only) . 3rd table for a tight mach and 4th table for a loose match for company + address. After that, we create 1 table which contains all the data including some additional columns stating whether that given record was a tight match or a loose match and how many records from the source exist in our main table. After that, we send the file back to end users.
What I am trying to do here is I want to make this process fast. For example, 1 SSIS package with 1 file with 20,000 records takes about 20 - 30 minutes. And if I am processing 100,000 records, than it takes 2 -3 hours. It takes that long because we use Melisa Data component for contact verification which takes some time. So I just need some suggestion on how I can improve this process? Can I create multiple temp tables but how does it work in this situation. I also thought about processing those records in batches but it won't work either because the process is going to be write to a same table which is not possible. Any suggestion would be helpful.
frederico_fonseca
frederico_fonseca
SSCertifiable
SSCertifiable (7.2K reputation)SSCertifiable (7.2K reputation)SSCertifiable (7.2K reputation)SSCertifiable (7.2K reputation)SSCertifiable (7.2K reputation)SSCertifiable (7.2K reputation)SSCertifiable (7.2K reputation)SSCertifiable (7.2K reputation)

Group: General Forum Members
Points: 7176 Visits: 3876
Its always hard to help with third party tools - the maker should be your first point of contact to see if performance can be improved with settings or workarounds while dealing with their components.

In many situations cases like this are constrained by the server specs so can you please tell us what they are - including available memory and normal cpu/io load while the packages are executing.

As for the ssis flow - at what point(s) does Melisa gets used? you mentioned 4 tables which could mean that part of the process is done even before passing through Melisa. (which probably is advisable to do if not done already).
if you could either post the package here, or post images of the flow including the exact points that are slow it may help.

General things to look at are
- buffer size
- rows per buffer
- datatypes (are they bigger than what it is required? if so change it as it affects memory allocation by SSIS)
- lookup tables on SSIS - avoid where possible if volumes are big - better to process on SQL

You can also look at splitting the data flow onto multiple outputs onto Melisa and then into same table or multiple staging tables, either can work if table/database are setup for it.
Joe Torre
Joe Torre
SSCertifiable
SSCertifiable (6.2K reputation)SSCertifiable (6.2K reputation)SSCertifiable (6.2K reputation)SSCertifiable (6.2K reputation)SSCertifiable (6.2K reputation)SSCertifiable (6.2K reputation)SSCertifiable (6.2K reputation)SSCertifiable (6.2K reputation)

Group: General Forum Members
Points: 6220 Visits: 1142
The issue here is the latency with web service requests processed RBAR. If you could upload the data to Melissa so they could import and enhance the data all on one server it would be way faster. Using Melissa WS in a data entry scenario, that latency is acceptable as it's getting only one row enahanced.
Go


Permissions

You can't post new topics.
You can't post topic replies.
You can't post new polls.
You can't post replies to polls.
You can't edit your own topics.
You can't delete your own topics.
You can't edit other topics.
You can't delete other topics.
You can't edit your own posts.
You can't edit other posts.
You can't delete your own posts.
You can't delete other posts.
You can't post events.
You can't edit your own events.
You can't edit other events.
You can't delete your own events.
You can't delete other events.
You can't send private messages.
You can't send emails.
You can read topics.
You can't vote in polls.
You can't upload attachments.
You can download attachments.
You can't post HTML code.
You can't edit HTML code.
You can't post IFCode.
You can't post JavaScript.
You can post emoticons.
You can't post or upload images.

Select a forum








































































































































































SQLServerCentral


Search