September 7, 2018 at 9:55 am
Hi guys I inherited a process in which extracts changes from 214 tables in a sql database into 8 flat files using numbers to mark each table. Then gets imported into a data warehouse via an SSIS package using a conditional split from each flat file to individual data conversion tasks for each table, then task factory upserts into the database. The problem is the package is over 18MB and takes over 7 seconds to initialize after execution on the SSIS server. So even if the actual import of data only takes 3 seconds, total execution time is 10. I am trying to either figure out a better way to import the files or completely change the extract and import process. I am currently looking at bcp, but it looks pretty complicated and requires a format file for each table. Any suggestions?
September 7, 2018 at 10:38 am
jbecker 27503 - Friday, September 7, 2018 9:55 AMHi guys I inherited a process in which extracts changes from 214 tables in a sql database into 8 flat files using numbers to mark each table. Then gets imported into a data warehouse via an SSIS package using a conditional split from each flat file to individual data conversion tasks for each table, then task factory upserts into the database. The problem is the package is over 18MB and takes over 7 seconds to initialize after execution on the SSIS server. So even if the actual import of data only takes 3 seconds, total execution time is 10. I am trying to either figure out a better way to import the files or completely change the extract and import process. I am currently looking at bcp, but it looks pretty complicated and requires a format file for each table. Any suggestions?
Neither BCP nor BULK INSERT require a format file in most cases.
--Jeff Moden
Change is inevitable... Change for the better is not.
Viewing 2 posts - 1 through 2 (of 2 total)
You must be logged in to reply to this topic. Login to reply