Click here to monitor SSC
SQLServerCentral is supported by Red Gate Software Ltd.
 
Log in  ::  Register  ::  Not logged in
 
 
 
        
Home       Members    Calendar    Who's On


Add to briefcase 12»»

Dynamic Connections Expand / Collapse
Author
Message
Posted Tuesday, May 14, 2013 8:55 AM


SSC Journeyman

SSC JourneymanSSC JourneymanSSC JourneymanSSC JourneymanSSC JourneymanSSC JourneymanSSC JourneymanSSC Journeyman

Group: General Forum Members
Last Login: Sunday, September 8, 2013 9:25 AM
Points: 96, Visits: 495
Hello All,

I have a package that downloads a lot of csv files and transfers their contents into a database. For one of the tasks in this package, I have a Precedence Constraint that evaluates the following expression:
@Table_Number == "3290056" || @Table_Number == "3290057" || @Table_Number == "3290058" || @Table_Number == "3290059" || @Table_Number == "3290060" || @Table_Number == "3290061" || @Table_Number == "3290062" || @Table_Number == "3290063" || @Table_Number == "3290064" || @Table_Number == "3290065" || @Table_Number == "3290066" || @Table_Number == "3290067" || @Table_Number == "3290068"

Basically what happens up to this is:

1. Download data based on a tblRelease_Dates query (if release is today, download).
2. Unzip the file (extract the csv)
3. Set the final destination path to archive the zipped files.
4. Start transform and load processes.

My problem is when the process flows get to the above precedence constraints. The previous line I gave checks which table is being transferred and loaded. However, I need to extract from the file in the folder with which-ever Table_Number is called. In the next step, I need to connect to the csv files and then load the values into a staging table. In the folder, the file name looks like: 3290056-eng.csv, 3290057-eng.csv, ..., & 3290068-eng.csv. If I do this the way I've learned, which is to create a connection using the connection manager, it will not be a dynamic connection so no matter which table number I'm on, it will only connect to the connection I make. How can I create a dynamic connection based on the table numbers. I'm going to play around in the Script Task sandbox to try and figure this out but I thought in the meantime I would ask first.


Regards:
Mordred
Keep on Coding in the Free World
Post #1452649
Posted Tuesday, May 14, 2013 11:56 AM
Old Hand

Old HandOld HandOld HandOld HandOld HandOld HandOld HandOld Hand

Group: General Forum Members
Last Login: Friday, June 27, 2014 2:05 PM
Points: 300, Visits: 810
My understanding is that there is not the ability to have dynamic connections. I have split packages into mult pkgs to get around it.
Post #1452750
Posted Tuesday, May 14, 2013 12:49 PM


SSC Journeyman

SSC JourneymanSSC JourneymanSSC JourneymanSSC JourneymanSSC JourneymanSSC JourneymanSSC JourneymanSSC Journeyman

Group: General Forum Members
Last Login: Sunday, September 8, 2013 9:25 AM
Points: 96, Visits: 495
Seriously? Darn! Now I'll have to make 11 different packages? Darn!

Regards:
Mordred
Keep on Coding in the Free World
Post #1452778
Posted Wednesday, May 15, 2013 12:58 AM


SSCertifiable

SSCertifiableSSCertifiableSSCertifiableSSCertifiableSSCertifiableSSCertifiableSSCertifiableSSCertifiableSSCertifiable

Group: General Forum Members
Last Login: 2 days ago @ 2:06 PM
Points: 5,047, Visits: 11,797
Can you answer a couple more questions about your set-up:

1) Is the file format the same for all files?
2) Is this a common staging table, or is there one staging table per file to be imported?



Help us to help you. For better, quicker and more-focused answers to your questions, consider following the advice in this link.

When you ask a question (and please do ask a question: "My T-SQL does not work" just doesn't cut it), please provide enough information for us to understand its context.
Post #1452943
Posted Wednesday, May 15, 2013 8:43 AM


SSC Journeyman

SSC JourneymanSSC JourneymanSSC JourneymanSSC JourneymanSSC JourneymanSSC JourneymanSSC JourneymanSSC Journeyman

Group: General Forum Members
Last Login: Sunday, September 8, 2013 9:25 AM
Points: 96, Visits: 495
If by format you mean the type of file then yes, they are all CSVs. If you mean the format in the csv then no, they all differ a bit (column names may differ).

The staging table is a common staging table that is set up to accommodate all the csvs (at least so far it is).


Regards:
Mordred
Keep on Coding in the Free World
Post #1453142
Posted Wednesday, May 15, 2013 8:46 AM


SSC Journeyman

SSC JourneymanSSC JourneymanSSC JourneymanSSC JourneymanSSC JourneymanSSC JourneymanSSC JourneymanSSC Journeyman

Group: General Forum Members
Last Login: Sunday, September 8, 2013 9:25 AM
Points: 96, Visits: 495
I've been working on a workaround for this. I've created individual connections for each CSV file. At a certain point in my package, and while using precedence constraints for each table, I have been able to connect to each file individually for my ETL processes. the attachment sort of shows.

Regards:
Mordred
Keep on Coding in the Free World


  Post Attachments 
My Process.JPG (8 views, 32.53 KB)
Post #1453146
Posted Thursday, May 16, 2013 1:06 AM


SSCertifiable

SSCertifiableSSCertifiableSSCertifiableSSCertifiableSSCertifiableSSCertifiableSSCertifiableSSCertifiableSSCertifiable

Group: General Forum Members
Last Login: 2 days ago @ 2:06 PM
Points: 5,047, Visits: 11,797
Mordred (5/15/2013)
I've been working on a workaround for this. I've created individual connections for each CSV file. At a certain point in my package, and while using precedence constraints for each table, I have been able to connect to each file individually for my ETL processes. the attachment sort of shows.


I think you're working along the right lines: check what sort of file it is and then execute the appropriate data flow - though I think I would have expected to see more things happening within the Foreach container.



Help us to help you. For better, quicker and more-focused answers to your questions, consider following the advice in this link.

When you ask a question (and please do ask a question: "My T-SQL does not work" just doesn't cut it), please provide enough information for us to understand its context.
Post #1453343
Posted Thursday, May 16, 2013 8:07 AM


SSC Journeyman

SSC JourneymanSSC JourneymanSSC JourneymanSSC JourneymanSSC JourneymanSSC JourneymanSSC JourneymanSSC Journeyman

Group: General Forum Members
Last Login: Sunday, September 8, 2013 9:25 AM
Points: 96, Visits: 495
I only showed a small part of the For loop container. Check the new image out to see that I'm doing a fair bit in there and I'm still adding.

Also, and this still has to do with connections, I got the following error when I started this up this morning:
Error 1 Error loading DownloadCSVPackage.dtsx: The connection "" is not found. This error is thrown by Connections collection when the specific connection element is not found. C:\Projects\EconAnalysisStatsCan\EconAnalysisStatsCan\DownloadCSVPackage.dtsx
I don't know how to find this blank connection, any ideas? I've scrolled my Connection Manager pane and don't see an connections named "".

EDIT: I've finished the workflow but I'm still seeing issues regarding the connection with a name "".


Regards:
Mordred
Keep on Coding in the Free World


  Post Attachments 
Workflow.JPG (1 view, 67.31 KB)
Post #1453564
Posted Friday, May 17, 2013 2:14 AM
Mr or Mrs. 500

Mr or Mrs. 500Mr or Mrs. 500Mr or Mrs. 500Mr or Mrs. 500Mr or Mrs. 500Mr or Mrs. 500Mr or Mrs. 500Mr or Mrs. 500

Group: General Forum Members
Last Login: Thursday, August 21, 2014 4:21 AM
Points: 561, Visits: 1,168
I've had that error when there's no value in the connection string
You'll need a default connection even if you are setting the actual connection with an expression that evaluates at runtime.

I have a similar sounding job which is a set of packages that import about four hundred csv files daily from four different business section zip file imports (from our outsourced business system) and the same file name and format is in each section i.e. file salsorders.csv will be the same format in each business section.
I use script tasks at the start to set up the correct import, error log and zip archive folders into variables and then a for each section number loop which does the import to combined staging tables which the next package converts and loads to live.
The connection string of a flat file is then an expression such as import folder variable plus file name plus section number variable plus extension.

I split the import section into sub-packages due to the slowness of working on each package and the whole thing is far more manageable. The dtutil batch script to deploy the packages is my most time saving utility!
Post #1453890
Posted Tuesday, May 21, 2013 6:32 AM


SSC Journeyman

SSC JourneymanSSC JourneymanSSC JourneymanSSC JourneymanSSC JourneymanSSC JourneymanSSC JourneymanSSC Journeyman

Group: General Forum Members
Last Login: Sunday, September 8, 2013 9:25 AM
Points: 96, Visits: 495
You'll need a default connection even if you are setting the actual connection with an expression that evaluates at runtime.
Where do I set the default connection? I'm going to start looking now and will post an answer when I find it but if someone posts before I do I'm going to keep checking here.


Regards:
Mordred
Keep on Coding in the Free World
Post #1454969
« Prev Topic | Next Topic »

Add to briefcase 12»»

Permissions Expand / Collapse