SQL Clone
SQLServerCentral is supported by Redgate
 
Log in  ::  Register  ::  Not logged in
 
 
 


SSIS Mapping (Audit)


SSIS Mapping (Audit)

Author
Message
Jonathan Marshall
Jonathan Marshall
SSC-Addicted
SSC-Addicted (428 reputation)SSC-Addicted (428 reputation)SSC-Addicted (428 reputation)SSC-Addicted (428 reputation)SSC-Addicted (428 reputation)SSC-Addicted (428 reputation)SSC-Addicted (428 reputation)SSC-Addicted (428 reputation)

Group: General Forum Members
Points: 428 Visits: 373
I have an inquiry based on how SSIS handles mappings for a certain number of columns that increase in time.
A client has a file that started off with say 10 columns. The columns have increased over time to say 100 columns.
The package is mapped so that it is mapped to the maximum amount of columns. If a file comes in with less then the 100 columns everything should work properply?

Basically rows are being ingested incorrectly.

Marshall
Kenneth Fisher
Kenneth Fisher
SSCertifiable
SSCertifiable (6.3K reputation)SSCertifiable (6.3K reputation)SSCertifiable (6.3K reputation)SSCertifiable (6.3K reputation)SSCertifiable (6.3K reputation)SSCertifiable (6.3K reputation)SSCertifiable (6.3K reputation)SSCertifiable (6.3K reputation)

Group: General Forum Members
Points: 6345 Visits: 2059
Jonathan Marshall (8/27/2012)
I have an inquiry based on how SSIS handles mappings for a certain number of columns that increase in time.
A client has a file that started off with say 10 columns. The columns have increased over time to say 100 columns.
The package is mapped so that it is mapped to the maximum amount of columns. If a file comes in with less then the 100 columns everything should work properply?

Basically rows are being ingested incorrectly.

Marshall


Are the missing columns off the end of the file?

IE
Field1, Field2, Field3, Field4, Field5
to
Field1, Field2, Field3
instead of
Field1, Field3, Field5

If it's the 2nd then you will have problems. The way the mapping will end up is Field1-> Field1, Field3->Field2, Field5->Field3. The only solution in that case I can think of is to create multiple connections for the file and branch depending on which version comes in. Or load the file into a single column table (varchar(max)) and parse from there.

I'm not 100% certain what will happen in the first case but my guess is that it will work fine, leaving NULLs in the affected columns.

Kenneth FisherI strive to live in a world where a chicken can cross the road without being questioned about its motives.--------------------------------------------------------------------------------For better, quicker answers on T-SQL questions, click on the following... http://www.sqlservercentral.com/articles/Best+Practices/61537/For better answers on performance questions, click on the following... http://www.sqlservercentral.com/articles/SQLServerCentral/66909/Link to my Blog Post --> www.SQLStudies.com
Jonathan Marshall
Jonathan Marshall
SSC-Addicted
SSC-Addicted (428 reputation)SSC-Addicted (428 reputation)SSC-Addicted (428 reputation)SSC-Addicted (428 reputation)SSC-Addicted (428 reputation)SSC-Addicted (428 reputation)SSC-Addicted (428 reputation)SSC-Addicted (428 reputation)

Group: General Forum Members
Points: 428 Visits: 373
I will look into deeper.
Basically it seems as if the columns are being add to the same file over the years.
I mapped to the latest file (ex. File_03312012.txt contains 90 ) (ex. File_06302012.txt contains 99)
The file with only 90 columns is only importing half of the rows.
This pattern occurs through the process. Will do deeper analysis to where exactly they are falling.
Quite tedious at times. Any logic for incoming columns or checking the file before SSIS ingests.
I would believe that the process would fail but it is not failing. The files are tab delimited
Jonathan Marshall
Jonathan Marshall
SSC-Addicted
SSC-Addicted (428 reputation)SSC-Addicted (428 reputation)SSC-Addicted (428 reputation)SSC-Addicted (428 reputation)SSC-Addicted (428 reputation)SSC-Addicted (428 reputation)SSC-Addicted (428 reputation)SSC-Addicted (428 reputation)

Group: General Forum Members
Points: 428 Visits: 373
If definitely the second case scenario.
When loading the data manually through the SSMS import wizard all the rows and columns line up.
So yes its definitely mapping incorrectly.
Is there any way of auditing files through SSIS?
Kenneth Fisher
Kenneth Fisher
SSCertifiable
SSCertifiable (6.3K reputation)SSCertifiable (6.3K reputation)SSCertifiable (6.3K reputation)SSCertifiable (6.3K reputation)SSCertifiable (6.3K reputation)SSCertifiable (6.3K reputation)SSCertifiable (6.3K reputation)SSCertifiable (6.3K reputation)

Group: General Forum Members
Points: 6345 Visits: 2059
By any chance is the first row the column headings? If not any chance it can be made to? If so there is a setting on the connection manager that will tell it to use the first row for column headings and that should resolve your problem. If that won't do it then you will need some way for the package to tell the difference between the files. Either by name of the file, or by the contents of one of the columns. I've dealt with a file that had 6 different types of rows in it and the first column was lettered a-f to tell me what it was. That isn't to hard to deal with.

If you can't get the column headings in the file then see if you can tell me how you tell what columns are in the file then we will see what we can do :-).

Kenneth FisherI strive to live in a world where a chicken can cross the road without being questioned about its motives.--------------------------------------------------------------------------------For better, quicker answers on T-SQL questions, click on the following... http://www.sqlservercentral.com/articles/Best+Practices/61537/For better answers on performance questions, click on the following... http://www.sqlservercentral.com/articles/SQLServerCentral/66909/Link to my Blog Post --> www.SQLStudies.com
Jonathan Marshall
Jonathan Marshall
SSC-Addicted
SSC-Addicted (428 reputation)SSC-Addicted (428 reputation)SSC-Addicted (428 reputation)SSC-Addicted (428 reputation)SSC-Addicted (428 reputation)SSC-Addicted (428 reputation)SSC-Addicted (428 reputation)SSC-Addicted (428 reputation)

Group: General Forum Members
Points: 428 Visits: 373
Yes each file has column headers.
I'm retrieving the name of the file.
Based on the filename it populates the correct table.
The mapping is based on the file with the most recent columns which is greater to older files.
Even the order of the columns change within the file.
The package isn't failing but data is not loading correctly.
I'm came across this article regarding loading changing files.
I'm not the strongest programmer but seems I can give it a try to make the package more scalable.

Let me know you guys thoughts

http://blog.quasarinc.com/ssis/best-solution-to-load-dynamically-change-csv-file-in-ssis-etl-package/
Go


Permissions

You can't post new topics.
You can't post topic replies.
You can't post new polls.
You can't post replies to polls.
You can't edit your own topics.
You can't delete your own topics.
You can't edit other topics.
You can't delete other topics.
You can't edit your own posts.
You can't edit other posts.
You can't delete your own posts.
You can't delete other posts.
You can't post events.
You can't edit your own events.
You can't edit other events.
You can't delete your own events.
You can't delete other events.
You can't send private messages.
You can't send emails.
You can read topics.
You can't vote in polls.
You can't upload attachments.
You can download attachments.
You can't post HTML code.
You can't edit HTML code.
You can't post IFCode.
You can't post JavaScript.
You can post emoticons.
You can't post or upload images.

Select a forum

































































































































































SQLServerCentral


Search