SQL Clone
SQLServerCentral is supported by Redgate
 
Log in  ::  Register  ::  Not logged in
 
 
 


Null records being Inserted during Import of CSV file


Null records being Inserted during Import of CSV file

Author
Message
Sqlraider
Sqlraider
SSCrazy
SSCrazy (2.7K reputation)SSCrazy (2.7K reputation)SSCrazy (2.7K reputation)SSCrazy (2.7K reputation)SSCrazy (2.7K reputation)SSCrazy (2.7K reputation)SSCrazy (2.7K reputation)SSCrazy (2.7K reputation)

Group: General Forum Members
Points: 2717 Visits: 2312
I know very little about SSIS, so bear with me.

I have a SSIS package that inserts records into a 'staging' table from a CSV file (loading one CSV file per package execution).

The 1st production file loaded just fine - no issuses. The 2nd file loaded an additional 494,720 NULL rows (all columns for each row had NULLs) in addition to the 18,272 records I wanted to import.

I have a flat file connection (CSV), Code Page: 20127 (US-ASCII), Format: Delimited, Text Qualifier: ", Header Row Delimiter: {CR}{LF}, Header Rows to Skip: 2, Column Delimter: Comma{,}, Data Type: unicode String [DT_WSTR] (for ALL Columns).

I have an OLEDB Connection to my Staging table and I'm using 'fast load'.

What could be different about the 2nd file to cause this? What 'file editor' would I need to use to see said differences? Is there a setting I could use(set) to Not load Null rows?

Any ideas of what could be causing this?

Thanks,
Sqlraider
Phil Parkin
Phil Parkin
SSC Guru
SSC Guru (52K reputation)SSC Guru (52K reputation)SSC Guru (52K reputation)SSC Guru (52K reputation)SSC Guru (52K reputation)SSC Guru (52K reputation)SSC Guru (52K reputation)SSC Guru (52K reputation)

Group: General Forum Members
Points: 52436 Visits: 21180
Sqlraider (9/6/2013)
I know very little about SSIS, so bear with me.

I have a SSIS package that inserts records into a 'staging' table from a CSV file (loading one CSV file per package execution).

The 1st production file loaded just fine - no issuses. The 2nd file loaded an additional 494,720 NULL rows (all columns for each row had NULLs) in addition to the 18,272 records I wanted to import.

I have a flat file connection (CSV), Code Page: 20127 (US-ASCII), Format: Delimited, Text Qualifier: ", Header Row Delimiter: {CR}{LF}, Header Rows to Skip: 2, Column Delimter: Comma{,}, Data Type: unicode String [DT_WSTR] (for ALL Columns).

I have an OLEDB Connection to my Staging table and I'm using 'fast load'.

What could be different about the 2nd file to cause this? What 'file editor' would I need to use to see said differences? Is there a setting I could use(set) to Not load Null rows?

Any ideas of what could be causing this?

Thanks,
Sqlraider


If you open the two files in Notepad++, you should get an idea of what the difference is.

As for avoiding the import of this dodgy data, I would use a Conditional Split component in the data flow to redirect all of the rubbish rows to an unused output - that will filter it out.

Or just filter it out when you process the data in staging.


Help us to help you. For better, quicker and more-focused answers to your questions, consider following the advice in this link.

If the answer to your question can be found with a brief Google search, please perform the search yourself, rather than expecting one of the SSC members to do it for you.

Please surround any code or links you post with the appropriate IFCode formatting tags. It helps readability a lot.
Golden_eye
Golden_eye
SSC Rookie
SSC Rookie (43 reputation)SSC Rookie (43 reputation)SSC Rookie (43 reputation)SSC Rookie (43 reputation)SSC Rookie (43 reputation)SSC Rookie (43 reputation)SSC Rookie (43 reputation)SSC Rookie (43 reputation)

Group: General Forum Members
Points: 43 Visits: 228
Just a small hint - check if the columns in your destination table allows NULLs and change that.
Sqlraider
Sqlraider
SSCrazy
SSCrazy (2.7K reputation)SSCrazy (2.7K reputation)SSCrazy (2.7K reputation)SSCrazy (2.7K reputation)SSCrazy (2.7K reputation)SSCrazy (2.7K reputation)SSCrazy (2.7K reputation)SSCrazy (2.7K reputation)

Group: General Forum Members
Points: 2717 Visits: 2312
Phil Parkin (9/7/2013)
Sqlraider (9/6/2013)
I know very little about SSIS, so bear with me.

I have a SSIS package that inserts records into a 'staging' table from a CSV file (loading one CSV file per package execution).

The 1st production file loaded just fine - no issuses. The 2nd file loaded an additional 494,720 NULL rows (all columns for each row had NULLs) in addition to the 18,272 records I wanted to import.

I have a flat file connection (CSV), Code Page: 20127 (US-ASCII), Format: Delimited, Text Qualifier: ", Header Row Delimiter: {CR}{LF}, Header Rows to Skip: 2, Column Delimter: Comma{,}, Data Type: unicode String [DT_WSTR] (for ALL Columns).

I have an OLEDB Connection to my Staging table and I'm using 'fast load'.

What could be different about the 2nd file to cause this? What 'file editor' would I need to use to see said differences? Is there a setting I could use(set) to Not load Null rows?

Any ideas of what could be causing this?

Thanks,
Sqlraider


If you open the two files in Notepad++, you should get an idea of what the difference is.

As for avoiding the import of this dodgy data, I would use a Conditional Split component in the data flow to redirect all of the rubbish rows to an unused output - that will filter it out.

Or just filter it out when you process the data in staging.


I was focused on the first row of data after the 2nd header record, thinking there was something not right, when in fact there are 494720 Null records (each row is all commas) After the last 'good' data record.

I'm going to use your suggestion of a Conditional Split for those records I don't load into the staging table. That way if for some reason I don't load a record I'll at least still have it.

Thanks,
Sqlraider
aaron.reese
aaron.reese
SSCrazy
SSCrazy (2.5K reputation)SSCrazy (2.5K reputation)SSCrazy (2.5K reputation)SSCrazy (2.5K reputation)SSCrazy (2.5K reputation)SSCrazy (2.5K reputation)SSCrazy (2.5K reputation)SSCrazy (2.5K reputation)

Group: General Forum Members
Points: 2463 Visits: 907
I am going to hazard a guess that the CSV is generated from an Excel file and that there were 400K empty lines at the end of the CSV file. Any time Excel is involved any where near an SSIS package, SSIS seems to wander off in a huff.

Excel and SSIS do not play nicely together in any combination I have found
Sqlraider
Sqlraider
SSCrazy
SSCrazy (2.7K reputation)SSCrazy (2.7K reputation)SSCrazy (2.7K reputation)SSCrazy (2.7K reputation)SSCrazy (2.7K reputation)SSCrazy (2.7K reputation)SSCrazy (2.7K reputation)SSCrazy (2.7K reputation)

Group: General Forum Members
Points: 2717 Visits: 2312
aaron.reese (9/16/2013)
I am going to hazard a guess that the CSV is generated from an Excel file and that there were 400K empty lines at the end of the CSV file. Any time Excel is involved any where near an SSIS package, SSIS seems to wander off in a huff.

Excel and SSIS do not play nicely together in any combination I have found


A third party creates the file (how I don't know) and sends it to us as CSV. I didn't see the 400k null lines until I opened the file using Notepad++.
aaron.reese
aaron.reese
SSCrazy
SSCrazy (2.5K reputation)SSCrazy (2.5K reputation)SSCrazy (2.5K reputation)SSCrazy (2.5K reputation)SSCrazy (2.5K reputation)SSCrazy (2.5K reputation)SSCrazy (2.5K reputation)SSCrazy (2.5K reputation)

Group: General Forum Members
Points: 2463 Visits: 907
So is the source file missing data (i.e. are the lines of empty fields supposed to have data) or just extra lines.

If you are happy that the data is complete, I would

1) raise a defect against the file source to get the null rows removed
2) apply the conditional split as per Phils suggestion to ignore them - the exact rules for the CS will be down to you and the nature of the data (I would find a field or combination of fields that CANNOT be null and validate against them)
Sqlraider
Sqlraider
SSCrazy
SSCrazy (2.7K reputation)SSCrazy (2.7K reputation)SSCrazy (2.7K reputation)SSCrazy (2.7K reputation)SSCrazy (2.7K reputation)SSCrazy (2.7K reputation)SSCrazy (2.7K reputation)SSCrazy (2.7K reputation)

Group: General Forum Members
Points: 2717 Visits: 2312
aaron.reese (9/17/2013)
So is the source file missing data (i.e. are the lines of empty fields supposed to have data) or just extra lines.

If you are happy that the data is complete, I would

1) raise a defect against the file source to get the null rows removed
2) apply the conditional split as per Phils suggestion to ignore them - the exact rules for the CS will be down to you and the nature of the data (I would find a field or combination of fields that CANNOT be null and validate against them)


It's just extra lines (lines with just commas no data between ex: ,,,,,,). I did apply a conditional split off of the ONE field that cannot be null. This is a monthly file and only on occasion does it have extra lines.
Go


Permissions

You can't post new topics.
You can't post topic replies.
You can't post new polls.
You can't post replies to polls.
You can't edit your own topics.
You can't delete your own topics.
You can't edit other topics.
You can't delete other topics.
You can't edit your own posts.
You can't edit other posts.
You can't delete your own posts.
You can't delete other posts.
You can't post events.
You can't edit your own events.
You can't edit other events.
You can't delete your own events.
You can't delete other events.
You can't send private messages.
You can't send emails.
You can read topics.
You can't vote in polls.
You can't upload attachments.
You can download attachments.
You can't post HTML code.
You can't edit HTML code.
You can't post IFCode.
You can't post JavaScript.
You can post emoticons.
You can't post or upload images.

Select a forum

































































































































































SQLServerCentral


Search