[Flat File Source [1]] Warning: There is a partial row at the end of the file. using 2008 R2

  • Does anyone know the fix?

    Have read this article and tried suggestions: http://stackoverflow.com/questions/10888616/why-is-ssis-complaining-that-there-is-a-partial-row-at-the-end-of-the-file

    have read this KB article stating it is a known bug with a SP fix for 2005 but I am on 2008.

    https://support.microsoft.com/en-us/kb/937969/

    The source file I'm trying to import has just one header row and one data row. (yes I could insert it manually to the destination table) but I need a long term fix. I added CRLF to end of data row and resaved and reran data flow task, still didn't import the row.

    Anyone know about this problem and fix in 2008 R2?

    --Quote me

  • polkadot (5/1/2015)


    Does anyone know the fix?

    Have read this article and tried suggestions: http://stackoverflow.com/questions/10888616/why-is-ssis-complaining-that-there-is-a-partial-row-at-the-end-of-the-file

    have read this KB article stating it is a known bug with a SP fix for 2005 but I am on 2008.

    https://support.microsoft.com/en-us/kb/937969/

    The source file I'm trying to import has just one header row and one data row. (yes I could insert it manually to the destination table) but I need a long term fix. I added CRLF to end of data row and resaved and reran data flow task, still didn't import the row.

    Anyone know about this problem and fix in 2008 R2?

    Quick suggestions:

    😎

    1) Import the file line by line into a staging table and parse it from there. This method is common when importing multi-document-type files.

    2) Read the file line by line into a Script transformation and to the splitting there, works similar to the one above.

    3) Read the file and count the delimiters in each line, add any if missing from the count in the first line.

  • 1) Import the file line by line into a staging table and parse it from there. This method is common when importing multi-document-type files.

    The file uses the thor character (non standard) as a delimiter so I have to scrub the file first before loading to sql and the warning is given during the scrub process (comprised of flat file connection, derived column, flat file destination). the task looks like it succeeds but it doesn't write the rows to the subsequent flat file which I need to upload to stg.

    2) Read the file line by line into a Script transformation and to the splitting there, works similar to the one above.

    does this require visual b or c# code?

    3) Read the file and count the delimiters in each line, add any if missing from the count in the first line.

    no delimeters are missing

    --Quote me

  • Few points

    😎

    polkadot (5/2/2015)


    1) Import the file line by line into a staging table and parse it from there. This method is common when importing multi-document-type files.

    The file uses the thor character (non standard) as a delimiter so I have to scrub the file first before loading to sql and the warning is given during the scrub process (comprised of flat file connection, derived column, flat file destination). the task looks like it succeeds but it doesn't write the rows to the subsequent flat file which I need to upload to stg.

    Suspect this could be part of the problem, add data viewers on both sides of the derived columns and check if the output matches the full input. There should be no need to clean / replace the thorn character, just configure it as a delimiter for the source.

    2) Read the file line by line into a Script transformation and to the splitting there, works similar to the one above.

    does this require visual b or c# code?

    Yes, you can use either VB or C#, in fact this would replace the scrubbing step.

    3) Read the file and count the delimiters in each line, add any if missing from the count in the first line.

    no delimeters are missing

    Then my suspicion is that the scrubbing step is the culprit, have you tried to append a single empty line to the file after the scrub?

  • sorry, I don't know C#...

    There should be no need to clean / replace the thorn character, just configure it as a delimiter for the source.

    FF Source doesn't offer thorn as a possible alternate to comma (in the header row delimiter drop down), so I use Derived Column to replace the thorn character with | in this way:

    REPLACE([creative_idþgroup_idþdateþimpressionsþclicksþpublisher_costþavg_posþavg_cpcþconversion_type_idþconversionsþrevenueþlast_modified],"þ","|")

    ...and it works in al 43 flat file imports except for this one. How would you do this?

    --Quote me

  • Then my suspicion is that the scrubbing step is the culprit, have you tried to append a single empty line to the file after the scrub?

    and yes, 2 days ago I tried a blank line (used 13 commas on the line following last line in the file)

    thanks Andrew at this blog http://blog.andrewjudd.ca/category/sql/ssis/

    and that seemed to be fixed the problem, until I reran the ssis package. I don't know what else I might have done afterward, but the next time I reran I got same issue, even though the 13 commas were still in the raw pre-scrubbed file. So, I decided to pull the files freshly from the ftp server and unzip them and added the 13 commas again and apply my scrub but no more success with blank line.

    If you have better way to scrub the file I would like to try it but it otherwise works for all 43 of my raw files (all using my derived column solution).

    --Quote me

  • Eririkur, this problem is gone at this time. I don't really know why. I reimported the files from ftp server, re-unzipped, and rebuilt from scratch the DTF handling of that particular file, and this time the source file wasn't problematic for scrubbing. No use in looking into it further...but I'll let you know if it comes up again. Such a show stopper.

    --Quote me

  • polkadot (5/3/2015)


    Eririkur, this problem is gone at this time. I don't really know why. I reimported the files from ftp server, re-unzipped, and rebuilt from scratch the DTF handling of that particular file, and this time the source file wasn't problematic for scrubbing. No use in looking into it further...but I'll let you know if it comes up again. Such a show stopper.

    Thanks for the feedback and good thing it's working, ping back if you run into this problem again and we will tackle it then.

    😎

  • since it's just a warning though, my package doesn't fail, though the consequences are pretty serious. Do you know how to fail a package on this partial row at end of file warning (or any warning for that matter)?

    --Quote me

Viewing 9 posts - 1 through 8 (of 8 total)

You must be logged in to reply to this topic. Login to reply