• Steve Jones - SSC Editor (7/7/2015)


    ZZartin (7/7/2015)


    Well in all fairness any flatfile format that isn't properly generated on the source side is going to break your dataflow. As for why csv are more popular it's because they were a more consistent standard, tabs might be a standard now but on older systems some might interpret a tab as 4 spaces some might interpret them as 8 spaces etc.... Similar issue with line breaks since for whatever reason MS decided to go with CR+NL for line breaks and nix systems use just NL i believe, that difference can make you cry if you're working on both platforms, fun is trying to figure out why the password you're copying and pasting at 3 in the morning from excel into a java program isn't working.

    Completely agree, but when the file format opens find in Excel, opens in text editors as CSV, appears in PostgreSQL fine, what can't SSIS handle it? A very poor implementation.

    You'd think that parsing records and columns in a CSV file would be a function of the provider library; the same library used by SSIS, Excel, and Java. I would expect the Data Flow in SSIS to provide only generic pumbing for moving data from one task to another. Perhaps the solution for getting SSIS to properly handle embedded commas in a CSV file is a matter of configuring something under advanced properties in the Flat File data source.

    "Do not seek to follow in the footsteps of the wise. Instead, seek what they sought." - Matsuo Basho