Is the a way in SSIS to add a comma to the end of each row of a csv file?

  • Is the a way in SSIS to add a comma to the end of each row of a csv file? C# Script? It's part of a package that loads csv files to SQL Server table... Thanks!

  • Yes, a C# script could definitely do it.

    If you haven't even tried to resolve your issue, please don't expect the hard-working volunteers here to waste their time providing links to answers which you could easily have found yourself.

  • Something like this? I know this as written will not work:

    StringWriter csv = new StringWriter();

    csv.WriteLine(string.Format("{,},{LF}"));

    foreach (var item in YourListData)
    {
    csv.WriteLine(string.Format("{,},{LF}));
    }

    return File...
  • Have a look at the first answer here: https://stackoverflow.com/questions/36116795/how-to-update-a-record-in-text-file

    As long as your file is not too huge, that gets you pretty close.

    If you haven't even tried to resolve your issue, please don't expect the hard-working volunteers here to waste their time providing links to answers which you could easily have found yourself.

  • This line has to reference the file name as a literal: string[] lines = File.ReadAllLines("items.txt")....set up as a variable?:

        public void UpdateInFile(string modified,int id)
    {
    string[] lines = File.ReadAllLines("items.txt");
    for (int i = 0; i < lines.Length; i++)
    {
    string[] parts = lines.Split(',');
    if (Convert.ToInt32(parts[0]) == id)
    {
    lines = modified;
    }
    }
    File.WriteAllLines("items.txt", lines);
    }
  • Maybe I'm approaching this all wrong - the problem is my csv files will not load unless there's a comma at the end of each row. So maybe there's a way of manipulating my package, perhaps in the data flow part of the For Each Container, to force the package to ignore?

    • This reply was modified 2 years, 12 months ago by  DaveBriCam.
  • Why is the final comma needed? My CSV files don't need one.

    If you haven't even tried to resolve your issue, please don't expect the hard-working volunteers here to waste their time providing links to answers which you could easily have found yourself.

  • I'm not sure but the package fails if the comma is not there.

  • Sounds like it is expecting (n+1) columns rather than the (n) that are there. Why not fix that, rather than modifying the input file?

    If you haven't even tried to resolve your issue, please don't expect the hard-working volunteers here to waste their time providing links to answers which you could easily have found yourself.

  • I checked the column count in SQL Server table and it matches the number of csv columns.

  • Are there any unmapped columns in your OLE DB or ABO destination?  If so, that might get you pointed in the right direction.

  • DaveBriCam wrote:

    I checked the column count in SQL Server table and it matches the number of csv columns.

    This is not the important thing. You need to look inside the package, using VS, and look at how the file connection has been set up.

    If you haven't even tried to resolve your issue, please don't expect the hard-working volunteers here to waste their time providing links to answers which you could easily have found yourself.

  • There probably are now as my original package corrupted and I had to make a copy. I'm trying to create a flat file connection manager and have for gotten how to do it so I don't have to pick a specific file as I have 14 csv files in my source folder that I'm looping through.

  • Crisis avoided I restored a previous version.

  • DaveBriCam wrote:

    I checked the column count in SQL Server table and it matches the number of csv columns.

    So - where is this failing and what is the failure message?  I suspect it probably is failing when trying to read the file - most likely because the file specification is set to a specific number of columns and without that extra comma it isn't seeing the last field.

    But - without any information on the actual error or where it is occurring, this is all just a guess.  And as Phil pointed out - instead of trying to work around the issue it is much better (and often much easier) to fix the underlying problem.

    Jeffrey Williams
    “We are all faced with a series of great opportunities brilliantly disguised as impossible situations.”

    ― Charles R. Swindoll

    How to post questions to get better answers faster
    Managing Transaction Logs

Viewing 15 posts - 1 through 14 (of 14 total)

You must be logged in to reply to this topic. Login to reply