Script (C#) to add row to Object Variable (SSIS 2008 R2)

  • Hi,

    Is it possible to...

    1) create an empty, no rows, Object Variable from a table structure. Possibly with a SELECT * FROM XYZ where 1=2

    2) Add records to the Object Variable in multiple places through a package.

    Basically the goal is to gather issues, not errors, that have occurred during the package run, importing data from a 3rd party, into a Object variable. The contents of the variable would be written to a table and subsequently a 'Run Report' would be sent back 3rd party either saying the report ran successfully with no issues or had issues, see attached.

    Thanks

    Steve

  • SteveD SQL (8/28/2016)


    Hi,

    Is it possible to...

    1) create an empty, no rows, Object Variable from a table structure. Possibly with a SELECT * FROM XYZ where 1=2

    2) Add records to the Object Variable in multiple places through a package.

    Basically the goal is to gather issues, not errors, that have occurred during the package run, importing data from a 3rd party, into a Object variable. The contents of the variable would be written to a table and subsequently a 'Run Report' would be sent back 3rd party either saying the report ran successfully with no issues or had issues, see attached.

    Thanks

    Steve

    I've certainly written C# in the past which will read from an object variable which is already populated, so I think it should be possible.

    Outside of C#, you'll struggle to do this, so if you're not into scripting, I suggest you consider alternatives. Maybe a raw file could be used to store the results ...

    If you haven't even tried to resolve your issue, please don't expect the hard-working volunteers here to waste their time providing links to answers which you could easily have found yourself.

  • Hi Phil,

    Thanks for the input. Yes I think I am going to a RAW file mainly because I discovered while there is a Record Set Destination there is no Record Set Source for data flow and I am now about to use foreach and do x number of individual inserts to eventually populate a table.

    Wonder if this is any different in 2012,2014 or 2016

    Thanks Again

    Steve

  • SteveD SQL (8/28/2016)


    Hi Phil,

    Thanks for the input. Yes I think I am going to a RAW file mainly because I discovered while there is a Record Set Destination there is no Record Set Source for data flow and I am now about to use foreach and do x number of individual inserts to eventually populate a table.

    Wonder if this is any different in 2012,2014 or 2016

    Thanks Again

    Steve

    If you are talking about using an object variable as a data source in a data flow, that can be achieved by creating a Script Component Source, defining appropriate outputs and writing some code.

    Nothing comes to mind regarding your final question ... and I've used all versions.

    But this all sounds more complicated than it needs to be. I'd suggest creating a logging proc and calling that from your package using one or more ExecuteSQL tasks.

    If you haven't even tried to resolve your issue, please don't expect the hard-working volunteers here to waste their time providing links to answers which you could easily have found yourself.

  • Hi Phil

    The scenario I am trying to deal when importing data from a flat. End result is if any of the data in the flat file is invalid, e.g. required column is missing or the value is invalid like a invalid product code I want to report that whoever provided the file. I will be storing each 'issue' in a run log issue table with a structure like:

    Issue RowKey

    ------ ---------

    Missing Product Code 3

    Invalid Product Code 25

    Invalid Quantity 50

    Since there would be multiple columns and various other validation I wanted to gather all the issues before Inserting into destination log table, then I would be calling a report and including it in email to provider.

    Anyway that's the plan

    Thanks

    Steve

  • SteveD SQL (8/28/2016)


    Hi Phil

    The scenario I am trying to deal when importing data from a flat. End result is if any of the data in the flat file is invalid, e.g. required column is missing or the value is invalid like a invalid product code I want to report that whoever provided the file. I will be storing each 'issue' in a run log issue table with a structure like:

    Issue RowKey

    ------ ---------

    Missing Product Code 3

    Invalid Product Code 25

    Invalid Quantity 50

    Since there would be multiple columns and various other validation I wanted to gather all the issues before Inserting into destination log table, then I would be calling a report and including it in email to provider.

    Anyway that's the plan

    Thanks

    Steve

    Good luck. I am working on a similar project right now ... though somewhat wider in scope. It allows for the definition of an 'expected file format' in metadata, allowing Access, Excel and txt files to be defined. I then check the things you mention (number of columns, column names, column order, mandatory columns, expected values).

    As I find errors or warnings, I write them to a logging table. I do this via redirects in data flows (in 2016, you get access to the column name which caused the error in a data flow, making this a lot easier than it was) and directly from C# in Script Tasks.

    At the end of the job, I produce a report showing run details, along with a list of errors and warnings.

    My point is this: as long as you collect the data in an organised way, you can make the report do the work of presenting the results nicely.

    If you haven't even tried to resolve your issue, please don't expect the hard-working volunteers here to waste their time providing links to answers which you could easily have found yourself.

  • Your project sounds cool. An 'expected format' validation has been on my wishlist for awhile and while I try to live

    there is always enough time to do it right

    ... well there just never seems to be enough time :w00t:

    Good luck with your project!

    Steve

  • The destination table , having check constraints I assume, does the logical checks for you. In the dataflow you can redirect error rows to another table. From there you could run some a T-SQL script to do the checks you are looking for to identify the reason for the error. Note that there could be more than one reason (like more than one column is missing a value). So the total for the structure you presented could be more than the number of redirected rows.

    ----------------------------------------------------

  • Hi MMartin1

    Thanks for your response, while my tables do have check and RI constraints I also use surrogate keys and therefore most of my data import type packages perform some kind of look-up, using the alternate keys, to get the surrogate primary key to be used as foreign keys... hopefully that made sense.

    So I often have a list of dodgy values, e.g. they were not found, before attempting an insert/update It is theses values that I want to capture and eventually store in a table for reporting purposes.

    Thanks

    Steve

  • SteveD SQL (8/30/2016)


    Hi MMartin1

    Thanks for your response, while my tables do have check and RI constraints I also use surrogate keys and therefore most of my data import type packages perform some kind of look-up, using the alternate keys, to get the surrogate primary key to be used as foreign keys... hopefully that made sense.

    So I often have a list of dodgy values, e.g. they were not found, before attempting an insert/update It is theses values that I want to capture and eventually store in a table for reporting purposes.

    Thanks

    Steve

    Hi Steve;

    Without knowing more about the specifics of your situation... I can say in general that the lookup transform in SSIS can be arranged in a water fall like structure. If you dont find the value you need in the main place, then you redirect to another lookup transform to try to make the match somewhere else. From there if not found you can even employ a fuzzy lookup transform.

    However my first thought when I read that there are IDs with no matching keys , is that those may be new additions that did not exist before and needed a new key generated for them. But I am confident that you would have known this.

    Good luck. 😎

    ----------------------------------------------------

Viewing 10 posts - 1 through 9 (of 9 total)

You must be logged in to reply to this topic. Login to reply