Is there a way to exclude records from being loaded when using BULK INSERT?

  • Hi everyone

    I have one record in the source CSV file that is not letting BULK INSERT load the file.  If I remove the record from the CSV then the file loads.  However, manually removing the records each time the code is run is not efficient.  Ideally, I would like to instruct BULK INSERT to ignore that record.

    Code:

    EXEC    ( 'BULK INSERT #SECURITY_NEW
    FROM ''' +
    @FILEPATH +
    ''' WITH (FIRSTROW = 2, FORMAT = ''CSV'', MAXERRORS = 0);'
    )

    Temp table:

    CREATE TABLE #SECURITY_NEW
    (
    [CONTRACT] [varchar](MAX) NOT NULL,
    more fields
    )

    I would like to have BULK INSERT ignore all records where [CONTRACT] = '%ABC%'.  Is there a way to do this with BULK INSERT? If not doable then what are my options?

    Thank you

  • I came across OPENROWSET.  This looks promising but I have never used it before.  Can this be used for my purpose?  If yes, how would I use it?  I am not a strong coder so I need some help.

    Thank you

  • What if you change MAXERRORS = 0 to MAXERRORS = 1 (or whatever your upper limit is)

  • ensure the file is a correctly built CSV file - if it is then you can load it using the format=CSV above - if it not then get whoever generates the file to do it properly.

  • pietlinden wrote:

    What if you change MAXERRORS = 0 to MAXERRORS = 1 (or whatever your upper limit is)

    I tried a few values and the outcome is the same.  The file won't load.  What else do you suggest I try?

  • frederico_fonseca wrote:

    ensure the file is a correctly built CSV file - if it is then you can load it using the format=CSV above - if it not then get whoever generates the file to do it properly.

    The file comes from a data provider.  They are a big company so it will take a while for them to look, investigate, and update their code.  I don't want to keep waiting for them.  That is why I am looking at excluding that problem record from even loading in the first place.  Do you know how I can do that?

  • water490 wrote:

    frederico_fonseca wrote:

    ensure the file is a correctly built CSV file - if it is then you can load it using the format=CSV above - if it not then get whoever generates the file to do it properly.

    The file comes from a data provider.  They are a big company so it will take a while for them to look, investigate, and update their code.  I don't want to keep waiting for them.  That is why I am looking at excluding that problem record from even loading in the first place.  Do you know how I can do that?

    This not really a SQL question. If you do not know how to do it try asking one of the AI engines something like:

    With powershell, please read a csv one line at a time and, excluding lines with too many commas, write the results to another csv file.

    The result may not be perfect but it should get you started.

  • Ken McKelvey wrote:

    water490 wrote:

    frederico_fonseca wrote:

    ensure the file is a correctly built CSV file - if it is then you can load it using the format=CSV above - if it not then get whoever generates the file to do it properly.

    The file comes from a data provider.  They are a big company so it will take a while for them to look, investigate, and update their code.  I don't want to keep waiting for them.  That is why I am looking at excluding that problem record from even loading in the first place.  Do you know how I can do that?

    This not really a SQL question. If you do not know how to do it try asking one of the AI engines something like:

    With powershell, please read a csv one line at a time and, excluding lines with too many commas, write the results to another csv file.

    The result may not be perfect but it should get you started.

    It is a SQL question.  Isn't there a way for SS to exclude the record when importing a CSV file?

  • water490 wrote:

    Isn't there a way for SS to exclude the record when importing a CSV file?

    You can write a script component in C# but the logic to preprocess the CSV will be the same.

Viewing 9 posts - 1 through 9 (of 9 total)

You must be logged in to reply to this topic. Login to reply