June 8, 2025 at 11:03 pm
Hi everyone
I have one record in the source CSV file that is not letting BULK INSERT load the file. If I remove the record from the CSV then the file loads. However, manually removing the records each time the code is run is not efficient. Ideally, I would like to instruct BULK INSERT to ignore that record.
Code:
EXEC ( 'BULK INSERT #SECURITY_NEW
FROM ''' +
@FILEPATH +
''' WITH (FIRSTROW = 2, FORMAT = ''CSV'', MAXERRORS = 0);'
)
Temp table:
CREATE TABLE #SECURITY_NEW
(
[CONTRACT] [varchar](MAX) NOT NULL,
more fields
)
I would like to have BULK INSERT ignore all records where [CONTRACT] = '%ABC%'. Is there a way to do this with BULK INSERT? If not doable then what are my options?
Thank you
June 8, 2025 at 11:15 pm
I came across OPENROWSET. This looks promising but I have never used it before. Can this be used for my purpose? If yes, how would I use it? I am not a strong coder so I need some help.
Thank you
June 9, 2025 at 6:26 am
What if you change MAXERRORS = 0 to MAXERRORS = 1 (or whatever your upper limit is)
June 9, 2025 at 2:48 pm
ensure the file is a correctly built CSV file - if it is then you can load it using the format=CSV above - if it not then get whoever generates the file to do it properly.
June 9, 2025 at 3:42 pm
ensure the file is a correctly built CSV file - if it is then you can load it using the format=CSV above - if it not then get whoever generates the file to do it properly.
The file comes from a data provider. They are a big company so it will take a while for them to look, investigate, and update their code. I don't want to keep waiting for them. That is why I am looking at excluding that problem record from even loading in the first place. Do you know how I can do that?
June 9, 2025 at 8:46 pm
frederico_fonseca wrote:ensure the file is a correctly built CSV file - if it is then you can load it using the format=CSV above - if it not then get whoever generates the file to do it properly.
The file comes from a data provider. They are a big company so it will take a while for them to look, investigate, and update their code. I don't want to keep waiting for them. That is why I am looking at excluding that problem record from even loading in the first place. Do you know how I can do that?
This not really a SQL question. If you do not know how to do it try asking one of the AI engines something like:
With powershell, please read a csv one line at a time and, excluding lines with too many commas, write the results to another csv file.
The result may not be perfect but it should get you started.
June 10, 2025 at 4:23 am
water490 wrote:frederico_fonseca wrote:ensure the file is a correctly built CSV file - if it is then you can load it using the format=CSV above - if it not then get whoever generates the file to do it properly.
The file comes from a data provider. They are a big company so it will take a while for them to look, investigate, and update their code. I don't want to keep waiting for them. That is why I am looking at excluding that problem record from even loading in the first place. Do you know how I can do that?
This not really a SQL question. If you do not know how to do it try asking one of the AI engines something like:
With powershell, please read a csv one line at a time and, excluding lines with too many commas, write the results to another csv file.
The result may not be perfect but it should get you started.
It is a SQL question. Isn't there a way for SS to exclude the record when importing a CSV file?
Viewing 9 posts - 1 through 9 (of 9 total)
You must be logged in to reply to this topic. Login to reply