Bulk insert t-sql command

  • When using bulk insert and specified errorfile paramter, the error file is being deleted when an error is encountered in the data file. Anyone came across this before?

  • Can you provide syntax that includes the BULK INSERT?

  • bulk insert from "e:\datafile.bcp"

    WITH (FORMATFILE = 'e:\formatfile.xml', maxerrors=0, errorfile='E:\import_error.txt')

  • Using FORMATFILE as per BOL you must meet one of these criterias...

    The data file contains greater or fewer columns than the table or view.

    The columns are in a different order.

    The column delimiters vary.

    There are other changes in the data format. Format files are typically created by using the bcp utility and modified with a text editor as needed. For more information, see bcp Utility.

    Do you meet these criteria to use FORMATFILE? Just a shot

  • I think you misunderstood, the formatfile matches the table definition. The question is why bulk insert creates an error file and then deletes it from disk once finished running?

    For instance imagine a a table with one column that has a data type of decimal (11,2), but one of the rows in the file contains much larger value, such as: 999999999999999999999999999.99999, so this won't fit. Running bulk insert on this row will fail. During the run time of bulk insert command two files are created, but after the transaction is rolled back the file is automatically deleted making it impossible to see what was the error message and which rows it failed on.

  • This is from BOL

    MAXERRORS = max_errors

    Specifies the maximum number of syntax errors allowed in the data before the bulk-import operation is canceled. Each row that cannot be imported by the bulk-import operation is ignored and counted as one error. If max_errors is not specified, the default is 10.

    You have this option set to 0, so you are saying that if an error occurs to cancel the insert. This may delete the file because the insert was canceled with no errors written to disk. Try bumping this this number up. The default is 10.

  • I did, but it still deletes the file.

  • Does 'E:\import_error.txt' exsit prior to running the bulk insert? The file should not exist before running the insert.

  • no it doesn't. it gets created and then deleted when the transaction is rolled back.

  • How are you running the procedure from SSMS query window or command line and do you have the insert wrapped in a transaction with a try/catch or xact_abort?

  • I get a similar problem, but in the exact opposite way. When I run:

    BULK INSERT Import

    FROM 'C:\Import.txt'

    WITH (FIELDTERMINATOR = '","', ROWTERMINATOR = '"', FIRSTROW=3522350, LASTROW=3522360, MAXERRORS = 100,

    ERRORFILE = 'C:\Errors.txt')

    I get the msg:

    Msg 4861, Level 16, State 1, Line 2

    Cannot bulk load because the file "C:\Errors.txt" could not be opened. Operating system error code 80(The file exists.).

    Msg 4861, Level 16, State 1, Line 2

    Cannot bulk load because the file "C:\Errors.txt.Error.Txt" could not be opened. Operating system error code 80(The file exists.).

    ...but the file doesn't exist prior to running. It's as if it first creates the error files, finds an error record, and then tries to create the error files a second time. I'm running SQL Express v9.0 sp1.

    I tried it again on a range that I know has no errors:

    BULK INSERT Import

    FROM 'C:\Import.txt'

    WITH (FIELDTERMINATOR = '","', ROWTERMINATOR = '"', FIRSTROW=1, LASTROW=10, MAXERRORS = 100,

    ERRORFILE = 'C:\Errors.txt')

    and got the same msg.

    Any help is appreciated...

    -Jesse

  • Luk (1/28/2008)


    no it doesn't. it gets created and then deleted when the transaction is rolled back.

    Heh... listen to what you just said... "it gets created... and then deleted WHEN THE TRANSACTION IS ROLLED BACK". That's the whole purpose of rolling back... undo everything that was done by the transaction.

    If you have an error log, why are you rolling a Bulk Insert back? You should be doing a BULK INSERT into a STAGING TABLE and making the decision as to whether to copy that data from the staging table to the final table long before you ever start a transaction.

    Recommend you change your process to use a staging table...

    --Jeff Moden


    RBAR is pronounced "ree-bar" and is a "Modenism" for Row-By-Agonizing-Row.
    First step towards the paradigm shift of writing Set Based code:
    ________Stop thinking about what you want to do to a ROW... think, instead, of what you want to do to a COLUMN.

    Change is inevitable... Change for the better is not.


    Helpful Links:
    How to post code problems
    How to Post Performance Problems
    Create a Tally Function (fnTally)

  • Jesse McLain (2/11/2008)


    I get a similar problem, but in the exact opposite way. When I run:

    BULK INSERT Import

    FROM 'C:\Import.txt'

    WITH (FIELDTERMINATOR = '","', ROWTERMINATOR = '"', FIRSTROW=3522350, LASTROW=3522360, MAXERRORS = 100,

    ERRORFILE = 'C:\Errors.txt')

    Field terminator cannot be more than 1 character long without a format file...

    Looks like you're trying to import a "real" CSV file... if you provide the table schema (as a runnable CREATE TABLE command), the record layout, and attach 50 or so rows of data, perhaps we can help a bit...

    --Jeff Moden


    RBAR is pronounced "ree-bar" and is a "Modenism" for Row-By-Agonizing-Row.
    First step towards the paradigm shift of writing Set Based code:
    ________Stop thinking about what you want to do to a ROW... think, instead, of what you want to do to a COLUMN.

    Change is inevitable... Change for the better is not.


    Helpful Links:
    How to post code problems
    How to Post Performance Problems
    Create a Tally Function (fnTally)

  • I didn't realize that your (Jeff Moden's) first response referred to the original poster. My bad. :ermm:

Viewing 14 posts - 1 through 13 (of 13 total)

You must be logged in to reply to this topic. Login to reply