BULK INSERT - Limitation ?

  • Bipin Kadam

    Right there with Babe

    Points: 760

    Hi all,

    Steps to reproduce

    I have created sample table “CSVTest” on in db, having fields as id, first name and last name.

    To import data using bulk insert please use following SQL. I have shared “sample.txt”.

    BULK INSERT CSVTest

    FROM '\\mymachine\import\sample.txt'

    WITH

    (

    FIELDTERMINATOR = ',',

    ROWTERMINATOR = ''

    )

    --- Data in Sample.text (NOTICE: at the end of file) --

    1,James,

    2,BK,

    ------- End of data----

    It inserts data properly with 2 rows.

    Try the same query, but change data as

    --- Data in Sample.text (NOTICE: NO at the end of file & NO data at last row last column) --

    1,James,

    2,BK,

    ------- End of data----

    PROBLEM: Only first row get inserted and second row is not.

    Try the same query, but change data as

    --- Data in Sample.text (NOTICE: NO at the end of file & data EXISTS at last row last column) --

    1,James,

    2,BK,SD

    ------- End of data----

    It inserts 2 rows 🙂

    One way to solve problem is to add '' at end, but this is fix and not solution.

    Are there any other ways to solve this ?

  • Devendra (Dev) Shirbad

    SSC-Forever

    Points: 42493

    It’s not a limitation. You are not setting the row terminator properly & that’s why missing few rows (or some workaround).

    Try “*r*n” (or “,*r*n”) as row terminator. It should work for your data. (Using '*' in place of ‘\' so replace it in your code back).

    Let me know if you still face any issue.

  • Bipin Kadam

    Right there with Babe

    Points: 760

    Hi Dev,

    thanks for the reply.

    I have already tried *r*n approach in delimiter.

    But it does not work.

    Thanks again!

  • Devendra (Dev) Shirbad

    SSC-Forever

    Points: 42493

    Are you populating error file? It will give you some hint on the reason.

    BULK INSERT

    [ database_name . [ schema_name ] . | schema_name . ] [ table_name | view_name ]

    FROM 'data_file'

    [ WITH

    (

    [ [ , ] BATCHSIZE = batch_size ]

    [ [ , ] CHECK_CONSTRAINTS ]

    [ [ , ] CODEPAGE = { 'ACP' | 'OEM' | 'RAW' | 'code_page' } ]

    [ [ , ] DATAFILETYPE =

    { 'char' | 'native'| 'widechar' | 'widenative' } ]

    [ [ , ] FIELDTERMINATOR = 'field_terminator' ]

    [ [ , ] FIRSTROW = first_row ]

    [ [ , ] FIRE_TRIGGERS ]

    [ [ , ] FORMATFILE = 'format_file_path' ]

    [ [ , ] KEEPIDENTITY ]

    [ [ , ] KEEPNULLS ]

    [ [ , ] KILOBYTES_PER_BATCH = kilobytes_per_batch ]

    [ [ , ] LASTROW = last_row ]

    [ [ , ] MAXERRORS = max_errors ]

    [ [ , ] ORDER ( { column [ ASC | DESC ] } [ ,...n ] ) ]

    [ [ , ] ROWS_PER_BATCH = rows_per_batch ]

    [ [ , ] ROWTERMINATOR = 'row_terminator' ]

    [ [ , ] TABLOCK ]

    [ [ , ] ERRORFILE = 'file_name' ] )]

    Per BOL:

    ERRORFILE ='file_name'

    Specifies the file used to collect rows that have formatting errors and cannot be converted to an OLE DB rowset. These rows are copied into this error file from the data file "as is."

    The error file is created when the command is executed. An error occurs if the file already exists. Additionally, a control file that has the extension .ERROR.txt is created. This references each row in the error file and provides error diagnostics. As soon as the errors have been corrected, the data can be loaded.

  • Bipin Kadam

    Right there with Babe

    Points: 760

    Hi Dev,

    Thanks for the reply.

    I have tried your suggestion to ERRORFILE in sql query.

    But as import is getting executed successfully, file is not getting created.

    File is getting created when there is any error while importing data.

    Thanks !

  • Devendra (Dev) Shirbad

    SSC-Forever

    Points: 42493

    It will create error files if it rejects any row for inconsistency. If your data is clean it won’t create it. Verify it with scenarios where you lose some of the rows, its details would be in error files.

  • Bipin Kadam

    Right there with Babe

    Points: 760

    Yep, verified that error file is created or not.

    Case 1:

    Added incorrect data (format) in file.

    Error file got created.

    Case 2:

    Valid data with 3 rows as

    BULK INSERT CSVTest

    FROM '\\mymachine\import\sample.txt'

    WITH

    (

    FIELDTERMINATOR = ',',

    ROWTERMINATOR = 'NEWLINE CHARACTER',

    ERRORFILE = '\\vmvertex11\import\Error.txt'

    )

    --- Data in Sample.text (NOTICE: at the end of file) --

    1,James,

    2,BK,

    ------- End of data----

    Error file not get created this time but insert only 2 rows.

  • Devendra (Dev) Shirbad

    SSC-Forever

    Points: 42493

    Can you try your code with following data?

    --- Data in Sample.text

    1,James,

    2,BK,

    3,

    ------- End of data----

  • Bipin Kadam

    Right there with Babe

    Points: 760

    --- File Data---

    1,James,

    2,bk,

    3,

    -----------------

    Query:

    BULK INSERT CSVTest

    FROM '\\vmvertex11\import\sample.txt'

    WITH

    (

    FIELDTERMINATOR = ',',

    ROWTERMINATOR = '',

    ERRORFILE = '\\vmvertex11\import\Error'

    )

    ---

    2 files generated

    File 1 : "Error"

    File 2: "Error.Error.text"

    File 1 contents

    3,3,

    File 2 contents

    Row 3 File Offset 17 ErrorFile Offset 0 - HRESULT 0x80004005

  • Devendra (Dev) Shirbad

    SSC-Forever

    Points: 42493

    Did you conclude anything based on this output?

    2 files generated

    File 1 : "Error"

    File 2: "Error.Error.text"

    It’s as expected error files.

    File 1 contents

    3,3,

    It indicates error in row#3 with complete row...

    File 2 contents

    Row 3 File Offset 17 ErrorFile Offset 0 - HRESULT 0x80004005

    It explains the error in row#3 i.e. offset error.

    Now back to your question. When you try to insert a blank line as last row, BULK INSERT doesn’t consider it a valid row at all. It reaches the column terminator, then row terminator (then EOF) and finishes the operation with success message.

    When you try to insert a row that contains some data (my sample data), it errors out if it finds inconsistency.

    If you need any more clarification, please feel free to revert back.

  • Bipin Kadam

    Right there with Babe

    Points: 760

    Yep, understood your comments.

    Is there any solution for

    --- Data in Sample.text (NOTICE: NO at the end of file & NO data at last row last column) --

    1,James,

    2,BK,

    ------- End of data----

    It is importing only 1 row in bulk insert.

    As per this discussion to import 2 row we have to enter blank line i.e. data should be

    --- Sample.text (NOTICE: There is new line at the end of file & NO data at last row last column) --

    1,James,

    2,BK,

    ------- End of data----

    It inserts 2 rows.

  • Devendra (Dev) Shirbad

    SSC-Forever

    Points: 42493

    Try following on notepad...

    Insert some text (any text) and hit enter key (new line). What’s the output?

    Until you hit enter key, line can go till EOF. But it’s not a valid row or line (per BULK INSERT format) because it’s not able to find row terminator.

  • Bipin Kadam

    Right there with Babe

    Points: 760

    Yes, it is not able find end of line.

    But I can't tell this reason to User who is importing this file.

    There should be some way to read this also ?

    OR

    This is limitation of BULK INSERT

  • Devendra (Dev) Shirbad

    SSC-Forever

    Points: 42493

    I believe its exported csv (or tab limited) file. So by default it will have row terminator, you don’t need to force it.

    Else you can put following case with explanations in front of your customer / management.

  • Bipin Kadam

    Right there with Babe

    Points: 760

    Yep, currently I am going with approach 2 that you suggested ! that is

    "Put case with explanations in front of your customer / management."

    Lets see how it goes 🙂

    Thanks for the help and time !!!

Viewing 15 posts - 1 through 15 (of 16 total)

You must be logged in to reply to this topic. Login to reply