Tab-Delimited Bulk Insert Gave Odd Results

  • I created a table to import a tab-delimited text file and erroneously left out a field. It took me a while to realize that, though, because SQL Server slurped up the data without throwing any errors. I found the problem when I saw that the last field in every record had an odd gap at the second position of what should have been a variable-length ID with no spaces. The last two fields of the table should have been:

    INDICATOR int

    ID varchar(10)

    I used this to insert the data:

    bulk insert junk from 'G:\file.txt' with

    (rowterminator = '0x0a', fieldterminator = '\t')

    I left off the INDICATOR field by mistake when I created the table. When I noticed the gap, I used the ASCII function to figure out what it was:

    ASCII(substring(ID,2,1))

    It came back 9, or tab. So what SQL did was run out of fields on the bulk insert and just put the last two fields of data in the last field, complete with the tab in between them. I tested this and it will do that without throwing errors for as many columns as I tried to jam into a single, last column ( as long as it was set to be large enough).

    Given how finicky SQL is about accepting data, this really surprises me. Does anyone have any insights to offer on this phenomenon?

    Thanks in advance.

    Tom

Viewing post 1 (of 1 total)

You must be logged in to reply to this topic. Login to reply