bulk insert fails

  • I am trying to bulk insert from 3 text files.  They all have the same format.  Two of them work fine, the third one gives me this:

    Server: Msg 4865, Level 16, State 1, Line 1

    Could not bulk insert because the maximum number of errors (10) was exceeded.

    Server: Msg 7399, Level 16, State 1, Line 1

    OLE DB provider 'STREAM' reported an error. The provider did not give any information about the error.

    OLE DB error trace [OLE/DB Provider 'STREAM' IRowset::GetNextRows returned 0x80004005:  The provider did not give any information about the error.].

    The statement has been terminated.

    I deleted all but one record from the offending text file, but it still gives me the error.  My code looks like this:

    use finance_history

    drop table history

    create table history (

      FUND VARCHAR(3) NOT NULL,

      DEPT VARCHAR(4) NOT NULL,

      OBJ VARCHAR(5) NOT NULL,

      SEQDA DATETIME NOT NULL,

      APPR DECIMAL NOT NULL,

      ADJM DECIMAL NOT NULL,

      ENCUMB DECIMAL NOT NULL,

      RECD DECIMAL NOT NULL,

      PAY  DECIMAL NOT NULL,

      VENDOR VARCHAR(5)NOT NULL,

      INVOICE_NUMBER VARCHAR(40),

      CHECK_NUMBER VARCHAR(6),

      BANK VARCHAR(4) NOT NULL,

      JE_NUMBER VARCHAR(6) NOT NULL,

      REC_NUMBER VARCHAR(6) NOT NULL,

      DESCRIPTN VARCHAR(40)NOT NULL,

      PO_NUMBER VARCHAR(5) NOT NULL,

      LINE VARCHAR(3) NOT NULL,

      SOURCE VARCHAR(5) NOT NULL,

      BATCH VARCHAR(6) NOT NULL,

      POSTDA DATETIME NOT NULL,

      TRID VARCHAR(4) NOT NULL,

      FISCAL_YEAR VARCHAR(2) NOT NULL )

    bulk insert history from '\\rosensan\downloads\fgl2004.txt'

     with (fieldterminator = '*')

    BULK INSERT history from '\\Rosensan\DOWNLOADS\EXP2004.TXT'

     WITH (FIELDTERMINATOR = '*')

    bulk insert history from '\\Rosensan\downloads\rev2004.txt'

     With (fieldterminator = '*')

    The first two bulk inserts are fine - the third one gives the error.  If I only do the third one, I still get the error.  I have looked at the text file, and I don't see any unusual characters in it.  The version with one record looks like this (note: this is all one line - the wrapping is from the editor):

    001*0000*41010*31 Jul 2004*        .00*        .00*        .00* 4406756.82*        .00*00000*                                         *0     *    *000000*000000*JULY TAX POSTING FROM QD                *0    *0  *CRR  *TXJULY*14 Sep 2004*001 *04

    Can anyone tell me what might be causing the error?

  • I tested your code with that one sample row you posted, and I get a problem with the field for INVOICE_NUMBER, because the data is 41 chars, but the definition is 40:

    Server: Msg 4863, Level 16, State 1, Line 1

    Bulk insert data conversion error (truncation) for row 1, column 11 (INVOICE_NUMBER).

     

    obviously it could be an issue with converting html to raw text and such, but could that  be one of the issues?

    when i changed the definition to a varchar(50), i could import that row.

    Lowell


    --help us help you! If you post a question, make sure you include a CREATE TABLE... statement and INSERT INTO... statement into that table to give the volunteers here representative data. with your description of the problem, we can provide a tested, verifiable solution to your question! asking the question the right way gets you a tested answer the fastest way possible!

  • Thanks for looking at this for me.

    The length of the field is not the problem - if I change it to 50, I still get the error, so I guess that you weren't able to reproduce the problem. 

    I just copied the line out of my original message, and I was able to import it as well.  I guess what I need to do now is see what is different between the file that works and the one that doesn't.  Thank you - I think that this will help.  

Viewing 3 posts - 1 through 3 (of 3 total)

You must be logged in to reply to this topic. Login to reply