Error in importing large Excel columns

  • Hi,

    While importing large Excel files with a large columns (over 1020 chars!) I have the following error:

    Error Descriptionata for source column 18 ('DESCRIPTION') is too large for the specified buffer size.

    The destination table columns is varchar(1500) so there is enough space there.

    The source column data has some carriage return in it.

    • Is there any limitation if the size of an Excel columns to be imported?
    • 've set the registry key HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\Jet\3.5\Engines\Excel\TypeGuessRows to 0 which have solved several other issues in the past.
    • Do you know whether the carriage return causes this problem? (When I save the Excel file as tab delimited text all thos CR chars a empeaching me to import my data

    Any idea?



    Bye
    Gabor

  • Did you read this: http://support.microsoft.com/default.aspx?scid=kb;en-us;281517?

    By default, the value for this key is 8. Hence, the provider scans the first 8 rows of the source data to determine the data types for the columns. If any field looks like text and the length of data is more than 255 characters, the column is typed as a memo field. So, if there is no data with a length greater than 255 characters in the first 8 rows of the source, Jet cannot accurately determine the nature of the data type.

    What I remember from my days as a developer was that even if that setting exists, it doesn't entirely work. I think even setting it to 0 only scanned the first 16 rows.

    A second way to workaround this problem (without modifying the registry) is to make sure that rows with fields, which have data 255 characters or greater, are present in the first 8 rows of the source data file.

    Don't know if it helps but...

     

    //Hanslindgren

  • I know about this problem.

    That's why I have speicified in my problem description that I HAVE MODIFIED the registry entry to 0, which based on the MS knowledge base article should scan the first 16384 rows of the Excel file.

    So this should not be the problem AFAIK



    Bye
    Gabor

  • I understood that, that's why I stated that the setting doesn't entirely work

    One of the reasons I have for stating this comes from microsoft: (http://support.microsoft.com/default.aspx?scid=kb;EN-US;257819)

    However, due to a bug in the ODBC driver, specifying the Rows to Scan (MaxScanRows) setting currently has no effect. In other words, the Excel ODBC driver (MDAC 2.1 and later) always scans the first 8 rows in the specified data source in order to determine each column's datatype.

    //Hans

  • You're great Hans!

    I have rearanged the data and it works now!

    Of course I've got some problems because I've 2 columns where the data can be over 1000 char long and of course I was falling into the trap. But Ive solved that issue.

    Of course it has solved my problem for the big load of 51+K rows, but my daily 100 rows problem is not solved. and a bigger mess is that the daily load is done by some administrative workforce not really being able to handle (without any data loss warranty !) data manipulation in Excel.

    So I'll have to find an elegant solution to the problem

    Any idea?



    Bye
    Gabor

  • Thanks  Always glad if I can help!

    Maybe also your second problem can be solved. Is this with loading data into SQL Server or still an excel export? How is it being done right now and is it failing in the same way as the data export?

    I'll be back tomorrow.

    Ciao.

    //Hans

  • The data is originally exported from a DB2 database somewhere in Germany any sent to our site during the night. The export itself should not have any issues.

    The operators have to load this file into our SQLServer database for reporting purpose.

    So I expect that we have some rows not imported into our reporting database as the max error count is set to 9999. The rows containing errors are not imported into the database as I suppose.

    Therefore I will have to find a solution. Maybe I will have to write a small program which reads and loads to data but I don't really want to do that is there is already a tool like DTS which should do the job



    Bye
    Gabor

Viewing 7 posts - 1 through 6 (of 6 total)

You must be logged in to reply to this topic. Login to reply