[Flat File Source [2]] Error: Data conversion failed. The data conversion for column "Produce" returned status value 4 and status text "Text was truncated or one or more characters had no match in the target code page.".

  • I keep getting this error. I have set the value of the "Produce"column to nvarchar MAX but still no luck. Some records are getting loaded but not all of them.
    Source is a csv file. I have changed the data type for all columns by going to show advanced Editor,> input and output properties>flat file source output>output columns>changed to unicode string [DT_WSTR], length 4000
    Dest is a SQL Server

    Dest table CREATE TABLE [PMG].[InitialData20180605](
        [CompanyNids] [nvarchar](500) NULL,
        [CompanyName] [nvarchar](500) NULL,
        [RIBITCompanyId] [nvarchar](500) NULL,
        [PMGPIN] [nvarchar](500) NULL,
        [Active] [nvarchar](500) NULL,
        [E-mail] [nvarchar](500) NULL,
        [Website] [nvarchar](500) NULL,
        [BusinesAddressCountryCode] [nvarchar](500) NULL,
        [BusinessAddressStateCode] [nvarchar](500) NULL,
        [BusinessAddressLocality] [nvarchar](500) NULL,
        [BusinessPostalCode] [nvarchar](500) NULL,
        [BussinessAddressLine1] [nvarchar](500) NULL,
        [MailingAddressCountryCode] [nvarchar](500) NULL,
        [MailingAddressStateCode] [nvarchar](500) NULL,
        [MailingAddressLocality] [nvarchar](500) NULL,
        [MailingPostalCode] [nvarchar](500) NULL,
        [MailingAddressLine1] [nvarchar](500) NULL,
        [Fax] [nvarchar](500) NULL,
        [Phone] [nvarchar](500) NULL,
        [Commodity] [nvarchar](max) NULL,
        [PrimaryOperations] [nvarchar](500) NULL,
        [SecondaryOperations] [nvarchar](200) NULL,
        [Logo] [nvarchar](500) NULL,
        [AboutUs] [nvarchar](max) NULL
    )

    What information I can provide here to get some help?

  • The SQL Server table is not the problem here.
    Check the Advanced properties for your Flat File Source component and check the input and output columns.

    If you haven't even tried to resolve your issue, please don't expect the hard-working volunteers here to waste their time providing links to answers which you could easily have found yourself.

  • Phil Parkin - Tuesday, June 5, 2018 12:24 PM

    Check the Advanced properties for your Flat File Source component and check the input and output columns.

    I did and I changed all columns to the same data type.

  • NewBornDBA2017 - Tuesday, June 5, 2018 12:42 PM

    Phil Parkin - Tuesday, June 5, 2018 12:24 PM

    Check the Advanced properties for your Flat File Source component and check the input and output columns.

    I did and I changed all columns to the same data type.

    You say that some rows get loaded.
    SSIS loads rows from top to bottom, so does the failure always occurs in the same place? 
    If so, you should be able to work towards identifying the rows in the source file which are failing the load. Viewing these rows in Notepad++ and comparing them with 'good' rows is likely to give you an idea what the problem is.

    If you haven't even tried to resolve your issue, please don't expect the hard-working volunteers here to waste their time providing links to answers which you could easily have found yourself.

  • Phil Parkin - Tuesday, June 5, 2018 12:49 PM

    NewBornDBA2017 - Tuesday, June 5, 2018 12:42 PM

    Phil Parkin - Tuesday, June 5, 2018 12:24 PM

    Check the Advanced properties for your Flat File Source component and check the input and output columns.

    I did and I changed all columns to the same data type.

    You say that some rows get loaded.
    SSIS loads rows from top to bottom, so does the failure always occurs in the same place? 
    If so, you should be able to work towards identifying the rows in the source file which are failing the load. Viewing these rows in Notepad++ and comparing them with 'good' rows is likely to give you an idea what the problem is.

    Yes, the failure occur in the same place and that's what I have been looking for but don't see anything. For example, the process stops on 6929 so I don't how different 6930 is from 6929.
    6928 ; ;500532 ;500532 ;inactive ; ; ;US ;PA ;Philadelphia           ;19103 ; ;US ;PA ;Philadelphia         ;19103 ; ;NULL ;NULL ;Berries,Citrus,Deciduous Fruit,Melons,Stone Fruit,Tropical Fruit,Cooking Vegetables,Salad Vegetables,Pineapple ; ;Wholesaler ;No ;
    6929 ; ;161569 ;161569 ;inactive ; ; ;US ;OH ;Columbus             ;43220 ; ;US ;OH ;Columbus             ;43220 ; ;NULL ;NULL ; ; ;Other ;No ;
    6930 ; ;500534 ;500534 ;inactive ; ; ;US ;CA ;Hacienda Heights ;91745 ; ;US ;CA ;Hacienda Heights ;91745 ; ;NULL ;NULL ;Berries,Citrus,Deciduous Fruit,Melons,Stone Fruit,Tropical Fruit,Cooking Vegetables,Salad Vegetables ; ;Wholesaler,Retailer ;No ;

  • Ok, so what I did was I deleted 6930 record from the flat file, ran the job again and it still stopped at 6929. Then I deleted 6929 and it processed 6930 and  then it stopped. So the way I see the result in a table, it is only processing 1000 records at once and then it's stopping. How do I fix this new issue?

  • NewBornDBA2017 - Tuesday, June 5, 2018 1:11 PM

    Ok, so what I did was I deleted 6930 record from the flat file, ran the job again and it still stopped at 6929. Then I deleted 6929 and it processed 6930 and  then it stopped. So the way I see the result in a table, it is only processing 1000 records at once and then it's stopping. How do I fix this new issue?

    It processes the rows in batches. So just because x rows get inserted successfully does not necessarily mean that there is an error on row x + 1. It means that there is a problem in the next natch of rows.

    Try importing and parsing the file into Excel and looking there too.

    If you haven't even tried to resolve your issue, please don't expect the hard-working volunteers here to waste their time providing links to answers which you could easily have found yourself.

Viewing 7 posts - 1 through 6 (of 6 total)

You must be logged in to reply to this topic. Login to reply