The attempt to add a row to the Data Flow task buffer failed with error code 0xC0047020.

  • Hi,

    I am trying to transfer the data from flat file to sql server.When I am running the package on local server(network server) it works fine.But when I user it to transfer the data to online server it starts and shows 2771 rows transfered and remains on that only. when i stop the execution I get the following errors:

     

    [DTS.Pipeline] Error: The pipeline received a request to cancel and is shutting down.

     

    [Loose Diamond File [1]] Error: The attempt to add a row to the Data Flow task buffer failed with error code 0xC0047020.

     

    [DTS.Pipeline] Error: The PrimeOutput method on component "Loose Diamond File" (1) returned error code 0xC02020C4.  The component returned a failure code when the pipeline engine called PrimeOutput(). The meaning of the failure code is defined by the component, but the error is fatal and the pipeline stopped executing.

     

    [DTS.Pipeline] Error: Thread "SourceThread0" has exited with error code 0xC0047038.

     

    Can any one help me to find what the problem is.

    Thanks in advance.

  • This was removed by the editor as SPAM

  • I wonder if we should check the inputrow buffer for a null value and set a flag to redirect the row to an error table of some sort.

  • This is not a solution but another example of the same error.

    I'm writing a package that compares two tables of the same name in different databases. If the data exists in both tables I want to update the destination table with changes from the source table.

    If the data does not exist in the destination table, I want to insert it. I've written a query to select the correct data (SSIS Lookup Query.jpg) and tested that it works (SSIS Lookup Preview.jpg).

    I'm new to this so I don't know whether the lookup should be a SELECT or UPDATE but other than that I would have thought this should work?

  • Has any one having any clue?. I'm just stuck up with this without any clue? I'm having the same problem as Grosshopper said.

  • Hello, i just had the exact same problem and just stop using the "fast-load" mode and it is working just fine now, hope it helps.

    Regards,

  • i did the same one just by changing the 'table or view fast load' to 'table or view' mode. its working fine now

  • hey...

    i was also facing same problem for same task....

    so finally i tried this query in execute task of ssis and it works properly.

    INSERT table2( col1, col2, col3 )

    SELECT A. col1, A. col2, A. col3

    FROM table1 A

    LEFT JOIN table2 B ON A.key_PK = B.key_PK

    WHERE B.key_PK IS NULL

  • This happens when you auto-generate a Data Flow step with the import/export wizard. The process doesn't distinguish between a view and a table, so it will copy views in as a table. Make sure to check your source and destination steps that may be trying to copy a view to a view.

  • I also had same problem.

    Have all of you found the solution?

    I've tried to change the Data Access Mode of Oledb Destination to from "Table/View Fast Load" To "Table/View". but I still got the same error :crazy:

    At the beginning of execution,my package run well, I got the error message in the middle of execution and some records have been inserted successfully to destination table.

    Thanks a lot.

  • In my case the problem was only on the items in the data flow task where the "[highlight=#ffff11]Check Constraints[/highlight]" option was selected for the OLE DB Destination. Once I change that, i had no issues. In my case I did not want the check, but your case might be different.

    Regards,

    Derald

  • In my case this was caused due to the following properties

    [Data Flow Task]

    DefaultBufferMaxRows

    DefaultBufferSize

    [SQLServerDestination]

    MaxInsertCommitSize

    You need to play around with those three parameters and get the correct values for your hardware/software/database configuration. First try the MaxInsertCommitSize parameter - this is set to 0 by default which means during a bulk insert that it performs one transaction (COMMIT) for the bulk insert. This might cause buffer issues if you don't have enough memory for a large dataset. In my case it was a transfer of 11 million rows that failed and I have now set the parameter to 250,000. MaxRows is set to 10,000 and BufferSize is set to 104857600 (100MB). As the articles below advise, try to avoid swapping to disc at all costs.

    Read these articles for some very useful tips:

    Top 10 SQL Server Integration Services Best Practices

    Improving the Performance of the Data Flow

  • isn't it the case that excel is full? (64000 rows exceeded?)

  • Sander Stuurwold-385722 (2/21/2012)


    isn't it the case that excel is full? (64000 rows exceeded?)

    Remember that the latest post in this thread is already a few months old.

    Furthermore, Excel is nowhere mentioned in this thread - the OP is talking about a flat file - so no, it probably isn't the case that the Excel is full.

    Need an answer? No, you need a question
    My blog at https://sqlkover.com.
    MCSE Business Intelligence - Microsoft Data Platform MVP

  • I can't see how this can be an Excel error when Excel is not mentioned anywhere by the OP and also the OP is importing data to SQL Server.

    As I said in my earlier reply what worked for me was to play around with those three settings. A few months after this post I started getting another related error for the largest of the source tables. In the end I had to change the insert from a SQL Server Destination / BULK insert to a OLEDB Destination and the source was a select statement (from the same table as before) but with a SET ANSI_WARNINGS OFF at the start. For some reason although the error is in fact a warning, SSIS stops executing the task and logs it as a task error.

Viewing 15 posts - 1 through 15 (of 29 total)

You must be logged in to reply to this topic. Login to reply