Viewing 8 posts - 1 through 9 (of 9 total)
Well, the internal implicit conversion changed the column99's datatype to NText automatically in the source editor, so I had to change the destination datatype to NText to match them. ...
December 19, 2013 at 12:06 am
My destination is a table in SQL server. My query is against the source, which is our ERP server. This package pulls data from ERP source into SQL server.
The column...
December 18, 2013 at 3:29 am
Sorry, that's my destination.
December 17, 2013 at 4:00 am
Steve, Not sure how to provide table schema.
My table structure is
[dbo].[table1](
[column1] [nvarchar](3) NULL,
[column2] [nvarchar](4) NULL,
[column3] [nvarchar](10) NULL,
[column4] [datetime] NULL,
[column5] [nvarchar](10) NULL,
[column6] [nvarchar](4) NULL,
[column7] [nvarchar](10) NULL,
[column8] [int] NULL,
[column9] [datetime]...
December 17, 2013 at 12:17 am
Source is data connection from a PC from which ETL is performed.
I tried various other things, like deleting the mapping between external and output columns in the ADO NET...
December 16, 2013 at 5:33 am
Sorry for the delay....but I followed what you'd said and it returned this:
[ADO NET Source [2]] Error: The ADO NET Source was unable to process the data. Field table-column99 is...
December 14, 2013 at 6:54 pm
Description: The ADO NET Source was unable to process the data. Field table1-column99 is missing an escape character for a quote.Unable to update PK WHERE clause.Error processing data batch. ...
December 12, 2013 at 11:52 pm
I tried doing that too, excluding the column99 in the select list in the ADO NET source. But it still throws the same error.
December 12, 2013 at 11:05 pm
Viewing 8 posts - 1 through 9 (of 9 total)