Insert bulk failed due to a schema change of the target table.

  • Hello Expert.

    First time I used sqlbulkcopy its performance is impressive, however my application threw a SQL exception after processed about 500,000 records. The error stated "Insert bulk failed due to a schema change of the target table." which I don't understand why, data being inserted into the same table, and why it processed 500,000 records with out issue, but not the rest which I cant understand.

    Here is the chunk of VB.Net code that does the bulk copy. Notice that MyDestTable has a couple more fields that I didn't want them to be mapped. all fields in the MySourceDataTable table are specified in the mapping.

    Dim sqlBulk As New SqlBulkCopy(_connectionString)

    sqlBulk.DestinationTableName = MyDestTable

    sqlBulk.ColumnMappings.Add("DataOwnerID", "DataOwnerID")

    sqlBulk.ColumnMappings.Add("ObjectID", "ObjectID")

    sqlBulk.ColumnMappings.Add("ObjectTypeID", "ObjectTypeID")

    sqlBulk.ColumnMappings.Add("PropertyName", "PropertyName")

    sqlBulk.ColumnMappings.Add("Pvalue", "Pvalue")

    sqlBulk.ColumnMappings.Add("UpdateIDRefreshed", "UpdateIDRefreshed")

    sqlBulk.ColumnMappings.Add("UpdateIDModified", "UpdateIDModified")

    sqlBulk.ColumnMappings.Add("Attribute", "Attribute")

    sqlBulk.ColumnMappings.Add("POrder", "POrder")

    sqlBulk.WriteToServer(MySourceDataTable)

    sqlBulk.Close()

    Anyone run into this type of error while using SQLBulk copy? I did lot of googling but didn't find anything points to the right direction.

    Thanks in advanced.

  • Here is a detailed error:

    System.Data.SqlClient.SqlException: Insert bulk failed due to a schema change of the target table.

    at System.Data.SqlClient.SqlConnection.OnError(SqlException exception, Boolean breakConnection)

    at System.Data.SqlClient.SqlInternalConnection.OnError(SqlException exception, Boolean breakConnection)

    at System.Data.SqlClient.TdsParser.ThrowExceptionAndWarning(TdsParserStateObject stateObj)

    at System.Data.SqlClient.TdsParser.Run(RunBehavior runBehavior, SqlCommand cmdHandler, SqlDataReader dataStream, BulkCopySimpleResultSet bulkCopyHandler, TdsParserStateObject stateObj)

    at System.Data.SqlClient.SqlBulkCopy.WriteToServerInternal()

    at System.Data.SqlClient.SqlBulkCopy.WriteRowSourceToServer(Int32 columnCount)

    at System.Data.SqlClient.SqlBulkCopy.WriteToServer(DataTable table, DataRowState rowState)

    at System.Data.SqlClient.SqlBulkCopy.WriteToServer(DataTable table)

  • Does this happen repeatedly? That is, if you scratch the table and start over, does it happen again?

    I found http://blogs.msdn.com/b/sqlserverfaq/archive/2009/11/04/s.aspx although that article is not entirely easy to understand. But it may be worth trying disabling autostats.

    What does "SELECT @@version" return?

    [font="Times New Roman"]Erland Sommarskog, SQL Server MVP, www.sommarskog.se[/font]

  • Hi there,

    My application is still at the development stage switching from regular insert statement to sql bulk import and yes so far the issue occurs consistently I couldn't get it run until the end, I deleted constraint on the table. The there are a bit more than 1million records to import, and it always error out at somewhere around 500K records. Here is version of sql I am running.

    Microsoft SQL Server 2008 R2 (RTM) - 10.50.1617.0 (X64) Apr 22 2011 19:23:43 Copyright (c) Microsoft Corporation Developer Edition (64-bit) on Windows NT 6.1 <X64> (Build 7601: Service Pack 1)

    Next I am going to remove all the indexes on the table and run the import again, and will let you know if that help

    Thanks for looking at this.

    ***UPDATED: look like indexes were causing this issue, my application was able to processed pass the 1 million records for the first time, with that, I am looking into disabling indexes on the affected table prior to do bulk copy and rebuild indexes afterward. hope this will solve the issue.***

  • There is a Service Pack2 out for SQL 2008 R2 out. I don't know whether that will resolve the issue, but you should be on the latest service pack anyway.

    Disabling indexes also means that you disable some of the autostats, which I suspect is the reason.

    [font="Times New Roman"]Erland Sommarskog, SQL Server MVP, www.sommarskog.se[/font]

  • Erland Sommarskog (7/11/2013)


    There is a Service Pack2 out for SQL 2008 R2 out. I don't know whether that will resolve the issue, but you should be on the latest service pack anyway.

    Disabling indexes also means that you disable some of the autostats, which I suspect is the reason.

    My previous test was done by dropping all indexes and that works. before that I tested the BCP with autostats disabled, that didn't work.

    My next test, which I am working on now, is to instead of drop all indexes and recreate them, I will disable indexes, run BCP then re-in enable indexes...will see if this works

    ***UPDATED: as expected, I am getting this error if I just disable indexes. it seems they have to dropped and recreated after import.***

    System.InvalidOperationException: Cannot access destination table 'xyz'. ---> System.Data.SqlClient.SqlException: The query processor is unable to produce a plan because the index 'IX_xyz' on table or view 'xyz' is disabled.

    at System.Data.SqlClient.SqlConnection.OnError(SqlException exception, Boolean breakConnection)

    at System.Data.SqlClient.SqlInternalConnection.OnError(SqlException exception, Boolean breakConnection)

    at System.Data.SqlClient.TdsParser.ThrowExceptionAndWarning(TdsParserStateObject stateObj)

    at System.Data.SqlClient.TdsParser.Run(RunBehavior runBehavior, SqlCommand cmdHandler, SqlDataReader dataStream, BulkCopySimpleResultSet bulkCopyHandler, TdsParserStateObject stateObj)

    at System.Data.SqlClient.SqlBulkCopy.CreateAndExecuteInitialQuery()

    at System.Data.SqlClient.SqlBulkCopy.WriteToServerInternal()

    --- End of inner exception stack trace ---

    at System.Data.SqlClient.SqlBulkCopy.WriteToServerInternal()

    at System.Data.SqlClient.SqlBulkCopy.WriteRowSourceToServer(Int32 columnCount)

    at System.Data.SqlClient.SqlBulkCopy.WriteToServer(DataTable table, DataRowState rowState)

    at System.Data.SqlClient.SqlBulkCopy.WriteToServer(DataTable table)

  • It would be interesting to try to repro this. Can you share the table and index definition? I would have use for the data file as well, but I understand if you don't want to share it, so I could try to cook up one myself. And, oh, I would also like the BCP command you use.

    [font="Times New Roman"]Erland Sommarskog, SQL Server MVP, www.sommarskog.se[/font]

  • not a problem, here tables schemas and indexes.

    CREATE TABLE [dbo].[TempTable](

    [DataID] [int] NOT NULL,

    [ID] [bigint] IDENTITY(1,1) NOT NULL,

    [OID] [uniqueidentifier] NOT NULL,

    [TypeID] [uniqueidentifier] NOT NULL,

    [PropertyName] [nvarchar](100) NOT NULL,

    [PropertyValue] [nvarchar](3900) NULL,

    [RefID] [uniqueidentifier] NULL,

    [UpdateID] [uniqueidentifier] NULL,

    [Attribute] [smallint] NULL,

    [SortOrder] [int] NULL,

    [Comment] [tinyint] NULL,

    CONSTRAINT [PK_TempTable] PRIMARY KEY NONCLUSTERED

    (

    [DataID] ASC,

    [ID] ASC

    )WITH (PAD_INDEX = OFF, STATISTICS_NORECOMPUTE = OFF, IGNORE_DUP_KEY = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON) ON [PRIMARY]

    ) ON [PRIMARY]

    GO

    ALTER TABLE [dbo].[TempTable] WITH NOCHECK ADD CONSTRAINT [CK_TempTable] CHECK (([DataID]=(1)))

    GO

    ALTER TABLE [dbo].[TempTable] CHECK CONSTRAINT [CK_TempTable]

    GO

    CREATE CLUSTERED INDEX [IX_TempTable] ON [dbo].[TempTable]

    (

    [TypeID] ASC,

    [OID] ASC,

    [Attribute] ASC,

    [SortOrder] ASC

    )WITH (PAD_INDEX = OFF, STATISTICS_NORECOMPUTE = OFF, SORT_IN_TEMPDB = OFF, IGNORE_DUP_KEY = OFF, DROP_EXISTING = OFF, ONLINE = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON) ON [PRIMARY]

    GO

    CREATE NONCLUSTERED INDEX [IX_TempTable_1] ON [dbo].[TempTable]

    (

    [OID] ASC

    )WITH (PAD_INDEX = OFF, STATISTICS_NORECOMPUTE = OFF, SORT_IN_TEMPDB = OFF, IGNORE_DUP_KEY = OFF, DROP_EXISTING = OFF, ONLINE = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON) ON [PRIMARY]

    GO

    CREATE NONCLUSTERED INDEX [IX_TempTable_2] ON [dbo].[TempTable]

    (

    [PropertyName] ASC

    )WITH (PAD_INDEX = OFF, STATISTICS_NORECOMPUTE = OFF, SORT_IN_TEMPDB = OFF, IGNORE_DUP_KEY = OFF, DROP_EXISTING = OFF, ONLINE = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON) ON [PRIMARY]

    GO

    CREATE UNIQUE NONCLUSTERED INDEX [IX_TempTable_3] ON [dbo].[TempTable]

    (

    [ID] ASC

    )WITH (PAD_INDEX = OFF, STATISTICS_NORECOMPUTE = OFF, SORT_IN_TEMPDB = OFF, IGNORE_DUP_KEY = OFF, DROP_EXISTING = OFF, ONLINE = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON) ON [PRIMARY]

    GO

    ALTER TABLE [dbo].[TempTable] ADD CONSTRAINT [PK_TempTable] PRIMARY KEY NONCLUSTERED

    (

    [DataID] ASC,

    [ID] ASC

    )WITH (PAD_INDEX = OFF, STATISTICS_NORECOMPUTE = OFF, SORT_IN_TEMPDB = OFF, IGNORE_DUP_KEY = OFF, ONLINE = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON) ON [PRIMARY]

    GO

    Here is my VB.Net code that does bulk copy. NewDataTable is a DataTable that has same structrure of TempTable minus ID, and Comment columns.

    Dim sqlBulk As New SqlBulkCopy(connectionString)

    sqlBulk.DestinationTableName = "TempTable"

    sqlBulk.ColumnMappings.Add("DataID", "DataID")

    sqlBulk.ColumnMappings.Add("OID", "OID")

    sqlBulk.ColumnMappings.Add("TypeID", "TypeID")

    sqlBulk.ColumnMappings.Add("PropertyName", "PropertyName")

    sqlBulk.ColumnMappings.Add("Propertyvalue", "Propertyvalue")

    sqlBulk.ColumnMappings.Add("RefID", "RefID")

    sqlBulk.ColumnMappings.Add("UpdateID", "UpdateID")

    sqlBulk.ColumnMappings.Add("Attribute", "Attribute")

    sqlBulk.ColumnMappings.Add("SortOrder", "SortOrder")

    sqlBulk.WriteToServer(NewDataTable)

    sqlBulk.Close()

    NewDataTable .Clear()

  • Didn't you say that you had also tried BCP? I'd rather work with BCP than writing a .Net program. So if you have the BCP command (and any format file you are using that would be great). I will not be able to look at the case right now anyway, as it is past midnight where I sit.

    [font="Times New Roman"]Erland Sommarskog, SQL Server MVP, www.sommarskog.se[/font]

  • Erland Sommarskog (7/11/2013)


    Didn't you say that you had also tried BCP? I'd rather work with BCP than writing a .Net program. So if you have the BCP command (and any format file you are using that would be great). I will not be able to look at the case right now anyway, as it is past midnight where I sit.

    That is ok. I have never write BCP on sql query before, I only use bulk copy from .net program the first time. Due to the nature of complexity of the feed it needs to process lots of things before importing to sql server.

    FYI. last night I disabled only non clustered indexes, and let my program ran thru out the night, it stopped at 82% and processed around 6,189,170 records, then all sudden threw this error again. don't understand what the hell going on, I have to remove all the indexes and let it run again and see if it run until completion or not.

    System.Data.SqlClient.SqlException: Insert bulk failed due to a schema change of the target table.

  • Any chance that you could write a small console-mode program that reads data from a tab-delimted file and then sends the data to SQL Server with SqlBulkCopy in your program?

    Yes, I could write one myself, but I'm not really fluent in .Net-programming, and it looks like tomorrow will be another sunny day (which means that I will not sit inside and look at SQL problems a lot). So if you could help me on that part, I could get a head start. You don't have to test that the program repros the problem, but I'm glad if you test it with a short file with a few records.

    [font="Times New Roman"]Erland Sommarskog, SQL Server MVP, www.sommarskog.se[/font]

  • After dropping all indexes and successfully bulk copying the data in, are the index rebuilds successful?

  • So I tried to repro the problem my filling up the table you posted with 10 million rows, which then exported to file in native format. Then I used BCP to load that file into the table (which I had dropped an recreated).

    The load has actually not completed yet, but with 8.94 million rows loaded, the error you got has yet to appear. And note that I have all indexes defined on the table.

    The different outcome from what you experience, could be for a number of different reasons:

    1) The behaviour is different with BCP than with SqlBulkCopy.

    2) The profile of my test data is different.

    3) You has some different configuration when you run your load.

    4) There are concurrent queries against the table when you load the data. (This could trigger autostats.)

    5) I have SQL 2008 SP3, you have SQL 2008 R2 RTM. That is, you may experience a problem that has been solved in a service pack, why I again encourage you to install SQL 2008 R2 SP2.

    [font="Times New Roman"]Erland Sommarskog, SQL Server MVP, www.sommarskog.se[/font]

  • Erland Sommarskog (7/14/2013)


    So I tried to repro the problem my filling up the table you posted with 10 million rows, which then exported to file in native format. Then I used BCP to load that file into the table (which I had dropped an recreated).

    The load has actually not completed yet, but with 8.94 million rows loaded, the error you got has yet to appear. And note that I have all indexes defined on the table.

    The different outcome from what you experience, could be for a number of different reasons:

    1) The behaviour is different with BCP than with SqlBulkCopy.

    2) The profile of my test data is different.

    3) You has some different configuration when you run your load.

    4) There are concurrent queries against the table when you load the data. (This could trigger autostats.)

    5) I have SQL 2008 SP3, you have SQL 2008 R2 RTM. That is, you may experience a problem that has been solved in a service pack, why I again encourage you to install SQL 2008 R2 SP2.

    Thanks for looking into this.

    I think you are absolutely right, especially on #4, I know for the fact that our application has some built-in auto index feature that checks the table periodically and rebuild the indexes if it finds necessary. but it is very complex app, and that disable auto reindex may not be an option as it would impact other functions of the app. I am still investigating to see how to handle this.

    again,

    thank you for looking at this.

  • arnipetursson (7/12/2013)


    After dropping all indexes and successfully bulk copying the data in, are the index rebuilds successful?

    I have not tried to recreate indexes as at this point I am focusing on get the import process works, one it is works the will take care the recreate the indexes, but I don't think that be an issue. testing this is painful, this app import is very data intensive, one schedule import could take few days to complete, depends on server's CPU and available memory.

    Thanks

Viewing 15 posts - 1 through 15 (of 25 total)

You must be logged in to reply to this topic. Login to reply