June 12, 2008 at 12:08 pm
This is what I do:
1) Import data from an Excel file into a DataTable using OdbcConnection, OdbcDataAdapter, and OdbcCommand
2) Examine the DataTable's list of columns to see which columns it has
3) Call a TSQL procedure that alters two staging tables by adding and/or dropping columns to make the tables match the Excel data
4) Use SQLBulkCopy to move data from the C# DataTable into the SQL Server database.
The problem:
Say my columns are k1, k2, k3, k4, k5, k6, k7. I delete K3 and add k8. After the data is transferred, some of it goes into the wrong column! It is like the bulk copy is using the old column ordinal from before the ALTER TABLE. So the k4 data goes into k5, the k5 data into k6, k6 data into k7, etc. I am wondering if I am supposed to cause recompilation of SPs or refreshing of caches or some clever thing to get this to work...
Here is a trimmed down version of the code I wrote to accomplish steps 1 & 4 above:
DataTable regionPricing;
// Connect to Excel file.
string excelDriver = "{Microsoft Excel Driver (*.xls)}";
String excelConnectionString = String.Format("Driver={0};Driverid=790;Dbq={1};DefaultDir={2};", excelDriver, excelFile, defaultDir);
OdbcConnection cnCSV = new OdbcConnection(excelConnectionString);
cnCSV.Open();
OdbcDataAdapter daCSV = new OdbcDataAdapter();
// Load Excel data into DataTable
OdbcCommand cmdSelectRegion = new OdbcCommand(@"SELECT * FROM [By_Region]", cnCSV);
daCSV.SelectCommand = cmdSelectRegion;
regionPricing = new DataTable();
daCSV.Fill(regionPricing);
daCSV = null;
cnCSV.Close();
// Copy data from C# DataTable into SQL Server table
SqlBulkCopy sbc = new SqlBulkCopy(dbConnectionString);
sbc.DestinationTableName = regionPricingTableName;
sbc.WriteToServer(regionPricing);
sbc.Close();
I tried creating explicit column mappings (in a loop over all columns):
sbc.ColumnMappings.Add(new SqlBulkCopyColumnMapping(columnName, columnName));
No change.
Here is how I called the TSQL procedure (prior to SQLBulkCopy) to alter the tables from C#:
System.Data.SqlClient.SqlConnection conn = new System.Data.SqlClient.SqlConnection();
conn.ConnectionString = this.dbRegularConnectionString;
SqlCommand command = null;
conn.Open();
// alter_tables( @countryWeightColumns varchar(max), @regionWeightColumns varchar(max) )
command = new SqlCommand("alter_tables", conn);
command.CommandType = CommandType.StoredProcedure;
command.Parameters.AddWithValue("@countryWeightColumns", countryKgColumns);
command.Parameters.AddWithValue("@regionWeightColumns", regionKgColumns);
command.ExecuteNonQuery();
conn.Close();
The TSQL procedure forges dynamic queries to add and drop columns. I'll spare you the details of the parsing of the comma delimited list of column names and how I compared it to the INFORMATION_SCHEMA. But once I finish that I do this to ALTER the TABLEs:
DECLARE @SchemaChange TABLE(TableName varchar(50), ColumnName varchar(50), Action varchar(50))
... Omit: Parsing happens ...
DECLARE @sql nvarchar(max)
DECLARE cur CURSOR FOR SELECT TableName, ColumnName, Action
FROM @SchemaChange
ORDER BY TableName, ColumnName;
PRINT 'Schema changes:'
OPEN cur;
FETCH NEXT FROM cur INTO @TableName, @ColumnName, @Action;
WHILE @@FETCH_STATUS = 0
BEGIN
IF (@Action = 'DROP')
SET @sql = N'ALTER TABLE dbo.' + @TableName + ' DROP COLUMN ' + @ColumnName
ELSE IF (@Action = 'ADD')
SET @sql = N'ALTER TABLE dbo.' + @TableName + ' ADD ' + @ColumnName + ' float NULL '
IF (@action <> 'KEEP') BEGIN
PRINT ' ' + @sql
EXEC sp_executesql @sql, N''
END
ELSE
PRINT ' -- Keep ' + @TableName + '.' + @ColumnName
FETCH NEXT FROM cur INTO @TableName, @ColumnName, @Action;
END;
CLOSE cur;
DEALLOCATE cur;
Viewing post 1 (of 1 total)
You must be logged in to reply to this topic. Login to reply