I have a requirement where i need to copy around 70 tables from oracle database to Sql server 2008 and further this total no of tables may increase to 300.
I thought of implementing it dynamically using ADO.NET in script task,instead of having 70 data flow tasks. As a proof of concept i have developed a package where i'm copying data from one sql server to another table in different database. Below is the code for the same. I have taken source and destination for sql server for this POC.
public void Main()
// TODO: Add your code here
Dts.TaskResult = (int)ScriptResults.Success;
string Src_connectionString =@"Data Source=XXX;Initial Catalog=Test;Integrated Security=True";
string Dst_connectionString = @"Data Source=XXX;Initial Catalog=Movies;Integrated Security=True";
using (SqlConnection sourceConnection =
SqlCommand myCommand =
new SqlCommand("SELECT * FROM Customers_1", sourceConnection);
IDataReader reader = myCommand.ExecuteReader();
using (SqlConnection destinationConnection =
using (SqlBulkCopy bulkCopy =
bulkCopy.BatchSize = 500;
bulkCopy.DestinationTableName = "Customers_Dest";
Here i'm handling 1 source table and both source and target are having same table structure, going forward i'll have a control table where i'll store all 70 table names and there source connection and queries and also its corresponding destination table
Please suggest/advice to go with this design or is there any alternative approach to handle the requirement.