Click here to monitor SSC
SQLServerCentral is supported by Red Gate Software Ltd.
 
Log in  ::  Register  ::  Not logged in
 
 
 
        
Home       Members    Calendar    Who's On


Add to briefcase

Data Migration Options Expand / Collapse
Author
Message
Posted Thursday, November 20, 2008 7:57 AM
Forum Newbie

Forum NewbieForum NewbieForum NewbieForum NewbieForum NewbieForum NewbieForum NewbieForum Newbie

Group: General Forum Members
Last Login: Monday, January 10, 2011 11:15 AM
Points: 1, Visits: 29
We have SQL Server databases located on different servers for our hosted web application.

Each database can house multiple 'customers' along with their data.

We have the need to be able to 'move' data for one or many customers to from one server to another server.

Moving the data from one server to another will require copying a 'slice' of data for a selected customer to a destination SQL Server that will already house other customers.

Upon moving the data, primary and foreign keys from the source data cannot be directly copied to the destination server, since these most likely will be already generated for another customer on the destination server.

We have been looking into several approaches to accomplishing what we want including:

1) Writing a Windows Application that taked in user selections for source and destination databases and connectionstrings. Then reading the data from the source database into a dataset or something and sending the data to the destination database in chunks.

2) Investigating using SSIS to write a package that will be able to migrate a customers data from the source to destination, also handling the keys? Is this possible?

3) Having linked servers on each of our database servers and using cross database querys to 'Insert Select' the data from the source to the destination. Can the linked server name be set in sql dynamically so as to accomidate a users selection for server.

I thought maybe some of the SQL experts here have already worked through this type of situation and appreciate any recommendations.
Thanks in advance for your suggestions
Post #605835
Posted Tuesday, September 11, 2012 11:07 AM
Forum Newbie

Forum NewbieForum NewbieForum NewbieForum NewbieForum NewbieForum NewbieForum NewbieForum Newbie

Group: General Forum Members
Last Login: Tuesday, February 4, 2014 8:46 AM
Points: 2, Visits: 7
Actually I've got a similar challenge. I couldn't see any replies here. How did you get on?
Post #1357590
Posted Wednesday, September 12, 2012 11:20 PM
Valued Member

Valued MemberValued MemberValued MemberValued MemberValued MemberValued MemberValued MemberValued Member

Group: General Forum Members
Last Login: Friday, September 5, 2014 3:18 PM
Points: 52, Visits: 264
I worked on a similar problem where I had to merge 17 departments into a master database, where each department shared a common database schema.

The merges took place sequentially, and each merge simply appended its data to the growing master store.

One each merge, the primary and foreign keys are re-caculated as the script runs (using a junk variable on each table to save its old key values so they can be used to compute the new values during the merge).

I've done lots of this stuff, if that helps.

Glen



R Glen Cooper
Post #1358342
Posted Tuesday, September 25, 2012 7:47 PM


Ten Centuries

Ten CenturiesTen CenturiesTen CenturiesTen CenturiesTen CenturiesTen CenturiesTen CenturiesTen Centuries

Group: General Forum Members
Last Login: Monday, September 29, 2014 7:25 AM
Points: 1,221, Visits: 1,294
I have the same issue and was wondering what you came up with for a sooution

For better, quicker answers, click on the following...
http://www.sqlservercentral.com/articles/Best+Practices/61537/

For better answers on performance questions, click on the following...
http://www.sqlservercentral.com/articles/SQLServerCentral/66909/
Post #1364397
Posted Sunday, December 29, 2013 4:35 AM
Forum Newbie

Forum NewbieForum NewbieForum NewbieForum NewbieForum NewbieForum NewbieForum NewbieForum Newbie

Group: General Forum Members
Last Login: Monday, September 29, 2014 2:36 AM
Points: 5, Visits: 26
I am currently in the process of migrating data from one server / database to another, and here's a tried and tested method that I have used for certain tables: NEGATIVE IDENTITIES
NOTE:I do all transformations in a 'StageDB' database, which is a restored copy of the Source database (both Schema and Data)- so I'm not doing any updates in the source database at all OR any inserts directly into the destination database until the data is in its final state)
Here are the steps that you need to take:
1) Drop all Foreign Key Constraints that relate to StageDB.dbo.Customers table
2) Delete all records from the StageDB.dbo.Customers table
3) Make the following changes to StageDB.dbo.Customers table:
- Add additional Integer field - 'PreviousCustomerID'
- set the CustomerID Primary Key Identity field to Negative Integer as follows:
CustomerID INT NOT NULL IDENTITY (-1, -1)
4) Insert data from SourceDB.dbo.Customers to StageDB.dbo.StageCustomers as follows:
INSERT INTO StageDB.dbo.Customers
(
PreviousCustomerID
--add all other Customer table fields here except the CustomerID
)
SELECT
CustomerID
--select all other Customer table fields
FROM
SourceDB.dbo.Customers;
4) The StageDBdbo.Customers table is now populated with Negative Identity Integers on the Primary Key field CustomerID, with the additional field 'PreviousCustomerID' which will preserve the original CustomerID Value (as suggested in previous post)
We can use this to update any tables related to the Customers table
5) Update any tables related to the Customers table in the StageDB database as follows -
UPDATE RelatedCustomerTable
SET CustomerID=cust.CustomerID
FROM StageDB.dbo.Customers as cust
WHERE cust.CustomerID=RelatedCustomerTable.PreviousCustomerID
6) Repeat Step 5 for any table that has the CustomerID field, ensuring that any Foreign Key Constraints are dropped before updating
7) Once all updates are complete - Add any Foreign Key Constraints that were originally dropped in the StageDB database. This will act as a data quality check to ensure that all IDs have been updated to match the new PKs
8) Add the new 'PreviousCustomerID' field on the DestinationDB.dbo.Customers table
9) For the final insert from the StageDB database into the DestinationDB database, you must drop the Identity from the Primary Key on the DestinationDB.dbo.Customers table so that the negative Identities are not overwritten
10) Insert data directly from the StageDB.dbo.Customers into DestinationDB.dbo.Customers
11) Add the Identity back onto the Primary Key of Customers Table
12) Insert the data from all other tables from StageDB database
*****The process is now complete*****

Using this method, you can ensure that there will never be any clash in identities and it is easy to identify which data is from migration and which data is not
TIP 1: If this data migration is not a one off procedure, it is best to wrap the stages in a transaction AND/OR put the steps in an SSIS Package - either way, ensure that you implement effective error handling and rollback routines
TIP 2: Again - if this data migration is not a one off procedure, by using the same StageDB.dbo.Customers table each time, by using the Delete statement, rather than Truncate - your maximum identity value for CustomerID will be retained so you won't be duplicating negative identities in the DestinationDB

I hope this helps,

Jo
Post #1526332
« Prev Topic | Next Topic »

Add to briefcase

Permissions Expand / Collapse