Click here to monitor SSC
SQLServerCentral is supported by Redgate
 
Log in  ::  Register  ::  Not logged in
 
 
 


Copying data from one set of tables to another


Copying data from one set of tables to another

Author
Message
ErikMN
ErikMN
SSC Rookie
SSC Rookie (33 reputation)SSC Rookie (33 reputation)SSC Rookie (33 reputation)SSC Rookie (33 reputation)SSC Rookie (33 reputation)SSC Rookie (33 reputation)SSC Rookie (33 reputation)SSC Rookie (33 reputation)

Group: General Forum Members
Points: 33 Visits: 385
I'm hoping someone can help find an easier, more supportable way of doing the following:

1. Take a subset of data from about 100 tables that have multiple references to other tables in this group of 100 from a first DB.
2. Insert the above data into a second DB, a database that already has data in the 100 tables, while maintaining the correct references.

As a general approach, the best way I can think of doing this is as follows:

1. Create mapping tables for every ID that is referenced in a different table (OldID NewID)
2. Insert the old data into the new table and output the OldID and NewID into the mapping table.
3. Use that mapping data to make sure all tables that use those IDs have the new IDs in DB2.

This approach is extremely labor intensive both on initial implementation and would require a fairly substantial amount of work to maintain going forward. Does anyone know of a better approach to do this?
Eirikur Eiriksson
Eirikur Eiriksson
SSCertifiable
SSCertifiable (6.7K reputation)SSCertifiable (6.7K reputation)SSCertifiable (6.7K reputation)SSCertifiable (6.7K reputation)SSCertifiable (6.7K reputation)SSCertifiable (6.7K reputation)SSCertifiable (6.7K reputation)SSCertifiable (6.7K reputation)

Group: General Forum Members
Points: 6714 Visits: 17699
You could (protect your identity by ;-)
1. Script the database
2. Run the create table part only at the destination
3. Use Import wizard to transfer the data, enable identity insert and filter where needed
4. Run the constraint part of the database script

or

1. Copy database
2. Script the constraints and drop them
3. Clean the data
4. Re-Create the constraints

Hint, look up IDENTITY_INSERT

Cool
ErikMN
ErikMN
SSC Rookie
SSC Rookie (33 reputation)SSC Rookie (33 reputation)SSC Rookie (33 reputation)SSC Rookie (33 reputation)SSC Rookie (33 reputation)SSC Rookie (33 reputation)SSC Rookie (33 reputation)SSC Rookie (33 reputation)

Group: General Forum Members
Points: 33 Visits: 385
Thanks for the reply, but I think you're missing one part of my scenario or I'm not quite understanding your answer. The problem with your answer is that I want to take data from one database to another database that already has data in tables that the data is going into.

For example,

DB1 - Table1 has IDs 1-1000

DB2 - Table1 has IDs 1-500

I'm wanting to add records with IDs 1-1000 to the second table but also making sure the references in other tables point to the correctID from Table1.
Eirikur Eiriksson
Eirikur Eiriksson
SSCertifiable
SSCertifiable (6.7K reputation)SSCertifiable (6.7K reputation)SSCertifiable (6.7K reputation)SSCertifiable (6.7K reputation)SSCertifiable (6.7K reputation)SSCertifiable (6.7K reputation)SSCertifiable (6.7K reputation)SSCertifiable (6.7K reputation)

Group: General Forum Members
Points: 6714 Visits: 17699
ErikMN (3/27/2014)
Thanks for the reply, but I think you're missing one part of my scenario or I'm not quite understanding your answer. The problem with your answer is that I want to take data from one database to another database that already has data in tables that the data is going into.

For example,

DB1 - Table1 has IDs 1-1000

DB2 - Table1 has IDs 1-500

I'm wanting to add records with IDs 1-1000 to the second table but also making sure the references in other tables point to the correctID from Table1.

My bad here, misunderstood the question. My thought is to update all references as part of the load, a mapping solution might become very hard to maintain.

Are there overlaps in the referential data?
RBarryYoung
RBarryYoung
SSCrazy Eights
SSCrazy Eights (9.4K reputation)SSCrazy Eights (9.4K reputation)SSCrazy Eights (9.4K reputation)SSCrazy Eights (9.4K reputation)SSCrazy Eights (9.4K reputation)SSCrazy Eights (9.4K reputation)SSCrazy Eights (9.4K reputation)SSCrazy Eights (9.4K reputation)

Group: General Forum Members
Points: 9438 Visits: 9517
ErikMN (3/27/2014)
...I want to take data from one database to another database that already has data in tables that the data is going into.

For example,

DB1 - Table1 has IDs 1-1000

DB2 - Table1 has IDs 1-500

I'm wanting to add records with IDs 1-1000 to the second table but also making sure the references in other tables point to the correctID from Table1.


OK, here's my crazy, shoot-from-the-hip shortcut idea:

1) Make a new schema in your source DB called [Export]

2) For each table to be exported, create a View in [Export] named "v_{tableName}" that just selects every column by name. (This can be automated)

3) Edit these views, changing every ID column and ID-linking column to "{IdName}+20000 As [{IdName}]". (with some effort, this too can be automated)

4) Use Generate Scripts on the source DB, Data-Only and select only the Views in the [Export] schema.

5) Edit the script and add "SET IDENTITY INSERT" On/Off statements for every table's INSERT statement block. (I haven't figured out a way to automate this part)

6) Backup the Target DB. Seriously, this is dangerous.

7) Execute the script against the Target DB.

And hopefully that does it.

-- RBarryYoung, (302)375-0451 blog: MovingSQL.com, Twitter: @RBarryYoung
Proactive Performance Solutions, Inc.
"Performance is our middle name."
ErikMN
ErikMN
SSC Rookie
SSC Rookie (33 reputation)SSC Rookie (33 reputation)SSC Rookie (33 reputation)SSC Rookie (33 reputation)SSC Rookie (33 reputation)SSC Rookie (33 reputation)SSC Rookie (33 reputation)SSC Rookie (33 reputation)

Group: General Forum Members
Points: 33 Visits: 385
If I understand what you mean by overlaps in the referential data, then yes there are. Without going into the specifics, both databases have a table called Person with PersonID as an identity. That identity is referred to in a number of other tables and it is not unique between both databases so there might be a PersonID = 500 in both databases that refer to different persons.

The only approach that I can think of, that looking back at the OP, I didn't explain very well, would be to update the data as it goes in. The mapping tables I was referring to would map the ID in the old database to the ID that was added in the new database, so that when the data of a table that refers to PersonID is inserted, it would put in the new PersonID.

Example mapping table:

CREATE TABLE dbo.MapPerson(
OldPersonID INT,
NewPersonID INT)



Then when inserting into a new table that has a reference to PersonID, I would join in the MapPerson table to get the new PersonID. As you said, this approach would be very labor intensive and get more so as you add more tables with multiple references to other tables. That's why I'm hoping there might be another approach I haven't thought of...
ErikMN
ErikMN
SSC Rookie
SSC Rookie (33 reputation)SSC Rookie (33 reputation)SSC Rookie (33 reputation)SSC Rookie (33 reputation)SSC Rookie (33 reputation)SSC Rookie (33 reputation)SSC Rookie (33 reputation)SSC Rookie (33 reputation)

Group: General Forum Members
Points: 33 Visits: 385
RBarryYoung (3/27/2014)

OK, here's my crazy, shoot-from-the-hip shortcut idea:

1) Make a new schema in your source DB called [Export]

2) For each table to be exported, create a View in [Export] named "v_{tableName}" that just selects every column by name. (This can be automated)

3) Edit these views, changing every ID column and ID-linking column to "{IdName}+20000 As [{IdName}]". (with some effort, this too can be automated)

4) Use Generate Scripts on the source DB, Data-Only and select only the Views in the [Export] schema.

5) Edit the script and add "SET IDENTITY INSERT" On/Off statements for every table's INSERT statement block. (I haven't figured out a way to automate this part)

6) Backup the Target DB. Seriously, this is dangerous.

7) Execute the script against the Target DB.

And hopefully that does it.


Thank you, this is definitely intriguing and while it would still be quite a bit of work, it'd certainly be less initial work than my original approach. The only thing I'd need to think about is step 3 where i don't think a static value of +20000 would be appropriate as there are tables with a few hundred million rows and some with 20, so I'd probably want to do some kind of count or max on the receiving table and add that instead.
ScottPletcher
ScottPletcher
Hall of Fame
Hall of Fame (3.9K reputation)Hall of Fame (3.9K reputation)Hall of Fame (3.9K reputation)Hall of Fame (3.9K reputation)Hall of Fame (3.9K reputation)Hall of Fame (3.9K reputation)Hall of Fame (3.9K reputation)Hall of Fame (3.9K reputation)

Group: General Forum Members
Points: 3944 Visits: 6678
ErikMN (3/27/2014)
RBarryYoung (3/27/2014)

OK, here's my crazy, shoot-from-the-hip shortcut idea:

1) Make a new schema in your source DB called [Export]

2) For each table to be exported, create a View in [Export] named "v_{tableName}" that just selects every column by name. (This can be automated)

3) Edit these views, changing every ID column and ID-linking column to "{IdName}+20000 As [{IdName}]". (with some effort, this too can be automated)

4) Use Generate Scripts on the source DB, Data-Only and select only the Views in the [Export] schema.

5) Edit the script and add "SET IDENTITY INSERT" On/Off statements for every table's INSERT statement block. (I haven't figured out a way to automate this part)

6) Backup the Target DB. Seriously, this is dangerous.

7) Execute the script against the Target DB.

And hopefully that does it.


Thank you, this is definitely intriguing and while it would still be quite a bit of work, it'd certainly be less initial work than my original approach. The only thing I'd need to think about is step 3 where i don't think a static value of +20000 would be appropriate as there are tables with a few hundred million rows and some with 20, so I'd probably want to do some kind of count or max on the receiving table and add that instead.


In that case, if the data type is int, add +1000000000 instead of +20000 :-)

SQL DBA,SQL Server MVP('07, '08, '09)

Prosecutor James Blackburn, in closing argument in the "Fatal Vision" murders trial: "If in the future, you should cry a tear, cry one for them [the murder victims]. If in the future, you should say a prayer, say one for them. And if in the future, you should light a candle, light one for them."
Go


Permissions

You can't post new topics.
You can't post topic replies.
You can't post new polls.
You can't post replies to polls.
You can't edit your own topics.
You can't delete your own topics.
You can't edit other topics.
You can't delete other topics.
You can't edit your own posts.
You can't edit other posts.
You can't delete your own posts.
You can't delete other posts.
You can't post events.
You can't edit your own events.
You can't edit other events.
You can't delete your own events.
You can't delete other events.
You can't send private messages.
You can't send emails.
You can read topics.
You can't vote in polls.
You can't upload attachments.
You can download attachments.
You can't post HTML code.
You can't edit HTML code.
You can't post IFCode.
You can't post JavaScript.
You can post emoticons.
You can't post or upload images.

Select a forum

































































































































































SQLServerCentral


Search