Click here to monitor SSC
SQLServerCentral is supported by Red Gate Software Ltd.
 
Log in  ::  Register  ::  Not logged in
 
 
 
        
Home       Members    Calendar    Who's On


Add to briefcase 12»»

Copy a big table (250G) form one server to another Expand / Collapse
Author
Message
Posted Monday, March 4, 2013 12:56 PM
Grasshopper

GrasshopperGrasshopperGrasshopperGrasshopperGrasshopperGrasshopperGrasshopperGrasshopper

Group: General Forum Members
Last Login: Wednesday, March 6, 2013 8:23 AM
Points: 12, Visits: 34
Hi,

I need to copy a big table (about 250G, basically used to store images) that I have on a MSSQL Server to a MSSQL Server. I don't need to transform data but I don't if this command will works

INSERT INTO Temp SELECT * FROM [SourceServerName].DBName.dbo.Temp

Thanks for your help
Post #1426423
Posted Monday, March 4, 2013 1:05 PM


Hall of Fame

Hall of FameHall of FameHall of FameHall of FameHall of FameHall of FameHall of FameHall of FameHall of Fame

Group: General Forum Members
Last Login: Monday, July 14, 2014 2:06 PM
Points: 3,863, Visits: 7,130
If space isn't an issue, I'd restore a backup to the other server and drop the things you don't need, keeping only that table

______________________________________________________________________________
"Never argue with an idiot; They'll drag you down to their level and beat you with experience"
Post #1426425
Posted Monday, March 4, 2013 1:07 PM
Grasshopper

GrasshopperGrasshopperGrasshopperGrasshopperGrasshopperGrasshopperGrasshopperGrasshopper

Group: General Forum Members
Last Login: Wednesday, March 6, 2013 8:23 AM
Points: 12, Visits: 34
Hi, I need to insert it into a database. Thanks
Post #1426426
Posted Monday, March 4, 2013 1:13 PM


Hall of Fame

Hall of FameHall of FameHall of FameHall of FameHall of FameHall of FameHall of FameHall of FameHall of Fame

Group: General Forum Members
Last Login: Monday, July 14, 2014 2:06 PM
Points: 3,863, Visits: 7,130
Your command will work, there's just a whole bunch of things to think about: First thing that comes to mind is that the entire process will be logged and generated one heck of a large transaction! Is your DB in simple, bulk-logged, or full recovery?


______________________________________________________________________________
"Never argue with an idiot; They'll drag you down to their level and beat you with experience"
Post #1426429
Posted Monday, March 4, 2013 1:19 PM
Grasshopper

GrasshopperGrasshopperGrasshopperGrasshopperGrasshopperGrasshopperGrasshopperGrasshopper

Group: General Forum Members
Last Login: Wednesday, March 6, 2013 8:23 AM
Points: 12, Visits: 34
Hi,

The database is in Simple Mode.

Thanks
Post #1426434
Posted Monday, March 4, 2013 3:53 PM


SSC-Dedicated

SSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-Dedicated

Group: General Forum Members
Last Login: Today @ 7:48 AM
Points: 36,759, Visits: 31,214
Don't use the insert command as you currently have it. Contrary to popular belief, having the database in the SIMPLE recovery mode does NOT prevent the log file from being used. Since you're trying to transfer all 250GB in a single query, your log file will explode to at least that size.

You could write a WHILE loop to transfer the data in smaller chunks but it'll still take a while and fill the pipe while you're doing it.

It's time for a "carrier-pidgeon" to do the first load. Before we get into that, are both servers on a SAN and are both SANs of the same make and model?


--Jeff Moden
"RBAR is pronounced "ree-bar" and is a "Modenism" for "Row-By-Agonizing-Row".

First step towards the paradigm shift of writing Set Based code:
Stop thinking about what you want to do to a row... think, instead, of what you want to do to a column."

(play on words) "Just because you CAN do something in T-SQL, doesn't mean you SHOULDN'T." --22 Aug 2013

Helpful Links:
How to post code problems
How to post performance problems
Post #1426498
Posted Monday, March 4, 2013 6:20 PM
Grasshopper

GrasshopperGrasshopperGrasshopperGrasshopperGrasshopperGrasshopperGrasshopperGrasshopper

Group: General Forum Members
Last Login: Wednesday, March 6, 2013 8:23 AM
Points: 12, Visits: 34
Hi,

They are on the same SAN. Thanks
Post #1426550
Posted Monday, March 4, 2013 7:17 PM


SSC-Dedicated

SSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-Dedicated

Group: General Forum Members
Last Login: Today @ 7:48 AM
Points: 36,759, Visits: 31,214
In that case, you might try what SQL Server itself uses for replication if you have the disk space. BCP in native format. Export the table to a file, then import it in half million (or less depending on the size of your rows) row chunks using the batch size setting. It will keep your log file from exploding.

As an alternative, you could have the infrastructure folks do what I believe is called a "SAN Snapshot" to copy the mdf and ldf files and then simply attach the database.


--Jeff Moden
"RBAR is pronounced "ree-bar" and is a "Modenism" for "Row-By-Agonizing-Row".

First step towards the paradigm shift of writing Set Based code:
Stop thinking about what you want to do to a row... think, instead, of what you want to do to a column."

(play on words) "Just because you CAN do something in T-SQL, doesn't mean you SHOULDN'T." --22 Aug 2013

Helpful Links:
How to post code problems
How to post performance problems
Post #1426557
Posted Monday, March 4, 2013 7:36 PM
Grasshopper

GrasshopperGrasshopperGrasshopperGrasshopperGrasshopperGrasshopperGrasshopperGrasshopper

Group: General Forum Members
Last Login: Wednesday, March 6, 2013 8:23 AM
Points: 12, Visits: 34
Hi,

In the past I had problems with BCP because when I try to copy the data to a file some times SQL introduce returns between the data and when I try to reconstruct again the images I got errors.

The second option is a valid but I need to copy the table to a specific data base, so if a move the mdf and ldf I still have to move the table from two locals databases

Thanks
Post #1426562
Posted Monday, March 4, 2013 7:44 PM


Hall of Fame

Hall of FameHall of FameHall of FameHall of FameHall of FameHall of FameHall of FameHall of FameHall of Fame

Group: General Forum Members
Last Login: Monday, July 14, 2014 2:06 PM
Points: 3,863, Visits: 7,130
csalazar-607991 (3/4/2013)
Hi,

In the past I had problems with BCP because when I try to copy the data to a file some times SQL introduce returns between the data and when I try to reconstruct again the images I got errors.
Yes but if you actually set it up to replicate the data over (meaning physically create a publication/subscriber), the snapshot will create multiple bcp chunks for your automatically and should keep everything in sync.

If you can't use snap shot replication as a one time deal, you really only have 2 options: 1) have the SAN guys take a snap of the LUN (as Jeff has mentioned) or 2) break it up into a match style job, transferring over x-number of rows at a time.

I can't think of anything else that hasn't already been mentioned


______________________________________________________________________________
"Never argue with an idiot; They'll drag you down to their level and beat you with experience"
Post #1426566
« Prev Topic | Next Topic »

Add to briefcase 12»»

Permissions Expand / Collapse