I’ve grown up reading Tom Clancy and probably most of you have at least seen Red October, so this book caught my eye when browsing used books for a recent trip. It’s a fairly human look at what’s involved in sailing on a Trident missile submarine…
You need to move a table from one instance to another with the following requirements:
- The instances are on separate domains. In fact, you might be shipping the table to a vendor or client.
- The destination is an actual SQL Server instance.
- Can be automated on either end.
- Quick and easy please. Assume limited knowledge on the recipients part.
The separate domains/companies is a big killer. This eliminates options like replication, linked servers, etc. We can dump the output to a CSV but that does have some limitations if you have odd characters. Not to mention it requires extra work depending on your creation method. The automation means no GUI. Combine that with Quick, easy and limited knowledge on the recipients part and I want a script or two that I can pass a user.
My go-to solution here is going to be BCP. For those of you who aren’t familiar with it, it’s a command line tool for moving data in and out of SQL. It’s fast and once you have it scripted, easy enough to run. There are a fair number of options so the scripting can be somewhat complicated but the help is quite good and with a little practice this isn’t a problem. I should point out here that my requirements may seem a bit contrived to point directly to BCP, but I did actually receive this exact request recently. Also, BCP is a nice go-to for a number of data movement requests.
If you have used BCP before you’ve probably been annoyed at having to either create a format file or go through and hit enter for each column to confirm the format. Native format avoids this problem. When you export the data it is dumped with the table format built in, and when you import the data, it’s read using the format of the destination table. This does mean that unless you do use a format file the source and destination table formats have to be identical.
I should probably point out here that you aren’t going to be able to just open a file created using native format using Notepad or Excel and expect to be able to read it.
In theory native mode is also faster although I wasn’t able to prove this conclusively.
Here is a simple example of the export using BCP in native format.
bcp "Person.Person" out "C:\temp\Person_Person.out" -S "(local)\SQL2016CS" -d AdventureWorks2014 -n -T
Pretty simple right? Here are the params in order.
- “Person.Person” – The table we want to export.
- out – Direction we want to go. This could also be queryout if the first parameter is a query and not a table name.
- “C:\temp\Person_Person.out” – Output destination.
- -S “(local)\SQL2016CS” – Source SQL instance
- -d AdventureWorks2014 – Source database (where the table is).
- -T – Use trusted authentication (you could use -U and -P for the username/password of a SQL login.
- -n – Native mode. You could also use -N for unicode native mode.
I should point out that the flags are case sensative so -D and -d are two seperate things.
Now I just need to give my vendor/customer three things. A script to create the table, the output file, and the following command which could just as easily be put in a bat file to run.
bcp "Person.Person" in "C:\temp\Person_Person.out" -S "(local)\SQL2016CS" -d AdventureWorks2014 -n -T
Notice how it’s almost identical to the first command? I’ve just changed out to in to tell it to import rather than export. Of course, the file location, server name, etc will need to be changed based on the environment but that’s pretty easy. I should also point out that this just does a bulk insert so if you are doing a complete replace of the table a truncate needs to be done first or you could have a problem with primary keys etc.
Either way, in the end, this is a very simple way to move data from one location to another.