October 12, 2012 at 7:02 am
Hello,
I've generated one file per table of my database. Each file contains 'insert' statement for each records of the table. (did that for data migration purposes with bad collation, blablabla)
I first tried to use SQLCMD to load data into my tables but, apparently, if my file is too big (or there are too many inserts to do) it doesn't work.
I got that error, for big files/tables:
Shared Memory Provider: No process is on the other end of the pipe. Sqlcmd: Error: Microsoft SQL Server Native Client 10.0 : Communication link failure.
Well, I read that "bcp" was THE tool to perform what I want to do. Okay, I gave it a try...
But I've problems with its syntax. I get many errors. Some of these are:
SQLState = 22003, NativeError = 0
Error = [Microsoft][SQL Server Native Client 10.0]Numeric value out of range
and
SQLState = S1000, NativeError = 0
Error = [Microsoft][SQL Server Native Client 10.0]Unable to open BCP host data-file
What my input file looks like:
INSERT CCE_DATAFILESIZES(ID_DFSIZE,CHECK_DATE,DATABASE_NAME,TABLESPACE_NAME,DATAFILE_NAME,COSTCENTRE,MIN_MB,MAX_MB,AVG_MB,SHARED_DATABASE)
VALUES('1',convert(datetime,'Apr 27 2012 12:00AM'),'master','PRIMARY','G:\SYS_DB\HSICTR2D\MSSQL10_50.HSICTR2D\MSSQL\DATA\master.mdf','','4.000','0.000','0','n')
INSERT CCE_DATAFILESIZES(ID_DFSIZE,CHECK_DATE,DATABASE_NAME,TABLESPACE_NAME,DATAFILE_NAME,COSTCENTRE,MIN_MB,MAX_MB,AVG_MB,SHARED_DATABASE)
VALUES('2',convert(datetime,'Apr 27 2012 12:00AM'),'master','LOG','G:\SYS_DB\HSICTR2D\MSSQL10_50.HSICTR2D\MSSQL\DATA\mastlog.ldf','','1.000','0.000','0','n')
So, what is the best way to insert a lot of data into tables and how to do it+syntax ?
Here's how i wrote my bcp command:
bcp mydb.dbo.mytable in G:\DATA_DB\Migration\mytable.sql -S server\instance -T -b10000 -c
Please help me, i'm becoming crazy using bcp :crazy:
Thanks in advance.
Regards.
Franck.
October 12, 2012 at 7:23 am
bulk insert expects raw data, and not formatted INSERT...statements.
if your file looked like this isntead:
1,Apr 27 2012 12:00AM
2,Apr 27 2012 12:00AM
bulk insert would insert into a two column table witht eh columns of type INT,datetime increadibly fast.
I've imported much wider files, for example a zip plus 4 database, where the text file was over a gig in size in under 15 minutes;
files with 100K rows of data in under a second are the norm.
here's a simple example, where i use bcp out to send a table to disk, and then bcp or BULK INSERT in the same data to a different table:
because I'm expecting CrLf in note and description fields, I like to use very odd delimiters that I'm sure will not exist in the same data.
--using a super special 4 char row delimiter to be sure it doesn't exist in the data
--flags explanation:
-- -c = character data
-- -t"[||]" = field terminator
-- -r"[~~]" = row terminator
-- -T' = Trusted connection
--out
EXECUTE master.dbo.xp_cmdshell 'bcp "SELECT EMAILTEMPLATESID, EMAILBODY FROM SandBox.dbo.EMAILTEMPLATES ORDER BY EMAILTEMPLATESID" queryout c:\Data\bcpExample.txt -c -t"[||]" -r"[~~]" -T'
--in
EXECUTE master.dbo.xp_cmdshell 'bcp Production.dbo.EMAILTEMPLATES in c:\Data\bcpExample.txt -c -t"[||]" -r"[~~]" -T'
--in via bulk insert
BULK INSERT EMAILTEMPLATES FROM 'c:\Data\bcpExample.txt'
WITH (
DATAFILETYPE = 'char',
FIELDTERMINATOR = '[||]',
ROWTERMINATOR = '[~~]',
FIRSTROW = 1
)
Lowell
October 12, 2012 at 7:34 am
Hahaha, I spent hours on exporting my data and prepare inserts :hehe:
Well, thanks for your fast reply, i'll try your way and post my feedback 😉
Franck
October 15, 2012 at 7:33 am
Lowell,
Thank you very much for your help. I still have some issues but it somehow works 🙂
Have a nice day.
Franck.
October 15, 2012 at 7:37 am
let us know if we can help with the additional issues you are encountering; glad i could get you pointed in a helpful direction.
Lowell
Viewing 5 posts - 1 through 4 (of 4 total)
You must be logged in to reply to this topic. Login to reply