January 16, 2006 at 6:56 pm
I've been tasked with writing a text file parser and I'm having problems getting the file into a table (as raw text rows) in order to parse the file. We have a table with an AutoIncrement key and a Varchar(1000) field. We want to perform a "BULK INSERT" into the table by treating the input file as a file containing a single column. Unfortunately, I haven't been able to find a way around the what appears to be a limitation of BULK INSERT: Since the key column doesn't exist in the file to be imported, BULK INSERT refuses to perform the insert. Any ideas or suggestions on how to do this?
One alternative would be to DTS the file into a table, but we want the process to loop and process as many files as are waiting to be imported. Is there a way to do this using DTS? (We've already created a stored proc to read the filenames from the directory where the files are waiting to be processed.)
Thank you for any advice,
Dan
January 16, 2006 at 7:28 pm
Since you have one less column in the target table, you can use a format file to specify the required format. Type 'format files' in BOL to find out more.
January 16, 2006 at 7:32 pm
To do this, you will need a BCP format file OR, lose the IDENTITY column. BCP Format files can actually do all of the parsing in the final format for you. Look up {Format Files, Using Format Files} in Books OnLine.
Yes, I know you are using Bulk Insert but that uses the same format file as BCP.
--Jeff Moden
--Jeff Moden
Change is inevitable... Change for the better is not.
January 17, 2006 at 6:32 am
Wow, thanks for the quick response. Off to BOL...
Regards,
Dan
Viewing 4 posts - 1 through 4 (of 4 total)
You must be logged in to reply to this topic. Login to reply