November 9, 2007 at 8:13 am
I am looking for the most efficient way to import large amounts of data into SQL Server... the data can be provided in any format. At the moment I use a scheduled bulk import - but this involves generating plain text files (containing the data), which are then (presumably) converted back into typed data by SQL Server bulk import. I am sure a more efficient way than this should exist!
OverView
- Single table with approx 40 rows.
- The amount of data in each row is not very large (each row is only a few kb)
- 1000's of rows generated daily (by a piece of custom software)
- The data can be output in any format (as we have the source code for the software that generates the data)
What is the most efficient way possible to get data from a piece of custom software into SQL server?
Thanks for any advice!
November 9, 2007 at 8:20 am
Do you have control over what the custom app does? For example, could you modify the custom app to write to a table in the server?
Otherwise, using BULK INSERT to import a text file (tab delimited would probably be the best) as you're doing is probably the most "efficient"...
--Jeff Moden
Change is inevitable... Change for the better is not.
November 9, 2007 at 9:09 am
Thanks for your fast response!
Yes - that is a good point - that is something I could look at. However, this would require some more significant re-development. I was really wondering if there was a more efficient format for the data could be serialized to - as I believe that plain text is not generally regarded as a particularly efficient means of passing data. However, if there are no options for importing data in other formats, then i will probably stick with the bulk import for now!
Thanks for your help
Viewing 3 posts - 1 through 3 (of 3 total)
You must be logged in to reply to this topic. Login to reply