Click here to monitor SSC
SQLServerCentral is supported by Redgate
Log in  ::  Register  ::  Not logged in
Home       Members    Calendar    Who's On

Add to briefcase

stuck on trying to do a large import Expand / Collapse
Posted Saturday, February 23, 2013 10:59 PM


Group: General Forum Members
Last Login: Monday, March 3, 2014 4:58 PM
Points: 130, Visits: 214
Hi All

I am trying to do a large import with the below code for many thousands of rows from an excel spreadsheet.

the table is made up as follows

col1, col2, col3 col4, col5

sun microsystems inc, test,test,test,test
sun microsystems, inc, test, test, test, test
adobe inc, test,test,test,test
"adobe, inc",test,test,,test

I am having problems for example with rows 2, 5 and 6

all rows are inserting but for instance row 2 is finding the comma after sun microsystems, then putting inc in the next column along
same for row 6 with adobe ones
row 5 is still inserting which is similar two the other rows but i think it is doing the same because of the the double quotes
also the last row is showing no value so is there anyway to put some wording into the row if it is empty say something like 'Empty Value'

here is the import I am using

INSERT The_Big_Kahuna
FROM 'c:\users\alynch\Desktop\The_Big_Kahuna.csv'
FROM dbo.The_Big_Kahuna

Post #1423393
Posted Sunday, February 24, 2013 3:13 AM



Group: General Forum Members
Last Login: Wednesday, February 10, 2016 11:50 AM
Points: 6,897, Visits: 13,559
You'll need to standardize the input source before even trying the BULK INSERT.
SQL Server is a great tool but it cannot "guess" what the final result should look like from your personal perspective.
You need to provide a field terminator that is not part of the values you're trying to insert.
Maybe you could use a pipe separator or something like this.

A pessimist is an optimist with experience.

How to get fast answers to your question
How to post performance related questions
Links for Tally Table , Cross Tabs and Dynamic Cross Tabs , Delimited Split Function
Post #1423402
Posted Monday, February 25, 2013 12:54 PM
SSCrazy Eights

SSCrazy EightsSSCrazy EightsSSCrazy EightsSSCrazy EightsSSCrazy EightsSSCrazy EightsSSCrazy EightsSSCrazy EightsSSCrazy EightsSSCrazy Eights

Group: General Forum Members
Last Login: Yesterday @ 10:52 PM
Points: 9,963, Visits: 9,372
From your BULK INSERT statement, you're not actually importing from an Excel file, but from a CSV file.

This is an important distinction because it's been my experience that you have a much better shot at getting your input file standardized when using CSV. Excel interprets lots of values such as leading zeros, dates, etc. that make it virtually impossible to get things standardized. If this is going to be a repeated process that runs on a server, Excel can also have VBA code associated with it and you probably don't want to take a chance by running it on your server, especially if it comes from a third party.

I've found that for repeated loads from consistent data files, using BULK INSERT with a format file is very reliable. It's a bit of work to set up the XML format file, but it is ultra-fast and very consistent. A point to consider is that both the file with the source data and the format file need to be on the server itself.


Tally Tables - Performance Personified
String Splitting with True Performance
Best practices on how to ask questions
Post #1423753
« Prev Topic | Next Topic »

Add to briefcase

Permissions Expand / Collapse