I have a bunch of logging data that my sysadmin has downloaded from our elastic load balancers at amazon and I need to stick it in a completely bland table for a one off sales pitch.
I've scripted up all of the unzip, copy, etc etc… but I've become so reliant on SSIS that I've lost my data import skills.
I have text files that are space separated with standard CR/LF line endings and are text quoted using "
I know how to list the files, so the idea is either a batch file that imports (bad - if i get kicked off the workstation/server) or a cursor (also yukky) that loops through my list of files.
I'm hopelessly lost in the import process though - and the format file documentation is driving me crazy
I got to the point of looking at openrowset
SELECT X.* FROM OPENROWSET(BULK N'\\xxxserverxxx\csv\215319143659_elasticloadbalancing_eu-west-1_app.Sign3-Liv-Disp01-Pub-ALB.de861ec4fa295ccb_20200113T0000Z_184.108.40.206_3qh6f4c2.log', single_CLOB) AS X
which is giving me a single row, single column spit out of the file
BULK INSERT importdata.dbo.import FROM '\\xxxserverxxx\csv\215319143659_elasticloadbalancing_eu-west-1_app.Sign3-Liv-Disp01-Pub-ALB.de861ec4fa295ccb_20200113T0000Z_220.127.116.11_3qh6f4c2.log'
WITH (FIELDTERMINATOR=' ')
which is giving me
The bulk load failed. The column is too long in the data file for row 1, column 27. Verify that the field terminator and row terminator are specified correctly.
I don't have a field27, (field 26 is my max column) - but I can't see how to set the row terminator (if that is the problem)
can anyone help me with either a format file and a command or another way of doing this...…?