Click here to monitor SSC
SQLServerCentral is supported by Red Gate Software Ltd.
 
Log in  ::  Register  ::  Not logged in
 
 
 
        
Home       Members    Calendar    Who's On


Add to briefcase ««12345»»»

Create Table and Bulk Insert Expand / Collapse
Author
Message
Posted Tuesday, December 25, 2012 3:30 PM


SSC-Dedicated

SSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-Dedicated

Group: General Forum Members
Last Login: Today @ 11:31 AM
Points: 36,714, Visits: 31,164
urso47 (12/25/2012)
Hope you had a great X'mas Eve!

Thank you and Merry Christmas!


It worked perfectly, even when I don't get whole understanding right now, but I will learn when practicing. I didn't even know about this "CROSS APPLY " and " Item = REPLACE" function. Thank you for teach me step by step like a kid, because for me most part of the code still very hard to understand.

Press the {f1} key to bring up "Books Online". That and this website should become your best friends.


Now, I would like to ask you how could I execute this SP in dozens of files within the same directory (c:temp in this case). Is there a way to call all the files within this directory regardless the file name?

Yes... that's the next "lesson". What is the name of "that" directory?


I found the SP under Programmability, but not the table for the "GetFileType01", so I think we have to do an INSERT INTO SomeTable, am I right?

Correct. We first needed to show you how to parse the data. Next, we'll show you how to insert it into a table and how to do it for many files.


And finally, for some other files some fields will be bigger than 8K, specially one field that brings http adresses longer than 500 or 600 characters and I am having troubles even using the T-SQL or a SQL Server Import Export Tools, it comes with the error message:

I need the complete file layout in order to be able to help there. A sample file would also be very helpful.


When I use a T-SQL transaction to do it, I found out the this field gets splited and part of it goes to the next fields, even using VARCHAR(MAX) or TEXT configuration in this field. I don't have the "header problem" with this files, but now I faced this problem with a very big/long information to insert in this other table field.

Again, I'd need to know the layout for such a file including what the delimiters are (comma, tab, or something else). I'd also need to know if the large column has any column delimiters embedded in the data itself and if the column is encapsulated in any type of "text qualifier".


Always thank you, Jeff!

Andre

My pleasure. If you want to save some time, always include things like a record layout, the CREATE TABLE statement for the target table, and, if you can and without violating any private information or "company proprietary informatio", at least the first 10 lines from the files you'll be working with. Obviously, a file like those containing headers will need to have more lines included to cover the header and about 10 lines of data.


--Jeff Moden
"RBAR is pronounced "ree-bar" and is a "Modenism" for "Row-By-Agonizing-Row".

First step towards the paradigm shift of writing Set Based code:
Stop thinking about what you want to do to a row... think, instead, of what you want to do to a column."

(play on words) "Just because you CAN do something in T-SQL, doesn't mean you SHOULDN'T." --22 Aug 2013

Helpful Links:
How to post code problems
How to post performance problems
Post #1400104
Posted Tuesday, December 25, 2012 8:49 PM
SSC Rookie

SSC RookieSSC RookieSSC RookieSSC RookieSSC RookieSSC RookieSSC RookieSSC Rookie

Group: General Forum Members
Last Login: Thursday, January 24, 2013 2:01 PM
Points: 29, Visits: 239
Thanks Jeff,

I will use a directory at:
C:\Temp\Eq_files\Eq_sample_1.csv and Eq_sample_2.csv
C:\Temp\sync_files\sync_sample_1.csv and sync_sample_2.csv
C:\Temp\pcut_files\pcut_sample_1.csv and pcut_sample_2.csv
C:\Temp\sm_files\V151110456_print.csv and V1511100466_print.csv

All the files are zipped in a folder (sample files, table structure image and t-code). Pcut_files and sm_files have headers. Eq_files have a big length problematic field.

For the pcut_files, the first header row is unnecessary. If possible, I would like to have an Identity field with an Identity Increment for all those tables.

I tried CREATE TABLE t_eq_temp then BULK INSERT on it in order to SELECT INTO t_equitrac table but it didn't work, I can't even remove the double quotes, and the cDocumentName field gets truncated... For the eq_sample_1.csv a BULK INSERT works with all fields as VARCHAR (MAX) but, I still have the unnecessary double quotes and the sDocumentName gets splited and part of it goes to the next fields. The real files have more than 100k rows, I can send you by e-mail if you prefer. Is there a way to avoid duplicated rows in case I try to insert the same file over again?

All files are in original format now, and to be honest, I didn't even imagine that we would go this further with this help and I am very glad for that.

Many Thanks again!

Andre


  Post Attachments 
Temp.zip (1 view, 104.72 KB)
Post #1400129
Posted Thursday, December 27, 2012 2:05 AM
SSC Rookie

SSC RookieSSC RookieSSC RookieSSC RookieSSC RookieSSC RookieSSC RookieSSC Rookie

Group: General Forum Members
Last Login: Thursday, January 24, 2013 2:01 PM
Points: 29, Visits: 239
Up!!!
Post #1400499
Posted Thursday, December 27, 2012 11:29 AM


SSCommitted

SSCommittedSSCommittedSSCommittedSSCommittedSSCommittedSSCommittedSSCommittedSSCommitted

Group: General Forum Members
Last Login: Sunday, July 20, 2014 11:55 AM
Points: 1,945, Visits: 2,860
There are tools for Report mining tool. I sued Monarch years ago, and you can see an old video at:

http://www.youtube.com/watch?v=MxTGHlsyJso

Another product is:

http://www.astera.com/solutions/technology-solutions/report-mining?gclid=CO_klOmZu7QCFeiPPAodMxoAxQ

Whcih has a free download.

Basically you pull up the text of the report on a screen, pick the fields you to pull off and load them into a file that can be loaded into a database. Nice simple GUI interfaces. They correct the date formats, ignore report headers, make simple decisions, etc. Astera has a free version.


Books in Celko Series for Morgan-Kaufmann Publishing
Analytics and OLAP in SQL
Data and Databases: Concepts in Practice
Data, Measurements and Standards in SQL
SQL for Smarties
SQL Programming Style
SQL Puzzles and Answers
Thinking in Sets
Trees and Hierarchies in SQL
Post #1400715
Posted Thursday, December 27, 2012 3:44 PM
SSC Rookie

SSC RookieSSC RookieSSC RookieSSC RookieSSC RookieSSC RookieSSC RookieSSC Rookie

Group: General Forum Members
Last Login: Thursday, January 24, 2013 2:01 PM
Points: 29, Visits: 239
Hi Celko,

Thank you for your message. I just submitted a download form and a message said that an Astera Software representative will be in contact shortly.
Post #1400750
Posted Thursday, December 27, 2012 4:55 PM


SSC-Dedicated

SSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-Dedicated

Group: General Forum Members
Last Login: Today @ 11:31 AM
Points: 36,714, Visits: 31,164
CELKO (12/27/2012)
I sued Monarch years ago,...


Why did you need to sue them?


--Jeff Moden
"RBAR is pronounced "ree-bar" and is a "Modenism" for "Row-By-Agonizing-Row".

First step towards the paradigm shift of writing Set Based code:
Stop thinking about what you want to do to a row... think, instead, of what you want to do to a column."

(play on words) "Just because you CAN do something in T-SQL, doesn't mean you SHOULDN'T." --22 Aug 2013

Helpful Links:
How to post code problems
How to post performance problems
Post #1400756
Posted Thursday, December 27, 2012 5:02 PM


SSC-Dedicated

SSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-Dedicated

Group: General Forum Members
Last Login: Today @ 11:31 AM
Points: 36,714, Visits: 31,164
urso47 (12/25/2012)
Thanks Jeff,

I will use a directory at:
C:\Temp\Eq_files\Eq_sample_1.csv and Eq_sample_2.csv
C:\Temp\sync_files\sync_sample_1.csv and sync_sample_2.csv
C:\Temp\pcut_files\pcut_sample_1.csv and pcut_sample_2.csv
C:\Temp\sm_files\V151110456_print.csv and V1511100466_print.csv

All the files are zipped in a folder (sample files, table structure image and t-code). Pcut_files and sm_files have headers. Eq_files have a big length problematic field.

For the pcut_files, the first header row is unnecessary. If possible, I would like to have an Identity field with an Identity Increment for all those tables.

I tried CREATE TABLE t_eq_temp then BULK INSERT on it in order to SELECT INTO t_equitrac table but it didn't work, I can't even remove the double quotes, and the cDocumentName field gets truncated... For the eq_sample_1.csv a BULK INSERT works with all fields as VARCHAR (MAX) but, I still have the unnecessary double quotes and the sDocumentName gets splited and part of it goes to the next fields. The real files have more than 100k rows, I can send you by e-mail if you prefer. Is there a way to avoid duplicated rows in case I try to insert the same file over again?

All files are in original format now, and to be honest, I didn't even imagine that we would go this further with this help and I am very glad for that.

Many Thanks again!

Andre


I need the CREATE TABLE statements for each of these file types. JPG's won't do me any good. I need the code itself. You can save it TXT or SQL files. Either is fine with me.


--Jeff Moden
"RBAR is pronounced "ree-bar" and is a "Modenism" for "Row-By-Agonizing-Row".

First step towards the paradigm shift of writing Set Based code:
Stop thinking about what you want to do to a row... think, instead, of what you want to do to a column."

(play on words) "Just because you CAN do something in T-SQL, doesn't mean you SHOULDN'T." --22 Aug 2013

Helpful Links:
How to post code problems
How to post performance problems
Post #1400757
Posted Thursday, December 27, 2012 7:08 PM
SSC Rookie

SSC RookieSSC RookieSSC RookieSSC RookieSSC RookieSSC RookieSSC RookieSSC Rookie

Group: General Forum Members
Last Login: Thursday, January 24, 2013 2:01 PM
Points: 29, Visits: 239

I need the CREATE TABLE statements for each of these file types. JPG's won't do me any good. I need the code itself. You can save it TXT or SQL files. Either is fine with me.


Hi Jeff,

Thanks for your response. I just uploaded the statements and the sample files again.


  Post Attachments 
Forum.zip (5 views, 13.39 KB)
Post #1400774
Posted Saturday, December 29, 2012 8:16 PM
SSC Rookie

SSC RookieSSC RookieSSC RookieSSC RookieSSC RookieSSC RookieSSC RookieSSC Rookie

Group: General Forum Members
Last Login: Thursday, January 24, 2013 2:01 PM
Points: 29, Visits: 239
Hi guys,

I have been struggling myself for the last two days but I couldn't make it work. Could anyone help me on this statement?

-- TABLE EQUIT
USE TEMPDB
GO

CREATE TABLE Equit
(
cGroupID INT,
cGroupIDBillable INT,
cGroupName VARCHAR(MAX),
cGroupDescription VARCHAR(MAX),
cRowID INT,
cRowName VARCHAR(MAX),
cRowDescription VARCHAR(MAX),
cTransactDate SMALLDATETIME,
cTransactType VARCHAR(MAX),
cChargeAccountID VARCHAR(MAX),
cChargeAccountType VARCHAR(MAX),
cUserWhoPrinted VARCHAR(MAX),
cDocumentName VARCHAR(MAX),
cUnits INT,
cDocumentID VARCHAR(MAX),
cDeviceIP VARCHAR(MAX),
cDevModel VARCHAR(MAX),
cDevManuf VARCHAR(MAX),
cDuration VARCHAR(MAX),
cDestination VARCHAR(MAX),
cFullName VARCHAR(MAX),
cPrimaryPIN VARCHAR(MAX),
cWorkstation VARCHAR(MAX),
cAmount INT,
JobProperties VARCHAR(MAX),
cAltCost INT,
cDepartment VARCHAR(MAX),
cBillable VARCHAR(MAX),
cAlternatePin VARCHAR(MAX)
)
GO

-- FORMAT FILE FORMAT.EQUIT.FMT
10.0
30
1 SQLCHAR 0 0 "\"" 0 Line SQL_Latin1_General_CP1_CI_AS
2 SQLCHAR 0 0 "\"," 1 cGroupID SQL_Latin1_General_CP1_CI_AS
3 SQLCHAR 0 0 "\"," 2 cGroupIDBillable SQL_Latin1_General_CP1_CI_AS
4 SQLCHAR 2 0 ",\"" 3 cGroupName SQL_Latin1_General_CP1_CI_AS
5 SQLCHAR 2 0 ",\"" 4 cGroupDescription SQL_Latin1_General_CP1_CI_AS
6 SQLCHAR 2 0 ",\"" 5 cRowID SQL_Latin1_General_CP1_CI_AS
7 SQLCHAR 2 0 ",\"" 6 cRowName SQL_Latin1_General_CP1_CI_AS
8 SQLCHAR 2 0 ",\"" 7 cRowDescription SQL_Latin1_General_CP1_CI_AS
9 SQLCHAR 2 0 ",\"" 8 cTransactDate SQL_Latin1_General_CP1_CI_AS
10 SQLCHAR 2 0 ",\"" 9 cTransactType SQL_Latin1_General_CP1_CI_AS
11 SQLCHAR 2 0 ",\"" 10 cChargeAccountID SQL_Latin1_General_CP1_CI_AS
12 SQLCHAR 2 0 ",\"" 11 cChargeAccountType SQL_Latin1_General_CP1_CI_AS
13 SQLCHAR 2 0 ",\"" 12 cUserWhoPrinted SQL_Latin1_General_CP1_CI_AS
14 SQLCHAR 2 0 ",\"" 13 cDocumentName SQL_Latin1_General_CP1_CI_AS
15 SQLCHAR 2 0 "\"," 14 cUnits SQL_Latin1_General_CP1_CI_AS
16 SQLCHAR 2 0 ",\"" 15 cDocumentID SQL_Latin1_General_CP1_CI_AS
17 SQLCHAR 2 0 ",\"" 16 cDeviceIP SQL_Latin1_General_CP1_CI_AS
18 SQLCHAR 2 0 ",\"" 17 cDevModel SQL_Latin1_General_CP1_CI_AS
19 SQLCHAR 2 0 ",\"" 18 cDevManuf SQL_Latin1_General_CP1_CI_AS
20 SQLCHAR 2 0 ",\"" 19 cDuration SQL_Latin1_General_CP1_CI_AS
21 SQLCHAR 2 0 ",\"" 20 cDestination SQL_Latin1_General_CP1_CI_AS
22 SQLCHAR 2 0 ",\"" 21 cFullName SQL_Latin1_General_CP1_CI_AS
23 SQLCHAR 2 0 ",\"" 22 cPrimaryPIN SQL_Latin1_General_CP1_CI_AS
24 SQLCHAR 2 0 ",\"" 23 cWorkstation SQL_Latin1_General_CP1_CI_AS
25 SQLCHAR 2 0 "\"," 24 cAmount SQL_Latin1_General_CP1_CI_AS
26 SQLCHAR 2 0 ",\"" 25 JobProperties SQL_Latin1_General_CP1_CI_AS
27 SQLCHAR 2 0 "\"," 26 cAltCost SQL_Latin1_General_CP1_CI_AS
28 SQLCHAR 2 0 ",\"" 27 cDepartment SQL_Latin1_General_CP1_CI_AS
29 SQLCHAR 2 0 ",\"" 28 cBillable SQL_Latin1_General_CP1_CI_AS
30 SQLCHAR 2 0 "\"\r" 29 cAlternatePin SQL_Latin1_General_CP1_CI_AS
--=== BLANK LINE



MASTER..XP_CMDSHELL 'bcp EQUIT in c:\temp\equit.csv -f c:\temp\equitfmt.fmt'
BULK INSERT EQUIT FROM 'c:\temp\equit.csv' WITH (FORMATFILE = 'c:\temp\equitfmt.fmt')

Error:
(13 row(s) affected)
Msg 4864, Level 16, State 1, Line 2
Bulk load data conversion error (type mismatch or invalid character for the specified codepage) for row 1, column 2 (cGroupID).
Msg 4832, Level 16, State 1, Line 2
Bulk load: An unexpected end of file was encountered in the data file.
Msg 7399, Level 16, State 1, Line 2
The OLE DB provider "BULK" for linked server "(null)" reported an error. The provider did not give any information about the error.
Msg 7330, Level 16, State 2, Line 2
Cannot fetch a row from OLE DB provider "BULK" for linked server "(null)".


I also tried this one but I got the same error.


DECLARE @CAMINHO VARCHAR(256), @SQL VARCHAR (1000)
SET @CAMINHO = 'C:\TEMP\Equit.csv'
SET @SQL = 'BULK INSERT EQUIT
FROM ''' + @CAMINHO + '''
WITH (FIELDTERMINATOR = '','', CODEPAGE =''ACP'', ROWTERMINATOR='''')'
EXEC (@SQL)

The files used are attached.

Thanks in advance!


  Post Attachments 
equit_files.zip (1 view, 13.45 KB)
Post #1401223
Posted Saturday, December 29, 2012 8:21 PM
SSC Rookie

SSC RookieSSC RookieSSC RookieSSC RookieSSC RookieSSC RookieSSC RookieSSC Rookie

Group: General Forum Members
Last Login: Thursday, January 24, 2013 2:01 PM
Points: 29, Visits: 239

I need the CREATE TABLE statements for each of these file types. JPG's won't do me any good. I need the code itself. You can save it TXT or SQL files. Either is fine with me.


Hi Jeff,

Thanks for your response. I just uploaded the statements and the sample files again.


  Post Attachments 
Forum.zip (1 view, 13.39 KB)
Post #1401224
« Prev Topic | Next Topic »

Add to briefcase ««12345»»»

Permissions Expand / Collapse