Click here to monitor SSC
SQLServerCentral is supported by Red Gate Software Ltd.
 
Log in  ::  Register  ::  Not logged in
 
 
 
        
Home       Members    Calendar    Who's On


Add to briefcase ««12

Working with stored text file in TSQL Expand / Collapse
Author
Message
Posted Sunday, March 23, 2014 9:57 AM


SSC-Dedicated

SSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-Dedicated

Group: General Forum Members
Last Login: Today @ 11:31 AM
Points: 36,728, Visits: 31,180
I'm still a bit confused though. You say the user is uploading a file... what is that process? Is the user uploading a file to the Web Server and then your app is loading from that file or ???

I need to understand what the current process and file location is so I can try to come up with a better alternative.


--Jeff Moden
"RBAR is pronounced "ree-bar" and is a "Modenism" for "Row-By-Agonizing-Row".

First step towards the paradigm shift of writing Set Based code:
Stop thinking about what you want to do to a row... think, instead, of what you want to do to a column."

(play on words) "Just because you CAN do something in T-SQL, doesn't mean you SHOULDN'T." --22 Aug 2013

Helpful Links:
How to post code problems
How to post performance problems
Post #1553823
Posted Sunday, March 23, 2014 2:09 PM
Grasshopper

GrasshopperGrasshopperGrasshopperGrasshopperGrasshopperGrasshopperGrasshopperGrasshopper

Group: General Forum Members
Last Login: Thursday, June 19, 2014 10:14 AM
Points: 11, Visits: 44
Yes Jeff. We are using asp.net 3.5 to present the users/analysts with the tools they require. The data provider uploads files on the 1st of the month. We use the native upload control for .Net. This control places the file in a predefined folder within the apps folder structure. Once it successfully saves to this folder, we begin walking the file to generate the values required for the sproc. When this walk reaches the end of the file, we close the file and delete it.

Did I cover the details enough? Besides what's mentioned above, there really isn't much to it. We don't use ajax or any other eye candy so the controls and logic are pretty straight forward.

Keith
Post #1553839
Posted Monday, March 24, 2014 5:19 AM


SSC-Dedicated

SSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-Dedicated

Group: General Forum Members
Last Login: Today @ 11:31 AM
Points: 36,728, Visits: 31,180
If that's the case, then you could use SQLBULKCOPY in C# to load the rows into the database in batches much like BCP or BULK INSERT would. That will be MUCH faster than doing an INSERT per row.

--Jeff Moden
"RBAR is pronounced "ree-bar" and is a "Modenism" for "Row-By-Agonizing-Row".

First step towards the paradigm shift of writing Set Based code:
Stop thinking about what you want to do to a row... think, instead, of what you want to do to a column."

(play on words) "Just because you CAN do something in T-SQL, doesn't mean you SHOULDN'T." --22 Aug 2013

Helpful Links:
How to post code problems
How to post performance problems
Post #1553965
Posted Tuesday, March 25, 2014 12:05 PM
Grasshopper

GrasshopperGrasshopperGrasshopperGrasshopperGrasshopperGrasshopperGrasshopperGrasshopper

Group: General Forum Members
Last Login: Thursday, June 19, 2014 10:14 AM
Points: 11, Visits: 44
Well that looks promising. Let me do some reading and see where it leads. I think I may have looked at that once long ago, but shelved it because i needed to do a row by row validation and this if i recall was a firehouse with the all or nothing approach.

Let me do some google csi and see how things have changed, if at all.
Post #1554629
Posted Tuesday, March 25, 2014 2:59 PM


SSC-Dedicated

SSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-Dedicated

Group: General Forum Members
Last Login: Today @ 11:31 AM
Points: 36,728, Visits: 31,180
keith.westberg (3/25/2014)
Well that looks promising. Let me do some reading and see where it leads. I think I may have looked at that once long ago, but shelved it because i needed to do a row by row validation and this if i recall was a firehouse with the all or nothing approach.

Let me do some google csi and see how things have changed, if at all.


I you need to, you can still do "row-by-row" validations. Load 10 or 50 K rows into an array, validate the data, and bulk it over to the server.

A better way (IMHO) is to bulk all of it over to a staging table on the server and validate it there. Then copy the validated rows to their final resting places and bad data to an errata table for repairs or failure reporting.



--Jeff Moden
"RBAR is pronounced "ree-bar" and is a "Modenism" for "Row-By-Agonizing-Row".

First step towards the paradigm shift of writing Set Based code:
Stop thinking about what you want to do to a row... think, instead, of what you want to do to a column."

(play on words) "Just because you CAN do something in T-SQL, doesn't mean you SHOULDN'T." --22 Aug 2013

Helpful Links:
How to post code problems
How to post performance problems
Post #1554693
Posted Tuesday, March 25, 2014 4:05 PM
SSCommitted

SSCommittedSSCommittedSSCommittedSSCommittedSSCommittedSSCommittedSSCommittedSSCommitted

Group: General Forum Members
Last Login: Today @ 8:59 AM
Points: 1,960, Visits: 2,894
Gotta' admit, I prefer working in SQL/TSQL, so even for (moderately) large text files I'll often just do an:

OPENROWSET(BULK '<path_and_file_name>', SINGLE_CLOB) --or NCLOB, if req'd

to load the whole file at once into a [n]varchar(max) variable or table column, then process it from there. Maybe that can be of some help in your case as well.

[IIRC, that method doesn't optimize logging, so you might want to use a temp table when possible, or always if the file is very large.]


SQL DBA,SQL Server MVP('07, '08, '09)
"In America, every man is innocent until proven broke!" Brant Parker
Post #1554712
Posted Tuesday, March 25, 2014 7:05 PM
Grasshopper

GrasshopperGrasshopperGrasshopperGrasshopperGrasshopperGrasshopperGrasshopperGrasshopper

Group: General Forum Members
Last Login: Thursday, June 19, 2014 10:14 AM
Points: 11, Visits: 44
@SSCommitted: I would sure love to see an example of what you mentioned. That's exactly what I envisioned but cannot for the life of me find an example of working with an uploaded file directly and walking it via tsql. My oracle friend says its cake, but ever since he made that comment, he went off the grid... =/
Post #1554738
Posted Tuesday, March 25, 2014 7:08 PM
Grasshopper

GrasshopperGrasshopperGrasshopperGrasshopperGrasshopperGrasshopperGrasshopperGrasshopper

Group: General Forum Members
Last Login: Thursday, June 19, 2014 10:14 AM
Points: 11, Visits: 44
@ScottPletcher: haha... the above response is for you. That's a dead giveaway that I'm new around here.
Post #1554740
Posted Wednesday, March 26, 2014 9:10 AM
SSCommitted

SSCommittedSSCommittedSSCommittedSSCommittedSSCommittedSSCommittedSSCommittedSSCommitted

Group: General Forum Members
Last Login: Today @ 8:59 AM
Points: 1,960, Visits: 2,894
Roughly like below. You may have to adjust the "SINGLE_NCLOB" to "SINGLE_CLOB", depending on the format of the input file.


DECLARE @file_text varchar(max)

SELECT @file_text = file_text
FROM OPENROWSET(BULK 'd:\full\path\to\file\filename.txt', SINGLE_NCLOB) AS file_bulk(file_text)
--(Or)
--INSERT INTO dbo.tablename ( file_name, file_text )
--SELECT 'File1' AS file_name, file_text FROM ...

SELECT @file_text




SQL DBA,SQL Server MVP('07, '08, '09)
"In America, every man is innocent until proven broke!" Brant Parker
Post #1555026
Posted Wednesday, March 26, 2014 10:24 AM
Grasshopper

GrasshopperGrasshopperGrasshopperGrasshopperGrasshopperGrasshopperGrasshopperGrasshopper

Group: General Forum Members
Last Login: Thursday, June 19, 2014 10:14 AM
Points: 11, Visits: 44
@ScottPletcher: Thank you Scott. I apparently misunderstood you. I assumed you meant that you preferred to generate the loading of records by moving the file to the db and parsing it to produce a flow of insert/updates per line.

That was my original vision. Apparently its not a common thing because I cannot find a single post where if given the following text data, a routine would parse and exec db calls

Payload: testing.txt stored in a nvarchar(max)
cola,colb,colc
a,b,b
s,d,f

SPROC would generate
exec insert into TBL (cola,colb,colc) values (a,b,b);
exec update TBL set cola = a, colb=b, colc=b;



Post #1555068
« Prev Topic | Next Topic »

Add to briefcase ««12

Permissions Expand / Collapse