Log in  ::  Register  ::  Not logged in

Working with stored text file in TSQL

 Author Message Jeff Moden SSC Guru Group: General Forum Members Points: 209593 Visits: 41973 I'm still a bit confused though. You say the user is uploading a file... what is that process? Is the user uploading a file to the Web Server and then your app is loading from that file or ???I need to understand what the current process and file location is so I can try to come up with a better alternative. --Jeff ModenRBAR is pronounced ree-bar and is a Modenism for Row-By-Agonizing-Row.First step towards the paradigm shift of writing Set Based code: Stop thinking about what you want to do to a row... think, instead, of what you want to do to a column.If you think its expensive to hire a professional to do the job, wait until you hire an amateur. -- Red Adair Helpful Links:How to post code problemsHow to post performance problemsForum FAQs keith.westberg Grasshopper Group: General Forum Members Points: 11 Visits: 48 Yes Jeff. We are using asp.net 3.5 to present the users/analysts with the tools they require. The data provider uploads files on the 1st of the month. We use the native upload control for .Net. This control places the file in a predefined folder within the apps folder structure. Once it successfully saves to this folder, we begin walking the file to generate the values required for the sproc. When this walk reaches the end of the file, we close the file and delete it.Did I cover the details enough? Besides what's mentioned above, there really isn't much to it. We don't use ajax or any other eye candy so the controls and logic are pretty straight forward.Keith Jeff Moden SSC Guru Group: General Forum Members Points: 209593 Visits: 41973 If that's the case, then you could use SQLBULKCOPY in C# to load the rows into the database in batches much like BCP or BULK INSERT would. That will be MUCH faster than doing an INSERT per row. --Jeff ModenRBAR is pronounced ree-bar and is a Modenism for Row-By-Agonizing-Row.First step towards the paradigm shift of writing Set Based code: Stop thinking about what you want to do to a row... think, instead, of what you want to do to a column.If you think its expensive to hire a professional to do the job, wait until you hire an amateur. -- Red Adair Helpful Links:How to post code problemsHow to post performance problemsForum FAQs keith.westberg Grasshopper Group: General Forum Members Points: 11 Visits: 48 Well that looks promising. Let me do some reading and see where it leads. I think I may have looked at that once long ago, but shelved it because i needed to do a row by row validation and this if i recall was a firehouse with the all or nothing approach.Let me do some google csi and see how things have changed, if at all. Jeff Moden SSC Guru Group: General Forum Members Points: 209593 Visits: 41973 keith.westberg (3/25/2014)Well that looks promising. Let me do some reading and see where it leads. I think I may have looked at that once long ago, but shelved it because i needed to do a row by row validation and this if i recall was a firehouse with the all or nothing approach.Let me do some google csi and see how things have changed, if at all.I you need to, you can still do "row-by-row" validations. Load 10 or 50 K rows into an array, validate the data, and bulk it over to the server.A better way (IMHO) is to bulk all of it over to a staging table on the server and validate it there. Then copy the validated rows to their final resting places and bad data to an errata table for repairs or failure reporting. --Jeff ModenRBAR is pronounced ree-bar and is a Modenism for Row-By-Agonizing-Row.First step towards the paradigm shift of writing Set Based code: Stop thinking about what you want to do to a row... think, instead, of what you want to do to a column.If you think its expensive to hire a professional to do the job, wait until you hire an amateur. -- Red Adair Helpful Links:How to post code problemsHow to post performance problemsForum FAQs ScottPletcher SSCoach Group: General Forum Members Points: 19143 Visits: 7410 Gotta' admit, I prefer working in SQL/TSQL, so even for (moderately) large text files I'll often just do an:OPENROWSET(BULK '', SINGLE_CLOB) --or NCLOB, if req'dto load the whole file at once into a [n]varchar(max) variable or table column, then process it from there. Maybe that can be of some help in your case as well.[IIRC, that method doesn't optimize logging, so you might want to use a temp table when possible, or always if the file is very large.] SQL DBA,SQL Server MVP(07, 08, 09) Prosecutor James Blackburn, in closing argument in the Fatal Vision murders trial: If in the future, you should cry a tear, cry one for them [the murder victims]. If in the future, you should say a prayer, say one for them. And if in the future, you should light a candle, light one for them. keith.westberg Grasshopper Group: General Forum Members Points: 11 Visits: 48 @SSCommitted: I would sure love to see an example of what you mentioned. That's exactly what I envisioned but cannot for the life of me find an example of working with an uploaded file directly and walking it via tsql. My oracle friend says its cake, but ever since he made that comment, he went off the grid... =/ keith.westberg Grasshopper Group: General Forum Members Points: 11 Visits: 48 @ScottPletcher: haha... the above response is for you. That's a dead giveaway that I'm new around here. ScottPletcher SSCoach Group: General Forum Members Points: 19143 Visits: 7410 Roughly like below. You may have to adjust the "SINGLE_NCLOB" to "SINGLE_CLOB", depending on the format of the input file.DECLARE @file_text varchar(max)SELECT @file_text = file_textFROM OPENROWSET(BULK 'd:\full\path\to\file\filename.txt', SINGLE_NCLOB) AS file_bulk(file_text)--(Or)--INSERT INTO dbo.tablename ( file_name, file_text )--SELECT 'File1' AS file_name, file_text FROM ...SELECT @file_text SQL DBA,SQL Server MVP(07, 08, 09) Prosecutor James Blackburn, in closing argument in the Fatal Vision murders trial: If in the future, you should cry a tear, cry one for them [the murder victims]. If in the future, you should say a prayer, say one for them. And if in the future, you should light a candle, light one for them. keith.westberg Grasshopper Group: General Forum Members Points: 11 Visits: 48 @ScottPletcher: Thank you Scott. I apparently misunderstood you. I assumed you meant that you preferred to generate the loading of records by moving the file to the db and parsing it to produce a flow of insert/updates per line.That was my original vision. Apparently its not a common thing because I cannot find a single post where if given the following text data, a routine would parse and exec db callsPayload: testing.txt stored in a nvarchar(max)cola,colb,colca,b,bs,d,fSPROC would generateexec insert into TBL (cola,colb,colc) values (a,b,b);exec update TBL set cola = a, colb=b, colc=b;