Click here to monitor SSC
SQLServerCentral is supported by Red Gate Software Ltd.
 
Log in  ::  Register  ::  Not logged in
 
 
 
        
Home       Members    Calendar    Who's On


Add to briefcase

SQL CLR .Net stored procedure and threading Expand / Collapse
Author
Message
Posted Wednesday, June 11, 2008 12:39 AM
Forum Newbie

Forum NewbieForum NewbieForum NewbieForum NewbieForum NewbieForum NewbieForum NewbieForum Newbie

Group: General Forum Members
Last Login: Wednesday, June 18, 2008 10:12 AM
Points: 2, Visits: 8
Hi there,
We are trying to implement a solution using .net CLR SQL 2005. The current system is trying to do a mass operation of sending 100000 rows of data to the database and sometimes the stored procedure is timing out to process the data.
We have been contemplating of a solution of sending the data in batches call the stored procedure in batch and consolidate the results and send them to the UI. But this also will have a performance hit because we need to do a table lock on the table and process all the records which will be sent from the user (100000). This will only be useful if there is a single user scanerio but we would like to have concurrent usage scanerio.
I was just going thro the SQL 2005 .net CLR integration and wondering whether CLR stored procedure can be written to invoke the data in chunk. The UI will send the dataset to the CLR stored procedure and the CLR SP will run threads to send the data in bits of 1000 and then yield when there is a exception. So we are not holding the UI for the time it takes for the Stored procedure to process the whole data. CLR can send some kind of progress bar to the UI and show the progress made with regard to the data. Is this solution viable in SQL CLR .net?
If so is there any samples that can look at for thread programming in SQL CLR?
Thanks
John
Post #514958
Posted Wednesday, June 11, 2008 10:05 AM


SSCertifiable

SSCertifiableSSCertifiableSSCertifiableSSCertifiableSSCertifiableSSCertifiableSSCertifiableSSCertifiableSSCertifiable

Group: General Forum Members
Last Login: Today @ 8:41 PM
Points: 7,122, Visits: 15,028
I'm not sure why you'd need to try to thread this. Assuming you're doing this with a Bulk Insert process, BCP or a CLR SQLBulkCopy object, 100K rows should be inserted in the blink of an eye (unless the rows themselves are HUGE, which would slow down any process you set up).

Try taking a look at the SQLBulkCopy object. One way or another - it should be your friend. If you find that the 100,000 makes it too slow - consider setting the timeout lower and/or the batch size to a smaller increment.

http://msdn.microsoft.com/en-us/library/7ek5da1a(VS.80).aspx


----------------------------------------------------------------------------------
Your lack of planning does not constitute an emergency on my part...unless you're my manager...or a director and above...or a really loud-spoken end-user..All right - what was my emergency again?
Post #515329
Posted Wednesday, June 11, 2008 10:38 AM
Forum Newbie

Forum NewbieForum NewbieForum NewbieForum NewbieForum NewbieForum NewbieForum NewbieForum Newbie

Group: General Forum Members
Last Login: Wednesday, June 18, 2008 10:12 AM
Points: 2, Visits: 8
Hi there,
THanks for your thoughts. let me explain the problem scanerio:
We have a BOMWALKDOWN which is a lengthy process and every child in the BOM will be exploded and if there is an impact to one it ripples to lot of children and as well we do BOMWALKUP where Parents get affected. This result will be quite a pain for the customers. They have to wait for more than 5 mins or sometimes they will not be able to get the results because of timeout issues. So what we proposed is that user can give the data in chunks of 5% of the load everytime and then SP will do a BOMWALKDOWN or BOMWALKUP for those materials. Then once that gets completed the client will send the next 5% data to the same SP. This will be ok and the overhead will be maintaining a counter for the transaction all the time. So I was thinking if we can use CLR to do the same task, client need not bother about maintaining a counter for the batch mode. We can spawn threads to call the SP, all the data can be consumed in one time by the CLR SP and then it would inturn call the transaction SP (this does the real BOMWALKDOWN or BOMWALKUP).
By the way we are not doing any BULKINSERT in these scanerios it is just a SELECT which will be sent to the client (internally the SP will use table variables for storing the impacts and joining with different tables).
Thanks
John
Post #515357
Posted Wednesday, June 11, 2008 8:07 PM


SSCrazy Eights

SSCrazy EightsSSCrazy EightsSSCrazy EightsSSCrazy EightsSSCrazy EightsSSCrazy EightsSSCrazy EightsSSCrazy EightsSSCrazy EightsSSCrazy Eights

Group: General Forum Members
Last Login: Thursday, June 5, 2014 10:54 AM
Points: 9,902, Visits: 9,480
This really sounds like it should be a Service Broker application. This would allow you to decouple the WALKDOWN / WALKUP from the Sending of 100000 rows.

-- RBarryYoung, (302)375-0451 blog: MovingSQL.com, Twitter: @RBarryYoung
Proactive Performance Solutions, Inc.
"Performance is our middle name."
Post #515610
Posted Thursday, June 12, 2008 8:50 AM
SSC-Addicted

SSC-AddictedSSC-AddictedSSC-AddictedSSC-AddictedSSC-AddictedSSC-AddictedSSC-AddictedSSC-Addicted

Group: General Forum Members
Last Login: Thursday, July 24, 2014 5:08 AM
Points: 424, Visits: 395
It would be interesting to know if you can spawn threads in SQL CLR. SQL Server already has its own multi-threading load distribution in place for query processing.

It sounds like you need to run asynchronous queries on the client. The client platform may or may not have this capability, but even if you break the task up into chunks, you still need to relieve the UI from waiting on this query execution. In .NET 2+, check out SQLCommand.BeginExecuteNonQuery.

That doesn't solve your problem, but it will make it more bearable to your users.
Post #515992
« Prev Topic | Next Topic »

Add to briefcase

Permissions Expand / Collapse