dosomethingbig & dosomethingsmall

  • Hi All,

    I have a simple problem that I guess most of us have but we might tend to igore it. We have a ASP.NET application, once a week we need to call a "schdule" stored procedure from our application. This should have been offhour process but we were not able to do this because of some business rules. This simply calls a stored procedure that will take approx. 1.5 hours to complete. When we call this procedure the SQL utilization becose 100% and then our simple query became unresponsive.

    Can we by some way tell this particular store procedure not to take more than 50% of CPU cycles, (sql server resouce), so if say some one ask SQL to get back with login credential this can be done when this sp is runnning..

    [50%CPU]

    create procedure dosomethingbig

    begin

    end

    [20%CPU]

    create procedure dosomethingsmall

    begin

    end

    the above fake configuration is what I am looking at, this is something i see as sql procedure priority to execute the dosomethingbig has priority 3 and dosomethingsmall has priority 1.

    Suggest:

  • this is what all producers of poorly performing code ask, sorry! Tune your stored procedure - you can't assign cpu, thank goodness.

    [font="Comic Sans MS"]The GrumpyOldDBA[/font]
    www.grumpyolddba.co.uk
    http://sqlblogcasts.com/blogs/grumpyolddba/

  • Have to agree with Colin.  Trying to optimise performance by boosting hardware or having ways to control hardware is always going to show up flaws down the line.

    If I was in your position, I would run a profiler trace agains the stored procedure.  Identify the worst lines of code and hit those first.

    If you are struggling, please post the problem lines of code in this post and we'll do our best to help you tune the code.

  • if it's taking 1.5 hours to complete, i bet it's got a row by row cursor on a big table, and whatever it is doing might be optimized into one or more set based operations, which of course would be faster.

    Lowell


    --help us help you! If you post a question, make sure you include a CREATE TABLE... statement and INSERT INTO... statement into that table to give the volunteers here representative data. with your description of the problem, we can provide a tested, verifiable solution to your question! asking the question the right way gets you a tested answer the fastest way possible!

  • you obviously haven't met some of the developers I have!  heh heh

    [font="Comic Sans MS"]The GrumpyOldDBA[/font]
    www.grumpyolddba.co.uk
    http://sqlblogcasts.com/blogs/grumpyolddba/

  • Display an Execution Plan for the SP to look for code that needs optimizing  ( ie table scans ...)

  • Handsomely answered, no place to tune my stored procedured. The matter of fact is there is just too much of data to work with.  I am not concerned with amount of time taken. The SQL engine is so poorely written in that you hit one simple query that do read-write to any one table and your hardwares are gone and processor 100% utilization.

    I have no proble my process runnig 3 or even 14 hours all I want other statements like "select getdate()" reply back.

    Best Regards

    Dipesh

     

  • Every piece is fully optimized and table indexed, the data that needs to processed is just too big. Any other suggestions?

    Regards:

    Dipesh

     

     

  • nah! If a query maxes a box it's usually at fault - unless the server/hardware is totally inadequate for the need.  what sort of box are you running this on?

    Do your queries have in-line functions, functions involving t-sql, cursors, triggers, views or row by row operations .. if so any of those may be the problem.

    Check your perfmon counters for context switching, page life expectancy, i/o completion times - for a start.

    Are the stats up to date - query plans contain NO index scans,  NO clustered index scans, NO spools, how about work tables ? Checked io stats to see which table(s) are getting most scans and io ?

    You also say "too big"  tried smaller batches ?  For example I used to have to make regular deletions of some 70 million rows from a table, the quickest way to do this was in 70 x 1 miiion deletes.

    [font="Comic Sans MS"]The GrumpyOldDBA[/font]
    www.grumpyolddba.co.uk
    http://sqlblogcasts.com/blogs/grumpyolddba/

  • you might want to paste your procedure here; some of us can point out any spots where the procedure might be able to be optimized. I think you glazed over the proc when you stated "no place to tune my stored procedured."

     

    Lowell


    --help us help you! If you post a question, make sure you include a CREATE TABLE... statement and INSERT INTO... statement into that table to give the volunteers here representative data. with your description of the problem, we can provide a tested, verifiable solution to your question! asking the question the right way gets you a tested answer the fastest way possible!

Viewing 10 posts - 1 through 10 (of 10 total)

You must be logged in to reply to this topic. Login to reply