Optimization of stored procedures

  • Hi,

    I have a number of stored procedures used for reporting and whenever we run these reports it slows down the entire database.

    Will optimizing the stored procedures improve the performance of the database while running reports.

    Thanks

  • Simplest answer, it depends. Most likely, yes, if done correctly. Hard to tell you with just the info provided.

  • Nikhi123 (11/20/2012)


    Hi,

    I have a number of stored procedures used for reporting and whenever we run these reports it slows down the entire database.

    Will optimizing the stored procedures improve the performance of the database while running reports.

    Thanks

    It may do and it may not...

    Have you considered to have dedicated database for reporting purposes?

    _____________________________________________
    "The only true wisdom is in knowing you know nothing"
    "O skol'ko nam otkrytiy chudnyh prevnosit microsofta duh!":-D
    (So many miracle inventions provided by MS to us...)

    How to post your question to get the best and quick help[/url]

  • I will suggest to use a third party tool it will monitor the performance of SQL Server, the future growth of your database size,

    which in turn improve the performance of the server and save ur precious time ...

    u can consider Lepide ,idhra ,ME etc company's tool among of them Lepide Sql storage Manager is cost effective as other are having high cost for the same purpose ...

    the products will generate report without affecting the server performance in real time also through email notification as well

    I think this is best suited for the ur query just try any of these tools with trial version that would be free of cost most probably !!!!1

  • Nikhi123 (11/20/2012)


    Will optimizing the stored procedures improve the performance of the database while running reports.Thanks

    you can also directly test the stored proc at mgmt studio with the passed parameters to check whether actually proc itself is culprit or anything else too ? sql profiler and execution plan will be your tools to judge or analyze their performance.

    -------Bhuvnesh----------
    I work only to learn Sql Server...though my company pays me for getting their stuff done;-)

  • If by "slows down the entire database" you mean that running these reports slows down other, non-reporting activity then my advise would be to do what Eugene was saying: consider a dedicated database for reporting purposes.

    It never hurts to monitor for deadlocks. If you use a lot stored prodedures for parameters in your SSRS reports you may wnat to see if you have any parameter sniffing going on; that can certainly grind your system to a halt. Here's a good article on the topic. http://sqlinthewild.co.za/index.php/2007/11/27/parameter-sniffing/

    "I cant stress enough the importance of switching from a sequential files mindset to set-based thinking. After you make the switch, you can spend your time tuning and optimizing your queries instead of maintaining lengthy, poor-performing code."

    -- Itzik Ben-Gan 2001

  • Nikhi123 (11/20/2012)


    Hi,

    I have a number of stored procedures used for reporting and whenever we run these reports it slows down the entire database.

    Will optimizing the stored procedures improve the performance of the database while running reports.

    Thanks

    Reporting off of normal OLTP structures almost always affects OLTP performance. Reporting to me means scanning large amounts of data, hash (or maybe merge if you are lucky) joins and doing aggregations on same. This is completely contrary to what OLTP stuff does (i.e. order entry, quick, short, low-row-count type of queries with lots of index seeks, lookups, nested loop joins, etc). Good non-clustered indexing often doesn't help because the optimizer won't seek/lookup when more than roughly 1% of the data is estimated to be hit. And if you create a bunch of covering NC indexes, you can very quickly cause issues with your OLTP workload.

    Having said that, there ARE some things that can be fixed to help: not using table variables/UDFs, breaking very large/complex queries down into 2 or more intermediate temp tables, read-uncommitted SELECTs (assuming you are aware of and accept the issues that come from that) - or snapshot isolation if not (which has it's own issues that you need to manage through.

    If you are on Enterprise Edition of SQL Server you could use Resource Governor to help throttle the queries, and you can always use the MAXDOP query hint to help too, regardless of edition.

    I very strongly recommend you get a professional on board for a few days/few weeks to a) give system a performance review and b) help you improve your current reporting code/structures and possibly c) build out a proper reporting environment.

    Best,
    Kevin G. Boles
    SQL Server Consultant
    SQL MVP 2007-2012
    TheSQLGuru on googles mail service

  • CELKO (11/22/2012)


    Will optimizing the stored procedures improve the performance of the database while running reports.

    Can you be more vague? 🙂

    I can! My vague-king question is here:

    How?

    _____________________________________________
    "The only true wisdom is in knowing you know nothing"
    "O skol'ko nam otkrytiy chudnyh prevnosit microsofta duh!":-D
    (So many miracle inventions provided by MS to us...)

    How to post your question to get the best and quick help[/url]

  • peterdru401 (11/21/2012)


    I will suggest to use a third party tool it will monitor the performance of SQL Server, the future growth of your database size,

    which in turn improve the performance of the server and save ur precious time ...

    u can consider Lepide ,idhra ,ME etc company's tool among of them Lepide Sql storage Manager is cost effective as other are having high cost for the same purpose ...

    the products will generate report without affecting the server performance in real time also through email notification as well

    I think this is best suited for the ur query just try any of these tools with trial version that would be free of cost most probably !!!!1

    I have to seriously disagree with that. There are usually (I've seen 1 exception) no performance settings or tricks with partitioning disks (there used to be 1 but it's built into all new equipment now) that will come close to the improvements that you can make in code. I've not seen it yet where a server crippling job that takes more than 45 minutes to run be reduced to 3 seconds by any such server settings or hardware tricks and I've seen it be done with code. I've also seen people go out and "improve" their server environment by replacing a small 4 processor box that has only 16GB of RAM for a fire-breathing 16 processor box with 64GB of RAM and NOT have any performance improvement because the code sucked so bad.

    Adding the right kind indexes can certainly help but only if the code is actually capable of using them effectively. Frequently, it is not and must suffer at least a minor redaction to make it so.

    Hardware upgrades and storage configurations are frequently not much more than expensive methods for getting a relatively small (if any) gain in performance. Real performance gains come from fixing bad code.

    --Jeff Moden


    RBAR is pronounced "ree-bar" and is a "Modenism" for Row-By-Agonizing-Row.
    First step towards the paradigm shift of writing Set Based code:
    ________Stop thinking about what you want to do to a ROW... think, instead, of what you want to do to a COLUMN.

    Change is inevitable... Change for the better is not.


    Helpful Links:
    How to post code problems
    How to Post Performance Problems
    Create a Tally Function (fnTally)

  • Jeff Moden (11/22/2012)

    I have to seriously disagree with that. There are usually (I've seen 1 exception) no performance settings or tricks with partitioning disks (there used to be 1 but it's built into all new equipment now) that will come close to the improvements that you can make in code. I've not seen it yet where a server crippling job that takes more than 45 minutes to run be reduced to 3 seconds by any such server settings or hardware tricks and I've seen it be done with code. I've also seen people go out and "improve" their server environment by replacing a small 4 processor box that has only 16GB of RAM for a fire-breathing 16 processor box with 64GB of RAM and NOT have any performance improvement because the code sucked so bad.

    Adding the right kind indexes can certainly help but only if the code is actually capable of using them effectively. Frequently, it is not and must suffer at least a minor redaction to make it so.

    Hardware upgrades and storage configurations are frequently not much more than expensive methods for getting a relatively small (if any) gain in performance. Real performance gains come from fixing bad code.

    +1. I totally agree with Jeff. Most of the times it's the basic database design and architecture where we often fail. Bad choice of indexes don't help either infact hey are pain in the head (u know). On the top of that, unefficient code kills the entire application.

    looking for any third party tool should be the last option. However this is my personal thinking.

    On your problem, have tried checking where and which procedures are taking most of the time ? Any blockings coming during their execution ? If yes, which are the blocking proceses and which are getting blocked ? Just basics to start with.

Viewing 10 posts - 1 through 9 (of 9 total)

You must be logged in to reply to this topic. Login to reply