Help with performance of a MegaQuery

  • Hi,

    I'm searching a SQL server DBA to help us with a query.

    We have a query with 8 unions, Select in Select and Select in Where.

    This query takes at least 45 sec. and 50% of CPU. This query may run

    1 to 20 time a day, and if 2 persons run it at the same time, ...BOOM...

    Did someone can help me, I need a professionnal help.

    Please send me by email  your name if your are able to help me and how much it will cost.

    I will send Database and Query.

    Thanks

    Pascal Lebel

    plebel@nwaretech.com

  • These forums are designed to provide a certain level of free help.  You may want to post your query and see what kind of responses you get.  Other hardware/storage/SQL Server configuration would also be helpful.  Chances are, if you are joining 8 tables, that the SQL code can be optimized to help with your performance, but you could have system limitations or resource contention that you need to address also.  To completely and comprehensively diagnose performance issues, it helps to have a qualified DBA or similar type professional on site.  You may gain some great insight into your problems through forums, but you would definitely benefit more through bringing in a SQL Server professional on a contract basis (I am not a contractor, nor do I know any). 

    John Rowan

    ======================================================
    ======================================================
    Forum Etiquette: How to post data/code on a forum to get the best help[/url] - by Jeff Moden

  • Assuming the query is fine have you tried adding indexes (may use index wiz to analyse query), rebuilding indexes nightly, autoupdate stats etc. If youve done all the above look at breaking a query up into several small queries.

     


    Phil Nicholas

  • Using User Defined Functions for the portion of the Mega query and then calling it at the right time helps reduce execution time considerably.

    SRA

  • It's really going to depend on what the query is doing and what type of data it is going against. If the data is mostly static, consider setting up an OLAP cube.

    Also, depending on the type of data, update frequency, and other software accessing the data; look at the amount of normalization in your database. If the query has a where clause hitting a char type field (or any iteration thereof, varchar, nvarchar, etc.) consider trying to move the text data to a lookup table and have the main table just store an integer ID, and then bound your where clause to that. Keep in mind that SQL is not really great for string comparisons, so this should help. On the other hand, if your query is doing a lot of calculations (from the CPU usage this seems likely) consider de-normalizing your data. Have the pre-calculated numbers in the table. While this does increse the likelyhood of data errors, it can speed up queries which have to do complex calculations. Just keep in mind that you need to have some sort of checking mechanism, e.g. a job which runs nightly that crawls through the tables and checks consistancy of this caculated field.

    Further, look at your execution plan, how often are you doing table scans? If you are doing a lot of them, get some indexes built. If you have indexes in place, make sure they aren't corrupt. During a lull in usage run a DBCC checkdb('myDB') on your database, you may have some corrupt indexes, which tends to create issues. If any are reported, write down what they are on, any options, etc. Backup your database. Backup your database (no, I'm not stuttering). Drop the index and re-create it. Wash, rinse, repeat for all corrupt objects. Then, backup your database. Keep in mind that the DBCC checkdb process can take a while, and will eat resources while it's running. How long, and how much depend heavily on the size and complexity of your database.

    An example of the query would be really helpful for pointing you in the right direction.

    one parting thought:

    Don't use a SELECT * ...

    This costs you extra time, as the system has to figure out what that * means. Always name every field you want returned.

  • Hello.

    What is the frequency of data change (inserts, deletions and updates) of the tables involved in this query? If it is not so frequent, You may try create a table to receive this data and set a Job to clean and fill the table each hour (or less). Finally, set your client application to access this table.

    Hope this help.




    Fernando Ponte
    factdata.com.br

Viewing 6 posts - 1 through 5 (of 5 total)

You must be logged in to reply to this topic. Login to reply