Slow SQL Script

  • Hi there, 
    I want to select some information from a few tables and it is very slow that causes timeout.
    Kindly refer to the following code, is there any room to improve he speed? I tried to create indexes to respective tables with correct sort order but still very slow. 

    Can anyone help me on how to restructure the script to make it faster?

    Note,
    s_dopicklot has 300,000 records
    s_locqty has 250,000 records
    s_dohdr has about 50,000 records
    s_dodetl has about 150,000 records


    SELECT l.part_code, l.lot_no, l.wh_code, l.part_loc, l.shelf_loc,
    isnull(l.qty,0)-ISNULL((select SUM(qty)
                            from s_dopicklot, s_dohdr
                            where s_dohdr.do_no=s_dopicklot.do_no
                            and part_code=l.part_code
                            and lot_no=l.lot_no
                            and wh_code=l.wh_code
                            and loc_code=l.part_loc
                            and shelf_loc=l.shelf_loc
                            and DOM=l.dom
                            and status='OPEN' ),0) as qty, l.dom, l.unit_cost
    from s_locqty l
    where l.qty>0
    and l.part_code in (select b.part_code from s_dohdr a, s_dodetl b
                                 where a.do_no = b.do_no and b.lot_no ='' and a.status='OPEN' and ISNULL(qty_pick,0)< qty_send 
                                 and b.qty_send>0 and (b.type='I' OR b.type='F'))
    and l.part_loc<>'PROD'
    and l.part_loc<>'PROD 1'
    and l.part_loc<>'PROD 2'
    and l.part_loc<>'WIP'

    Together, we can make wonders!
  • Here's a slightly more conventional form of the same query just to get us started really. Please can you run it, confirm that it generates correct results, then post the Actual Execution Plan as a .sqlplan attachment - thanks.


    SELECT
     l.part_code,
     l.lot_no,
     l.wh_code,
     l.part_loc,
     l.shelf_loc,
     isnull(l.qty,0)-ISNULL(x.PickQty,0) as qty,
     l.dom,
     l.unit_cost 
    FROM s_locqty l
    OUTER APPLY (
     SELECT PickQty = SUM(qty)
     FROM s_dopicklot
     INNER JOIN s_dohdr
      ON s_dohdr.do_no = s_dopicklot.do_no
     WHERE part_code = l.part_code 
      AND lot_no = l.lot_no
      and wh_code = l.wh_code
      and loc_code = l.part_loc
      and shelf_loc = l.shelf_loc
      and DOM = l.dom
      AND [status] = 'OPEN'
    ) x
    WHERE l.qty > 0
     AND l.part_loc NOT IN ('PROD','PROD 1','PROD 2','WIP')
     AND EXISTS (
      SELECT 1
      FROM s_dohdr a
      INNER JOIN s_dodetl b                               
       ON a.do_no = b.do_no 
      WHERE b.lot_no = ''
       AND a.[status] = 'OPEN'
       and ISNULL(qty_pick,0) < b.qty_send                             
       and b.qty_send > 0
       and b.[type] IN ('I','F')
       AND b.part_code = l.part_code
     )

    “Write the query the simplest way. If through testing it becomes clear that the performance is inadequate, consider alternative query forms.” - Gail Shaw

    For fast, accurate and documented assistance in answering your questions, please read this article.
    Understanding and using APPLY, (I) and (II) Paul White
    Hidden RBAR: Triangular Joins / The "Numbers" or "Tally" Table: What it is and how it replaces a loop Jeff Moden

  • Like Chris, show us the execution plan to get good advice.

    One immediate issue, ISNULL(qty_pick,0), is going to lead to scans of that table. It can't effectively use indexes because of the calculation. Replace it with an OR condition or something else.

    "The credit belongs to the man who is actually in the arena, whose face is marred by dust and sweat and blood"
    - Theodore Roosevelt

    Author of:
    SQL Server Execution Plans
    SQL Server Query Performance Tuning

  • Hi ChrisM and Grant Fritchey
    Very much appreciated of your fast reply and the restructured script provided. 
    The database is in my office server and I am away for a few days. Will test the script once I get back to office. I promise, I will share the execution plan here.
    Thanks again to both of you

    Together, we can make wonders!
  • Hi ChrisM,

    My colleague helped me to try out your restructured script however, the speed is taking longer time to complete than my original one. My original script took 10s and yours 36s to complete in the same database.
    My original script is slow when the data server is having high usage and causes time out. The test for both scripts was only one user login to the server and scripts were executed from local SSMS. 
    As per your request, i attach both "Estimated Execution Plan" for your kind refer. I am still looking for help to enhance the speed. 
    Thanks

    Together, we can make wonders!
  • AhTu_SQL2k+ - Wednesday, November 28, 2018 6:43 AM

    Hi ChrisM,

    My colleague helped me to try out your restructured script however, the speed is taking longer time to complete than my original one. My original script took 10s and yours 36s to complete in the same database.
    My original script is slow when the data server is having high usage and causes time out. The test for both scripts was only one user login to the server and scripts were executed from local SSMS. 
    As per your request, i attach both "Estimated Execution Plan" for your kind refer. I am still looking for help to enhance the speed. 
    Thanks

    Thanks - can you post the "Actual" rather then "Estimated" plans please? They may contain significant information.

    “Write the query the simplest way. If through testing it becomes clear that the performance is inadequate, consider alternative query forms.” - Gail Shaw

    For fast, accurate and documented assistance in answering your questions, please read this article.
    Understanding and using APPLY, (I) and (II) Paul White
    Hidden RBAR: Triangular Joins / The "Numbers" or "Tally" Table: What it is and how it replaces a loop Jeff Moden

  • Data type mismatches, where join columns between two tables are of two different types, are causing you some grief:

    CONVERT_IMPLICIT(nchar(30),
     [TotalERP_winwin].[dbo].[s_dodetl].[part_code] as .[part_code]
     ,0) =
     [TotalERP_winwin].[dbo].[s_locqty].[part_code] as [l].[part_code]

    CONVERT_IMPLICIT(nchar(15),
     [TotalERP_winwin].[dbo].[s_dohdr].[do_no]
     ,0) =
     [TotalERP_winwin].[dbo].[s_dopicklot].[do_no]
    “Write the query the simplest way. If through testing it becomes clear that the performance is inadequate, consider alternative query forms.” - Gail Shaw

    For fast, accurate and documented assistance in answering your questions, please read this article.
    Understanding and using APPLY, (I) and (II) Paul White
    Hidden RBAR: Triangular Joins / The "Numbers" or "Tally" Table: What it is and how it replaces a loop Jeff Moden

  • AhTu_SQL2k+ - Wednesday, November 28, 2018 6:43 AM

    Hi ChrisM,

    My colleague helped me to try out your restructured script however, the speed is taking longer time to complete than my original one. My original script took 10s and yours 36s to complete in the same database.
    My original script is slow when the data server is having high usage and causes time out. The test for both scripts was only one user login to the server and scripts were executed from local SSMS. 
    As per your request, i attach both "Estimated Execution Plan" for your kind refer. I am still looking for help to enhance the speed. 
    Thanks

    Estimated number of rows are less, so it shouldn't take a long time to return records. can you check if you have indexes on the columns where clause is specified? If there you can rebuild indexes, there could be fragmentation on them. Other option:  can you restore the database on a different host and try running the same query and see how it performs there?

  • Also:

    Add [qty] to index [s_dopicklot].[NonClusteredIndex-20181127-173726]

    Add [qty_pick] and [qty_send] to index [s_dodetl].[_dta_index_s_dodetl_18_591549391__K1_K2_K23_K13_K6]

    This could remove the two key lookups from the plan.

    “Write the query the simplest way. If through testing it becomes clear that the performance is inadequate, consider alternative query forms.” - Gail Shaw

    For fast, accurate and documented assistance in answering your questions, please read this article.
    Understanding and using APPLY, (I) and (II) Paul White
    Hidden RBAR: Triangular Joins / The "Numbers" or "Tally" Table: What it is and how it replaces a loop Jeff Moden

  • ChrisM@Work - Wednesday, November 28, 2018 7:03 AM

    Data type mismatches, where join columns between two tables are of two different types, are causing you some grief:

    CONVERT_IMPLICIT(nchar(30),
     [TotalERP_winwin].[dbo].[s_dodetl].[part_code] as .[part_code]
     ,0) =
     [TotalERP_winwin].[dbo].[s_locqty].[part_code] as [l].[part_code]

    CONVERT_IMPLICIT(nchar(15),
     [TotalERP_winwin].[dbo].[s_dohdr].[do_no]
     ,0) =
     [TotalERP_winwin].[dbo].[s_dopicklot].[do_no]

    Hi Chris, I will do the necessary changes according to your posts.
    FYI, my colleague was tested on the same database like I mentioned in first post. the size of the table rows are  
    s_dopicklot has 300,000 records
    s_locqty has 250,000 records
    s_dohdr has about 50,000 records
    s_dodetl has about 150,000 records
    As for your suggestion to use different host, in our production server the execution time is shorter than our client's data server. Their data server is HP DL380 having a 32GB of RAM. Much higher than our production server but it always return timeout. Both database's size are same as we backup their and restore to our server. 
    I will amend the script accordingly and post the Actual Execution Plan here (Sorry for giving the wrong plan) 
    Thanks again

    Together, we can make wonders!
  • rinu philip - Wednesday, November 28, 2018 7:07 AM

    AhTu_SQL2k+ - Wednesday, November 28, 2018 6:43 AM

    Hi ChrisM,

    My colleague helped me to try out your restructured script however, the speed is taking longer time to complete than my original one. My original script took 10s and yours 36s to complete in the same database.
    My original script is slow when the data server is having high usage and causes time out. The test for both scripts was only one user login to the server and scripts were executed from local SSMS. 
    As per your request, i attach both "Estimated Execution Plan" for your kind refer. I am still looking for help to enhance the speed. 
    Thanks

    Estimated number of rows are less, so it shouldn't take a long time to return records. can you check if you have indexes on the columns where clause is specified? If there you can rebuild indexes, there could be fragmentation on them. Other option:  can you restore the database on a different host and try running the same query and see how it performs there?

    Hi Rinu, thanks for your input. Yes, we have added index to respective tables. We also used the Tuning Advisory to create and apply the recommendations. We have tried in our server and client's server. Our server is having only 8GB RAM comparing to our client's server 32GB RAM. The speed is slow and both server will return timeout when the database is shared with more than 5 users.
    Thank you

    Together, we can make wonders!
  • AhTu_SQL2k+ - Wednesday, November 28, 2018 8:29 AM

    ChrisM@Work - Wednesday, November 28, 2018 7:03 AM

    Data type mismatches, where join columns between two tables are of two different types, are causing you some grief:

    CONVERT_IMPLICIT(nchar(30),
     [TotalERP_winwin].[dbo].[s_dodetl].[part_code] as .[part_code]
     ,0) =
     [TotalERP_winwin].[dbo].[s_locqty].[part_code] as [l].[part_code]

    CONVERT_IMPLICIT(nchar(15),
     [TotalERP_winwin].[dbo].[s_dohdr].[do_no]
     ,0) =
     [TotalERP_winwin].[dbo].[s_dopicklot].[do_no]

    Hi Chris, I will do the necessary changes according to your posts.
    FYI, my colleague was tested on the same database like I mentioned in first post. the size of the table rows are  
    s_dopicklot has 300,000 records
    s_locqty has 250,000 records
    s_dohdr has about 50,000 records
    s_dodetl has about 150,000 records
    As for your suggestion to use different host, in our production server the execution time is shorter than our client's data server. Their data server is HP DL380 having a 32GB of RAM. Much higher than our production server but it always return timeout. Both database's size are same as we backup their and restore to our server. 
    I will amend the script accordingly and post the Actual Execution Plan here (Sorry for giving the wrong plan) 
    Thanks again

    The number of rows involved suggests that this query (or rather, the DDL) can be tweaked for a significantly shorter execution time.
    From my perspective: eliminate the data type differences, add the columns to the INCLUDE section of the indexes (ask if you're unsure about this) and finally - actual plans.
    I don't think you will need to faff about setting up an alternative host.

    “Write the query the simplest way. If through testing it becomes clear that the performance is inadequate, consider alternative query forms.” - Gail Shaw

    For fast, accurate and documented assistance in answering your questions, please read this article.
    Understanding and using APPLY, (I) and (II) Paul White
    Hidden RBAR: Triangular Joins / The "Numbers" or "Tally" Table: What it is and how it replaces a loop Jeff Moden

  • Hi,
    the reazon why you have two execution plans different, basically, for the same query is due to the server interprets the query as different, even if they performance the same task.

    By the other hand, in both, plans you have a cardinallity problem due to in some cases the query estimator, believes that they will need to execute 1000 times to get just 1 row for result.

    Even for a small size tables, this can cause a performance degration.

    Finally, my suggestion is: if you already checked the % of index fragmentation and you are sure that you have the correct indexes on your tables, it is time to update you statistics for this tables.
    After that, you can compare the actual vs estimated rows and verify that you don´t have the cardinallity problem.

    When you solved this, your query will run as fast as they can without modify your code.

    By the way, using agregate functions like sum in the select, is not recommeded for good query performance. Perhaps the next step is to tweak your code in a better way.

    Hope this helps.

  • Hi ChrisM,
    Timeout issue has been resolved. Below steps are the changes highlighted by your previous posts. The Actual Execution plan is attached also.
    1) alter s_picklot.do_no from char(15) to char(12)
    2) alter s_locqty.part_code from char(60) to char(30)
    3) new index 
     CREATE NONCLUSTERED INDEX [idx_20181130-1]
    ON [dbo].[s_dodetl] ([lot_no],[type],[qty_send])
    INCLUDE ([do_no],[part_code],[qty_pick])
    GO
    Now, the execution time is greatly reduced from 36s to 1s. 

    Thanks again, ChrisM,Regards,
    Wilson

    Together, we can make wonders!
  • AhTu_SQL2k+ - Thursday, November 29, 2018 10:25 PM

    Hi ChrisM,
    Timeout issue has been resolved. Below steps are the changes highlighted by your previous posts. The Actual Execution plan is attached also.
    1) alter s_picklot.do_no from char(15) to char(12)
    2) alter s_locqty.part_code from char(60) to char(30)
    3) new index 
     CREATE NONCLUSTERED INDEX [idx_20181130-1]
    ON [dbo].[s_dodetl] ([lot_no],[type],[qty_send])
    INCLUDE ([do_no],[part_code],[qty_pick])
    GO
    Now, the execution time is greatly reduced from 36s to 1s. 

    Thanks again, ChrisM,Regards,
    Wilson

    Top work 🙂
     Add column [qty] to index [s_dopicklot].[NonClusteredIndex-20181127-173726] for a further performance lift - it should get rid of the (expensive) key lookup

    “Write the query the simplest way. If through testing it becomes clear that the performance is inadequate, consider alternative query forms.” - Gail Shaw

    For fast, accurate and documented assistance in answering your questions, please read this article.
    Understanding and using APPLY, (I) and (II) Paul White
    Hidden RBAR: Triangular Joins / The "Numbers" or "Tally" Table: What it is and how it replaces a loop Jeff Moden

Viewing 15 posts - 1 through 14 (of 14 total)

You must be logged in to reply to this topic. Login to reply