Query Performance Issue

  • Hello,

    I have a query below which used to run in less than a few seconds before but runs for forever (max I watched it run was for 50 or so minutes).

    Can anyone look at this and tell me if they see something wrong? I honestly dont see anything wrong in it and have only made a simple change which shouldnt effect the performance in such a huge way.

    Please see below:

    SELECT a.EdpNo,

    a.ItemNo,

    b.fintProductId,

    c.fintSizeId,

    d.fintWidthId,

    e.fintColorId,

    a.ItemStatus,

    f.CountryOrigin,

    '0'

    FROM[1-Database-1].[dbo].[vwRegularItems]a

    LEFT JOIN[2-Database-2].[dbo].[tblProduct]bON a.StyleCd = b.fstrShortSku

    --LEFT JOIN[2-Database-2].[dbo].[ltblSize]cON a.Level1DescConv = c.fstrSizeValue

    LEFT JOIN[2-Database-2].[dbo].[ltblSize]cON a.Level1Desc = c.fstrSizeValue

    LEFT JOIN[2-Database-2].[dbo].[ltblWidth]dON a.Level2Desc = d.fstrWidthValue

    LEFT JOIN[2-Database-2].[dbo].[ltblcolor]eON a.Level3Desc = e.fstrColorName

    LEFT JOIN[1-Database-1].[dbo].[vwCountryOfOrigin]fON a.EdpNo = f.Edpno

    WHEREa.EdpNo IN

    (

    SELECTri.EdpNo

    FROM[1-Database-1].[dbo].[vwRegularItems] ri

    LEFT JOIN[2-Database-2].[dbo].[tblproductsizewidthcolor] pswc

    ONri.EdpNo=pswc.fintEcometryId

    WHEREpswc.fstrLongSku IS NULL AND pswc.fintEcometryId IS NULL AND Level1Type = 'SZ'

    AND Level2Type = 'WD' AND Level3Type = 'CO'

    )

    The change I made was adding another LEFT JOIN and commenting one out. Any input would be appreciated.

    Thanks,

    S

    --
    :hehe:

  • Please see the links in my signature below



    Clear Sky SQL
    My Blog[/url]

  • Can you post the execution plans?

    "The credit belongs to the man who is actually in the arena, whose face is marred by dust and sweat and blood"
    - Theodore Roosevelt

    Author of:
    SQL Server Execution Plans
    SQL Server Query Performance Tuning

  • Please post table definitions, index definitions and execution plan, as per http://www.sqlservercentral.com/articles/SQLServerCentral/66909/

    Gail Shaw
    Microsoft Certified Master: SQL Server, MVP, M.Sc (Comp Sci)
    SQL In The Wild: Discussions on DB performance with occasional diversions into recoverability

    We walk in the dark places no others will enter
    We stand on the bridge and no one may pass
  • The execution plan is attached. Thanks for the replies and any help will be appreciated. The query ran for 40 minutes before it provided me the execution plan and 622 results. The table it looks into (tblRegularItems) has a total of 600K+ row to which I've created a view called VwRegularItems which shows only the "active" records through a flag I have put in there. The vwRegularItems view returns about 126k rows from the tblRegularItems.

    DDL for view below:

    CREATE VIEW [dbo].[vwRegularItems]

    AS

    SELECTItem_Id,

    MinorCatCd,

    SubMinorCatCd,

    StyleCd,

    ItemNo,

    EDPNo,

    ItemStatus,

    StyleDesc,

    OfferPrice,

    ItemDesc,

    Price,

    Level1Type,

    Level1Desc,

    Level1DescCONV,

    Level2Type,

    Level2Desc,

    Level3Type,

    Level3Desc,

    InvQty,

    Date,

    IsArchived,

    ArchiveDate,

    DateCreated

    FROM[BMBStaging].[dbo].[tblRegularItems]

    WHEREisArchived = 0

    DDL for index on tblRegularItems below:

    ALTER TABLE [dbo].[tblRegularItems] ADD CONSTRAINT [IX_tblRegularItems_EdpNo] UNIQUE NONCLUSTERED

    (

    [EDPNo] ASC,

    [ArchiveDate] ASC

    )WITH (PAD_INDEX = OFF, STATISTICS_NORECOMPUTE = OFF, SORT_IN_TEMPDB = OFF, IGNORE_DUP_KEY = OFF, ONLINE = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON) ON [PRIMARY]

    Thanks again!

    -s

    --
    :hehe:

  • You're getting a scan on the tblProductSizeWidthColor table, from the name alone, I think you've violated some design principals, which is reading, if I counted the digits correctly 332 million rows which gets filtered down to 600 for the result set. I suspect that is one of the largest problems you're dealing with.

    You're also getting a scan on tblRegularItems.

    But the biggest problem, I think, is statistics. The estimated number of rows for tblRegularItems is 1,but the actual is 127000+. That's a huge disparity. There's also an index seek against tblRegularItems that is showing estimated 19 rows, but actual 2.5 million. I'd suggest updating your stats on all the tables, and I'd suggest using FULL SCAN to do it.

    "The credit belongs to the man who is actually in the arena, whose face is marred by dust and sweat and blood"
    - Theodore Roosevelt

    Author of:
    SQL Server Execution Plans
    SQL Server Query Performance Tuning

  • Grant,

    Thank you for the information. Can you possibly point me to the right direction for updating stats on the tables as well as more information for "Full Scans". I will look this up on Google myself as well.

    Also, can I update the statistics on these tables in Production during regular business hours? How much effect would updating the statistics have on the database load wise?

    Thanks,

    S

    --
    :hehe:

  • Slick84 (10/14/2009)


    Grant,

    Thank you for the information. Can you possibly point me to the right direction for updating stats on the tables as well as more information for "Full Scans". I will look this up on Google myself as well.

    Also, can I update the statistics on these tables in Production during regular business hours? How much effect would updating the statistics have on the database load wise?

    Thanks,

    S

    It could cause issues running it during the day. Normally you can update stats with sp_updatestats, but I think you might need a complete scan done on these based on the wide disparity between the values. To use UPDATE STATISTICS, you just have to run the following for each table:

    UPDATE STATISTICS schema.table WITH FULLSCAN

    "The credit belongs to the man who is actually in the arena, whose face is marred by dust and sweat and blood"
    - Theodore Roosevelt

    Author of:
    SQL Server Execution Plans
    SQL Server Query Performance Tuning

  • So another question, and please let me know if I'm asking too much or just point me to the right direction but...

    Would you be able to explain what the wide disparity between the two numbers signifies?

    --
    :hehe:

  • Slick84 (10/14/2009)


    Would you be able to explain what the wide disparity between the two numbers signifies?

    The optimiser estimates one row and hence generates a plan that's optimal for a very small number of rows. 125000 is not a small number of rows and hence the plan is very, very sub-optimal.

    You have table scans everywhere! And as the inner sources for nested loop joins!!!!! No wonder this is slow. In fact, I'm surprised it only takes 40 min.

    Quick suggestions (as in, I looked over it quickly, not as in they're quick for you to implement)

    Index on tblRegularItems (IsArchived, Level1Type, Level2Type, Level3Type, EDPNo)

    Index on tblProductSizeWidthColour (fstrLongSKU, fintEcometryID) (btw, what does this table store?)

    Index on tblProduct (fstrShortSKU)

    Index on ltblSize (fstrSizeValue)

    Index on ltblWidth (fstrWidthValue)

    Index on ltblColour (fstrColourName)

    Once you've added all those, run the query again and post the revised exec plan.

    Gail Shaw
    Microsoft Certified Master: SQL Server, MVP, M.Sc (Comp Sci)
    SQL In The Wild: Discussions on DB performance with occasional diversions into recoverability

    We walk in the dark places no others will enter
    We stand on the bridge and no one may pass
  • Yeah, what Gail said.

    Seriously though, she's right. There was missing index information in the execution plan, it was moving scads of data around to retrieve 600 rows, you're getting a parallel execution on that query too. That may or may not be an issue. You'll have to play with it to see.

    And no, you're not asking too many questions.

    "The credit belongs to the man who is actually in the arena, whose face is marred by dust and sweat and blood"
    - Theodore Roosevelt

    Author of:
    SQL Server Execution Plans
    SQL Server Query Performance Tuning

  • Wow thanks for the detailed information! Let me implement this and run the procedure again and upload the exec plan. Im creating non-unique non-clustered indexes on those, right?

    By the way.. as a side question, if I "cancel" my insert statement, and close the query window, the transaction automatically rolls back correct?

    The tblProductSizeWidthColor table basically contains the color, size and width attributes for products that we sell on the website. The size, width and color id's in this table reference the look up tables ltblcolor, ltblsize and ltblwidth.

    Hope that makes sense.

    Thanks,

    S

    --
    :hehe:

  • WOWWWWWWWWWWWWW! Thanks so much Gilamonster. It's unbelievable. I got my results back in 5 seconds....amazing...execution plan attached. :-D:-D:-D:-D

    --
    :hehe:

  • Better, but not fantastic.

    There's still a massive inaccuracy on the row estimation coming from tblRegularItems. I don't understand why. Try an update statistics with full scan on that table, even though it shouldn't be necessary

    Did you add the recommended index on tblProduct? There's still an index scan there.

    If you did, widen it (fstrShortSKU) INCLUDE (fintProductID)

    Widen the index that you added to tblRegularItems, add EDPNo as an INCLUDE column.

    Can you show us the definition of the vwRegularItems view?

    Gail Shaw
    Microsoft Certified Master: SQL Server, MVP, M.Sc (Comp Sci)
    SQL In The Wild: Discussions on DB performance with occasional diversions into recoverability

    We walk in the dark places no others will enter
    We stand on the bridge and no one may pass
  • Hi Gail,

    Here is the DDL for the view VwRegularItems which I previously posted as well. I havent changed anything on it from before.

    CREATE VIEW [dbo].[vwRegularItems]

    AS

    SELECT Item_Id,

    MinorCatCd,

    SubMinorCatCd,

    StyleCd,

    ItemNo,

    EDPNo,

    ItemStatus,

    StyleDesc,

    OfferPrice,

    ItemDesc,

    Price,

    Level1Type,

    Level1Desc,

    Level1DescCONV,

    Level2Type,

    Level2Desc,

    Level3Type,

    Level3Desc,

    InvQty,

    Date,

    IsArchived,

    ArchiveDate,

    DateCreated

    FROM [BMBStaging].[dbo].[tblRegularItems]

    WHERE isArchived = 0

    As far as your recommendation. I did not add the tblProduct one because when I checked the table, the non-clustered index was already there. Below is the script for the index on tblProduct.

    CREATE NONCLUSTERED INDEX [IX_tblProduct_1] ON [dbo].[tblProduct]

    (

    [fintGenderTypeId] ASC,

    [fstrShortSKU] ASC,

    [fintSiteContentId_Published] ASC,

    [fintStyleId] ASC

    )WITH (PAD_INDEX = OFF, STATISTICS_NORECOMPUTE = OFF, SORT_IN_TEMPDB = OFF, IGNORE_DUP_KEY = OFF, DROP_EXISTING = OFF, ONLINE = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON, FILLFACTOR = 80) ON [PRIMARY]

    Also below is my script for the index I created on tblRegularItems.

    CREATE NONCLUSTERED INDEX [IX_tblRegularItems_Levels] ON [dbo].[tblRegularItems]

    (

    [Level1Type] ASC,

    [Level2Type] ASC,

    [Level3Type] ASC,

    [IsArchived] ASC

    )WITH (PAD_INDEX = OFF, STATISTICS_NORECOMPUTE = OFF, SORT_IN_TEMPDB = OFF, IGNORE_DUP_KEY = OFF, DROP_EXISTING = OFF, ONLINE = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON) ON [PRIMARY]

    I'll add the INCLUDED columns in there and show you the execution plan. Thanks for all your help.

    Regards,

    S

    --
    :hehe:

Viewing 15 posts - 1 through 15 (of 34 total)

You must be logged in to reply to this topic. Login to reply