Perfromance Issue with the Query...

  • Hi everyone,

    I have written one sql which is having performance problem.

    This table is partitioned on EVENTDATE column and having 10 crs records in it.

    The block of qry which slow down I am posting here. In short what I did in that block is

    I have one temporary table where I collected sub result so that I can use it for further data fetch.

    I am possessing 15000 records from that temporary table each time as this help me in performance with compare to situation when I processed all temp table in one shot

    when this temporary table having 100000 records performance is ~6 mins that hv to reduce.

    declare @ROWPROCESSED int

    SELECT @ROWPROCESSED = 0

    while (@ROWPROCESSED <= (select MAX(ROWNO) from #temp))

    begin

    insert into #temp2(QUANSTOCKNKG, QUANCONDNKG, EVENTDATE)

    select T1.QUANSTOCKNKG as QUANSTOCKNKG , T1.QUANCONDNKG as QUANCONDNKG, T1.EVENTDATE as EVENTDATE from(

    SELECT O.QUANSTOCKNKG as QUANSTOCKNKG , O.QUANCONDNKG as QUANCONDNKG, O.EVENTDATE as EVENTDATE

    from ORDERSTOCKFACTS O --#temp T

    --join ORDERSTOCKFACTS O on O.MILLORDNO = T.MILLORDNO and O.MACHCHAINNO = T.MACHCHAINNO and O.ORDERLINENO = T.ORDERLINENO

    where

    exists(select 1 from #temp T where O.MILLORDNO = T.MILLORDNO and O.MACHCHAINNO = T.MACHCHAINNO and O.ORDERLINENO = T.ORDERLINENO

    and T.ROWNO > @ROWPROCESSED and T.ROWNO < @ROWPROCESSED + 15001)

    and O.EVENTDATE >= convert(smalldatetime,'01-12-2011',105) and O.EVENTDATE <= convert(smalldatetime,'01-5-2015',105)

    and O.QUALITYCODE = 'P' AND O.INVSTATUS = 'UN' AND O.READINESS = 'F'

    --and T.ROWNO > @ROWPROCESSED and T.ROWNO < @ROWPROCESSED + 15001

    )T1

    select @ROWPROCESSED = @ROWPROCESSED + 15000

    end

    The middle block is the query which is actually time consuming. Can anybody suggest what can I do to get ride on this performance issue.

    table definition and index I have attached. find attachment for execution plan also...

    Please suggest if you feel what we could have to improve the performance

    Many thanks.

  • For testing purposes, comment out the INSERT part of the batch. Run this test batch a couple of times to obtain a performance baseline.

    The difference between actual and estimated rows for the index seek (node 9) on table ORDERSTOCKFACTS (index ORDERSTOCKFACTS_I6) suggest that statistics may be out of date. Run

    UPDATE STATISTICS ORDERSTOCKFACTS WITH FULLSCAN

    then run the test batch a couple of times.

    Next, the plan shows a cost of 92% is attributable to a key lookup. Index ORDERSTOCKFACTS_I6 doesn't contain columns QUANSTOCKNKG and QUANCONDNKG, so SQL Server has to use the cluster key in index ORDERSTOCKFACTS_I6 to look these up in the table. Adding these two columns to index ORDERSTOCKFACTS_I6 as INCLUDE columns would eliminate the key lookup at the expense of slightly slower index seeks. With only two columns it's likely to be well worth it. If you're not sure how to add these two columns to the index, post the CREATE INDEX script for ORDERSTOCKFACTS_I6.

    โ€œWrite the query the simplest way. If through testing it becomes clear that the performance is inadequate, consider alternative query forms.โ€ - Gail Shaw

    For fast, accurate and documented assistance in answering your questions, please read this article.
    Understanding and using APPLY, (I) and (II) Paul White
    Hidden RBAR: Triangular Joins / The "Numbers" or "Tally" Table: What it is and how it replaces a loop Jeff Moden

  • Looking it over, the only hope for tuning is, as was already pointed out, to eliminate the key lookup. Otherwise, we're just processing a lot of data and that's going to take more time. Are you returning 100K rows to the client? Is that an export of data for other purposes or is the assumption that people are consuming 100K rows? If the latter, that never happens. You could push back on the requirements if so.

    One other point, this is an estimated plan, so it's hard to know if we're seeing accurate statistics without being able to compare to an actual plan. I'd suggest capturing the actual plan and doing that comparison to see if your statistics maintenance is adequate.

    "The credit belongs to the man who is actually in the arena, whose face is marred by dust and sweat and blood"
    - Theodore Roosevelt

    Author of:
    SQL Server Execution Plans
    SQL Server Query Performance Tuning

  • Thanks for reply!

    Earlier I was collecting data in some temporary table so that I can use it as input to query on main table. Now I have removed that part and created another non clustered index on that table on field SALESCOMPANY(as per suggestion from Senior person) and querying data on main table directly.

    Now performance is shortened to almost half time.

    I hope it is acceptable ๐Ÿ™‚

  • mote.ajit2 (1/27/2014)


    Thanks for reply!

    Earlier I was collecting data in some temporary table so that I can use it as input to query on main table. Now I have removed that part and created another non clustered index on that table on field SALESCOMPANY(as per suggestion from Senior person) and querying data on main table directly.

    Now performance is shortened to almost half time.

    I hope it is acceptable ๐Ÿ™‚

    That's certainly an improvement. Personally I think there's scope for far more improvement. If you wish to explore this possibility, then post the actual execution plan (as a .sqlplan file) of the new query.

    โ€œWrite the query the simplest way. If through testing it becomes clear that the performance is inadequate, consider alternative query forms.โ€ - Gail Shaw

    For fast, accurate and documented assistance in answering your questions, please read this article.
    Understanding and using APPLY, (I) and (II) Paul White
    Hidden RBAR: Triangular Joins / The "Numbers" or "Tally" Table: What it is and how it replaces a loop Jeff Moden

Viewing 5 posts - 1 through 4 (of 4 total)

You must be logged in to reply to this topic. Login to reply