Query optimization

  • Hello,

    I have two tables

    #InnerTable which have around 7000 rows

    DCl.CompanyToken which have around 20000000 rows

    when i am firing below query, it takes around 30 mins to execute and return 787 rows..

    please help to optimize it

    SELECT

    a.SearchTokenNumber ,

    a.SearchRowID ,

    CT.CompanyID ,

    CT.CompanyRowID ,

    a.SearchField ,

    a.MaxPossibleScore ,

    a.AssignedWeight ,

    MAX(a.Weight) Weight

    FROM

    #InnerTable a

    INNER JOIN DCl.CompanyToken CT ON a.CompanyTokenNumber = CT.CompanyTokenNumber

    WHERE AND CT.Token = a.Token

    GROUP BY a.SearchTokenNumber ,

    a.SearchRowID ,

    CT.CompanyID ,

    CT.CompanyRowID ,

    a.SearchField ,

    a.MaxPossibleScore ,

    a.AssignedWeight

    Note..

    CREATE INDEX IX_LocalIdentifierID ON #InnerTable (CompanyTokenNumber) index is applied on #InnerTable.

    /****** Object: Index [ix_CL_DCL_CompanyToken] Script Date: 12/12/2012 13:04:19 ******/

    CREATE CLUSTERED INDEX [ix_CL_DCL_CompanyToken] ON [DCL].[CompanyToken]

    (

    [CompanyTokenNumber] ASC,

    [Token] ASC,

    [CompanyID] ASC

    )WITH (PAD_INDEX = OFF, STATISTICS_NORECOMPUTE = OFF, SORT_IN_TEMPDB = OFF, IGNORE_DUP_KEY = OFF, DROP_EXISTING = OFF, ONLINE = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON) ON [PRIMARY]

    index is applied on DCl.CompanyToken

  • Can you post the actual query plan?

    Need an answer? No, you need a question
    My blog at https://sqlkover.com.
    MCSE Business Intelligence - Microsoft Data Platform MVP

  • Please post table definitions, index definitions and execution plan, as per http://www.sqlservercentral.com/articles/SQLServerCentral/66909/

    Gail Shaw
    Microsoft Certified Master: SQL Server, MVP, M.Sc (Comp Sci)
    SQL In The Wild: Discussions on DB performance with occasional diversions into recoverability

    We walk in the dark places no others will enter
    We stand on the bridge and no one may pass
  • Please find attachment

    Thanks,

    Aadhar

  • How many rows does it need to access in the large table - is it most of them or a few.

    Are the group by columns independent? Can some be omitted from the group by and a max used in the resultset - i.e. if you omit some any column from the group by clause will you always get less groups?


    Cursors never.
    DTS - only when needed and never to control.

  • large table has 20000000 rows and all group by are required

    By ommiting single group by will raise me to an error..

  • The large table has that many rows but are they all involved in the join to the temp table or does that join filter the rows accessed and if so by how much. If you only require a few thousand rows from the large table it could help to split the query into two.

    Removing a column from the group by will give an error unless you aggregate the column in the select clause by using (e.g.) max()

    So if you remove a.AssignedWeight from the group by clause and change the entry in the select clause to max(a.AssignedWeight) do you get less entries returned or is a.AssignedWeight dependent on the other values?


    Cursors never.
    DTS - only when needed and never to control.

  • This is what u are telling same thing.

    If i split query and i take sub data and group it or i take temp table to store partial data and use it to further join, It will take same time.

    I tried but couldnt succeed.

  • Do you want to answer the questions I asked?


    Cursors never.
    DTS - only when needed and never to control.

  • I tried but it produces less rows..

  • Have you updated stats? You have some serious under-estimations with the large table.

    Clustered Index Seek

    [DS_GreensDK].[DCL].[CompanyToken].[ix_CL_DCL_CompanyToken]

    Estimated Number Of Rows : 309

    Actual Number Of Rows : 13,807,291

    The SQL Guy @ blogspot[/url]

    @SeanPearceSQL

    About Me[/url]

  • No i haven not updated stats..

    May b i'll have to check that..

Viewing 12 posts - 1 through 11 (of 11 total)

You must be logged in to reply to this topic. Login to reply