Improving Performance

  • I have this piece of code in my query , where the while loop is involved.

    It has some performance improvement to be done.

    declare @i int=1

    declare @sitename varchar(1000)

    WHILE (@i<=(select COUNT(*) from #Content))

    BEGIN

    SET @sitename=(select sitename from #Content where id=@i)

    SELECT @count = count(*)

    FROM #Content

    SET @deep = @deep + 1

    INSERT INTO #Content

    SELECT @sitename,

    Cnr_2.containerName

    ,Cnr_2.containerCode

    ,Cnr.containerName

    ,Cnr.containerCode

    ,''

    ,''

    ,''

    ,0x0

    ,@deep

    FROM Containers AS Cnr

    INNER JOIN ContainersContents AS CnrCn WITH (NOLOCK) ON CnrCn.contentSqlId = Cnr.containerSqlId

    AND CnrCn.contentIncId = Cnr.containerIncId

    AND CnrCn.isDeleted = 0x0

    AND CnrCn.contentMetaTableSqlId = 2

    AND CnrCn.contentMetaTableIncId = 413 --container

    INNER JOIN Containers AS Cnr_2 WITH (NOLOCK) ON Cnr_2.containerSqlId = CnrCn.containerSqlId

    AND Cnr_2.containerIncId = CnrCn.containerIncId

    AND Cnr_2.isDeleted = 0x0

    INNER JOIN #Content C ON C.contentCode = Cnr_2.containerCode

    WHERE Cnr.isDeleted = 0x0

    AND ((SELECT count(*) FROM #Content AS Con WHERE Con.contentCode = Cnr.containerCode) = 0) --container is not deleted and not already in

    and C.id=@i

    SET @i=@i+1

    END

    I found out like this has to be like this to improve the performance.

    declare @i int=1

    declare @sitename varchar(1000)

    declare @cntr int

    set @cntr=(select COUNT(*) from #Content)

    WHILE (@i<=@cntr)

    BEGIN

    SET @sitename=(select sitename from #Content where id=@i)

    SELECT @count = count(*)

    FROM #Content

    SET @deep = @deep + 1

    ..............................

    ..............................

    But the results are getting wrong.

    Is there any other way of re-writing this particular query.

  • Apart from the loop this query will most likely be expensive due to the Subquery in the WHERE clause

    INSERT INTO #Content

    SELECT @sitename,

    Cnr_2.containerName

    ,Cnr_2.containerCode

    ,Cnr.containerName

    ,Cnr.containerCode

    ,''

    ,''

    ,''

    ,0x0

    ,@deep

    FROM Containers AS Cnr

    INNER JOIN ContainersContents AS CnrCn WITH (NOLOCK) ON CnrCn.contentSqlId = Cnr.containerSqlId

    AND CnrCn.contentIncId = Cnr.containerIncId

    AND CnrCn.isDeleted = 0x0

    AND CnrCn.contentMetaTableSqlId = 2

    AND CnrCn.contentMetaTableIncId = 413 --container

    INNER JOIN Containers AS Cnr_2 WITH (NOLOCK) ON Cnr_2.containerSqlId = CnrCn.containerSqlId

    AND Cnr_2.containerIncId = CnrCn.containerIncId

    AND Cnr_2.isDeleted = 0x0

    INNER JOIN #Content C ON C.contentCode = Cnr_2.containerCode

    WHERE Cnr.isDeleted = 0x0

    AND ((SELECT count(*) FROM #Content AS Con WHERE Con.contentCode = Cnr.containerCode) = 0) --container is not deleted and not already in

    and C.id=@i

    You may get better performance using a NOT EXISTS on the query

    INSERT INTO #Content

    SELECT @sitename,

    Cnr_2.containerName

    ,Cnr_2.containerCode

    ,Cnr.containerName

    ,Cnr.containerCode

    ,''

    ,''

    ,''

    ,0x0

    ,@deep

    FROM Containers AS Cnr

    INNER JOIN ContainersContents AS CnrCn WITH (NOLOCK) ON CnrCn.contentSqlId = Cnr.containerSqlId

    AND CnrCn.contentIncId = Cnr.containerIncId

    AND CnrCn.isDeleted = 0x0

    AND CnrCn.contentMetaTableSqlId = 2

    AND CnrCn.contentMetaTableIncId = 413 --container

    INNER JOIN Containers AS Cnr_2 WITH (NOLOCK) ON Cnr_2.containerSqlId = CnrCn.containerSqlId

    AND Cnr_2.containerIncId = CnrCn.containerIncId

    AND Cnr_2.isDeleted = 0x0

    INNER JOIN #Content C ON C.contentCode = Cnr_2.containerCode

    WHERE Cnr.isDeleted = 0x0

    AND NOT EXISTS (SELECT * FROM #Content AS Con WHERE Con.contentCode = Cnr.containerCode) --container is not deleted and not already in

    and C.id=@i

    I would also question why you are performing a RBAR, rather than a single set operation to do the insert as this will be extremely costly and have a big impact on the log if in full recovery mode.

    _________________________________________________________________________
    SSC Guide to Posting and Best Practices

  • As per my understanding, the site name is coming from #count table only,

    So we can avoid the loop by writing the single select statement. to get @deep value try to use row_number

    partition by C.id

    though i am not completely clear with the actual scenario, i am assuming to try with the below query.

    SELECT

    sitename,

    Cnr_2.containerName

    ,Cnr_2.containerCode

    ,Cnr.containerName

    ,Cnr.containerCode

    ,''

    ,''

    ,''

    ,0x0

    ,row_number() over()-- @deep

    FROM Containers AS Cnr

    INNER JOIN ContainersContents AS CnrCn WITH (NOLOCK) ON CnrCn.contentSqlId = Cnr.containerSqlId

    AND CnrCn.contentIncId = Cnr.containerIncId

    AND CnrCn.isDeleted = 0x0

    AND CnrCn.contentMetaTableSqlId = 2

    AND CnrCn.contentMetaTableIncId = 413 --container

    INNER JOIN Containers AS Cnr_2 WITH (NOLOCK) ON Cnr_2.containerSqlId = CnrCn.containerSqlId

    AND Cnr_2.containerIncId = CnrCn.containerIncId

    AND Cnr_2.isDeleted = 0x0

    INNER JOIN #Content C ON C.contentCode = Cnr_2.containerCode

    WHERE Cnr.isDeleted = 0x0

    AND ((SELECT count(*) FROM #Content AS Con WHERE Con.contentCode = Cnr.containerCode) = 0)

  • I am posting my query modified.

    I'm trying with this query for sometime. Maybe someone can point me in the right direction.

    CREATE TABLE #Content(id int identity(1,1),

    SiteName nvarchar(1000),

    containerName nvarchar(1000),

    containerCode nvarchar(1000),

    contentName nvarchar(1000),

    contentCode nvarchar(1000),

    StudyCode nvarchar(100),

    sampleType nvarchar(MAX),

    sampleDescription nvarchar(MAX),

    isSpecimen bit

    ----deep integer

    )

    Declare @count as integer ; --count how many item are in the temp table

    set @count=0

    INSERT INTO #Content

    select Ctn.containerName, '','', Ctn.containerName, Ctn.containerCode, '', '', '',0x0

    from Containers as Ctn

    INNER JOIN Partners P WITH(NOLOCK) ON

    Ctn.partnerSqlId = P.partnerSqlId AND

    Ctn.partnerIncId = P.partnerIncId AND

    P.isDeleted = 0 AND Ctn.isDeleted=0x0

    Where charindex(Ctn.containerCode, ISNULL(@container, Ctn.containerCode)) <> 0

    AND charindex(P.partnerCode, ISNULL(@Partners, P.partnerCode)) <> 0

    declare @i int=1

    declare @sitename varchar(1000)

    WHILE (@i<=(select COUNT(*) from #Content))

    BEGIN

    SELECT @sitename = sitename from #Content where id=@i

    SELECT @count = count(*) FROM #Content

    INSERT INTO #Content

    SELECT @sitename,

    Cnr_2.containerName

    ,Cnr_2.containerCode

    ,Cnr.containerName

    ,Cnr.containerCode

    ,''

    ,''

    ,''

    ,0x0

    FROM Containers AS Cnr

    INNER JOIN ContainersContents AS CnrCn WITH (NOLOCK) ON CnrCn.contentSqlId = Cnr.containerSqlId

    AND CnrCn.contentIncId = Cnr.containerIncId

    AND CnrCn.isDeleted = 0x0

    AND CnrCn.contentMetaTableSqlId = 2

    AND CnrCn.contentMetaTableIncId = 413 --container

    INNER JOIN Containers AS Cnr_2 WITH (NOLOCK) ON Cnr_2.containerSqlId = CnrCn.containerSqlId

    AND Cnr_2.containerIncId = CnrCn.containerIncId

    AND Cnr_2.isDeleted = 0x0

    INNER JOIN #Content C ON C.contentCode = Cnr_2.containerCode

    WHERE Cnr.isDeleted = 0x0

    AND NOT EXISTS (SELECT * FROM #Content AS Con WHERE Con.contentCode = Cnr.containerCode) --container is not deleted and not already in

    and C.id=@i

    SET @i=@i+1

    END

    ----select * from #Content

    create table #temp_final

    (Partnercode nvarchar(200), SiteName varchar(1000), containerName varchar(1000), containerCode varchar(1000), samplePartnerCode varchar(100),

    studycode varchar(100), sampleTypeName varchar(max), sampleDescription varchar(max))

    declare @p int = 1

    while (@p<=(select COUNT(*) from #Content))

    begin

    insert into #temp_final

    select P.partnerCode, C.SiteName ,Cnr_2.containerName,Cnr_2.containerCode,S.samplePartnerCode,

    ST.studycode,SMT.sampleTypeName,S.sampleDescription

    from SamplesPartners as S

    INNER JOIN ContainersContents AS CnrCn WITH(NOLOCK)

    ON CnrCn.contentSqlId=S.samplePartnerSqlId AND CnrCn.contentIncId=S.samplePartnerIncId AND CnrCn.isDeleted=0x0 AND S.isDeleted = 0

    and CnrCn.contentMetaTableSqlId=2 and CnrCn.contentMetaTableIncId=293 --samples partners

    INNER JOIN Containers AS Cnr_2 WITH(NOLOCK)

    ON Cnr_2.containerSqlId=CnrCn.containerSqlId AND Cnr_2.containerIncId=CnrCn.containerIncId AND Cnr_2.isDeleted=0x0

    INNER JOIN Partners P WITH(NOLOCK)

    ON Cnr_2.partnerSqlId = P.partnerSqlId AND Cnr_2.partnerIncId = P.partnerIncId AND

    P.isDeleted = 0 AND Cnr_2.isDeleted=0x0

    INNER Join #Content C

    On C.contentCode=Cnr_2.containerCode

    LEFT JOIN SampleTypes SMT ON

    S.sampleTypeIncId = SMT.sampleTypeIncId AND S.sampleTypeSqlId = SMT.sampleTypeSqlId AND SMT.isDeleted = 0

    LEFT JOIN ActivitiesSamplesPartners ASP ON S.samplePartnerIncId = ASP.samplePartnerIncId

    AND S.samplePartnerSqlId = ASP.samplePartnerSqlId

    AND S.isDeleted = 0 AND ASP.isDeleted = 0

    LEFT JOIN Activities A ON A.activityIncId = ASP.activityIncId

    AND A.activitySqlId = ASP.activitySqlId

    AND A.isDeleted = 0

    LEFT JOIN Studies ST ON A.studyIncId = ST.studyIncId

    AND A.studySqlId = ST.studySqlId

    AND ST.isDeleted = 0

    where C.id=@p

    AND charindex(P.partnerCode, ISNULL(@Partners, P.partnerCode)) <> 0

    set @p= @p + 1

    end

    create table #temp2(id int identity(1,1),containerCode varchar(1000),studycode varchar(100))

    insert into #temp2

    select distinct containerCode,studycode

    from #temp_final

    where studycode IS NOT NULL

    declare @j-2 int=1

    declare @containercode varchar(100)

    declare @studycode varchar(100)

    create table #temp_final2(SiteName varchar(1000),containerName varchar(1000),containerCode varchar(1000),studycode varchar(100),

    samplePartnerCode varchar(100), sampletypename varchar(max),sampleDescription varchar(max))

    while (@j<=(select COUNT(*) from #temp2))

    begin

    set @containercode=(select containerCode from #temp2 where id =@j)

    set @studycode=(select studycode from #temp2 where id=@j)

    insert into #temp_final2

    select distinct SiteName, containerName,containerCode,studycode,samplePartnerCode,

    (select stuff((select distinct ','+ convert(varchar(max),sampleTypeName) from #temp_final

    where containerCode=@containercode

    and studycode=@studycode

    for xml path('')) , 1,1,'')) as sampleTypeName

    ,(select stuff((select distinct ','+ convert(varchar(max),sampleDescription) from #temp_final

    where containerCode=@containercode

    and studycode=@studycode

    for xml path('')) , 1,1,'')) as sampleDescription

    from #temp_final

    where containerCode=@containercode

    and studycode=@studycode

    set @j-2=@j+1

    end

    ----FINAL SELECT Query

    select distinct *

    from #temp_final2

  • Can you describe in words exactly what the code is supposed to do? It looks like you have a hierarchy in there - have you explored some of the more conventional methods for resolving hierarchies?

    Edit: is this really necessary? Can you supply some sample data of what might be encountered either side of such a join?

    Where charindex(Ctn.containerCode, ISNULL(@container, Ctn.containerCode)) <> 0

    AND charindex(P.partnerCode, ISNULL(@Partners, P.partnerCode)) <> 0

    “Write the query the simplest way. If through testing it becomes clear that the performance is inadequate, consider alternative query forms.” - Gail Shaw

    For fast, accurate and documented assistance in answering your questions, please read this article.
    Understanding and using APPLY, (I) and (II) Paul White
    Hidden RBAR: Triangular Joins / The "Numbers" or "Tally" Table: What it is and how it replaces a loop Jeff Moden

  • I am using this particular query for an SSRS report

    It has two multivalue parameters Partners and containers

    When the user selects 45 partners and 42 corresponding containers, the performance issue occurs.

    This query takes almost 17 minutes to execute.

    I have used the CHARINDEX query to for filtering based on multiple partners and containers selected in the report.

    If you can help in altering the existing code which I posted to improve the execution time, it will be most helpful

  • Junglee_George (2/11/2016)


    I am using this particular query for an SSRS report

    It has two multivalue parameters Partners and containers

    When the user selects 45 partners and 42 corresponding containers, the performance issue occurs.

    This query takes almost 17 minutes to execute.

    I have used the CHARINDEX query to for filtering based on multiple partners and containers selected in the report.

    If you can help in altering the existing code which I posted to improve the execution time, it will be most helpful

    We can deal with that by splitting the partner and container parameter into tables.

    What about the two questions I posted yesterday?

    Is there a hierarchy somewhere in your data?

    What exactly is the code supposed to do?

    “Write the query the simplest way. If through testing it becomes clear that the performance is inadequate, consider alternative query forms.” - Gail Shaw

    For fast, accurate and documented assistance in answering your questions, please read this article.
    Understanding and using APPLY, (I) and (II) Paul White
    Hidden RBAR: Triangular Joins / The "Numbers" or "Tally" Table: What it is and how it replaces a loop Jeff Moden

  • The hierarchy is there in the container data.

    The result of #content shows the actual container level hierarchy in the container list starting with the top most level container.

    If we optimize the result of #content, it will be fine. It outputs almost 11000 records.

    Also, avoiding the while loop may further optimize the result.

    The data type of @Partners and @Container is NVARCHAR(MAX)

    Partners value can be like 'EUADVE', 'EUUSEB', 'EUBGLE'

    Containers value can be like 'LOC-DENI-000012', 'LOC-DEST-000002', 'LOC-BRSP-000025'

  • Junglee_George (2/11/2016)


    The hierarchy is there in the container data.

    The result of #content shows the actual container level hierarchy in the container list starting with the top most level container.

    If we optimize the result of #content, it will be fine. It outputs almost 11000 records.

    Also, avoiding the while loop may further optimize the result.

    The data type of @Partners and @Container is NVARCHAR(MAX)

    Partners value can be like 'EUADVE', 'EUUSEB', 'EUBGLE'

    Containers value can be like 'LOC-DENI-000012', 'LOC-DEST-000002', 'LOC-BRSP-000025'

    Thanks, this is a start. Now, I'd guess that anything other than superficial help is going to cost someone a good couple of hours. Can I suggest you knock up some sample data for them to work with, and provide a more elaborate explanation of what it is you are trying to do here, and how that container hierarchy works? Cheers.

    “Write the query the simplest way. If through testing it becomes clear that the performance is inadequate, consider alternative query forms.” - Gail Shaw

    For fast, accurate and documented assistance in answering your questions, please read this article.
    Understanding and using APPLY, (I) and (II) Paul White
    Hidden RBAR: Triangular Joins / The "Numbers" or "Tally" Table: What it is and how it replaces a loop Jeff Moden

  • Junglee_George (2/11/2016)


    The data type of @Partners and @Container is NVARCHAR(MAX)

    Partners value can be like 'EUADVE', 'EUUSEB', 'EUBGLE'

    Containers value can be like 'LOC-DENI-000012', 'LOC-DEST-000002', 'LOC-BRSP-000025'

    Never ever use NVARCHAR(MAX) for column data types.

    Never!

    I once managed to improve overall performance of a database by about 70 times simply by replacing all NVARCHAR(MAX) with NVARCHAR(500) and NVARCHAR(100) in all tables where it was used.

    7 times.

    I did not even read a single line of code.

    _____________
    Code for TallyGenerator

  • Sergiy (2/11/2016)


    Junglee_George (2/11/2016)


    The data type of @Partners and @Container is NVARCHAR(MAX)

    Partners value can be like 'EUADVE', 'EUUSEB', 'EUBGLE'

    Containers value can be like 'LOC-DENI-000012', 'LOC-DEST-000002', 'LOC-BRSP-000025'

    Never ever use NVARCHAR(MAX) for column data types.

    Never!

    I once managed to improve overall performance of a database by about 70 times simply by replacing all NVARCHAR(MAX) with NVARCHAR(500) and NVARCHAR(100) in all tables where it was used.

    7 times.

    I did not even read a single line of code.

    A pretty dangerous thing to do, and an incredibly irresponsible advice to put out on forums.

    Replacing the max datatypes with shorter versions is fine if you know for sure that you will never need a larger length. But that is in most cases impossible to know without reading the code and talking to users and managers.

    Never, ever, ever shorten a data type without analyzing the code, verifying your assumptions, and setting up a good test.


    Hugo Kornelis, SQL Server/Data Platform MVP (2006-2016)
    Visit my SQL Server blog: https://sqlserverfast.com/blog/
    SQL Server Execution Plan Reference: https://sqlserverfast.com/epr/

  • Hugo Kornelis (2/11/2016)


    A pretty dangerous thing to do, and an incredibly irresponsible advice to put out on forums.

    Believe me, unfortunately - it's not.

    Those who know what their data types mean and why they are there won't listen to it, as they don't need it anyway.

    Those who don't know their data - they'll have to learn it in order to decide which length to use for VARCHAR data types.

    Replacing the max datatypes with shorter versions is fine if you know for sure that you will never need a larger length.

    Placing the max datatypes is fine only if you know for sure what you will need a larger length for.

    But that is in most cases impossible to know without reading the code and talking to users and managers.

    Yes, in most cases it's impossible for developers to explain why did they use MAX data types.

    They use it just because they don't want to think about table design.

    Forcing them to decide on an actual length limit for string values will make them think about it.

    Which is good.

    So, it's pretty good advice after all.

    _____________
    Code for TallyGenerator

  • Sergiy (2/11/2016)


    Hugo Kornelis (2/11/2016)


    A pretty dangerous thing to do, and an incredibly irresponsible advice to put out on forums.

    Believe me, unfortunately - it's not.

    Those who know what their data types mean and why they are there won't listen to it, as they don't need it anyway.

    Those who don't know their data - they'll have to learn it in order to decide which length to use for VARCHAR data types.

    Replacing the max datatypes with shorter versions is fine if you know for sure that you will never need a larger length.

    Placing the max datatypes is fine only if you know for sure what you will need a larger length for.

    But that is in most cases impossible to know without reading the code and talking to users and managers.

    Yes, in most cases it's impossible for developers to explain why did they use MAX data types.

    They use it just because they don't want to think about table design.

    Forcing them to decide on an actual length limit for string values will make them think about it.

    Which is good.

    So, it's pretty good advice after all.

    An interesting conclusion. You still did not address a concern I also had with your approach where you improved system performance 70 times. The one statement from Hugo you decided not to quote and refute or address, this:

    Never, ever, ever shorten a data type without analyzing the code, verifying your assumptions, and setting up a good test.

    You said yourself you did not read one line of code. I am curious how you landed on NVARCHAR(500) for some columns and NVARCHAR(100) for others and determined all of it was safe. Did you profile the data? What was your method?

    There are no special teachers of virtue, because virtue is taught by the whole community.
    --Plato

  • Orlando Colamatteo (2/11/2016)


    Sergiy (2/11/2016)


    Hugo Kornelis (2/11/2016)


    A pretty dangerous thing to do, and an incredibly irresponsible advice to put out on forums.

    Believe me, unfortunately - it's not.

    Those who know what their data types mean and why they are there won't listen to it, as they don't need it anyway.

    Those who don't know their data - they'll have to learn it in order to decide which length to use for VARCHAR data types.

    Replacing the max datatypes with shorter versions is fine if you know for sure that you will never need a larger length.

    Placing the max datatypes is fine only if you know for sure what you will need a larger length for.

    But that is in most cases impossible to know without reading the code and talking to users and managers.

    Yes, in most cases it's impossible for developers to explain why did they use MAX data types.

    They use it just because they don't want to think about table design.

    Forcing them to decide on an actual length limit for string values will make them think about it.

    Which is good.

    So, it's pretty good advice after all.

    An interesting conclusion. You still did not address a concern I also had with your approach where you improved system performance 70 times. The one statement from Hugo you decided not to quote and refute or address, this:

    Never, ever, ever shorten a data type without analyzing the code, verifying your assumptions, and setting up a good test.

    You said yourself you did not read one line of code. I am curious how you landed on NVARCHAR(500) for some columns and NVARCHAR(100) for others and determined all of it was safe. Did you profile the data? What was your method?

    The database was 4 years in production.

    Longest string in the fields was just below 100 characters.

    There was no any trend of increasing length over time.

    So, I concluded 500 will be just enough.

    5 years on - no complaints so far.

    _____________
    Code for TallyGenerator

  • Sergiy (2/11/2016)


    Orlando Colamatteo (2/11/2016)


    Sergiy (2/11/2016)


    Hugo Kornelis (2/11/2016)


    A pretty dangerous thing to do, and an incredibly irresponsible advice to put out on forums.

    Believe me, unfortunately - it's not.

    Those who know what their data types mean and why they are there won't listen to it, as they don't need it anyway.

    Those who don't know their data - they'll have to learn it in order to decide which length to use for VARCHAR data types.

    Replacing the max datatypes with shorter versions is fine if you know for sure that you will never need a larger length.

    Placing the max datatypes is fine only if you know for sure what you will need a larger length for.

    But that is in most cases impossible to know without reading the code and talking to users and managers.

    Yes, in most cases it's impossible for developers to explain why did they use MAX data types.

    They use it just because they don't want to think about table design.

    Forcing them to decide on an actual length limit for string values will make them think about it.

    Which is good.

    So, it's pretty good advice after all.

    An interesting conclusion. You still did not address a concern I also had with your approach where you improved system performance 70 times. The one statement from Hugo you decided not to quote and refute or address, this:

    Never, ever, ever shorten a data type without analyzing the code, verifying your assumptions, and setting up a good test.

    You said yourself you did not read one line of code. I am curious how you landed on NVARCHAR(500) for some columns and NVARCHAR(100) for others and determined all of it was safe. Did you profile the data? What was your method?

    The database was 4 years in production.

    Longest string in the fields was just below 100 characters.

    There was no any trend of increasing length over time.

    So, I concluded 500 will be just enough.

    5 years on - no complaints so far.

    I think your customer should feel lucky, and so should you, that this move has not adversely affected your customer's data or business. I think it may even be possible that it has adversely affected their data but it has yet to be detected. I consider your move an "educated guess" (maybe well-educated, but still a guess) and I am with Hugo here. You did not do a complete impact analysis therefore you took a risk. While this may seem safe in some cases the approach is not worthy of passing along as advice on an open forum.

    There are no special teachers of virtue, because virtue is taught by the whole community.
    --Plato

Viewing 15 posts - 1 through 15 (of 16 total)

You must be logged in to reply to this topic. Login to reply