How to write loop below with best practice performance ?

  • I work on SQL server 2012

    I have temp table get data from excel and based on data exist on excel i insert on table inside loop

    temp table always have big amount of data may be at least 5000 or 10000 or 15000 or more

    I need every iteration increased by 5000 rows insert from temp table

    so that i need best solutions for that according to speed and memory like that

    and if there are any thing not correct as logic please tell me

    my Query as below :

    create table #Temp(
    DocumentPartID int identity(1,1),
    CompanyName VARCHAR(4000),
    [AffectedProduct] NVARCHAR(4000),
    [ReplacementPart] VARCHAR(4000) ,
    [ReplacementCompany] VARCHAR(4000) ,
    [Category] VARCHAR(4000) ,


    DocumentID int null,
    CompanyID VARCHAR(4000) null,
    PartID int null,
    ReplacementPartID int null,
    CategoryID int null,
    [Status] VARCHAR(4000) null ,


    )



    insert into #Temp
    (
    CompanyName ,
    [AffectedProduct],
    [ReplacementPart],
    [ReplacementCompany],
    [Category]
    )
    values
    ('Nokia','RF1550','RF1550','HTS','HTS'),
    ('IPHONE','TF1545','TF1545','Corning Incorporated','HTS2')



    DECLARE @MaxValue int = ( select Max(DocumentPartID) from #Temp)
    DECLARE @Currentindex int =0
    DECLARE @Rows [dbo].[Type_ValidationInPut];

    while @Currentindex < @MaxValue
    begin


    DELETE @Rows
    INSERT INTO @Rows
    (
    RowNumber ,
    GivenPartNumber ,
    GivenManufacturer
    )


    select TOP 5000 DocumentPartID , isnull(AffectedProduct,''), isnull(CompanyName,'') FROM #Temp where

    (CategoryID = 517884 or CategoryID = 1110481) and (DocumentPartID > @Currentindex) and [Status] is null

    INSERT INTO @Rows
    (
    RowNumber ,
    GivenPartNumber ,
    GivenManufacturer
    )


    select TOP 5000 DocumentPartID, isnull(substring(ReplacementPart,0,70),''), isnull(ReplacementCompany,'') FROM #Temp where
    (DocumentPartID > @Currentindex) and [Status] is null and ReplacementPart is not null


    DECLARE @NewID nVARCHAR(4000) =newID()
    insert into [ls30].[validation].[dbo].PartsData (BatchID,RowNumber,GivenPartNumber,givenmanufacturer)
    SELECT @NewID ,0,GivenPartNumber,GivenManufacturer from @Rows



    set @Currentindex = @Currentindex +5000
    DELETE @Rows
    end
  • I need every iteration increased by 5000 rows insert from temp table

    Why not something like

    SELECT TOP (5000) <field list>
    FROM SourceTable st
    WHERE NOT EXISTS (SELECT 1 FROM DestinationTable dt
                                                WHERE dt.ID = st.ID);

    ?

    ?

  • according to your solution it will be one time only

    I need repeated check

    so  it will not be valid to use

  • The query will eliminate the sets that have already been added. That's what the NOT EXISTS predicate is for.

    I guess if you don't like it, you could always write your own?

  • I need to replace while loop above with any thing else can do loop but good performance

    meaning i dont need to loop using cursor or while loop

    i need any thing else with good performance

  • Sometimes a loop is the best way to do something.  Doing something iteratively can be one of those times.  Have you tested the performance of a loop? Do you know it's poor or is it just 'loop bad, must not use?'


    On two occasions I have been asked, "Pray, Mr. Babbage, if you put into the machine wrong figures, will the right answers come out?" ... I am not able rightly to apprehend the kind of confusion of ideas that could provoke such a question.
    —Charles Babbage, Passages from the Life of a Philosopher

    How to post a question to get the most help http://www.sqlservercentral.com/articles/Best+Practices/61537

  • Hold the phone here... inserting 5000, 10000, 15000 rows isn't a large, time consuming task for SQL Server.  Why do you think you need to "chunk" the data in this fashion to begin with?

    --Jeff Moden


    RBAR is pronounced "ree-bar" and is a "Modenism" for Row-By-Agonizing-Row.
    First step towards the paradigm shift of writing Set Based code:
    ________Stop thinking about what you want to do to a ROW... think, instead, of what you want to do to a COLUMN.

    Change is inevitable... Change for the better is not.


    Helpful Links:
    How to post code problems
    How to Post Performance Problems
    Create a Tally Function (fnTally)

  • Jeff Moden wrote:

    Hold the phone here... inserting 5000, 10000, 15000 rows isn't a large, time consuming task for SQL Server.  Why do you think you need to "chunk" the data in this fashion to begin with?

    I've been increasingly "chunking" workloads for concurrency reasons. Maybe OP is too. Just a thought.

     

  • Actually I think the existing code looks like it could end up duplicating rows from the #temp table into the PartsData table.  If there is a spreadsheet that has both "ReplacementPart" rows and "CategoryID = 517884 or 1110481" rows, then each iteration through the loop could be grabbing up to 10,000 rows from the #temp table but the counter will only be incremented by 5000.

  • x wrote:

    Jeff Moden wrote:

    Hold the phone here... inserting 5000, 10000, 15000 rows isn't a large, time consuming task for SQL Server.  Why do you think you need to "chunk" the data in this fashion to begin with?

    I've been increasingly "chunking" workloads for concurrency reasons. Maybe OP is too. Just a thought.

    I agree but with only 15,000 rows?  What size "chunks" do you normally use, Patrick?

    And sorry for the wicked late response.  I just saw this and I'm curious.

    --Jeff Moden


    RBAR is pronounced "ree-bar" and is a "Modenism" for Row-By-Agonizing-Row.
    First step towards the paradigm shift of writing Set Based code:
    ________Stop thinking about what you want to do to a ROW... think, instead, of what you want to do to a COLUMN.

    Change is inevitable... Change for the better is not.


    Helpful Links:
    How to post code problems
    How to Post Performance Problems
    Create a Tally Function (fnTally)

Viewing 10 posts - 1 through 9 (of 9 total)

You must be logged in to reply to this topic. Login to reply