how to increase batch size to 100000

  • i have a table where it has 1 million rows. i want to generate deployment script like insert script where it should add GO after each 10,000 records.

     

    is there an option in sql server 2014 while generating scripts?

  • If you're generating individual INSERT/VALUES statements, I wouldn't do that because it will take a month of Sundays to complete 1 million rows.  In fact, most methods will not to mention the fact that such a thing will negate all possibility of minimal logging.

    We need more information...

    1. Are you, indeed, generating individual INSERT/VALUES statements using the GUI method of scripting data?
    2. What does the CREATE TABLE statement look like for the target table?
    3. Does the target table already contain any rows or is it empty at the start of the deployment?
    4. Where will you be running this deployment script from?  The command line or within SSMS?
    5. I'm assuming that the data for the deployment script will live in a file.  Is that correct?  If not, can it?

    --Jeff Moden


    RBAR is pronounced "ree-bar" and is a "Modenism" for Row-By-Agonizing-Row.
    First step towards the paradigm shift of writing Set Based code:
    ________Stop thinking about what you want to do to a ROW... think, instead, of what you want to do to a COLUMN.

    Change is inevitable... Change for the better is not.


    Helpful Links:
    How to post code problems
    How to Post Performance Problems
    Create a Tally Function (fnTally)

  • What did you end up doing for this?  Inquiring minds want to know. 😀

    --Jeff Moden


    RBAR is pronounced "ree-bar" and is a "Modenism" for Row-By-Agonizing-Row.
    First step towards the paradigm shift of writing Set Based code:
    ________Stop thinking about what you want to do to a ROW... think, instead, of what you want to do to a COLUMN.

    Change is inevitable... Change for the better is not.


    Helpful Links:
    How to post code problems
    How to Post Performance Problems
    Create a Tally Function (fnTally)

  • Couldn't you use TOP with OFFSET to get a fixed size chunk of records and insert that way? (So modify to do the insert in chunks?

    use AdventureWorks2017;
    GO
    DECLARE @NumRows INT = 10;
    WHILE @NumRows <= 40
    BEGIN
     SELECT
      [name],
      listprice
     FROM
      production.product
     ORDER BY
      listprice,
      [name]
     OFFSET @NumRows ROWS
     FETCH NEXT 10 ROWS ONLY;
     
     SET @NumRows = @NumRows + 10;
    END
  • pietlinden wrote:

    Couldn't you use TOP with OFFSET to get a fixed size chunk of records and insert that way? (So modify to do the insert in chunks?

    I think the OP wanted the inserts in a million line script.

  • How about doing something like BCP OUT to write the records to some manageable number of text files, and then use BULK INSERT to insert each file of, say, 100K records?

    https://docs.microsoft.com/en-us/sql/tools/bcp-utility?view=sql-server-ver15

  • That was going to be my suggestion if the OP ever came back on this thread except I wouldn't limit the batches.  First, BCP automatically limits itself to 1,000 rows per batch on the output (and the output should be sorted by an ORDER BY).  Then, use BULK INSERT (which is a little bit faster than BCP) to do the import.  It's nasty fast just about any way but if you can get minimal logging to happen with TABLOCK and an import order, you can get twice as fast as nasty fast across 2 million rows and not have to chunk up anything.

    --Jeff Moden


    RBAR is pronounced "ree-bar" and is a "Modenism" for Row-By-Agonizing-Row.
    First step towards the paradigm shift of writing Set Based code:
    ________Stop thinking about what you want to do to a ROW... think, instead, of what you want to do to a COLUMN.

    Change is inevitable... Change for the better is not.


    Helpful Links:
    How to post code problems
    How to Post Performance Problems
    Create a Tally Function (fnTally)

Viewing 7 posts - 1 through 6 (of 6 total)

You must be logged in to reply to this topic. Login to reply