• sql-noob (10/14/2013)


    Jack Corbett (8/26/2013)


    Sean Lange (8/26/2013)


    I would recommend breaking this into batches of 10k rows or so. A single insert with 13 million rows is going to kill your logs. This is one of those times that the dreaded cursor or while loop is actually going to increase your performance.

    +1 because even actions on memory-optimized tables are logged unless you specify non-durable. http://technet.microsoft.com/en-us/library/dn133174(v=sql.120).aspx

    +2 With above suggestion iwould also suggest to change the recovery model of the database to simple if you can, if not take trasaction log backup in b/w the batches to free up the .ldf file. Some indexes can also be disabled during that load to avoid lots of updates (Indexes can be rebuild again after the load).

    Overall Plan your activity accordingly 🙂

    Are you sure you can disable the index on an in-memory table?

    Need an answer? No, you need a question
    My blog at https://sqlkover.com.
    MCSE Business Intelligence - Microsoft Data Platform MVP