SQL Clone
SQLServerCentral is supported by Redgate
 
Log in  ::  Register  ::  Not logged in
 
 
 


Giant Logfiles (ldf) during loading data into a Memory Optimized table


Giant Logfiles (ldf) during loading data into a Memory Optimized table

Author
Message
Fecker Elmar
Fecker Elmar
SSC Rookie
SSC Rookie (25 reputation)SSC Rookie (25 reputation)SSC Rookie (25 reputation)SSC Rookie (25 reputation)SSC Rookie (25 reputation)SSC Rookie (25 reputation)SSC Rookie (25 reputation)SSC Rookie (25 reputation)

Group: General Forum Members
Points: 25 Visits: 229
Hello everyone,

I try to load data into a memOpt table (INSERT INTO ... SELECT ... FROM ...). The source table has a size about 1 Gb and 13 Mio Rows. During this load the LDF File grows to size of 350 GB (until the space if the disk is run out of space). The Server has about 110 GB Memory for the SQL Server reserved. The tempdB doesn't grow. The Bucket Size in the create statement has a size of 262144. The Hash key as 4 fields`(2 fields have the datatype int,1 has smallint, 1 has varchar(200). ) The disk for the datafiles has still space for the datafiles (incl. the hekaton files).

Can anyone guide how can I reduce the size of the ldf files during the load of the data ?
Sean Lange
Sean Lange
One Orange Chip
One Orange Chip (25K reputation)One Orange Chip (25K reputation)One Orange Chip (25K reputation)One Orange Chip (25K reputation)One Orange Chip (25K reputation)One Orange Chip (25K reputation)One Orange Chip (25K reputation)One Orange Chip (25K reputation)

Group: General Forum Members
Points: 25879 Visits: 17510
I would recommend breaking this into batches of 10k rows or so. A single insert with 13 million rows is going to kill your logs. This is one of those times that the dreaded cursor or while loop is actually going to increase your performance.

_______________________________________________________________

Need help? Help us help you.

Read the article at http://www.sqlservercentral.com/articles/Best+Practices/61537/ for best practices on asking questions.

Need to split a string? Try Jeff Modens splitter.

Cross Tabs and Pivots, Part 1 – Converting Rows to Columns
Cross Tabs and Pivots, Part 2 - Dynamic Cross Tabs
Understanding and Using APPLY (Part 1)
Understanding and Using APPLY (Part 2)
Jack Corbett
  Jack Corbett
SSCoach
SSCoach (18K reputation)SSCoach (18K reputation)SSCoach (18K reputation)SSCoach (18K reputation)SSCoach (18K reputation)SSCoach (18K reputation)SSCoach (18K reputation)SSCoach (18K reputation)

Group: General Forum Members
Points: 18336 Visits: 14890
Sean Lange (8/26/2013)
I would recommend breaking this into batches of 10k rows or so. A single insert with 13 million rows is going to kill your logs. This is one of those times that the dreaded cursor or while loop is actually going to increase your performance.


+1 because even actions on memory-optimized tables are logged unless you specify non-durable. http://technet.microsoft.com/en-us/library/dn133174(v=sql.120).aspx



Jack Corbett

Applications Developer

Don't let the good be the enemy of the best. -- Paul Fleming
At best you can say that one job may be more secure than another, but total job security is an illusion. -- Rod at work

Check out these links on how to get faster and more accurate answers:
Forum Etiquette: How to post data/code on a forum to get the best help
Need an Answer? Actually, No ... You Need a Question
How to Post Performance Problems
Crosstabs and Pivots or How to turn rows into columns Part 1
Crosstabs and Pivots or How to turn rows into columns Part 2
sql-noob
sql-noob
SSC Rookie
SSC Rookie (49 reputation)SSC Rookie (49 reputation)SSC Rookie (49 reputation)SSC Rookie (49 reputation)SSC Rookie (49 reputation)SSC Rookie (49 reputation)SSC Rookie (49 reputation)SSC Rookie (49 reputation)

Group: General Forum Members
Points: 49 Visits: 178
Jack Corbett (8/26/2013)
Sean Lange (8/26/2013)
I would recommend breaking this into batches of 10k rows or so. A single insert with 13 million rows is going to kill your logs. This is one of those times that the dreaded cursor or while loop is actually going to increase your performance.


+1 because even actions on memory-optimized tables are logged unless you specify non-durable. http://technet.microsoft.com/en-us/library/dn133174(v=sql.120).aspx


+2 With above suggestion iwould also suggest to change the recovery model of the database to simple if you can, if not take trasaction log backup in b/w the batches to free up the .ldf file. Some indexes can also be disabled during that load to avoid lots of updates (Indexes can be rebuild again after the load).
Overall Plan your activity accordingly :-) [I hope i am not wrong with the suggestions i gave]
Koen Verbeeck
Koen Verbeeck
One Orange Chip
One Orange Chip (27K reputation)One Orange Chip (27K reputation)One Orange Chip (27K reputation)One Orange Chip (27K reputation)One Orange Chip (27K reputation)One Orange Chip (27K reputation)One Orange Chip (27K reputation)One Orange Chip (27K reputation)

Group: General Forum Members
Points: 27151 Visits: 13268
sql-noob (10/14/2013)
Jack Corbett (8/26/2013)
Sean Lange (8/26/2013)
I would recommend breaking this into batches of 10k rows or so. A single insert with 13 million rows is going to kill your logs. This is one of those times that the dreaded cursor or while loop is actually going to increase your performance.


+1 because even actions on memory-optimized tables are logged unless you specify non-durable. http://technet.microsoft.com/en-us/library/dn133174(v=sql.120).aspx


+2 With above suggestion iwould also suggest to change the recovery model of the database to simple if you can, if not take trasaction log backup in b/w the batches to free up the .ldf file. Some indexes can also be disabled during that load to avoid lots of updates (Indexes can be rebuild again after the load).
Overall Plan your activity accordingly :-) [I hope i am not wrong with the suggestions i gave]


Are you sure you can disable the index on an in-memory table?


How to post forum questions.
Need an answer? No, you need a question.
What’s the deal with Excel & SSIS?
My blog at SQLKover.

MCSE Business Intelligence - Microsoft Data Platform MVP
sql-noob
sql-noob
SSC Rookie
SSC Rookie (49 reputation)SSC Rookie (49 reputation)SSC Rookie (49 reputation)SSC Rookie (49 reputation)SSC Rookie (49 reputation)SSC Rookie (49 reputation)SSC Rookie (49 reputation)SSC Rookie (49 reputation)

Group: General Forum Members
Points: 49 Visits: 178
My bad, the suggestion was for general ETL operations (Large one).
Forgot that it's hekaton.
Fecker Elmar
Fecker Elmar
SSC Rookie
SSC Rookie (25 reputation)SSC Rookie (25 reputation)SSC Rookie (25 reputation)SSC Rookie (25 reputation)SSC Rookie (25 reputation)SSC Rookie (25 reputation)SSC Rookie (25 reputation)SSC Rookie (25 reputation)

Group: General Forum Members
Points: 25 Visits: 229
Hello,

sorry I didn't mentioned, that I have already set the log model of the database to simple. Even with that configuration the Log growths in this way. I tried the suggestion with divide the data into smaller Batches. I managed that with creating a SSIS Package. In the properties of the 'data flow Task - OLE DB destitnation' it is possible to divide the load into Batches of 10K size. After that the copy of the data into the InMem Table works, without any unnormal growths of the log.

The exciting question will be, how SQL Server will handle this in a Environment, where it isn't possible to set the recovery model to simple (Mirrored Databases / Logshipped Databases). ;-) I hope in the RTM will be a improvent of that behavior.
Fecker Elmar
Fecker Elmar
SSC Rookie
SSC Rookie (25 reputation)SSC Rookie (25 reputation)SSC Rookie (25 reputation)SSC Rookie (25 reputation)SSC Rookie (25 reputation)SSC Rookie (25 reputation)SSC Rookie (25 reputation)SSC Rookie (25 reputation)

Group: General Forum Members
Points: 25 Visits: 229
The problem is fixed in the CTP 2, according this annoucement.

see:
http://social.msdn.microsoft.com/Forums/sqlserver/en-US/c69d826d-0994-44bf-8c17-96d5ff805dad/sql14-hekaton-loading-causes-log-space-to-blow-up?forum=sql14inmemtech
Jack Corbett
  Jack Corbett
SSCoach
SSCoach (18K reputation)SSCoach (18K reputation)SSCoach (18K reputation)SSCoach (18K reputation)SSCoach (18K reputation)SSCoach (18K reputation)SSCoach (18K reputation)SSCoach (18K reputation)

Group: General Forum Members
Points: 18336 Visits: 14890
Fecker Elmar (10/23/2013)
The problem is fixed in the CTP 2, according this annoucement.

see:
http://social.msdn.microsoft.com/Forums/sqlserver/en-US/c69d826d-0994-44bf-8c17-96d5ff805dad/sql14-hekaton-loading-causes-log-space-to-blow-up?forum=sql14inmemtech


You are still going to have log growth if you load a large amount of data in a single transaction unless you use non-durable table (schema only persisted) as all the inserts will still be logged for durability. I think the best practices for large ETL processes will still hold true for In-Memory OLTP as well.



Jack Corbett

Applications Developer

Don't let the good be the enemy of the best. -- Paul Fleming
At best you can say that one job may be more secure than another, but total job security is an illusion. -- Rod at work

Check out these links on how to get faster and more accurate answers:
Forum Etiquette: How to post data/code on a forum to get the best help
Need an Answer? Actually, No ... You Need a Question
How to Post Performance Problems
Crosstabs and Pivots or How to turn rows into columns Part 1
Crosstabs and Pivots or How to turn rows into columns Part 2
Steve Jones
Steve Jones
SSC Guru
SSC Guru (61K reputation)SSC Guru (61K reputation)SSC Guru (61K reputation)SSC Guru (61K reputation)SSC Guru (61K reputation)SSC Guru (61K reputation)SSC Guru (61K reputation)SSC Guru (61K reputation)

Group: Administrators
Points: 61811 Visits: 19099
In production, you would load in batches, and run log backups during the load to keep the size manageable.

Follow me on Twitter: @way0utwest
Forum Etiquette: How to post data/code on a forum to get the best help
My Blog: www.voiceofthedba.com
Go


Permissions

You can't post new topics.
You can't post topic replies.
You can't post new polls.
You can't post replies to polls.
You can't edit your own topics.
You can't delete your own topics.
You can't edit other topics.
You can't delete other topics.
You can't edit your own posts.
You can't edit other posts.
You can't delete your own posts.
You can't delete other posts.
You can't post events.
You can't edit your own events.
You can't edit other events.
You can't delete your own events.
You can't delete other events.
You can't send private messages.
You can't send emails.
You can read topics.
You can't vote in polls.
You can't upload attachments.
You can download attachments.
You can't post HTML code.
You can't edit HTML code.
You can't post IFCode.
You can't post JavaScript.
You can post emoticons.
You can't post or upload images.

Select a forum

































































































































































SQLServerCentral


Search