Memory Optimized tables for speeding inserts and updates

  • Does memory optimized tables speed inserts and updates? Also, what would be the pros and cons in it's implementations specifically in regards to SQL Server 2017 Enterprise. Basically, I am working on a requirement where the user needs to let's say for e.g. data is pulled from table a and table b joined together and dumped into table c that is created on the fly and there would be other columns in the table that would be null when dumping data from a and b but later on would be joined with other tables to update the columns that contain null values. As per the developer he would like to make use of memory optimized tables for table c and has asked a few question in regards to it which I have mentioned below

    • Does it speed up inserts? How fast it could be if for e.g. disk based gives us let's say 230 ms and if memory optimized gives us say 200 ms for inserting maybe 1 million rows it doesn't make much sense to implement Memory Optimized tables for a 30 ms difference
    • Are there any gotchas in the implementation? Like for e.g. we have a half a terabyte sized RAM on the server will the amount of data impact performance of the existing system? How to calculate how much memory would be needed to for table c
    • Database on which it would be implemented shouldn't affect other users or queries etc. Also, note that the database is a subscriber database although table c is not under replication but table a and b are...so any potential issues with it?
    • Last but not the least...is there another way to speed inserts rather than making use of Memory optimization due to the amount of work involved.

    Any help to answer these questions would be highly appreciated.

    • This topic was modified 3 years, 10 months ago by  Feivel.
    • This topic was modified 3 years, 10 months ago by  Feivel.
    • This topic was modified 3 years, 10 months ago by  Feivel.
  • IMHO, memory optimized tables have a lot of baggage for something like this.  How often will you need to do this?  If it's once every 10 minutes, maybe.  If it's once a day, probably not.

    Have you done any testing using the disk method?  I'm thinking that's the first thing you should do.  If it's slow, we're going to need all the information that you can provide to troubleshoot a performance issue.  See the 2nd link in my signature line below for how to do that.  Strongly recommend you read it before you do the test so you know what's coming.

     

    --Jeff Moden


    RBAR is pronounced "ree-bar" and is a "Modenism" for Row-By-Agonizing-Row.
    First step towards the paradigm shift of writing Set Based code:
    ________Stop thinking about what you want to do to a ROW... think, instead, of what you want to do to a COLUMN.

    Change is inevitable... Change for the better is not.


    Helpful Links:
    How to post code problems
    How to Post Performance Problems
    Create a Tally Function (fnTally)

  • Hey Jeff,

    I am only looking for a general pros/cons list that I can forward it to the developer.

  • Sorry... you're not going to like this.  My general guideline for things as important as Memory Optimized data is to Google for the documentation and read about it because I find that a whole lot of people read someone's "general guidelines" on such a subject and never read anything else about.  Then, they either never implement it on something where it would be a fantastic idea to do such an implementation or they implement it on something they should never have.

    My thoughts on Memory Optimized stuff... do it right on the right things and get amazing performance... and, even if you do it right but on the wrong things, you'll end up wishing you did a deep study and some serious testing first.

    To summarize, "It Depends" and you need to study it.

    --Jeff Moden


    RBAR is pronounced "ree-bar" and is a "Modenism" for Row-By-Agonizing-Row.
    First step towards the paradigm shift of writing Set Based code:
    ________Stop thinking about what you want to do to a ROW... think, instead, of what you want to do to a COLUMN.

    Change is inevitable... Change for the better is not.


    Helpful Links:
    How to post code problems
    How to Post Performance Problems
    Create a Tally Function (fnTally)

  • I was expecting that as an answer. Never mind, and thanks for taking the pains to even bother responding to the post. God bless you.

  • I didn't mean that as a slam.  It's just not possible to post general guidlines because everyone's stuff is different.  I just watched the following YouTube on the subject... every one talks about 30X performance improvement but in this 'tube, I'm seeing someone that seems to know what he's doing but the best he could do was whittle down a 5 second insert to a little over 2 seconds... for 1600 bloody rows.  I do a million rows in less than that on my laptop without In-Memory and I only have 8 TempDB files instead of 24 etc, etc.

    Here's the link...

    https://www.youtube.com/watch?v=-DVVcxvhyhM

    Here's another link that shows some of the tools that come with SSMS to help estimate whether a move to In-Memory is going to help or not... this 'tube is by one of the folks that helped write the In Memory stuff...  but even with that, I'm not yet wicked impressed.

    https://www.youtube.com/watch?v=l5l5eophmK4&t=0

     

    --Jeff Moden


    RBAR is pronounced "ree-bar" and is a "Modenism" for Row-By-Agonizing-Row.
    First step towards the paradigm shift of writing Set Based code:
    ________Stop thinking about what you want to do to a ROW... think, instead, of what you want to do to a COLUMN.

    Change is inevitable... Change for the better is not.


    Helpful Links:
    How to post code problems
    How to Post Performance Problems
    Create a Tally Function (fnTally)

  • p.s. Even in SQL Server 2016, there's still a limit of the In Memory stuff of 250GB.  In the databases of today, that means you have to be a wee bit careful in choosing what you're going to include with the idea that it's going to continue to grow.  I have no idea what happens when the data exceeds that.  That's why I say you really have to snuggle up with the documentation and that general advise isn't going to necessarily portray the reality.

     

    --Jeff Moden


    RBAR is pronounced "ree-bar" and is a "Modenism" for Row-By-Agonizing-Row.
    First step towards the paradigm shift of writing Set Based code:
    ________Stop thinking about what you want to do to a ROW... think, instead, of what you want to do to a COLUMN.

    Change is inevitable... Change for the better is not.


    Helpful Links:
    How to post code problems
    How to Post Performance Problems
    Create a Tally Function (fnTally)

  • Jeff Moden wrote:

    I didn't mean that as a slam.  It's just not possible to post general guidlines because everyone's stuff is different.  I just watched the following YouTube on the subject... every one talks about 30X performance improvement but in this 'tube, I'm seeing someone that seems to know what he's doing but the best he could do was whittle down a 5 second insert to a little over 2 seconds... for 1600 bloody rows.  I do a million rows in less than that on my laptop without In-Memory and I only have 8 TempDB files instead of 24 etc, etc.

    Here's the link...

    https://www.youtube.com/watch?v=-DVVcxvhyhM

    Here's another link that shows some of the tools that come with SSMS to help estimate whether a move to In-Memory is going to help or not... this 'tube is by one of the folks that helped write the In Memory stuff...  but even with that, I'm not yet wicked impressed.

    https://www.youtube.com/watch?v=l5l5eophmK4&t=0

    I tried using memory optimized tables on a small database I wrote solve a combinatorial optimization problem (SQL Server is not the best tool for this but it was more just an experiment for a bit of fun). The memory optimized tables turned out to be slightly slower than standard ones. The database had an SSD disk and SQL Server caches data in memory, the memory optimised tables are very limited in what indexes, foreign keys you can have etc. So I'm not convinced by them either.

Viewing 8 posts - 1 through 7 (of 7 total)

You must be logged in to reply to this topic. Login to reply