Replication - High memory usage when running Snapshot

  • If there are any replication gurus out there - I need your help!!!

    First the background ... a client of mine has transactional replication set up against a 15GB database. The bulk of the 15GB is contained within one BLOB column on a table. The distributor was SQL Server 2008 SP1 and sits on a box that's got 3.5GB of available RAM - spotted the problem yet?

    Anyway, when the snapshot agent runs, available memory on the box drops like a lead balloon. I have a strong suspicion that the issue is related to this bug - http://support.microsoft.com/kb/2606301 and that it's because the snapshot process uses memory mapped data to do the data transfer. As such I've applied SP3 CU1, but memory still drops and eventually runs out, rendering the server inaccessible.

    I've run a number of tests so I know it's this BLOB data causing the problem. This includes excluding the column from the publication, and also filtering the publication so the amount of BLOB data coming across does not exceed the amount of physical memory available on the distributor. In both these cases the snapshot gets generated successfully.

    Any thoughts?

    Thanks (and Merry Christmas!)

    Phil

  • And if anyone's up for a bit of light reading I've captured the output by running snapshot.exe in the command prompt and logging at outputverboselevel 2 (you can see the command arguments I put in at the top of the file).

    I'm afraid I did chicken out and I killed the snapshot.exe in Task Manager after my instance of SQL Server had paged out all its memory to prevent the server from dying on me completely! Therefore, I'm not sure how much use the attached log file is as the last recorded entry there is at 13:17, and I had to kill the exe at 13:22, so there's a good 5 minutes of activity that's not being logged.

    Happy reading!

    Phil

  • If anyone has any ideas on this one it would be very much appreciated! I've got the backup plan of being able to initialise from backup, but ideally I'd like to get the snapshot working.

    Thanks

  • Stop trying to force a square peg into a round hole. Just initialize from backup and move on to the next item in your to do list.

    Best,
    Kevin G. Boles
    SQL Server Consultant
    SQL MVP 2007-2012
    TheSQLGuru on googles mail service

  • How many subscribers are there on the box?

    How many packages?

    Is the distributor on the same box as the database?

    How much memory is allocated to SQL Server?

    Steve Jimmo
    Sr DBA
    “If we ever forget that we are One Nation Under God, then we will be a Nation gone under." - Ronald Reagan

  • Thanks for the replies. In response, firstly my hammering of my square peg...

    Kevin, the subscriber is on a remote site managed by a third party, and therefore co-ordinating the various stages in intialising the subscriber and making sure things are done in a timely manner becomes that much more complicated.

    And Steve, the server itself is only acting as the distributor. We're talking about 1 publication (with a ~15GB table) on a box that only has 4GB RAM. This SQL instance is using up 2GB of that RAM.

    Now I know there are obvious answers to this, i.e. add more RAM, and that's fine, and we're setting up another box that will do just that. But one of my reasons for asking this question is that I don't want to have to continually up the RAM in a box to handle this 15GB table. If I have a box that's got 18GB of RAM I can do the intial setup using snapshot and go home. But when that table grows to beyond 17, 18, 19GB ... what then? I know I'm back to the intialise from backup, but surely there's a way of managing such large tables through the snapshot so it doesn't use up all available RAM?

    An answer to that question would be much appreciated, even if it's just to satisfy my curiosity.

    Thank you for indulging me!

  • SQLPhil,

    Your first issue, as I see it is the very low memory on the box. The biggest issue is that with Replication, each of the agents running uses memory both from SQL and from Windows. Each agent is running a windows session. In larger installations there are other issues wich you would see this a lot more. In yours, I suggest increasing the memory to 12 GB on your distributor. This will give you room to grow some.

    On my distributors, I do not see a lot of memory increase from a snapshot. I do though see what translates into system paging. This is caused by the fact that the snapshot is writing the physical files on the distributor (most likely as I do not know your full configuration).

    You will see an increase in memory usage on the OS side ad the sanpshot agent starts. The snapshot is generated by the system using bcp to output the data to the files. You will see an increase in network traffic between the publisher and the subscriber when this happens. This should not increase the SQL memory usage dramatically. But again, you are using a very limited amount of memory in both the OS and SQL.

    Ideally, though I have had some who disagree, I try to give the OS 3GB of memory for itself to be happy. I then figure out what else is running on the box that will need memory and give it what it needs. The rest is for SQL. The argument is that SQL is a memory hog and all should be given to SQL. The problem is then 2 fold - the OS will balk in the form of paging which is a problem, even if the box iss dedicated to SQL. In the case of yours being a remote distrributor, it wants more memory for desktop heap, which is OS. (Desktop heap is simply (high level) the memory used by each application which makes a connection and runs on the box. In this case each agent. An example using a larger installation is mine where the central distributor has 16 cores with 32GB of memory. The memory is set to 15 for SQL and the rest for OS. But in my case I have 28 packages with well over 600 subscribers. My distribution database is only 50 Gb annd the publishing databases are ranging from 300GB to 1 TB. (There are internal adjustments which must be performed for a system this large for the OS, but you don't need to worry about that at this point.)

    So again, the biggest issue you need to address is the limited amount of memory you have currently.

    Hope that I was able to be of some assistance.

    Steve Jimmo
    Sr DBA
    “If we ever forget that we are One Nation Under God, then we will be a Nation gone under." - Ronald Reagan

  • Try to increase Virtual Memory

  • Simha

    Try to increase Virtual Memory

    The problem with this is that paging would increase as items are swapped between RAM and Virtual wouldn't it?

    Steve Jimmo
    Sr DBA
    “If we ever forget that we are One Nation Under God, then we will be a Nation gone under." - Ronald Reagan

  • Thanks everyone for getting in touch - I really do appreciate it.

    Simha, unfortunately virtual memory isn't the issue. As the memory usage ramps up when the snapshot is taking place you can see the SQL instance page out, but the process keeps on going. The problem is that when the snapshot executable runs on the distributor it's not a process that uses up much memory (if you get the chance, run a snapshot and look at memory usage in Task Manager - very little changes). But if you look at the Available Memory counter in perfmon you can see the memory usage slowly creep up. I'm putting this down to the memory mapped data the snapshot process uses when it bcp's the contents of the database to file - and this does not get paged.

    Steve, awesome response - thank you so much for taking out the time for writing all that out and drawing upon your own experiences which is very helpful. As you say, it all comes down to the amount of RAM on the box, and that I (desperately) need more! At least this side is getting addressed and I will hopefully soon have a more beefier box to run the distributor from, which will help.

    I guess what I'm looking for is some sort of configuration option which specifies the threshold at which memory mapped data is flushed to disk (set at a level that will ensure I don't encounter any paging or server crashes!). That's what I was hoping SQL 2008 SP3 CU1 was going to give me - but no joy it would seem.

    Not to worry, I have options going forward which is the important thing, and I've learned a lot more about replication as part of this (rather painful) process. Thanks again all for your help!

  • SQLLPhil,

    Hang in there. It will eventually get better.

    Steve Jimmo
    Sr DBA
    “If we ever forget that we are One Nation Under God, then we will be a Nation gone under." - Ronald Reagan

  • Hi SQLPhil,

    I am facing a similar issue. I am configuring Merge Replication for a Database of Size 3GB.

    When I run the SnapShot Agent My memory is falling down. I'm unable to diagnose the issue from past 2days.

    DB Size is 3.3GB

    Available size is 13 GB

    Please help me to come out of it.

    Thanks,

    Vamsi

  • vamsi2.k (5/20/2013)


    Hi SQLPhil,

    I am facing a similar issue. I am configuring Merge Replication for a Database of Size 3GB.

    When I run the SnapShot Agent My memory is falling down. I'm unable to diagnose the issue from past 2days.

    DB Size is 3.3GB

    Available size is 13 GB

    Please help me to come out of it.

    Thanks,

    Vamsi

    Please start a new thread. This one is over 1 year old.

    Thanks!

    Best,
    Kevin G. Boles
    SQL Server Consultant
    SQL MVP 2007-2012
    TheSQLGuru on googles mail service

Viewing 13 posts - 1 through 12 (of 12 total)

You must be logged in to reply to this topic. Login to reply