Home > System Administration, Windows Server 2008 R2 > High Memory Usage – Windows Server 2008 R2 File Server

High Memory Usage – Windows Server 2008 R2 File Server

I recently reinstalled my file server on Windows Server 2008 R2 on VMware ESXi 4.1.1 and I noticed something very strange and very alarming. Couple hours after booting up, the server would use all of the physical memory as shown on the screenshots below.

By design, some services in 2008 R2 like Lsass.exe and dfs-replication will cache memory for future utilization. Exchange has the same behavior but on much bigger scale as it can eat all available memory. This cache is not a problem because the cached memory can be released and given to the OS or applications if needed be and doesn’t create any performance related issues. Note that this behavior is new to 2008 R2. In previous versions of Windows Server, you had to install Dynamic Cache Service to throttle the cache growth but it is no longer needed in the current version of Windows due to some kernel improvement. I then found a very useful tool  from Sysinternals called RamMap. This tool creates a map of the memory allocated by the operating system.

This is where I found the cause of the “memory leak”. As per Technet ” Metafile is part of the system cache and consists of NTFS metadata. NTFS metadata includes the MFT as well as the other various NTFS metadata files (see How NTFS Works for more details, and of course Windows Internals is a great reference). In the MFT each file attribute record takes 1k and each file has at least one attribute record. Add to this the other NTFS metadata files and you can see why the Metafile category can grow quite large on servers with lots of files.” After adding 1 GB of RAM to my server, the Metafile stabilized to 1.5 GB of allocated memory.

  1. Sammy
    April 13, 2011 at 11:34 am

    We’re experiencing the same issue and working with Microsoft on it. We’re running 8GB of memory, but it’s consuming 98% of that (Metafile). The only way to clear it is to recycle the server. So far they’re telling us that this is the way 2008 R2 x64 is designed.

    However, there are fixes, but at this point 2008 R2 still has the issue.


  2. Speeder
    May 9, 2011 at 5:36 pm

    Do you have any updates to this? We’re experiencing the same issue (6mil files would mean we need 8gb ram for this to be stable in our env). Microsoft premier suggested that was a memory ballooning bug (which it isn’t) and now wants us to run all kind of log collection tools with multiple reboots, etc. Something we can’t do on a production server.

    • Samuel Robillard
      May 10, 2011 at 9:12 am

      As per MS, this behavior is by designed. In 2003 and 2008, you had Windows Dynamic Cache Service to manage the size of the cache but it doesn’t work on 2008 R2. The algorithms are supposed to be updated to prevent this…

  3. The Diamond Z
    August 5, 2011 at 3:41 pm

    Anyone know when the algorithms will be updated?
    It seems to happen mainly on servers with a lot disk/file intensive behavior.

  4. laderio
    August 8, 2011 at 10:04 am

    We have a 2008-R2-2-Node-Active/Passive-Cluster here with 8GB installed using 7,4GB. Since now there isn’t any performance impact, so I think everything is fine.

    • laderio
      August 8, 2011 at 10:08 am

      We have 7 of 22 Services online with 1-5 million files per service.

  5. emenius77
    September 23, 2011 at 8:34 am

    running into the same problem on multiple Windows server 2008 r2 servers. It certainly appears running any type of application on Win2008r2 which uses a large number of files is a bad idea. applications i’ve seen problems with include Hyland Onbase, ArcGis, and simple file servers. Buyer beware! I hope MS stops denying and fixes this problem soon. You can only pust so much memory into one machine.

  6. Bill
    October 6, 2011 at 12:07 pm

    I Have the same Issue, 16GB of memory soon becomes 15.5GB of file cache. I have to reboot the server weekly to clear it up. It appears that once memory is allocated to the cache it is not released until a reboot. I am underwhelmed!

  7. Alysson
  8. Alysson
  9. January 13, 2012 at 10:22 am

    Hi all.
    Problem is fixed here, it has to do that MS has not implemented a dynamic cash service for windows 2008R2. I recompiled the the it for windows2008R2 and installed it as a service. With a few registry settings I can set a max of memory the metafile may use, thus controlling max memory usage.

    • Walid
      January 21, 2012 at 1:04 pm

      Coud you please share your binaries ?

      • TontonLud
        February 10, 2012 at 2:47 am

        Yes please !

      • Dom
        February 23, 2012 at 8:37 am

        I’m experiencing the same problem – also interested in the binaries

      • February 29, 2012 at 4:38 pm

        were you able to resolve the issue?

    • Salman
      January 24, 2012 at 11:23 am

      Yea can you please share the updated code and binary? Would really appreciate!

    • February 29, 2012 at 4:37 pm

      I’m seeing the same issue here and was hoping to implement your fix. How did you recompile the binaries for 2008R2. I have installed one of the dyncache.exe’s as a service but haven’t started it yet. Please assist

    • IB
      April 3, 2012 at 1:10 pm

      Any solutions? badding, do you have files you can share?

  10. Eugene
    April 24, 2012 at 8:56 am

    I have found that changing the server to handle process scheduling for Programs instead of Background services (right click ‘Computer’, Properties, advanced, advanced tab…) that memory no longer gets locked and becomes standby for mapped files which is good when you don’t want SQL to get bogged down by file shares on a SQL server. Need others to confirm. Microsoft still has not confirmed this issue despite providing DynCache on one of our client’s servers and not charging us for the call, which would typically indicate that the issue was Microsoft’s, not the customer’s.

    I am unable to locate any documentation that indicates what type of performance gain is made when running the server configured for Background services, but so far, the fix is working for us and may be applied to any and all of our servers that are both sharing files and running SQL Server 2008 R2. SQL was the victim here as it was unable to obtain more memory until the server was rebooted and this was an unacceptable workaround. So far, since the change, memory allocated to “Mapped file” in RAMMAP is now in “Standby” which means that the OS can release the memory as needed. Even though the server is now using most of the 12GB of RAM, 7GB of that amount is available as “Standby”. Hope this helps others until Microsoft releases a better widespread fix.

    • laderio
      June 8, 2012 at 3:05 am

      Hello Eugene,
      nice hint, but I don’t think, that I can configure this on our productive machines.
      Do you have new informations on this?

  11. Hugo Proost
    August 1, 2012 at 7:00 pm

    Have a look at: http://support.microsoft.com/kb/976618/en-us
    A download is available at: http://www.microsoft.com/en-us/download/details.aspx?displaylang=en&id=9258
    For Windows 2008 R2 contact Microsoft Support as described in the first link. KB976618

    • laderio
      September 12, 2012 at 10:11 am

      But in this Microsoft article Windows Server 2008 R2 is not mentioned.

    • laderio
      September 12, 2012 at 10:15 am

      Also interesting: mail for this post is Form 2012-09-12, but the post itself says 2012-08-01 ?

  12. TontonLud
    September 12, 2012 at 9:04 am

    Hugo Proost :
    Have a look at: http://support.microsoft.com/kb/976618/en-us
    A download is available at: http://www.microsoft.com/en-us/download/details.aspx?displaylang=en&id=9258
    For Windows 2008 R2 contact Microsoft Support as described in the first link. KB976618

    N/A to W2008 R2…

    • Joe F.
      February 7, 2013 at 5:38 pm

      I had this same problem and it now looks like this solution is applicable to W2008R2

  13. NDLunchbox
    March 12, 2013 at 12:05 pm

    I have this issue with a brand new Citrix Provisioning Server (which requires me to regularly copy 20+GB files to and from it to keep it in sync) – the strange thing is it’s a small number of files (2 or 3 VHDs) but the same issue happens, I quickly jump up to using about 26GB of RAM (out of 128GB). The system becomes basically unusable, you can’t browse folders or reach shares on the server. I need to reboot the server to get it working, which usually requires me to go into iLO and hard cycle it. Unlike most people here, I am getting this with a physical system.

    I’m trying to get the Dynamic Cache Service for 2008 R2, on with a MS agent now – she is setting up a case, since it is a documented issue it appears there will not be a charge.

    • June 10, 2013 at 10:06 am

      @NDLunchbox, Did you get anywhere with the MS agent on 2008 R2? Having the same issue. Thanks.

  1. No trackbacks yet.

Leave a Reply

Please log in using one of these methods to post your comment:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )


Connecting to %s

%d bloggers like this: