147

Task Manager shows my total memory usage at 90% of my 6 GB total, but no single process is using more than 250 MB RAM, and the sum of RAM use of all running processes is less than 2 GB. I've tried:

  • Looking at the numbers in the "Memory" column on the "Processes" tab of Windows 8 Task Manager.
  • Looking at the "Working Set", "Private Working Set", "Shared Working Set", and "Commit Size" columns on the "Details" tab of Task Manager.
  • Looking at similar memory-related columns in Process Explorer.
  • I've tried running Sysinternals RAMMap, but while I'm having the low-memory crisis, it crashes at launch. Once I resolve the problem, RAMMap runs normally, but at that point it's too late.

All show a pretty small amount of memory being used.

There are lots of people asking variants of this question, with various versions of windows, all over the Internet. Some of them manage to solve their low-memory problems, often by re-installing software; sometimes by re-installing windows from scratch. I'm looking for an answer to the general questions that these all share, and that never seem to get answered elsewhere:

  1. Why is total used memory much higher than the memory used by all listed processes, no matter how I try to count them?
  2. How could windows "know" that memory is used without knowing what program is using it?
  3. What processes might possibly use up memory but not show up on the list?
  4. Is there any software out there that can give more information about used memory?

Details specific to my own problem: Since upgrading to Windows 8.1, The problem occurs as soon as I log in. I run out of memory as soon as I ran any program. I noticed in Process Explorer that several instances of iexplore.exe were running, apparently started automatically. One particular instance was only using a few MB of RAM, but showed hundreds of millions of page faults. On a whim, I killed that specific process, and memory usage immediately dropped by 70%.

Leading to one specific question:

  • How could killing one process that supposedly only used a few MB free up several GB?

And a (presumably hard) bonus question:

  • Short of re-installing Windows, how might I avoid having to go through this every time I reboot my computer?
Josh
  • 3,099

1 Answers1

5

By the way, you should try not to use the term "memory". It creates a lot of confusion. If you mean physical memory, say "physical memory", or "RAM". If you mean virtual memory, say so. If you mean backing store, say so.

Why is total used memory much higher than the memory used by all listed processes, no matter how I try to count them?

Because the operating system doesn't waste physical memory (RAM) unless it has no choice.

How could windows "know" that memory is used without knowing what program is using it?

Because no program is using it. Consider, for example, memory that contains the code for a program that just terminated. No program is using it. But that memory is used, since it is not free and contains data that might be useful (in case the program runs again).

What processes might possibly use up memory but now show up on the list?

It's not used by processes.

Is there any software out there that can give more information about used memory?

RAMMap can do this.

There are only two possibilities, RAM can be used or it can be wasted. Obviously, the first is better. Any free memory is forever wasted -- a 4GB machine can't use 2GB today in order to use 6GB tomorrow. If you're thinking "I want it free now so I can use it later", forget that. You can use it now and use it later.

How could killing one process that supposedly only used a few MB free up several GB?

You are running low on backing store, not physical memory. You have plenty of free physical memory but insufficient backing store for the OS to keep allocating virtual memory that might require backing.

The process was only using a few MB of physical memory, but the OS might have had to reserve several GB of backed virtual memory for it. For example, suppose a process creates a writable, private memory mapping of a 2GB file. The OS must reserve 2GB of backed virtual memory for the process, because it might write to every single byte of that mapping. Also, it might never write to any of them. This is why you need a good sized paging file.

Modern operating systems write lots of checks (promising backing store) that will never be cashed (require RAM). You can't keep writing checks (promising backing store) even if you have plenty of money in the bank (free RAM) if you've already written a bunch of big checks that might or might not get cashed (promised as much backing store as you have). Paging files add backing store, allowing the OS to keep writing checks.