When I look into the "Performance" tab of my TaskManager then there are the following values:
Total: 4014
Cached: 1299
Available: 2697
Free: 1532
Hmm, What is the difference between available and free space?
Shouldn't this be the same?
What means cache?
From my point of view there should be a value "occupied by running prgms" and "free".
Peter
Total: 4014
Cached: 1299
Available: 2697
Free: 1532
Hmm, What is the difference between available and free space?
Shouldn't this be the same?
What means cache?
From my point of view there should be a value "occupied by running prgms" and "free".
Peter
Available is the only one that matters. Available shows what is capable of being used by Programs. Without paging other lower priority processes out of memory. It is a combination of both Cache and Free.
Plus the current memory handling architecture for Win 7 is that it tries to front load possibly needed libraries into memory to help speed up things. The free part is basically the memory that doesn't have any front loaded material at all. For the most part, as Logicearth pointed out, Available is the one that you should trust more, as the frontloaded libraries are easily forgotten in favor of making room for needed memory.
As others have said, Windows 7 will preload things into RAM when your computer isn't busy...this way when you do need or want them, they launch much faster.
In your chart above, the free (totally unused) + cache (that which has been preallocated) should just about equal the available amount. The available is going to show just a bit less than these two summed together.
In any instance that you need more RAM than what is Free...Windows 7 will just use the cached memory and dump what is there.
If you have the memory, might as well use it. It does no good if it sits there totally unused all day long.
In your chart above, the free (totally unused) + cache (that which has been preallocated) should just about equal the available amount. The available is going to show just a bit less than these two summed together.
In any instance that you need more RAM than what is Free...Windows 7 will just use the cached memory and dump what is there.
If you have the memory, might as well use it. It does no good if it sits there totally unused all day long.
Unfortunately there is a bug in Windows 7 related to NUMA (non-uniform memory architecture) that will cause issues on certain platforms when the amount of Free memory goes down. For instance, on my Thinkpad T410 with 8GB of RAM the Free memory will approach zero when I load a large VM. When I stop the VM and exit VMWare, the Free memory does not recover - the memory remains allocated by the cache.
The problem with NUMA is that it prefers to allocate memory from banks that are attached to a CPU core (hence non-uniform, i.e. not all memory is considered equal). Unfortunately, the Windows 7 NUMA bug prevents memory from being allocated that is in the Available pool in this scenario. Since I have no Free memory, the machine begins to swap madly when I restart the VM (the same or another one does not matter) and freezes up for minutes - even though there are over 4GB "Available".
So, despite of what the others have said, the amount of "Free" memory is more important than what's in the "Available" pool.
There is a hotfix for this issue:
Poor performance occurs on a computer that has NUMA-based processors and that is running Windows Server 2008 R2 or Windows 7 if a thread requests lots of memory that is within the first 4 GB of memory
The problem with NUMA is that it prefers to allocate memory from banks that are attached to a CPU core (hence non-uniform, i.e. not all memory is considered equal). Unfortunately, the Windows 7 NUMA bug prevents memory from being allocated that is in the Available pool in this scenario. Since I have no Free memory, the machine begins to swap madly when I restart the VM (the same or another one does not matter) and freezes up for minutes - even though there are over 4GB "Available".
So, despite of what the others have said, the amount of "Free" memory is more important than what's in the "Available" pool.
There is a hotfix for this issue:
Poor performance occurs on a computer that has NUMA-based processors and that is running Windows Server 2008 R2 or Windows 7 if a thread requests lots of memory that is within the first 4 GB of memory
Look into Resource Monitor > Memory tab. That gives you a better picture. The nums in Task Manager can be confusing. Everything you see in blue, is up for grabs by more programs/processes/data. The orange is temporary and still needs to be written back to disk, the green are your running process/data and the grey is hardware reserved (e.g. for graphics).
When I look into the "Performance" tab of my TaskManager then there are the following values:
Total: 3987
Cached: 2305
Available: 2652
Free: 395
Are these values normal?
Total: 3987
Cached: 2305
Available: 2652
Free: 395
Are these values normal?
Looks normal to me. You must have been running a lot of programs since the system was booted. That's why the 'cached' is relatively high - but that's OK. You might as well make use of all the RAM. 100% usage of the RAM is the best case.
the "cached" just means "RAM with stuff in it that can be freed if needed", your numbers look fine.
Guys... I believe I have a problem:
Total: 4024
Cached: 1721
Available: 1664
Free: 1
Recently I have been having "MEMORY_MANAGEMENT" Bluescreens while gaming or using many tabs on Firefox... is this the reason? I always seem to have 0 - 160~ in the Free category and performance has become sluggish lately. Should I buy new RAM or should I just flat out replace it? Like I said, I have been having MEMORY_MANAGEMENT BSODs and I think this is the reason. I am thinking about buying another 4 GB of RAM for my system. Good idea? please help.
P.S. I am running 1333 Mhz RAM btw.
Total: 4024
Cached: 1721
Available: 1664
Free: 1
Recently I have been having "MEMORY_MANAGEMENT" Bluescreens while gaming or using many tabs on Firefox... is this the reason? I always seem to have 0 - 160~ in the Free category and performance has become sluggish lately. Should I buy new RAM or should I just flat out replace it? Like I said, I have been having MEMORY_MANAGEMENT BSODs and I think this is the reason. I am thinking about buying another 4 GB of RAM for my system. Good idea? please help.
P.S. I am running 1333 Mhz RAM btw.
Where are you getting these numbers from. Look into Resource Monitor > Memory tab. The colored bar will give you the full story.
i was looking at task manager. But even in resource monitor it has limited free ram.
But wait, that doesnt matter right? only "available" matters right?
The 'Standby' and 'Free' in Resource Monitor are both 'available'. The standby is RAM where previously run programs are cached. The OS fetches the program from there in case you use it again. That is the fastest access possible. But that RAM will be used if a new program needs the space.
So you have to add the 2 blue areas together to obtain available RAM.
So you have to add the 2 blue areas together to obtain available RAM.
The reason is not because lack of RAM. Running out of RAM does not cause a blue screen. Instead it could be an error in the system, for example bad memory.
For monitoring system resources, I find Resource Monitor better than Task Manager........Click Start > type Resource Monitor and there it is. It may make things a little easier to understand.
Memory management in Windows is complex and follows principles that are not obvious and may at times seem to make no sense. But there is method in the apparent madness.
Free memory: This is memory that contains no useful data. It is expensive fast memory just sitting there, consuming power and providing nothing in return. You don't want it and you don't need it. The only good thing about it is that it can immediately be taken out of unemployment and set to work. The ideal would be for free memory to be zero at all times but we aren't there yet.
Available memory: This is the sum of memory on the free list (bad), and the standby list (good). Memory on the standby list can also be immediately be put to work for any application. But what sets it apart from free memory is that it contains useful data and code, it just hasn't been recently used. This memory serves a dual role. As mentioned it can be given to any application. In addition, as it still contains the original code or data it can be given back to the original application it belonged to. In the resource manager of Vista and later you can see the values for free and standby memory. The value for standby memory should be high and free as small as possible. Standby memory also exists in XP but it takes some sophisticated tools to see it's value.
Cache memory: This is the sum of the file cache and the standby list. The file cache contains a copy of file data that has been recently accessed. If it is needed again, as often happens, it can be more quickly read from RAM than from disk. The file cache and the standby memory are both a form cache so it makes sense to show them together. A high value is a good thing. If an application needs more memory the size of the file cache will be trimmed if necessary.
Please note that the above is a highly simplified description of what is really a very complex process. The system memory manager will always try to assign memory where it will do the most good, and keep the unemployed free memory as low as possible.
Free memory: This is memory that contains no useful data. It is expensive fast memory just sitting there, consuming power and providing nothing in return. You don't want it and you don't need it. The only good thing about it is that it can immediately be taken out of unemployment and set to work. The ideal would be for free memory to be zero at all times but we aren't there yet.
Available memory: This is the sum of memory on the free list (bad), and the standby list (good). Memory on the standby list can also be immediately be put to work for any application. But what sets it apart from free memory is that it contains useful data and code, it just hasn't been recently used. This memory serves a dual role. As mentioned it can be given to any application. In addition, as it still contains the original code or data it can be given back to the original application it belonged to. In the resource manager of Vista and later you can see the values for free and standby memory. The value for standby memory should be high and free as small as possible. Standby memory also exists in XP but it takes some sophisticated tools to see it's value.
Cache memory: This is the sum of the file cache and the standby list. The file cache contains a copy of file data that has been recently accessed. If it is needed again, as often happens, it can be more quickly read from RAM than from disk. The file cache and the standby memory are both a form cache so it makes sense to show them together. A high value is a good thing. If an application needs more memory the size of the file cache will be trimmed if necessary.
Please note that the above is a highly simplified description of what is really a very complex process. The system memory manager will always try to assign memory where it will do the most good, and keep the unemployed free memory as low as possible.
Unfortunately there is a bug in Windows 7 related to NUMA (non-uniform memory architecture) that will cause issues on certain platforms when the amount of Free memory goes down. For instance, on my Thinkpad T410 with 8GB of RAM the Free memory will approach zero when I load a large VM. When I stop the VM and exit VMWare, the Free memory does not recover - the memory remains allocated by the cache.
The problem with NUMA is that it prefers to allocate memory from banks that are attached to a CPU core (hence non-uniform, i.e. not all memory is considered equal). Unfortunately, the Windows 7 NUMA bug prevents memory from being allocated that is in the Available pool in this scenario. Since I have no Free memory, the machine begins to swap madly when I restart the VM (the same or another one does not matter) and freezes up for minutes - even though there are over 4GB "Available".
So, despite of what the others have said, the amount of "Free" memory is more important than what's in the "Available" pool.
There is a hotfix for this issue:
Poor performance occurs on a computer that has NUMA-based processors and that is running Windows Server 2008 R2 or Windows 7 if a thread requests lots of memory that is within the first 4 GB of memory
The problem with NUMA is that it prefers to allocate memory from banks that are attached to a CPU core (hence non-uniform, i.e. not all memory is considered equal). Unfortunately, the Windows 7 NUMA bug prevents memory from being allocated that is in the Available pool in this scenario. Since I have no Free memory, the machine begins to swap madly when I restart the VM (the same or another one does not matter) and freezes up for minutes - even though there are over 4GB "Available".
So, despite of what the others have said, the amount of "Free" memory is more important than what's in the "Available" pool.
There is a hotfix for this issue:
Poor performance occurs on a computer that has NUMA-based processors and that is running Windows Server 2008 R2 or Windows 7 if a thread requests lots of memory that is within the first 4 GB of memory
I don't know if you're still there after 2 years from this post .....
I have a Thinkpad T410 with Win7 64bit 5GB RAM and the same issue with virtual machines, both VirtualBox and VMWare Player, lot of available memory but VMs not starting with diagnostic reporting not enough memory.
3GB available memory and VM configured for 512MB RAM.
All was perfectly working on the same PC when it was WinXP with even less RAM (3GB).
Is there any news from this front?
I have a corporate PC and I cannot install hotfixes
I wonder if after 2 years Win7 has not yet resolved this kind of issue?
Hope to find some help, I really need to run VMs.
Thanks in advance
ViSco
This problem has driven me crazy ever since I have installed Windows.
Quote:
Well, I have Windows 7 64bit and 4GB of RAM 512MB being used by on board Radeon and often have FREE memory around 15-20 and sometimes single digits but what I can' figure out (if anyone can shed some light on this) is why if I have anywhere from about 900MB -over 1GB, like 1200MB or so, I am getting constant 'Out of Memory' warnings while running Google Chrome and also the old 'Low memory' Telling my I should close 'The Program' and then it usually list 'Google Chrome' or 'Desktop Gadgets' or some offending, memory hogging program. I don't see why it would warn me I am running out of memory consider I still have around or close to 30% of the memory still available.
This problem has driven me crazy ever since I have installed Windows.
This problem has driven me crazy ever since I have installed Windows.
Posing a screenshot of Task Manager -Performance tab when the problem occurs would be a big help.
I've see in other forums that this is about the release of "free" memory in Win7, Google Chrome seems to be in the black list of applications that don't play fair and don't release this kind of memory as they should.
Open Chrome as normal, the press CTRL+SHIFT+ESC to open the Chrome TaskManager. Now examine the memory being used to see if the problem lies with Chrome.
https://support.google.com/chrome/answer/95672?hl=en
https://support.google.com/chrome/answer/95672?hl=en
Thanks to all who replied
LMILLER7 - I'll have to reproduce the problem and then post a scrn shot of the task manager for you as I have been keeping programs closed and trying to keep open browser tabs to a minimum so that it DOES NOT occur.
Golden - CTRL+SHIFT+ESC did not open any Chrome task manager
To all - I think if I remember correctly that I had the same problem when I was running 64 bit Waterfox exclusively for browsing the web. It wouldn't be hard to believe that Chrome doesn't play well with releasing memory but then if Windows Task Manager says that there is still around 1GB of memory available that I shouldn't be getting 'Out of Memory' warnings as from what I understand is that 'Available' memory is as ready to use as 'Free' memory and most forums state that since Windows 7 loads frequently used programs for faster launching, your Free memory should always be either in the single or no more than double digit range.
LMILLER7 - I'll have to reproduce the problem and then post a scrn shot of the task manager for you as I have been keeping programs closed and trying to keep open browser tabs to a minimum so that it DOES NOT occur.
Golden - CTRL+SHIFT+ESC did not open any Chrome task manager
To all - I think if I remember correctly that I had the same problem when I was running 64 bit Waterfox exclusively for browsing the web. It wouldn't be hard to believe that Chrome doesn't play well with releasing memory but then if Windows Task Manager says that there is still around 1GB of memory available that I shouldn't be getting 'Out of Memory' warnings as from what I understand is that 'Available' memory is as ready to use as 'Free' memory and most forums state that since Windows 7 loads frequently used programs for faster launching, your Free memory should always be either in the single or no more than double digit range.
1. Click the Chrome settings icon
2. Now navigate to Tools > Task manager
WOW ..... CTRL+SHIFT+ESC brought up Windows Taks manager after I rebooted but SHIFT+ESC brought up the Google TM. I didn't know that Google had this feature and is very helpful. Thanks, i will have to look at this closer.
What I find interesting is that I had Chrome close several times because of low memory. Whn close I had in excess of 2GB available. After opening it would use almost all of that and leave around 150-200MB available. After rebooting and opening Chrome to the last state with the same 120 tabs that were open before and adding up what is in the Google Task Manager I find it is only using 309MB and has the system still has 1658MB available. It appears that as the system stays up over many days that there is some kind of memory leak or chrome isn't releasing memory it doesn't need anymore.
Now my question is, what use is google chrome? For the first year or two of its inception it offered faster browsing with its smaller code and footprint. But I just today opened IE 9 and it opened web pages much faster than chrome. So now, with google's mega conglomerate advertising super machine with the change to their adwords and such (which is probably what is slowing it down) I see it as a once was of a 'had been' and think I am going back to Waterfox as my browser of choice.
What I find interesting is that I had Chrome close several times because of low memory. Whn close I had in excess of 2GB available. After opening it would use almost all of that and leave around 150-200MB available. After rebooting and opening Chrome to the last state with the same 120 tabs that were open before and adding up what is in the Google Task Manager I find it is only using 309MB and has the system still has 1658MB available. It appears that as the system stays up over many days that there is some kind of memory leak or chrome isn't releasing memory it doesn't need anymore.
Now my question is, what use is google chrome? For the first year or two of its inception it offered faster browsing with its smaller code and footprint. But I just today opened IE 9 and it opened web pages much faster than chrome. So now, with google's mega conglomerate advertising super machine with the change to their adwords and such (which is probably what is slowing it down) I see it as a once was of a 'had been' and think I am going back to Waterfox as my browser of choice.
Sorry, I don't have these problems with Chrome......its way faster than IE for me
The only thing I can suggest is to examine the services and programs in your startup.
The only thing I can suggest is to examine the services and programs in your startup.
Hi there ... Part of your problem is you have far to many Tabs open .. Cannot imagine why you would need as many open .. Cut down on the Tabs it will use less memory .. And don't leave your computer on for days on end
However, Startup tab under msconfig has nothing in it that I didn't put there.
Services only has one entry that I don't recognize which is "QosServM.exe" but upon research I found that this executable is usually linked with Avaya IP softphone which software I did install so that doesn't appear to be out of the ordinary and I can't find anything else in Services that is out of the ordinary.
Maxie, Are you serious? I do not agree with this as a person should be able to do with a computer anything that a computer is capable of doing without resulting in resigning himself to a mode of limited use. In the days when I was running Windows XP and we didn't even have "Tabbed" browsing (which is supposed to be easier on system resources) I could routinely have 200 browser WINDOWS open without having memory issues and that was all with 384MB or RAM. Oh and the reason why people have more than the 2-3 browser tabs that you apparently only allow open is that its a browser capability and when one is doing research and are constantly referring back and forth it prevents them from having to open and close tabs 150 times. Shutting down a computer every day? This is also not necessary as I have worked in corporate environments and those pc's rarely get shut down and are on 24/7. I have had my own computer on for 9 months without being shut down and no slowing down or other issues. Of course I am always on the machine but the only reason a pc would need to be shut down is something reacquiring a reboot or to reset poorly designed software that may have issues and need the system reset. But then that isn't the pc's fault, its the software designer's.
On my Lenovo T420i, task manager shows:
If I run System Internals process explorer, I see the page faults jump from < 500 to about 27,000, so I'm guessing that if I add another 4Gb of memory I will create a lot more free memory and reduce the amount of page faults/swapping...which I presume is at the root of the slowdowns.
Total: 3979, Cached: 2156, Available: 2105, Free: 1 (MB)Everything seems to slow right down if I try to print something from Adobe Reader or Word.
If I run System Internals process explorer, I see the page faults jump from < 500 to about 27,000, so I'm guessing that if I add another 4Gb of memory I will create a lot more free memory and reduce the amount of page faults/swapping...which I presume is at the root of the slowdowns.
Welcome to the Seven Forums.
Have you applied this hotfix?
http://support.microsoft.com/kb/2155311/en-us
edit: see this post for more info.
Have you applied this hotfix?
http://support.microsoft.com/kb/2155311/en-us
edit: see this post for more info.
coghlan
Did you have this problem before Windows Enterprise was installed?
Where did you get Enterprise from?
Did you have this problem before Windows Enterprise was installed?
Where did you get Enterprise from?
I don't see this hotfix under installed updates, but it's not clear if my processor (i3-2350M) is NUMA-based and therefore affected.
Is it likely that my machine suffers from this problem regardless?
Is it likely that my machine suffers from this problem regardless?
I only ran Win XP for a short while after I received this laptop so I can't be sure of the genesis of this problem. All I know is that, when I start up a print job or a new app, it's a pig, with not responding showing up in various title bars.
The reason I asked coghlan is from your specs.
OS Windows 7 Enterprise 64
Have you tried to get help from your desktop support people.
With a corporate or government agency it best to use your I.T. Department to do such repairs. They normally don't like or want others messing with their computers and the way it is set up.
OS Windows 7 Enterprise 64
Have you tried to get help from your desktop support people.
With a corporate or government agency it best to use your I.T. Department to do such repairs. They normally don't like or want others messing with their computers and the way it is set up.
Of course, more RAM can reduce battery life :-(
Having a significant amount of free memory is neither necessary nor desirable. The OS was designed to operate with little or no free memory, this in fact being the optimum situation. The important number is available memory and this seems to be quite adequate. Having a large number of page faults in itself is not a bad thing as most are likely to be soft faults which require no disk access. Based on the information provided there seems little reason to believe the problem is memory related.
Here's a screen shot of over 700,000 page faults/sec, however, I notice that the peak commit charge (apparently RAM + paging file used) is 2.4Gb, so I presume nothing is being paged to disk.
Is it safe to assume that if peak is below the amount of physical RAM, adding more memory isn't going to help prevent the extreme sluggishness I see at various times during the day?
Is it safe to assume that if peak is below the amount of physical RAM, adding more memory isn't going to help prevent the extreme sluggishness I see at various times during the day?
Quote:
Here's a screen shot of over 700,000 page faults/sec, however, I notice that the peak commit charge (apparently RAM + paging file used) is 2.4Gb, so I presume nothing is being paged to disk.
Is it safe to assume that if peak is below the amount of physical RAM, adding more memory isn't going to help prevent the extreme sluggishness I see at various times during the day?
Is it safe to assume that if peak is below the amount of physical RAM, adding more memory isn't going to help prevent the extreme sluggishness I see at various times during the day?
The idea that if peak commit charge is less than RAM size then no paging will occur is one that is often seen. But the concept is based on a flawed understanding of what commit charge means and has no real validity. Commit charge is NOT a measure of RAM usage, pagefile usage, or any combination of the two. There really is no direct relationship between commit charge and performance. It is possible to have a high commit charge with good performance. It is also possible to have a low commit charge and poor performance. The commit charge is important but all you really need to know is to ensure that the commit peak is well below the commit limit. A high commit charge could also indicate a memory leak but there is no evidence of that here. Understanding the commit charge is complex and I won't attempt to explain it here.
The number of page faults is very high but clearly they are of the soft variety and do not involve disk access.
This is something that really needs to be looked into by the IT staff.
I think I'm on my own w.r.t. this issue. We don't get much more than component swapping from our desktop IT people.
I was recently given a copy of this process explorer tool, so when I start seeing performance degrade I will try to get a better idea of what is happening. I know that CPU usage is not excessive (<50%), and things tend to go south when I start a new activity (print a PDF, open Word etc.), so it's somehow related to things being loaded into RAM.
I was recently given a copy of this process explorer tool, so when I start seeing performance degrade I will try to get a better idea of what is happening. I know that CPU usage is not excessive (<50%), and things tend to go south when I start a new activity (print a PDF, open Word etc.), so it's somehow related to things being loaded into RAM.
...mcshield.exe?
165,000,000 (yes, million) page faults today.
Can this thing be locked in memory somewhere???
165,000,000 (yes, million) page faults today.
Can this thing be locked in memory somewhere???
Having a significant amount of free memory is neither necessary nor desirable. The OS was designed to operate with little or no free memory, this in fact being the optimum situation. The important number is available memory and this seems to be quite adequate. Having a large number of page faults in itself is not a bad thing as most are likely to be soft faults which require no disk access. Based on the information provided there seems little reason to believe the problem is memory related.
As I indicated in another post, mcshield.exe seems to generate a HUGE number of (soft) page faults, but it seems that the counts for page faults for this service are also way down. I'll have a better idea after I've had the chance to use my computer for a full day, but first impressions are very positive.
LMiller7
So like I said ...... when I get 'Out of Memory' warnings I have about 1GB of 'AVAILABLE' memory. I was not talking about Free memory. So, as I understand from what your saying is I should not be getting these warnings in this instance. Correct?
So like I said ...... when I get 'Out of Memory' warnings I have about 1GB of 'AVAILABLE' memory. I was not talking about Free memory. So, as I understand from what your saying is I should not be getting these warnings in this instance. Correct?
I really dislike that "Out of memory" error. It seems to almost go out of it's way to cause confusion. I know this is counter intuitive but such errors are rarely due to a lack of physical memory (RAM). With 1 GB RAM available that would seem very unlikely. The amount of "Free" memory is largely irrelevant. These errors are usually caused by the commit charge hitting the commit limit or exhaustion of the process private virtual address space.
Well, I think I have to agree that while everything fits into 4Gb, however, after the addition of 4Gb more RAM, my Win 7 Enterprise machine is noticeably faster. My initial guess is that there is a lot less virtual memory management (moving pages in/out of address space) happening with the extra RAM.
As I indicated in another post, mcshield.exe seems to generate a HUGE number of (soft) page faults, but it seems that the counts for page faults for this service are also way down. I'll have a better idea after I've had the chance to use my computer for a full day, but first impressions are very positive.
As I indicated in another post, mcshield.exe seems to generate a HUGE number of (soft) page faults, but it seems that the counts for page faults for this service are also way down. I'll have a better idea after I've had the chance to use my computer for a full day, but first impressions are very positive.
How much RAM did you have before adding the 4GB, or how much total? I had 4GB and added 2GB more totaling 6GB and it didn't make a difference in my 'out of memory' alerts. At least no difference that I could quantify.
Welcome to the Seven Forums.
Have you applied this hotfix?
Poor performance occurs on a computer that has NUMA-based processors and that is running Windows Server 2008 R2 or Windows 7 if a thread requests lots of memory that is within the first 4 GB of memory
edit: see this post for more info.
Have you applied this hotfix?
Poor performance occurs on a computer that has NUMA-based processors and that is running Windows Server 2008 R2 or Windows 7 if a thread requests lots of memory that is within the first 4 GB of memory
edit: see this post for more info.
Here is what I found from Wikipedia:
As of 2011, ccNUMA systems are multiprocessor systems based on the AMD Opteron processor, which can be implemented without external logic, and the Intel Itanium processor, which requires the chipset to support NUMA. Examples of ccNUMA-enabled chipsets are the SGI Shub (Super hub), the Intel E8870, the HP sx2000 (used in the Integrity and Superdome servers), and those found in NEC Itanium-based systems. Earlier ccNUMA systems such as those from Silicon Graphics were based on MIPS processors and the DEC Alpha 21364 (EV7) processor.
I really dislike that "Out of memory" error. It seems to almost go out of it's way to cause confusion. I know this is counter intuitive but such errors are rarely due to a lack of physical memory (RAM). With 1 GB RAM available that would seem very unlikely. The amount of "Free" memory is largely irrelevant. These errors are usually caused by the commit charge hitting the commit limit or exhaustion of the process private virtual address space.
Unfortunately there is a bug in Windows 7 related to NUMA (non-uniform memory architecture) that will cause issues on certain platforms when the amount of Free memory goes down. For instance, on my Thinkpad T410 with 8GB of RAM the Free memory will approach zero when I load a large VM. When I stop the VM and exit VMWare, the Free memory does not recover - the memory remains allocated by the cache.
The problem with NUMA is that it prefers to allocate memory from banks that are attached to a CPU core (hence non-uniform, i.e. not all memory is considered equal). Unfortunately, the Windows 7 NUMA bug prevents memory from being allocated that is in the Available pool in this scenario. Since I have no Free memory, the machine begins to swap madly when I restart the VM (the same or another one does not matter) and freezes up for minutes - even though there are over 4GB "Available".
So, despite of what the others have said, the amount of "Free" memory is more important than what's in the "Available" pool.
There is a hotfix for this issue:
Poor performance occurs on a computer that has NUMA-based processors and that is running Windows Server 2008 R2 or Windows 7 if a thread requests lots of memory that is within the first 4 GB of memory
The problem with NUMA is that it prefers to allocate memory from banks that are attached to a CPU core (hence non-uniform, i.e. not all memory is considered equal). Unfortunately, the Windows 7 NUMA bug prevents memory from being allocated that is in the Available pool in this scenario. Since I have no Free memory, the machine begins to swap madly when I restart the VM (the same or another one does not matter) and freezes up for minutes - even though there are over 4GB "Available".
So, despite of what the others have said, the amount of "Free" memory is more important than what's in the "Available" pool.
There is a hotfix for this issue:
Poor performance occurs on a computer that has NUMA-based processors and that is running Windows Server 2008 R2 or Windows 7 if a thread requests lots of memory that is within the first 4 GB of memory
Welcome to the Seven Forums.
Have you applied this hotfix?
Poor performance occurs on a computer that has NUMA-based processors and that is running Windows Server 2008 R2 or Windows 7 if a thread requests lots of memory that is within the first 4 GB of memory
edit: see this post for more info.
Have you applied this hotfix?
Poor performance occurs on a computer that has NUMA-based processors and that is running Windows Server 2008 R2 or Windows 7 if a thread requests lots of memory that is within the first 4 GB of memory
edit: see this post for more info.
Here is what I found from Wikipedia:
As of 2011, ccNUMA systems are multiprocessor systems based on the AMD Opteron processor, which can be implemented without external logic, and the Intel Itanium processor, which requires the chipset to support NUMA. Examples of ccNUMA-enabled chipsets are the SGI Shub (Super hub), the Intel E8870, the HP sx2000 (used in the Integrity and Superdome servers), and those found in NEC Itanium-based systems. Earlier ccNUMA systems such as those from Silicon Graphics were based on MIPS processors and the DEC Alpha 21364 (EV7) processor.
~~~
Welcome to the Seven Forums.
Have you applied this hotfix?
Poor performance occurs on a computer that has NUMA-based processors and that is running Windows Server 2008 R2 or Windows 7 if a thread requests lots of memory that is within the first 4 GB of memory
edit: see this post for more info.
Have you applied this hotfix?
Poor performance occurs on a computer that has NUMA-based processors and that is running Windows Server 2008 R2 or Windows 7 if a thread requests lots of memory that is within the first 4 GB of memory
edit: see this post for more info.
Here is what I found from Wikipedia:
As of 2011, ccNUMA systems are multiprocessor systems based on the AMD Opteron processor, which can be implemented without external logic, and the Intel Itanium processor, which requires the chipset to support NUMA. Examples of ccNUMA-enabled chipsets are the SGI Shub (Super hub), the Intel E8870, the HP sx2000 (used in the Integrity and Superdome servers), and those found in NEC Itanium-based systems. Earlier ccNUMA systems such as those from Silicon Graphics were based on MIPS processors and the DEC Alpha 21364 (EV7) processor.
~~~
I'm not sure which CPU chip sets are NUMA based.
I know what you mean about MS info not helping much...
...but in this case, I'm not much help either :-(
I know what you mean about MS info not helping much...
...but in this case, I'm not much help either :-(
As for me, myself and I, I think my problem started when I changed my VM setting from 'Widows Managed' to a set number like I used to do with XP so that it wouldn't expand the page file, all in itself should not have been a problem. But I find these days that there are far too many memory leaks and memory leaking applications as I had started using MS Vice Rec software and found that after I booted up one time it was using some massive 10-12 GB of VM I only had a total VM of 18GB so with the ram I was using that put my usage to 15GB. I guess if you have you have to close and reopen memory hogging apps or set your VM to 'System Managed' and allow it to be told by apps they need more memory and have it grow your page file to an astronomical size.
Because I'm simple I will keep it simple.
Windows 7 is not XP.
You do not need to do all those little dings,dongs, tweak, ect. in Windows 7.
If you let Windows 7 control your memory and have enough memory you won't have these things come up.
Don't try to out smart Windows 7; learn how to use Windows 7.
Windows 7 is not XP.
You do not need to do all those little dings,dongs, tweak, ect. in Windows 7.
If you let Windows 7 control your memory and have enough memory you won't have these things come up.
Don't try to out smart Windows 7; learn how to use Windows 7.
I have an 8gb of RAM, and I have about 2 gb free. This is a value that can be set, right? Since my memory usage rarely exceeds 30%, should I raise that free value? What would that do? Or should I lower it. I'm not sure what the parameter is for.
Quote:
I have an 8gb of RAM, and I have about 2 gb free. This is a value that can be set, right?
The idea that a large amount free memory is beneficial is hopelessly outdated. All modern operating systems try to find some use for as much memory as possible, even if it is only of trivial value. Unused memory is wasted memory. The ideal would be zero free memory at all times. Unfortunately we are not there yet.
This is not some new idea but has been in use in computers for many years, the basic principles dating back to the 1950's. All Microsoft operating systems have followed these principles for more than 20 years. It would have been longer but the early hardware was to primitive to support such a sophisticated operating system. Linux and Mac OS follow similar principles.
You do not have direct control over the amount of free memory. There are misconceived programs that can increase it but they are harmful and should be avoided.
Thanks, all. I'm not doing anything.
My related question is how can I tell from the task manager if I really would profit from doing an install of win 7 64bit and adding more ram. Because I don't really fancy doing a reinstall of windows if I can avoid it.
Thanks
Thanks
Unless you are crowding that number down toward zero, more memory wouldn't help. If it reads "1000" available and you add 4 GB more RAM, it would read about "5000" under the same workload. Instead of having 1000 mb of unused RAM, you'd have 5000 mb unused. No benefit.
It's interesting. I started replying and had 251 available. I thought ok I'll add excel and maxview to reproduce a typical heavier load, and actually available went UP a touch to 280 (30 free).
All ways round <10% available sounds as if more ram would be good,do you agree?
Thanks
All ways round <10% available sounds as if more ram would be good,do you agree?
Thanks
Windows memory manager always maintains control over memory management, always with the goal of maximizing overall system performance. It will always try to maintain what it considers a reasonable amount of available memory under the current situation. This is all very complex so I will not go into the details. It is desirable that Windows does not have to work too hard to accomplish this. I would like to see a minimum of about 40% memory available. This is just a very rough guide. The lower this value becomes the harder the memory manager has had to work. At 10% and lower memory available it has probably had to resort to some of the more drastic methods. At that level you really do need more memory.
It would help if we know how much memory you have. Understand that with a 32 bit OS 4 GB is the maximum.
It would help if we know how much memory you have. Understand that with a 32 bit OS 4 GB is the maximum.
I currently have 4Gb fitted of which Win7 32 uses 3
task manager performance screenshot attached
task manager performance screenshot attached
You're crowding it pretty hard.
Go to the Resource Monitor from its button in Task Manager and post a screen shot of the memory tab showing the colored bars at the bottom of the window. You may have a considerable amount of memory devoted to hardware reserved.
You might be a good candidate for an upgrade to 64 bit and 8 gb of RAM, but it's hard to say that you would necessarily notice the difference. Is anything annoyingly slow with just 4 GB?
Go to the Resource Monitor from its button in Task Manager and post a screen shot of the memory tab showing the colored bars at the bottom of the window. You may have a considerable amount of memory devoted to hardware reserved.
You might be a good candidate for an upgrade to 64 bit and 8 gb of RAM, but it's hard to say that you would necessarily notice the difference. Is anything annoyingly slow with just 4 GB?
herewith resource monitor
Is it Slow?, not so bad today. Not beset with slow running scripts today, dont know why. I'm using this machine as an example to educate myself (and maybe others), there are a couple of laptops Tosh R600 3G fitted which are often worse. I don't notice huge paging delays like you used to get years ago with XP/2K but I think that's partly at least because Win 7 memory management is more sophisticated (and of course because everyone has at least 3G fitted)
Is it Slow?, not so bad today. Not beset with slow running scripts today, dont know why. I'm using this machine as an example to educate myself (and maybe others), there are a couple of laptops Tosh R600 3G fitted which are often worse. I don't notice huge paging delays like you used to get years ago with XP/2K but I think that's partly at least because Win 7 memory management is more sophisticated (and of course because everyone has at least 3G fitted)