Difference between revisions of "Memory Usage Dialog"
|Line 69:||Line 69:|
;Page fault rate:
;Page fault rate:
:the number of page faults per second averaged over the previous 10-15 seconds. Page faults seen here can occur from two sources. First, every page of memory used by the process is initially faulted into existence. This is a very inexpensive operation that occurs frequently during initial start up, and as your model evaluation begins before the RAM utilization is high. Later, when it is necessary to make use of page file memory, a page fault occurs whenever a memory page is required that happens to currently be in page file memory. These page faults are expensive and usually require the least recently accessed page to be written to the page file, and then the required page to be brought into RAM. When this page fault rate becomes high, we say that ''thrashing'' occurs. At that time, the % of CPU utilization of Analytica as seen in Task Manager may drop to much less than a full processor. large page fault over long periods .
== Model ==
== Model ==
Revision as of 00:45, 15 July 2009
The dialog described here is New to Analytica 4.2.
The size of the computation possible from within Analytica is limited by the amount of memory required to complete the computation, relative to the maximum amount of memory available. The Memory Utilization pane in the expanded view, and the Memory usage reflects this information.
- In use
- Amount of memory currently being used by Analytica (this process), including RAM and Virtual Memory.
- Max available
- Maximum amount of memory available to Analytica.
- In Analytica 32-bit, like any 32-bit application, this maximum is usually controlled by the 32-bit address space: It is either 2 GB, 3 GB or 4 GB, depending on which operating system you are using, and whether you have configured your boot.ini file with the /3GB flag to allow 32-bit processes to utilize more than 2GB of memory. If you have less than that amount of physical RAM memory, it can use virtual memory, using hard disk. But, the total available is still limited by the address space.
- For Analytica 64-bit, the maximum memory available is the total virtual memory allocated less the memory used by other processes, including the Windows Operating System. When the total system virtual memory reaches capacity, Windows will usually attempt to allocate more space on disk for the page file (depending on system settings). This means that your model will not necessarily run out of memory when it reaches this maximum -- instead, all applications on Windows will appear to hang for a while (usually 1 to 2 minutes) while it expands the page file. When it comes back to life, the maximum available will be larger. System administrators can adjust the initial and maximum page file sizes in the Control Panel → System → Advanced → Performance Settings → Advanced → Virtual Memory → Change.
- Total System Virtual
- The amount of virtual memory available given the allocated page file size. This memory is shared among all processes running on your computer.
Like any Windows application, Analytica can use virtual memory -- meaning that, if it runs out of RAM (random access memory chips) it may supplement it with page file memory, usually on the hard disk. RAM is typically 1,000 times faster than page file memory, so using a page file can dramatically slow down computational speed.
If you are running a standard 32-bit edition of Analytica, it is limited to a maximum of 3 GB (gigabytes) of memory, like any 32-bit application. If your computer has 4 GB of RAM, it may still use virtual memory for a large model if other Windows applications and the operating system are using a lot of RAM.
If you are running a 64-bit edition of Analytica, it can use up to 128 GB of memory -- depending on how much memory your version of Windows supports. E.g. The 64-bit Home Edition of Windows Vista can only support 8 GB.
If your computer has 8GB of RAM, but you have 20 processes running on your computer, the Windows operating system must decide how much of the 8GB each process is allowed to utilize. The amount granted to a process is called its working set, and the sum of all working set sizes across all processes never exceeds the amount of physical RAM. At any given moment, the size of the working set is a slightly different concept from the amount of RAM actually used by the process. For example, Windows may grant your Analytica process a working set size of 500MB, even though at the moment it is utilizing only 100MB. This might give Analytica the flexibility to quickly allocate intermediate values without having to swap to page file memory.
Windows continually balances the working set sizes of all running processes to match their computational needs. To achieve acceptable performance, Analytica requires a working set of about 3 times the size required to hold the largest array (including the Run index for uncertain samples). However, if you have thousands of different arrays in your model, the working set can be much less than the total memory in use without substantially impacting evaluation speed. A cap on the maximum working set for a process can ensure that other processes don't get swapped out of RAM during long intense model evaluations, but keeping the rest of your computer response. On the operating systems that allow caps on the maximum working set size, Analytica 64-bit limits its maximum working set size for this reason, to keep other applications response. For information on how to adjust this, see Working_Set_Size.
- RAM Usage
- The RAM currently being consumed by your Analytica process. The difference between this number and in-use is the amount of the process memory current residing in page-file memory.
- Working set size
- The amount of RAM currently alloted by the operating system to the Analytica process. When this number is larger than the number shown for RAM Usage, the process has space to rapidly allocate new memory. If RAM usage passes this number, either the operating system will increase the working set size alloted to this process, or some memory will need to be swapped into page file memory.
- Peak working set
- The largest working set size that has been alloted to the processes since the Analytica process was started up. This generally coincides with the peak RAM usage (but not peak memory usage).
- Max working set
- Shown on operating systems that support a maximum cap on working set size, this shows the cap that has been placed on the largest working set that the operating system should allot to this Analytica process.
- Keeping this number less than the total system RAM can help to ensure that other applications remain responsive when you launch a long memory-intensive model evaluation. See Working Set Size.
- Total System RAM
- The total physical RAM installed on your computer.
- Unused RAM
- RAM not currently in use by any running process.
- System RAM load
- This value, between 0% and 100%, provides an indication of the level of memory demands among all processes on the system relative to the amount of available RAM. When this is near 100%, all available RAM is in use and use of the page file is required to keep things running. At levels near 100%, performance may taper off.
- Page fault rate
- A page fault occurs when an application tries to access a block of memory, finds that it is in virtual memory and so needs to be paged back into RAM. The page fault rate is the number of page faults per second averaged over the previous 10-15 seconds. Page faults seen here can occur from two sources. First, every page of memory used by the process is initially faulted into existence. This is a very inexpensive operation that occurs frequently during initial start up, and as your model evaluation begins before the RAM utilization is high. Later, when it is necessary to make use of page file memory, a page fault occurs whenever a memory page is required that happens to currently be in page file memory. These page faults are expensive and usually require the least recently accessed page to be written to the page file, and then the required page to be brought into RAM. When this page fault rate becomes too high, we say that thrashing occurs. At that time, the % of CPU utilization of Analytica as seen in Task Manager may drop to much less than a full processor. A large page fault rate over a long periods indicates thrashing.
- Objects in use
- The number of objects currently in use by your model. Each shape (except arrows) that appears on a diagram, including each input or output node, counts as one object. In addition, each local index, and each graph template, counts as an object. Your model may utilize a bit over 31,000 objects maximum.
- Sample Size
- The Uncertainty Settings sample size, used for Monte Carlo simulation, etc. Total memory usage for probabilistic models is roughly linear in this sample size, as is the working set size required for acceptable performance.
- Show object being evaluated
When checked, the object currently being evaluated is displayed here, along with the Dynamic context of the evaluation. Checking this option slows computation times down dramatically, in many cases a slowdown of 10 to 100 fold. Typically the identifiers fly by in a blur during evaluation; however, its value may be in noticing those variables that linger for extended periods of time. You may also find it educational to watch the evaluation proceed, especially with Dynamic models.
- Expanded view
When unchecked, the terse memory dialog is displayed. When checked, the expanded view is displayed.
- Stop Computing
This button appears only when a computation is in progress. Pressing the button is equivalent to pressing the CTRL-Break key combination, causing the in-progress computation to abort.
- Information on fields
Jumps to this Wiki page.