Loading...
 

Not enough memory


While using the Huygens Software the following error message might occur which will terminate deconvolution and restoration:

The operating system did not allow a necessary increase of the size of the image memory pool from 1500 MB to 2300 MB for a 800 MB image with dimensions...(truncated) Failed to allocate heap memory. Not enough memory available.


Some things to check:

  • Is a sufficiently large part of system ram available to Huygens? If other software, or other instances of Huygens running on the same machine, have a large part of the memory in use, then the current Huygens can perhaps not obtain enough.
  • Is enough RAM available? Some operations (see below) may need a surprisingly large amount of RAM. If you use these often, enlarging the amount of available system RAM may be the best option.
  • Is swap space (virtual memory) available? Modern operating systems will automatically allocate some amount of virtual memory. if you need a large amount of memory only occasionally, it could help to temporarily increase it (see below). It can also help to have a dedicated SSD for this.
  • Is there a limitation on the amount of memory that a single process may use? In the past, 32-bits programs could never use more than 4 GB of memory at the same time, even when the system had more RAM installed in total. Currently all versions of Huygens are 64-bits and on a regular single-user system usually does not have restrictions by default. On a server, batch processing system, or cluster, there could be additional limitations being enforced that could prevent Huygens to use all of the installed memory.

Deconvolution requirements


This memory problem can be very annoying when you try to restore a 300 MB image and you have, for example, 4 GB RAM installed. That's more than 13 times more!

A 300 MB image may not appear large (it is a typical size for a Multi Channel 3D 16 bit image, see Bytes And Bits to estimate file sizes), but when deconvolving such images many intermediate images have to be created like bricks and the PSF. All of these intermediate images are created as 32 bits which is double your original 16 bit tiff image. The final restoration result will also be 32 bit if there is enough memory, but it can be less if Huygens can't allocate the necessary memory.

Huygens Deconvolution internally runs in 32 bit float mode to make use of a wider Dynamic Range, a benefit from deconvolution. In an ideal situation without any memory restriction an original 16 bit image of 300 MB would be transformed into a 600 MB 32-bit dataset before starting deconvolution. The destination images and PSF are the same size as this transformed original which means 600 MB for both PSF and destination image. Other, intermediate state images (for FFT and other calculations) are about the same size (600 MB)

600 × 4 = 2400 MB, more than half of the memory that is installed! Maybe the operating system won't allow that allocation: the memory may be being used by other programs, perhaps the OS itself. Some operating systems (like Mac OS X) have shown to be very conservative and even when the memory is available they won't provide a single program all that much, maybe to allow other programs or other users to still use the machine later. In such cases, just having more memory installed is not a solution: the OS will not serve it anyway!

(Tip: An image can be split and each channel may be loaded seperately. An application like Huygens Essential will deconvolve them separately anyway and having the other channels loaded in the memory while deconvolving just one may push things to the memory limits).

Virtual memory


Operating systems can use a part of the disk that is intended for non-volatile storage as extra RAM. On a Linux installation, typically a separate partition is created on the disk when Linux is installed, while Windows and Mac use a file on the system disk that can grow on demand. On Linux it is possible to temporarily create an extra swap file. If the system contains both SSDs (solid state drives) and rotating magnetic hard disks, then the automatically configured virtual memory may not be on the desired disk. Magnetic harddisks have the advantages that they are cheaper per terabyte in large sizes, and that they do not suffer as much from wearing out. SSDs and especially NVMe SSDs are much faster than magnetic disks, and may be very suitable to be used as virtual memory. Be aware that they do wear out when they are written to very often: look for the "total number of TB written" specification.
The amount of virtual memory and the disks it resides on can be configured in e.g. Mac OSX and Windows 10.

Save memory


When no more memory can be requested the alternative solution is to use the available memory in a smart way. Huygens will still do the internal calculations in 32 bit numeric format, but if there is little memory available it won't transform the original image to try and keep memory demands down and if necessary, the final result will not be 32 bit despite the loss in Dynamic Range.

When even this is not enough, the 32 bit internal calculations will run in small chunks, not considering the image as a whole. This is called Brick Splitting and can be problematic, especially when the Point Spread Function is very large (as with Wide Field Microscopy) and the contribution of every light source affects the whole of image (the blur spreads over a large volume). If all contributions are not considered simultaneously the deconvolution result may not be as good. But in cases of little available memory it might be the only option. The Brick Splitting routine considers many variables to find the best compromise between available memory and deconvolution requirements. There is a limit however and if the bricks become so small that the deconvolution will be a disaster, it won't run. Moreover, if your Microscopic Parameters are not physically realistic this routine can also fail (see Brick Splitting).

System limitations


Even them there might be problems when the OS reports that it has more memory available than what is going to be provided later. Just asking the OS how much free memory is available is not enough if later, when demanded, it will serve less than that! The provisions made based on initial figures (like Brick Splitting) will fail.

Moreover, a typical problem present in some operating systems is that they only allow memory allocation of contiguous memory chunks. The (reported) available memory may be large, but if there's not a contiguous piece of the size that is requested, it won't be provided. Some versions of Windows are known for suffering from memory fragmentation that reduce the size of the largest allocatable chunk. In these cases, the message "not enough memory available" actually means "not enough contiguous memory available".

Until the Operating Systems in question do not improve their memory management, there's not much that applications can do. Huygens does its best not to be fooled by these OSs (usually this implies being pessimistic with the provided free memory figures, and calculate the brick splitting in a more conservative way), but when large amounts of memory are required we usually suggest using other platforms like recent Windows or Linux versions.

Exporting requirements

Sometimes the memory problem appears when saving your datasets. Why does just saving data require extra memory usage? It depends on the File Format that is being used. A format like ICS will store the data directly without further memory demands, but other formats, like TIFF, require the dataset to be transformed and for that the program needs to create a new destination image to store the transformation result (see Tiff Scaling). Maybe there's not enough memory to create a full third image, even if it is 16 bit!

In such a case, store the results first in ICS or OME, restart the program and try reloading and converting the image to the desired format.