C++ – C++ Out of memory : write to file instead, read data when needed?

C++ Out of memory : write to file instead, read data when needed?… here is a solution to the problem.

C++ Out of memory : write to file instead, read data when needed?

I am using C++ to develop tools for wavelet image analysis and machine learning on a Linux machine.
It is limited by the image size, the number of proportions in each of the N directions and their corresponding filters (up to 2048×2048 double), and the additional memory and processing overhead of the machine learning algorithm.

Unfortunately, my Linux system programming skills are superficial at best,
So I’m not currently using swap, but I think it should be possible to somehow implement?

I need to keep the imaginary and real parts
Filtered images for each scale and orientation, along with corresponding wavelets for reconstruction purposes. I keep them in memory to increase the speed of processing small images.

Regarding memory usage: I have

  • Store only once,
  • Only what is needed,
  • Remove any duplicate entries or redundancies
  • Passed by reference only
  • Use pointers on temporary objects
  • Free up memory as soon as it is no longer needed
  • Limit the number of calculations to an absolute minimum.

As with most data processing tools, speed is critical. As long as there is
Compared to the same implementation in Matlab code, the tool is approximately 3 times faster if there is enough memory.

But once I lose memory, nothing will be left. Unfortunately, most of the images I’m training the algorithm on are large (raw data 4096×4096 double entries, even larger after symmetrical filling), so I often hit the ceiling.

Is it a bad practice to temporarily write data from memory to disk that is not needed for the current computation/processing step?

  • Which method/data format is best for doing this?
  • I’m thinking about using rapidXML to read the XML and write it to a binary file, and then read only out the required data. Does this work?
  • Do I need a memory-mapped file? https://en.wikipedia.org/wiki/Memory-mapped_file

I know this results in a performance penalty, but more importantly, the software runs smoothly and doesn’t stutter.

I understand that there are libraries out there that can do wavelet image analysis, so please stop saying “why reinvent the wheel and replace it only with XYZ”. I’m using very specific wavelets, I need to do it myself, I shouldn’t use external libraries.


Yes, it is bad practice to write data to disk to save memory.

You typically don’t need to manually write data to disk to save memory unless you reach the limit you can handle (4GB on 32-bit machines and more on 64-bit machines).

This is because the operating system is already doing exactly the same thing. Your own solution will most likely be slower than what the operating system is doing. Read this Wikipedia article if you’re not familiar with the concepts of paging and virtual memory.

Related Problems and Solutions