I am working with genomic data which is quite large. Somtimes this is too large to handle, but here I am seeing some strange behaviour. I get the message failure to allocate vector of size 242 MB which is strange, because If I check Task Manager I can see that I have around 2.4 GB worth of free RAM available.
If you are on Linux or macOS, enter unlimit, which shows how much of free RAM is available. By default, it is usually around 8GB, but might have been set lower. See man ulimit for how to adjust.
I think it could be related to memory fragmentation , i.e you may have a total of 2.4gb free , but its a collection of smaller regions of which no one alone is big enough to contain the new vector
I think I've seen that kind of things with functions that have a lot of internal computations. While running, the function creates several intermediate objects in memory, until it runs out. So the last object, that fails to create, may be only 242 MB, but before that R just created 2.39 GB worth of other objects.
If you have a way to monitor the free RAM (e.g. Window's task manager with a fast refresh speed), what I typically see is:
I start the function
for several seconds (or minutes) my computer works hard, the fans might turn on, and I see the used RAM increase steadily
the function runs the one-too-many operation, and stops with the error "can't allocate vector of size xx MB", the RAM is immediately freed to the level it was when I started the function and the fans stop blowing.