I have to load some really big dataframes stored as .rda files. In doing so, I did notice the following behavior of my Rstudio Server (same behavior on local Rstudio installation).
Whenever I load multiple variables, the memory seems not to be freed after deleting a variable with rm and calling gc. Example
At this point, the session still allocates over 6gb of RAM. My expectation would be that the memory is 4gb, as the first loaded variable is no longer required.
Is this a known behavior or is there another way to free memory? I really need to load all my data sequentially however I have not enough space to have them all stored in RAM.
Ok, its what I expected. the current used is just 14Mb, but the max used recorded was 4099.9.
I think the situation is something like. R gets memory from the OS for it to allocate to your objects.
R tracks your object and will garbage collect them when you drop them; but it wont release this to your OS; it will retain that memory for other objects (without it needing to go beg your OS for more memory resource).
So this is a double edged sword.
the good ) you can expect to load some other equivalent sized file with R successfully, you really did release that as far as R is concerned ; the used (Mb ) went back down to where it was at the starts 14 to 4099.9 to 14Mb. going back up to 4099.9 should be no issue.
the bad ) its not released as far as the OS is concerned so if your are using other applications outside of R; they won't get that memory that R is hanging onto for you...
I'm not an expert here and would be happy for someone who is to clarify; and also speak up if there are methods to force R to release memory back to the OS without closing or restarting...
That would mean that as soon as the R-Session reaches the RAM Limit of the system, all available RAM is reserved for that R session and cannot not be used from other projects until the session is
terminated?