I've a complex model that performs an Actuarial Projection using parallel and Rcpp. I'm facing problems in Rstudio in a Linux enviroment because I can't release memory ram after each calculation. I'd tried remove those objetcs using rm() and after applying gc(full = TRUE, reset = TRUE).
For some objetcs it actualy works but when I checked the performance metrics trought the model it's clearly not.
My guess is for the cases where you believe it to not work, you might be releasing a copy / i.e. non unique instantiation of an object, so there is no memory gain from removal of such a duplicate, where the same info is kept around in another object
Have you tried removing all objects and then garbage collecting ?
Removing all objetcs its impossible because to perform those projections I need to keep some objetcs in memory. But they aren't the problem. I will try to print what I'm trying to express and show here
Im suggesting clearing everything not as a solution, but as a piece of detective work you could attempt, to demonstrate for yourself whether or not there is a memory leak, or if all the memory is reclaimable just so long as you actually do remove it. (as if you have an object taking up memory, and copy it without changing its internals, then measure your memory, release the copy, and measure again, you would expect no/miniscule change. / whereas if you removed all copies, its expected for you to return to baseline, before the first object was allocated.
Then I runned an functions to consolidate the projections results, for ex: My memory used raises to 11.9 gb then I remove the object call gc() again and nothing happed. And printed out again the memory report and gc output after removing