RStudio not opening .RDATA file

Hello,

I am working with some big sequencing data and have been analysing it for some time. I normally am saving the entire global environment and eventually it got too large. During the last save, RStudio saved an RDATA file of 314GB but in trying to open it it either times out and the R session aborts, or it gives an error (Error: cannot allocate vector of size 3.5 Gb).

I can see the ram usage is expended and it goes into the swap memory before the crash/error.

Not really sure how to proceed here. The same computer that was able to run and save the analysis is not able to open it.

Any help would be appreciated. I do have a backup of the key files but I would like the whole ecosystem since the supplementary files will have to be recreated anyway.

Debian 10
250GB RAM

Hi @aaa_3 -

I'm going to guess you're going to need to start over, unless you want to try and upload that data to AWS or another cloud provider where you can use many multiples of the RDATA file size in RAM.

In general, Posit doesn't recommend using the 'save environment' functionality for many reasons...mostly because it's hard to ensure reproducible results, but secondarily the issue you are running into with out-of-memory/memory corruption.

If you have admin rights on your computer and are comfortable doing sysadmin type stuff, you might try creating a really large swap file on your computer (1TB or more of you have the disk space available) and see if that helps. When R tries to read the RDATA file and runs out of RAM, it will spill to disk. This will be quite slow, but it might help you get in and clean up your environment and back up things.

Good luck!

Best,
Randy