Rstudio (win) crashes approaching 4GB

Hi all,

I have rstudio running some foreach loops, and when the memory footprint of rstudio approaches 4GB, the child processes bomb out. Everything runs fine when running serially and remaining below 4GB, as well as running parallel with a reduced dataset.

It's running 64-bit R, but rstudio is listed as a 32-bit application in the task manager, which makes me suspect the 4GB ceiling is to blame. (troubleshooting the crash is tough, hence my circumstantial sleuthing here).

I haven't seen a 64-bit rstudio for windows anywhere... any other thoughts on how to get around the issue aside from changing my code around to avoid the memory limit? Any help would be greatly appreciated.

The error I specifically get is the following (doesn't really help, but included here for the curious):
Error in unserialize(socklist[[n]]) : error reading from connection


The grouping used by Windows 10 (and maybe 8?) in the Task Manager makes it look like the R session is included in the 32-bit memory, but it's not:

That said, you potentially try running your code in the basic "R x64" / Rgui.exe front-end to narrow down if it's RStudio related or not. It's possible that there is some interaction with the IDE that is causing problems in this case.

1 Like

Thanks nick... indeed, turned out not to be an Rstudio problem. I was sourcing my functions (which include Rcpp functions) from within the %dopar% loop... when I instead initialized the cluster with clusterEvalQ(clust, source("myfuncs.r")), problem solved.