memory.limit() bug?

Thanks for the added detail. ulimit isn't something that can be set in R or RStudio, it comes from the operating system. I'm sorry I gave you an out-of-date-fix. Window's UNIX-like are now accessible through Powershell.

If your case if you have only 8GB of RAM, increasing ulimit beyond 8067 will have no effect; your objects need to be able to fit in RAM and they obviously can't if you don't have enough.

The best alternative if adding memory to, say, 32GB isn't an option is to rent time on a heavy duty AWS EC2 instance using one of the provisioned R images that I've heard of, but I have no personal experience with that.

1 Like

x <- data.frame(x = runif(1000000000*0.5))
This will show if R is using your available 8Gb correctly.

Thanks for the info. I'll look into cloud computing or simply add more ram to my computer. Thanks again!

I am able to run this code. However, I believe that my computer simply does not have enough memory to run the analysis. I may purchase additional ram or take Richard's advice and look into cloud computing. Thanks for all the help!

I am not sure where you are at on this issue but know that there is bug regarding memory.limit, R 3.6 and RStudio IDE that has been resolved.


And it is available in the preview release
See release Note

I really don't know if it is related (seems a bit different) but it worth a try just in case, and it is good to reference this here I guess.


Fix incorrect memory.limit() result with R >= 3.6.0 (#4986)


It's been a while but I finally got it done.
Amazon cloud service's virtual computers did the trick. Turns out I needed at least 80Gbs of RAM. Thanks for the help!


This topic was automatically closed 7 days after the last reply. New replies are no longer allowed.