R Studio Server cannot save or export files larger than 1GB


I just started using R Studio Server on AWS EC2 (with 32GB of ram) , and I've been constantly running into trouble working with files larger than 1GB. When ever I try to save an .Rdata file larger than 1GB, I get the following error:

Error in save(brm_violxses4lv, file = "brm_violxses4lv.Rdata") :
error writing to connection

I just read in a forum post from a few years back that memory for RCloud is capped at 1GB. Perhaps it is still the same? Currently, I have several .Rdata files in my ephermeral storage, but I cannot save any of them. Each took me about a day and $14 to fit. Is there any way I can save and download them to my PC? I would really appreciate any workarounds if there are any.

Thank you.

Rstudio Cloud is not the same as RStudio Server, the first one is a cloud service hosted on RStudio's servers, whereas RStudio server is a software that you can install in your own infrastructure (or cloud computing services like AWS).
Can you clarify to which one are you referring to?

Sorry about that. It is RStudio Server. I set it up using Louis Aslett's RStudio Server AMI. (I modfied my initial post accordingly).

platform x86_64-pc-linux-gnu
arch x86_64
os linux-gnu
system x86_64, linux-gnu
major 3
minor 6.0
year 2019
month 04
day 26
svn rev 76424
language R
version.string R version 3.6.0 (2019-04-26)
nickname Planting of a Tree

I've never used that particular AMI but this issue very frequently is due to the use of a reverse proxy like nginx, can you confirm if that is the case?

hmm... In fact, I don't know. I haven't set anything up, but I'll look into it. Thank you.

In case the AMI is using a reverse proxy you have to edit the settings, for example, for Nginx, you have to edit this file /etc/nginx/nginx.conf by adding a line like this

client_max_body_size 2000M;

And then restart the service

sudo systemctl restart nginx

This topic was automatically closed 21 days after the last reply. New replies are no longer allowed.