I planned on using RStudio Cloud for an upcoming demo. The demo required:
A few R packages installed
Python modules for reticulate to be installed
A data folder ~/.zipline/...more_here where data was required to be stored
I was able to get everything set up on my base project that I was planning on giving users a link to, but when I opened up a private browser and tried to test it out, it seems that only R specific things like the packages persisted.
The python modules, virtual environment, and data were all gone!
Is there anything I can do? Is this intended? The idea was to be a (way) more user friendly version of docker for this kind of demo that required a lot of upfront work to get going.
I've managed to hack around this by adding the required data files and virtualenv into the /cloud/project/ folder where the other R scripts live. Then the first lines of the demo use fs to copy them to the right folder.
# Run this once to move python modules and data into the correct place
library(fs)
dir_copy(".zipline", "~/.zipline")
dir_copy(".virtualenvs", "/home/rstudio-user/.virtualenvs")
Typically, the contents of the home directory is not copied when a derived project is created. This means that any data that should be shared needs to be in the /cloud/project directory. This is because the home directory could contain sensitive information in many cases (SSH keys, etc).
Here is a little more info on how this works:
In your case, I would look into if its possible to change the location of your virtualenv to a path under the /cloud/project directory.
Thanks, this is what I figured. I do think I could change the virtualenv location somehow, but i would still need the data to be copied to right place.
I've got it working after the dir_copy() calls shown above, and I'm happy enough with that! Thanks!