I'm using RStudio Cloud to scrape a bunch of PDFs from a website.
Each PDF is reasonably large - about 20MB - and there are hundreds of them, so I regularly need to download the scraped PDFs to my local computer and then delete them from the R Studio Cloud project in order to stay under the 1 GB limit.
I am following Josh's response around exporting here: Exporting datasets from RStudio Cloud. But due to the number of times that I'm needing to do this, I'm wondering if I can improve on doing this download/delete process manually? For instance, is there a way to save to my local computer as I go?
Additionally, if I was to get a paid shinyapps.io account would this allow me to go over the 1GB limit or is the 'you will not encounter these space limits' comment in the Guide just referring to the number of members not the memory restriction?
Finally, even when the number of files is well less than 1GB (say 500MB), the workspace seems to crash if I try to do the above manual download of the PDFs. But if I grab about 10 of the PDFs at a time, then it seems to work. So I'm guessing that it's the size that is causing it to crash. Is there an alternative way to download and then delete than 'More'->'Export', and once that's done 'Delete', that might avoid this problem?