I've even tried removing various objects (e.g., large dataframes) prior to training the model to see if that helps with memory. It doesn't seem to have an effect. What should I do so that my session doesn't continue crashing? Thanks!
There's a 1GB memory limit on cloud projects. In the other thread the user was exceeding the memory limit (which causes the r session to terminate) and from our logging it appears that your project is hitting that as well. In general machine learning algorithms on non-trivial data sets tends to be memory intensive - so it's tricky to get that working successfully on the cloud platform.
Thanks for your response. I typically do my work on a Chromebook so I'm wondering if there's a way for me to resolve it without using RStudio Desktop. Would it be worth spinning up RStudio Cloud on a virtual machine? Would that allow me a higher memory limit? RStudio Cloud has been great given it's ease to get started but hitting a snag like this may mean I need to shift gears.
PS: Can I just say I so appreciate how responsive you all are. Not sure what (or if you have a SLA) in terms of responding is but I definitely appreciate being able to post a question in the community forums knowing that I'll get a response. Simply, thank you!
You could try our marketplace offering of RStudio Server Pro on your favorite cloud provider (currently AWS, Google and Azure). You would then pay per hour for the compute and software license, and have the ability to specify how much memory and CPU you want.
We don't have an SLA, we are just really excited to share what we are working on with everyone, and enjoy learning how people are using it. We know the current memory limit can be frustrating, and it is not always evident when it is being hit, so rest assured that we are looking into addressing both aspects.
Yep. That all makes sense. Again, appreciate your response and I'll look more into the RStudio Server Pro option (and price point given this is a hobby for me!). Resolving this conversation chain. Thank you!