Dear community. We have a Standard plan on posit. I have deployed my app which contains big single cell data for different tissues. The app crashes when I try to load the largest dataset, this is a matrix of 27k x 50k. What are the limits of runtime memory, I cannot find it, does it depends on the plan? Is it possible to increase it, given we have a payed account. Thank you in advance.
Hi,
Welcome to the RStudio community!
Here is an article that might answer some of your questions.
That said, I think the more elegant solution would be for you to rethink how you are reading and processing the data. I don't know what the current format is the data is stored in, but you might want to consider ones that allow dynamic access of the data instead of loading it all into memory at once. Also, be sure to check the where you data is loaded. A large dataset should be loaded in the global environment (i.e. outside of the server function) so it only has to be loaded once for everyone. Additionally you can also consider pre-processing the data (based on what the output will be) so you can avoid expensive computations at runtime.
Hope this helps,
PJ