Hello,
I have a Shiny app currently deployed locally, that loads several large biological datasets of ~2GB each. The datasets are loaded outside the server function, and the user selects which dataset to use. The current dataset is passed to a reactiveVal object, which is passed to several modules for analysis.
When I run the app locally, the datasets are loaded as expected upon startup, but after terminating the app and calling mem_used(), the memory usage remains at the same level it was when the app was started. If I run the app multiple times, the memory usage after terminating the app becomes equal to the size of the datasets times the number of times I have ran the app. If I restart the R session, or if I run the app as a background job instead of in the console, the memory completely clears.
I've tried using gc()
, but this does not clear the memory. Next I tried putting env <- environment()
outside the server function, and then added the following code to onStop() (onStop() is placed within the onStart
parameter of shinyApp()):
# `datasets` is a list of all the datasets loaded
rm("datasets", envir = env)
This caused some of the memory to clear, but not all of it. The amount of memory cleared was equal to the size of the datasets that had not yet been selected by the user.
I also tried creating a simplified version of the app that just loads the datasets, and it cleared memory upon terminating as would be expected. The original app also behaves as expected when I comment out the modules that use the datasets.
Is there something about how the datasets are used in the modules that could be causing this issue?