Hello. I am currently developing an R Shiny application which I have put into an package. Development has ground to a halt as devtools::load_all() routinely takes over 30 minutes to run and reload my package. I suspect that this may be due to the amount of data that I have included inside my package. This is because I have added large chunks of data at several time points and each time results in a worse experience of package development.
This data is utilized in various places of my R-Shiny application for analysis and visualization purposes. I have 45 .rda objects stored within my data/ folder that bring the folder size to ~700 MB. The usethis package was used to store and document this data.
Has anyone else experience significant slow down with package development when including more datasets? Is there a better way to handle this data?
One option I have thought about is creating an external package that has the sole purpose of holding holding the data (R packages - internal and external data - coolbutuseless). My thinking is that this would speed up the development of my R-Shiny package since the data objects are no longer included in that package. Since I would not update the new "data only" often, having a 30 minute devtools::load_all() call is not a big deal.