Hi, we have a set of Shiny apps for internal clients which share a common codebase and are differentiated with some config files. The general workflow is:
Retrieve some data from a Redshift database
Execute some R code that creates RData files with the necessary objects for the apps
Publish the folder (app.R, RData, config file) to shinyapps.io
These steps have generally been done using some scheduling and performed on an AWS EC2 machine. I am thinking that there must be a more scalable way to do this, one consideration also being autoscaling of the published app. Aside from shinyapps.io, I am aware that we could host the app on EC2 and likely use Shiny Server. Are there best practices anyone uses? This essentially is similar to how one would maintain a BI workflow (e.g. Tableau): schedule a data extract that the dashboard then reads from. Thanks in advance!
If you are looking into using a hosting service for your Shiny apps other than shinyapps.io, I highly recommend containerizing your apps with Docker. Regarding the scalability question, you should also look into Shinyproxy (they just released a new version recently).
To be honest, this blog (hosting.analythium.io) is a gold mine for anything shiny apps deployment. I highly recommend browsing through its articles. Do not hesitate to sign up to it as well, it's free and some articles require you to sign up before you can read them.