Hi All,
I want to vet an approach I'm using for git deployment and scheduled data for shiny apps on RStudio Connect.
(I'm on Ubuntu 16.04)
A little background:
We were on Shiny Server Pro before and this year switched to RStudio Connect.
We've been super impressed, we already have ~30-40 publishers, and with rstudio server, our entire company is using R or consuming R reporting.
However, moving from Shiny Server Pro left us with 30-40 legacy app migrations each with dependencies on CRON and a git deployment strategy.
I needed a way to let some publishers use GIT to deploy to RStudio Connect and allow them to continue to manage their data via automated scripts on the server.
I found the following solution, which may help others (or which may be a giant mistake), here were my steps
- Create a new directory for these kind of apps, at
/path/to/shiny-server-apps/
and migrate all old apps here. chown this directory for a user calledrstudio-connect-legacy
- For each app, create a "pointer app", an example follows:
(In a new directory, <app_name>_pointer
app.R
# Change Directory to the old directory
setwd('/path/to/shiny-server-apps/app_1')
# load namespaces of all dependencies
# we don't want to load libraries as that will force a load order
# let the app load its own libraries
loadNamespace('shiny')
loadNamespace('shinydashboard')
# If there is a global file, source it
source('global.R')
# Run app in directory
shinyAppDir('.')
Now, we deploy this pointer app rather than the original one and use the user rstudio-connect-legacy
to run the app on the server.
Users can deploy just like they would on shiny server pro, by pulling their changes into the app directory. They can also run cron scripts because they can do whatever they have the permissions to do on the server. The only time this pointer app will need to be redeployed is if the package dependencies change.
Thoughts? My real concern here is that I was lazy and put all apps under rstudio-connect-legacy
, mainly because the account creation for each app was going to be a hassle. The result is that technically these apps can see one another, and potentially negatively interact (The reverse is also true, they could positively interact).
However, this way only an rstudio-admin can make an app that interacts with these, akin to our behavior in shiny server Pro.
Thoughts?
I have a function that generates this:
# Make a pointer app from an app directory
# run deployApp() from the rsconnect package afterwards
make_pointer_app <-
function(
from = getwd(),
to_dir = sprintf('%s_pointer',dirname(sprintf('%s//',from))),
output_to_file = TRUE, ...) {
if (!any(file.exists(file.path(from, c('global.R', 'ui.R', 'server.R','app.R'))))) {
stop('Error: Not a shiny app.')
}
if (dir.exists(to_dir)) {
stop('Error: Directory already exists!')
}
message('migrating ',from)
base <- expression()
base[[length(base) + 1]] <- substitute(setwd(x),list(x = from))
# Add dependencies
depends <- rsconnect::appDependencies(from)
# for each dependency load the namespace of the library in question
for (i in seq_along(depends$package)) {
base[[length(base) + 1]] <- substitute(loadNamespace(x),list(x = depends[i,'package']))
}
if(!all(c('ui.r', 'server.r') %in% tolower(list.files(from))) & !'app.R' %in% tolower(list.files(from))) {
stop('Not an App: ', from)
}
# if global exists, we need to source this first, and then run the app
if(file.exists(file.path(from, 'global.R')))
base[[length(base) + 1]] <-substitute(source(x),list(x = list.files(from)[tolower(list.files(from)) == 'global.r'][1]))
base[[length(base) + 1]] <-
substitute(
shinyAppDir('.')
)
if (output_to_file) {
dir.create(to_dir)
write(paste(as.character(base),collapse = '\n'), file.path(to_dir,'app.R'))
} else {
return(base)
}
}