Automated workflow for downloading, updating and sharing data


I am looking to "stream" data to a shiny app so it updates weekly.

I already consulted:

To obtain the data, I use an API which I already connected to R with the package httr. Why don't just streaming it directly to the shiny app? Because the data is too big, and needs to be wrangled before going to the shiny app.

My idea in a nutshell: using github actions with a simple R script that downloads and update the data every week in a specific repo. Steps:

  1. create a github repo with an R package containing functions to download, clean and save the data
  2. use this repo to re-stream the data to shiny... using a link for example
  3. automate weekly updates with github actions to query the mentioned API


  • limited time for github actions
  • not possible to share the data between the "wrangling" repo and the shiny app if the first is private. Was thinking about hosting the data elsewhere but don't know where to start.

I think this "method" has the advantage of not touching at all the shiny app as the [download and clean] process is being performed in a different repo. However, I feel there is a more efficient way of doing this. Do you have any advices? Two constraints: Windows user and only interested in open source alternatives.

Thanks in advance!

This topic was automatically closed 54 days after the last reply. New replies are no longer allowed.

If you have a query related to it or one of the replies, start a new topic and refer back with a link.