Quite often, I find myself making Shiny apps that does something like the following:
library(shiny)
data <- odbc::dbGetQuery(con,'select * from someTable')
ui <- fluidPage(
DTOutput("table")
)
server <- function(input, output) {
output$table <- renderDT({
datatable(data)
})
}
So -- query some table in a SQL database, save the data to the app memory, and then display it in a DT. The user can then filter the table using the built-in DT filters, download it, etc. This works well for smaller datasets, but for a table with e.g. 50 columns and some 100000 rows, doing the initial load of the app becomes slow.
Instead, I want to use some sort of incremental load of the data, inspired by the approach here -- so, upon opening the app, we query the first e.g. 10 rows of the table, when the user clicks 'next', we query the next 10 using OFFSET, when the user uses a filter, we append that WHERE clause to the query, and so on. The user's selection in the app would dynamically determine the query being sent to the database.
I can do some of this by taking the same approach as in the link above, but there's some work involved in creating the dynamic query, setting up the filters, and so on. Has anyone tried doing something similar -- or maybe created a package for handling this? Any 'good practice' tips or alternative approaches? Basically, I don't care too much about the implementation, I would just like to display data from a large table without loading all of it at once.