Hello
I am working on a Shiny app (in R) that renders large and complex ggplot2
visualizations from real-time data. The issue is; when users interact with filters / change time ranges; the app noticeably lags or freezes for a few seconds especially when the dataset exceeds 100K rows.
I have tried using renderPlot
with req()
and isolate()
, and even caching parts of the data but the performance still isn't smooth. Has anyone successfully optimized high-volume visual rendering in Shiny?
I’m wondering if switching to plotly
or r2d3
, or even rendering to a static image (like PNG) & embedding it, would reduce the load. I would also love to know if people have used reactive data sampling / memoization in large dashboards without losing too much accuracy. Performance tuning for Shiny is tricky when visuals are central to the user experience.
I have checked Shiny - Scaling and Performance Tuning with shinyapps.io for reference. A colleague new to R asked me what is R Programming Language and this app became the perfect example showing R’s power in analytics but also the challenges in UI performance when scaling.
If anyone has insights or working patterns to reduce plot rendering time, I would really appreciate it.
Thnak you !!