Best way to read big datasets using googlesheets4?

I have a relatively big (60,000 rows) dataset on Google Sheets that I use to feed a Shiny App. I've been using the googlesheets package, which does the job just fine, as it reads the dataset very quickly after loading the Shiny App. However, since googlesheets will stop working in a few weeks, I'm trying to update my app to use googlesheets4. It works just fine with smaller datasets (under 2,000 rows, for example), but I takes a really long time to read the large one. This causes my Shiny App, which is hosted on shinyapps.io, to not load and give and error. What's the best way to deal with this? I'm using the function read_sheet("sheet_id"), by the way.
Thanks in advance!

Just to add more information:

Using the googlesheets package, the dataset takes 6.61 seconds to read.
Using the googlesheets4 package, it takes 161.64 seconds.

Have you filed an issue or discussed this in the googlesheets4 GitHub repo?

Jenny's preparing a CRAN release, and Google is deprecating the old API in March, so it would definitely be helpful information to have:
https://twitter.com/JennyBryan/status/1232195759325372416?s=20

This topic was automatically closed 21 days after the last reply. New replies are no longer allowed.