Good afternoon, I have a question, if I have a table with 40,000 records, how can I optimize the GT library so that it doesn't take so long to edit it and export the final table?
Attempting to display 40,000 rows will always take time. If you are using gt
in an interactive context (HTML) then pagination is essential:
df <- tibble(x = rnorm(40000), y = rnorm(40000))
df |>
gt() |>
opt_interactive(use_pagination = TRUE)
This takes only a few seconds on my laptop, but, obviously, a data frame with lots of text columns will take much longer to render.
An alternative solution is to use DT::datatable()
which is markedly faster. It will complain about your data being too big for client-side table, but it will render a table. For very large data sets, it might be necessary to do server-side processing.
This topic was automatically closed 42 days after the last reply. New replies are no longer allowed.
If you have a query related to it or one of the replies, start a new topic and refer back with a link.