Group_split function & Sparklyr

Hi all,

I wonder if there is a function like dplyr::group_split() to use in Sparklyr context (method Databricks) with a Spark dataframe. ChatGPT says group_split works fine but i'm getting the error:

"no applicable method for 'group_split' applied to an object of class "c('tbl_spark', 'tbl_sql', 'tbl_lazy', 'tbl')""

Here's an example:

sc <- spark_connect(method = "databricks")   # my Spark connection

df <-"a","a","b","b","c"),

df %>%
copy_to(sc,.)  %>% 
dplyr::group_by(V1) %>%

Can anyone help with this?

Thanks a lot!

This topic was automatically closed 42 days after the last reply. New replies are no longer allowed.

If you have a query related to it or one of the replies, start a new topic and refer back with a link.