Group_split function & Sparklyr

Hi all,

I wonder if there is a function like dplyr::group_split() to use in Sparklyr context (method Databricks) with a Spark dataframe. ChatGPT says group_split works fine but i'm getting the error:

"no applicable method for 'group_split' applied to an object of class "c('tbl_spark', 'tbl_sql', 'tbl_lazy', 'tbl')""

Here's an example:

sc <- spark_connect(method = "databricks")   # my Spark connection


df <- as.data.frame(cbind(c("a","a","b","b","c"),
                          c(1,2,3,3,5), 
                          c(3,3,4,2,2)))

df %>%
copy_to(sc,.)  %>% 
dplyr::group_by(V1) %>%
group_split()


Can anyone help with this?

Thanks a lot!
Filipa

This topic was automatically closed 42 days after the last reply. New replies are no longer allowed.

If you have a query related to it or one of the replies, start a new topic and refer back with a link.