I'm currently experiencing some issues with sparklyr. I am on my local mac connecting to Databricks.
When I call sparklyr::sdf_copy_to I get a Java heap error which I believe is related to my driver node? I am able to copy over a small subset of the dataframe to Databricks but when I do the full amount (only around 100,000 rows) I get a Java heap error.
I tried to reconfigure my spark config to increase the driver memory to 4g and 8g but end up getting a initiate hive session error.
Any ideas how to copy dataframes seamlessly?
Thanks,
Chris