Connecting remote spark cluster from Rstudio by installing spaklyr package

We installed MapR clinet & Sparklyr package & R & Rstudio

we set the environment setting like belwo

1.MAPR_HOME =C:\opt\mapr
2.SAPRK_HOME =C:\opt\mapr\spark\spark-2.1.0-bin-hadoop2.7
3.JAVA_HOME =C:\Program Files\Java\jdk1.8.0_162
4.HADOOP_CONF_DIR =C:\opt\mapr\hadoop\hadoop-0.20.2
5.HADOOP_HOME =C:\hadoop-common-2.2.0-bin-master for winutils.exe

We are trying to connect remote spark cluster(Hadoop cluster) from rstduio using sparklyr package....

We are able to connect local spark but unable to connect remote spark cluster..

We used below r code to connect remote spark cluster


sc <- spark_connect(master = "")

Here:IP is the Remote saprk cluster ip with port number

We are getting below error while connect remote spark cluster

Error in file(con, "r") : cannot open the connection
In addition: Warning message:
In file(con, "r") :
cannot open file 'C:\Users\admin\AppData\Local\Temp\RtmpWmwyXi\file3ad004c7714b6_spark.log': Permission denied

spark does not work like this way.

you need to connect a remote spark by yarn-cluster/mesos/livy.

If you only wanna test the function, please try yarn-client or local mode.

Yes we are using yarn client

Where you able to make it work. If yes can you pass the connection string that you used?