problem with dbConnect

, ,

Hello everyone,
I am facing an issue and I would like your help.

I have a R project with one collect.R which does this :

getShipCurrentMonth_test <- function(conn, start_of_current_month=start_of_current_month, yesterday_date=yesterday_date ){
    message("ship store current month correct ")
    query <- paste0("select * from test")
     data <- dbGetQuery(conn,query)
return(data)
}

I have a main.R where I call the function.
But I found that, this return an error :

Error in dbSendQuery(conn, query) : Unable to retrieve JDBC result set
  JDBC ERROR: [Databricks][JDBCDriver](500051) ERROR processing query/statement. Error Code: 0, SQL state: null, Query: select but***, Error message from Server: org.apache.hive.service.cli.HiveSQLException: Error running query: [_LEGACY_ERROR_TEMP_2196] org.apache.spark.SparkException: Unable to fetch tables of db cds.
        at org.apache.spark.sql.hive.thriftserver.HiveThriftServerErrors$.runningQueryError(HiveThriftServerErrors.scala:48)
        at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation.$anonfun$execute$1(SparkExecuteStatementOperation.scala:694)
        at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
        at com.databricks.unity.EmptyHandle$.runWith(UCSHandle.scala:124)
        at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation.org$apache$spark$sql$hive$thriftserver$SparkExecuteStatementOperation$$execute(SparkExecuteStatementOperation.scala:571)

But, I don't have that error when the SQL query is in the same main.

I would like to mention that the connection with DBI and RJDBC was done.

Do you know why this is happening? I am pretty new to R and documentation are not clear.
Thanks

This topic was automatically closed 21 days after the last reply. New replies are no longer allowed.

If you have a query related to it or one of the replies, start a new topic and refer back with a link.