Error-handling Mechanism in Plumber API

Hello everyone,

I'm currently working on developing an R Plumber API that essentially implements an ETL process. While I've made progress, I've encountered some challenges with the error-handling process.

In case of an unexpected request, Here are the issues I'm facing:

  1. The error handling mechanism, called main, functions as expected when I run the function through the console.
  2. While API is working, I can see the error message being logged correctly.

However, when I make a request from Postman, all I receive is a generic 'An exception occurred' message. What I actually need is to receive the specific result defined in the exception case.

Could someone provide assistance with this matter?

Please find the details below.

### Main function that return the requirements
main <- function(inputs){
  
  tryCatch({
  
    ....  
    results = list(success_code = success_code, 
                   data = res,
                   run_time = as.numeric(end_time - start_time),
                   error_message = NULL)
    
    
  }

  return(results)
  
  }, error = function(e){
    
    results = list(success_code = -1,
                   data = NULL,
                   run_time = NULL,
                   error_message = e)
    
    return(results)
    
  }
  )
}

Then, I create a Plumber API with the following code:

library(plumber)
library(jsonlite)
library(reprex)

source('./functions.R')

#* @apiTitle ...
#* @apiDescription ...
#* @apiDescription ...
#* @apiVersion 1.0

#* @plumber
function(pr) {
  pr %>%
    pr_set_debug(TRUE)
}

#* Echo back the input
#* @param msg The message to echo
#* @get /echo
function() {
    list(Status = 'CropMap Field Filtering API is working properly!',
         Time = Sys.time())
}

#* Log some information about the incoming request
#* @filter logger
function(req){
  cat(as.character(Sys.time()), "-",
      req$REQUEST_METHOD, req$PATH_INFO, "-",
      req$HTTP_USER_AGENT, "@", req$REMOTE_ADDR, "\n")
  
  forward()
}

#* Get model input
#* @filter /field_filter
function(req, res){
  
  if(grepl("field_filter", req$PATH_INFO)){
    req$data <- tryCatch(jsonlite::fromJSON(req$postBody), 
                         error = function(x){
                           return(NULL)
                         })
    
    if(is.null(req$data)){
      res$status <- 400
      return(
        list(error = "No JSON file is found in the request")
      )
    }
    
    list_of_inputs <- req$data
    
    req$results <- main(inputs = list_of_inputs$inputs)
  }
  
  forward()
  
}

#* ...
#* @post /field_filter/report
function(req){
  
  res <- req$results
  res
}

Finally, based on this Plumber API, I serve the API with the following script:

library(plumber)
library(logger)

log_dir <- "logs"

if (!fs::dir_exists(log_dir)) fs::dir_create(log_dir)
log_appender(appender_tee(tempfile("plumber_", log_dir, ".log")))

convert_empty <- function(string) {
  if (string == "") {
    "-"
  } else {
    string
  }
}

r <- plumb("./plumber.R")


r$registerHooks(
  list(
    preroute = function() {
      # Start timer for log info
      tictoc::tic()
    },
    postroute = function(req, res) {
      end <- tictoc::toc(quiet = TRUE)
      # Log details about the request and the response
      log_info('{convert_empty(req$REMOTE_ADDR)} "{convert_empty(req$HTTP_USER_AGENT)}" {convert_empty(req$HTTP_HOST)} {convert_empty(req$REQUEST_METHOD)} {convert_empty(req$PATH_INFO)} {convert_empty(res$status)} {round(end$toc - end$tic, digits = getOption("digits", 5))}')
    }
  )
)

r$run(port = 8181, host = "0.0.0.0")

Did you activate debug?

Set debug value to include error messages of routes cause an error — pr_set_debug • plumber (rplumber.io)

r$run(port = 8181, host = "0.0.0.0", debug = TRUE)

I tried that however, it does not help.

r$run(port = 8181, host = "0.0.0.0", debug = TRUE)

You should be able to see the error server side. I do not know if the error occurs in the filter.

I'm wondering why the specific filter to parse the json body? There is an internal mechanism within plumber to handle json body.

With the provided code, I could identify two things.

Use conditionMessage on e to extract message from error condition. Otherwise, it will fail to convert a condition to JSON because there no toJSON S3 method for condition class.

### Main function that return the requirements
main <- function(inputs){
  results = tryCatch({
     list(success_code = 1, 
          data = res,
          run_time = NULL,
          error_message = NULL)
  }, error = function(e){
    list(success_code = -1,
         data = NULL,
         run_time = NULL,
         error_message = conditionMessage(e))
  })
}

I do not think you need a filter to parse json.

library(plumber)
library(jsonlite)
library(reprex)

source('./functions.R')

#* @apiTitle ...
#* @apiDescription ...
#* @apiDescription ...
#* @apiVersion 1.0

#* @plumber
function(pr) {
    pr %>%
        pr_set_debug(TRUE)
}

#* Echo back the input
#* @param msg The message to echo
#* @get /echo
function() {
    list(Status = 'CropMap Field Filtering API is working properly!',
         Time = Sys.time())
}

#* Log some information about the incoming request
#* @filter logger
function(req){
    cat(as.character(Sys.time()), "-",
        req$REQUEST_METHOD, req$PATH_INFO, "-",
        req$HTTP_USER_AGENT, "@", req$REMOTE_ADDR, "\n")
    
    forward()
}

#* ...
#* @param inputs:list
#* @post /field_filter/report
function(inputs ,req){
    main(inputs)
}

I think the last part should be ok.
Good work, you are almost there.

Thanks @meztez,

This solves the issue.

list(success_code = -1,
         data = NULL,
         run_time = NULL,
         error_message = conditionMessage(e))

I handle input parsing within the filter function because it accommodates multiple inputs, some of which may be optional. However, your message has made me reconsider whether this approach is indeed necessary.

Thank you very much.

This topic was automatically closed 7 days after the last reply. New replies are no longer allowed.

If you have a query related to it or one of the replies, start a new topic and refer back with a link.