can vetiver endpoint return class probs?

I have a binary classification model i've developed and deployed to our enteprise instance of Connect using vetiver_deploy_rsconnect.

everything works as expected and running the code below returns a 10x1 tibble with one column called .pred_class

predict(
  object = endpoint,
  new_data = data %>% sample_n(10),
  httr::add_headers(Authorization = paste("Key", apiKey))
)

I am trying to get the class proabilities from my model as well and I thought adding type = "prob" inside predict() would return 2 cols for .pred_0 and .pred_1, but that's not the case.

Is there a way I can specify the type of prediction i want the vetiver endpoint to return?

Yes, you'll need to specify what kind of predictions you want when you create your API, not when you call the API. This is because when you create your API, you are specifying what code will be run when you call it later. So something like this:

library(vetiver)
library(plumber)

mtcars_glm <- glm(mpg ~ ., data = mtcars)
v <- vetiver_model(mtcars_glm, "cars_glm")

pr() |> vetiver_api(v, type = "response")
#> # Plumber router with 4 endpoints, 4 filters, and 1 sub-router.
#> # Use `pr_run()` on this object to start the API.
#> ├──[queryString]
#> ├──[body]
#> ├──[cookieParser]
#> ├──[sharedSecret]
#> ├──/logo
#> │  │ # Plumber static router serving from directory: /Library/Frameworks/R.framework/Versions/4.3-arm64/Resources/library/vetiver
#> ├──/metadata (GET)
#> ├──/ping (GET)
#> ├──/predict (POST)
#> └──/prototype (GET)

Created on 2024-01-08 with reprex v2.0.2

If you are using vetiver_deploy_rsconnect() or vetiver_prepare_docker() you pass this through like predict_args = list(type = "response").

perfect! thank you so much

1 Like

Have you checked the configuration of the deployed endpoint using vetiver_deploy_rsconnect to ensure it supports returning class probabilities?

This topic was automatically closed 7 days after the last reply. New replies are no longer allowed.

If you have a query related to it or one of the replies, start a new topic and refer back with a link.