Using ollama with {mall}

Hi,

I am trying to run a LLM locally using ollama and {mall}.

Any ideas why I might get the below error?

library(tidyverse)
library(mall)

#download ollama from https://ollama.com/

ollamar::pull("llama3.2")
#> <httr2_response>
#> POST http://127.0.0.1:11434/api/pull
#> Status: 200 OK
#> Content-Type: application/x-ndjson
#> Body: In memory (1006 bytes)

llm_use(
  backend = "ollama",
  model = "llama3.2"
) 
#> 
#> ── mall session object
#> Backend: ollama
#> LLM session: model:llama3.2
#> R session:
#> cache_folder:C:\Users\david\AppData\Local\Temp\RtmpWuAXUO\_mall_cache41483262585a

data("reviews")

reviews |> 
  mall::llm_sentiment(review)
#> Error in `mutate()`:
#> ℹ In argument: `.sentiment = llm_vec_sentiment(x = review, options =
#>   options, additional_prompt = additional_prompt)`.
#> Caused by error in `httr2::req_perform()`:
#> ! HTTP 500 Internal Server Error.

Created on 2025-01-27 with reprex v2.1.1

Update libraries (i.e. Rcpp)

This topic was automatically closed 7 days after the last reply. New replies are no longer allowed.

If you have a query related to it or one of the replies, start a new topic and refer back with a link.