The only out-of-the-box ordinal regression models I've found in R seem to be linear (vglm in vgam and polr in MASS), but I'm looking for something to better incorporate non-linear effects like a random forest or maybe a neural network.

Any suggestions or code examples? All advice is appreciated.

The vgam function can easily include nonlinearity. Also, there is rpartScore for CART models.

Otherwise, you can convert them to unordered factors, fit models, then post-process the probabilities to be monotonic using techniques like the pooled-adjacent violator algorithm (best name ever) or monotonic regression.

Beyond that, you're stuck with using splines inside a formula method AFAICT

Suppose you fit a classification model and treat the classes as unordered. Suppose you have four classes (A though D) and you get class probabilities for a data point that look like:

A B C D
0.10 0.35 0.20 0.35

Since they are ordinal, you might expect that the class probabilities should not have an up- and down-pattern and should be monotonically increasing or decreasing. You can post-process these results to fit that constraint.

A simple PAVA algorithm would find where the assumption is violated and change the data values. Here an example using the isotone (pdf) package:

Thanks for the reply! And your earlier reply wasn't too jargony -- I'm just not that well-versed in stats talk. Good to know, though -- I guess using ordinal regression vs. just multi-class is largely a philosophical choice in most circumstances.

Do you know if it's possible for the fitted values in gpava to be rounded to more than 2 digits? I didn't see anything about that in the documentation, so I thought maybe you'd know