In the book TMWR, in section 18.2 (Local Explanations), it states (2nd paragraph) -
There are multiple possible approaches to understanding why a model predicts a given price for this duplex. One is a break-down explanation, implemented with the DALEX function
predict_parts()
; it computes how contributions attributed to individual features change the mean model’s prediction for a particular observation...
How does this explanation change for a classification problem. Can I say -
it computes how contributions attributed to individual features change the model's probability prediction for a particular observation...