Hi,
I'm currently reading Applied Predictive Modelling
specifically related to the MARS
algorithm. I have a couple of questions I hope someone can help me with
The algorithm talks about hinges where the data is effectively partitioned in two per predictor. A linear regression is then run on each of the partitions split by the hinge. In the book and in the Elements of Statistical Learning
the examples (well the diagrams, I got lost in the notation), It shows the predictor space only being partitioned in two. I found another website where it is partitioned into multiple spaces. The algorithm to my untrained eye looks very similar to Regression Splines only with pruning. Is this the case?
Does MARS allow the building of Smoothing Splines?
Finally Applied Predictive Modelling
(pg 149) talks about interactions and how the algorithm adds more hinge functions within the partitioned data. From the book (I can remove this aspect if it violates copyright).
The search procedure attempts to find hinge functions C and D that, when
multiplied by A, result in an improvement in the model; in other words, the
model would have terms for A, A×B and A×C.
My question on this is what happened to hinge function D?
It might be just a case that I'm mixing up apples and oranges but i would like to get a grasp of the conceptual process of the model before using it in R. If you have any recommendations of blog posts or research papers which focus on the the concepts with lots of examples i would be very grateful
Thank you all for your time.
References
ISLR - Regression Splines pg 271
ESLR - MARS Multivariate Adaptive Regression Splines pg 321