IWSM 2014: Short Course
The short course on "Boosting for statistical modelling" will be held by Benjamin Hofner, Andreas Mayr and Matthias Schmid (Friedrich-Alexander-Universität Erlangen-Nürnberg, Germany) on July 13, 2014.
Course contents
In recent years, gradient boosting has become an important tool for building regression models in the generalized additive model (GAM) framework. An important feature of gradient boosting is its intrinsic mechanism for variable selection, which allows for carrying out feature selection and GAM estimation simultaneously. Specifically, boosting can conveniently be applied to high-dimensional data with a large covariate space.
Gradient boosting, however, is not restricted to classical GAMs, but is also one of the most important estimation schemes for “Beyond mean regression” methods like quantile and expectile regression. Furthermore, boosting was also extended to fit generalized additive models for location, scale and shape (GAMLSS).
In this tutorial, we introduce the main characteristics of boosting methods in theory and explain how to use them to fit statistical regression models in practice. Special focus will be given to the R add-on packages mboost and gamboostLSS, which together provide a comprehensive implementation of model-based boosting algorithms for statistical modelling.