Skip to content

Details

Gradient Boosting

In the mid 90’s the Machine Learning world produced an impressive

classification algorithm called AdaBoost. Statisticians at Stanford

studied it and decided it’s secret sauce was something called Gradient

Boosting. This sauce could be generalised to many problems and lead to

Gradient Boosting Machines (GBM), and more recently Extreme Boosting

(XGBoost). We take a look at these developments, in particular XGBoost

and it’s capabilities."

Members are also interested in