XGBoost: The Machine Learning Algorithm That Is Winning Kaggle Competitions

This is a past event

139 people went

Location image of event venue

Details

Most Kaggle competitions are won using one of two techniques. The first is deep learning. The second is XGBoost.

This talk is being given by the maintainer of the XGBoost R package.

Title:

XGBoost: A package for fast and accurate gradient boosting

Abstract:

XGBoost is a multi-language library designed and optimized for boosting trees algorithms. The underlying algorithm of xgboost is an extension of the classic gradient boosting machine algorithm. By employing multi-threads and imposing regularization, xgboost is able to utilize more computational power and get more accurate prediction compared to the traditional version. Moreover, a friendly user interface and comprehensive documentation are provided for user convenience. It has now been widely applied in both industrial business and academic researches, as the R package has been downloaded for more than 7,000 times on average from CRAN per-month and the number is growing rapidly.
The R package has won the 2016 John M. Chambers Statistical Software Award. From the very beginning of the work, our goal is to make a package which brings convenience and joy to the users. In this talk, I will introduce the details of the model, as well as several highlights that we think users would love to know.

About Tong He:

Tong He (http://www.sfu.ca/~hetongh/) is a Data Scientist, and the maintainer of the XGBoost R package.

Schedule:

• 6:00PM Doors are open, feel free to mingle
• 6:30 Presentation start
• ~7:45 Off to a nearby restaurant for food, drinks, and breakout discussions

Getting There:

By transit there a number of high frequency buses (check Google Maps or the Translink site for your particular case) that will get you there. For the drivers, there is a fair bit of street parking (free and pay) in the area, especially after 6.