addressalign-toparrow-leftarrow-rightbackbellblockcalendarcameraccwcheckchevron-downchevron-leftchevron-rightchevron-small-downchevron-small-leftchevron-small-rightchevron-small-upchevron-upcircle-with-checkcircle-with-crosscircle-with-pluscontroller-playcrossdots-three-verticaleditemptyheartexporteye-with-lineeyefacebookfolderfullheartglobegmailgooglegroupshelp-with-circleimageimagesinstagramFill 1light-bulblinklocation-pinm-swarmSearchmailmessagesminusmoremuplabelShape 3 + Rectangle 1ShapeoutlookpersonJoin Group on CardStartprice-ribbonprintShapeShapeShapeShapeImported LayersImported LayersImported Layersshieldstartickettrashtriangle-downtriangle-uptwitteruserwarningyahoo

Gradient Boosted Regression Trees using Python's scikit-learn

This talk describes Gradient Boosted Regression Trees (GBRT), a powerful statistical learning technique with applications in a variety of areas, ranging from web page ranking to environmental niche modeling. GBRT is a key ingredient of many winning solutions in data-mining competitions such as the Netflix Prize, the GE Flight Quest, or the Heritage Health Price.

Peter will give a brief introduction to the GBRT model and regression trees -- focusing on intuition rather than mathematical formulas. The majority of the talk will be dedicated to an in depth discussion how to apply GBRT in practice using scikit-learn. He will cover important topics such as regularization, model tuning and model interpretation that should significantly improve your score on Kaggle.

About the speaker:

Peter Prettenhofer is a data scientist / software engineer at DataRobot. He studied computer science at Graz University of Technology, Austria and Bauhaus University Weimar, Germany focusing on machine learning and natural language processing. He is a contributor to scikit-learn where he co-authored a number of modules such as Gradient Boosted Regression Trees, Stochastic Gradient Descent, and Decision Trees.

Agenda:

6:00 - 6:30 Pizza and Beer
6:30 - 7:30 Talk
7:30 - 8:30 Beer and Networking

Pizza and Beer will be provided by DataRobot.

Join or login to comment.

  • Jason

    I am sorry to hear that a few of you cannot make it. I will try to video the talk. If you click the Gradient Boosted Regression Trees link in the talk summary you will find the slides.

    1 · April 1, 2014

    • Sunanda Koduvayur P

      Thanks Jason! Really looking forward to the video.

      April 4, 2014

    • Ali U.

      Wasn't able to make it -- I was really excited about this talk, too. Were you able to record it?

      April 4, 2014

  • Rani N.

    Very cool. Thanks!

    April 3, 2014

  • Marcus

    Great, thanks!

    April 3, 2014

  • Matt B.

    Excellent presentation! Impressive work.

    April 3, 2014

  • Matt E

    Great to know the trick of using small learning rate thx for sharing

    April 2, 2014

  • Ngiap K.

    Thank you for the great presentation

    1 · April 2, 2014

  • Kent J.

    Excellent presentation with good depth.

    1 · April 2, 2014

  • Siddharth

    GBM are on my bucket list for so long and I was desperately waiting for this session. I hate to say this but due to unforeseeable circumstance I will be unable to attend this meetup. I will really appreciate if the video and other content can be posted.

    April 1, 2014

  • Gary

    Unfortunately for me I am unable to attend a meeting which appears insightful for a very timely topic. If codes or slides are available, I would be greatly interested in them. Thanks.

    April 1, 2014

  • Sunanda Koduvayur P

    Really bummed about missing this! Would greatly appreciate if video of the talk can be posted. Thanks!

    1 · March 31, 2014

Sign up

Meetup members, Log in

By clicking "Sign up" or "Sign up using Facebook", you confirm that you accept our Terms of Service & Privacy Policy