Skip to content

Ensembling, Ensemble Learning & Committee Machines

Photo of Solomon Kashani.
Hosted By
Solomon K. and Nathaniel S.
Ensembling, Ensemble Learning & Committee Machines

Details

Intro:
The session is partially based on the chapter entitled "Ensembling, Ensemble Learning & Committee Machines" from the upcoming book "Deep learning Interviews". In addition, practical aspects are discussed based on one or more recent Kaggle competitions.

Syllabus / Agenda (The sessions are in English):
The main goal of any Ensembling method is enhancing the predictive power of a rather limited single model instance. As such, Ensembling obviously has a significant impact on the performance of AI systems in general, but its effect has been proven to be particularly dramatic in the field of neural networks.

Shlomo Kashani + Nathaniel Shimoni: A high-level overview of Ensembling.

  1. Why we love ensembling so much, especially in Kaggle?
  2. A basic review of the most commonly used Ensembling approaches. including Monolithic and Heterogeneous Ensembling, Snapshot Ensembles, and their combinations.
  3. The simple MVC
  4. Ensembling several CNN's inside another CNN.
  5. Examples in Python / Pytorch.

Or Katz: SOTA approaches for Ensembling Bounding Boxes in Object detection.

Yam Peleg: Stacking + Live coding.

Because it is impossible to know which statistical learning algorithm performs best on a prediction task, it is common to use stacking methods to ensemble individual learners into a more powerful single learner. Stacking is an ensemble algorithm, It uses a meta-learning algorithm to learn how to best combine the predictions from two or more base machine learning algorithms.

The benefit of stacking is that it can harness the capabilities of a range of well-performing models on a classification or regression task and make predictions that have better performance than any single model in the ensemble. In this live-coding tutorial, you will discover the stacked generalization ensemble or stacking in Python.
Bonus: "Extreme ensembling: Training two million models on a super-computer (for fun!)" - Hardware glimpse for heavy brute-forces,

Photo of Tel Aviv Deep Learning Bootcamp group
Tel Aviv Deep Learning Bootcamp
See more events