From SHAP to EBM: Explain your Gradient Boosting Models in Python
Dettagli
L'evento si svolgerà in modalità ibrida: in presenza e online.
>>> IN PRESENZA
Luogo: Edreams Odigeo Tech Hub Via Gustavo Fara, 26, 20124 Milano MI
PRIMO PIANO
ATTENZIONE! Ingresso dal retro, vedi foto allegata!
Data: Giovedì 28 Novembre 2024 ore 18:45
Ingresso a partire dalle 18:15.
Evento gratuito con limite dei partecipanti e con registrazione obbligatoria.
>>> IMPORTANTE! ISCRIVITI A QUESTO EVENTO DI MEETUP SOLO SE INTENDI PARTECIPARE IN PRESENZA. Se ti sei iscritto e per qualsiasi ragione sei impossibilitato a partecipare ANNULLA l'iscrizione.
Se vuoi partecipare all'evento ONLINE segui le istruzioni riportate qui di seguito.
>>> ONLINE
Iscriviti al nostro canale YouTube e segui l'evento in diretta streaming disponibile a questo indirizzo
https://www.youtube.com/@kaggledaysmeetupmilano3935/streams
PROGRAMMA
18:15-18:45 Registrazione partecipanti
18:45-18:50 Presentazione della nostra community
18:50-19:35 Talk
19:35-20:30 Q&A e Networking
SPEAKER
Emanuele Fabbiani
Engineer, researcher, entrepreneur. Emanuele earned his PhD in AI by researching time series forecasting in the energy field. He was a guest researcher at EPFL Lausanne, and he's now the Head of AI at xtream, where he solves business problems with AI. He published 8 papers in international journals, presented and organized tracks and workshops at 20+ international conferences, including AMLD Lausanne, ODSC London, WeAreDevelopers Berlin, PyData Berlin, PyData Paris, PyCon Florence, and lectured in Italy, Switzerland, and Poland.
ABSTRACT
XGBoost is considered a state-of-the-art model for regression, classification, and learning-to-rank problems on tabular data. Unfortunately, tree-based ensemble models are notoriously difficult to explain, limiting their application in critical fields. Techniques like SHapley Additive exPlanations (SHAP) and Explainable Boosting Machine (EBM) have become common methods for assessing how much each feature contributes to the model prediction.
This talk will introduce SHAP and EBM, explaining the theory behind their mechanisms in an accessible way and discussing the pros and cons of both techniques. We will also comment on Python snippets where SHAP and EBM are used to explain a gradient-boosting model.
Attendees will walk away with an understanding of how SHAP and EBM work, the limitations and merits of both techniques and a tutorial on how to use these methods in Python, courtesy of the shap and interpret-ml packages.
Talk outline:
- A brief reminder about gradient boosting and XGBoost (5 mins)
- The challenge of explainability (5 mins)
- EBM: theory and applications (10 mins)
- SHAP: theory and applications (10 mins)
--> Per rimanere sempre aggiornato su questo evento e sui prossimi, iscriviti alla nostra pagina Linkedin (link)
