2025-10: Tabular Attention Models
Details
Your next tabular ML breakthrough might not come from better feature engineering, but from borrowing techniques from LLMs. Explore the practical reality of applying attention mechanisms to structured data beyond the research papers.
Where: Platform Calgary, East Annex
When: Wednesday, October 29, at 5:30pm
Supervised machine learning on tabular data has long relied on linear models and gradient-boosted trees. Recently, attention mechanisms and transformers have opened new possibilities for handling both time series and independent and identically distributed (IID) datasets. This talk explores how these techniques compare to established methods, highlighting their strengths, limitations, and practical use cases. Attendees will leave with a clearer understanding of when attention-based models can add value to tabular data problems and how to approach them in practice.
Schedule:
5:30 - Food and Networking
6:00 - Presentation and Discussion
7:30 - Wrap up
Speaker Bio:
Kai Lukowiak has worked in the energy and technology sectors for nearly a decade, using Python to analyse data. He recently founded Industry Transformer, a company focused on applying deep learning to tabular data.


