Prof. Michael Small presents The Shape of Learning
Details
With increasingly complex and massive data sets, and increasingly powerful and intricate machine learning schemes, it is imperative to seek understanding of what these algorithms are actually doing. While the remit of “explainable AI” is to justify the output of a model, a separate and more challenging problem is to better understand the reasoning of the algorithms themselves.
Professor Michael Small's group, at UWA, has been using tools from the mathematical field of homology and algorithms from nonlinear physics to provide powerful new insight into machine learning paradigms. In this talk we'll explore Topological data analysis, which seeks to quantify the shape of data, and Reservoir computing, which harnesses the power of statistical physics to avoid the computation burden associated with recurrent neural networks. We will demystify how these innovative techniques aid in mapping the intricate landscapes of machine learning algorithms and their nuanced reactions to data.
Biography: Michael is the CSIRO-UWA Chair of Complex Systems, Interim Director of the UWA Data Institute, and a Professor of Applied Mathematics at UWA. His research group works on applying the mathematical theories of Chaos and Complexity to data-driven problems. Current applications of his group include disease modelling, predicting propensity for self-harm and suicidal ideation, traffic modelling, geological discovery under cover, and predictive maintenance. Prof. Small is Chief Investigator of the ARC Research Hub for Transforming Energy Infrastructure Through Digital Engineering, and the ARC Training Centre for Transforming Maintenance through Data Science. He is one of the highest cited mathematicians in Australia and recognised internationally as the Deputy Editor in Chief of the journal Chaos. In 2022 was awarded the V. Afraimovich Award of the Nonlinear Science Society for contributions to complex systems and nonlinear dynamics.