Skip to content

Hilbert Space Kernel Methods for Machine Learning: Background and Foundations

Photo of QU Communications
Hosted By
QU C.
Hilbert Space Kernel Methods for Machine Learning: Background and Foundations

Details

QU Winter School Speaker Series

Join Daniel Duffy and Jean-Marc Mercier in a discussion on Hilbert Space Kernel Methods for Machine Learning: Background and Foundations!

Register here (Please register with this link for zoom information)
https://quspeakerseries24.splashthat.com/

A conversation with Quants, Thinkers and Innovators all challenged to innovate in turbulent times!

Join QuantUniversity for a complimentary Winter speaker series where you will hear from Quants, innovators, startups and Fintech experts on various topics in Quant Investing, Machine Learning, Optimization, Fintech, AI etc.

Daniel will, in the first part of this talk, overviews RKHS (Reproducing Kernel Hilbert Space) methods and some of their applications to statistics and machine learning. They have several attractive properties such as solid mathematical foundations, computational efficiency and versatility when compared to earlier machine learning methods (for example, artificial neural networks (ANNs)). We can draw on the full power of (applied) Functional Analysis to give sharper and a priori error estimation for classification and regression problems, and we have access to any partial differential equations driven approach. We discuss how RKHS methods subsume and improve traditional machine learning methods and we discuss their advantages for the two-sample problems for distributions and Support Vector Estimation and Regression Estimation.

Jean-Marc will then present and discuss a Python library called codpy (curse of dimensionality - for Python), that is an application oriented library supporting Support Vector Machine (SVM) and implementing RKHS methods, providing tools for machine learning, statistical learning and numerical simulations. This library has been used in the last five years for the internal algorithmic needs of his company, as the main tool and ingredient of proof-of-concept projects for institutional clients. He will also present a benchmark of this library against a more traditional neural network approach, for two important, sometimes critical, classes of applications: the first one is classification methods, illustrated with the benchmark MNIST pattern recognition problem. The second one is statistical learning, for which he will compare both approaches with methods computing conditional expectations.

Photo of QuantUniversity Meetup group
QuantUniversity Meetup
See more events