Speaker: Jeremy Watt
Title: Sparse and low-rank recovery problems in machine learning: an introduction to applications and models
Abstract: Due to their wide applicability, sparse and low-rank recovery models have quickly become some of the most important tools for today’s researchers in machine learning, statistics, optimization, bioinformatics, as well as signal, image, and video processing. In this introductory talk I’ll focus on motivating sparse/low rank models from the natural desire to tweak a) standard linear least squares regression and b) Principal Component Analysis (both of which I’ll review briefly first) in order to perform a variety of machine learning tasks on big data sets.
I’ll also discuss (with pictures and live code!) an array of amazing applications of these models including: lasso problems in biostatistics, face recognition, digital compression, Robust Principle Component Analysis, Collaborative Filtering (e.g. the Netflix Prize), compressive sensing, as well as a number of fascinating recovery problems in image processing.
Bio: I’m a PhD student at Northwestern working in the rapidly growing field of sparse and low-rank models and algorithms, with applications to machine learning and signal processing. You can find out more about this exciting field of research - including explanatory notes and presentations on machine learning topics - on my blog here. I’ll also be co-teaching a course on the subject of this talk during the upcoming winter quarter – for those interested (all are welcome) a tentative syllabus may be found here.