addressalign-toparrow-leftarrow-rightbackbellblockcalendarcameraccwchatcheckchevron-downchevron-leftchevron-rightchevron-small-downchevron-small-leftchevron-small-rightchevron-small-upchevron-upcircle-with-checkcircle-with-crosscircle-with-pluscrossdots-three-verticaleditemptyheartexporteye-with-lineeyefacebookfolderfullheartglobegmailgoogleimageimagesinstagramlinklocation-pinmagnifying-glassmailminusmoremuplabelShape 3 + Rectangle 1outlookpersonplusprice-ribbonImported LayersImported LayersImported Layersshieldstartickettrashtriangle-downtriangle-uptwitteruseryahoo

Talk: Sparse and Low-Rank Recovery Problems in Machine Learning

Speaker: Jeremy Watt

Title: Sparse and low-rank recovery problems in machine learning: an introduction to applications and models

Abstract: Due to their wide applicability, sparse and low-rank recovery models have quickly become some of the most important tools for today’s researchers in machine learning, statistics, optimization, bioinformatics, as well as signal, image, and video processing. In this introductory talk I’ll focus on motivating sparse/low rank models from the natural desire to tweak a) standard linear least squares regression and b) Principal Component Analysis (both of which I’ll review briefly first) in order to perform a variety of machine learning tasks on big data sets. 

I’ll also discuss (with pictures and live code!) an array of amazing applications of these models including: lasso problems in biostatistics, face recognition, digital compression, Robust Principle Component Analysis, Collaborative Filtering (e.g. the Netflix Prize), compressive sensing, as well as a number of fascinating recovery problems in image processing.

Bio: I’m a PhD student at Northwestern working in the rapidly growing field of sparse and low-rank models and algorithms, with applications to machine learning and signal processing. You can find out more about this exciting field of research - including explanatory notes and presentations on machine learning topics - on my blog here. I’ll also be co-teaching a course on the subject of this talk during the upcoming winter quarter here at NU Evanston – for those interested (all are welcome) a tentative syllabus may be found here.

Join or login to comment.

  • Nermeen A.

    Hello everybody..
    can anybody help me, i have a question

    I'm working on image classification system, i used sift and support vector machine. I want to make it a web application using java applet, How can i use SVM results to classify new images? Is there a formula for svm hyper plane that i can use? or anything else can be used to classify. actually i calculated the sift for all images and i want to have a web application that a user can browse an image from predefined images (which it's sift is calculated) and classify to see the results. PS: i did the training phase using MATLAB i just want the final decision for classification of new images.

    December 30, 2013

    • Reza B.

      In MATLAB you can use the built-in functions "svmtrain" and "svmclassify":­
      S= svmtrain(training_data,t­­raining_labels);
      This is the training phase. Now for the testing phase, you can use the structure S to classify a new test image, say I, as follows:
      class = svmclassify(S, I);
      for more details, see: http://www.mathworks....­

      January 5, 2014

Our Sponsors

People in this
Meetup are also in:

Sign up

Meetup members, Log in

By clicking "Sign up" or "Sign up using Facebook", you confirm that you accept our Terms of Service & Privacy Policy