This week Dale will lead a discussion on dimensionality reduction with PCA and related techniques. We will review geometric and data compression motivations for PCA, and explore methods for calculating these latent aspects. There will be an introduction to the mathematical principals underlying different decomposition techniques. Some examples will be shown during the meetup
Useful links for this topic:
Vidoes by Andrew Ng's Coursera class. Look for videos under Section XIV Dimensionality Reduction (week 8) https://class.coursera.org/ml-003/lecture/index
Of course, Wikipedia also has a very complete description of this at http://en.wikipedia.org/wiki/Principal_component_analysis
We also want to thank John for his overview on Recommendation Systems and Andrew for his overview on Collaborative Filtering during last week’s meetup.
For those new to this group, the purpose of this meetup is to learn from each other the various aspects of machine learning. Our focus will be on intuitions, theories, and implementations behind machine learning algorithms. The format will be a chosen algorithm or topic reviewed/explained by a pre-assigned member through a highly interactive discussion session, and it is expected that other members will draw from their respective knowledge and experience to contribute to the explanation on the topic. We may also discuss specific machine learning projects should time permits after the chosen topic discussion.
The background of attendees varies from those that have exposure to machine learning, whether through Andrew Ng’s Coursera class or through other learning channels, to those that use machine learning in their day-to-days. However, even if you have no exposure to but have interest in machine learning, you are welcome to attend the meetup. Of course you might get more out of it if you go through the videos in Andrew Ng’s free Coursera class or equivalent.