Modern Applied Machine Learning for NLP: Large Scale Techniques for Shallow and Deep Learning (Part II)
In this two-part talk, Dan Pressel from Digital Roots will discuss modern Machine Learning techniques and methods that have become popular for handling large scale machine learning problems. This will include a practical discussion about online learning, overlapped processing, feature hashing, optimization with Stochastic Gradient Descent variants, as well as software implementation details and patterns for Linear Classification, Deep Neural Networks (DNNs) for NLP on a variety of platforms using these techniques.
Part II of this talk covers:
• Refitting our ML understanding for deep learning
• More on Continuous/Dense Representations
• Optimization for DNNs
• Neural Bag of Words models
• Sentence modeling with CNNs for classification
• Language modeling with RNNs
• Tagging tasks with CNNs and RNNs
As in the first talk, along the way, we will get into implementation details, so some experience with machine learning and programming is advised.
Spark will provide food and drink!