Word Embeddings for Natural Language Processing with Python


Details
Word embeddings are a family of Natural Language Processing (NLP) algorithms where words are mapped to vectors in low-dimensional space.
The interest around word embeddings has been on the rise in the past few years, because these techniques have been driving important improvements in many NLP applications like text classification, sentiment analysis or machine translation.
We're lucky to have Marco Bonzanini (author, and organiser PyData meetup) describe the intuitions behind this family of algorithms, in particular with details on word2vec and doc2vec.
We'll also explore some of the Python tools that allow us to implement modern NLP applications and we'll conclude with some practical considerations.
Please remember to also sign up with Skills Matter
https://skillsmatter.com/meetups/10114-word-embeddings-for-natural-language-processing-with-python


Word Embeddings for Natural Language Processing with Python