Skip to content

The Power of Neural Tensor Networks

Photo of Josh Touyz
Hosted By
Josh T.
The Power of Neural Tensor Networks

Details

Abstract: Applications of Tensor Network Theory to Deep Learning

How can we apply deep learning to natural language processing?

Tensor network theory has seen a wide variety of applications in deep learning. First implemented in 2012 by Richard Socher at Stanford, neural tensor networks have shown superior performance in understanding hierarchical structures in natural language processing.

In this lecture, we’ll explore the basis of neural tensor networks and their recent surge in popularity for natural language processing. I’ll be going over an introduction to tensor network theory, it’s effectiveness in machine learning, and possibly one of the most widely used applications of tensor network theory: recursive neural tensor networks for natural language processing.

Ultimately this lecture provides a basis for approaching and implementing neural tensor networks for your projects.

Bio:

Patrick D. Smith is a data scientist specializing in deep learning who currently leads a boutique data consulting practice. Previously, Patrick worked as a data scientist for Booz Allen Hamilton, a data science instructor for General Assembly, as well as a quantitative analyst in the financial services realm. His interests in deep learning include tensor network theory and network optimization. He holds a degree in Economics from The George Washington University and is currently pursuing graduate studies at Stanford University in Artificial Intelligence.

Photo of Data Education DC group
Data Education DC
See more events
GWU, Funger Hall, Room 103
2201 G St. NW · Washington, DC