Skip to content

Capsule Networks – next generation learning architecture

Photo of Sondre Pedersen
Hosted By
Sondre P.
Capsule Networks – next generation learning architecture

Details

To quote Geoffrey Hinton :"The pooling operation used in convolutional neural networks is a big mistake and the fact that it works so well is a disaster."

Pooling is a necessity in most large convolutional nets, but explicitly break hierarchical relationships in the data, or at least makes learning of the underlying representations very inefficient. At the same time, almost all interesting problems have clear hierarchies or geometric relationships. One approach to efficiently learn hierarchical correspondences is Capsule Networks where neurons are extended to output a tensor instead of a scalar. This has several benefits, for example the ability of each (extended) neuron to predict geometric transforms. Theory, use cases, training and extensions will be discussed.

The lecture will be held by Dr. Asbjørn Berge, Sintef.

Photo of XAI: Explaining what goes on inside DNN/AI group
XAI: Explaining what goes on inside DNN/AI
See more events