Capsule Networks – next generation learning architecture


Details
To quote Geoffrey Hinton :"The pooling operation used in convolutional neural networks is a big mistake and the fact that it works so well is a disaster."
Pooling is a necessity in most large convolutional nets, but explicitly break hierarchical relationships in the data, or at least makes learning of the underlying representations very inefficient. At the same time, almost all interesting problems have clear hierarchies or geometric relationships. One approach to efficiently learn hierarchical correspondences is Capsule Networks where neurons are extended to output a tensor instead of a scalar. This has several benefits, for example the ability of each (extended) neuron to predict geometric transforms. Theory, use cases, training and extensions will be discussed.
The lecture will be held by Dr. Asbjørn Berge, Sintef.

Capsule Networks – next generation learning architecture