Ross Gayler is a data scientist who has been around long enough to have been through multiple hype-cycles for neural nets and AI. He has a long-running side project in AI/cognitive-science looking into neurally implementable, computational approaches that could e used as a basis for intelligence.
This talk is about computing with discrete, compositional data structures (like graphs) in analog computers (like neural networks). I describe Vector Symbolic Architectures, a family of mathematical techniques for analog computation in hyperdimensional vector spaces, that map naturally onto neural network implementations. VSAs naturally support computation on discrete, compositional data structures and provide a form of virtualisation that breaks the nexus between the items to be represented and the hardware that supports the representation. This means that computations on evolving data structures do not require physical rewiring of the implementing hardware.