Skip to content

Details

When Claude Shannon discovered his equation for the average efficient code length for a probability distribution, it's rumored that he was told by fellow scientist and mathematician John von Neumann, to "just call it entropy, since no one knows what that is anyway."

Jest aside, this was because Shannon's equation for "Information Entropy" had an astonishing resemblance to Boltzmann's equation for physical entropy.

In this presentation, I'll explore not only these connections, but a whole host of problems, ideas, and software that relate to this mysterious quantity, Entropy, that I've spent decades attempting to understand. I'll also introduce and discuss the Kolmogorov Complexity, another very important measure of information and uncertainty.

The slides for the presentation are here!

This will again be hosted at Alphabet City Beer Co.:

https://www.abcbeer.co/

96 Ave C (between 6th and 7th Streets), New York, NY 10009

Don't be shy, bring your friends!

Best,

Charles

Events in New York, NY
Artificial Intelligence
Deep Learning
Machine Learning
Data Science
Computer Programming

Members are also interested in