Zum Inhalt springen

Details

Link to article: https://arxiv.org/pdf/2601.03220
Title: From Entropy to Epiplexity: Rethinking information for Computationally Bounded Intelligence
Content: This paper identifies three paradoxes in classical information theory—that information can't increase through deterministic transformations, is order-independent, and that likelihood modeling is just distribution matching, and then argues that these arise because Shannon information and Kolmogorov complexity assume unbounded computational capacity. To resolve this, the authors introduce "epiplexity," a measure of what computationally bounded observers can actually learn from data, showing how computation can create information, ordering matters, and providing a theoretical foundation for data selection that tracks with downstream performance and generalization.
Slack link: ml-ka.slack.com, channel: #pdg. Please join us -- if you cannot join, please message us here or to mlpaperdiscussiongroupka@gmail.com.

In the Paper Discussion Group (PDG) we discuss recent and fundamental papers in the area of machine learning on a weekly basis. If you are interested, please read the paper beforehand and join us for the discussion. If you have not fully understood the paper, you can still participate – everyone is welcome! You can join the discussion or simply listen in. The discussion is in German or English depending on the participants.

Verwandte Themen

Artificial Intelligence
Deep Learning
Machine Learning
Natural Language Processing
Neural Networks

Das könnte dir auch gefallen