We will discuss the exquisite introduction to information theory given by the book An Introduction to Information Theory: Symbols, Signals and Noise by John R. Pierce. The book features a detailed description of the nuts and bolts of the mathematics of information. There is also an exquisite and thorough discussion of entropy. The book has chapters discussing the role of the subject in physics, cybernetics, psychology, and even art and music.
A few of the chapters are a bit challenging, but overall the book is an easy read. Excluding the index, the book is just under 300 pages, so it will take some time to read in its entirety. Our discussion will focus on the first nine chapters.
The first edition (1961) of "An Introduction to Information Theory" is in the public domain and can be read in many formats for free. I have read Dover's second edition of Pierce's book on Information Theory (1980).
Possible Topics for Discussion:
• The nature of scientific theories
• The nature of mathematical models
• The importance of proof in mathematics
• The mathematical model of communication theory
• The nature of encoding information
• Entropy: a measure of the information conveyed from a source to a recipient
• Shannon's fundamental theorem for the noiseless channel (channel capacity)
• Language, meaning, understanding and information theory
• The noisy channel: error detection and correction
• The use of multidimensional space to characterize channel capacity
If there is some aspect of Pierce's book or information theory in general which you would like us to discuss, please post a comment about it below.