[HN Gopher] A Mathematical Theory of Communication [pdf]
___________________________________________________________________
A Mathematical Theory of Communication [pdf]
Author : luu
Score : 146 points
Date : 2024-04-30 23:05 UTC (2 days ago)
(HTM) web link (people.math.harvard.edu)
(TXT) w3m dump (people.math.harvard.edu)
| whereismyacc wrote:
| my holy book
| ziofill wrote:
| I use this paper whenever I teach information theory. If you are
| mathematically inclined, I'd recommend you to read the
| demonstration of his two main theorems, it's illuminating.
| the_panopticon wrote:
| Another great read from Shannon
| https://archive.org/details/bstj28-4-656
| mehulashah wrote:
| When you read this and think about the world he was in -- it's
| even more remarkable. How did he come up with it?
| 082349872349872 wrote:
| From playing 20 Questions and attempting to formalise it?
|
| EDIT: actually the cryptography connection is more likely:
| Leibniz was XVII; who was it that was already using binary
| alternatives for steganography a few centuries earlier?
|
| EDIT2: did entropy in p-chem come before or after Shannon?
|
| EDIT3: well before; S = k_B ln O was 1877.
| duped wrote:
| This is apocryphal, but it probably had something to do with
| dropping shells on Nazis - he was developing fire control
| systems for the US Navy around the time he developed the
| theorem, and only published several years after the War.
|
| Allegedly he also derived Mason's Gain Formula around the same
| time but that was classified until Mason published it.
| shalabhc wrote:
| While well known for this paper and "information theory",
| Shannon's master's thesis* is worth checking out as well. It
| demonstrated some equivalence between electrical circuits and
| boolean algebra, and was one of the key ideas that enabled
| digital computers.
|
| * https://en.wikipedia.org/wiki/A_Symbolic_Analysis_of_Relay_a...
| B1FF_PSUVM wrote:
| The funny thing is that, at the time, digital logic circuits
| were made with relays. For most of the XX century you could
| hear relays clacking away at street junctions, inside metal
| boxes controlling traffic lights.
|
| Then you got bipolar junction transistors (BJTs), and most
| digital logic, such as ECL and TTL, was based on a different
| paradigm for a few decades.
|
| Then came the MOS revolution, allowing for large scale
| integration. And it worked like relays used to, but Shannon's
| work was mostly forgotten by then.
| mturmon wrote:
| > Then you got bipolar junction transistors (BJTs), and most
| digital logic, such as ECL and TTL, was based on a different
| paradigm for a few decades.
|
| I think the emphasis is misplaced here. It is true that a
| single BJT, when considered as a three terminal device, does
| not operate in the same "gated" way as a relay and a CMOS
| gate does.
|
| But the BJT components still were integrated into chips, or
| assembled into standard design blocks that implemented
| recognizable Boolean operations, and synthesis of desired
| logical functions would use tools like Karnaugh maps that
| were (as I understand it) outgrowths of Shannon's approach.
| egl2021 wrote:
| This is my candidate for the most influential master's thesis
| ever.
| dbcurtis wrote:
| > The fundamental problem of communication is that of
| reproducing at one point either exactly or approximately a
| message selected at another point. Frequently the messages
| have meaning...
|
| This is my candidate for the sickest burn in a mathematical
| journal paper...
| ShaneCurran wrote:
| Not many know about it, but this paper (written in 1948) stemmed
| from a lesser-known paper Shannon wrote in 1945 called "A
| Mathematical Theory of Cryptography"[0].
|
| [0]: https://evervault.com/papers/shannon
| SatvikBeri wrote:
| Among other things, this paper is surprisingly accessible. You
| can give it to a beginner without much math background and
| they'll be able to understand it. I actually find it better than
| most modern books on information theory.
| kouru225 wrote:
| Always upvote Shannon
| dilawar wrote:
| I find it incredible how "simple" were his theories and enormous
| impact they had. Is there anyone else who developed such
| seemingly "simple" theories?
| ImageXav wrote:
| If anyone is on the fence about reading this, or worried about
| their ability to comprehend the content, I would tell you to go
| ahead and give it a chance. Shannon's writing is remarkably lucid
| and transparent. The jargon is minimal, and his exposition is
| fantastic.
|
| As many other commentators has mentioned, it is impressive that
| such an approachable paper would lay the foundations for a whole
| field. I actually find that many subsequent textbooks seem to
| obfuscate the simplicity of the idea of entropy.
|
| Two examples from the paper really stuck with me. In one, he
| discusses the importance of spaces for encoding language,
| something which I had never really considered before. In the
| second, he discusses how it is the redundancy of language that
| allows for crosswords, and that a less redundant language would
| make it harder to design these (unless we started making them
| 3D!). It made me think more deeply about communication as a
| whole.
| jp42 wrote:
| I am one of the few who is on the fence. This comment motivates
| me to give it try to this paper. Thanks ImageXav!
| jessriedel wrote:
| Agreed. This is one of the all time great papers in that it
| both launched an entire field (information theory) and remains
| very accessible and pedagogical. A true gem.
| Anon84 wrote:
| He also creates the first (at least that I could find) instance
| of a auto-regressive (markovian) language model as a clarifying
| example in the first 10 pages :)
| contingencies wrote:
| > Two examples from the paper really stuck with me. In one, he
| discusses the importance of spaces for encoding language,
| something which I had never really considered before.
|
| As a westerner who has studied quite a few writing systems this
| is kind of hard to interpret.
|
| Verbally however, the timing of pauses are important in all
| languages I've learned. This would be a more coherent argument
| to place at the pan-lingual level than one related to written
| representation, which is pretty arbitrary (many languages have
| migrated scripts over the years, see for example the dual mode
| devanagari/arabic hindu/urdu divide, many other languages
| migrating to arabic, phagspa, vietnamese moving from chinese to
| french diacritics, etc.).
|
| > In the second, he discusses how it is the redundancy of
| language that allows for crosswords, and that a less redundant
| language would make it harder to design these (unless we
| started making them 3D!). It made me think more deeply about
| communication as a whole.
|
| Yeah, good luck making a Chinese crossword. Not sure
| "redundancy" is the right term, however. Perhaps "frequent
| [even tediously repetitive?] glyph reuse".
| loph wrote:
| Shannon did a lot more interesting things than just this paper.
|
| If you become more interested in Claude Shannon, I recommend the
| biography "A Mind At Play"
|
| https://en.wikipedia.org/wiki/A_Mind_at_Play
|
| A very interesting person.
| pid-1 wrote:
| As an undergrad I struggled to understand why log was used to
| measure information. Could not find a reason in any textbook.
|
| Took a deep breath and decided to download and read this paper.
| Surprise, surprise: it's super approachable and the reasoning for
| using log is explained on the first page.
___________________________________________________________________
(page generated 2024-05-03 23:00 UTC)