Me.jpg

いらっしゃい!

Claude Shannon's Information Theory

Claude Shannon's Information Theory

I recently read a biography of Claude Shannon, the first mind of Information Theory.

Before opening the book, I knew little about Shannon.

His is one of the two names in the Shannon-Weaver model of communication. He is the author of “The Mathematical Theory of Communication”. He devised units that can measure information capacity, now called Shannon Numbers.

But there is so much more to him.

First, Information Theory was more than a theory.

It represented a fundamental change in how communications systems work, and everything related to them.

These days we would call it a "digital transformation” because Shannon devised a way to improve something analog by making it digital.

But he introduced the first digital transformation … ever … in any context.

Biographers Jimmy Soni and Rob Goodman open with an anecdote about Shannon as a young boy in rural Michigan. He and a friend figured out how to send dot-dash telegraph messages to each other a half mile away through a fence wire that ran between their houses. This anecdote is deeply foretelling.

Skipping forward, after earning a bachelor's degree in mathematics at University of Michigan, Shannon made a brief stop at the Institute for Advanced Study in Princeton, and then to MIT to work with Vannevar Bush on the differential analyzer. The differential analyzer was an early mechanical computer, painstakingly built to represent the analogs of problems it was trying to solve.

During this period at MIT Shannon wrote his famous masters thesis in which, rather than literally building a mechanical computer, he found a new way to render its parts and processes on paper as a symbolic system.

After MIT, Shannon moved to New York City where he worked in the then-thriving Bell Laboratories. At Bell Labs he was employed as a mathematician, but enjoyed working on all types of projects including building a mechanical, maze-beating mouse called Thesesus.

While at Bell Labs, Shannon’s work on cryptography led to his seminal paper called “A Mathematical Theory of Communications” which was published in the Bell Systems Journal when he was only 32 years old. Everyone who read it - which wasn’t many people, because it was a technical paper written in a mix of prose and symbolic math - understood the widespread implications of digital bit-based, rather than analogue switch-based, information packaging.

There are too many specific examples of breakthroughs that owe their origin to this idea, but it is safe to say that there would be no internet without bits.

Shannon’s biographers jump back and forth between his achievements, and the cast of characters that pass through his life along the way.

Betty Shannon, his wife, a mathematician in her own right, who did all of the wiring underneath the learning-robot Theseus, figures in consistently and prominently.

John Von Neuman, the polymath who suggested using the term “entropy” to explain how information can be predictable or less-so, shows up right on time.

Alan Turing, the computer scientist, whom Shannon met for tea at Bell Labs and despite wartime secrecy protocols, emerged as a long-distance but true friend and peer.

You can learn more here:

https://princetonlibrary.bibliocommons.com/item/show/1387353057

https://en.wikipedia.org/wiki/Claude_Shannon

New Experience @ The Cooper Hewitt

New Experience @ The Cooper Hewitt

Frank Lloyd Wright Textiles

Frank Lloyd Wright Textiles

0