An Insightful Life
Claude Shannon is one of my intellectual heroes.
His MIT master’s thesis, submitted in 1936, laid the foundation for digital circuit design. (My MIT master’s thesis, submitted 70 years later, has so far proven somewhat less influential.)
His insight was simple. The wires, relays and switches that made up the types of complex circuits he encountered at AT&T could be understand as the terms and operators of logic statements expressed in the boolean algebra he encountered as a math major at the University of Michigan.
Though simple, this insight had huge impact. It meant that circuits could be designed and optimized in the abstract and precise language of mathematics, and then transformed back to soldered wires and finicky magnetic coils only at the last step — enabling staggering leaps in circuit complexity.
But he wasn’t done. A decade later, inspired in part by his wartime research efforts, Shannon developed information theory: a mathematical framework that formalizes both techniques and fundamental limits for reliably transmitting information over noisy channels.
(For a popular treatment of this theory, see this or this; for a technical introduction, I recommend this guide).
Put another way, Shannon’s master’s thesis laid the foundation for digital computers, while his information theory paper laid the foundation for digital communication.
Not a bad legacy.