small medium large xlarge

When Did That Happen?

The Most Famous Master’s Thesis in History

by Dan Wohlbruck

Generic image illustrating the article
  Dan continues his series on events in computer history with a look at the world-changing insight of Claude Shannon.  

Despite the fact that he was responsible of one of the twentieth century’s breakthrough achievements in digital networking, Claude Shannon is almost alone among tech heroes in being without a biographer. We do know this: he was born on April 30, 1916, in Petoskey, Michigan—making his story apt for the April issue of PragPub. And we know the story of how he came to write the most famous Master’s thesis in history….

From the corporate archives of the Bell Labs where he worked, we know that Shannon graduated from the University of Michigan in 1936 with a bachelor’s degree in mathematics and electrical engineering. He went on to MIT later that year to study under Vannevar Bush and to work with MIT’s Differential Analyzer.

The Differential Analyzer

Vannevar Bush himself deserves several pages in any computer history, having founded Raytheon, anticipated the Web with his prescient Memex concept, directed scientific efforts during World War II for the White House, including the Manhattan Project, and built an early analog computer. That computer, the Differential Analyzer, is what concerns us here.

Bush was at the time an associate professor at MIT and was working on the transient problem with electrical power networks. He became interested in analog computational devices because evaluating transients required significant calculations. As a result, by 1931 he and his students at MIT had completed the development of his Differential Analyzer. The Analyzer, a mechanical analog computing device that could solve ordinary differential equations up to the sixth order, was so large that it occupied a full room. Before attempting the solution of a particular differential equation, the machine’s circuits had to be redesigned and rewired. Despite its difficulties, the Analyzer was a tool that was much in demand by students and professors at MIT.

About this time, Claude Shannon arrived at MIT and, like college students of every generation, he needed money. He noticed a job posting on a bulletin board on campus announcing that someone was needed to work with visiting scientists and wire the Differential Analyzer. Shannon applied for and was given the job.

While helping a visiting researcher wire the Analyzer, Shannon had an idea. He honed the idea into a master’s thesis titled “A Symbolic Analysis of Relay and Switching Circuits.” That thesis has been called the most important and the most famous master’s thesis ever written. What follows describes Shannon’s idea and how it led to today’s digital networks.

Shannon helped scientists set up their problems by rearranging the Analyzer so that the machine’s movements would be synchronized with the mathematical equations. While helping a visiting researcher wire the Analyzer, Shannon had an idea.

Dr. Charles Vest, an MIT researcher, recalled Shannon telling him how one night it just dawned on Shannon that the circuits he was building were like the Boolean logic he had studied while he was at Michigan. Shannon thought that switches could be combined in circuits in such a manner as to carry out symbolic logic operations. He honed the idea, this marrying of electric circuits with symbolic logic, into a Master’s thesis titled “A Symbolic Analysis of Relay and Switching Circuits.” At the urging of Dr. Bush, it was published in 1938 and honored with the Alfred Noble American Institute of American Engineers Award in 1940. That thesis has been called the most important and the most famous Master’s thesis ever written.

So what was so special about this thesis?

Shannon’s Thesis

Shannon introduces his thesis thus:

“The method of attack on these problems may be described briefly as follows: any circuit is represented by a set of equations, the terms of the equations corresponding to the various relays and switches in the circuit. A calculus is developed for manipulating these equations by simple mathematical processes, most of which are similar to ordinary algebraic algorithms. This calculus is shown to be exactly analogous to the calculus of propositions used in the symbolic study of logic.”

At the time, to bother with equations just to create a circuit design would have been seen as innovative to the point of eccentricity. Before Shannon’s treatise, circuit design was an art, not a science. Shannon’s part-time job of synchronizing the electrical and mechanical parts of the Analyzer taught him that there wasn’t much design in the process, just a lot of trial and error. First, he would wire a circuit, energize it, and discover that it didn’t work as intended. Then he would follow the wires, testing each relay, to find where he had gone wrong. One imagines that one night after a particularly tedious circuit repair, Shannon was driven to find a way to make circuit design a science.

Shannon begins his argument with the following postulate:

“[A]t any given time the circuit between any two terminals must be either open (infinite impedance) or closed (zero impedance). Let us associate a symbol Xab or simply X, with the terminals a and b. This variable, a function of time, will be called the hinderance [sic] of the two-terminal circuit a-b. The symbol 0 (zero) will be used to represent the hinderance of a closed circuit, and the symbol 1 (unity) to represent the hinderance of an open circuit. Thus when the circuit a-b is open Xab equals 1 and when closed Xab equals 0.”

Shannon went on to define the plus sign to mean the series connection of two terminal circuits whose hinderances are added together. He also defined the product of two hinderances to mean the hinderance of the circuit formed by connecting the two circuits in parallel. After these two definitions, Shannon said, “This choice of symbols makes the manipulation of hinderances very similar to ordinary numerical algebra.”

With his definitions, postulates, and theorems, Shannon proceeded to use Boolean algebra to design real electrical circuits. In short, he moved the design process from art to science.

At the conclusion of his thesis, Shannon gave examples of several circuits he had designed with his algebra. One of the examples was a circuit that added two numbers. Shannon called the example an electric adder to the base two and he explained that although any numbering system could be used, the circuit would be greatly simplified when each digit was either a 0 or a 1.

And with just 0s and 1s, the digital circuit was born.


Not surprisingly, Shannon was awarded a Master’s degree for his work, which he followed with a PhD. He continued to build on his deep insight into the logic of computing for the rest of his career. He worked with the Army in World War II, and in 1946, he prepared a classified report titled, “Communication Theory of Secrecy Systems.” His greatest work came in 1948 when, with little fanfare, he published his seminal paper on information theory.

But that’s another story altogether.

Dr. Claude Elwood Shannon, the American mathematician and computer scientist whose theories laid the groundwork for the electronic communications networks that now lace the earth, died in February 2001 at his home in Medford, Mass. He was 84. But it all started in April 1916—and that’s when it happened.

Dan Wohlbruck has over 30 years of experience with computers, with over 25 years of business and project management experience in the life and health insurance industry. He has written articles for a variety of trade magazines and websites. He is currently hard at work on a book on the history of data processing.

Send the author your feedback or discuss the article in the magazine forum.