Claude Shannon

Information theory thinker

Your use of this content is subject to the terms and conditions of this portal

Claude Shannon: Information theory thinker
Image supplied rights-cleared by the Chartered Management Institute, 2015.

Claude Shannon (1916-2001) had considerable talents and interest in the disciplines of electrical circuitry, mathematics, cryptology and code breaking and his early work in these areas was to evolve into the concept of information theory; a discipline which Shannon introduced in his seminal work The mathematical theory of communication published in 1948. In it, he details his theory of communication; an approach that was new and innovative. This pioneering publication laid the foundations for information theory and earned Shannon the title of founding father.


Shannon understood signal processing and its relationship to data storage, compression and transmission. Crucially, he approached information as a physical mass, deducing as a result that information can be measured. He proved that all communication, as diverse as radio waves, text, pictures and telephone signalling, can be programmed in binary numbers. Before his discovery, the different types of communication had all been transmitted using entirely different mediums. This all-inclusive approach to communication was groundbreaking and effectively set Shannon apart from the crowd; making him one of the most significant early pioneers of the digital age we live in today.

Life and career

Born in Michigan, USA in 1916, Shannon’s passion and talent for mechanics and electronics was first discovered through his innovative childhood inventions. He continued to invent games and mechanical devises into his adulthood; indulging in his passion for games as well as his innate passion for creativity. One such invention was his electromechanical mouse which was the first foray into artificial intelligence. Another was a device for calculating casino odds. For frippery or fun, Shannon’s ‘playful’ inventions left their mark.

As a young man, he cemented his interests in academic study, graduating from the University of Michigan in 1936 with degrees in mathematics and electrical engineering. After the completion of his early studies, Shannon went on to grow his passion for electrical engineering with a graduate degree at the Massachusetts Institute of Technology (MIT), where he worked with early analogue computers as an assistant to Vannevar Bush, an early pioneer in this field.

His interest in the workings of circuit systems and mathematical relationships were developed further during his academic studies when he was exposed to the work of George Boole, of Boolean logic fame. This led the tenacious young Shannon to decipher how Boole’s logic could be applied to greater effect in relation to electrical switches, namely telephone call routing switches. Although Shannon wasn’t the first to develop Boolean logic in this way, it was only when he wrote about his findings for his thesis that its logic became widely received. Shannon’s discovery effectively formed the basis of digital circuit design.

After the success and recognition of his early work, Shannon turned his considerable talents to cryptology; culminating in a venture into code breaking and designing secure telecommunications. Such cryptology and code breaking prowess led Shannon to the discovery that the One-Time Pad, an encryption technique, is unbreakable. From his findings, Shannon concluded that all codes must have the same requirements as the One-Time Pad system if they are to remain incorruptible. Although numerous names have been credited with an involvement in the development and use of the One-Time Pad since Frank Miller’s first connection in 1882, Shannon was the first to demonstrate the theoretical implications of this encryption system as a totally secure model.

Many of Shannon’s early achievements and explorations were carried out during his tenure with national defence during the Second World War whilst employed at Bell Laboratories. It was whilst working at Bell’s that Shannon produced The mathematical theory of communication.

Shannon joined the MIT faculty in 1956, where he remained until 1978.


Shannon was influenced by many of his predecessors and contemporaries whose ideas he embraced and admired. Alongside George Boole and Vannevar Bush, Shannon was also inspired by Harry Nyquist (pioneering work on telegraph speeds); Ralph Hartley (researcher on repeaters, carrier and voice transmission); and Alan Turing (eminent British cryptologist and mathematician).

Inspired by the work of such men, Shannon expanded upon existing theories and discoveries to become the first person to consider communication as a statistical process; understanding as he did that information is measurable. This is what underpins information theory. Through their own works, fellow Bell Laboratories’ employees Nyquist and Hartley undoubtedly helped Shannon lay the foundations of what was to become information theory. Although Shannon wasn’t always the originator of a notion, he had the knowledge and talent to explore further the seeds of ideas dropped by others; probing a little deeper and researching more fully until he was confident that a developed theory could be proven and substantiated mathematically.  

Key theories

Shannon’s early insights into communication theory and cryptology during the Second World War cemented his passion in this field of research considerably, leading to the development of groundbreaking theories. His mathematical leanings are evident in his research, substantiating his ideas and logic with complex mathematical equations and formulae.

Shannon’s discoveries surrounding code breaking and communication theory were taken a step further as he attempted to discern how best to effectively programme the information the sender wants to broadcast so that the intended message is sent and understood. Recognising that doubt can arise in communication and lead to miscommunication, Shannon developed the notion of information entropy as a means of measuring such uncertainty and to statistically gauge how much information is contained within a received message.

From his work in the field of information theory, several off-shoot theories have derived such as channel capacity and noisy channel coding. Through his research, Shannon understood the upper limits of data transmission and compression before a message is effectively lost. Such limits are theorised in Shannon’s source coding theorem. He also explored the limitations of communication when noise is present in relation to bandwidth, which is reflected in the Shannon-Hartley theorem.

Communication process model

Warren Weaver, renowned scientist and pioneer of machine translation, helped popularise The mathematical theory of communication and together, Shannon and Weaver developed the first model of the communication process.

The model was originally developed for the use of mechanical messaging for the Bell Telephone Company. Shannon and Weaver’s model demonstrates what happens when we communicate. They illustrated communication exchange as a series of steps. In crude terms, the stages are thus:

  1. The communicator moves the information they wish to impart from their brain to their mouth (transmitting device) using appropriate words - the transmitting device can equally be written words or spoken ones.
  2. Once the words are imparted through the communication channel (voice, email etc) the message is sent through the air toward the receiver for them to decode the message.
  3. Along the way, the message is joined by other sounds and distractions (noise) that can alter the meaning of the message.
  4. The receiver decodes the message, reconstructing the meaning to minimise noise.
  5. The message is received and understood.

Shannon and Weaver’s model effectively demonstrated what happens when we communicate with others. They highlighted the potential hazards the message encounters from transmission to receipt which can distort content and lead to miscommunication. They recognised that natural language processing can be compromised by interfering ‘noise’ which leads to communication failure. When the same message is transmitted using a combination of methods, such as verbal and digital, this increases the potential for miscommunication.

By approaching the communication process as a series of steps Shannon and Weaver made it possible to pinpoint where in the process communication may fail. They showed unequivocally that a problem with encoding, channel, noise, and decoding can result in the failure or breakdown of effective information transmission and retrieval.

In perspective

Shannon has received countless awards and accolades including numerous honorary doctorates, the National Medal of Science which was awarded to him by President Johnson in 1966, and a posthumous entry into the National Inventors’ Hall of Fame in 2004.

Outside the sphere of technological and information experts, Shannon’s truly groundbreaking work remains largely unknown by the general public. Yet his contribution and impact is immense. Through his work, Shannon effectively introduced the information and computing fraternity to the limitations, and possibilities, of data storage, compression and transmission; paving the way for such inventions as CDs, DVDs, MP3s and JPEGs, among others.

His work transformed telecommunications with the move away from analogue transmissions to digital ones. His findings also smoothed the way for understanding digital communications and content messaging which is so relevant today. Shannon’s discoveries regarding the ability to measure information content were the founding principles of the way digital transmissions are measured today as bits per minute (bpm).

Such is the impact of Shannon’s work on today’s technical world it can easily to be argued that he was instrumental in helping lay the very foundations that resulted in the digital revolution of the twentyfirst century. The application of his works also extends to other disciplines such as physics, statistics, economics and psychology.

As for Shannon himself, he was uncomfortable being in the spotlight. As information theory became the new buzz word of the day he gently retreated away from teaching and academic life. He seemed surprised by the impact and overwhelming response his work and ideas generated.

Further reading

Key works by Claude Shannon


With Weaver, W. The mathematical theory of communication. Chicago Ill., University of Illinois Press, 1963


A symbolic analysis of relay and switching circuits [MS thesis]. Massachusetts Institute of Technology, 1937

An algebra for theoretical genetics [PhD thesis]. Massachusetts Institute of Technology, 1943

Journal articles

A mathematical theory of communication. Bell System Technical Journal, 27 (3) July 1948, pp.379-423

Communication theory of secrecy systems. Bell System Technical Journal, 28 (4) October 1949, pp.656-715

Related collection items

Principles of good communication

This support sheet outlines various elements of communication and provides tips for good communication for people working…