Claude Shannons Contributions to Computer Science, Telecommunications, and

Claude Shannons Contributions to Computer Science, Telecommunications, and

Claude Shannons Contributions to Computer Science, Telecommunications, and Information Theory Dr. Charles C. Tappert Professor of Computer Science Seidenberg School of CSIS 2016 Research Day Life Overview 1916 (April 30) born in Petoskey, Michigan 1936 BS EE and BS Math University of Michigan 1937 MS EE MIT, thesis applied Booles logic to circuit design Considered best MS thesis of the century

1940 PhD MIT, Department of Mathematics 1940 National Research Fellow, Institute for Advanced Study in Princeton, NJ (Hermann Weyl, von Neumann, Einstein, Gdel) 1941-1956 Bell Labs, Murray Hill, New Jersey During World War II worked on fire-control systems and cryptography 1943 met and shared ideas with Alan Turing at Bell Labs Proved the cryptographic one-time pad unbreakable

Invented information theory key paper 1948 1956-1978 MIT Research Lab of Electronics 2001 (February 24) died at age 84 (almost 85) 2016 Research Day Shannons Source Coding Theorem The Communication Channel Source generates message s sequence of symbols s is encoded into channel input codewords x = g(s) Channel input output x => output y decoded to s 2016 Research Day

Shannons Source Coding Theorem Channel Capacity Maximum information communicated from channel input to output 2016 Research Day Shannons Source Coding Theorem on Channel Capacity Shannons 1948 A Mathematical Theory of Communication studied the theoretical limits on the channel transmission of information from source to receiver, where the source is encoded before transmission

He considered two types of channels continuous and discrete And he dealt with added noise Only the discrete channel without noise is described in this brief presentation 2016 Research Day Preliminaries to Shannons Theorem Example: Fair and Biased Coin 50%

2016 Research Day 50% 90% 10% Shannon Information = log2(1/p(x)) Measure of Surprise of One Outcome 90-10 coin & p(tail) = 0.1 surprise! Shannon information = 3.32 bits 50-50 coin => p(head/tail) = 0.5

Shannon information = 1.00 bit 90-10 coin & p(head) = 0.9 Shannon information = 0.15 bits 2016 Research Day Shannon Entropy H(X) is Average Shannon Information For probability distribution p( X ) p( x1 ), , p( xm ) i m 1 H ( X ) p( xi ) log 2

i 1 p( xi ) Examples: 50-50 coin: H(X) = 0.5*log(1/0.5)+0.5*log(1/0.5)=1.00 bit 90-10 coin: H(X) = 0.9*log(1/0.9)+0.1*log(1/0.1)=0.47 bits H(X) obeys properties a measure of information should possess: continuity, symmetry, max value, additive 2016 Research Day Graph of Shannon Entropy H(X) versus coin bias (probability of a head) 50-50 coin

H = 1.00 bits 90-10 coin H = 0.47 bits 2016 Research Day Shannons Source Coding Theorem The Communication Channel Source generates message s sequence of symbols s is encoded into channel input codewords x = g(s) Channel input output x => output y decoded to s

2016 Research Day Shannons Source Coding Theorem Shannons optimal channel capacity equals the maximum entropy over possible probability distributions C max H ( X ) bits/sec p( X ) To guarantee each binary digit carries maximum information, the optimal distribution of code values should corresponds to the independently and identically distributed case But natural signals convey information in dilute form

E.g., pixels close to each other in images tend to have similar values and sequences of characters in English are related to each other Thus, this theorem is really about the coding of messages before transmission through the channel but it does not state how such recoding can be achieved Inspired creation of coding methods Huffman, etc. 2016 Research Day Shannons Source Coding Theorem So far we have dealt mostly with variables having a uniform distribution, but the importance of the theorem only becomes

apparent for non-uniform distributions Consider the numeric outcome of throwing a pair of 6-sided dice 2016 Research Day Shannons Source Coding Theorem With 11 symbols, simple 4-bit binary code = 4.00 bits/symbol 2016 Research Day Shannons Information Entropy vs

Boltzmann-Gibbs Thermodynamic Entropy Shannons information entropy measures information i m 1 H ( X ) p( xi ) log 2 p ( x ) i 1

i Boltzmann-Gibbs thermodynamic entropy measures the number of possible states of a physical system (e.g., jar of gas molecules) (Note syntactic similarity to Shannons equation) 2016 Research Day Shannons Major Contributions Father of electronic communications age Applied Boolean algebra to electrical systems

Inventor of information theory Foundation of the digital revolution 2016 Research Day Shannons Hobbies and Inventions Hobbies Juggling, unicycling, and chess Card counting at blackjack in Las Vegas Portfolio investment management made him wealthy Inventions

Signal flow graphs Roman numeral computer called THROBAC Juggling machines and flame throwing trumpet First wearable computer calculates roulette odds Ultimate useless machine Device to solve Rubiks Cube puzzle Magnetic maze solving mouse called Theseus Shannons maxim reformulated Kerckhoffs principle Computer chess program with minimax procedure Minivac 601 Digital Computer Kit 2016 Research Day Shannons Maze Solving Mouse

Early example of machine learning 2016 Research Day Shannons Minivac 601 Digital Computer Kit 2016 Research Day Shannons Ultimate Useless Machine 2016 Research Day

Shannon Quotes We know the past but cannot control it. We control the future but cannot know it. I am very seldom interested in applications. I am more interested in the elegance of a problem. Is it a good problem, an interesting problem? I visualize a time when we will be to robots what dogs are to humans. And I am rooting for the machines. My greatest concern was what to call it. I thought of calling it information, but the word was overly used, so I decided to call it uncertainty. When I discussed it with John von Neumann, he had a better idea. Von Neumann told me, You should call it entropy, for two reasons. In the first place your uncertainty function has been used in statistical mechanics under that name, so it already has a name. In the

second place, and more important, nobody knows what entropy really is, so in a debate you will always have the advantage. 2016 Research Day Shannon Videos Information Theory Society: https:// www.youtube.com/watch?v=pHSRHi17RKM Father of information age: https://www.youtube.com/watch?v=z2Whj_nL-x8 Tech icons: Claude Shannon: https://www.youtube.com/watch?v=z7bVw7lMtUg

The man who turned paper into pixels: https:// vimeo.com/98345492 Shannon demonstrates machine learning: https://www.youtube.com/watch?v=vPKkXibQXGA Juggling machines: https://www.youtube.com/watch?v=sBHGzRxfeJY 2016 Research Day Many Shannon Centenary Celebrations IEEE Information Theory Society http://www.itsoc.org/resources/Shannon-Centenary Bell Labs

http:// spectrum.ieee.org/tech-talk/telecom/internet/bell-l abs-looks-at-claude-shannon-legacy-future-of-inform ation-age U.S. Postal Stamp Proposal http:// www.itsoc.org/news-events/recent-news/shannons2016centenary-us-postal-stamp Research Day Primary References Information Theory: A Tutorial Introduction By James V. Stone, Sebtel Press 2015

The Logician and The Engineer By Paul J. Nahin, Princeton Univ. Press 2013 Wikipedia, Claude Shannon https://en.wikipedia.org/wiki/Claude_Shannon 2016 Research Day

Recently Viewed Presentations

  • Strategic Leadership - SFU.ca

    Strategic Leadership - SFU.ca

    Who assumes the risk and liability [355] Lockean Proviso Is there an even playing field? 1. Who is Responsible? (2/3) 1. Who is Responsible? (3/3) Responsibility of the Consumer? Calculation of risk and benefit [362] Imposed versus voluntary risk [362]...
  • The Amazon Rainforest - courseweb.lt.unt.edu

    The Amazon Rainforest - courseweb.lt.unt.edu

    The Amazon Rainforest by: Karen Marshall Where is the Amazon Rainforest located? Continent of South America Covers many different countries The majority is located in Brazil The largest tropical rainforest on Earth Click the picture for extra info!!
  • Phytophthora ramorum: Educate to Detect (PRED) USDA-Forest Service

    Phytophthora ramorum: Educate to Detect (PRED) USDA-Forest Service

    Introduction and Overview My name is Janice Alexander, and I am the Sudden Oak Death Outreach Coordinator with the University of California Cooperative Extension and the CA Oak Mortality Task Force. ... Richard Regan, Oregon State University P. ramorum symptoms...
  • Ecology - CISD

    Ecology - CISD

    Stable ecosystem that can survive and thrive on its own. Variety of plants and animals (biodiversity) Limiting Factor. Anything that restricts the number of individuals in a population to grow. (Food, Space, shelter, water)
  • 投影片 1 - ncyu.edu.tw

    投影片 1 - ncyu.edu.tw

    (2) 利用Shephard's Lemma以及要素需求函數為要素價格的零階齊次函數的性質, (3) (2)式對 偏微分 (4) 利用(4)式可得出 (5) 令 i 要素達到技術效率的影子份額為 令 i 要素調整技術效率後的影子份額為 (7) (6) 比較 (6) 與 (7) 兩式知,是否考慮 TI 參數 b,不影響 i 要素的 ...
  • Slajd 1 - Adam Sawicki

    Slajd 1 - Adam Sawicki

    Wierzchołki mogą mieć dowolne pozycje i dowolne współrzędne tekstury -> Swobodne nakładanie tekstury na powierzchnię trójkąta. Tekstura dowolnie zniekształcona. Wierzchołki przemieszczone. Tylko fragment tekstury. Współrzędne w mniejszym zakresie niż 0..1.
  • Chem. 31 - 9/15 Lecture

    Chem. 31 - 9/15 Lecture

    Once the number of simple separation steps goes over a few (maybe 5 maximum), it becomes a labor inefficient way of performing a separation. Chromatographic Theory Simple Separations vs. Chromatography Example of separation of two compounds by LLE. Compound X...
  • Farm Bill Update- Next Steps

    Farm Bill Update- Next Steps

    Farm Bill Update- Next StepsAmy HagermanState Specialist in Agricultural and Food PolicyJanuary 22, 2019. Email: [email protected] Phone: 405-744-9811. Visit us online at: agecon.okstate.edu/agpolicy/