To install click the Add extension button. That's it.

The source code for the WIKI 2 extension is being checked by specialists of the Mozilla Foundation, Google, and Apple. You could also do it yourself at any point in time.

4,5
Kelly Slayton
Congratulations on this excellent venture… what a great idea!
Alexander Grigorievskiy
I use WIKI 2 every day and almost forgot how the original Wikipedia looks like.
Live Statistics
English Articles
Improved in 24 Hours
Added in 24 Hours
What we do. Every page goes through several hundred of perfecting techniques; in live mode. Quite the same Wikipedia. Just better.
.
Leo
Newton
Brights
Milds

A Mathematical Theory of Communication

From Wikipedia, the free encyclopedia

"A Mathematical Theory of Communication" is an article by mathematician Claude E. Shannon published in Bell System Technical Journal in 1948.[1][2][3][4] It was renamed The Mathematical Theory of Communication in the 1949 book of the same name,[5] a small but significant title change after realizing the generality of this work. It has tens of thousands of citations which is rare for a scientific article and gave rise to the field of information theory. Scientific American referred to the paper as the "Magna Carta of the Information Age".[6]

YouTube Encyclopedic

  • 1/3
    Views:
    77 216
    76 861
    2 121
  • A mathematical theory of communication | Computer Science | Khan Academy
  • Information Theory part 11: Claude Shannon: A Mathematical Theory of Communication
  • Mathematical Theories of Communication: Old and New

Transcription

Voiceover: Shannon had just finished developing his theories related to cryptography and therefore was well aware that human communication was a mix of randomness and statistical dependencies. Letters in our messages were obviously dependent on previous letters to some extent. In 1949, he published a groundbreaking paper, "A Mathematical Theory of Communication". In it, he uses Markov models as the basis for how we can think about communication. He starts with a toy example. Imagine you encounter a bunch of text written in an alphabet of A, B, and C. Perhaps you know nothing about this language, though you notice As seem to clump together, while Bs and Cs do not. He then shows that you could design a machine to generate similar-looking text, using a Markov chain. He starts off with a zeroth-order approximation, which means we just independently select each symbol A, B, or C at random, and form a sequence However, notice that this sequence doesn't look like the original. He shows then you could do a bit better with a first-order approximation, where the letters are chosen independently, but according to the probability of each letter in the original sequence. This is slightly better as As are now more likely, but it still doesn't capture much structure. The next step is key. A second-order approximation takes into account each pair of letters which can occur. In this case, we need three states. The first state represents all pairs which begin with A, the second all pairs that begin with B, and the third state all pairs that begin with C. Notice now that the A cup has many AA pairs, which makes sense, since the conditional probability of an A after an A is higher in our original message. We can generate a sequence using this second-order model easily as follows. We start anywhere and pick a tile, and we write down our output the first letter, and move to the cup defined by the second letter. Then we pick a new tile, and repeat this process indefinitely Notice that this sequence is starting to look very similar to the original message, because this model is capturing the conditional dependencies between letters. If we wanted to do even better, we could move to a third-order approximation, which takes into account groups of three letters, or "trigrams". In this case, we would need nine states. But next, Shannon applies this exact same logic to actual English text, using statistics that were known for letters, pairs, and trigrams, etc. He shows the same progression from zeroth-order random letters to first-order, second-order and third-order sequences. He then goes on and tries the same thing using words instead of letters, and he writes "the resemblance to ordinary English text "increases quite noticeably at each depth." Indeed, these machines were producing meaningless text, though they contained approximately the same statistical structure you'd see in actual English. Shannon then proceeds to define a quantitative measure of information, as he realizes that the amount of information in some message must be tied up in the design of the machine which could be used to generate similar-looking sequences. Which brings us to his concept of entropy.

Publication

The article was the founding work of the field of information theory. It was later published in 1949 as a book titled The Mathematical Theory of Communication (ISBN 0-252-72546-8), which was published as a paperback in 1963 (ISBN 0-252-72548-4). The book contains an additional article by Warren Weaver, providing an overview of the theory for a more general audience.

Contents

Shannon's diagram of a general communications system, showing the process by which a message sent becomes the message received (possibly corrupted by noise)

This work is known for introducing the concepts of channel capacity as well as the noisy channel coding theorem.

Shannon's article laid out the basic elements of communication:

  • An information source that produces a message
  • A transmitter that operates on the message to create a signal which can be sent through a channel
  • A channel, which is the medium over which the signal, carrying the information that composes the message, is sent
  • A receiver, which transforms the signal back into the message intended for delivery
  • A destination, which can be a person or a machine, for whom or which the message is intended

It also developed the concepts of information entropy and redundancy, and introduced the term bit (which Shannon credited to John Tukey) as a unit of information. It was also in this paper that the Shannon–Fano coding technique was proposed – a technique developed in conjunction with Robert Fano.

References

  1. ^ Shannon, Claude Elwood (July 1948). "A Mathematical Theory of Communication" (PDF). Bell System Technical Journal. 27 (3): 379–423. doi:10.1002/j.1538-7305.1948.tb01338.x. hdl:11858/00-001M-0000-002C-4314-2. Archived from the original (PDF) on 1998-07-15. The choice of a logarithmic base corresponds to the choice of a unit for measuring information. If the base 2 is used the resulting units may be called binary digits, or more briefly bits, a word suggested by J. W. Tukey.
  2. ^ Shannon, Claude Elwood (October 1948). "A Mathematical Theory of Communication". Bell System Technical Journal. 27 (4): 623–656. doi:10.1002/j.1538-7305.1948.tb00917.x. hdl:11858/00-001M-0000-002C-4314-2.
  3. ^ Ash, Robert B. (1966). Information Theory: Tracts in Pure & Applied Mathematics. New York: John Wiley & Sons Inc. ISBN 0-470-03445-9.
  4. ^ Yeung, Raymond W. (2008). "The Science of Information". Information Theory and Network Coding. Springer. pp. 1–4. doi:10.1007/978-0-387-79234-7_1. ISBN 978-0-387-79233-0.
  5. ^ Shannon, Claude Elwood; Weaver, Warren (1949). The Mathematical Theory of Communication (PDF). University of Illinois Press. ISBN 0-252-72548-4. Archived from the original (PDF) on 1998-07-15.
  6. ^ Goodman, Rob; Soni, Jimmy (2018). "Genius in Training". Alumni Association of the University of Michigan. Retrieved 2023-10-31.

External links

This page was last edited on 13 February 2024, at 23:01
Basis of this page is in Wikipedia. Text is available under the CC BY-SA 3.0 Unported License. Non-text media are available under their specified licenses. Wikipedia® is a registered trademark of the Wikimedia Foundation, Inc. WIKI 2 is an independent company and has no affiliation with Wikimedia Foundation.