A timeline of events related to information theory , quantum information theory and statistical physics , data compression , error correcting codes and related subjects.
1872 – Ludwig Boltzmann presents his H-theorem , and with it the formula Σp i log p i for the entropy of a single gas particle
1878 – J. Willard Gibbs defines the Gibbs entropy : the probabilities in the entropy formula are now taken as probabilities of the state of the whole system
1924 – Harry Nyquist discusses quantifying and the speed at which it can be transmitted by a communication system
1927 – John von Neumann defines the von Neumann entropy , extending the Gibbs entropy to quantum mechanics
1928 – Ralph Hartley introduces Hartley information as the logarithm of the number of possible messages, with information being communicated when the receiver can distinguish one sequence of symbols from any other (regardless of any associated meaning)
1929 – Leó Szilárd analyses Maxwell's Demon , showing how a Szilard engine can sometimes transform information into the extraction of useful work
1940 – Alan Turing introduces the deciban as a measure of information inferred about the German Enigma machine cypher settings by the Banburismus process
1944 – Claude Shannon 's theory of information is substantially complete
1947 – Richard W. Hamming invents Hamming codes for error detection and correction (to protect patent rights, the result is not published until 1950)
1948 – Claude E. Shannon publishes A Mathematical Theory of Communication
1949 – Claude E. Shannon publishes Communication in the Presence of Noise – Nyquist–Shannon sampling theorem and Shannon–Hartley law
1949 – Claude E. Shannon 's Communication Theory of Secrecy Systems is declassified
1949 – Robert M. Fano publishes Transmission of Information . M.I.T. Press, Cambridge, Massachusetts – Shannon–Fano coding
1949 – Leon G. Kraft discovers Kraft's inequality , which shows the limits of prefix codes
1949 – Marcel J. E. Golay introduces Golay codes for forward error correction
1951 – Solomon Kullback and Richard Leibler introduce the Kullback–Leibler divergence
1951 – David A. Huffman invents Huffman encoding , a method of finding optimal prefix codes for lossless data compression
1953 – August Albert Sardinas and George W. Patterson devise the Sardinas–Patterson algorithm , a procedure to decide whether a given variable-length code is uniquely decodable
1954 – Irving S. Reed and David E. Muller propose Reed–Muller codes
1955 – Peter Elias introduces convolutional codes
1957 – Eugene Prange first discusses cyclic codes
1959 – Alexis Hocquenghem , and independently the next year Raj Chandra Bose and Dwijendra Kumar Ray-Chaudhuri , discover BCH codes
1960 – Irving S. Reed and Gustave Solomon propose Reed–Solomon codes
1962 – Robert G. Gallager proposes low-density parity-check codes ; they are unused for 30 years due to technical limitations
1965 – Dave Forney discusses concatenated codes
1966 – Fumitada Itakura (Nagoya University ) and Shuzo Saito (Nippon Telegraph and Telephone ) develop linear predictive coding (LPC), a form of speech coding [1]
1967 – Andrew Viterbi reveals the Viterbi algorithm , making decoding of convolutional codes practicable
1968 – Elwyn Berlekamp invents the Berlekamp–Massey algorithm ; its application to decoding BCH and Reed–Solomon codes is pointed out by James L. Massey the following year
1968 – Chris Wallace and David M. Boulton publish the first of many papers on Minimum Message Length (MML ) statistical and inductive inference
1970 – Valerii Denisovich Goppa introduces Goppa codes
1972 – Jørn Justesen proposes Justesen codes , an improvement of Reed–Solomon codes
1972 – Nasir Ahmed proposes the discrete cosine transform (DCT), which he develops with T. Natarajan and K. R. Rao in 1973;[2] the DCT later became the most widely used lossy compression algorithm, the basis for multimedia formats such as JPEG , MPEG and MP3
1973 – David Slepian and Jack Wolf discover and prove the Slepian–Wolf coding limits for distributed source coding [3]
1976 – Gottfried Ungerboeck gives the first paper on trellis modulation ; a more detailed exposition in 1982 leads to a raising of analogue modem POTS speeds from 9.6 kbit/s to 33.6 kbit/s
1976 – Richard Pasco and Jorma J. Rissanen develop effective arithmetic coding techniques
1977 – Abraham Lempel and Jacob Ziv develop Lempel–Ziv compression (LZ77 )
1982 – Valerii Denisovich Goppa introduces algebraic geometry codes
1989 – Phil Katz publishes the .zip
format including DEFLATE (LZ77 + Huffman coding); later to become the most widely used archive container
1993 – Claude Berrou , Alain Glavieux and Punya Thitimajshima introduce Turbo codes
1994 – Michael Burrows and David Wheeler publish the Burrows–Wheeler transform , later to find use in bzip2
1995 – Benjamin Schumacher coins the term qubit and proves the quantum noiseless coding theorem
2003 – David J. C. MacKay shows the connection between information theory, inference and machine learning in his book.
2006 – Jarosław Duda introduces first Asymmetric numeral systems entropy coding: since 2014 popular replacement of Huffman and arithmetic coding in compressors like Facebook Zstandard , Apple LZFSE , CRAM or JPEG XL
2008 – Erdal Arıkan introduces polar codes , the first practical construction of codes that achieves capacity for a wide array of channels
YouTube Encyclopedic
1 / 5
Views: 249 723
1 055 535
66 652
1 814 723
87 608
What is information theory? | Journey into information theory | Computer Science | Khan Academy
What If Physics IS NOT Describing Reality?
Information Theory part 5: History of Static Electricity
Harnessing The Power Of Information | Order and Disorder | Spark
Informal History of Physics
References
v <abbr style=";;background:none transparent;border:none;box-shadow:none;padding:0;" title="Discuss this template">t</abbr> e By topic Numeral systems By ancient cultures Controversies Other
This page was last edited on 4 February 2024, at 12:39