To install click the Add extension button. That's it.

The source code for the WIKI 2 extension is being checked by specialists of the Mozilla Foundation, Google, and Apple. You could also do it yourself at any point in time.

4,5
Kelly Slayton
Congratulations on this excellent venture… what a great idea!
Alexander Grigorievskiy
I use WIKI 2 every day and almost forgot how the original Wikipedia looks like.
Live Statistics
English Articles
Improved in 24 Hours
Added in 24 Hours
Languages
Recent
Show all languages
What we do. Every page goes through several hundred of perfecting techniques; in live mode. Quite the same Wikipedia. Just better.
.
Leo
Newton
Brights
Milds

List of unsolved problems in information theory

From Wikipedia, the free encyclopedia

This article lists notable unsolved problems in information theory. These are separated into source coding and channel coding. There are also related unsolved problems[1] in philosophy.

YouTube Encyclopedic

  • 1/3
    Views:
    48 110
    4 651
    78 382
  • Information Theory part 9: What is a bit?
  • Information Theory - Measure of Information [Lecture 1] | Coding Theory
  • Stanford Seminar - Information Theory of Deep Learning, Naftali Tishby

Transcription

Consider the following: Alice and Bob have figured out how to transmit messages between their treehouses. At first, they used flames at night, and shutters during the day. Then they used a wire, which they plucked in different ways. Eventually, they electrified this wire to send electrical pulses, and were now at work on an experimental wireless method. The problem is in order to pay for their equipment, they needed money. So they decided to offer their service -- for a fee to others. And on the first day, Alice had three new customers who wanted to transmit messages to their friends over at Bob's treehouse. The first customer wanted to send a list of 10 coin flips. The second customer wanted to send a 6-letter word. And the third customer wanted to send a poker hand. The question now is, "How much should she charge?" Well, the price of a message should depend on how long it takes Alice to transmit it. But how could she measure the length of different types of messages using a common unit? To find out, let's play a game. Imagine you are Bob now. And you know Alice wants to send you these messages. But all you can do is get the answer to yes-or-no questions you've arranged. Alice will answer by sending a sequence of zeros or ones using some method of variation. Recall that all of their methods of transmission involve the exchange of differences. So, a 1 could be represented by an open flame, or an open shutter, or an electrical pulse. No matter how they are manifested, we can simply call them 'binary digits' -- because a binary digit can have only one of two values -- 0 or 1. So, let's say 0 represents a 'no,' and 1 represents a 'yes.' Your challenge, now, is to always ask the minimum number of questions in order to determine the exact message. First, let's consider the coin flips. For each symbol, the sender, Alice, can be thought of as selecting one of two different symbols -- 'heads' or 'tails.' Now how many questions do you need to ask to determine which she selected? One questions -- such as, "Is it heads?" -- will suffice. For 10 flips, what is the minimum number of questions? Well, 10 flips times one question per flip equals 10 questions -- or 10 binary digits to transmit this message. Next, let's consider the letters. For each symbol, the sender, Alice, can be thought of as selecting 1 of 26 different symbols. Let's start with the simplest message -- which is 1 letter. How many questions are needed? Is it A? Is it B? Is it C? Is it D? And so on. But that is not the minimum number of questions. The best you could do is ask questions which eliminate half of the possibilities. For example the middle of the alphabet is between M and N. So we could first ask, "Is it less than N?" If we receive a 1 -- yes -- we cut out half of the possibilities -- [so we have] 13 left. And since we can't split a letter in half, we divide the possible symbols into sets of 6 and 7, and ask is it less than G? We receive a 1, which is yes. And now we are left with 6 possible letters. And we can split them in half, and ask, "Is it less than D?" We receive a 0, which is no -- leaving us with three possible letters. And now we can pick a side and ask, "Is it D?" We receive a 0, which is no. And finally, we are left with two possibilities. We ask, "Is it E?" We receive a no. And after five questions, we have correctly identified the symbol: F. Realize that we will never need to ask more than five questions. So the number of questions will be at least 4, and at most 5. And in general, 2 to the power of number of questions equals the number of possible messages -- which we previously defined as the 'message space.' So how can we calculate the exact average -- or expected -- number of questions, given a message space of 26? We ask the reverse question. 2 to the power of something equals 26. And to answer these type of questions, we naturally use a logarithmic function, base 2 -- because log, base 2, of 26 is the exponent 2 needs to be raised to to give us 26. Which is approximately 4.7. So on average, approximately 4.7 questions will be needed per letter, at minimum. And since she wants to transmit a word with 6 letters, Bob can expect to ask, at minimum, 28.2 questions -- Which means Alice will need to send at most 29 binary digits. Finally, let's apply this formula to a new message -- the poker hand. Well, for each symbol, the sender, Alice, can be thought of as selecting 1 of 52 different symbols. And in this case, the number of questions is the same as the number of times we need to split the deck and ask Alice which pile it is in -- until we are left with one card -- which we will find is usually 6 splits -- or questions -- and sometimes 5. But we can save time and just use our equation. Log, base 2, of 52 is approximately 5.7, since 2 to the power of 5.7 is approximately 52. So the minimum number of questions, on average, is 5.7 per card. A poker hand contains five cards. So to transmit a poker hand requires 28.5 questions, on average. We are done. We now have our unit. It's based on the minimum number of questions to define the message -- or the 'height' of the decision tree. And since Alice transmits this information as binary digits, we can shorten this [term], and call our unit the 'bit' -- instead of 'binary digit.' So we have 10 coin flip require 10 bits. The 6-letter word requires 28.2 bits. and the poker hand requires 28.5 bits. Alice, then, decides to charge one penny per bit, and begins collecting her fees. Now this idea emerged in the 1920s. It was one of the more abstract problems that communication engineers were thinking about. Ralph Hartley was a prolific electronics researcher, who built on the ideas of Harry Nyquist -- both of whom worked at Bell Labs after World War I. And in 1928, Hartley published an important paper titled "The Transmission of Information." And in it, he defines the word 'information' using the symbol h -- as: h = n × logarithm of s, where h is our information, n is the number of symbols -- whether they're notes, letters, numbers, etc. -- and s is the number of different symbols available at each selection. And this can also be written as h = logarithm of s^n. And Hartley writes, "What we have done, then, is to take, as our practical measure of information, the logarithm of the number of possible symbol sequences. So, information is the logarithm of the message space. However, realize that throughout this lesson, we have been assuming that the symbol selection is random -- a convenient simplification. However, we know that in reality, most communication -- such as speech -- isn't always random. It's a subtle mix of predictability and surprise. We do not roll dice when we write letters. And it is this predictability which can result in significant savings in the length of transmission. Because when we can predict things in advance, we shouldn't need to ask as many yes-or-no questions to define it. But how could we formally model this subtle difference? This question brings us to the key insight in our story. Can you think of what it might be?

Channel coding

  • Capacity of a network: The capacity of a general wireless network is not known. There are some specific cases for which the capacity is known, such as the AWGN channel and fading channel.[2]
  • Capacity of the broadcast channel: The capacity of the broadcast channel, or the case in which a single transmitter is sending information to many receivers, is unknown in general, though it is known for several specific cases.[3][4]
  • Capacity of the interference channel (Two User): The capacity of the interference channel, in the case where there are two transmitter and receiver pairs that interfere among each other, is unknown in general. Capacity is known in special cases: strong interference regime, injective-deterministic. Capacity is known in approximate sense or within a range for: injective-semi-deterministic, additive white Gaussian noise with per block power constraint.
  • Capacity of the two-way channel: The capacity of the two-way channel (a channel in which information is sent in both directions simultaneously) is unknown.[5][6]
  • Capacity of Aloha: The ALOHAnet used a very simple access scheme for which the capacity is still unknown, though it is known in a few special cases.[7]
  • Capacity of the queue channel: Under a FIFO policy, whether the feedback capacity of the queue channel is strictly greater than the capacity without feedback is unknown for general service time distributions though it is known that the two quantities are equal when the service time distribution is memoryless.[8]
  • Quantum capacity: The capacity of a quantum channel is in general not known.[9]

Source coding

  • Lossy distributed source coding: The best way to compress correlated information sources using encoders that do not communicate with each other, preserving each source to within its distortion metric, is not known.

References

  1. ^ Adriaans, Pieter. "Open Problems in the Study of Information and Computation". Retrieved 21 June 2013.
  2. ^ Cover, Thomas (1991-08-26). Elements of Information Theory. Wiley-Interscience. ISBN 978-0471062592.
  3. ^ Cover, Thomas (Oct 1998). "Comments on Broadcast Channels". IEEE Trans Inf Theory. 44 (6): 2524. doi:10.1109/18.720547. S2CID 8985406.
  4. ^ Sridharan, Arvind. "Broadcast Channels" (PDF). Notre Dame. Archived from the original (PDF) on 29 August 2017. Retrieved 6 July 2014.
  5. ^ Shannon, Claude (1961). "Two-way communication channels". Proc Fourth Berkeley Sump on Mathematical Statistics and Probability. 1: 611.
  6. ^ meeuwissen, Erik (16 Aug 1998). "The Origin of Two-Way Channels". Proc ISIT. I: 185.
  7. ^ Médard, Muriel (March 2004). "Capacity of Time-Slotted ALOHA Packetized Multiple-Access Systems Over the AWGN Channel" (PDF). IEEE Transactions on Wireless Communications. 3 (2): 486–499. doi:10.1109/TWC.2003.821175. S2CID 791018. Archived from the original (PDF) on 18 December 2011. Retrieved 11 July 2014.
  8. ^ Anantharam, Venkat; Verdu, Sergio (1996). "Bits through queues". IEEE Trans Inf Theory. 42 (1): 4-18. doi:10.1109/18.481773.
  9. ^ Shor, Peter (2000). "Quantum Information Theory: Results and Open Problems" (PDF). In Alon N.; Bourgain J.; Connes A.; Gromov M.; Milman V. (eds.). Visions in Mathematics, GAFA 2000 Special Volume: Part II. Modern Birkhäuser Classics. Birkhäuser Basel. pp. 816–838. doi:10.1007/978-3-0346-0425-3_9. ISBN 978-3-0346-0425-3.

Further reading

This page was last edited on 20 February 2024, at 15:17
Basis of this page is in Wikipedia. Text is available under the CC BY-SA 3.0 Unported License. Non-text media are available under their specified licenses. Wikipedia® is a registered trademark of the Wikimedia Foundation, Inc. WIKI 2 is an independent company and has no affiliation with Wikimedia Foundation.