To install click the Add extension button. That's it.

The source code for the WIKI 2 extension is being checked by specialists of the Mozilla Foundation, Google, and Apple. You could also do it yourself at any point in time.

4,5
Kelly Slayton
Congratulations on this excellent venture… what a great idea!
Alexander Grigorievskiy
I use WIKI 2 every day and almost forgot how the original Wikipedia looks like.
Live Statistics
English Articles
Improved in 24 Hours
Added in 24 Hours
Languages
Recent
Show all languages
What we do. Every page goes through several hundred of perfecting techniques; in live mode. Quite the same Wikipedia. Just better.
.
Leo
Newton
Brights
Milds

Bivariate von Mises distribution

From Wikipedia, the free encyclopedia

Samples from the cosine variant of the bivariate von Mises distribution. The green points are sampled from a distribution with high concentration and no correlation (, ), the blue points are sampled from a distribution with high concentration and negative correlation (, ), and the red points are sampled from a distribution with low concentration and no correlation ().

In probability theory and statistics, the bivariate von Mises distribution is a probability distribution describing values on a torus. It may be thought of as an analogue on the torus of the bivariate normal distribution. The distribution belongs to the field of directional statistics. The general bivariate von Mises distribution was first proposed by Kanti Mardia in 1975.[1][2] One of its variants is today used in the field of bioinformatics to formulate a probabilistic model of protein structure in atomic detail, [3][4] such as backbone-dependent rotamer libraries.

YouTube Encyclopedic

  • 1/2
    Views:
    8 764
    61 062
  • A Mixed Distribution Example
  • Conducting a Shapiro-Wilk Normality Test in SPSS

Transcription

In this video, we'll look at an example in which we compute the expectation and cumulative density function of a mixed random variable. The problem is as follows. Al arrives at some bus stand or taxi stand at a given time-- let's say time t equals 0. He finds a taxi waiting for him with probability 2/3 in which he takes it. Otherwise, he takes the next arriving taxi or bus. The time that the next taxi arrives between 0 and 10 minutes, and it's uniformly distributed. The next bus leaves exactly in 5 minutes. So the question is, if X is Al's waiting time, what is the CDF and expectation of X? So one way to view this problem that's convenient is the tree structure. So I've drawn it for you here in which the events of interest are B1, B2, and B3, B1 being Al catches the waiting taxi, B2 being Al catches the next taxi, which arrives between 0 and 5 minutes, and B3 being Al catches the bus at the time t plus 5. Notice that these three events are disjoint. So Al catching the waiting taxi means he can't catch the bus or the next arriving taxi. And it also covers the entire set of outcomes. So, in fact, B1, B2, and B3 are a partition. So let's look at the relevant probabilities. Whether or not B1 happens depends on whether or not the taxi's waiting for Al. So if the taxi is waiting for him, which happens with 2/3 probability, B1 happens. Otherwise, with 1/3 probability, we see whether or not a taxi is going to arrive between 0 and 5 minutes. If it arrives, which is going to happen with what probability? Well, we know that the next taxi is going to arrive between 0 and 10 minutes uniform. It's a uniform distribution. And so half the mass is going to be between 0 and 5. And the other half is going to be between 5 and 10. And so this is going to be 1/2 and 1/2. And let's look at what X looks like. If B1 happens, Al isn't waiting at all, so x is going to be equal to 0. If B3 happens, which is the other easy case, Al's going to be waiting for 5 minutes exactly. And if B2 happens, well, it's going to be some value between 0 and 5. We can actually draw the density, so let's see if we can do that here. The original next taxi was uniformly distributed between 0 and 10. But now, we're told two pieces of information. We're told that B2 happens, which means that there's no taxi waiting, and the next taxi arrives between 0 and 5 minutes. Well, the fact that there was no taxi waiting has no bearing on that density. But the fact that the next taxi arrives between 0 and 5 does make a difference, because the density then is going to be definitely 0 in any region outside 0 and 5. Now, the question is, how is it going to look between 0 and 5? Well, it's not going to look crazy. It's not going to look like something different. It's simply going to be a scale version of the original density between 0 and 5. You can verify this by looking at the actual formula for when you condition events on a random variable. Here, it's going to be 1/5 in order for this to integrate out to 1. And now we can jump right into figuring out the expectation. Now, notice that X is actually a mixed random variable? What does that mean? Well, X either takes on values according to either a discrete probability law or a continuous one. So if B1 happens, for example, X is going to be exactly equal to 0 with probability 1, which is a discrete probability problem. On the other hand, if B2 happens, then the value of X depends on the density, which is going to be continuous. So X is going to be a continuous random variable here. So how do you define an expectation in this case? Well, you can do it so that it satisfies the total expectation theorem, which means that the expectation of X is the probability of B1 times the expectation given B1 plus the probability of B2 times the expectation given B2 plus the probability of B3 times the expectation given B3. So this will satisfy the total expectation theorem. So the probability of B1 is going to be exactly 2/3. It's simply the probability of a taxi waiting for Al. The expected value of X-- well, when B1 happens, X is going to be exactly equal to 0. So the expected value is also going to be 0. The probability of B2 happening is the probability of a taxi not being there times the probability of a taxi arriving between 0 and 5. It's going to be 1/3 times 1/2. And the expected value of X given B2 is going to be the expected value of this density. The expected value of this density is the midpoint between 0 and 5. And so it's going to be 5/2. And the probability of B3 is going to be 1/3 times 1/2. Finally, the expected value of X given B3. Well, when B3 happens, X is going to be exactly equal to 5. So the expected value is also going to be 5. Now we're left with 5/12 plus 5/6, which is going to be 15/12. And we can actually fill that in here so that we can clear up the board to do the other part. Now we want to compute the CDF of X. Well, what is the CDF? Well, the CDF of X is going to be equal to the probability that the random variable X is less than or equal to some little x. It's a constant [INAUDIBLE]. Before we jump right in, let's try to understand what's the form of the CDF. And let's consider some interesting cases. You know that the random variable X, the waiting time, is going to be somewhere between 0 and 5, right? So let's consider what happens if little x is going to be less than 0. That's basically saying, what's the probability of the random variable X being less than some number that's less than 0? Waiting time can't be negative, so the probablility of this is going to be 0. Now, what if X is between equaling 0 and strictly less than 5? In that case, either X can fall between 0 and 5 according to this case, in the case of B2, or X can be exactly equal to 0. It's not clear. So let's do that later. Let's fill that in later. What about if x is greater than or equal to 5? Little x, right? That's the probability that the random variable X is less than some number that's bigger than or equal to 5. The waiting time X, the random variable, is definitely going to be less than or equal to 5, so the probability of this is going to be 1. So now this case. How do we do it? Well, let's try to use a similar kind of approach that we did for the expected value and use the total probability theorem in this case. So let's try to review this. First of all, let's assume that this is true, that little x is between 0 and 5, including 0. And let's use the total probability theorem, and use the partitions B1, B2, and B3. So what's the probability of B1? It's the probability that Al catches waiting taxi, which happens with probability 2/3. What's the probability that the random variable X, which is less than or equal to little x under this condition, when B1 happens? Well, if B1 happens, then random variable X is going to be exactly equal to 0, right? So in that case, it's definitely going to be less than or equal to any value of x, including 0. So the probability will be 1. What's the probability that B2 happens now? The probability that B2 happens is 1/3 times 1/2, as we did before. And the probability that the random variable X is less than or equal to little x when B2 happens. Well, if B2 happens, this is your density. And this is our condition. And so x is going to be somewhere in between these spots. And we'd like to compute what's the probably that random variable X is less than or equal to little x. So we want this area. And that area is going to have height of 1/5 and width of x. And so the area's going to be 1/5 times x. And finally, the probability that B3 happens is going to be 1/3 times 1/2 again times the probability that the random variable X is less than or equal to little x given B3. Well, when B3 happens, X is going to be exactly 5 as a random variable. But little x, you know-- we're assuming in this condition-- is going to be between 0 and 5, but strictly less than 5. So there's no way that if the random variable X is 5 and this is strictly less than 5, this is going to be true. And so that probability will be 0. So we're now left with 2/3 plus 1/30. And now we can fill this in. 2/3 plus 1/30 x. And this is our CDF. So now we've finished the problem, computed the expected value here and then the CDF here, and this was a great illustration of how you would do so for a mixed random variable.

Definition

The bivariate von Mises distribution is a probability distribution defined on the torus, in . The probability density function of the general bivariate von Mises distribution for the angles is given by[1]

where and are the means for and , and their concentration and the matrix is related to their correlation.

Two commonly used variants of the bivariate von Mises distribution are the sine and cosine variant.

The cosine variant of the bivariate von Mises distribution[3] has the probability density function

where and are the means for and , and their concentration and is related to their correlation. is the normalization constant. This distribution with =0 has been used for kernel density estimates of the distribution of the protein dihedral angles and .[4]

The sine variant has the probability density function[5]

where the parameters have the same interpretation.

See also

References

  1. ^ a b Mardia, Kanti (1975). "Statistics of directional data". J. R. Stat. Soc. B. 37 (3): 349–393. JSTOR 2984782.
  2. ^ Mardia, K. V.; Frellsen, J. (2012). "Statistics of Bivariate von Mises Distributions". Bayesian Methods in Structural Bioinformatics. Statistics for Biology and Health. pp. 159. doi:10.1007/978-3-642-27225-7_6. ISBN 978-3-642-27224-0.
  3. ^ a b Boomsma, W.; Mardia, K. V.; Taylor, C. C.; Ferkinghoff-Borg, J.; Krogh, A.; Hamelryck, T. (2008). "A generative, probabilistic model of local protein structure". Proceedings of the National Academy of Sciences. 105 (26): 8932–7. Bibcode:2008PNAS..105.8932B. doi:10.1073/pnas.0801715105. PMC 2440424. PMID 18579771.
  4. ^ a b Shapovalov MV, Dunbrack, RL (2011). "A smoothed backbone-dependent rotamer library for proteins derived from adaptive kernel density estimates and regressions". Structure. 19 (6): 844–858. doi:10.1016/j.str.2011.03.019. PMC 3118414. PMID 21645855.
  5. ^ Singh, H. (2002). "Probabilistic model for two dependent circular variables". Biometrika. 89 (3): 719–723. doi:10.1093/biomet/89.3.719.
This page was last edited on 16 January 2023, at 13:27
Basis of this page is in Wikipedia. Text is available under the CC BY-SA 3.0 Unported License. Non-text media are available under their specified licenses. Wikipedia® is a registered trademark of the Wikimedia Foundation, Inc. WIKI 2 is an independent company and has no affiliation with Wikimedia Foundation.