To install click the Add extension button. That's it.

The source code for the WIKI 2 extension is being checked by specialists of the Mozilla Foundation, Google, and Apple. You could also do it yourself at any point in time.

4,5
Kelly Slayton
Congratulations on this excellent venture… what a great idea!
Alexander Grigorievskiy
I use WIKI 2 every day and almost forgot how the original Wikipedia looks like.
Live Statistics
English Articles
Improved in 24 Hours
Added in 24 Hours
What we do. Every page goes through several hundred of perfecting techniques; in live mode. Quite the same Wikipedia. Just better.
.
Leo
Newton
Brights
Milds

Discrete phase-type distribution

From Wikipedia, the free encyclopedia

The discrete phase-type distribution is a probability distribution that results from a system of one or more inter-related geometric distributions occurring in sequence, or phases. The sequence in which each of the phases occur may itself be a stochastic process. The distribution can be represented by a random variable describing the time until absorption of an absorbing Markov chain with one absorbing state. Each of the states of the Markov chain represents one of the phases.

It has continuous time equivalent in the phase-type distribution.

YouTube Encyclopedic

  • 1/5
    Views:
    55 890
    185 879
    43 450
    8 116
    5 628
  • Overview of Some Discrete Probability Distributions (Binomial,Geometric,Hypergeometric,Poisson,NegB)
  • Thinking about shapes of distributions | Data and statistics | 6th grade | Khan Academy
  • Fluent tutorial | DPM Model - Particle tracking
  • Mod-03 Lec-10 Morphological Characterization: Particle size distributions
  • Analyzing Tracer Data for a Tank

Transcription

Let's look at a quick overview of some discrete probability distributions and their relationships. I intend this video to be used as a recap after having been introduced to these distributions, but it could possibly be used as an introductory overview. I don't any calculations in this video, nor do I discuss how to calculate the probabilities. I simply discuss how these different distributions arise, and the relationships between them. The Bernoulli distribution is the distribution of the number of successes on a single Bernoulli trial. In a Bernoulli trial we get either a success or a failure. It's like an answer to a yes or no question. A Bernoulli random variable can take on only the values 0 and 1. For example, we can use the Bernoulli distribution to answer questions like: if a single coin is tossed once, what is the probability it comes up heads? Or, if a single adult American is randomly selected, what is the probability they are a heart surgeon? Some other important distributions are built on the notion of independent Bernoulli trials, where we have a series of trials, and each one results in a success or a failure. An important one is the binomial distribution, which is the distribution of the number of successes in n independent Bernoulli trials. For example, with the binomial distribution we can answer a question like: if a coin is tossed 20 times, what is the probability heads comes up exactly 14 times? And since the binomial distribution is the distribution of the number of successes in n independent Bernoulli trials, the Bernoulli distribution is a special case of the binomial distribution with n=1, a single trial. Continuing on with the theme of independent Bernoulli trials, the geometric distribution is the distribution of the number of trials needed to get the first success. For example, with the geometric distribution we can answer a question like: if a coin has repeatedly tossed, what is the probability the first time heads appears occurs on the 8 toss? The negative binomial distribution is a generalization of the geometric distribution. The negative binomial distribution is the distribution of the number of trials needed to get a certain number of successes in repeated independent Bernoulli trials. So the negative binomial distribution can help us answer questions like: if a coin has repeatedly tossed, what is the probability the third time heads appears occurs on the ninth trial? The way the binomial distribution and the negative binomial distribution arise can sound similar, and they can sometimes be confused. They differ in what the random variable is. In the binomial distribution, the number of trials is fixed, and the number of successes is a random variable. For instance, we're tossing a coin a fixed number of times, and the number of heads that comes up is a random variable. In the negative binomial distribution, the number of successes is fixed, and the number of trials required to get that number of successes is the random variable. For instance, we might be tossing a coin until we get heads 4 times. And the number of tosses required to get heads 4 times is the random variable. Now I'll talk about two distributions that are related to the binomial, but aren't based on independent Bernoulli trials. The hypergeometric distribution is similar to the binomial distribution in that we're interested in the number of successes in n trials, but it's different because the trials are not independent. The hypergeometric distribution is the distribution of the number of successes when we are drawing without replacement from a source that contains a certain number of successes and a certain number of failures. For example, we can use the hypergeometric distribution to answer a question like: if 5 cards are drawn without replacement from a well shuffled deck, what is the probability exactly 3 hearts are drawn? It's different from the binomial because the probability of success, the probability of getting a heart, would change from card to card, depending on what happened before. However, if the cards are drawn with replacement, meaning the card was put back in and reshuffled before the next card was drawn, then the trials would be independent and we would use the binomial distribution instead. If we are sampling only a small fraction of objects without replacement from a large population then the trials are still not independent, but that dependency has only a small effect, and the binomial distribution closely approximates the hypergeometric distribution. So there are times when a problem is in its nature a hypergeometric problem, but we use the binomial distribution as an approximation. This can make our life a little bit easier sometimes. Another distribution related to the binomial is the Poisson distribution. But this one's a little harder to explain. The Poisson distribution is the distribution of the number of events in a given time or length, or area, or volume etc., if those events are occurring randomly and independently. There's a bit more to it than that, and I go into this in much greater detail in my Poisson videos. But that's the gist of it. So we might use the Poisson distribution to answer a question like: what is the probability there will be exactly 4 car accidents on a certain university campus in a given week? There is a relationship between the Poisson distribution and the binomial distribution. The Poisson distribution closely approximates the binomial distribution if n, the number of trials, in the binomial, is large and p, the probability of success, is very small. So suppose we have a question like: what is the probability that in a random sample 100,000 births, there is at least one case of progeria? Progeria is an extremely rare disease that causes premature aging, and it occurs in about 1 in every eight million births. This is truly a binomial problem. But we have a binomial problem with a very large n, 100,000, and a very small probability of success, 1 in eight million or so, because progeria such a rare disease. And so this could be very well approximated by the Poisson distribution. I look into all of these concepts discussed in this video in greater detail in the videos for these specific distributions.

Contents

Definition

A terminating Markov chain is a Markov chain where all states are transient, except one which is absorbing. Reordering the states, the transition probability matrix of a terminating Markov chain with transient states is

where is a matrix and . The transition matrix is characterized entirely by its upper-left block .

Definition. A distribution on is a discrete phase-type distribution if it is the distribution of the first passage time to the absorbing state of a terminating Markov chain with finitely many states.

Characterization

Fix a terminating Markov chain. Denote the upper-left block of its transition matrix and the initial distribution. The distribution of the first time to the absorbing state is denoted or .

Its cumulative distribution function is

for , and its density function is

for . It is assumed the probability of process starting in the absorbing state is zero. The factorial moments of the distribution function are given by,

where is the appropriate dimension identity matrix.

Special cases

Just as the continuous time distribution is a generalisation of the exponential distribution, the discrete time distribution is a generalisation of the geometric distribution, for example:

See also

References

  • M. F. Neuts. Matrix-Geometric Solutions in Stochastic Models: an Algorithmic Approach, Chapter 2: Probability Distributions of Phase Type; Dover Publications Inc., 1981.
  • G. Latouche, V. Ramaswami. Introduction to Matrix Analytic Methods in Stochastic Modelling, 1st edition. Chapter 2: PH Distributions; ASA SIAM, 1999.
This page was last edited on 3 July 2014, at 13:58
Basis of this page is in Wikipedia. Text is available under the CC BY-SA 3.0 Unported License. Non-text media are available under their specified licenses. Wikipedia® is a registered trademark of the Wikimedia Foundation, Inc. WIKI 2 is an independent company and has no affiliation with Wikimedia Foundation.