Jump to content

Probability distribution

From Wikipedia, the free encyclopedia

This is an old revision of this page, as edited by 202.40.137.213 (talk) at 04:29, 18 October 2005 (→‎Discrete distributions). The present address (URL) is a permanent link to this revision, which may differ significantly from the current revision.

In mathematics, a probability distribution assigns to every interval of the real numbers a probability, so that the probability axioms are satisfied. In technical terms, a probability distribution is a probability measure whose domain is the Borel algebra on the reals.

A probability distribution is a special case of the more general notion of a probability measure, which is a function that assigns probabilities satisfying the Kolmogorov axioms to the measurable sets of a measurable space.

Every random variable gives rise to a probability distribution, and this distribution contains most of the important information about the variable. If X is a random variable, the corresponding probability distribution assigns to the interval [a, b] the probability Pr[aXb], i.e. the probability that the variable X will take a value in the interval [a, b].

The probability distribution of the variable X can be uniquely described by its cumulative distribution function F(x), which is defined by

for any x in R.

A distribution is called discrete if its cumulative distribution function consists of a sequence of finite jumps, which means that it belongs to a discrete random variable X: a variable which can only attain values from a certain finite or countable set. A distribution is called continuous if its cumulative distribution function is continuous, which means that it belongs to a random variable X for which Pr[ X = x ] = 0 for all x in R.

The so-called absolutely continuous distributions can be expressed by a probability density function: a non-negative Lebesgue integrable function f defined on the reals such that

for all a and b. That discrete distributions do not admit such a density is unsurprising, but there are continuous distributions like the devil's staircase that also do not admit a density.

  • The support of a distribution is the smallest closed set whose complement has probability zero.
  • The probability distribution of the sum of two random variables is the convolution of each of their distributions.
  • The probability distribution of the difference of two random variables is the cross-correlation of each of their distributions.

List of important probability distributions

Several probability distributions are so important in theory or applications that they have been given specific names:

Discrete distributions

With finite support

  • The Bernoulli distribution, which take value 1 with probability p and value 0 with probability q = 1 − p.
    • The Rademacher distribution, which takes value 1 with probability 1/2 and value −1 with probability 1/2.
  • The binomial distribution, which describes the number of successes in a series of independent Yes/No experiments.
  • The degenerate distribution at x0, where X is certain to take the value x0. This does not look random, but it satisfies the definition of random variable. This is useful because it puts deterministic variables and random variables in the same formalism.
  • The discrete uniform distribution, where all elements of a finite set are equally likely. This is supposed to be the distribution of a balanced coin, an unbiased die, a casino roulette or a well-shuffled deck. Also, one can use measurements of quantum states to generate uniform random variables. All these are "physical" or "mechanical" devices, subject to design flaws or perturbations, so the uniform distribution is only an approximation of their behaviour. In digital computers, pseudo-random number generators are used to produced a statistically random discrete uniform distribution.
  • The hypergeometric distribution, which describes the number of successes in the first m of a series of n independent Yes/No experiments, if the total number of successes is known.
  • Zipf's law or the Zipf distribution. A discrete power-law distribution, the most famous example of which is the description of the frequency of words in the English language.
  • The Zipf-Mandelbrot law is a discrete power law distribution which is a generalization of the Zipf distribution.

With infinite support

Poisson distribution
Skellam distribution

Continuous distributions

Supported on a bounded interval

Beta distribution
  • The Beta distribution on [0,1], of which the uniform distribution is a special case, and which is useful in estimating success probabilities.
continuous uniform distribution

Supported on semi-infinite intervals, usually [0,∞)

chi-square distribution
Exponential distribution
Gamma distribution
Pareto distribution

Supported on the whole real line

Cauchy distribution
Laplace distribution
Levy distribution
Normal distribution

Joint distributions

For any set of independent random variables the probability density function of the joint distribution is the product of the individual ones.

Two or more random variables on the same sample space

Matrix-valued distributions

Miscellaneous distributions

See also

Template:Link FA