Jump to content

Exponential distribution

From Wikipedia, the free encyclopedia

This is an old revision of this page, as edited by MarkSweep (talk | contribs) at 20:34, 7 November 2004 (Probability density function: explicit alternative). The present address (URL) is a permanent link to this revision, which may differ significantly from the current revision.

In probability theory and statistics, the exponential distribution is a continuous probability distribution.

Specification of the exponential distribution

Probability density function

Probability density function of exponential distribution for λ = 0.5, 1.0, and 1.5.

The probability density function (pdf) of the Exponential(λ) distribution is

for x ≥ 0 and where λ > 0 is a parameter of the distribution.

Alternatively, the exponential distribution can be parameterized by a scale parameter μ = 1/λ, as follows:

where μ > 0.

Cumulative distribution function

The cumulative distribution function is given by

.

Quantile function

The inverse cumulative distribution function is

.

Occurrence

The exponential distribution is used to model Poisson processes, which are situations in which an object initially in state A can change to state B with constant probability per unit time λ. The time at which the state actually changes is described by an exponential random variable with parameter λ. Therefore, the integral from 0 to T over f is the probability that the object is in state B at time T.

The exponential distribution may be viewed as a continuous counterpart of the geometric distribution, which describes the number of Bernoulli trials necessary for a discrete process to change state. In contrast, the exponential distribution describes the time for a continuous process to change state.

Examples of variables that are approximately exponentially distributed are:

  • the time until you have your next car accident;
  • the time until you get your next phone call (assuming you get called many times a day, or get called by people from many different time zones);
  • the distance between mutations on a DNA strand;
  • the distance between roadkill;
  • the time until a radioactive particle decays;
  • the number of dice rolls until you roll 6 11 times in a row.

Properties

Quartiles

The quartiles of an Exponential(λ) random variable are as follows:

first quartile
ln(4/3)/λ
median
ln(2)/λ
third quartile
ln(4)/λ

Expectations

An Exponential(λ) random variable has the following properties:

mean
μ = 1/λ
variance
σ2 = 1/λ2
skewness
γ1 = 2
kurtosis excess
γ1 = 6
entropy
H = 1 − ln(λ) nats

Memorylessness

An important property of the exponential distribution is that it is memoryless. This means that if a random variable T is exponentially distributed, its conditional probability obeys

This says that the conditional probability that we need to wait, for example, more than another 10 seconds before the first arrival, given that the first arrival has not yet happened after 30 seconds, is no different from the initial probability that we need to wait more than 10 seconds for the first arrival. This is often misunderstood by students taking courses on probability: the fact that P(T > 40 | T > 30) = P(T > 10) does not mean that the events T > 40 and T > 10 are independent. To summarize: "memorylessness" of the probability distribution of the waiting time T until the first arrival means

It does not mean

(That would be independence. These two events are not independent.)

Parameter estimation

Maximum likelihood

The likelihood of n independent and identically distributed samples x = (x1, ..., xn) is the following function L:

The maximum likelihood estimate must satisfy the following equation:

Solving for λ results in the following maximum likelihood estimate:

where is the sample mean.

Bayesian inference

The conjugate prior for the exponential distribution is the gamma distribution (of which the exponential distribution is a special case). The following parameterization of the gamma pdf is useful:

The posterior distribution p can then be expressed in terms of the likelihood function defined above and a gamma prior:

Now the posterior density p has been specified up to a missing normalizing constant. Since it has the form of a gamma pdf, this can easily be filled in, and one obtains:

Here the parameter α can be interpreted as the number of prior observations, and β as the sum of the prior observations.

Generating exponential variates

Given a random variate U drawn from the uniform distribution in the interval (0;1], the variate

has an exponential distribution with parameter λ.