Jump to content

Dominated convergence theorem

From Wikipedia, the free encyclopedia

This is an old revision of this page, as edited by BeteNoir (talk | contribs) at 06:12, 6 November 2005 (Refined categorization). The present address (URL) is a permanent link to this revision, which may differ significantly from the current revision.

In mathematics, Henri Lebesgue's dominated convergence theorem states that if a sequence { fn : n = 1, 2, 3, ... } of real-valued measurable functions on a measure space S converges almost everywhere, and is "dominated" (explained below) by some measurable function g whose integral is finite, then

To say that the sequence is "dominated" by g means that

for every n and "almost every" x (i.e., the measure of the set of exceptional values of x is zero). The theorem assumes that g is "integrable", i.e.,

Given the inequalities above, the absolute value sign enclosing g may be dispensed with. In Lebesgue's theory of integration, one calls a function "integrable" precisely if it is measurable and the integral of its absolute value is finite, so this hypothesis is expressed by saying that the dominating function is integrable.

That the assumption that the integral of g is finite cannot be dispensed with may be seen as follows: let fn(x) = n if 0 < x < 1/n and fn(x) = 0 otherwise, and let g(x) = supn f(x) for x > 0 (and 0 otherwise). In that case,

and