Eigenvalue
In linear algebra, a scalar λ is called an eigenvalue (in some older texts, a characteristic value) of a linear mapping A if there exists a vector x such that Ax=λx. The vector x is called an eigenvector.
In matrix theory, an element in the underlying ring R of a square matrix A is called a right eigenvalue if there exists a column vector x such that Ax=λx, or a left eigenvalue if the exists a row vector y such that yA=yλ. If R is commutative, the left eigenvalues of A are exactly the right eigenvalues of A and are just called eigenvalues. If R is not commutative, e.g. quaternions, they maybe different.
Multiplicity
Suppose A is a square matrix over commutative ring. The algebraic multiplicity (or simply multiplicity) of an eigenvalue λ of A is the number of factor t-λ of the characteristic polynomial of A. The geometric multiplicity of λ is the number of factor t-λ of the minimal polynomial of A or equivalently the nullity of (λI-A)n.
An eigenvalue of algebraic multiplicity 1 is called a simple eigenvalue.
Spectrum
In functional analysis, a spectrum of a linear operator A is the set of scalar ν such that νI-A is not invertible. If the underlying Hilbert space is of finite dimensional, then the spectrum of A is the same of the set of eigenvalues of A.
Multiset of eigenvalues
Occasionally, in an article on matrix theory, one may read a statement like:
- The eigenvalues of a matrix A are 4,4,3,3,3,2,2,1.
It means the algebraic multiplicity of 4 is two, of 3 is three, of 2 is two and of 1 is one.
This style is used because algebraic multiplicity is the key to many mathematical proof in matrix theory.
Trace and Determinant
Suppose the eigenvalues of a matrix A are λ1,λ2,...,λn. Then the trace of A is λ1+λ2+...+λn and the determinant of A is λ1λ2λn. These two are very important concepts in matrix theory.
See also
Please refer to eigenvector for some other properties of eigenvalues.