Let _ &phi.#: F^{~n} &hdash.&rightarrow. F^{~n} _ be a linear map. If _ &phi.#{~u} = λ#{~u} _ for some λ &in. F, _ then #{~u} is said to be an ~#{eigenvector} of &phi. with corresponding ~#{eigenvalue} λ.
If &phi. has corresponding matrix A relative to basis _ #{~u}_1 ... #{~u}_{~n} _ &in. F^{~n}, _ then
#{~u} = &sum._{~i} &alpha._{~i}#{~u}_{~i} _ is an eigenvector of &phi. _ _ &iff. _ _ A#{&alpha.} = λ#{&alpha.} _ for some λ &in. F , _ _ where #{&alpha.} = (&alpha._1 ... &alpha._{~n} )^T.
Then λ is said to be an ~#{eigenvalue} of A with corresponding ~#{eigenvector} #{&alpha.}.
If ~n # ~n matrix A has eigenvalue λ with corresponding eigenvector #{&xi.}, then _ A#{&xi.} = λ#{&xi.} _ so _ (A &minus. λI ) #{&xi.} = #0.
This has a non trivial solution _ &iff. _ | A &minus. λI | = 0.
Now _ | A &minus. λI | _ is a polynomial in λ of degree ~n over F, called the ~#{characteristic polynomial} of A, _ and _ | A &minus. λI | = 0 _ is the ~#{characteristic equation} of A. _ This has at most ~n roots, so an ~n # ~n matrix A has at most ~n eigenvalues.
Note that A - λI is just A with λ subtracted from each of its principal diagonal elements.
An alternative definition of eigenvalues of a matrix A being the roots of the characteristic equation of A.
An eigenvalue can be zero, i.e. A#{&xi.} = #0. _ This has a non-trivial solution _ &iff. _ | A | = 0 _ &iff. _ A is not regular.
&therefore. ~{a regular matrix has no zero eigenvalues}.
Consider the 2 # 2 matrix _ _ A _ = _ matrix{a_{11},a_{12}/a_{21},a_{22}} , _ _ then _ _ A - λI _ = _ matrix{a_{11} - λ,a_{12}/a_{21},a_{22} - λ} .
This has determinant _ (a_{11} - λ)(a_{22} - λ) - a_{12}a_{21}
= _ λ^2 - (a_{11} + a_{22})λ + a_{21}a_{22} - a_{12}a_{21} _ = _ λ^2 - tr(A)λ + det(A) .
It can be shown that for an ~n # ~n matrix A:
| A - λI | _ = _ sum{m_~i ({-}λ)^{~n - ~i},~i = 0, ~n} _ = _ m_0 ({-}λ)^{~n} + m_1 ({-}λ)^{~n - 1} + ... + m_~n
where m_0 = 1 and m_{~i} is the sum of all the principal ~i^{th} order of A.
This conjecture will not be proved here.
[ Note that a principal ~{first} order minor is just an element of the diagonal, so we have that m_1 = &sum._{~i} ~a_{~i ~i} called the ~{trace} of A.
Also the principal ~n^{th} order minor of A is just the determinant of A. ]
If λ is an eigenvalue of A with corresponding eigenvector #{&xi.} , _ then A (&mu.#{&xi.}) = &mu. A#{&xi.} = &mu. λ#{&xi.} = λ (&mu.#{&xi.}).
So &mu.#{&xi.} is also an eigenvector of A with corresponding eigenvalue λ.
If #{&xi.}_1 and #{&xi.}_2 are two linearly independent eigenvectors corresponding to the same eigenvalue λ _ then
A (&mu.#{&xi.}_1 + &nu.#{&xi.}_2) = &mu.A#{&xi.}_1 + &nu.A#{&xi.}_2 = &mu.λ#{&xi.}_1 + &nu.λ#{&xi.}_2 = λ (&mu.#{&xi.}_1 + &nu.#{&xi.}_2).
So &mu.#{&xi.}_1 + &nu.#{&xi.}_2 is also an eigenvector of A with corresponding eigenvalue λ.
We can conclude that the eigenvectors of A corresponding to a single eigenvalue, λ, form a linear subspace of F^{~n}, _ and that its dimension is _ ~n &minus. rank( A &minus. λI ).
[ (A &minus. λI)#{&xi.} = #0 _ &iff. _ #{&xi.} &in. ker &chi. , _ where &chi. is the linear map associated with A &minus. λI relative to some basis. _ dim( ker &chi. ) = ~n &minus. dim( im &chi. ) = ~n &minus. rank(A &minus. λI). ]
I.e. any eigenvalue λ of A can have _ ~n &minus. rank( A &minus. λI ) _ linearly independent eigenvectors.
#{Lemma}: If _ p(~t) is a polynomial over F _ and _ λ is an eigenvalue of matrix A, _ then _ p(λ) is also is an eigenvalue of p(A), _ with the same corresponding eigenvector.
Proof: _ If _ A#{&xi.} = λ #{&xi.} , _ then
So _ p(A)#{&xi.} = p(λ)#{&xi.} , _ any polynomial p( ).
For use in the following proposition, consider the ~r # ~r matrix, ~{Van der Monde's Matrix}, defined by
Λ _ = _ matrix{ 1,λ_1, ... ,λ_1^{~r&minus.1}/ :,:,,:/ 1,λ_{~r}, ... ,λ_{~r}^{~r&minus.1} } _ = _ script{rndb{λ_{~i}^{~k}},,,~i = 1 ... ~r,~k = 0 ... ~r&minus.1}
This is a regular matrix with determinant
det{Λ} _ = _ prod{ ( λ_{~i} &minus. λ_{~j} ),~i > ~j, _ }
which is known as ~{Van der Monde's Determinant}.
If ~n # ~n matrix A has ~r distinct eigenvalues λ_1 ... λ_{~r} then the corresponding eigenvectors #{&xi.}_1 ... #{&xi.}_{~r} are linearly independent.
Proof:
Suppose
sum{ &alpha._{~i} #{&xi.}_{~i},~i = 1, ~r} _ = _ #0 , _ _ some &alpha._{~i} &in. F
Let p(~t) be any polynomial over F, so by the lemma:
sum{ p( λ_{~i} ) &alpha._{~i} #{&xi.}_{~i},~i = 1, ~r} _ = _ sum{&alpha._{~i} ( p( λ_{~i} ) #{&xi.}_{~i} ),~i = 1, ~r}
_ _ _ _ _ = _ sum{&alpha._{~i} ( p(A) #{&xi.}_{~i} ),~i = 1, ~r} _ = _ p(A) sum{&alpha._{~i} #{&xi.}_{~i},~i = 1, ~r} _ = _ #0
In particular
sum{array{/},~i = 1, ~r} ( sum{ ~b_{~k} &lamda._{~i} ^{~k&minus.1} ,~k = 1, ~r} ) &alpha._{~i} #{&xi.}_{~i} _ = _ #0 , _ _ for any set of ~b_{~k} &in. F
But we can find ~b_{~k} such that _ &sum._{~k} ~b_{~k} &lamda._{~i} ^{~k&minus.1} = &delta._{~i ~j} _ for any 1 &le. ~j &le. ~r.
Consider Van der Monde's matrix defined above. This is regular so _ &exist. #{~b} = ( ~b_1 ... ~b_{~r} ) such that Λ#{~b} = #{~e}_{~j} has a non-trivial solution for each _ ~j _ &imply. _ &exist. ~b_1 ... ~b_{~r} ( dependent on _ ~j ) such that _ &sum._{~k} ~b_{~k} &lamda._{~i} ^{~k&minus.1} = &delta._{~i ~j} , _ so:
sum{&delta._{~i ~j} &alpha._{~i} #{&xi.}_{~i} ,~i = 1, ~r} _ = _ #0 _ _ => _ _ &alpha._{~j} _ = _ 0
This can be done for each ~j. So we have proved linear independence.
#{Corollaries}:
sum{~n &minus. rank( A &minus. λ_{~i} I ),~i = 1, ~n} _ _ _ _ =< _ _ _ _ ~n _ _ _ _ ( = _ _ _ _ dim F^{~n} )
If ~n # ~n matrix A has ~n eigenvalues λ_1 ... λ_{~n} (not necessarily distinct) with corresponding eigenvectors #{&xi.}_1 ... #{&xi.}_{~n}, _ then
A _ _ _ _ ~ _ _ _ _ L _ = _ matrix{ λ_1,0, ... ,0/0,λ_2, ... ,0/:,:,,:/0,0, ... ,λ_{~n}}
A _ _ _ _ ~ _ _ _ _ M _ = _ matrix{ &mu._1,0, ... ,0/0,&mu._2, ... ,0/:,:,,:/0,0, ... ,&mu._{~n}}
then _ &mu._1 ... &mu._{~n} are eigenvalues of A, and the corresponding eigenvectors are linearly independent.Proof:
#{Corollaries}:
P^{-1}AP _ = _ matrix{λ_1, ... ,0/:,,:/0, ... ,λ_{~n}} _ _ => _ _ P^{&minus.1}A^{&minus.1}P _ = _ (P^{&minus.1}AP)^{&minus.1} _ = _ matrix{λ_1^{&minus.1}, ... ,0/:,,:/0, ... ,λ_{~n}^{&minus.1}}
so λ_1^{&minus.1} ... λ_{~n}^{&minus.1} are the eigenvalues of A^{&minus.1} (by 3) with same eigenvectors as A (i.e. the columns of P).P^{&minus.1}AP _ = _ matrix{λ_1, ... ,0/:,,:/0, ... ,λ_{~n}} _ _ => _ _ det{P^{&minus.1}AP} _ = _ prod{λ_{~i},~i = 1,~n}
Now _ | P^{&minus.1}AP | _ = _ | P^{&minus.1} | | A | | P | _ = _ | A | | P^{&minus.1} | | P | _ = _ | A | | P^{&minus.1}P | _ = _ | A | | I_{~n} | _ = _ | A |.#{Summary}: ~n # ~n matrix A