Calculating Determinants

Page Contents

Cofactors

By definition, if A is an ~n # n square matrix, its determinant,

| A | _ = _ sum{&zeta.(&pi.)prod{~a_{~i,&pi.(~i)},~i = 1,~n} , &pi. &in. P_{~n}, _ }


Consider the elements of the ~r^{th} row of A (1 =< ~r =< ~n), then

| A | _ = _ sum{&zeta.(&pi.) ~a_{~r,&pi.(~r)} prod{~a_{~i,&pi.(~i)},~i ≠ ~r, _ } ,&pi. &in. P_{~n}, _ }


Now &pi. maps ~r^ to some ~s, where 1 &le. ~s &le. ~n. Considering each ~s in turn we can reorder the sum as

| A | _ = _ sum{ _ array{//} _ ~a_{~r ~s} ,~s = 1, ~n} sum{&zeta.(&pi.) prod{~a_{~i,&pi.(~i)},~i ≠ ~r, _ } ,&pi.(~r)=~s, _ }

The inner sum in the expression is called the #{~{cofactor}} of ~a_{~r ~s} and is written as A_{~r ~s}:

A_{~r ~s} _ = _ sum{&zeta.(&pi.) prod{~a_{~i,&pi.(~i)},~i ≠ ~r, _ } ,&pi.(~r)=~s, _ }

Row Expansion

For any (fixed) ~r (1&le.~r&le.~n) we have

| A | _ = _ sum{~a_{~r ~s} A_{~r ~s} ,~s = 1, ~n}

This is called the expansion of the determinant of A by the ~r^{th} row.

Column Expansion

Similarly for any (fixed) ~s (1&le.~s&le.~n) we have

| A | _ = _ sum{~a_{~r ~s} A_{~r ~s} ,~r = 1, ~n}

called the expansion of the determinant of A by the ~s^{th} column.

Minors

Let B be any ~p # ~q matrix, B = (~b'_{~i ~j} )_{~i =1 ... ~p, ~j =1 ... ~q} .

For _ ~m &le. min\{~p, ~q\} _ chose ~m rows (~i(1) ... ~i(~m)) and ~m columns (~j(1) ... ~j(~m)). _ We can form a new (square) matrix B' = (~b'_{~r ~s} )_{~r =1 ... ~m, ~s =1 ... ~m} _ where ~b'_{~r ~s} = ~b_{~i(~r), ~i(~s)}, _ i.e. a matrix formed with only those elements of B which are both in one of the rows (~i(1) ... ~i(~m)) and in one of the columns (~j(1) ... ~j(~m)) of B.

The determinant of B' is called an #{~{~m^{th} order minor}} of B.

Cofactor Expansion

For ~n # ~n square matrix A, _ let _ A^x_{~r ~s} _ (~r, ~s &le. ~n) _ be the matrix formed by excluding the elements of the ~r^{th} row and the ~s^{th} column of A.
| A^x_{~r ~s} | is an (~n - 1)^{th} order minor of A. _ If A_{~r ~s} is the cofactor of a_{~r ~s} in A we have

A_{~r ~s} _ = _ ( - 1)^{~r+~s} | A^x_{~r ~s} |

Proof:

Let B be the matrix obtained from A by interchanging successively the ~r^{th} and ~r+1^{th} rows, ... the ~n-1^{th} and ~n^{th} rows, and then the ~s^{th} ... ~n^{th} columns similarly. (~n - ~r row operations and ~n - ~s column operations). _ So _ | B | _ = _ ( - 1)^{~n - ~r + ~n - ~s} | A | = _ ( - 1)^{~r + ~s} | A |

Now

B _ = _ matrix{~a_{1 1}, ... , ... , ~a_{1 ~n},~a_{1 ~s}/:, , ,:,:/ :, , ,:,:/~a_{~n 1}, ... , ... ,~a_{~n ~n},~a_{~n ~s}/ ~a_{~r 1}, ... , ... ,~a_{~r ~n},~a_{~r ~s}} _ = _ matrix{ , , , ~a_{1 ~s}/ ,A^x_{~r ~s} , ,:/ , , ,:/~a_{~r 1}, ... , ... ,~a_{~r ~s} }

Write B = (~b_{~i ~j} ), _ then the cofactor of ~a_{~r ~s} in A is the cofactor of ~b_{~n ~n} in B

| B | _ = _ sum{&zeta.(&sigma.) prod{~b_{~i,&sigma.(~i)},~i = 1,~n - 1} ,&sigma. &in. P_{~n}, _ }

Inverse Matrix

Adjoint (or Adjugate)

If A_{~i ~j} is the cofactor of ~a_{~i ~j} in A, the #~{adjoint} or #~{adjugate} of A is the matrix defined by

adj A _ = _ (A_{~i ~j} )^T

i.e. the transpose of the matrix of corresponding cofactors.

[ The term ~{adjoint} is also used to designate the transpose of the conjugate matrix of A, _ $A ^T = A* , _ there is usually little scope for confusion, as the current type of adjoint is only used in the calculation of determinants. ]

#{Lemma}: _ _ A(adj A) _ = _ (adj A)A _ = _ | A | I_{~n}

Proof:
The (~i, ~j)^{th} element of adj A: _ (adj A)_{~i ~j} _ = _ A_{~j ~i} , _ - the cofactor of a_{~j ~i} in A. _ So the (~r, ~p)^{th} element of A(adj A) is _ &sum._{~s} a_{~r ~s} A_{~p ~s} .
Now if ~r = ~p then this becomes &sum._{~s} a_{~r ~s} A_{~r ~s} = | A | , _ - expansion of | A | by the ~r^{th} row.
If ~r ≠ ~p then &sum._{~s} a_{~r ~s} A_{~p ~s} = 0 , _ as this is the determinant of the matrix obtained from A by replacing the ~p^{th} row of A by the ~r^{th} row, so this matrix will have two equal rows and so its determinant is zero.

Similar proof for _ _ (adjA)A _ = _ | A | I_{~n} .

From the above if | A | ≠ 0 then

1) _ A^{-1} _ = _ fract{adj A, | A | }

2) _ | adj A | _ = _ | A | ^{~n - 1}

Proof of 2): _ From 1) _ adj A = | A | A^{-1} , _ so _ | adj A | _ = _ | A | ^{~n} | A^{-1} | _ = _ | A | ^{~n - 1} .
[ Remember | λB | = λ^{~n} | B | _ and _ | A^{-1} | = | A | ^{-1} ]