Expand/Contract All

Matrices

Page Contents

Matrix Space

Matrix

An ~m # ~n #{~{matrix}} over a field F is a rectangular array of ~m rows, ~n columns of elements of F

A _ = _ matrix{~a_{1 1} , ... ,~a_{1 ~n} / : , ... , : / ~a_{~m 1}, ... ,~a_{~m ~n}}

write ~a_{~i,~j} for the element in the ~i^{th} row and ~j^{th} column of the matrix. We also write A = ( ~a_{~i,~j} )_{~i=1 ... ~m; ~j=1 ... ~n}

[Formally we can regard a matrix as a map A: \{1, ... ,~m\} # \{1, ... ,~n\} &rightarrow. F, _ where A(~i,~j) = ~a_{~i,~j}. The above notation with rows and columns is just a lexigraphical representation of this map.]

Let @M(~m,~n) represent the set of all ~m # ~n matrices (over F).

An ~n # ~n matrix is called a #{~{square matrix}} and we write @M_{~n} = @M(~n,~n) _ - the set of all ~n # ~n matrices (over F).

Matrix Equality

Two matrices over F, A [~m # ~n] and B [~p # ~q], are equal if ~m = ~p, ~n = ~q, and ~a_{~i,~j} = ~b_{~i,~j} _ &forall. ~i = 1, ... ~m, ~j = 1, ... ~n

Matrix Addition

Two matrices A [~m # ~n] and B [~p # ~q] over F are conformable for addition if ~m = ~p and ~n = ~q, in which case A + B = ( ~a_{~i,~j} + ~b_{~i,~j} )_{~i=1 ... ~m; ~j=1 ... ~n}

Note: A + B = B + A; _ (A + B) + C = A + (B + C).

If @M(~m,~n) is the set of all ~m # ~n matrices (over F) then it is an abelian group with zero element the ~m # ~n matrix

O_{~m,~n} _ = _ matrix{0,...,0/:,...,:/0,...,0}

A + O_{~m,~n} = A; _ &forall. A &in. @M(~m,~n)

Scalar Multiplication

Define ~{#{scalar multiplication}}: _ _ _ λ A _ = _ ( λ ~a_{~i, ~j} )_{~i = 1 ... ~m, _ ~j = 1 ... ~n}

Vector Space of Matrices

The matrix space @M(~m,~n) is a vector space over F, with the operations of addition and scalar multiplication defined above.

Consider matrices X_{~k~l} &in. @M(~m,~n),

X_{~k~l} _ = _ ( ~{x_{k, l}} ) _{~k = 1...~m, ~l = 1...~n} _ _ _ _ _ _ _ _ _ ~{x_{i, j}} _ = _ array{1, ~i = ~k &comma. _ ~j = ~l/ 0,otherwise}

These form a basis for @M(~m,~n), which therefore has dimension ~m # ~n

Multiplication of Matrices

Two matrices A [~m # ~n] and B [~p # ~q] over F are conformable for multiplication if

  1. ~n = ~p in which case AB exists (and is ~m # ~q), _ AB = (&sum._{~j} ~{a_{ij}b_{jk}}) _ ~i = 1..~m, ~k=1..~q
  2. ~m = ~q in which case BA exists (and is ~p # ~n), _ BA = (&sum._{~k} ~{b_{jk}a_{ki}}) _ ~j = 1..~p, ~i=1..~n

If A is ~m # ~n and B is ~n # ~m, then both AB [~m # ~m] and BA [~n # ~n] exist, but in general BA ≠ AB, even when ~n = ~m.

Example:

Identitiy Matrix

Let @M_{~n} be the space of ~n×~n (square) matrices. (I.e. @M_{~n} = @M(~n,~n) ).

The ~n # ~n #{~{identity matrix}} I_{~n} &in. @M_{~n} is the matrix, whose (~i, ~j)^{th} element is the #{~{kroeneker delta}}:

&delta._{~i~j} _ = _ array{1, if ~i = ~j/0,otherwise}

I.e.

I_{~n} _ = _ matrix{1,0,...,0/0,1,...,0/,,...,/0,0,...,1}

For any ~m # ~n matrix A, _ I_{~m} A = A, _ and _ A I_{~n} = A.
In particular if A &in. @M_{~n} , _ then _ I_{~n} A = A I_{~n} = A.

Regular Matrix (definition)

A &in. @M_{~n} is said to be #{~{regular}} ( or #{~{non-singular}}) if &exist. A^{-1} &in. @M_{~n} such that A A^{-1} = I_{~n}

Inverse Matrix

#{Lemma} if A is regular, i.e. &exist. A^{-1} such that A A^{-1} = I_{~n}, _ then

  1. A^{-1} A = I_{~n} also,
  2. A^{-1} is regular and (A^{-1})^{-1} = A ,
  3. A^{-1} is unique.

A^{-1} is called the #{~{inverse}} of A.