Expand/Contract All

# Column Space of a Matrix

Page Contents

## Row and Column Vectors

An ~m # 1 matrix is called a #{~{column vector}}: _ #{~a} = (~a_1 ... ~a_{~m})^T

There is a bijection : _ F^{~n} &leftarrow.&rightarrow. @M(~n, 1) _ such that

(λ_1 ... λ_{~n}) _ _ _ _ _ _ _ _ matrix{λ_1/:/λ_{~n}}

We will henceforth regard elements of F^{~n} as column vectors.

A 1 # ~n matrix is called a #{~{row vector}}: _ #{~b}^T = (~b_1 ... ~b_{~n})
[Note that #{~b} is an element F^{~n} and therefore a column vector, which is why the row vector is represented as #{~b}^T .]

The ~m # ~n matrix A can be regarded as

• _ ~n column vectors _ _ _ #{~k}_{~j} = (~a_{1 ~j} ... ~a_{~m~j} )^T

or

• _ ~m row vectors _ _ _ #{~r}_{~i}^T = (~a_{~i 1} ... ~a_{~i~n} )

A _ _ = _ _ matrix{#{~k}_{~1}, ... ,#{~k}_{~n}} _ _ = _ _ matrix{#{~r}_{~1}^T/:/#{~r}_{~m}^T}

Note that the column(row) vectors of A are precisely the row(column) vectors of A^T.

A^T _ = _ matrix{#{~k}_{~1}^T/:/#{~k}_{~n}^T} _ = _ matrix{#{~r}_{~1}, ... ,#{~r}_{~m}}

## Block Multiplication of Matrices

So far we have treated matrix multiplication on an element by element basis, we can regard matrices as being constructed of "blocks" or "sub-matrices" of elements and observe how we can express multiplication in terms of these blocks.

Consider matrices A ~m # ~n and B ~n # ~p (which are &therefore. conformable for multiplication), and put:

A _ _ = _ _ matrix{A_{11},A_{12}/A_{21},A_{22}} _ _ _ _ _ _ B _ _ = _ _ matrix{B_{11},B_{12}/B_{21},B_{22}}

Where A_{11} is ~s # ~t and B_{11} is ~t # ~r , _ ~s &le. ~m , ~t &le. ~n , ~r &le. ~p , _ so all components are conformable for multiplication as given in the following expression:

AB _ _ = _ _ matrix{ A_{11}B_{11} + A_{12}B_{21}, A_{11}B_{12} + A_{12}B_{22}/ A_{21}B_{11} + A_{22}B_{21}, A_{11}B_{12} + A_{12}B_{22} }

So multiplication of blocks can be done treating the blocks as elements. In particular if #{~a}_{1}^T ... #{~a}_{~m}^T are the row vectors of A and #{~b}_{1} ... #{~b}_{~p} are the column vectors of B

AB _ = _ matrix{#{~a}_{1}^T/ : /#{~a}_{~m}^T} matrix{#{~b}_{1}, ... ,#{~b}_{~p}} _ = _ matrix{#{~a}_{1}^T#{~b}_{1}, ... ,#{~a}_{1}^T#{~b}_{~p}/ : , ... , : / #{~a}_{~m}^T#{~b}_{1}, ... ,#{~a}_{~m}^T#{~b}_{~p}}

or

AB _ = _ rndb{A} matrix{#{~b}_{1}, ... ,#{~b}_{~p}} _ = _ matrix{A#{~b}_{1}/ : /A#{~b}_{~p}}

or

AB _ = _ matrix{#{~a}_{1}^T/ : /#{~a}_{~m}^T} rndb{B} _ = _ matrix{#{~a}_{1}^TB/ : /#{~a}_{~m}^TB}

## Column Space of a Matrix

The #{~{column space}} of a ~m # ~n matrix A is the subspace of F^{~m} generated by its ~n column vectors.

[Similarly its #{~{row space}} is the subspace of F^{~n} generated by the transposes of its ~m row vectors.]

Consider a linear map _ &phi.#: F^{~n} &hdash.&rightarrow. F^{~m} , _ with associated matrix A relative to the natural bases. _ If #{&mu.} = (&mu._1 ... &mu._{~m} )^T &in. im &phi. (&subset. F^{~m}), _ then &exist. #{λ} = (λ_1 ... λ_{~m} )^T &in. F^{~n} _ such that _ #{&mu.} = A#{λ}. _ We can write #{&mu.} = λ_1#{~a}_1 ... λ_{~m}#{~a}_{~m} where #{~a}_1 ... #{~a}_{~m} are the column vectors of A.
So _ im &phi. _ &subseteq. _ column space of A.

Conversely _ if #{&alpha.} = &sum._{~i} &beta._{~i}#{~a}_{~i} &in. column space of A, then #{&alpha.} = A#{&beta.} = &phi.#{&beta.} _ where #{&beta.} = (&beta._1 ... &beta._{~n})^T, _ So _ im &phi. _ &supseteq. _ column space of A.

So _ im &phi. _ = _ column space of A.

Now _ &rho. _ = _ rank A _ = _ dim (im &phi.) _ = _ dim (column space of A).
This means that there are precisely &rho. linearly independent vectors among the column vectors of A.

Further _ row space of A _ = _ column space of A^T. _ So _ dim (row space of A) _ = _ rank A^T _ = &nbsp &rho.. _ So there are also &rho. linearly independent vectors among the row vectors of A.

dim (row space of A) _ = _ dim (column space of A) _ = _ rank A

## Matrix Product

Now consider ~m # ~n matrix A and ~n # ~p matrix B with associated linear maps &phi. and &psi. relative to the natural bases:

 F ^~p zDgrmRight{&psi.,B} F^~n zDgrmRight{&phi.,A} F^~m

Then &phi.&comp.&psi. has associated matrix AB. _ rank AB = dim (im &phi.&comp.&psi.) &le. dim (im &phi.) = rank A _ (since im &phi.&comp.&psi. &subseteq. im &phi.). _ So rank AB &le. rank A.
Also _ rank AB = rank (AB)^T = rank B^TA^T &le. rank B^T (by above) = rank B. _ So rank AB &le. rank B.

rank AB _ _ _ _ =< _ _ _ _ min \{ rank A, rank B \}

Now _ im &psi. and ker &phi. are subspaces of F^{~n} so
dim (im &phi. &intersect. ker &psi.) _ &le. _ dim (ker &phi.) _ = _ ~n &minus. dim (im &phi.) _ (by Lemma ) _ = _ ~n &minus. rank A.

But _ im &phi. &intersect. ker &psi. _ = _ ker (&phi.&vdash._{im &psi.} ), _ where _ (&phi.&vdash._{im &psi.} )#: _ im &psi. &hdash.&rightarrow. F^{m} is the #{~{restriction}} of &phi. to im &psi.. _ Also _ im (&phi.&vdash._{im &psi.} ) _ = _ im &phi.&comp.&psi., so
dim (im &psi.) _ = _ dim im (&phi.&vdash._{im &psi.} ) + dim ker (&phi.&vdash._{im &psi.} ) _ (applying the Lemma to the space im &psi. )
dim (im &psi.) _ = _ dim im &phi.&comp.&psi. + dim (im &phi. &intersect. ker &psi.)

I.e. _ rank B &le. rank AB + ~n &minus. rank A
or _ rank AB &ge. rank B + rank A &minus. ~n

## Regular Matrices

An ~m # ~m matrix A is regular _ &iff. _ rank A = ~m

Proof:
A is regular _ &imply. _ &exist. A^{&minus.1} such that AA^{&minus.1} = I_{~m}.
~m _ = _ rank I_{~m} _ = _ rank (AA^{&minus.1}) _ &le. _ min \{ rank A, rank A^{&minus.1} \}, _ but _ rank A &le. ~m _ and _ rank A^{&minus.1} &le. ~m.
So rank A _ = _ rank A^{&minus.1} _ = _ ~m.

Conversely _ rank A = ~m _ &imply. _ &exist. regular matrices M and N such that MAN = C

C _ = _ matrix{I_{&rho.},0/0,0}

where &rho. = rank A. _ But rank A = ~m _ so C = I_{~m}
&imply. NMA = NMAN^{&minus.1}N^{&minus.1} = NI_{~m}N^{&minus.1} = I_{~m}.

So _ A regular, with NM = A^{&minus.1}.