Vector Spaces

Page Contents

Vector Space

A ~{#{vector space}} V over a field F, is a set which is an abelian group under addition, and has an operation called scalar multiplication F # V &rightarrow. V such that:

  1. λ(&mu. {#{~v}}) _ = _ (λ &mu.) {#{~v}}
  2. (λ + &mu.) {#{~v}} _ = _ λ {#{~v}} + &mu. {#{~v}}
  3. 1_F {#{~v}} _ = _ {#{~v}}
  4. λ ({#{~v}} + {#{~u}}) _ = _ λ {#{~v}} + λ {#{~u}}

&forall. _ _ λ, &mu. &in. F; _ _ {#{~v}}, {#{~u}} &in. V

It follows that

λ #0 = #0 _ _ and _ _ 0_F {#{~v}} = #0, _ _ _ _ &forall. _ _ λ &in. F; _ _ {#{~v}} &in. V

Subspaces

A subset S &subset. V is a ~{#{subspace}} of V, if it is closed with respect to the operations of addition and scalar multiplication, i.e:

λ {#{~s}} + &mu. {#{~t}} &in. S, _ _ _ _ &forall. {#{~s}}, {#{~t}} &in. S; _ _ λ, &mu. &in. F;

A sufficient condition that S be a subspace is that

{#{~s}} + &mu. {#{~t}} &in. S, _ _ _ _ &forall. &mu. &in. F;

Note #0 &in. S any subspace S, as 0_F {#{~s}} = #0, &forall. {#{~s}}.

Linear Independence

A set &set.{#{~v}}_{~i}&xset._{~i = 1, ... ~n} , {#{~v}}_{~i} &in. V, is said to be ~{#{linearly dependent}} if:

&exist. λ_{~i} (not all 0) &in. F, ~i = 1, ... ~n, _ _ _ _ such that _ _ _ _ &sum._{~i} λ_{~i} {#{~v}}_{~i} = #0

Otherwise &set.{#{~v}}_{~i}&xset. is said to be ~{#{linearly independent}}, i.e:

&sum._{~i} λ_{~i} {#{~v}}_{~i} = #0 _ _ _ _ => _ _ _ _ λ_{~i} = 0, &forall. ~i

#{Lemma}: Any subset of a linearly independent set is also linearly independent.

Proof:
Suppose {#{~v}}_1 ... {#{~v}}_{~m} linearly dependent, where ~m < ~n. Then &exist. λ_1 ... λ_{~m} not all zero such that &sum._{~i=1 ... ~{m}} λ_{~i} {#{~v}}_{~i} = #0.
Now put λ_{~m+1} = 0, ... , λ_{~n} = 0, so we have &sum._{~i=1 ... ~{n}} λ_{~i} {#{~v}}_{~i} = #0.
But the supposition was that not all λ_{~i} = 0 => {#{~v}}_1 ... {#{~v}}_{~n} linearly dependent - contradiction!

Dimension of a Vector Space

Span

A subspace S &subset. V is ~{#{spanned}} by a set &set.{#{~v}}_{~i}&xset._{~i = 1, ... ~n} of vectors if
#{~s} = &sum._{~i} λ_{~i} {#{~v}}_{~i}, _ &forall. #{~s} &in. S.

Generating

Conversely the set T = &set. #{~t} | #{~t} = &sum._{~i} λ_{~i} {#{~v}}_{~i} &xset. is said to be ~{#{generated}}
by &set.{#{~v}}_{~i}&xset._{~i = 1, ... ~n}. _ _ T is a subspace and T = S.

The terms ~{spanned} and ~{generated} are thus synonymous.

Bases

If &set.{#{~v}}_{~i}&xset._{~i = 1, ... ~n} are linearly independent vectors that generate S then we say that they form a ~{#{basis}} for S.

#{Lemma}: If &set.{#{~v}}_{~i}&xset._{~i = 1, ... ~n} is a basis for S, then each #{~s} &in. S has a unique representation: _ #{~s} = &sum._{~i} λ_{~i} {#{~v}}_{~i}.

Proof:
If #{~s} has another representation: _ #{~s} = &sum._{~i} &mu._{~i} {#{~v}}_{~i} _ then _ &sum._{~i} (λ_{~i} - &mu._{~i}) {#{~v}}_{~i} = 0 _ so by linear independence => λ_{~i} = &mu._{~i}, _ &forall. ~i.

Co-ordinates

The scalars &set.λ_{~i}&xset. are called the ~{#{co-ordinates}} of #{~s} with respect to the basis &set.{#{~v}}_{~i}&xset..

#{Lemma}: If &set.{#{~v}}_{~i}&xset._{~i = 1, ... ~n} is a basis for S, then any set of ~n linearly independent vectors in S is also a basis for S.

Proof:
Let &set.{#{~a}}_{~i}&xset._{~i = 1, ... ~n} be linearly independent. _ Now #{~a}_{~j} = &sum._{~i} λ_{~j~i} {#{~v}}_{~i}, _ ~j = 1 ... ~n. In particular #{~a}_1 = &sum._{~i} λ_{1~i} {#{~v}}_{~i}. Now #{~a}_1 ≠ 0, so &exist. at least one ~k such that λ_{1~k} ≠ 0. By re-arranging we get:

#{~v}_{~k} _ = _ fract{#{~a}_1,λ_{1~k}} _ _ - _ _ sum{fract{λ_{1~i},λ_{1~k}}#{~v}_{~k},~i ≠ ~k,}

By re-arranging the indices of the basis so that ~k → 1, without loss of generality we can put:

#{~v}_1 _ = _ &mu._{11} #{~a}_1 _ _ - _ _ sum{&mu._{1~i} #{~v}_{~k},~i = 2,~n}

Continue inductively, i.e. for 1 < ~k < ~n:

#{~v}_{~s} _ = _ sum{&mu._{~s~i} #{~a}_{~i},~i = 1,~s} _ _ - _ _ sum{&mu._{~s~i} #{~v}_{~i},~i = ~s+1,~n}, _ _ _ _ &forall. ~s =< ~k - 1

then

#{~a}_{~k} _ = _ sum{λ_{~k~i} #{~v}_{~i},~i = 1,~n} _ = _ sum{&alpha._{~k~i} #{~a}_{~i},~i = 1,~k - 1} _ _ - _ _ sum{&beta._{~k~i} #{~v}_{~i},~i = ~k,~n}

Now

#{~a}_{~k} _ _ != _ _ sum{&alpha._{~k~i} #{~a}_{~i},~i = 1,~k - 1} _ _ _ _ (by linear independence)

so

sum{&beta._{~k~i} #{~v}_{~i},~i = ~k,~n} _ _ != _ _ 0 _ _ _ _ => _ _ _ _ &beta._{~k~l} != 0 , _ some ~l, _ ~k =< ~l =< ~n

thus

#{~v}_{~l} _ = _ fract{1,&beta._{~k~l}}rndb{&alpha._{~k} - sum{&alpha._{~k~i} #{~a}_{~i},~i = 1,~k - 1} - sum{&beta._{~k~i} #{~v}_{~i},~i ≠ ~l, _ }}

Re-arranging as before we can put:

#{~v}_{~k} _ = _ sum{&mu._{~k~i} #{~a}_{~i},~i = 1,~k} _ _ + _ _ sum{&mu._{~k~i} #{~v}_{~i},~i = ~k+1,~n}

true for all ~k, 1 =< ~k =< ~n. _ In particular for ~k = ~n, _ so

#{~v}_{~n} _ = _ sum{&mu._{~n~i} #{~a}_{~i},~i = 1,~n}

We can substitute this expression into the equation for #{~v}_{~n-1}, such that

#{~v}_{~n-1} _ = _ sum{&xi._{~n-1,~i} #{~a}_{~i},~i = 1,~n}

and so on regressively, giving

#{~v}_{~i} _ = _ sum{&xi._{~i ~j} #{~a}_{~j}, _ ~j = 1,~n} _ _ _ _ &forall. ~i

For any #{~t} &in. S

#{~t} _ = _ sum{&tau._{~i} #{~v}_{~i}, _ ~i = 1,~n} _ = _ sum{&tau._{~i} , ~i = 1,~n}sum{&xi._{~i ~j} #{~a}_{~j}, _ ~j = 1,~n}

_ _ _ _ _ = _ sum{ _ , _ ~j = 1,~n}sum{&tau._{~i}&xi._{~i ~j} #{~a}_{~j},~i = 1,~n} _ = _ sum{&sigma._{~j} #{~a}_{~j}, _ ~j = 1,~n}

So &set.#{~a}_{~j}&xset. generate S, this concludes the proof.

#{Corollary}:

  1. Any set of ~n + 1 vectors in S is linearly dependent.
  2. All bases for S have precisely ~n vectors.
  3. #{~b}_1, ... #{~b}_{~r} linearly independent vectors in S, ~r < ~n. Then #{~b}_1, ... #{~b}_{~r} form part of a basis for S.
  4. If &set.{#{~v}}_{~i}&xset._{~i = 1, ... ~k} generate S, ~k > ~n. Then &exist. subset #{~v}_{~i_1}, ... #{~v}_{~i_{~n}} which is a basis for S.
Proof:

Dimension

If &set.{#{~v}}_{~i}&xset._{~i = 1, ... ~n} is a basis for S, then S is said to have ~{#{dimension}} {~n}. _ This is written dim(S) = ~n.