# Row Equivalence

## Row Equivalence

Two ~m # ~n matrices A and B are said to be #{~{row equivalent}}
if B can be obtained from a by a finite sequence of three types of #{~{elementary row operations}}:

- multiply all elements of a row by a scalar
- interchange a pair of rows
- add a scalar multiple of one row to another row.

If A and B are row equivalent we write A &cong. B.

Each of these operations is reversible by applying another elementary row operation:

- λ times ~i^{th} row _ &doublearrow. _ 1/λ times ~i^{th} row
- interchange rows ~i and ~j _ &doublearrow. _ same operation
- row ~i plus λ times row ~j _ &doublearrow. _
row ~i plus &minus.λ times row ~j _

Row equivalence is a) reflexive, b) symmetric, c) transative.

- A &cong. A, apply one operation and its reverse.
- A &cong. B then A &cong. A_1 &cong. A_2 &cong. ... &cong. A_{~k} &cong. B, where
A_{~i} obtained from A_{~i&minus.1} by one elementary row operation. All the operations
are reversible so B &cong. A_{~k} &cong. ... &cong. A_1 &cong. A, i.e. B &cong. A.
- A &cong. B and B &cong. C &imply. A &cong. A_1 &cong. ... &cong. A_{~k} &cong. B &cong. B_1
&cong. ... &cong. B_{~l} &cong. C, _ each by one elementary row operation, so A &cong. C.

Similar definitions and results hold for column equivalence.

## Elementary Matrices

Define the following ~m # ~m #{~{elementary row matrices}}:

- R_{λ~i} &hdash. multiply ~i^{th} row of I_{~m} by λ
- R_{~i,~j} &hdash. interchange ~i^{th} and ~j^{th} rows of I_{~m}
- R_{~j+λ~i} &hdash. add λ times ~i^{th} row to ~j^{th} row of I_{~m}

Then if B is obtained from A by an elementary row operation, B = RA, where R is the elementary
matrix obtained from I

_{m} by the same operation.

Example:

Note:

- R_{λ~i}^{&minus.1} = R_{1/λ # ~i} , _ R_{λ~i}^T = R_{λ~i}
- R_{~i,~j}^{&minus.1} = R_{~i,~j} , _ R_{~i,~j}^T = R_{~i,~j}
- R_{~j+λ~i}^{&minus.1} = R_{~j&minus.λ~i} , _ R_{~j+λ~i}^T = R_{~i+λ~j}

B &cong. A _ &imply. _ B _ = R_{~k} ... R_1A,
_ (where the R_{~i} are elemetary matricies)
_ &imply. _ B = MAN, _ where M = R_{~k} ... R_1 and N = I_{~m}.

So row equivalence implies (matrix) equivalence.

The converse is in general not true.

We can have similar definitions for elementary column matrices and similar results hold
apart from the fact that column operations on a matrix are obtainded by *post*-multiplication
by the elementary (column) matrices.

The examples on this page have been written using
MathymaMatrix
, a JavaScript module for doing
Matrix algebra on a Web page. If you have any matrix calculation problems, why not take a look?

## Echelon Matrix

A matrix A (not necessarily a square matrix) is called an #{~{echelon}} matrix if

- If row ~i is non-zero then the position of the first non zero element
in row ~i is less than that in row ~i+1 (if this has a non-zero element).
- If row ~i is zero then row ~i+1 is zero (and so all subsequent rows).
- The first non-zero element of any row is 1.

Example:

If A is an ~m # ~n matrix, then let &mu.(~i) be the position of
the first non-zero element in row ~i, putting &mu.(~i) = ~n+1 if the entire row is zero.

Then we can give formalize the definition of an echelon matrix:

- &mu.(~i) &le. &mu.(~i+1) _ 1 &le. ~i &le. ~m
- &mu.(~i) = &mu.(~i+1) _ &iff. _ &mu.(~i) = ~n+1 and &mu.(~i+1) = ~n+1
- &mu.(~i) < ~n+1 _ &imply. _ ~a_{~i, &mu.(~i)} = 1

## Reduced Echelon Matrix

An ~m # ~n matrix B is a #{~{reduced echelon}} matrix if it is echelon and the
first non-zero element in any row is the only non-zero element in that column,
i.e. ~b_{~i, &mu.(~j)} = 0 if ~j ≠ ~i .

Example:

## Row Equivalence to Echelon Matrix

#{Theorem}: Any ~m # ~n matrix A is row equivalent to an ~m # ~n echelon matrix .

Proof is by induction on ~m:

Let A be a 1 # ~n (row) matrix. If A is zero there is nothing to prove,
otherwise (1/a_{1,&mu.(1)})A is echelon

Now suppose result holds for any given ~k, i.e. for all ~r, 1 &le. ~r &le. ~k
Any ~r # ~n matrix is row equivalent to an ~r # ~n echelon matrix.

Let A be a (~k+1) # ~n matrix. If A = 0 there is nothing to prove, otherwise choose ~r
such that &mu.(~r) = min \{&mu.(~i), 1 &le. ~i &le. ~m \} (i.e. &mu.(~r) is the
first column with non-zero element(s).

Interchange the 1^{st} and ~i^{th} rows, and multiply the (resulting) first row by 1/~a_{~r, &mu.(~r)}
_ giving:

A _ _ _ ~= _ _ _ matrix{
0, ... ,0, _ , _ 1 _ ,{~a_{~r, 1} ./ ~a_{~r, &mu.(~r)}}, ... ,{~a_{~r, ~n} ./ ~a_{~r, &mu.(~r)}}
/ _ ,,,,,,,
/,O,, _ ,,,A_1,,
/ _ ,,,,,,,
}

Where O is the ~k # (&mu.(~r)&minus.1) zero matrix, and A_1 is
the ~k # (~n&minus.&mu.(~r)+1) submatrix.

By successively subtracting multiples of the first row from following rows we obtain:

A _ _ _ ~= _ _ _ B _ _ _ = _ _ _ matrix{
0, ... ,0, _ , _ 1 _ ,{~a_{~r, 1} ./ ~a_{~r, &mu.(~r)}}, ... ,{~a_{~r, ~n} ./ ~a_{~r, &mu.(~r)}}
/ _ ,,,,,,,
/,O,,, _ ,,B_1,,
/ _ ,,,,,,,
}

Where O is the now ~k # (&mu.(~r)) zero matrix, and B_1 is
the ~k # (~n&minus.&mu.(~r)) submatrix. By the inductive hypothesis B_1 reduces
to a ~k # (~n&minus.&mu.(~r)) echelon matrix by a finite number of row operations.
Clearly these row operations performed on the respective rows of B (row 1 of B_1 = row 2 of B etc.)
will leave the first row, and the zero sub-matrix unchanged. So we obtain a (~k+1) # ~n echelon matrix C
by a finite number of row operations.
So B &cong. C and therefore A &cong. C, completing the inductive step.

## Row Equivalence to Reduced Echelon Matrix

#{Theorem}: Any ~m # ~n matrix A is row equivalent to an ~m # ~n reduced echelon matrix.

Proof:

By previous theorem A &cong. C where C is an echelon matrix. It is sufficient to show
that any echelon matrix is row equivalent to a reduced echelon matrix.

The first non-zero element, if any, in row ~i of C is 1, this is in column &mu.(~i). For all other
rows, ~j, with non-zero element in this column subtract ~c_{~j,&mu.(~i)} times row ~i from row ~j, leaving
zero in all positions of column &mu.{~i}.

By doing this for all rows the matrix reduces to a reduced echelon matrix.

Example: