If #{~u}, #{~v} &in. F^{~n} (~n # 1 column vectors) then the #{~{standard inner product}} (~{SIP}) is defined _ #{~u}#.#{~v} _ = _ #{~u}^T${#{~v}}.
(The transpose of #{~u}# times the complex conjugate of #{~v}.) _ Note that if F = &reals. _ #{~u}#.#{~v} _ = _ #{~u}^T#{~v}.
The #{~{natural basis}} for F^{~n} is the set _ \{#{~e}_{~i}\} _ (1 &le. ~i &le. ~n) _ where #{~e}_{~i} = (0 ... 0, 1,0 ... 0)^T _ (1 in the ~i^{th} position).
With respect to the SIP, the natural basis is an orthonormal set.
Consider ~m # ~n matrix A and ~n # ~p matrix B. These can be written in terms of their row and column vectors respectively:
A _ = _ matrix{#{~a}_1^T/:/#{~a}_{~m}^T} _ _ _ _ _ _ _ _ _ _ B _ = _ matrix{#{~b}_1, ... ,#{~b}_{~p}}
where _ #{~a}_{~i} , #{~b}_{~j} &in. F^{~n} .
Then _ A$B _ = _ ( #{~a}_{~i}^T #{~b}_{~j}^c )_{~i=1 ... ~m, _ ~j=1 ... ~p} _ = _ ( #{~a}_{~i} #. #{~b}_{~j} )_{~i, ~j}
If A and B are real matrices then _ AB _ = _ ( #{~a}_{~i}^T #{~b}_{~j} )_{~i, ~j} _ = _ ( #{~a}_{~i} #. #{~b}_{~j} )_{~i, ~j}
Let A be an ~n # ~n matrix, the _ #{~{(hermitian) adjoint}} of A is _ A* _ = _ $A ^T _ (the transpose of the complex conjugate).
Note that _ (A*)* _ = _ A.
If A is a real matrix then _ A* _ = _ A^T.
An ~n # ~n matrix U is said to be #{~{unitary}} if _ U* U _ = _ U U* _ = _ I_{~n}.
[Note that _ U U* _ = _ #( #{~u}_{~i} #. #{~u}_{~j} #)_{~i, ~j} , _ where #{~u}_{~i} is the ~i^{th} row vector of U.]
If an ~n # ~n real matrix V is unitary then it is also called an #{~{orthonormal}} matrix, _ in which case _ V^T V _ = _ V V^T _ = _ I_{~n}.
Note that _ V V^T _ = _ #( #{~v}_{~i} #. #{~v}_{~j} #)_{~i, ~j} , _ where #{~v}_{~i} is the ~i^{th} row vector of V, so we have that #{~v}_{~i} #. #{~v}_{~j} _ = _ &delta._{~i ~j}   (Kroenecker). So the row vectors of V form an orthonormal set. _ (An equivalent result holds for the column vectors of V.)
In fact this result generalizes to any unitary matrix (see "Orthonormal Basis" below).
Sometimes an ~n # ~n real unitary matrix V is called an orthogonal matrix. _ Strictly speaking however V is _ #{~{orthogonal}} _ if _ V^T V _ = _ V V^T _ = _ D, _ where D is any (real valued) ~n # ~n diagonal matrix, _ in which case the row (or column) vectors of V form an orthogonal set.
The column vectors of any unitary matrix form an orthonormal basis for F^{~n} _ (with respect to the SIP).
Conversely if _ \{#{~u}_{~i}\} _ (1 &le. ~i &le. ~n) _ is an orthonormal basis for F^{~n} _ (with respect to the SIP), _ then the matrix _ U _ = _ #( #{~u}_1 ... #{~u}_{~n} #) _ is unitary.
Proof:
If U is unitary then _ U* U = I_{~n} ,  but
U* U _ = _ matrix{(#{~u}_1^c)^T/:/(#{~u}_{~n}^c)^T} #( #{~u}_1, ... ,#{~u}_{~n} #) _ = _ #( (#{~u}_{~i}^c)^T#{~u}_{~j} #)_{~i , ~j} _ = _ #( (#{~u}_{~i} #. #{~u}_{~j} )^c #)_{~i , ~j}
So U* U = I_{~n} _ &iff. _ (#{~u}_{~i} #. #{~u}_{~j} )^c _ = _ &delta._{~i ~j} _ (Kroenecker), _ _ &iff. _ #{~u}_{~i} #. #{~u}_{~j} _ = _ &delta._{~i ~j} _ (as &delta._{~i ~j} all real), _ &iff. _ \{ #{~u}_{~i} \} is orthonormal.
If U and V are ~n # ~n unitary matrices, then U^T , $U , $U ^T [ = U*] and UV are also unitary.
Proof:
(((U^T)^c)^T U^T = (($U ^T)^T U^T = (U $U ^T)^T = (I_{~n})^T = I_{~n} _ etc.
Let V be any inner product space (not necessarily finite) over F, and let S be an ~n-dimensional subspace of V, _ then we can construct an orthonormal basis for S.
In fact if _ #{~u}_1 ... #{~u}_{~n} _ is any basis for S, put
#{~v}_1 _ = _ #{~u}_1 , _ _ _ _ _ _ _ _ _ _ #{~v}_{~i} _ = _ #{~u}_{~i} _ - _ sum{fract{( #{~u}_{~i} #. #{~v}_{~j} ),|| #{~v}_{~j} ||^2} #{~v}_{~j},~j = 1,~i &minus. 1}
Then _ #{~v}_1 ... #{~v}_{~n} _ are orthogonal.
Proof of orthogonality:
So #{~v}_1 ... #{~v}_{~n} _ are orthogonal. Now let
#{~w}_{~i} _ = _ fract{ #{~v}_{~i} ,|| #{~v}_{~i} ||^2}
Then _ #{~w}_1 ... #{~w}_{~n} _ is an orthonormal basis for S.
#{Example:}
Let _ S &in. &reals.^4 _ be generated by (1,1,0,0)^T , (1,&minus.1,1,1)^T , (&minus.1,0,2,1)^T _ [Note that they are linearly independent and therefore form a basis for S].
#{~v}_1 _ = _ (1,1,0,0)^T
#{~v}_2 _ = _ (1,&minus.1,1,1)^T &minus. (0/2)(1,1,0,0)^T _ = _ (1,&minus.1,1,1)^T
#{~v}_3 _ = _ (&minus.1,0,2,1)^T &minus. (2/4)(1,&minus.1,1,1)^T &minus. (&minus.1/2)(1,1,0,0)^T _ = _ (&minus.1, 1, 3/2, 1/2)^T
Normalizing
|| #{~v}_1 ||^2 _ = _ 2 _ _ _ _ _ _ => _ _ _ _ _ _ _ #{~w}_1 _ = _ (1/&sqrt.2, 1/&sqrt.2, 0, 0)^T
|| #{~v}_2 ||^2 _ = _ 4 _ _ _ _ _ _ => _ _ _ _ _ _ _ #{~w}_2 _ = _ (1/2, &minus.1/2, 1/2, 1/2)^T
|| #{~v}_3 ||^2 _ = _ 9/2 _ _ _ _ _ _ => _ _ _ _ _ _ _ #{~w}_3 _ = _ (&minus.&sqrt.2/3, &sqrt.2/3, 1/&sqrt.2, 1/(3&sqrt.2) )^T
If _ #{~u}_1 ... #{~u}_{~n} _ is any an orthonormal basis for V_{~n}, _ then for any #{~v} &in. V_{~n} , _ #{~v} = &sum._{~i} ( #{~v} #. #{~u}_{~i} ) #{~u}_{~i}
[ #{~v} = &sum._{~i} λ_{~i} #{~u}_{~i} _ then _ #{~v} #. #{~u}_{~j} _ = _ &sum._{~i} λ_{~i} #{~u}_{~i} #. #{~u}_{~j} _ = _ λ_{~j} ]
If _ #{~u}_1 ... #{~u}_{~n} _ is any an orthonormal basis for V_{~n}. _ Consider the linear map _ &pi.#: V_{~m} &hdash.&rightarrow. F^{~n} _   &pi.( &sum._{~i} λ_{~i} #{~u}_{~i} ) = ( λ_1 ... λ_{~n} )^T   - i.e. &pi. maps an element of V_{~n} onto the vector of its co-ordinates in F^{~n} _ (relative to the given basis.)
Then for any two elements _ #{~v} , #{~w} &in. V_{~n} , _ #{~v} #. #{~w} = &pi.( #{~v} ) #. &pi.( #{~w} ) , _ since
#{~v} #. #{~w} _ = _ ( &sum._{~i} λ_{~i} #{~u}_{~i} ) #. ( &sum._{~j} &mu._{~j} #{~u}_{~j} ) _ = _ &sum._{~i} &sum._{~j} _ λ_{~i} ( &mu._{~j} )^c #{~u}_{~i} #. #{~u}_{~j} _ = _ &sum._{~i} _ λ_{~i} ( &mu._{~i} )^c _ = _ &pi.( #{~v} ) #. &pi.( #{~w} ) _ (standard inner product)
In particular orthogonal (orthonormal) sets of vectors in V_{~n} map onto orthogonal (orthonormal) sets in F^{~n}. Thus we can henceforth confine our attention to F^{~n}, the result holds in any ~n-dimensional inner product space V_{~n} over F.
Let &phi.#: F^{~n} &hdash.&rightarrow. F^{~n} be a linear map with associated matrix A relative to the natural basis _ #{~e}_1 ... #{~e}_{~n}.
So _ &phi.( #{~e}_{~i} ) = &sum._{~j} ~a_{~i ~j} #{~e}_{~j} _ and _ &phi.( #{λ} ) = ( &sum._{~j} ~a_{~i ~j} #{λ}_{~j} )_{~i} = A #{λ} .
The linear map _ &phi.*#: F^{~n} &hdash.&rightarrow. F^{~n} _ is said to be _ #{~{adjoint}} _ to &phi. _ if _ &phi.#{~u} #. #{~v} = #{~u} #. &phi.*#{~v} , _ &forall. #{~u} , #{~v} &in. F^{~n}.
The adjoint of a linear map is unique. For suppose _ &phi.#{~u} #. #{~v} = #{~u} #. &gamma.#{~v} _ and _ &phi.#{~u} #. #{~v} = #{~u} #. &psi.#{~v} _ &forall. #{~u} , #{~v}.
Then #{~u} #. &gamma.#{~v} = #{~u} #. &psi.#{~v} _ or #{~u} #. ( &gamma.#{~v} &minus. &psi.#{~v} ) = 0 . _ in particular, putting #{~u} = &gamma.#{~v} &minus. &psi.#{~v} _ we get
#{~u} #. #{~u} = 0 _ &imply. _ #{~u} = #0 _ &imply. _ &gamma.#{~v} &minus. &psi.#{~v} = #0 , _ &forall. #{~v} _ &imply. _ &gamma. = &psi.
Let &phi. , &phi.* have associated matrices A , B relative to the natural basis. _ Then B is the hermitian adjoint of A, _ i.e. _ B = A* = $A ^T
If _ &phi.#{~u} #. #{~v} = #{~u} #. &phi.*#{~v} _ then _ A#{~u} #. #{~v} = #{~u} #. B#{~v} _ i.e.
_
&sum._{~i} ( &sum._{~j} ~a_{~i ~j} ~u_{~j} ) ( ~v_{~i} )^c _ = _ &sum._{~j} ~u_{~j} ( &sum._{~i} ~b_{~j ~i} ~v_{~i} )^c
so
&sum._{~i ~j} ~a_{~i ~j} ( ~v_{~i} )^c ~u_{~j} _ = _ &sum._{~i ~j} ( ~b_{~j ~i} )^c ( ~v_{~i} )^c ~u_{~j} _ _ _ _ => _ _ _ _ ~a_{~i ~j} _ = _ ( ~b_{~j ~i} )^c
So _ A = $B ^T = B* _ or _ B = A * .
The linear map &phi. is said to be _ #{~{self-adjoint}} _ if _ &phi.* = &phi. . _ i.e. _ &phi.#{~u} #. #{~v} = #{~u} #. &phi.#{~v} , _ &forall. #{~u} , #{~v} &in. F^{~n}.
If &phi. has associated matrix A relative to the natural bases, and &phi. is self-adjoint _ then _ A = A* , _ in which case A is said to be a _ #{~{hermitian}} _ matrix.
If a real-valued matrix A is hermitian, then A = A^T. _ A is called a _ #{~{symetric}} _ matrix.