Let V_{~n} be an inner product space over F, with basis \{ #{~u}_{~i} \} _ (not necessarily orthogonal).
Define the matrix _ H = ( ~h_{~i ~j} )_{~i ~j} = ( #{~u}_{~i} #. #{~u}_{~j} )_{~i ~j} _ then _ ~h_{~i ~j} = ${~h}_{~j ~i} , _ and so H is hermitian.
Note that if _ #{~{x}} = &sum. &alpha._{~i} #{~u}_{~i} _ and _ #{~{y}} = &sum. &beta._{~i} #{~u}_{~i} _ then
#{~{x}} #. #{~{y}} _ = _ &sum. &sum. ( &alpha._{~i} #{~u}_{~i} ) #. ( &beta._{~j} #{~u}_{~j} ) _ = _ &sum. &sum. &alpha._{~i} ${&beta.}_{~j} #{~u}_{~i} #. #{~u}_{~j} _ = _ &sum. &sum. &alpha._{~i} ${&beta.}_{~j} #{~h}_{~i ~j} _ = _ &sum. &sum. &alpha._{~i} #{~h}_{~i ~j} ${&beta.}_{~j} _ = _ #{&alpha.}^TH${#{&beta.}}
where _ #{&alpha.} = ( &alpha._1 ... &alpha._{~n} ) , _ #{&beta.} = ( &beta._1 ... &beta._{~n} ) _ &in. F^{~n}.
Now _ #{~{x}} ≠ 0 &in. V_{~n} _ &iff. _ #{&alpha.} ≠ 0 &in. F^{~n} _ &iff. _ #{&alpha.}^TH${#{&alpha.}} = #{~{x}} #. #{~{x}} > 0
An expression of the form _ #{&alpha.}^TH${#{&alpha.}} , _ #{&alpha.} &in. F^{~n}, H &in. @M_{~n} , _ is known as a _ #{~{quadratic form}}.
A matrix H is said to be _ #{~{positive definite}} _ if _ #{&alpha.}^TH${#{&alpha.}} > 0 , _ &forall. #{&alpha.} ≠ 0 &in. F^{~n}.
So we have shown that given an ~n-dimensional vector space we can define a positive definite hermitian matrix in terms of the inner product and basis of the space.
Now let V_{~n} be any ~n-dimensional vector spacce over F (= &reals. or &complex.). _ Can we impose an inner product on V_{~n}?
#{Theorem}: _ Let _ \{ #{~u}_{~i} \} _ be a basis for V_{~n}. _ If H is a positive definite hermitian matrix _ then _ &exist. (unique) inner product on V_{~n} , _ such that _ #{~u}_{~i} #. #{~u}_{~j} = ~h_{~i ~j}
Before proving the theorem, consider the following lemma:
#{Lemma}: _ Let H be a positive definite hermitian matrix, then &exist. regular matrix P such that H = P*P.
Proof:
#{Corollary}: _ H hermitian: it is positive definite _ &iff. _ all its eigenvalues are strictly positive
Proof of Theorem:
A metric based on an inner product given by _ #{&alpha.} #. #{&beta.} = #{&alpha.}^T H #{${&beta.}} _ some positive definite hermitian matrix H, _ is called a _ #{~{hermitian metric}}
To decide whether H is positive definite (hermitian), we can examine its eigenvalues, which should all be strictly positive. This is not easy.
Alternatively try to find #{~y}, _ where _ #{${~y}} = P#{${~x}} , _ such that _ #{~x}^T H #{${~x}} = #{~y}^T #{${~y}} .
Example:
H _ = _ matrix{ 2, 4, -2/ 4, 9, -3/ -2, -3, 5} , _ _ _ _ _ _ Let _ #{~x} _ = _ matrix{~x_1/~x_2/~x_3}
#{~x}^TH#{~x} _ = _ matrix{~x_1,~x_2,~x_3}matrix{2,4, -2/4,9, -3/ -2, -3,5}matrix{~x_1/~x_2/~x_3}
_ = _ 2~x_1^2 + 4~x_1~x_2 - 2~x_1~x_3 + 4~x_2~x_1 + 9~x_2^2 - 3~x_2~x_3 - 2~x_3~x_1 - 3~x_3~x_2 + 5~x_3^2
_ = _ 2~x_1^2 + 8~x_1~x_2 - 4~x_1~x_3 _ + 9~x_2^2 - 6~x_2~x_3 + 5~x_3^2
_ = _ 2(~x_1 + 2~x_2 - ~x_3)^2 _ - 8~x_2^2 + 8~x_2~x_3 - 2~x_3^2 _ + 9~x_2^2 - 6~x_2~x_3 + 5~x_3^2
_ = _ 2(~x_1 + 2~x_2 - ~x_3)^2 + ~x_2^2 + 2~x_2~x_3 + 3~x_3^2
_ = _ _ ~y_1^2 + ~y_2^2 + ~y_3^2 _ = _ _ #{~y}^T#{~y}
where#{~y} _ = _ matrix{~y_1/~y_2/~y_3} _ = _ matrix{&sqrt.2, 2&sqrt.2, -&sqrt.2 / 0, 1, 1 / 0, 0,&sqrt.2}matrix{~x_1/~x_2/~x_3}
So H is positive definite.