1

Let $B$ be a non-degenerate symmetric bilinear form define don a finite dimensional vector space $V$. Let $\mathcal{B} = \{v_i\}$ be a basis for $V$. I want to prove that there exists a basis $\mathcal{B}' = \{v'_i\}$ such that $B(v_i, v'_j)=\delta_{ij}$.


My first trial :

I know non-degenerate symmetric bilinear implies $[B]_{\mathcal{B}}$ is invertible, so by considering inverse matrix and identfiy coordinate transformations, $v'_j = \sum_k [B_{jk}]^{-1} v_k$, I can have \begin{align} B(v_i, v'_j) = B(v_i, \sum_k [B_{jk}]^{-1} v_k) = [B]_{ik} [B_{jk}]^{-1} = \delta_{ij} \end{align}

In this trial, I explicitly find the $v'_i$. I want to do this without explicit expression.


My second trial :

Naively, I know for a linear functional $f \in V^*$, there exists a unique vector $w\in V$ such that $f(v) = B(v,w)$. If I put $f=\sum a_{i} v_i^*$ and $v= b_j v_j$, then from $v^*_i(v_j) = \delta_{ij}$ I can assert $ a_j=f(v_j) = B(v_j, w)$... and I stuck with this moment.

Are there any good proof for this theorem?

phy_math
  • 6,700
  • I think that you are adding an unnecessary burden to yourself by doing it in coordinates... But, yes, I understand the impulse... and, still, after doing it in coordinates, you might think about how to do it without. :) – paul garrett Jun 28 '22 at 22:56

2 Answers2

3

The statement "for each $f \in V^*$ there exists a unique $w \in V$ such that $f = B(\_,w)$" is equivalent to say that the function $$\begin{align*} B_R \colon V & \longrightarrow V^* \\ w & \longmapsto B(\_,w) \end{align*}$$ is bijective. Since $B_R$ is linear (prove it), it is an isomorphism.

Now, consider the dual basis $\{v_i^*\}_{i \in I}$ of $\{v_i\}_{i \in I}$, and for each $i \in I$, let $v_i' := B_R^{-1}(v_i^*)$. Then $\{v_i'\}_{i \in I}$ is a basis for $V$, and for each $j \in I$ we have that $v_j^* = B_R(v_j') = B(\_,v_j')$, in particular, $$\forall i \in I, \quad \delta_{ij} = v_j^*(v_i) = B(v_i,v_j').$$

azif00
  • 23,123
1

By carving out the special case of characteristic 2 you can get to a nicer result via simpler methods like congruence.
Let $G$ be given by $g_{i,j}=B\big(w_i, w_j\big)$ where $\{w_1,..., w_n\}$ is some initial arbitrarily chosen basis for $V$

(i.) Assume $\text{char } \mathbb F\neq 2$
Induct on $n$. We show that $G$ is congruent to a diagonal matrix $D$. Once we have that, then each (basis) element of $\mathbf B'$ is simply a re-scaling of a (basis) element of $\mathbf B$. In matrix notation: $\mathbf B:=\bigg[\begin{array}{c|c|c|c} w_1 & w_2 &\cdots & w_{n}\end{array}\bigg]CC^{(2)}$ and $\mathbf B':=\mathbf BD^{-1}$. The Base case of $n=1$ is obvious.
Inductive case: The symmetric form is non-degenerate so there is some $v_1 = \sum_{k=1}^n x_i \cdot w_i$ that is not self-orthogonal under the form (not isotropic). This is equivalent to writing
$\mathbf x^T G$ which has rank one hence nullspace of dim $n-1$ which does not include $\mathbf x$. Letting $\big\{\mathbf y_2,\mathbf y_3,...,\mathbf y_n\big\}$ be a basis for $\ker \mathbf x^T G$, define $C:=\bigg[\begin{array}{c|c|c|c|c} \mathbf x & \mathbf y_2 &\cdots & \mathbf y_{n}\end{array}\bigg]$ and $C^TGC = \begin{bmatrix}d_{1,1} &\mathbf {0}\\ \mathbf {0} &G'\end{bmatrix}$

By induction hypothesis $G'$ is congruent to a diagonal matrix $D'$ via a matrix $C'$,
so define $C^{(2)}:=\begin{bmatrix}1 &\mathbf {0}\\ \mathbf {0} &C'\end{bmatrix}$ and
$\big(CC^{(2)}\big)^TG\big(CC^{(2)}\big)=D$

which completes the proof.

(ii.) Assume $\text{char } \mathbb F= 2$
The proof runs similarly though we need to deal with the possibility of total isotropy (skew-symmetry), i.e. $B(v,v)=0$ for all $v\in V$.
What we show here is that $G$ is is congruent to $\begin{bmatrix}D_r&\mathbf 0\\ \mathbf 0 &J_{2m}\end{bmatrix}$
and $J_{2m}$ is the symplectic matrix which is a permutation matrix since the ground field is $\mathbb F_2$. I.e. $\mathbf B'$ is the same as $\mathbf B$ except the first $r$ vectors are re-scaled and the last $2m$ vectors are permuted. Again proceed via induction and the $n=1$ case is obvious.

Inductive case:
computational note: we can tell between (a) and (b) by checking the diagonal of $G$. (a) occurs iff there is some non-zero diagonal element and (b) occurs otherwise.

(a) if there is some non-self-orthogonal vector, re-use the argument from (i) and call on induction hypothesis afterward. (The case of odd $n$, always falls under (a), why?)
(b) if the form is totally isotropic on our vector subspace $V$, where $\dim V={2m}$ i.e. $B(v,v)=0$ for all $v\in V$, then this is a skew-symmetric form and
$C^TGC = \begin{bmatrix}\mathbf {0}&I_m\\ -I_m &\mathbf 0\end{bmatrix}=\begin{bmatrix}\mathbf {0}&I_m\\ I_m &\mathbf 0\end{bmatrix}=J_{2m}$
which completes the proof.

user8675309
  • 12,193