By carving out the special case of characteristic 2 you can get to a nicer result via simpler methods like congruence.
Let $G$ be given by $g_{i,j}=B\big(w_i, w_j\big)$ where $\{w_1,..., w_n\}$ is some initial arbitrarily chosen basis for $V$
(i.) Assume $\text{char } \mathbb F\neq 2$
Induct on $n$. We show that $G$ is congruent to a diagonal matrix $D$. Once we have that, then each (basis) element of $\mathbf B'$ is simply a re-scaling of a (basis) element of $\mathbf B$. In matrix notation: $\mathbf B:=\bigg[\begin{array}{c|c|c|c} w_1 & w_2 &\cdots & w_{n}\end{array}\bigg]CC^{(2)}$ and $\mathbf B':=\mathbf BD^{-1}$. The Base case of $n=1$ is obvious.
Inductive case: The symmetric form is non-degenerate so there is some $v_1 = \sum_{k=1}^n x_i \cdot w_i$ that is not self-orthogonal under the form (not isotropic). This is equivalent to writing
$\mathbf x^T G$ which has rank one hence nullspace of dim $n-1$ which does not include $\mathbf x$. Letting $\big\{\mathbf y_2,\mathbf y_3,...,\mathbf y_n\big\}$ be a basis for $\ker \mathbf x^T G$, define $C:=\bigg[\begin{array}{c|c|c|c|c} \mathbf x & \mathbf y_2 &\cdots & \mathbf y_{n}\end{array}\bigg]$ and $C^TGC = \begin{bmatrix}d_{1,1} &\mathbf {0}\\ \mathbf {0} &G'\end{bmatrix}$
By induction hypothesis $G'$ is congruent to a diagonal matrix $D'$ via a matrix $C'$,
so define $C^{(2)}:=\begin{bmatrix}1 &\mathbf {0}\\ \mathbf {0} &C'\end{bmatrix}$ and
$\big(CC^{(2)}\big)^TG\big(CC^{(2)}\big)=D$
which completes the proof.
(ii.) Assume $\text{char } \mathbb F= 2$
The proof runs similarly though we need to deal with the possibility of total isotropy (skew-symmetry), i.e. $B(v,v)=0$ for all $v\in V$.
What we show here is that $G$ is is congruent to $\begin{bmatrix}D_r&\mathbf 0\\ \mathbf 0 &J_{2m}\end{bmatrix}$
and $J_{2m}$ is the symplectic matrix which is a permutation matrix since the ground field is $\mathbb F_2$. I.e. $\mathbf B'$ is the same as $\mathbf B$ except the first $r$ vectors are re-scaled and the last $2m$ vectors are permuted. Again proceed via induction and the $n=1$ case is obvious.
Inductive case:
computational note: we can tell between (a) and (b) by checking the diagonal of $G$. (a) occurs iff there is some non-zero diagonal element and (b) occurs otherwise.
(a) if there is some non-self-orthogonal vector, re-use the argument from (i) and call on induction hypothesis afterward. (The case of odd $n$, always falls under (a), why?)
(b) if the form is totally isotropic on our vector subspace $V$, where $\dim V={2m}$ i.e. $B(v,v)=0$ for all $v\in V$, then this is a skew-symmetric form and
$C^TGC = \begin{bmatrix}\mathbf {0}&I_m\\ -I_m &\mathbf 0\end{bmatrix}=\begin{bmatrix}\mathbf {0}&I_m\\ I_m &\mathbf 0\end{bmatrix}=J_{2m}$
which completes the proof.