The following statement is valid over $\mathbb{R}$: if $(a_{ij})$ is symmetric and $a_{ii} > \sum_{j\ne i} |a_{ij}|$ for all $i$, then $(a_{ij})$ is positive definite. Now, if we have an "algebraic" statement valid over $\mathbb{R}$, it will be valid over any "real closed field". That statement is $\langle A x, x\rangle > 0$ for all $x \ne 0$. But your ordered field can be imbedded in a real closed field, so the statement is therefore true over any ordered field.
This is the philosophy... But probably the statement can be proved directly, without all this "meta" stuff..
$\bf{Added:}$ The dominant diagonal element criterion is sharp, as one can see looking at the eigenvalues of the matrix $(a_{ij}) = (1_{ij})$. But a weaker condition is enough, for instance $|a_{ij}|< \frac{1}{n-1}$ for all $i\ne j$. It is enough to add up all the inequalities:
$$\frac{1}{n-1}\left (x^2_{i} + x^2_{j}\right) + 2 a_{ij} x_i x_j\ge 0$$ for $i<j$, and note that the inequalities are strict for non-zero variables.
$\bf{Added 2:}$. In fact diagonal dominant implies positive is quite simple. Just add all the inequalities;
$$ |a_{ij}| x_{i}^2 + |a_{ij}| x_{j}^2+ 2 a_{ij} x_i x_j\ge 0$$ for all $i< j$ and get
$$\sum_{i=1}^n s_i x^2_i + \sum_{i<j} 2 a_{ij} x_i x_j\ge 0$$ where
$$s_i = \sum_{j\ne i} |a_{ij}|$$
$\bf{Added 3:}$ Let's also give a purely algebraic proof that diagonal dominant matrix ( by rows, $a_{ii} > \sum_{j\ne i} |a_{ij}|$ for all $i$) have determinant $>0$.
The determinant cannot be $0$. Otherwise the system $A x = 0$ would have a non-zero solution. Get a contradiction, by considering the largest
$|x_i|$.
Deform the matrix to a matrix with positive determinant, while preserving dominance. The usual proof uses the intermediate value property for polynomials. We'll only use that property for polynomials of degree $1$, valid for every ordered field.
For this, consider for $t\in [0,1]$ the matrix $A_t$ that differs from $A$ only on first row, which is $(a_{11}, t a_{12}, \ldots, t a_{1n})$. We have
$$\det A_t = (1-t) a_{11} \det A' + t \det A$$ where $\det A'$ is the determinant of the matrix $(a_{ij})_{2 \le i,j\le n}$. So we can do an induction argument. $n=1$ case is trivial. Assume true for $n-1$. Then we have $\det A'>0$. Therefore, $\det A_0 >0$. We know that $\det A_t \ne 0$ for $t \in [0,1]$ ( determinant of a dominant matrix). We conclude $\det A_t >0$ for all $t \in [0,1]$, and in particular, $\det A_1 = \det A >0$.