I want to show, that the Vandermonde's Matrix $A:==\begin{pmatrix}1&x_0&\cdots&x_0^d\\1&x_1&\cdots&x_1^d\\\vdots&\vdots&\ddots&\vdots\\1&x_d&\cdots&x_d^d\end{pmatrix} \in \mathbb{R}^{(d+1)\times (d+1)}$ is invertible , when $x_0, ..., x_d \in \mathbb{R}$ are pairwise distinct. Determinants are unknown for us so far.
My idea was to use Gaussian elimination:
$$\begin{align*}\begin{pmatrix}1&x_0& x_0^2 &\cdots&x_0^d\\1&x_1& x_1^2 &\cdots&x_1^d\\\vdots&\vdots&\vdots&\ddots&\vdots\\1&x_d& x_d^2 &\cdots&x_d^d\end{pmatrix} &\longmapsto \begin{pmatrix}1&x_0& x_0^2 &\cdots&x_0^d\\0&x_1-x_0& x_1^2-x_0^2 &\cdots&x_1^d-x_0^d\\\vdots&\vdots&\vdots&\ddots&\vdots\\0&x_d-x_0& x_d^2-x_0^2 &\cdots&x_d^d-x_0^d\end{pmatrix} \\ &\longmapsto \begin{pmatrix}1&x_0& x_0^2 &\cdots&x_0^d\\0&x_1-x_0& x_1^2-x_0^2 &\cdots&x_1^d-x_0^d\\ 0 & 0 & x_2^2-x_1^2 & \dots & x_2^d-x_1^d\\\vdots&\vdots&\vdots&\ddots&\vdots\\0&0& x_d^2-x_1^2 &\cdots&x_d^d-x_1^d\end{pmatrix} \\ &\longmapsto \ldots \end{align*}$$
After the first transformation, we can see that $x_1,..x_d$ must be different from $x_0$ otherwise the row echelon form wouldn't have full rank $\Rightarrow$ not invertible. Analog for the others.
Is my proof correct?