(Answer for the case $n=2$)
Put $ s = \dim \mathbb{C}[x, y]_d$ and let $0<a_1 < a_2 < \cdots < a_s$ be positive real numbers. Consider $L_i=(x+a_i y)^d$ for $1 \leq i \leq s$. We can show that $L_1, \cdots, L_s$ are linearly independent, as follows.
Observe that $L_i = \sum_{r=0}^d \binom{d}{r} a_i^r x^{d-r} y^r$. So it sufficient to show that determinant of the following matrix is nonzero: (Note that here $s=d+1$)
\begin{bmatrix}
1 & \binom{d}{1}a_1 & \binom{d}{2}a_1^2 & \cdots & \binom{d}{d}a_1^d \\
1 & \binom{d}{1}a_2 & \binom{d}{2}a_2^2 & \cdots & \binom{d}{d}a_2^d \\
\vdots & \vdots & \vdots & \ddots & \vdots\\
1 & \binom{d}{1}a_s & \binom{d}{2}a_s^2 & \cdots & \binom{d}{d}a_s^d
\end{bmatrix}
The determinant is given by $ (\det A) \prod_{r=0}^d \binom{d}{r} $, where $A= \left( a_i^{j-1} \right)_{ij}$ is the Vandermonde matrix. Since the Vandermonde determinant is given by $\prod_{1 \leq i < j \leq s } (a_j-a_i) \neq 0$, we are done.
(Answer for the general case)
Let $s = \dim \mathbb{C}[x_1, \cdots, x_n]_d$ and $0<a_1 < a_2 < \cdots < a_s$ be positive real numbers. Choose positive integers $m_1, \cdots, m_n$ such that $m_j > s(m_{j-1}+\cdots+m_1)$ for all $1<j\leq n$. Define $L_i = \left( \sum_{j=1}^{n} a_j^{m_j} x_j \right)^d$ for $1\leq i \leq s$. We will show $L_1, \cdots, L_s$ are linearly independent by calculating determinant as follows:
First, give an anti-lexicographical order on the set of all monomials $\mathfrak{B}=\{\prod_{j=1}^{n} x_j^{r_j} \mid \sum r_j = d \}$, a basis for $\mathbb{C}[x_1, \cdots, x_n]_d$. Observe that $L_i= \sum \binom{d}{r_1, r_2, \cdots, r_n} a_i^{\sum_{j=1}^{n}m_j r_j} \prod_{j=1}^{n} x_j^{r_j}$, where the indices run over all nonnegative integers $r_1, \cdots, r_n$ such that $r_1 + \cdots + r_n = d$. Here $\prod_{j=1}^{n} x_j^{n_j} < \prod_{j=1}^{n} x_j^{r_j}$ implies $\sum_{j=1}^{n} m_j n_j < \sum_{j=1}^{n} m_j r_j$, by our choice of $m_1, \cdots, m_n$.
Now it suffices to show that determinant of the following matrix $A$ is nonzero. Note that multinomial coefficients can be ignored and the order of columns can be changed since the determinant is alternating multilinear in the columns.
$$A=
\begin{bmatrix}
a_1^{i_1} & a_1^{i_2} & a_1^{i_3} & \cdots & a_1^{i_s} \\
a_2^{i_1} & a_2^{i_2} & a_2^{i_3} & \cdots & a_2^{i_s} \\
\vdots & \vdots & \vdots & \ddots & \vdots\\
a_s^{i_1} & a_s^{i_2} & a_s^{i_3} & \cdots & a_s^{i_s}
\end{bmatrix}
$$
Here $i_1>i_2 > \cdots > i_s$ are positive integers.
By the bialternant formula, $ \det A = s_\lambda(a_1, \cdots, a_s)\prod_{1\leq j<k\leq s} (a_j-a_k)$ holds, where $s_\lambda(x_1, \cdots, x_s)$ is the Schur polynomial associated to the partition $\lambda = (\lambda_1, \cdots, \lambda_s)$ with $\lambda_j = i_j - (s-j)$. Recall that the Schur polynomial is defined by the sum of monomials,
$$s_\lambda (x_1, \cdots, x_s) = \sum_T x^T = \sum_T x_1^{t_1} \cdots x_s^{t_s}$$
where the summation is over all semistandard Young tableaux $T$ of shape $λ$. The exponents $t_1, \cdots , t_s$ give the weight of $T$, in other words each $t_i$ counts the occurrences of the number $i$ in $T$.
Since $a_1, \cdots, a_s$ are distinct positive real numbers, $s_\lambda(a_1, \cdots, a_s) \neq 0$. Therefore, $ \det A = s_\lambda(a_1, \cdots, a_s)\prod_{1\leq j<k\leq s} (a_j-a_k)$ is nonzero. Hence the claim follows.
Alternatively, we can prove the result by induction; the method is essentially same to the case $n=2$, as suggested in the exercise 23.5 in Introduction to Lie algebras and Representation Theory written by J.Humphreys. For details, see this answer.