Given $\delta\in [0,1]$ and $n\in \mathbb{N}$, consider a (biased) random walk $S_n(\delta) = \sum_{i = 1}^n X_i$ where $\{X_i:1\le i\le n\}$ are i.i.d. and $X_i = 1$ with probabiltiy $(1+\delta)/2$ and $-1$ otherwise. I am wondering whether the expected distance to the origin increases as the bias $\delta$ increases. Formally, if $0\le\delta\le\delta'\le 1$, for all $n\in \mathbb{N}$ $$\mathbb{E}[|S_n(\delta)|]\le \mathbb{E}[|S_n(\delta')|].$$ where $|\cdot|$ is the one norm.
Note that the second moment is increasing as $\delta$ increases, because $\mathbb{E}[S_n(\delta)^2] = n^2\delta^2+n(1-\delta^2)$. Additionally, by Chebyshev's inequality, we can prove the above inequality for large enough $n$. I am wondering if the inequality holds for all $n\ge 1$. Maybe it can be proved by a coupling argument.