4

Given $\delta\in [0,1]$ and $n\in \mathbb{N}$, consider a (biased) random walk $S_n(\delta) = \sum_{i = 1}^n X_i$ where $\{X_i:1\le i\le n\}$ are i.i.d. and $X_i = 1$ with probabiltiy $(1+\delta)/2$ and $-1$ otherwise. I am wondering whether the expected distance to the origin increases as the bias $\delta$ increases. Formally, if $0\le\delta\le\delta'\le 1$, for all $n\in \mathbb{N}$ $$\mathbb{E}[|S_n(\delta)|]\le \mathbb{E}[|S_n(\delta')|].$$ where $|\cdot|$ is the one norm.

Note that the second moment is increasing as $\delta$ increases, because $\mathbb{E}[S_n(\delta)^2] = n^2\delta^2+n(1-\delta^2)$. Additionally, by Chebyshev's inequality, we can prove the above inequality for large enough $n$. I am wondering if the inequality holds for all $n\ge 1$. Maybe it can be proved by a coupling argument.

1 Answers1

0

Imagine an instance of the random walk realized by independently uniformly drawing numbers $R_i$ from $[0,1]$ and setting $X_i=1$ iff $R_i\le\frac{1+\delta}2$. Consider how this instance changes as we increase $\delta$. As $\frac{1+\delta}2$ increases beyond some $R_j$, the sign of $X_j$ is flipped from $-1$ to $1$, and all $S_n$ with $n\ge j$ are increased by $2$. This increases $|S_n|$ iff $S_n-X_j\ge0$, and the probability for this is $\ge\frac12$ and independent of $R_j$. Since this holds for arbitrary $R_i$, it holds for the expectation over the $R_i$, which is the same as the expectation over the $X_i$.

joriki
  • 242,601