A hand-made exploration here would look as follows:
We can always write (I will use $X$ and $Y$)
$$X = E(X\mid Y) + e_{X|Y} \implies E(X\mid Y) = X-e_{X|Y}$$
$$Y \equiv E(Y\mid X) + e_{Y|X} \implies E(Y\mid X) = Y-e_{Y|X}$$
where $e_{X|Y},\;\;e_{Y|X}$ are the conditional expectation function errors, which by construction have expected value zero, $E[e_{X|Y}]=E[e_{Y|X}]=0$.
By the assumed inequalities we then have
$$X-e_{X|Y} \geq Y \implies X \geq Y + e_{X|Y},\;\;\; Y-e_{Y|X}\geq X$$
Combining
$$Y + e_{X|Y} \leq X \leq Y-e_{Y|X} \implies e_{X|Y}+e_{Y|X} \leq 0 $$
So under the assumed inequalities, the random variable $Z \equiv e_{X|Y}+e_{Y|X}$ is non-positive. But also $E(Z) = E(e_{X|Y})+E(e_{Y|X}) = 0$.
For a non-positive random variable to have expected value zero, all probability mass must be concentrated at the value zero, making $Z$ a constant random variable equal to zero. But then
$$Z = e_{X|Y} + e_{Y|X}=0 \implies e_{X|Y} = - e_{Y|X}$$
But then we obtain
$$Y + e_{X|Y} \leq X \leq Y+e_{X|Y} \implies X = Y+e_{X|Y}$$
while also $X = E(X\mid Y) + e_{X|Y}$ which leads to $Y = E(X\mid Y)$.
By analogous manipulations, we obtain $X = E(Y\mid X)$.
Then this post contains the proof for the case of equality.