1

Let $F$ be the distribution function of a random variable $X$. If $F$ is continuous, then it holds that $F^{-1}(F(X))=X$ almost surely, where $F^{-1}$ denotes the generalized inverse of $F$.

My question is now: Why does this result not hold for a discontinuous function $F$?

Generally it holds that $F^{-1}(F(x))\leq x$. Assume that $F$ is flat on the interval $[x_0,x_1]$. Then for $x\in(x_0,x_1]$ it follows $F^{-1}(F(x))=x_0<x$. However, we have that $Pr(X\in(x_0,x_1])=F(x_1)-F(x_0)=0$ and therefore the set where $F^{-1}(F(X))=X$ does NOT hold has measure 0 and the equality holds almost surely.

Now, assume there is a jump at $x_1$, i.e. the function is flat on the interval $[x_0,x_1)$. Then, in order to verify that the statement does not hold for discontinuous distribution function, we need to check the probability of $Pr(X\in (x_0,x_1))$. In my mind, this equals $Pr(X\in (x_0,x_1))=F(x_1-)-F(x_0)=0$, where $x_1-$ denotes the left limit of $x_1$. Hence, the set where $F^{-1}(F(X))=X$ does NOT hold has again measure zero and the statement holds almost surely.

What am I doing wrong here?

Did
  • 284,245
Tim
  • 109
  • 9

1 Answers1

1

A problem with your approach is that $F$ having a jump at $x$ does not mean that $F$ is constant on some interval $(x',x)$ with $x'\lt x$ (actually the two notions are not even related)... But the result that $F^{-1}(F(X))=X$ almost surely indeed holds in full generality, whatever the distribution of $X$ may be.

A short route to prove this is to consider some random variable $U'$ uniformly distributed on $(0,1)$, possibly defined on another probability space $(\Omega',\mathcal F',P')$, to define $X'=F^{-1}(U')$, and to remember that $X$ and $X'$ have the same distribution. Furthermore, $$F^{-1}(F(X'))=F^{-1}(F\circ F^{-1}(U'))\quad\text{almost surely}.$$ Since $F\circ F^{-1}(t)\geqslant t$ for every $t$ and $F^{-1}$ is nondecreasing, this identity implies that $F^{-1}(F(X'))\geqslant F^{-1}(U')=X'$ almost surely. Since $F^{-1}(F(x))\leqslant x$ for every $x$, $$ F^{-1}(F(X'))=X'\quad\text{almost surely}.$$ Now, the probability of the event $[F^{-1}(F(X'))=X']$ depends on the distribution of $X'$ only, hence $P(F^{-1}(F(X))=X)=P'(F^{-1}(F(X'))=X')=1$ and the proof is over.

The solution above follows the usual definition of the generalized inverse function of the CDF $F$ as $F^{-1}(t)=\inf\{x\in\mathbb R\mid F(x)\geqslant t\}$.

Did
  • 284,245
  • Thanks for your answer! And, of course, the two notions are not related.

    However, what I am looking for is an example (or rather the reason) why I need the continuity of F for the claim that $F^{-1}(F(X))=X$ almost surely.

    Your proof does not need it, but other literature that I'm working with states something like...

    "..thanks to the continuity of F, we have $F^{-1}(F(X))=X$ almost surely.."

    The fact that your proof seems ok to me and does not need the continuity confuses me even more now...:)

    – Tim May 08 '14 at 10:42
  • Which literature? – Did May 08 '14 at 12:04
  • For example here:

    H. Jin and X. Zhou, " Behavioral portfolio selection in continuous time" (pdf), Mathematical Finance, Vol. 18 (2008), pp. 385-426.

    After the proof of Lemma C.1. it says (rough quote): "Assume the strictly positive rv $X$ does not admit an atom. Denote $Z:=1-F(X)$. Then $Z\sim U[0,1]$ and $X=F^{-1}(1-Z)$ a.s., thanks to $F$ being continuous."

    I know that one needs continuity for $Z\sim U[0,1]$. Do you think that the above expression only relates to this fact? If yes, then this is bad wording, in my opinion.

    BIG thanks for your help and contribution!

    – Tim May 08 '14 at 12:15
  • Yes "the above expression only relates to this fact". – Did May 08 '14 at 12:17
  • Hey Did, I reconsidered this question and I could not find a result which says $X=F^{-1}(U)$ almost surely. All I know, which is a well-known result, is that $X\sim F^{-1}(U)$, i.e. their distributions are the same. However, given $X$, its cdf $F$ and a uniform random variable $U$, do we really know that

    $X=F^{-1}(U)$ almost surely?

    (cf. http://stats.stackexchange.com/questions/24938/can-two-random-variables-have-the-same-distribution-yet-be-almost-surely-differ)

    – Tim May 15 '14 at 22:38
  • I suppose that this has nothing to do with continuity but it questions your proof above. – Tim May 15 '14 at 22:39
  • Of course, "given X, its cdf F and a uniform random variable U", to assert that "X=F−1(U) almost surely" would be absurd (and I fail to see where I wrote this). – Did May 16 '14 at 05:51
  • With some distance, I know what my problem was. It holds that $$F^{-1}(F(X))=X a.s.$$ but not $$F^{-1}(U)=X a.s.$$ for any given uniform random variable $U$. – Tim Jun 05 '14 at 13:28
  • Hey Did, maybe you can help me out here:

    http://math.stackexchange.com/questions/831593/distribution-function-inequality-for-a-transformed-random-variable

    ?!

    – Tim Jun 13 '14 at 07:07