2

Exercise 2.3.12 from Grimmet Stirzaker's Probability and Random processes asks the following. I'd like, if you guys can help verify my solution.

Let $X$ be a random variable and $g:\mathbb{R} \to \mathbb{R}$ be continuous and strictly increasing. Show that $Y = g(X)$ is a random variable.

My Solution.

As $g$ is a monotonically increasing function, it is injective(one-to-one). That is, if $x_1 < x_2$, then $g(x_1) < g(x_2)$. Therefore, $x_1 \ne x_2 \implies g(x_1) \ne g(x_2)$.

I am not sure how to deduce, that $g$ is surjective (onto).

If $g$ is bijective, the inverse function $g^{-1}$ exists and is well-defined.

Hence, the set

\begin{align*} &\{ \omega : g(X(\omega)) \le x \}\\ =&\{ \omega : (X(\omega) \le g^{-1}(x) \} \in \mathcal{F} \end{align*}

since $X$ is a random variable. Consequently, $g(X)$ is a random variable.

Quasar
  • 5,644
  • 1
    Your proof in the case where $g$ is bijective is correct. However, $g$ is not always bijective (consider $x\mapsto e^x$ from $\mathbb{R}$ to $\mathbb{R}$). One way to do things is to notice that $g(\mathbb{R})$ is an interval $(a,b)$ with $a,b$ possibly infinite, and to distinguish the cases $x<a$, $x>b$ and $x\in(a,b)$. – charlus Dec 17 '20 at 18:00
  • Related? https://math.stackexchange.com/questions/3944284/prove-that-for-independent-random-variables-x-i-we-have-f-ix-i-are-indepe – BCLC Dec 18 '20 at 00:11

1 Answers1

1

The continuity and the strict monotonicity of $g$ are irrelevant. What is required is that $g$ is a Borel function. Note that either condition "$g$ is continuous", "$g$ is monotonic increasing" implies that $g$ is a Borel function.

Suppose that $g$ is a Borel function. Let $A\in\mathcal{B}(\mathbb{R})$. Observe that $g(X)^{-1}(A) = X^{-1}(g^{-1}(A))\in\mathcal{F}$ because $g^{-1}(A)$ is a Borel set. Hence $g(X)$ is $\mathcal{F}/\mathcal{B}(\mathbb{R})$-measurable, i.e., a random variable.

  • 1
    The book (that states this as a question) is pre-measure theoretic probability. I googled parts of your answer, so I understand that : if we can prove that $g$ is $\mathcal{F}$-measurable (whatever that means), it will map any measurable set (a collection of subsets with the property, that it is closed under countable unions and intersections) to a measurable set, we should be good. – Quasar Dec 17 '20 at 18:12
  • 1
    Related? I think this answer is not so good ebcause it is pre-measure-theoretic probably. https://math.stackexchange.com/questions/3944284/prove-that-for-independent-random-variables-x-i-we-have-f-ix-i-are-indepe @Quasar i suggest you to uncheck this answer in this regard. i think the 1st half of this answer is good in pointing out that continuous and monotonic are irrelevant, but i think the required function for $g$ at this point in probability should be that domain of $g$ contains image of $X$ and that $E[g(X)]$ exists i.e. $E[|g(X)|] < \infty$, EVEN THOUGH... – BCLC Dec 18 '20 at 00:11
  • 1
    ...EVEN THOUGH this can later be relaxed to something like $g$ is measurable. But then again this is kind of weird since we don't even necessarily expect $E[X]$ to exist (i.e. $E[|X|] < \infty$) or well any higher moment $E[X^n]$ I guess – BCLC Dec 18 '20 at 00:14
  • @ BCLC In general, a random variable need not be integrable. – Danny Pak-Keung Chan Dec 18 '20 at 00:51
  • @DannyPak-KeungChan i already say in my second comment that we don't expect $X$ to be integrable? anyway, when talking about elementary probability, i.e. pre-measure theory probability, i think we need to have certain restrictions on the random variables discussed. at the point of Exercise 2.3.12 in the textbook, is measure theory assumed? – BCLC Dec 19 '20 at 09:24
  • @Quasar correct understanding, and good indirect understanding! even though you don't what what $\mathcal F$-measurable means, you know that once you understand it, you understand how the proof is complete. – BCLC Dec 19 '20 at 09:25