Elementary probability:
They don't teach this is in elementary probability, but random variables have an explicit representation known as the Skorokhod representation.
Basically, we never really know the formulas for a lot of the $X$'s. We know the $X$'s mainly from the $F_X(x)$'s. It's kinda like talking about $f(x)=x^2+c$'s through their common derivative $f'(x)=2x$: When is $f$ increasing? When $f' > 0$. We know that if $f$ is not unique given an $f'(x)$. We can do that through integration, or just construct an explicit example $f(x)=x^2+5$ and $f(x)=x^2+4$.
How we do similarly here in probability?
For example, consider $X \sim Be(p)$ where $P(X=0):=p$ and $P(X=1):=1-p$ (Usually, textbooks use $p$ for the $P(X=1)$).
If both of the following $X_i$'s satisfy $X \sim Be(p)$, then we've given explicit Bernoulli random variables that can never be the same, i.e. $X \sim Be(p)$ doesn't have a unique Skorokhod representation.
$$X_1(\omega) := 1_{(0,1-p)}(\omega) := 1_{A_1}(\omega)$$
$$X_2(\omega) := 1_{(p,1)}(\omega) := 1_{A_2}(\omega)$$
If $\omega=\frac{1-p}{2}$, then $X_1(\omega)=1$ while $X_2(\omega)=0$.
Let us try to compute the CDF of $X_i$:
$P(X_i(\omega) \le x)$ is 0 for $x<0$ and 1 for $x \ge 1$.
As for $0 \le x < 1$, define
$$P(X_i(\omega) \le x) = P(X_i(\omega) = 0) = P(1_{A_i}(\omega) = 0) = P(\omega \notin A_i) = 1 - P(\omega \in A_i)$$
We have our result if $P(\omega \in A_1) = P(\omega \in A_2) = 1-p$. Is it?
Okay so here, we need to need to make some kind of assumption to say that the interval $(p,1)$ is not only as probable as $(0,1-p)$ but also that probability of each interval is $1-p$. Clearly, the intervals have the same length, but does that mean they have the same probability? Furthermore, if they do, is it equal to That depends on how we define probabilities here. One such assumption is:
A uniformly distributed random variable $U$ on $(0,1)$ has Skorokhod representation $U(\omega) = \omega \sim Unif(0,1)$.
Hopefully this isn't circular, otherwise this half of the answer is nonsense.
Then $P(\omega \in A_i) = \frac{(1-p)-(0)}{1-0}$ or $= \frac{(1)-(p)}{1-0}$
$$P(\omega \in A_i) = \frac{1-p}{1-0} = 1-p$$
Advanced probability:
It can be shown that $$Y(\omega) = \omega \sim Unif(0,1)$$ for $\omega$ in $((0,1),\mathscr B(0,1),\mu)$ where $\mu$ is Lebesgue measure.
Hence,
$$P(\omega \in A_i) = \mu(A_i) = l(A_i) = 1-p$$
where $l$ is length.