1

Let $A,B,C$ be iid Unif(0,1). Let $X,Y$ be random variables:

  • $X=(A-B)1_{A-B>0}+(1+(A-B))1_{A-B<0}$

  • $Y=(C-B)1_{C-B>0}+(1+(C-B))1_{C-B<0}$

I am able to show that $X,Y$ are id Unif(0,1). My problem is showing they are iid (i.e. I'm missing the independent).

(I'm not allowed to use measure theory here, but I actually don't see how I would anyway since both $X$ and $Y$ have a '$B$' in the formula.)

Okay so elementary probability stuff only. Let's compute joint cdf and hope it's uniform on unit square. This is

$$P(X \le x, Y \le y) = 1_{x,y > 1} + x1_{0 < x < 1, y > 1} + y1_{0 < y < 1, x > 1} + xy1_{0 < x,y < 1}$$

I believe I get everything except the $xy1_{0 < x,y < 1}$ part.

It seems we have to take cases

  • Case 1:$ X=A-B, Y=C-B$
  • Case 2: $X=(A-B)+1, Y=C-B$
  • Case 3: $X=A-B, Y=(C-B)+1$
  • Case 4: $X=(A-B)+1, Y=(C-B)+1$

Okay let's try Case 1. (Update: The bounds are wrong, but the questions in re the tags are still valid, I believe.)

$$P(0 < X = A - B \le x, 0 < Y = C - B \le y)$$

What I think is conditional probability but of 2 random variables conditioned on 1. Instead of the usual $$P(z_1 < Z < z_2 | B=b) := \int_{z_1}^{z_2} f_{Z|B=b}(z) dz,$$ with $f_{Z|B=b}(z)=\frac{f_{Z,B}(z,b)}{f_B(b)},$ it looks like we'll have like $$P(z_1 < Z < z_2, u_1 < U < u_2 | B=b) := \int_{u_1}^{u_2} \int_{z_1}^{z_2} f_{(Z,U)|B=b}(z,u) dz du,$$ with $f_{(Z,U)|B=b}(z,u) = $, I think, $\frac{f_{Z,U,B}(z,b,u)}{f_B(b)}$

  • Note: if any of the definitions ':=' are in fact not definitions, then you'll have to explain conditioning on an event of probability zero to me please.

So here's what I think is next:

$$P(0 < X = A - B \le x, 0 < Y = C - B \le y) = P(B < A \le x+B, B < C \le y+B)$$

$$ = \int_{b=0}^{b=1} P(B < A \le x+B, B < C \le y+B | B=b) f_B(b) db \tag{1?}$$

$$ = \int_{b=0}^{b=1} P(b < A \le x+b, b < C \le y+b | B=b) f_B(b) db \tag{2????}$$

$$ = \int_{b=0}^{b=1} \int_{b}^{x+b} \int_{b}^{y+b} f_{(A,C)|B=b}(a,c) dc da f_B(b) db \tag{3 part 1?}$$

$$ = \int_{b=0}^{b=1} \int_{b}^{x+b} \int_{b}^{y+b} \frac{f_{(A,C,B)}(a,c,b)}{f_B(b)} dc da f_B(b) db \tag{3 part 2?}$$

$$ \text{[details omitted because actually the bounds are wrong]}$$

$$ \text{[details omitted because actually the bounds are wrong]}$$

$$ \text{[details omitted because actually the bounds are wrong]}$$

$$ = xy $$

And then assuming all of the above is correct and all the question mark parts are justified, repeat for the other 3 cases and it looks like we have $xy$ in each. Are these cases supposed to be added up and so I'm missing $\frac14$? Or what?

  • Case 2: $b-1<A<x+b-1, b<C<y+b$, so again just $xy$

  • Case 3: $b<A<x+b, b-1<C<y+b-1$, so again just $xy$

  • Case 4: $b-1<A<x+b-1, b-1<C<y+b-1$, so again just $xy$

About the question marks:

  • For $(1?)$, I think the rule is like for an event $E$ and continuous random variable $B$, we have $P(E)=\int_{\mathbb R} P(E|B=b) f_B(b) db$. Is this correct?

    • Oh wait a minute wiki says we can't quite do this. What I understand is that we can't do it for arbitrary $E$, but we can do it when (but not only when I guess) $E=\{Y \in \ \text{some interval or Borel set I guess}\}$, for some continuous random variable $Y$ s.t. the joint pdf $f_{X,Y}$ is well-defined? (I forgot if any 2 continuous random variables necessarily have a well-defined joint pdf.)
  • For $(2????)$, I think we're doing something like for events $E$, $H$ and $G$ and continuous random variable $B$: we have $P(E|H)=P(E \cap H|H)$, but $P(G|H)$ is defined only for $P(H)>0$. What is being done here when technically $P(H)=0$? I mean of course in the 1st place when we say like '$P(E|B=b)$', this is notational, we're not really conditioning on the $P$-null event $\{B=b\}$. But I still don't get exactly what's being done here.

  • For $(3 \ \text{parts 1 and 2})$, I'm actually just guessing here, what's the definition of conditional joint cdf of 2 random variables given a 3rd? And please provide a reference.

    • Wiki just says $F_{(X,Y)|Z=z}(x,y):=P(X \le x, Y \le y|Z=z)$, but it doesn't quite define $P(X \le x, Y \le y|Z=z)$.

    • For just 1 continuous random variable conditioned on 1 continuous random variable, it's $P(X \le x, |Z=z) := \int_{-\infty}^{x} f_{X|Z=z}(t) dt$, where $f_{X|Z=z}(t) := \frac{f_{(X,Z)}(x,z)}{f_{Z}(z)}$.

    • For 2 continuous random variables conditioned on 1 continuous random variable, I think it's $P(X \le x, Y \le y|Z=z) := \int_{-\infty}^{x} \int_{-\infty}^{y} f_{(X,Y)|Z=z}(t,u) du dt$, but then...

    • What's $f_{(X,Y)|Z=z}(t,u)$? (I guess we do the elementary probability way of thinking: define the pdf before the cdf...) According to this site (see problems 1 and 16), it's $f_{(X,Y)|Z=z}(t,u): = \frac{f_{(X,Y,Z)}(t,u,z)}{f_Z(z)}$. So, I guess I'm right about joint cdf/pdf stuff. I'm just hoping for a reference please.

BCLC
  • 14,197
  • 2
    I think what you're missing here is that when you separate the four cases and start integrating, by the time you have written the integral as a triple integral you have two inner integrals with bounds outside the support of your random variables. For example, you write $\int_b^{y+b} f_{A,C,B}(a,c,b) ,dc$ with $f_{A,C,B}(a,c,b)=1$ over the entire integral, but in fact $f_{A,C,B}(a,c,b) = 0$ whenever $1<c\leq y+b,$ which happens a non-trivial amount of the time in your triple integral. – David K Mar 22 '21 at 22:51
  • 1
    The error described in the previous comment affects Cases 1 and 2. In Cases 3 and 4, where the innermost integral integrates over $c$ from $b-1$ to $y+b-1,$ the problem is that $f_{A,C,B}(a,c,b) = 0$ whenever $b - 1 \leq c < 0.$ And there are also errors of these kinds in the second-innermost integral, but involving $a$ and $x$ rather than $c$ and $y$. – David K Mar 22 '21 at 23:02
  • @DavidK THANK YOU SO SO MUCH. I FIGURED OUT THE BOUNDS. Post your comments as answer? I'll award you bounty. – BCLC Mar 24 '21 at 04:53
  • @DavidK oh wait there are also those mini-?'s in the computation. what are the justifications for each please? – BCLC Mar 24 '21 at 04:58
  • @DavidK Updated question. now asking about joint pdf/cdf and conditional stuff mainly. – BCLC Mar 24 '21 at 05:41
  • 1
    Cross-post: https://stats.stackexchange.com/q/514443/119261. – StubbornAtom Mar 25 '21 at 14:54
  • 1
    @DavidK: This is what I have been trying to explain in my https://stats.stackexchange.com/a/514451/7224 answer, but it apparently did not make it through... – Xi'an ні війні Mar 26 '21 at 08:02

1 Answers1

1

Note that $X = A - B \pmod 1$ and $Y = C - B \pmod 1$.

Now let $\alpha, \beta \in \mathbb R$ be some fixed numbers. Let $A_{\alpha} = A + \alpha$ and $C_{\beta} = C + \beta$. With these, form $X_{\alpha} = A_{\alpha} - B \pmod 1$ and $Y_{\beta} = C_{\beta} - B \pmod 1$.

Then on the one hand clearly the joint distribution of $(X_{\alpha}, Y_{\beta})$ is equal to that of $(X, Y)$ shifted cyclically $\pmod 1$ over the offset $(\alpha, \beta)$.

On the other hand $A_{\alpha} \pmod 1$, $B$, $C_{\beta} \pmod 1$ are independent and uniform on $[0,1]$. Therefore $(X_{\alpha}, Y_{\beta})$ and $(X, Y)$ have the same distribution. Both observations combined show that this joint distribution is invariant under shifts $\pmod 1$ and is therefore uniform.

WimC
  • 33,414
  • 2
  • 52
  • 98
  • thanks. so there's apparently this rule that [$(X,Y)$ is (joint) uniform(0,1)]-(if and only if)-[$(X,Y)$ and $(X+\alpha,Y+\beta)$ have the same (joint) distribution, for all $\alpha, \beta \in \mathbb R$] ? i think there's some $\mod$ i'm missing. what exactly is the rule please? – BCLC Mar 24 '21 at 04:57
  • also, WimC , i updated question...but well i guess you can ignore the updates if you can explain and justify this invariant if and only if uniform thing – BCLC Mar 24 '21 at 05:47
  • 2
    Let $\mu$ be a shift invariant probability measure on $[0,1]$. Let $p/q$ be a rational number in $[0,1]$. Then by dividing in $p$ equal parts $$\mu([0,p/q]) = p \mu([0,1/q]).$$ The interval $[0,1/q]$ fits exactly $q$ times in $[0,1]$ so $$\mu([0,1/q]) = 1/q.$$ Therefore $\mu([0,p/q]) = p/q$ and by $\sigma$-additivity $\mu([0,\alpha]) = \alpha$ for all $\alpha \in [0,1]$. – WimC Mar 24 '21 at 06:07
  • thanks, but this is for tutoring an engineering undergrad. no measure theory please. is it possible? – BCLC Mar 24 '21 at 06:12