-1

This question about pairwise vs. mutual relations is related some extant questions: here and here.

Kobayashi, Mark & Turin's Probability, Random Processes and Statistical Analysis, 2012, states without proof:

three events, A, B, C are mutually independent when:

P[A,B]=P[A]P[B], P[B,C]=P[B]P[C], P[A,C]=P[A]P[C], P[A,B,C]=P[A]P[B]P[C]

No three of these relations necessarily imply the fourth. [my italics]

However, Wikipedia and others generally agree that mutual independence implies pairwise independence, but also without a demonstration.

What is the simplest proof that mutual independence implies pairwise independence?

Note: GC Rota wrote that probability can be understood by focusing on random variables or focusing on distributions. However, the two views should be equivalent, correct?

alancalvitti
  • 3,440

2 Answers2

3

Mutual independence means the four identities you copied, pairwise independence means the first three of these identities. Ergo.

Did
  • 284,245
2

For an example where $P[A,B,C]=P[A]P[B]P[C]$ does not imply $P[A,B]=P[A]P[B], P[B,C]=P[B]P[C], P[A,C]=P[A]P[C]$ consider the distribution with indicators:

A  B  C  Prob
1  1  1   1/8
1  1  0   3/8
0  0  1   3/8
0  0  0   1/8

So $P[A]=P[B]=P[C]=\frac12$ and $P[A]P[B]P[C]=\frac18$ but $P[A,B]=\frac12, P[B,C]=\frac18, P[A,C]=\frac18$

In such as case, $A,B,C$ are not mutually independent

Henry
  • 169,616