0

Currently doing some work in formal epistemology and have come across the following equation between two ratios of probability. (It would take a long time to explain how I got there, so I'll just jump straight to it...)

Where P is a probability distribution, P(A&B)/P(B) is equal to P(A&B&C)/P(B&C).

I'm a little new to working with probability, so I can't tell if this equation tells me anything of importance about the relationship between A, B and C with respect to P.

EDIT: to be clear, my question is if the equality (in bold) is true, is it interesting for any reason? Or does it look to you like just a random equality?

I would be very interested if any one thought that any P which satisfies this equation also satisfies: P(B implies C)=1.

Sorry for the vague question. Many thanks for any (even vague) answers, tips, or advice.

  • Not sure what you are asking. That equality is not generally true. Suppose, say, that $C=A^c$. then $P(A\cap B\cap C)=0$ though the left side is not forced to be $0$. What did you mean? – lulu Jan 07 '20 at 15:15
  • Thanks. My question is if the equality is true, is it interesting for any reason? Or does it look to you like just a random equality? (interesting with respect to what it may say about A, B and C.) – Joshua Pearson Jan 07 '20 at 15:17
  • Well, the first is the conditional probability $P(A,|,B)$. The second is the conditional probability $P(A,|,B\cap C)$ so it is certainly true if $B\implies C$. – lulu Jan 07 '20 at 15:22
  • Thank you. Yes the equation is true whenever B implies C. But does B imply C whenever the equation is true? – Joshua Pearson Jan 07 '20 at 15:24
  • To see that we need not have $B\implies C$, suppose you have three fair coins, a penny, a nickel, and a dime. You toss all three. Let $A$ be the event "the penny comes up $H$", $B$ the same for the nickel, $C$ the same for the dime. Then both sides equal $\frac 12$ but $B$ does not imply $ C$. – lulu Jan 07 '20 at 15:25
  • Amazing, thanks for the case! – Joshua Pearson Jan 07 '20 at 15:26
  • Can anybody come up with another counterexample, like lulu's, but where we supposed also that P(B)P(C) was not equal to P(B&C) (i.e. suppose B and C are probabilistically dependent)? – Joshua Pearson Jan 07 '20 at 15:32
  • It's also true whenever $P(A \cap C)=0$ and $P(B \cap C) \ne 0$. – Robert Israel Jan 07 '20 at 16:04
  • @JoshuaPearson: For more on conditional independence, you may want to take a look at the answers to Could someone explain conditional independence? on this site. – joriki Jan 07 '20 at 16:38

1 Answers1

1

$\frac{P(A,B)}{P(B)}$ is usually refered to as the conditional probability of $A$ given $B$ (https://en.wikipedia.org/wiki/Conditional_probability), and we often use the notation $P(A|B)$. So what you are asking, is under which conditions we have the following. $$P(A | B) = P(A | B,C)$$ This is true exactly when $A,C$ are conditionally independent of B, i.e. when $P(A,C | B) = P(A|B)P(C|B)$.

Let's prove it: Assume conditional independence, then $$P(A |B,C) = \frac{P(A,B,C)}{P(B,C)}=\frac{P(A,C|B)P(B)}{P(C|B)P(B)}=\frac{P(A|B)P(C|B)} {P(C|B)}=P(A|B)$$ Now let's instead assume that $P(A | B) = P(A | B,C)$, then $$P(A,C|B)=\frac{P(A,B,C)}{P(B)}=\frac{P(A|B,C)P(C|B)P(B)}{P(B)}=P(A|B)P(C|B)$$

for more information on conditional independence: https://en.wikipedia.org/wiki/Conditional_independence