Questions tagged [independence]

For questions involving the notion of independence of events, of independence of collections of events, or of independence of random variables. Use this tag along with the tags (probability), (probability-theory) or (statistics). Do not use for linear independence of vectors and such.

For events: Two events $A$ and $B$ are independent if $$P(A\cap B)=P(A)P(B)$$ More generally, a family $\mathscr F$ of events is independent if, for every finite number of distinct events $A_1$, $A_2$, $\ldots$, $A_n$ in $\mathscr F$, $$P\left(\bigcap_{i=1}^nA_i\right) =\prod_{i=1}^nP(A_i)$$

Two collections of events (for example, two $\sigma$-algebras) $\mathscr F$ and $\mathscr G$ are mutually independent (or simply, independent) if every $A$ in $\mathscr F$ and every $B$ in $\mathscr G$ are independent.

More generally, some collections $\mathscr F_i$ of events, indexed by some finite or infinite set $I$, are mutually independent (or simply, independent) if, for every finite subset $\\{i_1,i_2,\ldots,i_n\\}$ of $I$ and every event $A_k$ in $\mathscr F_{i_k}$, the family $\\{A_1,\ldots,A_n\\}$ is independent.

For random variables: Two random variables $X$ and $Y$ (defined on the same probability space) are independent if their $\sigma$-algebras $\sigma(X)$ and $\sigma(Y)$ are (mutually) independent.

In particular, 2 events $A$ and $B$ are independent if and only if the indicator random variables $1_A$ and $1_B$ are independent.

More generally, a family $\mathscr X$ of random variables (defined on the same probability space) is independent if, for every finite sub-family $\\{X_1,X_2,\ldots,X_n\\}$ of $\mathscr X$, the $\sigma$-algebras $\sigma(X_{1})$, $\sigma(X_{2})$, $\dots$, $\sigma(X_{n})$ are (mutually) independent.

3052 questions
131
votes
4 answers

Could someone explain conditional independence?

My understanding right now is that an example of conditional independence would be: If two people live in the same city, the probability that person A gets home in time for dinner, and the probability that person B gets home in time for dinner are…
Ryan
  • 1,721
42
votes
6 answers

Why does zero correlation not imply independence?

Although independence implies zero correlation, zero correlation does not necessarily imply independence. While I understand the concept, I can't imagine a real world situation with zero correlation that did not also have independence. Can someone…
user86403
  • 421
37
votes
3 answers

Existence of independent and identically distributed random variables.

I often see the sentence "let $X_1, X_2, \ldots$ be a sequence of i.i.d. random variables with a certain distribution". But given a random variable $X$ on a probability space $\Omega$, how do I know that there is a sequence of INDEPENDENT random…
27
votes
6 answers

Example of Pairwise Independent but not Jointly Independent Random Variables?

I am asked to: Find a joint probability distribution $P(X_1,\dots, X_n)$ such that $X_i , \, X_j$ are independent for all $i \neq j$, but $(X_1, \dots , X_n)$ are not jointly independent. I have no idea where to start, please help.
user2262504
  • 994
  • 1
  • 14
  • 20
18
votes
2 answers

Independence of disjoint events with strictly positive probability

I'm taking a class in Probability Theory, and I was asked this question in class today: Given disjoint events $A$ and $B$ for which $$ P(A)>0\\ P(B)>0 $$ Can $A$ and $B$ be independent? My answer was: $A$ and $B$ are disjoint, so $P(A\cap…
aaazalea
  • 916
17
votes
1 answer

Uniform distribution on a simplex via i.i.d. random variables

For which $N \in \mathbb{N}$ is there a probability distribution such that $\frac{1}{\sum_i X_i} (X_1, \cdots, X_{N+1})$ is uniformly distributed over the $N$-simplex? (Where $X_1, \cdots, X_{N+1}$ are accordingly distributed iid random variables.)
17
votes
2 answers

Factoring $1+x+\dots +x^n$ into a product of polynomials with positive coefficients

Can the polynomial $1+x+x^2+\dots +x^n$ be factored, for some $n\ge 1$, into a product of two non-constant polynomials with positive coefficients? Thoughts It is easy to factor it into polynomials with non-negative coefficients e.g. $$ 1+x+x^2+x^3…
16
votes
2 answers

A criterion for independence based on Characteristic function

Let $X$ and $Y$ be real-valued random variables defined on the same space. Let's use $\phi_X$ to denote the characteristic function of $X$. If $\phi_{X+Y}=\phi_X\phi_Y$ then must $X$ and $Y$ be independent?
16
votes
1 answer

Independence and conditional expectation

So, it's pretty clear that for independent $X,Y\in L_1(P)$ (with $E(X|Y)=E(X|\sigma(Y))$), we have $E(X|Y)=E(X)$. It is also quite easy to construct an example (for instance, $X=Y=1$) which shows that $E(X|Y)=E(X)$, does not imply independence of…
16
votes
5 answers

Two tails in a row - what's the probability that the game started with a head?

We're tossing a coin until two heads or two tails in a row occur. The game ended with a tail. What's the probability that it started with a head? Let's say we denote the game as a sequence of heads and tails, e.g. $(T_1, H_2, T_3, H_5, H_6)$ is a…
Angie
  • 1,110
16
votes
4 answers

Showing that $n$ exponential functions are linearly independent.

I have $n$ lambdas, which are all different real and positive numbers, where: $\lambda_1 < \lambda_2 < \cdots < \lambda_n$. I then have to show that these functions are linearly independent: $$e^{\lambda_1 t}, e^{\lambda_2 t}, \ldots, e^{\lambda_n…
some_name
  • 357
15
votes
0 answers

Distribution of the sum of absolutes values of T-distributed random variables

Where $X$ is a r.v. following a symmetric $T$ distribution with $0 $mean and tail parameter $\alpha$. I am looking for the distribution of the $n$-summed independent variables $ \sum_{1 \leq i \leq n}|x_i|$. $Y=|X|$ has for PDF $\frac{2…
15
votes
5 answers

Examples of pairewise independent but not independent continuous random variables

By considering the set $\{1,2,3,4\}$, one can easily come up with an example (attributed to S. Bernstein) of pairwise independent but not independent random variables. Counld anybody give an example with continuous random variables?
14
votes
1 answer

Conditional expectation of product of conditionally independent random variables

I would like to show the following statement using the general definition of conditional expectation. I believe it is true as it was also pointed out in other posts. Let $X,Y$ be conditionally independent random variables w.r.t a sigma algebra…
1
2 3
99 100