0

Allow a multinomial distribution as per [1] with event probabilities $p_{1},\ldots ,p_{k}$ and $\Sigma_{i=1}^{k}{ p_{i}}=1 $ and support $X_i \in \{0,\dots,n\}$ such that $ \Sigma X_{i}=n $.

Clearly, $\textstyle {\mathrm{Cov}}(X_i,X_j) = - n \, p_i \, p_j$ when$~~(i\neq j)$

Given real-valued constants $a$ and $b$ where $ 0 \leq a, b \leq 1$, what is $\textstyle {\mathrm{Cov}}(a^{X_i}, b^{X_j})$

$\textstyle {\mathrm{Cov}}(a^{X_i}, b^{X_j}) = ?$

[1] https://en.wikipedia.org/wiki/Multinomial_distribution

Michael Levy
  • 1,194

1 Answers1

1

Each $X_i$ has a marginal distribution of $\text{Binomial}(n, p_i)$. Therefore, $$ \begin{align} E[a^{X_i}] &= \sum_{x=0}^n a^x \binom {n} {x} p_i^{x} (1 - p_i)^{n-x} \\ &= \sum_{x=0}^n \binom {n} {x} (ap_i)^{x} (1 - p_i)^{n-x} \\ &= (1 - p_i + ap_i)^n\end{align}$$ and similarly $E[b^{X_j}] = (1 - p_j + bp_j)^n$

The cross moments actually is similar too: $$ \begin{align} E[a^{X_i}b^{X_j}] &= \sum_{\substack x_i, x_j = 0 \\ x_i + x_j \leq n}^n a^{x_i}b^{x_j}\frac {n!} {x_i!x_j!(n-x_i-x_j)!}p_i^{x_i}p_j^{x_j} (1 - p_i - p_j)^{n - x_i - x_j} \\ &= \sum_{\substack x_i, x_j = 0 \\ x_i + x_j \leq n}^n \frac {n!} {x_i!x_j!(n-x_i-x_j)!}(ap_i)^{x_i}(bp_j)^{x_j} (1 - p_i - p_j)^{n - x_i - x_j} \\ & = (1 - p_i - p_j + ap_i + bp_j)^n \end{align}$$ where we just applied the multinomial theorem. (binomial theorem being the special case). We assune $i \neq j$ here; If $i = j$ then just use the result of the first part and have $E[(ab)^{X_i}] = (1 - p_i + abp_i)^n$.

Finally, with all things ready, you just need to use $Cov[X, Y] = E[XY] - E[X]E[Y]$ to arrive the desired result.

BGM
  • 7,803