Questions about covariance, a measure of (linear) association between two random variables.
Covariance is a measure which shows how much two RVs are dependent. If they are fully independent it would be zero and as much as they are dependent it would have a greater value. You can have a much more powerful insight by description of the following formula:
The covariance of the random variables $X$ and $Y$ is the difference between the Expected value of their product ($E(XY)$) and the product of their expected values ($E(X)E(Y)$).
\begin{align*} \sigma(X,Y) = E(XY)-E(X)E(Y) \end{align*}
If they are independent then $E(XY)=E(X)E(Y)$ and therefore the covariance would be zero. Also, as much as they depend on each other their distance would be higher.
Though the main formula for definition of covariance is \begin{align*} \sigma(X,Y) = E \left[ \left(X-E(X)\right) \left(Y-E(Y)\right) \right] \end{align*}
we can convert it to the pre-explained one (for the finite-domain random variables):
\begin{align*} \sigma(X,Y) &= E \left[ \left(X-E(X)\right) \left(Y-E(Y)\right) \right] \\\ &= E \left[ X Y - X E(Y) - E(X) Y + E(X) E(Y) \right]\\\ &= E (X Y) - E(X) E(Y) - E(X) E(Y) + E(X) E(Y) \\\ &= E (X Y) - E(X) E(Y). \end{align*}
Also, for two vectors of random variables ($\mathbb{X}$ and $\mathbb{Y}$) the covariance matrix has been defined as a matrix in which each cell shows the covariance of corresponding cell in the matrix ($\mathbb{X} \times \mathbb{Y}^T$).
Reference: