Questions tagged [variance]

For questions regarding the variance of a random variable in probability, as well as the variance of a list or data in statistics.

In probability and statistics, variance is a measure of spread among the possible values of a random variable or a list of values.

More information can be found here.

2603 questions
50
votes
4 answers

Determining variance from sum of two random correlated variables

I understand that the variance of the sum of two independent normally distributed random variables is the sum of the variances, but how does this change when the two random variables are correlated?
32
votes
2 answers

What is the Difference between Variance and MSE?

I know that the variance measures the dispersion of an estimator around its mean i.e. $\sigma^2 = E[(X - \mu)^2]$ (the second central moment about the mean). But I'm not getting the meaning of the definition below: The mean squared error measures…
26
votes
2 answers

Difference between variance and 2nd moment

I understand that $Var(X) = E(X^2) - E(X)^2 $ And that the second moment, variance, is $E(X^2)$ How is variance simultaneously $E(X^2)$ and $E(X^2) - E(X)^2$?
21
votes
4 answers

Is the variance of the mean of a set of possibly dependent random variables less than the average of their respective variances?

Is the variance of the mean of a set of possibly dependent random variables less than or equal to the average of their respective variances? Mathematically, given random variables $X_1, X_2, ..., X_n$ that may be dependent: Let $\bar{X} =…
20
votes
1 answer

what is the variance of a constant matrix times a random vector?

$\newcommand{\Var}{\operatorname{Var}}$In this video is claimed that if the equation of errors in OLS is given by: $$u=y - X\beta$$ Then in the presence of heteroscedasticity the variance of $u$, will not be constant, $\sigma^2 \times I$, where $I$…
Mario GS
  • 313
19
votes
4 answers

Why does $ \operatorname{Var}(X) = E[X^2] - (E[X])^2 $

$ \operatorname{Var}(X) = E[X^2] - (E[X])^2 $ I have seen and understand (mathematically) the proof for this. What I want to understand is: intuitively, why is this true? What does this formula tell us? From the formula, we see that if we subtract…
WorldGov
  • 1,037
18
votes
3 answers

Proving $\operatorname{Var}(X) = E[X^2] - (E[X])^2$

I want to understand something about the derivation of $\text{Var}(X) = E[X^2] - (E[X])^2$ Variance is defined as the expected squared difference between a random variable and the mean (expected value): $\text{Var}(X) = E[(X -…
17
votes
5 answers

Variance of sine and cosine of a random variable

Suppose $X$ is a random variable drawn from a normal distribution with mean $E$ and variance $V$. How could I calculate variance of $\sin(X)$ and $\cos(X)$? (I thought the question was simple and tried to do a search, but did not find any good…
17
votes
1 answer

Why are the eigenvalues of a covariance matrix equal to the variance of its eigenvectors?

This assertion came up in a Deep Learning course I am taking. I understand intuitively that the eigenvector with the largest eigenvalue will be the direction in which the most variance occurs. I understand why we use the covariance matrix's…
16
votes
6 answers

Intuition behind binomial variance

Suppose that I perform a stochastic task $n$ times (like tossing a coin) and that $p$ is the probability that one of the possible outcomes occurs. If $K$ is the stochastic variable that measures how many times this outcome occurred during the whole…
14
votes
2 answers

Why square a constant when determining variance of a random variable?

If I want to calculate the sample variance such as below: Which becomes: $\left(\frac{1}{n}\right)^2 \cdot n(\sigma^2)= \frac{\sigma^2}{n} $... My question is WHY does it become $$\left(\frac{1}{n}\right)^2?$$ In other words, why does the $(1/n)$…
13
votes
4 answers

Distribution and moments of $\frac{X_iX_j}{\sum_{i=1}^n X_i^2}$ when $X_i$'s are i.i.d $N(0,\sigma^2)$

Suppose $X_1,X_2,\ldots,X_n$ are independent $N(0,\sigma^2)$ random variables. For $i,j\in \{1,2,\ldots,n\}$, consider $$U=\frac{X_iX_j}{\sum_{i=1}^n X_i^2}$$ Provided $n>1$, we know that $U$ has a Beta distribution when $i=j$…
13
votes
0 answers

Trimmed mean: Take $n$ i.i.d. Gaussians and remove largest $m$ and smallest $m$ points. What is the variance of the mean of the remaining points?

Let $n,m\in\mathbb{Z}$ with $0 \le 2m < n$. Let $X_1, \cdots, X_n$ be i.i.d. standard Gaussians and let $X_{(1)} \le X_{(2)} \le \cdots \le X_{(n)}$ denote their order statistics (i.e., $\{X_1, X_2, \cdots, X_n\} = \{X_{(1)}, X_{(2)}, \cdots,…
Thomas Steinke
  • 945
  • 5
  • 23
12
votes
1 answer

Variance of the Euclidean norm under finite moment assumptions

Let $X = (X_1,X_2 \cdots X_n)$ be random vector in $R^n$ with independent coordinate $X_i$ that satisfy $E[X_i^2]=1$ and $E[X_i^4] \leq K^4$. Then show that $$\operatorname{Var}(\| X\|_2) \leq CK^4$$ where $C$ is a absolute constant and $\| \…
11
votes
2 answers

Maximum of the Variance Function for Given Set of Bounded Numbers

Let $ \boldsymbol{x} $ be a vector of $n$ numbers in the range $ \left[0, c \right] $, where $ c $ is a positive real number. What's is the maximum of the variance function of this $ n $ numbers? Maximum in the meaning what spread of the number will…
Royi
  • 10,050
1
2 3
99 100