First, this is not homework, it is actually for work. It's been a couple of years since I've done stats and need some help! I've googled for this problem but was unavailable to find any resources that could help answer my question.
I have 25 values:
11.5
11.6
11.9
12.2
12.4
12.4
12.5
12.5
12.5
12.8
12.8
12.9
13.1
13.3
13.5
13.5
13.7
13.7
13.8
13.9
13.9
14
14.3
14.5
15
From here, I calculate the mean and from that, the variance and then the standard deviation:
The variance formula and my variance calculations:
$$ \sigma^{2} =\frac{\sum_{i=1}^{n}(x_{i}-\mu )^{2}}{n}=\frac{\sum_{i=1}^{25}(x_{i}-13.128)^{2}}{25}=0.7996159999999999 $$
Of course, standard deviation is simply the square root of variance:
$$ \sigma =\sqrt{0.7996159999999999}=0.8942125027083886 $$
Here's where I feel like I'm messing up:
One standard deviation less than the mean:
$$ -\sigma + \mu = -0.8942125027083886 + 13.128 = 12.2337874972916114 $$
Two standard deviations less than the mean:
$$ -2\sigma + \mu = -2*0.8942125027083886 + 13.128 = 11.3395749945832228 $$
This value, 11.3395749945832228, falls below the smallest value in the array, 11.5.
How is this possible? Where am I messing up my calculations? Thank you for any and all help! I really appreciate it.