Estou intrigado com a seguinte declaração:
"Para aumentar o desvio padrão de um conjunto de números, você deve adicionar um valor que esteja a mais de um desvio padrão da média"
Qual a prova disso? É claro que sei como definimos o desvio padrão, mas pareço sentir falta dessa parte. Algum comentário?
standard-deviation
JohnK
fonte
fonte
Respostas:
Para qualquer número y 1 , y 2 , … , y N com média ˉ y = 1N y1,y2,…,yN , a variância é dada por
σ 2y¯=1N∑i=1Nyi
Aplicando(1)ao conjunto dado dennúmerosx1,x2,…xn
que consideramos conveniente na exposição para ter médiaˉx=0, temos que
σ2=1
fonte
The puzzling statement gives a necessary but insufficient condition for the standard deviation to increase. If the old sample size isn , the old mean is m , the old standard deviation is s , and a new point x is added to the data, then the new standard deviation will be less than, equal to, or greater than s according as |x−m| is less than, equal to, or greater than s1+1/n−−−−−−√ .
fonte
Leaving aside the algebra (which also works) think about it this way: The standard deviation is square root of the variance. The variance is the average of the squared distances from the mean. If we add a value that is closer to the mean than this, the variance will shrink. If we add a value that is farther from the mean than this, it will grow.
This is true of any average of values that are non-negative. If you add a value that is higher than the mean, the mean increases. If you add a value that is less, it decreases.
fonte
I'll get you started on the algebra, but won't take it quite all of the way. First, standardize your data by subtracting the mean and dividing by the standard deviation:
fonte