Standard deviation compared to Variance
If I understand, the standard deviation shows how close the data are to the mean and variance only helps with finding if the data has outliers?
Hello,
Standard deviation is a measure of dispersion. It shows you how far from the mean would you expect observations to be on average.
Hope this helps!
Best,
Ned
I don't understand how that is different from variance. In other words, to me, it sounds like variance also "shows you how far from the mean would you expect observations to be on average." Am I right or wrong?
3 points from the video:
1-Variance and standard deviation measure the dispersion of a set of data points around its mean value.
2-variance is large and hard to compare as the unit of measurement is squared
3-standard deviation will be much more meaningful than variance as the main measure of variability.
I understand those points separately (although with point 2, if you had the variance of more than 1 dataset, then the biggest variance will have the largest dispersion right? Of course, it's hard to know how large the dispersion is using variance alone because the measurement is squared but if you had 2 variances for example couldn't you see which is the bigger number and say that one has a larger dispersion? Having a larger or smaller dispersion isn't very much on its own but it does lead you in the right direction.). With standard deviation being more useful, why even use variance?
@dylan
To me, both sounds almost same, but variance being squared. it makes more sense to use STDEV since it is in the same units as the data being compared. for example, if the data was in inches, variance will be in inches squared while standard deviation will be in inches. this makes it easier to compare.
Please more explanation on how the coefficient is involved is needed