Anda di halaman 1dari 3


igma +athNet

1age 1 of 2

Why we use (n -1) in the denominator of standard deviation formula ?

In short, if x1, x2 ,, xN be independent and identically distributed random variables. The standard deviation of the entire population is given by the formula :

However, if x1, x2 ,, xn

n ! N " is not a set variables of the entire

#o#ulation, but only a sample of the #o#ulation, then we use the standard deviation formula :

Why we do like this ?

Not so te$hni$al, we use n % 1" be$ause we &ust li'e to ma'e the (s#read( or deviation" a little larger to refle$t the fa$t that, sin$e we are using a sam#le, not the entire #o#ulation, we have more un$ertainty.

Degree of freedom
Degrees of freedom is a measure of how mu$h #re$ision an estimate of variation has. ) general rule is that the degrees of freedom decrease when we have to estimate more parameters. Before you can compute the standard deviation, we have to first estimate a mean. This $auses you to lose one degree of freedom and you should divide by n % 1" rather than n. In more $om#lex situations, li'e )nalysis of *arian$e and +ulti#le ,inear -egression, we usually have to estimate more than one #arameter. +easures of variation from these #ro$edures have even smaller degrees of freedom.

n!iased estimator
) more formal way to $larify the situation is to say that s or the sam#le standard deviation" is an unbiased estimator of , the #o#ulation standard deviation if the denominator of s is n % 1". .u##ose we are trying to estimate the #arameter using an estimator / that is, some fun$tion of the observed data". Then the bias of / is defined to be 0 /" % , in short, (the ex#e$ted value of the estimator minus the true value /.(



Si"ma 5ath6et

4a"e # of $

If E()

= 0 , then the estimator is unbiased.

Variance formula using n instead of (n 1) is biased

Suppose we use the variance formula (s uare of standard deviation)

!e are "oin" to show that

Since E(s#) (#).


s# is a biased estimator of

if we use ($) instead of

!e would li%e to clarif& some points before we li%e to prove (') . (irstl&, if ) and * are independent and identicall& distributed, we have+ ,ar (%)) = %# ,ar ()) ,ar () . *) = ,ar ()) . ,ar (*) 0he proofs of (-) and (/) are left to the reader. Secondl&, if is the sample mean and is the population mean, then .... (-) .... (/)

since 12 , 1# , 3, 1n are independent and identicall& distributed and havin" the same variance

Proof of (2)



Sigma MathNet

Page 3 of 3

Taking the summation, we have

Taking the expection of (8),

http# www$%c$e&u$hk math '&vance&(2!)eve* Stan&a+&,&eviation$htm

21 !2 2!1"