Components of the Mean Square Error of an Estimator

If you have a parameter \(\theta\) and an estimate of that parameter \(\hat\theta\), then the mean square error of the estimator is defined as:

$$
MSE[\hat\theta] = E\Big[(\hat\theta - \theta)^2\Big]
$$

The MSE can be broken into two component parts, the variance of the estimator, which measures the precision of the estimator, and the bias of the estimator, which measures the accuracy of the estimator:

\begin{align}
MSE[\hat\theta] &= E\Big[(\hat\theta - \theta)^2\Big] \\
&= E[(\hat\theta - \theta)(\hat\theta - \theta)] \\
&= E\Big[\hat{\theta^2} - 2\hat\theta\theta + \theta^2\Big] \\
&= E\Big[\hat{\theta^2}\Big] - E[2\hat\theta\theta] + E\Big[\theta^2\Big] \\
\end{align}

Since \(\theta\) is a parameter, \(E[\theta] = \theta\), so:

\begin{align}
MSE[\hat\theta] &= E\Big[\hat{\theta^2}\Big] - E[2\hat\theta\theta] + E\Big[\theta^2\Big] \\
&=  E\Big[\hat{\theta^2}\Big] - 2\theta E[\hat\theta] + \theta^2 \\
\end{align}

By the definition of variance we know that:

\begin{align}
Var[\hat\theta] &= E\Big[\hat{\theta^2}\Big] - (E[\hat\theta])^2 \\
E\Big[\hat{\theta^2}\Big] &= Var[\hat\theta] + (E[\hat\theta])^2
\end{align}

so we can make a substitution for \(E\Big[\hat{\theta^2}\Big]\) is the equation for the MSE:

\begin{align}
MSE[\hat\theta] &= E\Big[\hat{\theta^2}\Big] - 2\theta E[\hat\theta] + \theta^2 \\
&= Var[\hat\theta] + (E[\hat\theta])^2 - 2\theta E[\hat\theta] + \theta^2 \\
&= Var[\hat\theta] + ((E[\hat\theta]) - \theta)^2 \\
&= Var[\hat\theta] + (B[\hat\theta])^2 \\
\end{align}

\(B[\hat\theta]\) is the bias in the estimator.

No comments:

Post a Comment

Some Gamma Function Notes

The Gamma Function is a particular form of integral that is commonly seen in probability problems: \(\Gamma(\alpha) = \int_{0}^{\infty}x^{\...