Definition [1]:

It is quite useful because it can be decomposed by the famous bias and variance. The expectation with respect to \(\theta\) means the expectation w.r.t. a law with that parameter \(\theta\), i.e., \(\int (W-\theta)dP_{\theta}\)
Below is a derivation toward the bias and variance expression.
\[\begin{aligned}
E_{\theta}(W-\theta)^2&=E_{\theta}(W-E_{\theta}W+E_{\theta}W-\theta)^2\\
&=E_{\theta}(W-E_\theta W)^2+E_\theta(E_\theta W-\theta)^2+2E(W-EW)(EW-\theta)\\
&=\mathrm{Var}(W)+(EW-\theta)^2 + 2(EW-EW)(EW-\theta)\\
&=\mathrm{Var}(W)+(EW-\theta)^2\\
&=\mathrm{Variance}+\mathrm{Bias}^2
\end{aligned}\]
Reference
[1] Casella, G., & Berger, R. (2024). Statistical inference. Chapman and Hall/CRC.
'통계' 카테고리의 다른 글
| Map of Bernoulli Variants (0) | 2025.07.15 |
|---|---|
| Reynolds transport theorem (0) | 2025.07.04 |
| Gradient과 convolution이 있을 때 equality (0) | 2025.07.01 |
| Trace technique (1) | 2025.06.24 |
| Generalized Leverage (0) | 2025.06.21 |