\(\DeclarePairedDelimiterX{\Set}[2]{\{}{\}}{#1 \nonscript\;\delimsize\vert\nonscript\; #2}\) \( \DeclarePairedDelimiter{\set}{\{}{\}}\) \( \DeclarePairedDelimiter{\parens}{\left(}{\right)}\) \(\DeclarePairedDelimiterX{\innerproduct}[1]{\langle}{\rangle}{#1}\) \(\newcommand{\ip}[1]{\innerproduct{#1}}\) \(\newcommand{\bmat}[1]{\left[\hspace{2.0pt}\begin{matrix}#1\end{matrix}\hspace{2.0pt}\right]}\) \(\newcommand{\barray}[1]{\left[\hspace{2.0pt}\begin{matrix}#1\end{matrix}\hspace{2.0pt}\right]}\) \(\newcommand{\mat}[1]{\begin{matrix}#1\end{matrix}}\) \(\newcommand{\pmat}[1]{\begin{pmatrix}#1\end{pmatrix}}\) \(\newcommand{\mathword}[1]{\mathop{\textup{#1}}}\)
Needs:
Real Numbers
Estimators
Needed by:
None.
Links:
Sheet PDF
Graph PDF

Unbiased Estimators

Definition

Consider a random variable $x: \Omega \to \R ^n$. The error of the estimate $\xi \in \R ^n$ is the random variable $e: \Omega \to \R ^n$ which is defined by $e(\omega ) = x(\omega ) - \xi $. The bias of an estimate is the expected value of the error. An estimate is unbiased if it has zero bias.

Likewise, if we have another random variable $y: \Omega \to \R ^m$, then the error of the estimator $f: \R ^m \to \R ^n$ is the random variable $e: \Omega \to \R ^n$ defined by $e(\omega ) = f(x(\omega )) - y(\omega )$.

Copyright © 2023 The Bourbaki Authors — All rights reserved — Version 13a6779cc About Show the old page view