There is a natural predictor corresponding to a normal linear model.
Let $(x: \Omega \to \R ^d, A \in \R ^{n \times d}, e: \Omega \to \R ^n)$ be a normal linear model over the probability space $(\Omega , \mathcal{A} , \mathbfsf{P} )$.
We are modeling $h_\omega : \R ^d \to \R $ by $h_w(a) = \transpose{x(\omega )}a$. The predictive density for a dataset $c^1, \dots , c^m \in \R ^d$ is the conditional density of the random vector $(h_{(\cdot )}(c^1), \dots , h_{(\cdot )}(c^m))$ given $y$.
\[ g(a) = (C\Sigma _{x}\transpose{A})\inv{(A\Sigma _{x}\transpose{A} + \Sigma _e)}\gamma . \]
and covariance\[ C\Sigma _{x}\transpose{C} - C\Sigma _{x}\transpose{A}\inv{(A\Sigma _{x}\transpose{A} + \Sigma _e)}A\Sigma _{x}\transpose{C}. \]
The normal linear model
predictor or normal linear
model regressor for the normal linear
model $(x, A, e)$ is the predictor which
assigns to a new point $a \in \R ^d$ the mean
of the predictive density at $a$.
That is, the predictor $g: \R ^d \to \R $
defined by
\[
g(a) =
\transpose{a}\Sigma _{x}\transpose{A}\inv{(A\Sigma _{x}\transpose{A} +
\Sigma _e)}\gamma .
\]
The use of a normal linear model predictor is often called Bayesian linear regression. The word Bayesian is used in reference to treating the parameters of the function, $x$, as a random variable.