We use a normal random function model to make a regressor.
Let $F: \Omega \to (A \to \R )$ be a normal random function with mean function $m: A \to \R $ and covariance function $k: A \times A \to \R $ over the probability space $(\Omega , \mathcal{A} , \mathbfsf{P} )$. Let the family of random variables (or stochastic process) of $F$ be $f: A \to (\Omega \to \R )$.
Let $e$ be a normal random vector with mean
zero and covariance $\Sigma _{e}$.
Let $a^1, \dots , a^n \in A$.
We sometimes call the sequence $a^1, \dots ,
a^n$ the design.
Define $y: \Omega \to \R ^d$ by
\[
y_i = f(a^i) + e_i
\]
Let $b^1, \dots , b^m \in A$. Define $z: \Omega \to \R ^d$ by $z_i = f(b^i)$ for $i = 1, \dots , n$. So $z_i$ is the random variable corresponding to the family at index $b^i \in A$. Then $(y, z)$ is normal. We call the conditional density of $z$ given $y$ the predictive density for $b$ given $a$.
\[ \pmat{ k(a^1, a^1) & \cdots & k(a^1, a^n) \\ \vdots & \ddots & \vdots \\ k(a^n, a^1) & \cdots & k(a^n, a^n) \\ } \]
and define $\Sigma _{ba} \in \R ^{m \times n}$ by\[ \pmat{ k(b^1, a^1) & \cdots & k(b^1, a^n) \\ \vdots & \ddots & \vdots \\ k(b^m, a^1) & \cdots & k(b^m, a^n) \\ }. \]
The predictive density $g_{z \mid y}(\cdot , \gamma ): \R ^m \to \R $ of $b \in A$ for design $a^1, \dots , a^n$ is normal with mean.\[ m_b + K_{ba}\inv{(K_{a} + \Sigma _{e})}(\gamma - m_a) \]
and covariance\[ \Sigma _{b} - \Sigma _{ba}\inv{(\Sigma _{a} + \Sigma _{e})}\Sigma _{ab}. \]