\(\DeclarePairedDelimiterX{\Set}[2]{\{}{\}}{#1 \nonscript\;\delimsize\vert\nonscript\; #2}\) \( \DeclarePairedDelimiter{\set}{\{}{\}}\) \( \DeclarePairedDelimiter{\parens}{\left(}{\right)}\) \(\DeclarePairedDelimiterX{\innerproduct}[1]{\langle}{\rangle}{#1}\) \(\newcommand{\ip}[1]{\innerproduct{#1}}\) \(\newcommand{\bmat}[1]{\left[\hspace{2.0pt}\begin{matrix}#1\end{matrix}\hspace{2.0pt}\right]}\) \(\newcommand{\barray}[1]{\left[\hspace{2.0pt}\begin{matrix}#1\end{matrix}\hspace{2.0pt}\right]}\) \(\newcommand{\mat}[1]{\begin{matrix}#1\end{matrix}}\) \(\newcommand{\pmat}[1]{\begin{pmatrix}#1\end{pmatrix}}\) \(\newcommand{\mathword}[1]{\mathop{\textup{#1}}}\)
Needs:
Input Designs
Linear Transformations
Matrix Transpose
Real Function Approximators
Needed by:
Least Squares Linear Regressors
Links:
Sheet PDF
Graph PDF

Linear Predictors

Why

A simple class of predictors when the input and output sets are vector spaces is the class of linear predictors.

Definition

A linear predictor (or linear model or deterministic linear model) is a predictor which is a linear function of its inputs.

Such a model is simple to implement and interpretable, at the cost of flexibility.

$\R ^d$ Example

Let $X = \R ^d$ be a set of inputs and $Y = \R $ a set of outputs. The linear functions on $\R ^d$ are in one-to-one correspondence with vectors in $\R ^d$.

A linear function $f: \R ^d \to \R $ over the vector space $(\R ^d, \R )$ has a set of parameters $w \in \R ^d$ so that

\[ f(x) = \sum_{i} w_i x_i = w^\top x. \]

The parameters of a linear predictor on $\R ^d$ are often called weights.

Copyright © 2023 The Bourbaki Authors — All rights reserved — Version 13a6779cc About Show the old page view