\(\DeclarePairedDelimiterX{\Set}[2]{\{}{\}}{#1 \nonscript\;\delimsize\vert\nonscript\; #2}\) \( \DeclarePairedDelimiter{\set}{\{}{\}}\) \( \DeclarePairedDelimiter{\parens}{\left(}{\right)}\) \(\DeclarePairedDelimiterX{\innerproduct}[1]{\langle}{\rangle}{#1}\) \(\newcommand{\ip}[1]{\innerproduct{#1}}\) \(\newcommand{\bmat}[1]{\left[\hspace{2.0pt}\begin{matrix}#1\end{matrix}\hspace{2.0pt}\right]}\) \(\newcommand{\barray}[1]{\left[\hspace{2.0pt}\begin{matrix}#1\end{matrix}\hspace{2.0pt}\right]}\) \(\newcommand{\mat}[1]{\begin{matrix}#1\end{matrix}}\) \(\newcommand{\pmat}[1]{\begin{pmatrix}#1\end{pmatrix}}\) \(\newcommand{\mathword}[1]{\mathop{\textup{#1}}}\)
Needs:
Polynomials
Feature Maps
Needed by:
Polynomial Fit Models
Links:
Sheet PDF
Graph PDF

Polynomial Regressors

Why

A simple example of an embedding.1

Definition

Fix $d \in \N $. A polynomial feature map of degree $d$ is a function $\phi : \R \to \R ^d$ with

\[ \phi (x) = \pmat{1 & x^2 & \cdots & x^d}^\top . \]

For $x \in \R $, we call $\phi (x)$ the polynomial embedding of $x$.

A polynomial regressor is a least squares linear predictor using a polynomial feature embedding (of any degree, but to be precise one must specify the degree). The task of consructing a linear predictor is often referred to as polynomial regression.

Given a dataset of paired records $(x^1, y^1), \dots , (x^n, y^n) \in \R ^2$, one can construct a predictor $g: \R \to \R $ for $y$ by embedding the dataset $(\phi (x^1), \dots , \phi (x^n))$ and finding the least squares linear regressor $f: \R ^d \to \R $ for $y$. One defines the predictor $g: \R \to \R $ by $g(\phi (x))$.


  1. Future editions will expand, or perhaps collapse this sheet. ↩︎
Copyright © 2023 The Bourbaki Authors — All rights reserved — Version 13a6779cc About Show the old page view