Author: Eiko
Time: 2025-02-24 11:01:21 - 2025-02-24 11:01:21 (UTC)
Equivalent Definitions of RKHS
Given positive definite,
is given by the following two properties
for all .
An RKHS is a Hilbert space of functions on , such that all evaluation maps at points are (bounded) continuous.
Relation With Sobolev Spaces
Let
where
Remarks
In one dimension, already has continuous point evaluations, making it an RKHS. But in higher dimensions, controlling only this first order might not be enough anymore.
is an RKHS iff (ref: Sobolev embedding theorem).
If we define some general version of derivatives (weak derivatives etc), then all RKHS can be viewed as some Sobolev space.
Kernel Ridge Regression
Given some data as a finite subset of , we want to find a function fitting the data in the space .
Problem. Fix , minimize
where is the regularization parameter.
Representer Theorem
If minimizes the above problem for , then must be a finite linear combination of the kernel functions for in the data set.
To prove this theorem we use a orthogonal decomposition where is the finite dimensional space of functions in spanned by the data set, if we put in the minimization goal, we get
We see that minimizing it yields (unless ).
Now we are trying to minimize
If we differentiate with respect to ,
and set it to zero for all , we get
In terms of matrices this is
where .