Kernels

Kernel Functions

A kernel is a function \kappa that for all \mathbf{t}, \mathbf{x} \in X satisfies \kappa(\mathbf{t},
\mathbf{x}) = \langle\Phi(\mathbf{t}),\Phi(\mathbf{x})\rangle, where \Phi is a mapping from X to an (inner product) feature space F, \Phi : \mathbf{t} \longmapsto
\Phi(\mathbf{t}) \in F.

The following functions take two array-like objects t (M, P) and x (N, P) and compute the (M, N) matrix \mathbf{K^t} with entries

\mathbf{K^t}_{ij} = \kappa(\mathbf{t}_i, \mathbf{x}_j).

Kernel Classes

class mlpy.Kernel

Base class for kernels.

class mlpy.KernelLinear

Linear kernel, t_i’ x_j.

class mlpy.KernelPolynomial(gamma=1.0, b=1.0, d=2.0)

Polynomial kernel, (gamma t_i’ x_j + b)^d.

class mlpy.KernelGaussian(sigma=1.0)

Gaussian kernel, exp(-||t_i - x_j||^2 / 2 * sigma^2).

class mlpy.KernelExponential(sigma=1.0)

Exponential kernel, exp(-||t_i - x_j|| / 2 * sigma^2).

class mlpy.KernelSigmoid(gamma=1.0, b=1.0)

Sigmoid kernel, tanh(gamma t_i’ x_j + b).

Functions

mlpy.kernel_linear(t, x)

Linear kernel, t_i’ x_j.

mlpy.kernel_polynomial(t, x, gamma=1.0, b=1.0, d=2.0)

Polynomial kernel, (gamma t_i’ x_j + b)^d.

mlpy.kernel_gaussian(t, x, sigma=1.0)

Gaussian kernel, exp(-||t_i - x_j||^2 / 2 * sigma^2).

mlpy.kernel_exponential(t, x, sigma=1.0)

Exponential kernel, exp(-||t_i - x_j|| / 2 * sigma^2).

mlpy.kernel_sigmoid(t, x, gamma=1.0, b=1.0)

Sigmoid kernel, tanh(gamma t_i’ x_j + b).

Example:

>>> import mlpy
>>> x = [[5, 1, 3, 1], [7, 1, 11, 4], [0, 4, 2, 9]] # three training points
>>> K = mlpy.kernel_gaussian(x, x, sigma=10) # compute the kernel matrix K_ij = k(x_i, x_j)
>>> K
array([[ 1.        ,  0.68045064,  0.60957091],
       [ 0.68045064,  1.        ,  0.44043165],
       [ 0.60957091,  0.44043165,  1.        ]])
>>> t = [[8, 1, 5, 1], [7, 1, 11, 4]] # two test points
>>> Kt = mlpy.kernel_gaussian(t, x, sigma=10) # compute the test kernel matrix Kt_ij = <Phi(t_i), Phi(x_j)> = k(t_i, x_j)
>>> Kt
array([[ 0.93706746,  0.7945336 ,  0.48190899],
       [ 0.68045064,  1.        ,  0.44043165]])

Centering in Feature Space

The centered kernel matrix \mathbf{\tilde{K}^t} is computed by:

\mathbf{\tilde{K}^t}_{ij} = \left\langle
\Phi(\mathbf{t}_i) -
\frac{1}{N} \sum_{m=1}^N{\Phi(\mathbf{x}_m)},
\Phi(\mathbf{x}_j) -
\frac{1}{N} \sum_{n=1}^N{\Phi(\mathbf{x}_n)}
\right\rangle.

We can express \mathbf{\tilde{K}^t} in terms of \mathbf{K^t} and \mathbf{K}:

\mathbf{\tilde{K}^t}_{ij} = \mathbf{K^t} - \mathbf{1}^T_N \mathbf{K}
- \mathbf{K^t} \mathbf{1}_N + \mathbf{1}^T_N \mathbf{K} \mathbf{1}_N

where \mathbf{1}_N is the N \times M matrix with all entries equal to 1/N and \mathbf{K} is \mathbf{K}_{ij} =
\kappa(\mathbf{x}_i, \mathbf{x}_j).

mlpy.kernel_center(Kt, K)

Centers the testing kernel matrix Kt respect the training kernel matrix K. If Kt = K (kernel_center(K, K), where K = k(x_i, x_j)), the function centers the kernel matrix K.

Parameters :
Kt : 2d array_like object (M, N)

test kernel matrix Kt_ij = k(t_i, x_j). If Kt = K the function centers the kernel matrix K

K : 2d array_like object (N, N)

training kernel matrix K_ij = k(x_i, x_j)

Returns :
Ktcentered : 2d numpy array (M, N)

centered version of Kt

Example:

>>> Kcentered = mlpy.kernel_center(K, K) # center K
>>> Kcentered
array([[ 0.19119746, -0.07197215, -0.11922531],
       [-0.07197215,  0.30395696, -0.23198481],
       [-0.11922531, -0.23198481,  0.35121011]])
>>> Ktcentered = mlpy.kernel_center(Kt, K) # center the test kernel matrix Kt respect to K
>>> Ktcentered
array([[ 0.15376875,  0.06761464, -0.22138339],
       [-0.07197215,  0.30395696, -0.23198481]])

Make a Custom Kernel

TODO

[Scolkopf98]Bernhard Scholkopf, Alexander Smola, and Klaus-Robert Muller. Nonlinear component analysis as a kernel eigenvalue problem. Neural Computation, 10(5):1299–1319, July 1998.

Table Of Contents

Previous topic

Linear Methods for Classification

Next topic

Non Linear Methods for Regression

This Page