Non Linear Methods for Regression

Kernel Ridge Regression

class mlpy.KernelRidge(lmb=1.0)

Kernel Ridge Regression (dual).

Initialization.

Parameters :
lmb : float (>= 0.0)

regularization parameter

alpha()

Return alpha.

b()

Return b.

learn(K, y)

Compute the regression coefficients.

Parameters:
K: 2d array_like object (N, N)
precomputed training kernel matrix
y : 1d array_like object (N)
target values
pred(Kt)

Compute the predicted response.

Parameters :
Kt : 1d or 2d array_like object ([M], N)

test kernel matrix. Precomputed inner products (in feature space) between M testing and N training points.

Returns :
p : integer or 1d numpy darray

predicted response

Example:

>>> import numpy as np
>>> import matplotlib.pyplot as plt
>>> import mlpy
>>> np.random.seed(0)
>>> x = np.arange(0, 2, 0.05).reshape(-1, 1) # training points
>>> y = np.ravel(np.exp(x)) + np.random.normal(1, 0.2, x.shape[0]) # target values
>>> xt = np.arange(0, 2, 0.01).reshape(-1, 1) # testing points
>>> K = mlpy.kernel_gaussian(x, x, sigma=1) # training kernel matrix
>>> Kt = mlpy.kernel_gaussian(xt, x, sigma=1) # testing kernel matrix
>>> krr = KernelRidge(lmb=0.01)
>>> krr.learn(K, y)
>>> yt = krr.pred(Kt)
>>> fig = plt.figure(1)
>>> plot1 = plt.plot(x[:, 0], y, 'o')
>>> plot2 = plt.plot(xt[:, 0], yt)
>>> plt.show()
_images/kernel_ridge.png

Support Vector Regression

See Support Vector Machines (SVMs)

Table Of Contents

Previous topic

Kernels

Next topic

Non Linear Methods for Classification

This Page