Kernel Ridge Regression with kernel Laplacian Regularization

Vivien Cabannes

Laplacian regularization is a popular smoothing technique in machine learning. It is particularly useful in situations where ambiguity of the data imposes the use of a criterion to disambiguate between potential functions explaining data, could it be spectral clustering or semi-supervised learning. While Laplacian regularization is usually approached through neighborhood graph diffusion, we present an approach through kernel methods, based on derivative evaluation maps. We derive an analytical solution of the empirical risk minimization with kernel Laplacian regularization. We prove strong consistency of our estimate when the number of data goes towards infinity. Moreover, we show that, under regularity assumptions, our kernel method bypasses the curse of dimensionality, hence providing a strong alternative to neighborhood graph methods that do not avoid it.

Knowledge Graph

arrow_drop_up

Comments

Sign up or login to leave a comment