Sparse representation in kernel machines.

Hongwei Sun, Qiang Wu
Author Information

Abstract

We study the properties of least square kernel regression with l1 coefficient regularization. The kernels can be flexibly chosen to be either positive definite or indefinite. Asymptotic learning rates are deduced under smoothness condition on the kernel. Sparse representation of the solution is characterized theoretically. Empirical simulations and real applications indicate that both good learning performance and sparse representation could be guaranteed.

Word Cloud

Created with Highcharts 10.0.0kernelrepresentationlearningSparsestudypropertiesleastsquareregressionl1coefficientregularizationkernelscanflexiblychoseneitherpositivedefiniteindefiniteAsymptoticratesdeducedsmoothnessconditionsolutioncharacterizedtheoreticallyEmpiricalsimulationsrealapplicationsindicategoodperformancesparseguaranteedmachines

Similar Articles

Cited By