Redirecting to original paper in 30 seconds...

Click below to go immediately or wait for automatic redirect

arxiv_ml 80% Match Research Paper Machine learning researchers,Data scientists,Researchers in kernel methods 19 hours ago

A Compositional Kernel Model for Feature Learning

generative-ai › flow-models
📄 Abstract

Abstract: We study a compositional variant of kernel ridge regression in which the predictor is applied to a coordinate-wise reweighting of the inputs. Formulated as a variational problem, this model provides a simple testbed for feature learning in compositional architectures. From the perspective of variable selection, we show how relevant variables are recovered while noise variables are eliminated. We establish guarantees showing that both global minimizers and stationary points discard noise coordinates when the noise variables are Gaussian distributed. A central finding is that $\ell_1$-type kernels, such as the Laplace kernel, succeed in recovering features contributing to nonlinear effects at stationary points, whereas Gaussian kernels recover only linear ones.

Key Contributions

This paper introduces a compositional variant of kernel ridge regression that facilitates feature learning in compositional architectures. It formulates the model as a variational problem and provides theoretical guarantees showing that relevant variables are recovered while noise variables are eliminated, particularly demonstrating that L1-type kernels can recover features contributing to nonlinear effects.

Business Value

Enables the development of more interpretable and robust machine learning models by automatically learning relevant features and discarding noise, improving performance and reducing model complexity in various data-driven applications.