Conditional mean embeddings and optimal feature selection via positive definite kernels

05/14/2023
by   Palle E. T. Jorgensen, et al.
0

Motivated by applications, we consider here new operator theoretic approaches to Conditional mean embeddings (CME). Our present results combine a spectral analysis-based optimization scheme with the use of kernels, stochastic processes, and constructive learning algorithms. For initially given non-linear data, we consider optimization-based feature selections. This entails the use of convex sets of positive definite (p.d.) kernels in a construction of optimal feature selection via regression algorithms from learning models. Thus, with initial inputs of training data (for a suitable learning algorithm,) each choice of p.d. kernel K in turn yields a variety of Hilbert spaces and realizations of features. A novel idea here is that we shall allow an optimization over selected sets of kernels K from a convex set C of positive definite kernels K. Hence our “optimal” choices of feature representations will depend on a secondary optimization over p.d. kernels K within a specified convex set C.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset