NIPS 97 Workshop on Support Vector Machines

NIPS 97 Workshop on Support Vector Machines

Title: Reproducing Kernel Hilbert Spaces, Smoothing Spline ANOVA Spaces, Support Vector Machines, and all that.

Talk by Grace Wahba at the NIPS Support Vector Learning Machines Workshop, December 6, 1997, at Breckenridge CO. Think snow.


In this (mostly) review talk, We look at reproducing kernel Hilbert spaces (rkhs) as a natural home for studying certain support vector machines and their relations with other multivariate function estimation paradigms. We examine Lemma 6.1 of Kimeldorf and Wahba (J. Math. Anal. Applic. 1971, p 92) concerning linear inequality constraints in rkhs, and note how it applies in arbitrary rkhs. We will particularly note the potential use of the reproducing kernels (rk's) associated with smoothing spline ANOVA spaces, for applications with heterogenous predictor variables, and possibly as a tool in variable selection as well as exemplar selection. Questions about the possible internal `tuning' of SVM's (as compared to the use of test sets) and the `degrees of freedom for signal' will be raised, although probably not answered.

Key words: support vector machines, reproducing kernel Hilbert spaces, penalized likelihood, smoothing spline ANOVA, radial basis functions, Gaussian processes and Bayes estimates, sigmoidal basis functions, greedy algorithms, regularization, generalized cross validation, generalized approximate cross validation
Click here for G. Wahba, Support Vector Machines, Reproducing Kernel Hilbert Spaces and the Randomized GACV, University of Wisconsin-Madison Statistics Department TR 984rr. Revised basis for part of my SVM Workshop talk. typos

Click here for Grace Wahba, Students and Colleagues recent Technical Reports.

The home page for the Support Vector Learning Machines Workshop is here.

Here is an announcement for `Advances in Kernel Methods Support Vector Learning' a book based on the conference talks.

Click here for Grace Wahba's publications.