Abstract
An aspect of kernel classifiers which complicates variable selection is the implicit use of the transformation function Φ. This function maps the space in which the data cases reside, the so-called input space (\(\mathcal{X}\)), to a higher dimensional feature space (\(\mathcal{F}\)). Variable selection in \(\mathcal{F}\) is a difficult problem, while variable selection in \(\mathcal{X}\) is mostly inadequate. We propose an intermediate kernel variable selection approach which is implemented in \(\mathcal{X}\) while also accounting for the fact that kernel classifiers operate in \(\mathcal{F}\).
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
Similar content being viewed by others
References
Bierman, S., & Steel, S. (2009). Variable selection for support vector machines. Communications in Statistics – Simulation and Computation, 38, 1640–1658.
Claeskens, G., Croux, C., & Van Kerkhoven, J. (2008). An information criterion for variable selection in support vector machines. Journal of Machine Learning Research, 9, 541–558.
Cristianini, N., Kandola, J., Elisseeff, A. and Shawe–Taylor, J., 2002. On kernel-target alignment. In: T. Dietterich, S. Becker and D. Cohn, (Eds.), Advances in Neural Information Processing Systems, 14, MIT Press, Cambridge.
Keerthi, S. (2005). Generalized LARS as an effective feature selection tool for text classification with SVMs. ACM International Conference Proceeding Series, 119, 417–424.
Oosthuizen, S. (2008). Variable selection for kernel methods with application to binary classification. Ph.D., University of Stellenbosch, South Africa.
Schölkopf, B., & Smola, A. (2002). Learning with kernels: support vector machines, regularization, optimization, and beyond. London: MIT Press.
Shawe-Taylor, J., & Cristianini, N. (2004). Kernel methods for pattern analysis. Cambridge: Cambridge University Press.
Zhang, H. (2006). Variable selection for support vector machines via smoothing spline ANOVA. Statistica Sinica, 16, 659–674.
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2009 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Oosthuizen, S., Steel, S. (2009). Variable Selection for Kernel Classifiers: A Feature-to-Input Space Approach. In: Fink, A., Lausen, B., Seidel, W., Ultsch, A. (eds) Advances in Data Analysis, Data Handling and Business Intelligence. Studies in Classification, Data Analysis, and Knowledge Organization. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-01044-6_14
Download citation
DOI: https://doi.org/10.1007/978-3-642-01044-6_14
Published:
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-642-01043-9
Online ISBN: 978-3-642-01044-6
eBook Packages: Mathematics and StatisticsMathematics and Statistics (R0)