Abstract
Recent studies have shown that the random subspace method can be used to create multiple independent tree-classifiers that can be combined to improve accuracy. We apply the procedure to k-nearest-neighbor classifiers and show that it can achieve similar results. We examine the effects of several parameters of the method by experiments using data from a digit recognition problem. We show that the combined accuracies follow a trend of increase with increasing number of component classifiers, and that with an appropriate subspace dimensionality, the method can be superior to simple k-nearest-neighbor classification, The method's superiority is maintained when smaller number of training prototypes are available, i.e., when conventional knn classifiers suffer most heavily from the curse of dimensionality.
Chapter PDF
Keywords
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.
References
Chandrasekaran, B., Jain, A.K.: On balancing decision functions, J. of Cybernetics and Information Science, 2, 3 (1979) 12–15
Cover, T.M., Hart, P.E.: Nearest neighbor pattern classification. IEEE Transactions on Information Theory, IT-13, 1 (1967) 21–27
Fukunaga, K., Hummels, D.M.: Bias of nearest neighbor error estimates. IEEE Transactions on Pattern Analysis and Machine Intelligence, 9 (1987) 103–112
Fukunaga, K., Hummels, D.M.: Bayes error estimation using Parzen and k-NN procedures. IEEE Transactions on Pattern Analysis and Machine Intelligence, 9 (1987) 634–643
Hamamoto, Y., Uchimura, S., Tomita, S.: A bootstrap technique for nearest neighbor classifier design. IEEE Transactions on Pattern Analysis and Machine Intelligence, 19, 1 (1997) 73–79
Ho, T.K.: Random decision forests, Proceedings of the 3rd International Conference on Document Analysis and Recognition, Montreal, Canada, August 14–18 (1995) 278–282
Ho, T.K.: C4,5 decision forests, Proceedings of the 14th International Conference on Pattern Recognition, Brisbane, Australia, August 17–20 (1998)
Ho, T.K., Hull, J.J., Srihari, S.N.: Decision combination in multiple classifier systems. IEEE Transactions on Pattern Analysis and Machine Intelligence, 16, 1 (1994) 66–75
Ho, T.K., Baird, H.S.: Large-scale simulation studies in image pattern recognition. IEEE Transactions on Pattern Analysis and Machine Intelligence, 19, 10 (1997) 1067–1079
Ho, T.K., Kleinberg, E.M.: Building projectable classifiers of arbitrary complexity. Proceedings of the 13th International Conference on Pattern Recognition, Vienna, Austria, August 25–30 (1996) 880–885
Kleinberg, E.M.: Stochastic discrimination. Annals of Mathematics and Artificial Intelligence, 1 (1990) 207–239
Kleinberg, E.M.: An overtraining-resistant stochastic modeling method for pattern recognition, Annals of Statistics, 4, 6 (1996) 2319–2349
Vapnik, V.: The nature of statistical learning theory. Springer-Verlag (1995)
Author information
Authors and Affiliations
Editor information
Rights and permissions
Copyright information
© 1998 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Ho, T.K. (1998). Nearest neighbors in random subspaces. In: Amin, A., Dori, D., Pudil, P., Freeman, H. (eds) Advances in Pattern Recognition. SSPR /SPR 1998. Lecture Notes in Computer Science, vol 1451. Springer, Berlin, Heidelberg. https://doi.org/10.1007/BFb0033288
Download citation
DOI: https://doi.org/10.1007/BFb0033288
Published:
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-64858-1
Online ISBN: 978-3-540-68526-5
eBook Packages: Springer Book Archive