[go: up one dir, main page]

Information and Media Technologies
Online ISSN : 1881-0896
ISSN-L : 1881-0896
Computing
Numerosity Reduction for Resource Constrained Learning
Khamisi Kalegele Research Institute of Electrical Communication, Tohoku University">Hideyuki TakahashiJohan Sveholm Research Institute of Electrical Communication, Tohoku University">Kazuto Sasai Research Institute of Electrical Communication, Tohoku University">Gen Kitagata Research Institute of Electrical Communication, Tohoku University">Tetsuo Kinoshita
Author information
JOURNAL FREE ACCESS

2013 Volume 8 Issue 2 Pages 360-372

Details
Abstract

When coupling data mining (DM) and learning agents, one of the crucial challenges is the need for the Knowledge Extraction (KE) process to be lightweight enough so that even resource (e.g., memory, CPU etc.) constrained agents are able to extract knowledge. We propose the Stratified Ordered Selection (SOS) method for achieving lightweight KE using dynamic numerosity reduction of training examples. SOS allows for agents to retrieve different-sized training subsets based on available resources. The method employs ranking-based subset selection using a novel Level Order (LO) ranking scheme. We show representativeness of subsets selected using the proposed method, its noise tolerance nature and ability to preserve KE performance over different reduction levels. When compared to subset selection methods of the same category, the proposed method offers the best trade-off between cost, reduction and the ability to preserve performance.

Content from these authors
© 2013 Information Processing Society of Japan
Previous article Next article
feedback
Top