Performance and Efficiency: Recent Advances in Supervised Learning

Copyright [©] (1999) by IEEE. Permission to make digital or hard copies of part or all of this work for personal or classroom use is granted without fee provided that copies are not made or distrubuted for profit. To copy otherwise, to republish, to post on servers, or to redistribute to lists, requires prior specific permission and/or a fee.

This paper reviews recent advances in supervised learning with a focus on two most important issues: performance and efficiency.Performance addresses the generalization capability of a learning machine on randomly chosen samples that are notincluded in a training set. Efficiency deals with the complexity of a learning machine in both space and time. As these two issues are general to various learning machines and learning approaches, we focus on a special type of adaptive learning systems with a neural architecture. We discuss four types of learning approaches: training an individual model, combinations of several well-trained models, combinations of many weak models, and evolutionary computation of models. We explore advantages and weaknesses of each approach and their interrelations, and pose open questions for possible future research.

By: Sheng Ma, Chuanyi Ji

Published in: IEEE Proceedings, volume 87, (no 9), pages 1519-35 in 1999

Please obtain a copy of this paper from your local library. IBM cannot distribute this paper externally.

Questions about this service can be mailed to reports@us.ibm.com .