A Simple Method for Cost-Sensitive Learning

A folk theorem implies a simple reduction which allows anyone to turn an arbitrary cost-insensitive classification algorithm into a cost-sensitive classification algorithm. The reduction works using a particular reweighting of the examples which can be satisfied either by feeding the weights to the classification algorithm (as often done in boosting), or by resampling. Naive methods of resampling often result in drastically poor performance due to a strong tendency to overfit. The problem is analyzed and a novel sampling method we call "wagging" is introduced. which provably avoids this problem and results in superior performance.

By: Bianca Zadrozny, John Langford, Naoki Abe

Published in: Proceedings of Third IEEE International Conference on Data Mining. Los Alamitos, CA, , IEEE Computer Society. , p.435-42 in 2003

Please obtain a copy of this paper from your local library. IBM cannot distribute this paper externally.

Questions about this service can be mailed to reports@us.ibm.com .