skip to main content
10.1145/1014052.1014056acmconferencesArticle/Chapter ViewAbstractPublication PageskddConference Proceedingsconference-collections
Article

An iterative method for multi-class cost-sensitive learning

Published: 22 August 2004 Publication History

Abstract

Cost-sensitive learning addresses the issue of classification in the presence of varying costs associated with different types of misclassification. In this paper, we present a method for solving multi-class cost-sensitive learning problems using any binary classification algorithm. This algorithm is derived using hree key ideas: 1) iterative weighting; 2) expanding data space; and 3) gradient boosting with stochastic ensembles. We establish some theoretical guarantees concerning the performance of this method. In particular, we show that a certain variant possesses the boosting property, given a form of weak learning assumption on the component binary classifier. We also empirically evaluate the performance of the proposed method using benchmark data sets and verify that our method generally achieves better results than representative methods for cost-sensitive learning, in terms of predictive performance (cost minimization) and, in many cases, computational efficiency.

References

[1]
E. L. Allwein, R. E. Schapire, and Y. Singer. Reducing multiclass to binary: A unifying approach for margin classifiers. Journal of Machine Learning Research, 1:113--141, 2000.
[2]
S. D. Bay. UCI KDD archive. Department of Information and Computer Sciences, University of California, Irvine, 2000. http://kdd.ics.uci.edu/.
[3]
C. L. Blake and C. J. Merz. UCI repository of machine learning databases. Department of Information and Computer Sciences, University of California, Irvine, 1998. http://www.ics.uci.edu/~mlearn/MLRepository.html.
[4]
J. Bradford, C. Kunz, R. Kohavi, C. Brunk, and C. Brodley. Pruning decision trees with misclassification costs. In Proceedings of the European Conference on Machine Learning, pages 131--136, 1998.
[5]
L. Breiman. Bagging predictors. Machine Learning, 24(2):123--140, 1996.
[6]
L. Breiman, J. H. Friedman, R. A. Olsen, and C. J. Stone. Classification and Regression Trees. Wadsworth International Group, 1984.
[7]
P. Chan and S. Stolfo. Toward scalable learning with non-uniform class and cost distributions. In Proceedings of the Fourth International Conference on Knowledge Discovery and Data Mining, pages 164--168, 1998.
[8]
P. Domingos. MetaCost: A general method for making classifiers cost sensitive. In Proceedings of the Fifth International Conference on Knowledge Discovery and Data Mining, pages 155--164. ACM Press, 1999.
[9]
C. Drummond and R. C. Holte. C4.5, class imbalance, and cost-sensitivity: Why under-sampling beats over-sampling. In Workshop Notes, Workshop on Cost-Sensitive Learning, International Conference on Machine Learning, June 2000.
[10]
C. Elkan. Magical thinking in data mining: Lessons from coil challenge 2000. In Proceedings of the Seventh International Conference on Knowledge Discovery and Data Mining, pages 426--431. ACM Press, 2001.
[11]
W. Fan, S. J. Stolfo, J. Zhang, and P. K. Chan. AdaCost: Misclassification cost-sensitive boosting. In Proceedings of the Sixteenth International Conference on Machine Learning, pages 97--105, 1999.
[12]
Y. Freund and R. E. Schapire. A decision-theoretic generalization of on-line learning and an application to boosting. Journal of Computer and System Sciences, 55(1):119--139, 1997.
[13]
G. Fumera and F. Roli. Cost-sensitive learning in support vector machines. In VIII Convegno Associazione Italiana per L'Intelligenza Artificiale, 2002.
[14]
P. Geibel and F. Wysotzki. Perceptron based learning with example dependent and noisy costs. In Proceedings of the Twentieth International Conference on Machine Learning, 2003.
[15]
U. Knoll, G. Nakhaeizadeh, and B. Tausend. Cost-sensitive pruning of decision trees. In Proceedings of the Eight European Conference on Machine Learning, pages 383--386, 1994.
[16]
D. Margineantu. Methods for Cost-Sensitive Learning. PhD thesis, Department of Computer Science, Oregon State University, Corvallis, 2001.
[17]
L. Mason, J. Baxter, P. Barlett, and M. Frean. Boosting algorithms as gradient descent. In Advances in Neural Information Processing systems 12, pages 512--158, 2000.
[18]
J. Quinlan. C4.5: Programs for Machine Learning. San Mateo, CA: Morgan Kaufmann, 1993.
[19]
B. Zadrozny and C. Elkan. Learning and making decisions when costs and probabilities are both unknown. In Proceedings of the Seventh International Conference on Knowledge Discovery and Data Mining, pages 204--213. ACM Press, 2001.
[20]
B. Zadrozny, J. Langford, and N. Abe. Cost-sensitive learning by cost-proportionate example weighting. In Proceedings of the Third IEEE International Conference on Data Mining, pages 435--442, 2003.

Cited By

View all

Index Terms

  1. An iterative method for multi-class cost-sensitive learning

    Recommendations

    Comments

    Information & Contributors

    Information

    Published In

    cover image ACM Conferences
    KDD '04: Proceedings of the tenth ACM SIGKDD international conference on Knowledge discovery and data mining
    August 2004
    874 pages
    ISBN:1581138881
    DOI:10.1145/1014052
    Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

    Sponsors

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 22 August 2004

    Permissions

    Request permissions for this article.

    Check for updates

    Author Tags

    1. boosting
    2. cost-sensitive learning
    3. multi-class classification

    Qualifiers

    • Article

    Conference

    KDD04

    Acceptance Rates

    Overall Acceptance Rate 1,133 of 8,635 submissions, 13%

    Upcoming Conference

    KDD '25

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)25
    • Downloads (Last 6 weeks)3
    Reflects downloads up to 06 Jan 2025

    Other Metrics

    Citations

    Cited By

    View all

    View Options

    Login options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    Media

    Figures

    Other

    Tables

    Share

    Share

    Share this Publication link

    Share on social media