原文:Learning from Imbalanced Classes

数据不平衡是一个非常经典的问题,数据挖掘、计算广告、NLP等工作经常遇到。该文总结了可能有效的方法,值得参考:

  • Do nothing. Sometimes you get lucky and nothing needs to be done. You can train on the so-called natural (or stratified) distribution and sometimes it works without need for modification.
  • Balance the training set in some way:
    • Oversample the minority class.
    • Undersample the majority class.
    • Synthesize new minority classes.
  • Throw away minority examples and switch to an anomaly detection framework.
  • At the algorithm level, or after it:
    • Adjust the class weight (misclassification costs).
    • Adjust the decision threshold.
    • Modify an existing algorithm to be more sensitive to rare classes.
  • Construct an entirely new algorithm to perform well on imbalanced data.
 posted on 2016-08-30 12:52  我爱机器学习  阅读(642)  评论(0编辑  收藏  举报