Divergence

  • compare two probability distribution

such as a true distribution and an approximate distribution

In statistics and information geometrydivergence or a contrast function is a function which establishes the "distance" of one probability distribution to the other on a statistical manifold. The divergence is a weaker notion than that of the distance, in particular the divergence need not be symmetric (that is, in general the divergence from p to q is not equal to the divergence from q to p), and need not satisfy the triangle inequality.

Two commonly used divergence scores from information theory are Kullback-Leibler Divergence and Jensen-Shannon Divergence.

 

References:

  1. https://machinelearningmastery.com/divergence-between-probability-distributions/
  2. https://en.wikipedia.org/wiki/Divergence_(statistics)
posted @ 2020-08-22 19:25  keeps_you_warm  阅读(190)  评论(0编辑  收藏  举报