摘要:
Kullback–Leibler divergenceFrom Wikipedia, the free encyclopedia(Redirected fromRelative entropy)Inprobability theoryandinformation theory, theKullback–Leibler divergence[1][2][3](alsoinformation divergence,information gain,relative entropy, orKLIC) is a non-symmetric measure of the difference betwe 阅读全文
摘要:
Dirichlet distribution--deeply understandFrom Wikipedia, the free encyclopediaSeveral images of the probability density of the Dirichlet distribution whenK=3 for various parameter vectorsα. Clockwise from top left:α=(6,2,2), (3,7,5), (6,2,6), (2,3,4).Inprobabilityandstatistics, theDirichlet distribu 阅读全文