Probability和Likelihood的区别
Bayes for Beginners: Probability and Likelihood 好好看,非常有用。
以前死活都不理解Probability和Likelihood的区别,为什么这两个东西的条件反一下就相等。
定义:
Probability是指在固定参数的情况下,事件的概率,必须是0-1,事件互斥且和为1. 我们常见的泊松分布、二项分布、正态分布的概率密度图描述的就是这个。
Likelihood是指固定的结果,我们的参数的概率,和不必为1,不必互斥,所以只有ratio是有意义的。
至于为什么L=P,这是因为定义就是这样的,wiki解释得非常清楚。
Consider a simple statistical model of a coin flip, with a single parameter that expresses the "fairness" of the coin. This parameter is the probability that a given coin lands heads up ("H") when tossed.
can take on any numeric value within the range 0.0 to 1.0. For a perfectly fair coin,
= 0.5.
Imagine flipping a coin twice, and observing the following data : two heads in two tosses ("HH"). Assuming that each successive coin flip is IID, then the probability of observing HH is
Hence: given the observed data HH, the likelihood that the model parameter equals 0.5, is 0.25. Mathematically, this is written as
This is not the same as saying that the probability that , given the observation HH, is 0.25. (For that, we could apply Bayes' theorem, which implies that the posterior probability is proportional to the likelihood times the prior probability.)
Suppose that the coin is not a fair coin, but instead it has . Then the probability of getting two heads is
Hence
More generally, for each value of , we can calculate the corresponding likelihood. The result of such calculations is displayed in Figure 1.
In Figure 1, the integral of the likelihood over the interval [0, 1] is 1/3. That illustrates an important aspect of likelihoods: likelihoods do not have to integrate (or sum) to 1, unlike probabilities.
【推荐】国内首个AI IDE,深度理解中文开发场景,立即下载体验Trae
【推荐】编程新体验,更懂你的AI,立即体验豆包MarsCode编程助手
【推荐】抖音旗下AI助手豆包,你的智能百科全书,全免费不限次数
【推荐】轻量又高性能的 SSH 工具 IShell:AI 加持,快人一步
· AI与.NET技术实操系列(二):开始使用ML.NET
· 记一次.NET内存居高不下排查解决与启示
· 探究高空视频全景AR技术的实现原理
· 理解Rust引用及其生命周期标识(上)
· 浏览器原生「磁吸」效果!Anchor Positioning 锚点定位神器解析
· 全程不用写代码,我用AI程序员写了一个飞机大战
· DeepSeek 开源周回顾「GitHub 热点速览」
· 记一次.NET内存居高不下排查解决与启示
· MongoDB 8.0这个新功能碉堡了,比商业数据库还牛
· .NET10 - 预览版1新功能体验(一)