2016年2月26日

摘要: A Markov chain is a stochastic process where we transition from one state to another state using a simple sequential procedure. We start a Markov chai 阅读全文
posted @ 2016-02-26 21:14 chaseblack 阅读(247) 评论(0) 推荐(0) 编辑
摘要: The application of probabilistic models to data often leads to inference problems that require the integration of complex, high dimensional distributi 阅读全文
posted @ 2016-02-26 18:19 chaseblack 阅读(144) 评论(0) 推荐(0) 编辑

导航