饭要一口一口吃

饭要一口一口吃

机器学习日报

http://feeds.memect.com/ml.rss.xml

Most cited deep learning papers(Github)

https://github.com/terryum/awesome-deep-learning-papers

Classic Papers

Classic papers (1997~2009) which cause the advent of deep learning era

  • Learning deep architectures for AI (2009), Y. Bengio. [pdf]
  • Convolutional deep belief networks for scalable unsupervised learning of hierarchical representations (2009), H. Lee et al. [pdf]
  • Greedy layer-wise training of deep networks (2007), Y. Bengio et al. [pdf]
  • Reducing the dimensionality of data with neural networks, G. Hinton and R. Salakhutdinov. [pdf]
  • A fast learning algorithm for deep belief nets (2006), G. Hinton et al. [pdf]
  • Gradient-based learning applied to document recognition (1998), Y. LeCun et al. [pdf]
  • Long short-term memory (1997), S. Hochreiter and J. Schmidhuber. [pdf]

Papers Worth Reading

Newly released papers which do not meet the criteria but worth reading

  • Layer Normalization (2016), J. Ba et al. (Hinton) [pdf]
  • Deep neural network architectures for deep reinforcement learning, Z. Wang et al. (DeepMind) [pdf]
  • Learning to learn by gradient descent by gradient descent (2016), M. Andrychowicz et al. (DeepMind) [pdf]
  • Identity Mappings in Deep Residual Networks (2016), K. He et al. (Microsoft) [pdf]
  • Adversarially learned inference (2016), V. Dumoulin et al. [web][pdf]
  • Understanding convolutional neural networks (2016), J. Koushik [pdf]
  • SqueezeNet: AlexNet-level accuracy with 50x fewer parameters and< 1MB model size (2016), F. Iandola et al. [pdf]
  • Learning to compose neural networks for question answering (2016), J. Andreas et al. [pdf]
  • Learning hand-eye coordination for robotic grasping with deep learning and large-scale data collection (2016) (Google), S. Levine et al. [pdf]
  • Taking the human out of the loop: A review of bayesian optimization (2016), B. Shahriari et al. [pdf]
  • Eie: Efficient inference engine on compressed deep neural network (2016), S. Han et al. [pdf]
  • Adaptive Computation Time for Recurrent Neural Networks (2016), A. Graves [pdf]

Survey / Review

  • Deep learning (Book, 2016), Goodfellow et al. (Bengio) [html]
  • Deep learning (2015), Y. LeCun, Y. Bengio and G. Hinton [pdf]
  • Deep learning in neural networks: An overview (2015), J. Schmidhuber [pdf]
  • Representation learning: A review and new perspectives (2013), Y. Bengio et al. [pdf]

Theory / Future

  • Distilling the knowledge in a neural network (2015), G. Hinton et al. [pdf]
  • Deep neural networks are easily fooled: High confidence predictions for unrecognizable images (2015), A. Nguyen et al. [pdf]
  • How transferable are features in deep neural networks? (2014), J. Yosinski et al. (Bengio) [pdf]
  • Return of the devil in the details: delving deep into convolutional nets (2014), K. Chatfield et al. [pdf]
  • Why does unsupervised pre-training help deep learning (2010), D. Erhan et al. (Bengio) [pdf]
  • Understanding the difficulty of training deep feedforward neural networks (2010), X. Glorot and Y. Bengio [pdf]

 

表达学习(Representation Learning

http://blog.csdn.net/zouxy09/article/details/10077055

 

众所周知,机器学习方法的性能很大程度上取决于数据表达(或者特征)的选择。也正是因为这个原因,为了使得机器学习算法有效,我们一般需要在数据的预处理和变换中倾注大部分的心血。这种特征工程的工作非常重要,但它费时费力,属于劳动密集型产业。这种弊端揭露了目前的学习算法的缺点:在提取和组织数据的区分性信息中显得无能为力。特征工程是一种利用人的智慧和先验知识来弥补上述缺点的方法。为了拓展机器学习的适用范围,我们需要降低学习算法对特征工程的依赖性。这样,就可以更快的构建新的应用,更重要的是,在人工智能AI领域迈出了一大步。人工智能最基本的能力就是能理解这个世界(understand the world around us)。我们觉得,只有当它能学会如何辨别和解开在观测到的低级感知数据中隐含的解释性因素时才能达到这个目标。

 

表达学习(亦被江湖称作深度学习或者特征学习)已经在机器学习社区开辟了自己的江山,成为学术界的一个新宠。在一些顶尖会议例如NIPS和ICML中都有了自己的正规军(研究它的workshops),今年(2013)还专门为它搞了一个新的会议,叫ICLR(International Conference on Learning Representations),可见它在学术界得到的宠爱招人红眼。尽管depth(深度)是这个神话的一个主要部分,但其他的先验也不能被忽视,因为有时候,先验知识会为表达的学习献上一臂之力,画上点睛之笔,更容易地学习更好的表达,这在下一章节中将会详细讨论。在表达学习有关的学术活动中最迅速的进展就是它在学术界和工业界都得到了经验性的显著性的成功。下面我们简单的聚焦几点。

 

学习经验

http://ml.memect.com/article/machine-learning-guide.html很详细的资料汇总

 

http://www.guokr.com/post/512037/研究生型入门者的亲身经历

http://ourcoders.com/thread/show/2837/研究生型入门者的亲身经历

http://www.cnblogs.com/tornadomeet/tag/%E6%9C%BA%E5%99%A8%E5%AD%A6%E4%B9%A0/default.html?page=1深度学习笔记,小伙伴如何学习tornadomeet

http://www.jiqizhixin.com/article/1731初学者必读:解读14个深度学习关键词

http://it.sohu.com/20160301/n438988397.shtml机器学习:入门方法与学习路径 (附资料)三个方面的能力,数学基础,基本算法,编程基础

http://www.cnblogs.com/subconscious/p/4107357.html从机器学习谈起,如何理解机器学习

http://blog.csdn.net/joeyon1985/article/details/38640755机器学习:最简明入门指南

http://machinelearningmastery.com/machine-learning-roadmap-your-self-study-guide-to-machine-learning/ The Missing Roadmap to Self-Study Machine Learning

 

 

中文

http://www.52ml.net/ 我爱机器学习

http://www.mitbbs.com/bbsdoc/DataSciences.html MITBBS- 电脑网络 - 数据科学版

http://www.guokr.com/group/262/ 果壳 > 机器学习小组

http://cos.name/cn/forum/22 统计之都 » 统计学世界 » 数据挖掘和机器学习

http://bbs.byr.cn/#!board/ML_DM 北邮人论坛>>学术科技>>机器学习与数据挖掘

英文

https://github.com/josephmisiti/awesome-machine-learning 机器学习资源大全

http://work.caltech.edu/library/ Caltech 机器学习视频教程库,每个课题一个视频

http://www.kdnuggets.com/ 数据挖掘名站

http://www.datasciencecentral.com/ 数据科学中心网站

 

机器学习内功修炼

课程、书

  1. Andrew Ng 机器学习 Coursera

    讲义:http://cs229.stanford.edu/materials.html

工具

Java:mahout

posted @ 2017-01-22 11:36  yizhichun  阅读(469)  评论(0编辑  收藏  举报