代码改变世界

开始学习深度学习和循环神经网络Some starting points for deep learning and RNNs

2016-03-28 12:51  GarfieldEr007  阅读(206)  评论(0编辑  收藏  举报
Bengio, LeCun, Jordan, Hinton, Schmidhuber, Ng, de Freitas and OpenAI have done reddit AMA's.  These are nice places to start to get a Zeitgeist of the field.
 
Hinton and Ng lectures at Coursera, UFLDL, CS224d and CS231n at Stanford, the deep learning course at Udacity, and the summer school at IPAM have excellent tutorials, video lectures and programming exercises that should help you get started.
 
The online book by Nielsen, notes for CS231n, and blogs by Karpathy, Olah and Britz have clear explanations of MLPs, CNNs and RNNs.  The tutorials at UFLDL and deeplearning.net give equations and code. The encyclopaedic book by Goodfellow et al. is a good place to dive into details.  I have a draft book in progress.
 
Theano, Torch, Caffe, ConvNet, TensorFlow, MXNet, CNTK, Veles, CGT, Neon, Chainer, Blocks and Fuel, Keras, Lasagne, Mocha.jl, Deeplearning4j, DeepLearnToolbox, Currennt, Project Oxford, Autograd (for Torch), Warp-CTC are some of the many deep learning software libraries and frameworks introduced in the last 10 years.  convnet-benchmarks and deepframeworks compare the performance of many existing packages. I am working on developing an alternative, Knet.jl, written in Julia supporting CNNs and RNNs on GPUs and supporting easy development of original architectures.  More software can be found at deeplearning.net.

Deeplearning.net and homepages of Bengio, Schmidhuber have further information, background and links.
 
from: http://www.denizyuret.com/2014/11/some-starting-points-for-deep-learning.html