机器翻译学习资料
博客地址:http://blog.csdn.net/wangxinginnlp/article/details/52944649
资料:https://arxiv.org/abs/1409.0473
说明:基于注意力机制的神经机器翻译(Attention-based NMT)的论文,首次将注意力机制引入神经机器翻译。
资料:https://devblogs.nvidia.com/parallelforall/introduction-neural-machine-translation-with-gpus/
http://devblogs.nvidia.com/parallelforall/introduction-neural-machine-translation-gpus-part-2/
https://devblogs.nvidia.com/parallelforall/introduction-neural-machine-translation-gpus-part-3/
说明:Kyunghyun Cho对神经机器翻译的介绍
资料:https://github.com/lmthang/thesis/blob/master/thesis.pdf
说明:Thang Luong的博士论文
资料:基于深度学习的机器翻译研究进展
说明:清华大学刘洋老师写的综述
资料:http://www.cipsc.org.cn/qngw/?p=953
说明:苏州大学张民老师、熊德意老师写的综述
资料:https://sites.google.com/site/acl16nmt/
说明:ACL2016上Thang Luong,Kyunghyun Cho和Christopher Manning的讲习班
资料:http://cwmt2016.xjipc.cas.cn/webpub/resource/10020/Image/document/CWMT2016-LiuYang.pdf
说明:CWMT2016上清华大学刘洋老师做的特邀报告
资料:http://www.cips-cl.org/static/CCL2016/tutorialpdf/T1B_%E6%9C%BA%E5%99%A8%E7%BF%BB%E8%AF%91_part2.pdf
说明:CCL2016上清华大学刘洋老师和中科院自动化所张家俊老师的讲习班
资料:https://drive.google.com/file/d/0B16RwCMQqrtdRVotWlQ3T2ZXTmM/view
说明:Kyunghyun Cho的talk : New Territory of Machine Translation,主要是讲cho自己所关注的NMT问题
资料:https://cn.aminer.org/archive/5832de5368ab39f745ee299d
说明:NLPCC2016上Jacob Devlin的Tutorial: Efficient Training and Deployment of Large Scale Deep Learning Systems for NLP
资料:http://www.cs.cmu.edu/~tbergkir/11711fa16/neubig16afnlp.pdf
说明:Graham Neubig的资料
资料:http://statmt.org/mtma16/uploads/mtma16-neural.pdf
说明:AMTA2016上Rico Sennrich的讲习班
资料:http://www.cnblogs.com/zhbzz2007/p/6276712.html
说明:来源网络,适合入门者
资料:http://www.statmt.org/eacl2017/practical-nmt.pdf
说明:EACL2017上Rico Sennrich的Tutorial: Practical Neural Machine Translation
资料:https://github.com/lmthang/thesis
说明:Thang Luong's Thesis on Neural Machine Translation(斯坦福Thang Luong博士的博士论文)