12 2018 档案

Improved Semantic Representations From Tree-Structured Long Short-Term Memory Networks-paper
摘要:Improved Semantic Representations From Tree-Structured Long Short-Term Memory Networks 作者信息:Kai Sheng Tai Stanford UniversityRichard Socher MetaMindCh 阅读全文
posted @ 2018-12-27 17:38 rosyYY 阅读(537) 评论(0) 推荐(0) 编辑
Parsing with Compositional Vector Grammars--paper
摘要:这篇和2012年的区别: 1)Max-Margin Training Objective J中RNN变为了CVG 2012-两个词向量合并并打分,这个-两个(词向量,POS)合并并打分 2012年: Socher et al. (2012) proposed to give every single 阅读全文
posted @ 2018-12-27 17:15 rosyYY 阅读(257) 评论(0) 推荐(0) 编辑
Reasoning With Neural Tensor Networks For Knowledge Base Completion-paper
摘要:https://www.socher.org/index.php/Main/ReasoningWithNeuralTensorNetworksForKnowledgeBaseCompletion 年份:2013 https://www.cnblogs.com/wuseguang/p/4168963. 阅读全文
posted @ 2018-12-27 16:01 rosyYY 阅读(673) 评论(0) 推荐(0) 编辑
Semantic Compositionality through Recursive Matrix-Vector Spaces-paper
摘要:Semantic Compositionality through Recursive Matrix-Vector Spaces 作者信息:Richard Socher Brody Huval Christopher D. Manning Andrew Y. Ngrichard@socher.org 阅读全文
posted @ 2018-12-26 22:20 rosyYY 阅读(529) 评论(0) 推荐(0) 编辑
Parsing Natural Scenes and Natural Language with Recursive Neural Networks-paper
摘要:Parsing Natural Scenes and Natural Language with Recursive Neural Networks作者信息: Richard Socher richard@socher.orgCliff Chiung-Yu Lin chiungyu@stanford 阅读全文
posted @ 2018-12-26 20:19 rosyYY 阅读(456) 评论(0) 推荐(0) 编辑
tree-lstm初探
摘要:https://zhuanlan.zhihu.com/p/35252733 可以先看看上面知乎文章里面的例子 Socher 等人于2012和2013年分别提出了两种区分词或短语类型的模型,即SU-RNN(Syntactically-Untied RNN)和MV-RNN(Matrix-Vector R 阅读全文
posted @ 2018-12-24 11:37 rosyYY 阅读(6543) 评论(0) 推荐(0) 编辑
multi-head attention
摘要:■ 论文 | Attention Is All You Need ■ 链接 | https://www.paperweekly.site/papers/224 ■ 源码 | https://github.com/Kyubyong/transformer ■ 论文 | Weighted Transfo 阅读全文
posted @ 2018-12-13 17:45 rosyYY 阅读(22528) 评论(1) 推荐(2) 编辑
Hierarchical RNN
摘要:https://blog.csdn.net/liuchonge/article/details/73610734 https://blog.csdn.net/triplemeng/article/details/78269127 -- https://github.com/triplemeng/hi 阅读全文
posted @ 2018-12-13 16:24 rosyYY 阅读(971) 评论(0) 推荐(0) 编辑
RNN-LSTM-GRU-BIRNN
摘要:https://blog.csdn.net/wangyangzhizhou/article/details/76651116 共三篇 RNN的模型展开后多个时刻隐层互相连接,而所有循环神经网络都有一个重复的网络模块,RNN的重复网络模块很简单,如下下图,比如只有一个tanh层。 而LSTM的重复网络 阅读全文
posted @ 2018-12-13 14:31 rosyYY 阅读(1390) 评论(0) 推荐(0) 编辑