随笔分类 -  NLP

摘要:ICLR 2020 Trends: Better & Faster Transformers for Natural Language Processing 2020-05-16 11:12:20 Source: http://gsarti.com/post/iclr2020-transformer 阅读全文
posted @ 2020-05-16 11:14 AHU-WangXiao 阅读(475) 评论(0) 推荐(0) 编辑
摘要:What’s new for Transformers at the ICLR 2020 Conference? 2020-05-07 10:51:22 Source: https://towardsdatascience.com/whats-new-for-transformers-at-the- 阅读全文
posted @ 2020-05-07 10:53 AHU-WangXiao 阅读(255) 评论(0) 推荐(0) 编辑
摘要:A Survey of Long-Term Context in Transformers 2020-03-17 10:08:32 Source: https://www.pragmatic.ml/a-survey-of-methods-for-incorporating-long-term-con 阅读全文
posted @ 2020-03-17 10:10 AHU-WangXiao 阅读(235) 评论(0) 推荐(0) 编辑
摘要:BERT-related Papers 2020-03-03 16:36:12 This is a list of BERT-related papers. Any feedback is welcome. Source: https://github.com/tomohideshibata/BER 阅读全文
posted @ 2020-03-03 16:36 AHU-WangXiao 阅读(1292) 评论(0) 推荐(0) 编辑
该文被密码保护。
posted @ 2020-03-03 15:31 AHU-WangXiao 阅读(0) 评论(0) 推荐(0) 编辑
该文被密码保护。
posted @ 2020-03-03 15:31 AHU-WangXiao 阅读(0) 评论(0) 推荐(0) 编辑
摘要:Illustrating the Reformer 2020-03-02 13:39:12 Source: https://towardsdatascience.com/illustrating-the-reformer-393575ac6ba0 See also: Translation in � 阅读全文
posted @ 2020-03-02 13:41 AHU-WangXiao 阅读(289) 评论(0) 推荐(0) 编辑
摘要:A Structured Self-Attentive Sentence Embedding ICLR 2017 2018-08-19 14:07:29 Paper:https://arxiv.org/pdf/1703.03130.pdf Code(PyTorch): https://github. 阅读全文
posted @ 2018-08-19 15:44 AHU-WangXiao 阅读(5577) 评论(2) 推荐(2) 编辑

点击右上角即可分享
微信分享提示