07 2020 档案

摘要:1. 了解有几种attention mechanism Seq2Seq, from a sequence input to a sequence output. Align & Translate, A potential problem of the vanilla Seq2Seq archite 阅读全文
posted @ 2020-07-29 03:34 keeps_you_warm 阅读(163) 评论(0) 推荐(0) 编辑
摘要:2思考问题,写introduction的时候,是用领域问题描述,还是通用方法,如果数据特殊就说领域问题,但你需要指出这类领域数据的特点和scientific challenge.,比如能源的异常检测和其他的有啥区别,为什么传统方法不行。 通用方法创新要难一些。 introduction两方面,mot 阅读全文
posted @ 2020-07-22 00:09 keeps_you_warm 阅读(129) 评论(0) 推荐(1) 编辑
摘要:References: https://www.zhihu.com/question/20525198 阅读全文
posted @ 2020-07-21 17:51 keeps_you_warm 阅读(134) 评论(0) 推荐(0) 编辑
摘要:The return of Pytorch.nn.LSTM is: output, (h_n, c_n) Outputs: output, (h_n, c_n) output (seq_len, batch, hidden_size * num_directions): tensor contain 阅读全文
posted @ 2020-07-16 22:41 keeps_you_warm 阅读(175) 评论(0) 推荐(0) 编辑
摘要:Humans can recognize new object classes from very few instances. However, most machine learning techniques require thousands of examples to achieve si 阅读全文
posted @ 2020-07-08 22:46 keeps_you_warm 阅读(138) 评论(0) 推荐(0) 编辑

点击右上角即可分享
微信分享提示