07 2020 档案
摘要:1. 了解有几种attention mechanism Seq2Seq, from a sequence input to a sequence output. Align & Translate, A potential problem of the vanilla Seq2Seq archite
阅读全文
摘要:2思考问题,写introduction的时候,是用领域问题描述,还是通用方法,如果数据特殊就说领域问题,但你需要指出这类领域数据的特点和scientific challenge.,比如能源的异常检测和其他的有啥区别,为什么传统方法不行。 通用方法创新要难一些。 introduction两方面,mot
阅读全文
摘要:References: https://www.zhihu.com/question/20525198
阅读全文
摘要:The return of Pytorch.nn.LSTM is: output, (h_n, c_n) Outputs: output, (h_n, c_n) output (seq_len, batch, hidden_size * num_directions): tensor contain
阅读全文
摘要:Humans can recognize new object classes from very few instances. However, most machine learning techniques require thousands of examples to achieve si
阅读全文