NLP学习参考

神经网络反向传播矩阵求导

https://zhuanlan.zhihu.com/p/83859554?from_voters_page=true

词嵌入向量WordEmbedding的原理和生成方法

https://www.sohu.com/a/210757729_826434

LSTM详解

https://blog.csdn.net/qian99/article/details/88628383

【神经网络】学习笔记十六——Attention机制

https://blog.csdn.net/zhuge2017302307/article/details/120025027

Attention用于NLP的一些小结

https://zhuanlan.zhihu.com/p/35739040

深度学习之GRU网络

https://www.cnblogs.com/jiangxinyang/p/9376021.html

A tutorial on training recurrent neural networks, covering BPPT, RTRL, EKF and the "echo state network" approach

http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.378.4095&rep=rep1&type=pdf

图解Transformer(完整版)

https://blog.csdn.net/longxinchen_ml/article/details/86533005

一文搞懂RNN(循环神经网络)基础篇

https://zhuanlan.zhihu.com/p/30844905

一分钟理解softmax函数(超简单)

https://blog.csdn.net/lz_peter/article/details/84574716

Pytorch详解NLLLoss和CrossEntropyLoss

https://blog.csdn.net/qq_22210253/article/details/85229988

log_softmax与softmax的区别

https://www.cnblogs.com/kanka/p/14690414.html

如何画出漂亮的神经网络图

https://zhuanlan.zhihu.com/p/148896017
https://github.com/HarisIqbal88/PlotNeuralNet

RNN文本生成-想为女朋友写诗(一)

https://blog.csdn.net/MrHanTalk/article/details/119896610

Pytorch常用的交叉熵损失函数CrossEntropyLoss()详解

https://zhuanlan.zhihu.com/p/98785902

posted @ 2022-03-13 19:48  裏表異体  阅读(24)  评论(0编辑  收藏  举报