Attention机制/注意力模型/attention

视频讲解:

https://www.bilibili.com/video/BV1L4411q785?p=3&spm_id_from=pageDriver

https://www.bilibili.com/video/BV1C7411k7Wg?from=search&seid=17393812710874939428

https://www.bilibili.com/video/BV1Nt411N7HN?from=search&seid=17393812710874939428

https://www.bilibili.com/video/BV1C54y147bY/?spm_id_from=333.788.recommend_more_video.6

https://www.bilibili.com/video/BV1X64y1M74z/?spm_id_from=333.788.recommend_more_video.1

 

博客:

https://blog.csdn.net/weixin_44791964/article/details/104000722?spm=1001.2014.3001.5501

 

 

 

示例代码:

https://github.com/bubbliiiing/Keras-Attention/blob/master/Attention_in_LSTM.py

 

posted @ 2021-02-19 11:27  emanlee  阅读(170)  评论(0编辑  收藏  举报