bert入门资料
transformer: Attention Is All You Need讲解
参考ppt: http://www.isclab.org.cn/wp-content/uploads/2018/12/Transformer%E4%B8%AD%E7%9A%84Multi-Head-Attention-%E7%8E%8B%E7%9D%BF%E6%80%A1-2018.12.9-19_00_00.pdf
参考知乎:https://zhuanlan.zhihu.com/p/46990010
bert入门
参考:https://zhuanlan.zhihu.com/p/49271699
https://zhuanlan.zhihu.com/p/48612853
用bert做抽取式摘要