bert资料收集汇总

一、seq2seq资料

  • Teacher force策略

GitHub - tensorflow/nmt: TensorFlow Neural Machine Translation Tutorial

简说Seq2Seq原理及实现 - 知乎

  • Seq2seq应用

Latex识别:Seq2Seq with Attention and Beam Search

字母序:从Encoder到Decoder实现Seq2Seq模型 - 知乎

  • API使用方法

Tensorflow中的Seq2Seq全家桶 - 知乎

二、bert模型的理解

  • FFN的理解

https://blog.csdn.net/u013166817/article/details/85837124

  • The Illustrated Transformer

The Illustrated Transformer【译】_于建民的博客-CSDN博客_the transformer

  • Google BERT详解

【NLP】Google BERT模型原理详解 - 知乎

  • 文本摘要

text-summarization-tensorflow

Tensorflow 自动文摘: 基于Seq2Seq+Attention模型的Textsum模型_rockingdingo的博客-CSDN博客

三、自然语言处理可视化

  • Visualizing A Neural Machine Translation Model (Mechanics of Seq2seq Models With Attention)

https://jalammar.github.io/visualizing-neural-machine-translation-mechanics-of-seq2seq-models-with-attention/

  • The Illustrated Transformer

https://jalammar.github.io/illustrated-transformer/

  • The Illustrated BERT, ELMo, and co. (How NLP Cracked Transfer Learning)

The Illustrated BERT, ELMo, and co. (How NLP Cracked Transfer Learning) – Jay Alammar – Visualizing machine learning one concept at a time.

posted @ 2022-11-13 22:35  dlhl  阅读(9)  评论(0)    收藏  举报