T5大型语言模型

T5 and Large Language Model

T5

Text-to-Text Transfer Transformer.

see all NLP questions as A TEXT-TO-TEXT TASK

universal format: task description + sentence -> answer

Details

Pretrain: BERT-base-sized encoder-decoder transformer, denoising objective, C4 datasets

finetune: GLUE CNN abstract SQuAD

mT5

multilingual T5

Clozed Domain QA

T5.1.1 only pretrained on unsupervised data to get knowledge.

use salient span masking to mask entities

posted @   19376273  阅读(141)  评论(0编辑  收藏  举报
相关博文:
阅读排行:
· 震惊!C++程序真的从main开始吗?99%的程序员都答错了
· 【硬核科普】Trae如何「偷看」你的代码?零基础破解AI编程运行原理
· 单元测试从入门到精通
· 上周热点回顾(3.3-3.9)
· winform 绘制太阳,地球,月球 运作规律
点击右上角即可分享
微信分享提示