随笔分类 - AI论文精读
摘要:【论文精读】 Lora:Low-rank adaptation of large language models 论文地址:Lora:Low-rank adaptation of large language models 年份:2021 引用量:8000+ 关键词:LLM的高效微调 目录【论文精读
阅读全文
摘要:
【论文精读】BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding 作者: Jacob Devlin, Ming-Wei Chang, Kenton Lee, Kristina Toutanov
阅读全文

摘要:【论文精读】On the Relationship Between Self-Attention and Convolutional Layers 作者: Jean-Baptiste Cordonnier, Andreas Loukas, Martin Jaggi 发表会议: ICLR 2020 论
阅读全文