09 2023 档案
摘要:来自微软(*^____^*) 论文地址:[2107.00641] Focal Self-attention for Local-Global Interactions in Vision Transformers (arxiv.org) 代码地址:microsoft/Focal-Transforme
阅读全文
摘要:来自CVPR2022 基于多尺度令牌聚合的分流自注意力 论文地址:[2111.15193] Shunted Self-Attention via Multi-Scale Token Aggregation (arxiv.org) 项目地址:https://github.com/OliverRensu
阅读全文
摘要:来自快手(ฅ′ω`ฅ) 论文地址:[2106.05786] CAT: Cross Attention in Vision Transformer (arxiv.org) 项目地址:https://github.com/linhezheng19/CAT 一、Abstract 由于Transformer
阅读全文
摘要:来自NUS&NVIDIA 文章地址:[2204.12451] Understanding The Robustness in Vision Transformers (arxiv.org) 项目地址:https://github.com/NVlabs/FAN 一、Motivation CNN使用滑动
阅读全文