摘要: 1、Visualize attention weights of multiple heads in this experiment. from matplotlib import pyplot as plt out = attention.attention.attention_weights.d 阅读全文
posted @ 2021-05-27 17:37 哈哈哈喽喽喽 阅读(80) 评论(0) 推荐(0) 编辑
摘要: 1、Modify keys in the toy example and visualize attention weights. Do additive attention and scaled dot-product attention still output the same attenti 阅读全文
posted @ 2021-05-27 17:32 哈哈哈喽喽喽 阅读(44) 评论(0) 推荐(0) 编辑
摘要: #2.What is the value of our learned w in the parametric attention pooling experiment? Why does it make the weighted region sharper when visualizing th 阅读全文
posted @ 2021-05-27 17:30 哈哈哈喽喽喽 阅读(84) 评论(0) 推荐(0) 编辑
摘要: #1、What can be the volitional cue when decoding a sequence token by token in machine translation? What are the nonvolitional cues and the sensory inpu 阅读全文
posted @ 2021-05-27 17:26 哈哈哈喽喽喽 阅读(71) 评论(0) 推荐(0) 编辑