摘要:
1、Visualize attention weights of multiple heads in this experiment. from matplotlib import pyplot as plt out = attention.attention.attention_weights.d 阅读全文
摘要:
1、Modify keys in the toy example and visualize attention weights. Do additive attention and scaled dot-product attention still output the same attenti 阅读全文
摘要:
#2.What is the value of our learned w in the parametric attention pooling experiment? Why does it make the weighted region sharper when visualizing th 阅读全文
摘要:
#1、What can be the volitional cue when decoding a sequence token by token in machine translation? What are the nonvolitional cues and the sensory inpu 阅读全文