Why Self Attention Is Called Self ?
Table Of Contents:
- Why Self Attention Is Called Self ?
(1) Why Self Attention Is Called Self ?
- We have learnt the Attention concepts from the Luong Attention.
- In Luong attention mechanism, we calculate which word of the Encoder is more important in predicting the current time step output of the Decoder.
- To Do this we assign an attention score to each word of the Encoder and pass it as input to the Decoder.
- We put a SoftMax layer to normalize the attention score.
- The same operation mathematical we are performing in case of the Self Attention technique.
- Hence Self attention is also called as Attention technique.
- Why it is called Self because we are performing the attention technique to the same input sentence.
- In case of Luong attention the attention technique is performed in the two different input sentences.
(2) Why Is Self Attention Is Called Self ?
(3) Example Of Self Attention:
(4) Key Difference from Other Attention Types:

