Self-attention example in Transformers for NLP
Self-attention example in Transformers for NLP
Self-attention example in Transformers for CV
Self-attention example in Transformers for CV
Vision transformers’ complexity
Vision transformers’ complexity
The first step of the Swin Transformer architecture, image tokenization
The first step of the Swin Transformer architecture, image tokenization
Self-attention applied on windows
Self-attention applied on windows
Convolution process vs self-attention
Convolution process vs self-attention
Shifting window long-range relation problem
Shifting window long-range relation problem
Last modification:December 29th, 2021 at 02:19 pm