Loading...

Self-attention

The attention mechanism applied within a single sequence to capture relationships between different positions. Self-attention is the core mechanism enabling transformers to understand context and relationships in language.

See: Attention mechanism; Transformer