D
Doc and Tell
Glossary/ai/ml
ai/ml

Transformer

A neural network architecture based on self-attention mechanisms that powers modern language models.

Introduced in the 2017 paper "Attention Is All You Need," transformers process input sequences in parallel rather than sequentially. The self-attention mechanism allows every token to attend to every other token, capturing long-range dependencies in text.

Transformers are the foundation of every major LLM. Their parallelizable architecture enables training on massive datasets and scaling to billions of parameters, which is what gives modern AI systems their remarkable language understanding capabilities.

Analyze Documents Related to Transformer

Upload any document and get AI-powered analysis with verifiable citations.

Start Free