Think
LLM
Models
Capabilities
Use Cases
Benchmarks
Papers
Glossary
Search
/
Glossary
/
Linear Attention
Linear Attention
techniques
An attention mechanism with linear complexity instead of quadratic.
Linear Attention — Glossary — ThinkLLM