Think
LLM
Models
Capabilities
Use Cases
Benchmarks
Papers
Glossary
Search
/
Glossary
/
Masked Self Attention
Masked Self Attention
techniques
Attention that only looks at past tokens, preventing future information leakage.
Masked Self Attention — Glossary — ThinkLLM