What Matters in Transformers? Not All Attention is Needed Paper • 2406.15786 • Published Jun 22, 2024 • 31
Breaking the Memory Barrier: Near Infinite Batch Size Scaling for Contrastive Loss Paper • 2410.17243 • Published Oct 22, 2024 • 89
Forgetting Transformer: Softmax Attention with a Forget Gate Paper • 2503.02130 • Published 7 days ago • 16