Efficient Long-Context Transformer: Breaking the 1M Token Barrier
Dec 24, 2025
Minghua Li's team presents ELCT, extending the effective context window to 1 million tokens via sparse attention and hierarchical memory, showing exceptional performance in long-doc understanding.