Scaling Transformer to 1M tokens and beyond with RMT - Summary
The paper presents a method to extend the context length of BERT, a Transformer-based model in natural language processing, by incorporating token-based memory storage and segment-level recurrence with recurrent memory (RMT). The method enables the model to store task-specific information across up