Extending Context Window in Large Language Models with Segmented Base Adjustment for Rotary Position Embeddings
In the realm of large language models (LLMs), extending the context window for long text processing is crucial for enhancing performance.This paper introduces SBA-RoPE (Segmented Base Adjustment for Rotary Position Embeddings), a novel approach designed to efficiently extend the Screws context window by segmentally adjusting the base of rotary posi