Positional Encoding Part 2 — RoPE, ALiBi, and the Quest for Length Generalization
The shift from absolute to relative positional encoding was one of the most important architectural changes in modern LLMs. This post covers T5's relative bias, ALiBi's simplicity, and RoPE's elegant rotation — the method that won and powers every major LLM today.
