Radial Attention: O(nlogn) Attention for Long Video Generation with 2-4× Speedup
1 lmxyy 1 7/7/2025, 8:31:10 PM hanlab.mit.edu ↗
Comments (1)
lmxyy · 8m ago
Introduce Radial Attention — a static sparse attention mechanism with O(nlogn) complexity for long video generation! Here are some key features:
* Plug-and-play: works with pretrained models like Wan, HunyuanVideo, Mochi
* Speeds up both training&inference by 2–4×, without quality loss
* Compatible with pre-trained LoRAs. When applied to 8-step FusionX LoRA, Radial Attention further delivers a 1.6× speedup