Blazeio.SharpEvent: A Python Async Primitive That Scales to 1M Waiters with O(1)

5 anonyxbiz 0 5/9/2025, 5:58:02 AM
I’ve been working on a Python async library ([Blazeio](https://github.com/anonyxbiz/Blazeio)) and stumbled into a shockingly simple optimization that makes `asyncio.Event` look like a relic.

### *The Problem* `asyncio.Event` (and similar constructs in other languages) has two nasty scaling flaws:

1. *Memory*: It allocates one future per waiter → 1M waiters = 48MB wasted.. 2. *Latency*: It wakes waiters one-by-one - O(N) syscalls under the GIL.

### *The Fix: `SharpEvent`* A drop-in replacement that: - *Uses one shared future* for all waiters - *O(1) memory*. - *Wakes every waiter in a single operation* - *O(1) latency*.

### *Benchmarks* | Metric | `asyncio.Event` | `SharpEvent` | |----------------|-----------------|--------------| | 1K waiters | ~1ms wakeup | *~1µs* | | 1M waiters | *Crashes* | *Still ~1µs*| | Memory (1M) | 48MB | *48 bytes* |

### *Why This Matters* - *Real-time apps* (WebSockets, games) gain *predictable latency*. - *High concurrency* (IoT, trading) becomes trivial. - It’s *pure Python* but beats CPython’s C impl.

### *No Downsides?* Almost none. If you need per-waiter timeouts/cancellation, you’d need a wrapper, but 99% of uses just need bulk wakeups.

### *Try It* ```python from Blazeio import SharpEvent event = SharpEvent() event.set() # Wakes all waiters instantly ```

[GitHub](https://github.com/anonyxbiz/Blazeio)

Would love feedback, am I missing a critical use case?

Comments (0)

No comments yet