Ask HN: What "developer holy war" have you flip-flopped on?
Ask HN: How do you connect with other founders in your city?
Show HN: A real-time browser game built for 100k people
RTT is <100ms for US west coast, and around 150-200ms for east coast. The world sim is on a single authoritative server with basic 2D physics. Clients talk to a fleet of horizontally scalable replicas. Replicas are dumb aggregators and never simulate ahead of the primary. Each replica has the full world sim state, and clients connect based on real-world location rather than location in the sim world. There are no handoffs when someone moves through the world. Each replica can handle around 1k clients before the experience starts to degrade.
Client simulates ahead of the primary to minimize perceived latency. I found that using two simulations and blending them together created the best experience. Desync does still happen, but the game's design softens the impact. For example, the "dueling" mechanic works based on accumulated collision time, so desync causes fewer noticeable changes in outcomes.
Duels (also called minigames) are hosted on a separate fleet of machines. The minigame UI is shown over the world so that you never feel like you've left despite the minigame being an isolated instance.
While I haven’t run a full-scale test with 100k real users, individual systems have been stressed beyond required load.
The core simulation can handle 100k bots at a fairly smooth 50hz. These timings are in ms and are from an EC2 c7a.4xlarge box: - Wall (mean: 19.999; sd: 0.575; max: 22.919) - Sim (mean: 3.81, sd: 0.387, max: 7.428) - Emit (mean: 1.028; sd: 0.368; max: 4.174)
These timings do not change much when I add more replicas because replica connections are handled uniformly and off-thread.
No comments yet