Show HN: Debunking Election Fraud Claims – Interactive Data Viz and Simulations

6 hannasanarion 0 7/29/2025, 2:22:18 PM sullivan.zip ↗
Hi HN!

I built this after seeing several references to Election Truth Alliance on social media, and after reading their analysis, I just couldn't get the problems I saw in it out of my head.

So I downloaded the data, and rebuilt their full analysis from scratch.

Their critical error is a simple misunderstanding of the Law of Large Numbers: values collected in large samples converge to the true probability in the sample distribution.

(not to be confused with the Law of Very Large Numbers: which states that unlikely things happen given enough time. That confused me too)

Technical Details:

- No build system, this is entirely handmade HTML, CSS, and plain Javascript.

- Initial analysis done in Python with only standard libraries.

- Visualizations created in Observable Plot and D3.js

- Simulations run entirely client-side

- Web page built with Scrollama for animations and behavior controls

- Vote history visualizations process ~600k individual ballot records in real time, with a little bit of cacheing to keep your browser from chugging.

Interesting Challenges:

- Making the visualizations performant without a backend, which is accomplished with a bit of preloading as you scroll, and some amount of reusable cacheing so that the visualizations can share resources whenever possible.

- Windsurf does run wild sometimes. During the initial preprocessing stage, it at one point dumped an absolutely massive json blob to disk, it was so large it actually crashed my whole computer while writing. Then to read it, obviously it couldn't just be read in, but rather than storing in a more sane format, my Opus 4 powered coding agent decided to build a streaming JSON parser from scratch. It worked, and I got the data out that I needed so I didn't go back and make it more sensible, but man that was dumb.

This actually started with the simulation, which took only about a day of work, and then later grew to include the re-analysis and visualizations. The visualizations were all dnoe within 2-3 days after I got the data.

If I did it over again, I would've probably tried to find some kind of build system or static site generator to compose the final result. Once the page got very long it was quite unwieldy even for windsurf. Very short conversations could flood Sonnet 4's rate limit because there was just so much stuff in a single file.

Comments (0)

No comments yet