I’ve seen many benchmarks on OLAP performance, but I wanted to better understand the practical impact for myself, especially for LLM applications. This is my first attempt at building a benchmarking tool to explore that.
It runs some simple analytical queries against ClickHouse, Postgres, and Postgres with indexes. To make the results more tangible than just a chart of timings, I added a "latency simulator" that visualizes how the query delay would actually feel in a chat UI.
With a 10M row dataset: ClickHouse queries are sub-second, while Postgres takes multiple seconds.
This is definitely a learning project for me, not a comprehensive benchmark. The data is synthetic and the setup is simple. The main goal was to create a visual demonstration of how backend latency translates to user-perceived latency. Feedback and suggestions are very welcome.
It runs some simple analytical queries against ClickHouse, Postgres, and Postgres with indexes. To make the results more tangible than just a chart of timings, I added a "latency simulator" that visualizes how the query delay would actually feel in a chat UI.
With a 10M row dataset: ClickHouse queries are sub-second, while Postgres takes multiple seconds.
This is definitely a learning project for me, not a comprehensive benchmark. The data is synthetic and the setup is simple. The main goal was to create a visual demonstration of how backend latency translates to user-perceived latency. Feedback and suggestions are very welcome.