Toys/Lag: Jerk Monitor

40 ptramo 35 7/6/2025, 2:33:34 PM nothing.pcarrier.com ↗

Comments (35)

sillysaurusx · 5h ago
I can buy that humans can see at least 120hz at a minimum. 60Hz is the generally accepted threshold, but I’ve long suspected that 120Hz has mostly imperceptible effects that are still noticeable, if rarely.

I can’t buy this:

> I've also learnt I do benefit from the 8 kHz setting of my mouse, as even at 3200 DPI with fast & smooth motion, some frames still miss a pointer update

It may be true that pointer updates were being missed. But does that really affect anything?

It turns out that there’s a way to test this experimentally. Do a double blind experiment, just like in science. If you can tell which monitor is 240hz more than randomly, then it matters. Ditto for the pointer updates.

The corollary is that if you can’t tell with better than random chance, then none of this matters, no matter how much you think it does.

Experiments like this have decisively settled the “Does higher sampling rate matter when listening to music?” debate, among other questions. People still swear that they can tell that there’s a difference, but it’s expectation bias. They’re mistaken.

(10ms drops every few seconds would definitely be noticeable though; that wasn’t the point.)

haiku2077 · 5h ago
> I can buy that humans can see at least 120hz at a minimum. 60Hz is the generally accepted threshold, but I’ve long suspected that 120Hz has mostly imperceptible effects that are still noticeable, if rarely.

There are videos on youtube showing people perceive differences at much higher framerates. e.g. https://www.youtube.com/watch?v=OX31kZbAXsA (long video, so you can skip to the end - they found that even casual players were performing measurably more consistently at 240Hz than even 144Hz.)

Anecdotally, I recently switched to playing racing games at 165FPS and the difference is massive!

ptramo · 5h ago
As per the post, I wrote this tool to confirm I was getting jerks of ~10ms every few seconds on one USB port and not the other. This would _suggest_ I can catch differences around the ballpark of 100 Hz.

I'm game for a randomized blinded test on 120 Hz refresh rate vs 240 Hz refresh rate. I would indeed be very curious to confirm I can tell the difference with a proper protocol.

Many years back (we were on CRTs), I was in similar shoes, convinced my friend couldn't tell the difference between 60 Hz and 90 Hz when playing video games.

Turns out he only needed to look at the pointer through one push of the mouse to tell right away, successful 100% of the time in a blinded experiment.

amluto · 3h ago
> Many years back (we were on CRTs), I was in similar shoes, convinced my friend couldn't tell the difference between 60 Hz and 90 Hz when playing video games.

That’s a silly experiment. I could look at a CRT with a completely static image and tell almost immediately whether it was at 60Hz, 90Hz or 120Hz. Flickr at 60Hz was awful, 90Hz was clearly perceptible, and even 120Hz was often somewhat noticeable. And most CRT/graphics card combos would become perceptibly blurry in the horizontal direction at 120Hz at any reasonable desktop resolution, so you could never truly win. Interlaced modes made the flicker much less visible, but the crawling effect was easy to see and distracting.

andai · 2h ago
The effects you describe are specific to CRTs only, right? Caused by the electron beam effectively illuminating one pixel at a time?
ptramo · 5h ago
As to how you can perceive the difference between 120 events per second and 240, I have what I hope is a fairly simple explanation.

It's like lightning strokes of tens of microseconds making a lasting impression on your perception of the scene. You don't "count" strokes over time, but in space.

When you make circles fast and large enough on screen, you can evaluate the number of cursors that appear before your eyes. At 4 circles per second, is each circle made of ~60 pointers or ~30? Belief not fact: it's not hard to guess.

sillysaurusx · 1h ago
I was going to reply with this:

“If anyone wants to implement this, I think the way to do it is to put the mouse cursor randomly on the edge of a circle whose radius is a few hundred pixels. The randomness is important, though I’m not sure it would be possible to count how many cursors there are.”

And then I realized that doesn’t work, for a few reasons.

One is that you won’t be able to count how many cursors appear during one second. It’ll all look like a jumble.

That leads to the argument that you should place the cursors at a consistent spacing, and the spacing needs to make it so that the cursors stay at the same spot on the screen each loop around the circle.

Unfortunately that doesn’t work either, because you’ll end up seeing a trail of cursors going around a circle once per second, and counting the cursors is hopeless.

So I think you’d need to make a list of the spots on the circle where the cursors should go, then randomly select from them as quickly as possible. That will let each cursor be perceptible because they’ll be spread out over time; the next cursor won’t be just one pixel apart, so this eliminates the “trail of cursors” problem.

I’m still a bit skeptical this could work, but I admit I can’t think of a reason it wouldn’t. You’ll need to be careful, because it’s really easy to fool yourself that you’ve done it correctly when you haven’t.

It would be interesting to make a WebGL canvas and try this out for real. Or maybe just reposition the mouse cursor with Python instead of doing anything graphical.

It seems important to reposition the mouse cursor rather than use WebGL to draw frames, but I think both could work. Actually, the WebGL route would be more faithful to the question of whether gamers specifically can notice 240Hz; there are all kinds of reasons why repositioning the mouse cursor wouldn’t really tell you that. Vice-versa too, because it might be possible to notice when repositioning cursors but not when using WebGL, though I can’t think of why that would be the case.

Neat idea. Thanks.

modeless · 27m ago
How much latency you can perceive greatly depends on the context. But in the right context humans can perceive display latency down close to 1 ms as demonstrated by Microsoft Research many years ago. There is no excuse to be skeptical about that. https://www.youtube.com/watch?v=vOvQCPLkPt4
amluto · 2h ago
The author didn’t say that they have a use for those 3200 updates per second other than as a workaround for some other issue. With a competently composited desktop and applications that pace input processing and frame generation well, and ignoring pointer acceleration, 1 correctly timed update per frame is enough. (As far as I know this does not exist from any vendor on a modern system other than for games, although really old Apple II-era software often got it right.). For acceleration, some pointer history is needed. And no one has a mouse that has an API that allows the host to pace the updates.

Presumably the 3200 Hz is needed for a combination of reasons:

- Under ideal conditions, if you want less than 10% variation in the number of samples per frame at 240Hz, you may need ~2400Hz. This effect is visible even by human eyeballs — you can see multiple cursor images across your field of view, and uneven spacing is noticeable.

- The mouse itself may work less well at a lower sampling rate.

- The OS and input stack may be poorly designed and work better at higher rates.

In any case, the application and cursor implementation are unlikely to ask for a mouse location more than once per frame, so the user is not really using 3200 updates per second, but that’s irrelevant.

ptramo · 1h ago
First it's settings not real numbers. I'm not claiming that's how the mice actually perform, only how I tell them to perform.

Second 3200 was DPI not Hz. I can trivially tell how much I have to move with 3200 DPI (my sweet spot with 2 4K monitors), 4800 DPI, and 6400.

For Hz, it was the polling rate. With a configured 8000 Hz polling rate which is a lie/peak, I still see stalls in the 4ms range with my hardware.

As to acceleration I disable it. To truly lose it at high DPIs I've had to install RawAccel on Microsoft Windows.

tverbeure · 5h ago
Higher refresh rates don't have to be perceptible to be useful: they can shift the balance in head-to-head gaming.

Imagine 2 identical gaming setups with 2 players that have the same skill set. In an FPS game, you'd expect each of those players to win 50% of the games.

Now switch one monitor from 120Hz to 240Hz. On average, the player on the 240Hz monitor will see their adversary 4ms earlier than the player on the 120Hz monitor and thus be able to push the mouse button earlier too.

brookst · 2h ago
Do any competitive FPS games actually render 240 different frames in a second? Because if both players’ hardware is doing 60FPS, that monitor difference changes nothing.
haiku2077 · 1h ago
Yes, most will go up past 500FPS, even old ones like Quake and classic Counter Strike.

https://youtu.be/nqa7QVwfu7s

timewizard · 1h ago
More directly if the game engine only updates player state every 60 seconds (tick rate) then is this 4ms advantage actually present for the 240Hz case?

Further if your network has more than 4ms of jitter then I don't think you can make any concrete claim in either direction.

dubbie99 · 49s ago
My theory about why it helps gamers to have a higher frame rate is that for something like a whip turn, with a low frame rate, your brain has to take a brief moment to work out where it ended up looking after the pan. But if your frame rate is high enough, you brain can keep updating its state during the pan because the updates are continuous enough not to lose “state” during it. This means when you finish the fast move, there is no delay while you reorient yourself for a few milliseconds.
leni536 · 43m ago
There is still theoretically an edge if you just show the same frames statistically earlier to one player.

You can present the game state statistically earlier to the player with the higher refresh rate display.

nkrisc · 4h ago
I think this sort of effect odd what makes people think they can tell the difference - they can notice the indirect side-effects that correlate with the difference.

A pro FPS player might notice that they loose contests peeking around corners more often. Obviously network latency in online games will be a factor as well, but since it likely averages out for both players over time, I would guess you can mostly discount it along with alternating who’s doing the peeking.

I don’t think anyone could look at a scene on a 120hz vs 240hz display and tell the difference, there needs to be some indirect clue.

sjoedev · 3h ago
I play video games at a decently high level (like top ~10% in a few competitive games). To support what you’re saying, I can tell the difference between 144hz and 240hz if I’m in control. For example, if I can shake the screen around.

If I’m just watching, I’m not sure I could even tell the difference between 60hz and 144hz.

tofof · 3h ago
So tired of defending against this same, old, completely wrong intuition from people especially those saying "do the science" to justify their ignorance instead of looking themselves since the science has already been done and it's coming up on a full century old.

From this one paper alone, humans can perceive information from a single frame at 2000 Hz.

https://doi.org/10.1080/00223980.1945.9917254

Humans can read numbers and reproduce them immediately a 5 digit number is displayed for 1 frame at 400 fps. This is a single exposure, it is not a looping thing with persistence of vision or anything like that. 7 digit numbers required the framerate to be 333 fps. Another student produced 9 digit number from a single frame at 300 fps. These were the average results. The record results were a correct reproduction of a 7 digit number from a single viewing of a single frame at 2000 Hz. This was the limit within 2% accuracy of the tachistoscopic equipment in question. From the progression of the students chasing records, no slowing of their progression had ever been in sight. The later papers from this author involve considerable engineering difficulty to construct an even faster tachistocope and are limited by 1930s-1940s technology.

This research led the US Navy in WW2 to adopt tachistotopic training methods for aircraft recognition replacing the WEFT paradigm (which had approximately a 0% success rate) to a 1 frame at 75 fps paradigm which led to 95% of cadets reaching 80% accuracy on recognition, and 100% of cadets reaching 62.5% accuracy after just 50 sessions.

Yes, humans can see 2000 fps. Yes, humans can see well beyond 2000 fps in later work from this researcher.

https://doi.org/10.1080/00223980.1945.9917254

Yes, humans can detect flicker well above 1000 fps in daily life at the periphery of vision with cone cells as cone cells can fire from a single photon of light and our edge detection circuits operate at a far higher frequency than our luminance and flicker-fusion circuits. Here's flicker being discriminated from steady light at an average of 2 kHz for 40 degree saccades, and an upper limit above 5 kHz during 20 degree saccades, which would be much more typical for eyes on a computer monitor.

There is no known upper limit to the frequency of human vision that is detectable. As far as I know, all studies (such as this one I link) have always been able to measure up to the reliable detection limit of their equipment, never up to a human limit.

chrisweekly · 2h ago
Best comment in the threads, IMHO. Thanks for the details.
gblargg · 4h ago
There could be side-effects of these higher rates that are noticeable, or bugs in handling different rates.
tuatoru · 4h ago
> Experiments like this have decisively settled the “Does higher sampling rate matter when listening to music?” debate...

Not really relevant. Music is experienced after a Fourier transform, in frequency space,

The more telling example is that experienced drummers get frustrated by lag of 2 ms from computer-generated effects. That's 500 Hz.

jerlam · 2h ago
I bought a Logitech wireless mouse called the Marathon which boasted an amazing three-year battery life on two AAs. I initially thought it was broken; it had a maddening delay where the sensors turned off after a short idle time, so when I wanted to use the mouse, it didn't register the first movements since it had to "wake up".

This delay wasn't present on the Logitech gaming mouse I previously used, probably a combination of a high polling rate (500Hz) and a much longer idle delay. The battery life was also much shorter, only 250 hours on high-performance mode, but I just recharged a set of AA batteries every week so it was never an issue.

I ended up returning the Marathon mouse.

jonathanlydall · 2h ago
I’m happy with my $5 wired Logitech mouse as it’s got essentially zero lag, never runs out of batteries and unlike the high end Logitech mice has no “rubber” which tends to invariably go icky over time.

At one point I had a Razer wireless mouse (Mamba I think?) which had no discernible latency and a nice dock for recharging the mouse, I was very happy with it until one evening it just stopped working. While alone in my flat, I stepped away from using my computer for about an hour, didn’t even put it to sleep, came back, and it would no longer move but would still register mouse clicks. I tried contacting customer support asking if there was a way to reset it or reflash the firmware or something and they’re just like “nope”. Last piece of Razer hardware I ever bought.

ptramo · 7h ago
With 240 Hz displays you probably want your mouse polling setting at 4000 or better 8000 Hz. This tool lets anyone confirm that on their hardware.
BearOso · 7h ago
That's a recent invention. Only the latest gaming mice can poll at that rate, and not particularly well on USB2. They're usually limited to 1000.

Neat tool, though. I'm also very sensitive towards latency.

naoru · 6h ago
Seems like it doesn't properly handle mouse events on Safari in macOS and only shows "frames with no pointer events". I assume it's because "pointerrawupdate" event is not supported there.

Also it's interesting that with ProMotion enabled it reports 16.67ms per frame (indicating 60Hz redraw rate) in Safari, but in Chrome it's 8.33.

ptramo · 6h ago
Yes, I rely on pointerrawupdate. Thanks for letting me know! Unfortunately pointermove is typically synced with graphics in my limited experience, and I think I'd rather not show anything than provide wildly inaccurate numbers.
naoru · 6h ago
Oh and you also might be interested in this one too: https://github.com/cakama3a/Polling

Although it's for gamepads, it's pretty much indispensable in debugging gamepad-related latency issues. For example, I found that my presumably 1000Hz controller can do only 500Hz in ideal conditions and it starts to drop at a much lower distance from the computer than advertised. Neat stuff.

daft_pink · 6h ago
I found that plugging my keyboards directly into my Mac’s limited USB ports is noticably faster.

I’m curious if there is a USB hub that I could buy of higher quality as my mac doesn’t have too much i/o

lostlogin · 3h ago
‘USB hub’ and ‘quality’ never go together.

I’d love to be wrong on this but haven’t been so far.

daft_pink · 1h ago
Buying a $160 thunderbolt 5 hub for my keyboard? Lol
xnx · 5h ago
Does this tool measure something different than? https://joltfly.com/mouse-latency-test/
ptramo · 5h ago
Yes, pointerrawupdate events for this one, mousemove for the one you linked. The latter tends to sync to the display in my very limited experience.

There are other differences in the tools, mine was designed for what I wanted to understand so I'm biased toward it.

rufus_foreman · 2h ago
Sometimes I think "Uncontrollable Urge" is the greatest Devo song ever. Other times, I think maybe "Jerk Monitor".