The sad state of font rendering on Linux

31 harporoeder 29 7/25/2025, 5:14:51 PM pandasauce.org ↗

Comments (29)

scblock · 28m ago
Windows has the worst font rendering of all modern operating systems. Wanting anything like Windows font rendering is insane. Windows 10 makes it near impossible to properly turn off subpixel hinting without also turning off all anti-aliasing, which on a QD-OLED screen makes for horrific color fringing. Windows 11 is better, but still pretty weak. Linux is roughly as good as Mac OS, both of which are miles better than Windows.

Mac OS dropped the subpixel garbage (it really is garbage if you're at all sensitive to fringing or use anything other than a standard LCD) in favor of high pixel density screens. Sharp, readable text and zero color fringing. This is the way.

not_a_bot_4sho · 1m ago
I suppose this is a subjective area. I would rank Windows on top, Mac as a close second, and Linux ... well, I love Linux for reasons other than UI.
forrestthewoods · 1m ago
> in favor of high pixel density screens

I wish I could download an OS update that gave me a high pixel screen! But, uhhh, that’s not how it works.

Night_Thastus · 27m ago
With OLEDs with funky pixel layouts starting to become more popular, I hope Windows starts making their system less crap...
dartharva · 7m ago
Heavily disagree as a longtime Linux user. I don't know about MacOS but Windows has always had better font rendering than Linux in my experience.
jchw · 35m ago
To me one of the most influential pieces of writing about subpixel rendering and in particular an exploration of the ways Microsoft got it wrong was the writings by the late developer of Anti-Grain Geometry, Maxim Shemanarev (R.I.P.)

https://agg.sourceforge.net/antigrain.com/research/font_rast...

Though to be fair to this article, Microsoft did improve things with DirectWrite, and yes the situation on Linux is quite bad unfortunately.

Also bonus, a pretty great article here talking about gamma correctness in font rendering, an issue that is often somewhat overlooked even when it is acknowledged.

https://hikogui.org/2022/10/24/the-trouble-with-anti-aliasin...

Just some additional reading materials if you're interested in this sort of thing.

bradfitz · 5m ago
I haven't used displays with under ~215ppi in over 10 years. I find these subpixel opinion discussions still ongoing very... quaint. :)
RGBCube · 42s ago
@dang, post is from 2018, so adding (2018) to the title may help as the current state of font rendering on Linux is pretty fine.
ranger207 · 6m ago
I think Linux font rendering looks fine (although it has noticeably gotten better since this post was last updated in 2019) but I absolutely agree that MacOS has the worst looking font rendering. And I was using it on a genuine MacBook Pro! Discussions otherwise have convinced my that apparently font rendering just isn't objective but is opinion based
neoden · 33m ago
The article is from 2018 and that should be mentioned in the title
mushufasa · 25m ago
I would love to see an update on what has improved and what is the same
jeffbee · 7m ago
Yeah even the flag they are talking about doesn't exist in Chrome anymore. Skia is the only text rendering I ever suffer under Linux, so whether or not Skia works properly is the only thing that makes a difference to me.
kccqzy · 27m ago
Yes definitely. I stopped reading after the OS X section because it was clearly talking about a different era.
zekica · 36m ago
Subpixel rendering works completely fine on Linux. I'm using it right now, using "full" hinting and "RGB" subpixel rendering. It even works completely fine with "non-integer" scaling in KDE, even in firefox when "widget.wayland.fractional-scale.enabled" is enabled.
RGBCube · 5m ago
Almost MacOS-tier font rendering, for free:

    FREETYPE_PROPERTIES="cff:no-stem-darkening=0 autofitter:no-stem-darkening=0"
Probably only good in high DPI monitors though.
initramfs · 4m ago
this is excellent. I know what they mean on Windows. Because not all linux distros support ClearType-like fonts out of the box.
bee_rider · 11m ago
What’s wrong with the v35 freetype picture? He writes like it is immediately obvious, but it seems fine.
jchw · 5m ago
See the jump from 17pt to 18pt? That's wrong. (Also, the small sizes are just completely obliterated IMO.) Font outlines are scalable; they should have the same relative weight no matter what pt/px size you render them at, and they should have the same proportions. Non-scalable rendering is incorrect (although techniques like hinting and gridfitting do intentionally sacrifice scalability for better legibility, but I argue you can do better in most cases.)
bee_rider · 2m ago
Who cares? That only matters if you have a bizarre document that is incrementing through all the font sizes.
necovek · 42m ago
One thing that used to be possible with Freetype was configuring how "heavy" hinting was: I remember the time when autohinted fonts looked the best with "light" hinting. They were smooth, non-bold and I couldn't see colour fringing either.

You could also set the RGBA vs whatever pixel layout in the same Gnome settings dialog. Easy-peasy adjustment for a VA panel.

After, it was available only in gconf/dconf or a tool like gnome-tweaks or similar.

MacOS is definitely terrible today, but I prefer Linux over Windows still.

webdevver · 40m ago
the art of drawing pixels generally appears to elude free software. its always kind of sucked. if you're talking about compute shaders, its ok. but the moment it hits the screen, ouch!
gmueckl · 25m ago
Rendering can become extremely nuanced and finicky. Sometimes it can be solved with better, harder to implement algorithms and sometimes it requires tradeoffs that lead to good design. All of this needs time and time is a resource that is scarce in open source.
whalesalad · 24m ago
I can't really agree with this at all. I am a very design-heavy person who has been using a Mac professionally since the x86 transition, and have very strong opinions about font rendering, color accuracy, etc. About 2 years ago, I built a beast of a Linux workstation and use it with a 5K Apple Studio Display. Everything looks flawless and pixel perfect.
zuhsetaqi · 14s ago
Could be that it changed in Linux a lot in the last few years. The article is from 2018.
zerocrates · 10m ago
Yeah, I'm happy with my Linux font rendering for sure, and I would say generally it's more Apple-esque, so some reasonably large degree of this is just opinion and preference.

There was definitely a time when, to get good results, you had to do a lot more tweaking, setting things in fontconfig, using patched Freetype, but I don't really experience that anymore now for quite some time. I do still bring around a fonts.conf to my machines basically out of habit... the only relevant thing it does now probably is disable embedded bitmaps.

It's always a little bit of whiplash seeing how different this very site looks when I occasionally am on a Windows machine, with Verdana rendering quite differently there, and to my mind, worse.

ivape · 17m ago
I still don't know why MacOS can't render sharp text on 1440p external monitor.
ranger207 · 5m ago
Well see Apple doesn't sell 1440p external monitors, so the answer is you should stop using that trash and move to one that brute forces the sharpness problem and costs way too much because it has an Apple logo on it
DiabloD3 · 43s ago
Because they want to sell you a Retina(tm) monitor. I wish I was making a joke.

Roll back to a version of OSX that predates Retina, and _all_ of your monitors get the expected Mac-like font rendering, Retina or not. Go to 10.7 or newer, and all monitors are ran using the Retina tuning for font rendering, which makes it very smeary and blurry on normal monitors, but looks great on anything that triggers Retina rendering.

So, what I've been advising to the fewer and fewer Mac owners I know that want multi-monitor: only buy 4k monitors, OSX thinks they're HiDPI and won't fuck over your font rendering. At least, they won't today.

zuhsetaqi · 10m ago
Because 1440p is not a sharp external monitor.