Instead of using a new PNG standard, I'd still rather use JPEG XL just because it has progressive decoding.
And you know, whilst looking like png, being as small as webp, supporting HDR and animations, and having even faster decoding speed.
Nothing really supports it. Latest Safari at least has support for it not feature-flagged or anything, but it doesn't support JPEG XL animations.
To be fair, nothing supports a theoretical PNG with Zstandard compression either. While that would be an obstacle to using PNG with Zstandard for a while, I kinda suspect it wouldn't be that long of a wait because many things that support PNG today also support Zstandard anyways, so it's not a huge leap for them to add Zstandard support to their PNG codecs. Adding JPEG-XL support is a relatively bigger ticket that has struggled to cross the finish line.
The thing I'm really surprised about is that you still can't use arithmetic coding with JPEG. I think the original reason is due to patents, but I don't think there have been active patents around that in years now.
bawolff · 20m ago
> The thing I'm really surprised about is that you still can't use arithmetic coding with JPEG.
I was under the impression libjpeg added support in 2009 (in v7). I'd assume most things support it by now.
jchw · 10m ago
Believe it or not, last I checked, many browsers and some other software (file managers, etc.) still couldn't do anything with JPEG files that have arithmetic coding. Apparently, although I haven't tried this myself, Adobe Photoshop also specifically doesn't support it.
kps · 11m ago
> Nothing really supports it.
Everything supports it, except web browsers.
bawolff · 24m ago
Doesn't PNG have progressive decoding? I.e. adam7 algorithm
APNG isn't recent so much as the specs were merged together. APNG will be 21 years old in a few weeks.
layer8 · 8m ago
True, but https://news.ycombinator.com/item?id=44802079 presumably holds the opinion that APNG != PNG, so I mentioned PNG 3 to counteract that. Animated PNGs being officially PNG is recent.
Doesn't really seem worth it? It doesn't compress better, and only slightly faster in decompression time.
stephencanon · 45m ago
"Only slightly faster in decompression time."
m5 vs -19 is nearly 2.5x faster to decompress; given that most image data is decompressed many many more times (often thousands or millions of times more, often by devices running on small batteries) than it is compressed, that's an enormous win, not "only slightly faster".
The way in which it might not be worth it is the larger size, which is a real drawback.
fmbb · 13m ago
Win how?
More efficiency will inevitably only lead to increased usage of the CPU and in turn batteries draining faster.
Might as well just shoot yourself if that's how you look at improvements. The only way to do something good it to stop existing. (this is a general statement, not aimed at you or anyone in particular)
hcs · 7m ago
So someone is going to load 2.5x as many images because it can be decoded 2.5x faster? The paradox isn't a law of physics, it's an interesting observation about markets. (If this was a joke it was too subtle for me)
arp242 · 38m ago
The difference is barely noticeable in real-world cases, in terms of performance or battery. Decoding images is a small part of loading an entire webpage from the internet. And transferring data isn't free either, so any benefits need to be offset against the larger file size and increased network usage.
bobmcnamara · 1h ago
Am I reading those numbers right? That's like 25x faster compression than WEBP-M1, there's probably a use case for that.
arp242 · 43m ago
The numbers seem small enough that it will rarely matter, but I suppose there might be a use case somewhere?
But lets be real here: this is basically just a new image format. With more code to maintain, fresh new exciting zero-days, and all of that. You need a strong use case to justify that, and "already fast encode is now faster" is probably not it.
zX41ZdbW · 1h ago
Very reasonable.
I've recently experimented with the methods of serving bitmaps out of the database in my project[1]. One option was to generate PNG on the fly, but simply outputting an array of pixel color values over HTTP with Content-Encoding: zstd has won over PNG.
Combined with the 2D-delta-encoding as in PNG, it will be even better.
I think there is a benefit to knowing that if you have a png file it works everywhere that supports png.
Better to make the back compat breaks be entirely new formats.
privatelypublic · 1h ago
Does deflate lead the pack in any metric at all anymore? Only one I can think of is extreme low spec compression (microcontrollers).
JoshTriplett · 42m ago
The only metric deflate leads on is widespread support. By any other metric, it has been superseded.
atiedebee · 31m ago
I'd assume memory usage as well, because it has a tiny context window compared to zstd
adgjlsfhk1 · 1h ago
Even there, LZ4 is probably better.
encom · 14m ago
(2021)
In my opinion PNG doesn't need fixing. Being ancient is a feature. Everything supports it. As much as I appreciate the nerdy exercise, PNG is fine as it is. My only gripe is that some software writes needlessly bloated files (like adding a useless alpha channel, when it's not needed). I wish we didn't need tools like OptiPNG etc.
https://dennisforbes.ca/articles/jpegxl_just_won_the_image_w...
https://caniuse.com/jpegxl
Nothing really supports it. Latest Safari at least has support for it not feature-flagged or anything, but it doesn't support JPEG XL animations.
To be fair, nothing supports a theoretical PNG with Zstandard compression either. While that would be an obstacle to using PNG with Zstandard for a while, I kinda suspect it wouldn't be that long of a wait because many things that support PNG today also support Zstandard anyways, so it's not a huge leap for them to add Zstandard support to their PNG codecs. Adding JPEG-XL support is a relatively bigger ticket that has struggled to cross the finish line.
The thing I'm really surprised about is that you still can't use arithmetic coding with JPEG. I think the original reason is due to patents, but I don't think there have been active patents around that in years now.
I was under the impression libjpeg added support in 2009 (in v7). I'd assume most things support it by now.
Everything supports it, except web browsers.
The recently released PNG 3 also supports HDR and animations: https://www.w3.org/TR/png-3/
APNG isn't recent so much as the specs were merged together. APNG will be 21 years old in a few weeks.
m5 vs -19 is nearly 2.5x faster to decompress; given that most image data is decompressed many many more times (often thousands or millions of times more, often by devices running on small batteries) than it is compressed, that's an enormous win, not "only slightly faster".
The way in which it might not be worth it is the larger size, which is a real drawback.
More efficiency will inevitably only lead to increased usage of the CPU and in turn batteries draining faster.
https://en.wikipedia.org/wiki/Jevons_paradox
But lets be real here: this is basically just a new image format. With more code to maintain, fresh new exciting zero-days, and all of that. You need a strong use case to justify that, and "already fast encode is now faster" is probably not it.
I've recently experimented with the methods of serving bitmaps out of the database in my project[1]. One option was to generate PNG on the fly, but simply outputting an array of pixel color values over HTTP with Content-Encoding: zstd has won over PNG.
Combined with the 2D-delta-encoding as in PNG, it will be even better.
[1] https://adsb.exposed/
Better to make the back compat breaks be entirely new formats.
In my opinion PNG doesn't need fixing. Being ancient is a feature. Everything supports it. As much as I appreciate the nerdy exercise, PNG is fine as it is. My only gripe is that some software writes needlessly bloated files (like adding a useless alpha channel, when it's not needed). I wish we didn't need tools like OptiPNG etc.