Leonardo Chiariglione: “I closed MPEG on 2 June 2020”

170 eggspurt 138 8/7/2025, 10:09:41 AM leonardo.chiariglione.org ↗

Comments (138)

ZeroGravitas · 3h ago
There's nothing obscure about them.

His comment immediately after describes exactly what happened:

> Even before it has ceased to exists, the MPEG engine had run out of steam – technology- and business wise. The same obscure forces that have hijacked MPEG had kept it hostage to their interests impeding its technical development and keeping it locked to outmoded Intellectual Property licensing models delaying market adoption of MPEG standards. Industry has been strangled and consumers have been deprived of the benefits of new technologies. From facilitators of new opportunities and experiences, MPEG standards have morphed from into roadblocks.

Big companies abused the setup that he was responsible for. Gentlemen's agreements to work together for the benefit of all got gamed into patent landmines and it happened under his watch.

Even many of the big corps involved called out the bullshit, notably Steve Jobs refusing to release a new Quicktime till they fixed some of the most egregious parts of AAC licencing way back in 2002.

https://www.zdnet.com/article/apple-shuns-mpeg-4-licensing-t...

sanjit · 20m ago
From ZiffDavis article: > QuickTime 6 media player and QuickTime Broadcaster, a free application that aims to simplify using MPEG-4 in live video feeds over the Net.

It was sweet to see “over the Net”…

wheybags · 4h ago
As someone who hasn't had any exposure to the human stories behind mpeg before, it feels to me like it's been a force for evil since long before 2020. Patents on h264, h265, and even mp3 have been holding the industry back for decades. Imagine what we might have if their iron grip on codecs was broken.
mike_hearn · 4h ago
Possibly, nothing. Codec development is slow and expensive. Free codecs only came along at all because Google decided to subsidize development but that became possible only 15 years or so after MPEG was born, and it's hardly a robust strategy. Plus free codecs were often built by acquiring companies that had previously been using IP licensing as a business model rather than from-scratch development.
Taek · 3h ago
I avoided a career in codecs after spending about a year in college learning about them. The patent minefield meant I couldn't meaningfully build incremental improvements on what existed, and the idea of dilligently dancing around existing patents and then releasing something which intentionally lacked state-of-the-art ideas wasn't compelling.

Codec development is slow and expensive becuase you can't just release a new codec, you have to dance around patents.

mike_hearn · 3h ago
Well, a career in codec development means you'd have done it as a job, and so you'd have been angling for a job at the kind of places that enter into the patent pools and contribute to the standards.
astrange · 2h ago
Software patents aren't an issue in much of the world; the reason I thought there wasn't much of a career in codec development was that it was obvious that it needed to move down into custom ASICs to be power-efficient, at which point you can no longer develop new ones until people replace all their hardware.
dylan604 · 49m ago
By the time software is robust enough to make it worth while to be placed into hardware, it's pretty damn efficient. For something like ASICs, you could at least upgrade the firmware with new code, but what about Apple's chips that do the decoding? Can they be upgraded, or does that mean needing to wait for the M++ chip?
rowanG077 · 2h ago
Software patents aren't an issue in most of the world. Codecs however are used all over the world. No one is going to use a codec that is illegal to use in the US and EU.
astrange · 27m ago
EU would be one of the places that doesn't have software patents, which is why VLC is based there.
rowanG077 · 19m ago
It's not that simple. Software patents exists in the EU, the requirements are much more strict though. For example Netflix was ordered to cease their use of H265 in germany: https://www.nexttv.com/news/achtung-baby-netflix-loses-paten...
derf_ · 1h ago
> ...it's hardly a robust strategy.

I disagree. Video is such a large percentage of internet traffic and licensing fees are so high that it becomes possible for any number of companies to subsidize the development cost of a new codec on their own and still net a profit. Google certainly spends the most money, but they were hardly the only ones involved in AV1. At Mozilla we developed Daala from scratch and had reached performance competitive with H.265 when we stopped to contribute the technology to the AV1 process, and our team's entire budget was a fraction of what the annual licensing fees for H.264 would have been. Cisco developed Thor on their own with just a handful of people and contributed that, as well. Many other companies contributed technology on a royalty-free basis. Outside of AV1, you regularly see things like Samsung's EVC (or LC-EVC, or APV, or...), or the AVS series from the Chinese.... If the patent situation were more tenable, you would see a lot more of these.

The cost of developing the technology is not the limitation. I would argue the cost to get all parties to agree on a common standard and the cost to deploy it widely enough for people to rely on it is much higher, but people manage that on a royalty-free basis for many other standards.

mike_hearn · 7m ago
Mozilla is just Google from a financial perspective, it's not an independent org, so the financing point stands.

H.264 was something like >90% of all video a few years ago and wasn't it free for streaming if the end user wasn't paying? IIRC someone also paid the fees for an open source version. There were pretty good licensing terms available and all the big players have used it extensively.

Anyway, my point was only that expecting Google to develop every piece of tech in the world and give it all away for free isn't a general model for tech development, whereas IP rights and patent pools are. The free ride ends the moment Google decide they need more profit, feel threatened in some way or get broken up by the government.

pornel · 3h ago
IP law, especially defence against submarine patents, makes codec development expensive.

In the early days of MPEG codec development was difficult, because most computers weren't capable of encoding video, and the field was in its infancy.

However, by the end of '00s computers were fast enough for anybody to do video encoding R&D, and there was a ton of research to build upon. At that point MPEG's role changed from being a pioneer in the field to being an incumbent with a patent minefield, stopping others from moving the field forward.

cornholio · 3h ago
That's unnecessarily harsh. Patent pools exist to promote collaboration in a world with aggressive IP legislation, they are an answer to a specific environment and they incentivize participants to share their IP at a reasonable price to third parties. The incentive being that you will be left out of the pool, the other members will work around your patents while not licensing their own patents to you, so your own IP is now worthless since you can't work around theirs.

As long as IP law continues in the same form, the alternative to that is completely closed agreements among major companies that will push their own proprietary formats and aggressively enforce their patents.

The fair world where everyone is free to create a new thing, improve upon the frontier codecs, and get a fair reward for their efforts, is simply a fantasy without patent law reform. In the current geopolitical climate, it's very very unlikely for nations where these developments traditionally happened, such as US and western Europe, to weaken their IP laws.

phkahler · 1h ago
>> That's unnecessarily harsh. Patent pools exist to promote collaboration in a world with aggressive IP legislation, they are an answer to a specific environment and they incentivize participants to share their IP at a reasonable price to third parties.

You can say that, but this discussion is in response to the guy who started MPEG and later shut it down. I don't think he'd say its harsh.

ZeroGravitas · 2h ago
They actually messed up the basic concept of a patent pool, and that is the key to their failure.

They didn't get people to agree on terms up front, they made the final codec with interlocking patents embedded from hundreds of parties and made no attempt to avoid random outsider's patents and then once it was done tried to come to a licence agreement when every minor patent holder had an effective veto on the resulting pool. That's how you end up with multiple pools plus people who own patents and aren't members of any of the pools. It's ridiculous.

My minor conspiracy theory is that if you did it right, then you'd basically end up with something close to open source codecs as that's the best overall outcome.

Everyone benefits from only putting in freely available ideas. So if you want to gouge people with your patents you need to mess this up and "accidentally" create a patent mess.

scotty79 · 2h ago
Patent pools exist to make infeasible system look not so infeasible so people won't recoginize how it's stifling innovation and abolish it.
mike_hearn · 3h ago
IP law and the need for extremely smart people with a rare set of narrow skills. It's not like codec development magically happens for free if you ignore patents.

The point is, if there had been no incentives to develop codecs, there would have been no MPEG. Other people would have stepped into the void and sometimes did, e.g. RealVideo, but without legal IP protection the codecs would just have been entirely undocumented and heavily obfuscated, relying on tamper-proofed ASICs much faster.

sitkack · 1h ago
You continue to make the same unsubstantiated claims about codecs being hard and expensive. These same tropes were said about every other field, and even if true, we have tens of thousands of folks that would like to participate, but are locked out due to broken IP law.

The firewall of patents exist precisely because digital video is a way to shakedown the route media would have to travel to get to the end user.

Codecs are not, "harder than" compilers, yet the field of compilers was blown completely open by GCC. Capital didn't see the market opportunity because there wasn't the same possibility of being a gatekeeper for so much attention and money.

The patents aren't because it is difficult, the patents are there because they can extract money from the revenue streams.

mike_hearn · 58m ago
Codecs not harder than compilers? Sounds like an unsubstantiated claim!

Modern video codecs are harder than compilers. You have to have good ASIC development expertise to do them right, for example, which you don't need for compilers. It's totally feasible for a single company to develop a leading edge compiler whereas you don't see that in video codecs, historically they've been collaborations.

badsectoracula · 3h ago
That sounds like the 90s argument against FLOSS: without the incentive for people to sell software, nobody would write it.
tsimionescu · 2h ago
This is still the argument for software copyright. And I think it's still a pretty persuasive argument, despite the success of FLOSS. To this day, there is very little successful consumer software. Outside of browsers, Ubuntu, Libre Office, and GIMP are more or less it, at least outside certain niches. And even they are a pretty tiny compared to Windows/MacOS/iOS/Android, Office/Google Docs, or Photoshop.

The browsers are an interesting case. Neither Chrome nor Edge are really open source, despite Chromium being so, and they are both funded by advertising and marketing money from huge corporations. Safari is of course closed source. And Firefox is an increasingly tiny runner-up. So I don't know if I'd really count Chromium as a FLOSS success story.

Overall, I don't think FLOSS has had the kind of effect that many activists were going for. What has generally happened is that companies building software have realized that there is a lot of value to be found in treating FLOSS software as a kind of barter agreement between companies, where maybe Microsoft helps improve Linux for the benefit of all, but in turn it gets to use, say, Google's efforts on Chromium, and so on. The fact that other companies then get to mooch off of these big collaborations doesn't really matter compared to getting rid of the hassle of actually setting up explicit agreements with so many others.

_alternator_ · 1h ago
The value of OSS is estimated at about $9 trillion dollars. That’s more valuable than any company on earth.

https://papers.ssrn.com/sol3/papers.cfm?abstract_id=4693148

tsimionescu · 18m ago
Sure. Almost all of it supported by companies who sell software, hardware, or ads.
sitkack · 1h ago
> don't think FLOSS has had the kind of effect that many activists were going for

The entire internet, end to end, runs on FLOSS.

tsimionescu · 11m ago
That's great, but it's not what FLOSS activists hoped and fight for.

It's still almost impossible to have a digital life that doesn't involve significant use of proprietary software, and the vast majority of users do their computing almost exclusively through proprietary software. The fact that this proprietary software is a bit of glue on top of a bunch of FLOSS libraries possibly running on a FLOSS kernel that uses FLOSS libraries to talk to a FLOSS router doesn't really buy much actual freedom for the end users. They're still locked in to the proprietary software vendors just as much as they were in the 90s (perhaps paying with their private data instead of actual money).

mike_hearn · 54m ago
If you ignore the proprietary routers, the proprietary search engines, the proprietary browsers that people use out-of-the-box (Edge, Safari and even Chrome), and the fact that Linux is a clone of a proprietary OS.
immibis · 11m ago
On my new phone I made sure to install F-Droid first thing, and it's surprising how many basic functions are covered by free software if you just bother to look.
zozbot234 · 2h ago
Software wasn't always covered by copyright, and people wrote it all the same. In fact they even sold it, just built-to-order as opposed to any kind of retail mass market. (Technically, there was no mass market for computers back then so that goes without saying.)
strogonoff · 2h ago
Without IP protections that allow copyleft to exist arguably there would be no FOSS. When anything you publish can be leveraged and expropriated by Microsoft et al. without them being obligated to contribute back or even credit you, you are just an unpaid ghost engineer for big tech.
immibis · 11m ago
I thought your argument was that Microsoft wouldn't be able to exist in that world. Which is it?
bsindicatr3 · 3h ago
> Free codecs only came along … and it's hardly a robust strategy

Maybe you don’t remember the way that the gif format (there was no jpeg, png, or webp initially) had problems with licensing, and then years later having scares about it potentially becoming illegal to use gifs. Here’s a mention of some of the problems with Unisys, though I didn’t find info about these scares on Wikipedia’s GIF or Compuserve pages:

https://www.quora.com/Is-it-true-that-in-1994-the-company-wh...

Similarly, the awful history of digital content restriction technology in-general (DRM, etc.). I’m not against companies trying to protect assets, but data assets historically over all time are inherently prone to “use”, whether that use is intentional or unintentional by the one that provided the data. The problem has always been about the means of dissemination, not that the data itself needed to be encoded with a lock that anyone with the key or means to get/make one could unlock nor that it should need to call home, basically preventing the user from actually legitimately being able to use the data.

adzm · 2h ago
> I didn’t find info about these scares on Wikipedia’s GIF or Compuserve pages

The GIF page on wikipedia has an entire section for the patent troubles https://en.wikipedia.org/wiki/GIF#Unisys_and_LZW_patent_enfo...

wheybags · 4h ago
It's not just about new codecs. There's also people making products that would use codecs just deciding not to because of the patent hassle.
weinzierl · 27m ago
"Free codecs only came along at all because Google decided to subsidize development but that became possible only 15 years or so after MPEG was born, and it's hardly a robust strategy"

I don't know about video codecs but MP3 (also part of MPEG) came out of Fraunhofer and was paid by German tax payer money. It should not have been patented in the first place (and wasn't in Germany).

cxr · 1h ago
> Free codecs only came along at all because Google decided to subsidize development but that became possible only 15 years or so after MPEG was born

The release of VP3 as open source predates Google's later acquisition of On2 (2010) by nearly a decade.

tomrod · 2h ago
Free codecs have been available a long time, surely, as we could install them in Linux distributions in 2005 or earlier?

(I know nothing about the legal side of all this, just remembering the time period of Ubuntu circa 2005-2008).

zappb · 2h ago
Free codecs without patent issues were limited to things like Vorbis which never got wide support. There were FOSS codecs for patented algorithms, but those had legal issues in places that enforce software patents.
breve · 1h ago
AV1, VP9, and Opus are used on YouTube and Netflix right now.

It's hard to get more mainstream than YouTube and Netflix.

notpushkin · 2h ago
> which never got wide support

Source? I’ve seen Vorbis used in a whole bunch of places.

Notably, Spotify only used Vorbis for a while (still does, but also includes AAC now, for Apple platforms I think).

scott_w · 1h ago
Pre-Spotify, MP3 players would usually only ship with MP3 support (thus the name), so people would only rip to MP3. Ask any millennial and most of them will never have heard of Ogg.
darkwater · 15m ago
Pre-Spotify (and pre-iPod) there were plenty of cheap MP3 players that also supported Ogg Vorbis. I owned one, for example. Obviously MP3 was THE standard, but Vorbis reached a good adoption HW wise (basically because it was free as in beer to implement)
notpushkin · 50m ago
Of course, but this is not what I’d call “never got wide support”.
scott_w · 31m ago
So you’d say that a format that most consumers couldn’t use (because only a few devices could play it) is “widely supported?”
ghm2199 · 3h ago
For the uninitiated, could you describe why codec development is slow and expensive?
mike_hearn · 16m ago
They're hardware accelerated so it's not worth making a new codec until you have a big improvement over the prior baseline, because it takes a long time to manufacture and roll out devices that are better. Verifying an optimization is worth it requires testing against a big library of videos using standardized perception metrics, it requires ensuring there's an efficient way to decode it in both hardware and software, including efficient encoding. It's easy to improve one kind of input but regress another. Most of the low hanging fruit is taken already. Just the usual stuff that makes advancing the frontier hard.
thinkingQueen · 2h ago
It’s a bit like developing an F1 car. Or a cutting edge airplane. Lots of small optimizations that have to work together. Sometimes big new ideas emerge but those are rare.

Until the new codec comes to together all those small optimizations aren’t really worth much, so it’s a long term research project with potentially zero return on investement.

And yes, most of the small optimizations are patented, something that I’ve come to understand isnt’t viewed very favorably by most.

phkahler · 1h ago
>> And yes, most of the small optimizations are patented, something that I’ve come to understand isn’t viewed very favorably by most.

Codecs are like infrastructure not products. From cameras to servers to iPhones, they all have to use the same codecs to interoperate. If someone comes along with a small optimization it's hard enough to deploy that across the industry. If it's patented you've got another obstacle: nobody wants to pay the incremental cost for a small improvement (it's not even incremental cost once you've got free codecs, it's a complete hassle).

zoeysmithe · 1h ago
This is impossible to know. Not that long ago something like Linux would have sounded like a madman's dream to someone with your perspective. It turns out great innovations happen outside the capitalist for-profit context and denying that is very questionable. If anything, those kinds of setups often hinder innovation. How much better would linux be if it was mired in endless licensing agreements, per monthly rates, had a board full of fortune 500 types, and billed each user a patent fee? Or any form of profit incentive 'business logic'?

If that stuff worked better, linux would have failed entirely, instead near everyone interfaces with a linux machine probably hundreds if not thousands of times a day in some form. Maybe millions if we consider how complex just accessing internet services is and the many servers, routers, mirrors, proxies, etc one encounters in just a trivial app refresh. If not linux, then the open mach/bsd derivatives ios uses.

Then looking even previous to the ascent of linux, we had all manner of free/open stuff informally in the 70s and 80s. Shareware, open culture, etc that led to today where this entire medium only exists because of open standards and open source and volunteering.

Software patents are net loss for society. For profit systems are less efficient than open non-profit systems. No 'middle-man' system is better than a system that goes out of its way to eliminate the middle-man rent-seeker.

newsclues · 4h ago
This is the sort of project that should be developed and released via open source from academia.

Audio and video codecs, document formats like PDF, are all foundational to computing and modern life from government to business, so there is a great incentive to make it all open, and free.

mike_hearn · 3h ago
Universities love patent licensing. I don't think academia is the solution you're looking for.
yxhuvud · 3h ago
The solution to that is to remove the ability to patent codecs.
master-lincoln · 1h ago
I think we should go a step further and remove the ability to patent algorithms (software)
immibis · 9m ago
Some people even think we should remove intellectual property.
newsclues · 1h ago
So do companies.

But education receives a lot of funding from the government.

I think academia should build open source technology (that people can commercialize on their own with the expertise).

Higher education doesn’t need to have massive endowments of real estate and patent portfolio to further educ… administration salaries and vanity building projects.

Academia can serve the world with technology and educated minds.

oblio · 3h ago
You're also describing technologies with universal use and potential for long term rent seeking.

Basically MBA drool material.

newsclues · 1h ago
Yeah, and if MBAs want to reap that reward, they need to fund the development exclusively without government funding.
lightedman · 2h ago
"Free codecs only came along at all because Google decided to subsidize development"

No, just no. We've had free community codec packs for years before Google even existed. Anyone remember CCCP?

leguminous · 1h ago
CCCP was just a collection of existing codecs, they didn't develop their own. Most of the codecs in CCCP were patented. Using it without licenses was technically patent infringement in most places. It's just that nobody ever cared to enforce it on individual end users.
notpushkin · 1h ago
Yes. Those won’t help you if you use them for commercial use and patent holders find out about it.
deadbabe · 3h ago
Why not just use AI?
thinkingQueen · 4h ago
Not sure why you are downvoted as you seem to be one of the few who knows even a little about codec development.

And regarding ”royalty-free” codecs please read this https://ipeurope.org/blog/royalty-free-standards-are-not-fre...

chrismorgan · 3h ago
That article is a scare piece designed to spread fear, uncertainty and doubt, to prop up an industry that has already collapsed because everyone else hated them, and make out that they’re the good guys and you should go back to how things were.
bjoli · 4h ago
At least two of the members of ipeurope are companies you could use as ann argument why we shouldn't have patents at all.
cnst · 1h ago
> The catch is that while the AV1 developers offer their patents (assuming they have any) on a royalty-free basis, in return they require users of AV1 to agree to license their own patents royalty-free back to them.

Such a huge catch that the companies that offer you a royalty-free license, only do so on the condition that you're not gonna turn around and abuse your own patents against them!

How exactly is that a bad thing?

How is it different from the (unwritten) social contracts of all humans and even of animals? How is it different from the primal instincts?

blendergeek · 3h ago
> And regarding ”royalty-free” codecs please read this https://ipeurope.org/blog/royalty-free-standards-are-not-fre...

Unsurprisingly companies that are losing money because their rent-seeking on media codecs is now over will spread FUD [0] about royalty free codecs.

[0] https://en.wikipedia.org/wiki/Fear%2C_uncertainty_and_doubt

thinkingQueen · 4h ago
Who would develop those codecs? A good video coding engineer costs about 100-300k USD a year. The really good ones even more. You need a lot of them. JVET has an attendance of about 350 such engineers each meeting (four times a year).

Not to mention the computer clusters to run all the coding sims, thousands and thousands of CPUs are needed per research team.

People who are outside the video coding industry do not understand that it is an industry. It’s run by big companies with large R&D budgets. It’s like saying ”where would we be with AI if Google, OpenAI and Nvidia didn’t have an iron grip”.

MPEG and especially JVET are doing just fine. The same companies and engineers who worked on AVC, HEVC and VVC are still there with many new ones especially from Asia.

MPEG was reorganized because this Leonardo guy became an obstacle, and he’s been angry about ever since. Other than that I’d say business as usual in the video coding realm.

rwmj · 4h ago
Who would write a web server? Who would write Curl? Who would write a whole operating system to compete with Microsoft when that would take thousands of engineers being paid $100,000s per year? People don't understand that these companies have huge R&D budgets!

(The answer is that most of the work would be done by companies who have an interest in video distribution - eg. Google - but don't profit directly by selling codecs. And universities for the more research side of things. Plus volunteers gluing it all together into the final system.)

chubot · 48m ago
> Who would write a whole operating system to compete with Microsoft when that would take thousands of engineers being paid $100,000s per year?

You might be misunderstanding that almost all of Linux development is funded by the same kind of companies that fund MPEG development.

It's not "engineers in their basement", and never was

https://www.linuxfoundation.org/about/members

e.g. Red Hat, Intel, Oracle, Google, and now MICROSOFT itself (the competitive landscape changed)

This has LONG been the case, e.g. an article from 2008:

https://www.informationweek.com/it-sectors/linux-contributor...

2017 Linux Foundation Report: https://www.linuxfoundation.org/press/press-release/linux-fo...

Roughly 15,600 developers from more than 1,400 companies have contributed to the Linux kernel since the adoption of Git made detailed tracking possible

The Top 10 organizations sponsoring Linux kernel development since the last report include Intel, Red Hat, Linaro, IBM, Samsung, SUSE, Google, AMD, Renesas and Mellanox

---

curl does seem to be an outlier, but you still need to answer the question: "Who would develop video codecs?" You can't just say "Linux appeared out of thin air", because that's not what happened.

Linux has funding because it serves the interests of a large group of companies that themselves have a source of revenue.

(And to be clear, I do not think that is a bad thing! I prefer it when companies write open source software. But it does skew the design of what open source software is available.)

rwmj · 43m ago
I've used and developed for Linux since 1994 (long before major commercial interests), and I work for Red Hat so it's unlikely I misunderstand how Linux was and is developed.
mike_hearn · 3h ago
Google funding free stuff is not a real social mechanism. It's not something you can point to and say that's how society should work in general.

Our industry has come to take Google's enormous corporate generosity for granted, but there was zero need for it to be as helpful to open computing as it has been. It would have been just as successful with YouTube if Chrome was entirely closed source and they paid for video codec licensing, or if they developed entirely closed codecs just for their own use. In fact nearly all Google's codebase is closed source and it hasn't held them back at all.

Google did give a lot away though, and for that we should be very grateful. They not only released a ton of useful code and algorithms for free, they also inspired a culture where other companies also do that sometimes (e.g. Llama). But we should also recognize that relying on the benevolence of 2-3 idealistic billionaires with a browser fetish is a very time and place specific one-off, it's not a thing that can be demanded or generalized.

In general, R&D is costly and requires incentives. Patent pools aren't perfect, but they do work well enough to always be defining the state-of-the-art and establish global standards too (digital TV, DVDs, streaming.... all patent pool based mechanisms).

breve · 1h ago
> Google funding free stuff is not a real social mechanism.

It's not a social mechanism. And it's not generosity.

Google pushes huge amounts of video and audio through YouTube. It's in Google's direct financial interest to have better video and audio codecs implemented and deployed in as many browsers and devices as possible. It reduces Google's costs.

Royalty-free video and audio codecs makes that implementation and deployment more likely in more places.

> Patent pools aren't perfect

They are a long way from perfect. Patent pools will contact you and say, "That's a nice codec you've got there. It'd be a shame if something happened to it."

Three different patent pools are trying to collect licencing fees for AV1:

https://www.sisvel.com/licensing-programmes/audio-and-video-...

https://accessadvance.com/licensing-programs/vdp-pool/

https://www.avanci.com/video/

raverbashing · 3h ago
These are bad comparisons

The question is more, "who would write the HTTP spec?" except instead of sending text back and forth you need experts in compression, visual perception, video formats, etc

rwmj · 2h ago
Did TBL need to patent the HTTP spec?
thinkingQueen · 3h ago
Are you really saying that patents are preventing people from writing the next great video codec? If it were that simple, it would’ve already happened. We’re not talking about a software project that you can just hack together, compile, and see if it works. We’re talking about rigorous performance and complexity evaluations, subjective testing, and massive coordination with hardware manufacturers—from chips to displays.

People don’t develop video codecs for fun like they do with software. And the reason is that it’s almost impossible to do without support from the industry.

unlord · 3h ago
> People don’t develop video codecs for fun like they do with software. And the reason is that it’s almost impossible to do without support from the industry.

As someone who lead an open source team (of majority volunteers) for nearly a decade at Mozilla, I can tell you that people do work on video codecs for fun, see https://github.com/xiph/daala

Working with fine people from Xiph.Org and the IETF (and later AOM) on royalty free formats Theora, Opus, Daala and AV1 was by far the most fun, interesting and fulfilling work I've had as professional engineer.

tux3 · 3h ago
Daala had some really good ideas, I only understand the coding tools at the level of a curious codec enthusiast, far from an expert, but it was really fascinating to follow its progress

Actually, are Xiph people still involved in AVM? It seems like it's being developed a little bit differently than AV1. I might have lost track a bit.

Taek · 3h ago
People don't develop video codecs for fun because there are patent minefields.

You don't *have* to add all the rigour. If you develop a new technique for video compression, a new container for holding data, etc, you can just try it out and share it with the technical community.

Well, you could, if you weren't afraid of getting sued for infringing on patents.

eqvinox · 3h ago
> Are you really saying that patents are preventing people from writing the next great video codec? If it were that simple, it would’ve already happened.

You wouldn't know if it had already happened, since such a codec would have little chance of success, possibly not even publication. Your proposition is really unprovable in either direction due to the circular feedback on itself.

scott_w · 3h ago
> Are you really saying that patents are preventing people from writing the next great video codec?

Yes, that’s exactly what people are saying.

People are also saying that companies aren’t writing video codecs.

In both cases, they can be sued for patent infringement if they do.

fires10 · 3h ago
I don't do video because I don't work with it, but I do image compression for fun and no profit. I do use some video techniques due to the type of images I am compressing. I don't release because of the minefield. I do it because it's fun. The simulation runs and other tasks often I kick to the cloud for the larger compute needs.
Spooky23 · 32m ago
Patents, by design, give inventors claims to ideas, which gives them the money to drive progress at a pace that meets their business needs.

Look at data compression. Sperry/Univac controlled key patents and slowed down invention in the space for years. Was it in the interest of these companies or Unisys (their successor) to invest in compression development? Nope.

That’s by design. That moat of exclusivity makes it difficult to compensate people to come up with novel inventions in-scope or even adjacent to the patent. With codecs, the patents are very granular and make it difficult for anyone but the largest players with key financial interests to do much of anything.

bayindirh · 3h ago
> People don’t develop video codecs for fun like they do with software. And the reason is that it’s almost impossible to do without support from the industry.

Hmm, let me check my notes:

    - Quite OK Image format: https://qoiformat.org/
    - Quite OK Audio format: https://qoaformat.org/
    - LAME (ain't a MP3 Encoder): https://lame.sourceforge.io/
    - Xiph family of codecs: https://xiph.org/
Some of these guys have standards bodies as supporters, but in all cases, bigger groups formed behind them, after they made considerable effort. QOI and QOA is written by a single guy just because he's bored.

For example, FLAC is a worst of all worlds codec for industry to back. A streamable, seekable, hardware-implementable, error-resistant, lossless codec with 8 channels, 32 bit samples, and up to 640KHz sample rate, with no DRM support. Yet we have it, and it rules consumer lossless audio while giggling and waving at everyone.

On the other hand, we have LAME. An encoder which also uses psycho-acoustic techniques to improve the resulting sound quality and almost everyone is using it, because the closed source encoders generally sound lamer than LAME in the same bit-rates. Remember, MP3 format doesn't have an reference encoder. If the decoder can read the file and it sounds the way you expect, then you have a valid encoder. There's no spec for that.

> Are you really saying that patents are preventing people from writing the next great video codec?

Yes, yes, and, yes. MPEG and similar groups openly threatened free and open codecs by opening "patent portfolio forming calls" to create portfolios to fight with these codecs, because they are terrified of being deprived of their monies.

If patents and license fees are not a problem for these guys, can you tell me why all professional camera gear which can take videos only come with "personal, non-profit and non-professional" licenses on board, and you have pay blanket extort ^H^H^H^H^H licensing fees to these bodies to take a video you can monetize?

For the license disclaimers in camera manuals, see [0].

[0]: https://news.ycombinator.com/item?id=42736254

roenxi · 4h ago
> It’s like saying ”where would we be with AI if Google, OpenAI and Nvidia didn’t have an iron grip”.

We'd be where we are. All the codec-equivalent aspects of their work are unencumbered by patents and there are very high quality free models available in the market that are just given away. If the multimedia world had followed the Google example it'd be quite hard to complain about the codecs.

thinkingQueen · 4h ago
That’s hardly true. Nvidia’s tech is covered by patents and licenses. Why else would it be worth 4.5 trillion dollars?

The top AI companies use very restrictive licenses.

I think it’s actually the other way around and AI industry will actually end up following the video coding industry when it comes to patents, royalties, licenses etc.

roenxi · 3h ago
Because they make and sell a lot of hardware. I'm sure they do have a lot of patents and licences, but if all that disappeared today it'd be years to decades before anyone could compete with them. Even just getting a foot in the door in TSMC's queue of customers would be hard. Their valuation can likely be justified based on their manufacturing position alone. There is literally no-one else who can do what they do, law or otherwise.

If it is a matter of laws, China would just declare the law doesn't count to dodge around the US chip sanctions. Which, admittedly, might happen - but I don't see how that could result in much more freedom than we already have now. Having more Chinese people involved is generally good for prices, but that doesn't have much to do with market structure as much as they work hard and do things at scale.

> The top AI companies use very restrictive licenses.

These models are supported by the Apache 2.0 license ~ https://openai.com/open-models/

Are they lying to me? It is hard to get much more permissive than Apache 2.

mike_hearn · 3h ago
The top AI companies don't release their best models under any license. They're not even distributed at all. If you did steal the weights out from underneath Anthropic they would take you to court and probably win. Putting software you develop exclusively behind a network interface is a form of ultra-restrictive DRM. Yes, some places are currently trying to buy mindshare by releasing free models and that's fantastic, thank you, but they can only do that because investors believe the ROI from proprietary firewalled models will more than fund it.

NVIDIA's advantage over AMD is largely in the drivers and CUDA i.e. their software. If it weren't for IP law or if NVIDIA had foolishly made their software fully open source, AMD could have just forked their PTX compiler and NVIDIAs advantage would never have been established. In turn that'd have meant they wouldn't have any special privileges at TSMC.

oblio · 3h ago
I imagine a chunk of it is also covered by trade secrets and NDAs.
somethingsome · 2h ago
Hey, I attend MPEG regularly (mostly lvc lately), there's a chance we’ve crossed paths!
mschuster91 · 3h ago
> Who would develop those codecs? A good video coding engineer costs about 100-300k USD a year. The really good ones even more. You need a lot of them.

How about governments? Radar, Laser, Microwaves - all offshoots of US military R&D.

There's nothing stopping either the US or European governments from stepping up and funding academic progress again.

rs186 · 2h ago
Yeah, counting on governments to develop codecs optimized for fast evolving applications for web and live streaming is a great idea.

If we did that we would probably be stuck with low-bitrate 720p videos on YouTube.

Reason077 · 1h ago
> "Patents on h264, h265, and even mp3 have been holding the industry back for decades. Imagine what we might have if their iron grip on codecs was broken."

Has AV1 solved this, to some extent? Although there are patent claims against it (patents for technologies that are fundamental to all the modern video codecs), it still seems better than the patent & licensing situation for h264 / h265.

philistine · 58m ago
At least for MP3, our collective nightmare is over. MP3 is completely patent-unencumbered and can be used freely.
jbverschoor · 4h ago
Enough codecs out there. Just no adoption.
rs186 · 2h ago
Not all codecs are equal, and to be honest, most are probably not optimized/suitable for today's applications, otherwise Google wouldn't have invented their own codec (which then gets adopted widely, fortunately).
egeozcan · 4h ago
This might be an oversimplification, but as a consumer, I think I see a catch-22 for new codecs. Companies need a big incentive to invest in them, which means the codec has to be technically superior and safe from hidden patent claims. But the only way to know if it's safe is for it to be widely used for a long time. Of course, it can't get widely used without company support in the first place. So, while everyone waits, the technology is no longer superior, and the whole thing fizzles out.
jbverschoor · 3h ago
Jxl has been around for years.

Av1 for 7

The problem is every platform wants to force their own codec, and get earn royalties from the rest of the world.

They literally sabotaging it. Jxl support even got removed from chrome.

Investment in adopting in software is next to 0.

In hardware it’s a different story, and I’m not sure to what extent which codec can be properly accelerated

Taek · 3h ago
Companies only need a big incentive to invest in new codecs because creating a codec that has a simple incremental improvement would violate existing patents.
wheybags · 4h ago
Yes, because mpeg got there first, and now their dominance is baked into silicon with hardware acceleration. It's starting to change at last but we have a long way to go. That way would be a lot easier if their patent portfolio just died.
TiredOfLife · 2h ago
Because every codec has 3+ different patent pools wanting rent. Each with different terms.
fidotron · 2h ago
The fact h264 and h265 are known by those terms is key to the other part of the equation: the ITU Video Coding Experts Group has become the dominant forum for setting standards going back to at least 2005.
marcodiego · 2h ago
> My Christian Catholic education made and still makes me think that everybody should have a mission that extends beyond their personal interests.

I remember this same guy complaining investments in the MPEG extortionist group would disappear because they couldn't fight against AV1.

He was part of a patent Mafia is is only lamenting he lost power.

Hypocrisy in its finest form.

maxloh · 2h ago
Any link to his comment?
marcodiego · 2h ago
> all the investments (collectively hundreds of millions USD) made by the industry for the new video codec will go up in smoke and AOM’s royalty free model will spread to other business segments as well.

https://blog.chiariglione.org/a-crisis-the-causes-and-a-solu...

He is not a coder, not a researcher, he is only part of the worst game there is in this industry: a money maker from patents and "standards" you need to pay for to use, implement or claim compatibility.

cnst · 1h ago
His argument is blatantly invalid.

He first points out that a royalty-free format was actually better than the patent-pending alternative that he was responsible for pushing.

In the end, he concludes that the that the progress of video compression would stop if developers can't make money from patents, providing a comparison table on codec improvements that conveniently omits the aforementioned royalty-free code being better than the commercial alternatives pushed by his group.

Besides the above fallacy, the article is simply full of boasting about his own self-importance and religious connotations.

DragonStrength · 1h ago
You missed the first part of that quote:

> At long last everybody realises that the old MPEG business model is now broke

And the entire post is about how dysfunctional MPEG is and how AOM rose to deal with it. It is tragic to waste so much time and money only to produce nothing. He's criticizing the MPEG group and their infighting. He's literally criticizing MPEG's licensing model and the leadership of the companies in MPEG. He's an MPEG member saying MPEG's business model is broken yet no one has a desire to fix it, so it will be beaten by a competitor. Would you not want to see your own organization reform rather than die?

Reminder AOM is a bunch of megacorps with profit motive too, which is why he thinks this ultimately leads to stalled innovation:

> My concerns are at a different level and have to do with the way industry at large will be able to access innovation. AOM will certainly give much needed stability to the video codec market but this will come at the cost of reduced if not entirely halted technical progress. There will simply be no incentive for companies to develop new video compression technologies, at very significant cost because of the sophistication of the field, knowing that their assets will be thankfully – and nothing more – accepted and used by AOM in their video codecs.

> Companies will slash their video compression technology investments, thousands of jobs will go and millions of USD of funding to universities will be cut. A successful “access technology at no cost” model will spread to other fields.

Money is the motivator. Figuring out how to reward investment in pushing the technology forward is his concern. It sounds like he is open to suggestions.

marcodiego · 1h ago
Fixing a business model that was always a force that slowed down development, implementation and adoption is not something that should be "fixed". MPEG dying is something to celebrate not whine about.
DragonStrength · 59m ago
Could you please point to the whining? He says MPEG is broken, but AOM will stagnate. You’re mad at the messenger.
dostick · 4h ago
The article does not give much beyond what you already read in the title. What obscure forces and how? Isn’t it an open standards non-profit organisation, then what could possible hinder it? Maybe because technologically closed standards became better and nonprofit project has no resources to compete with commercial standards? USB Alliance have been able to work things out, so maybe compression standards should be developed in similar way?
baobun · 4h ago
Supposedly the whole story is told in their linked book.
eggspurt · 3h ago
From Leonardo, who founded MPEG, on the page linked: "Even before it has ceased to exists, the MPEG engine had run out of steam – technology- and business wise. The same obscure forces that have hijacked MPEG had kept it hostage to their interests impeding its technical development and keeping it locked to outmoded Intellectual Property licensing models delaying market adoption of MPEG standards. Industry has been strangled and consumers have been deprived of the benefits of new technologies. From facilitators of new opportunities and experiences, MPEG standards have morphed from into roadblocks."
mananaysiempre · 1h ago
One detail for context: when “closing” MPEG, he also deleted all of its all pages and materials and redirected them to the AI stuff.
karel-3d · 4h ago
I... don't understand how AI related to video codecs. Maybe because I don't understand either video codecs or AI on a deeper level.
tdullien · 3h ago
Every predictor is a compressor, every compressor is a predictor.

If you're interested in this, it's a good idea reading about the Hutter prize (https://en.wikipedia.org/wiki/Hutter_Prize) and going from there.

In general, lossless compression works by predicting the next (letter/token/frame) and then encoding the difference from the prediction in the data stream succinctly. The better you predict, the less you need to encode, the better you compress.

The flip side of this is that all fields of compression have a lot to gain from progress in AI.

rahimnathwani · 17m ago
Also check out this contest: https://www.mattmahoney.net/dc/text.html

Fabrice Bellard's nncp (mentioned in a different comment) leads.

bjoli · 3h ago
It is like upscaling. If you could train AI to "upscale" your audio or video you could get away with sending a lot less data. It is already being done with quite amazing results for audio.
jl6 · 3h ago
It has long been recognised that the state of the art in data compression has much in common with the state of the art in AI, for example:

http://prize.hutter1.net/

https://bellard.org/nncp/

ddtaylor · 3h ago
Some view these as so interconnected that they will say LLMs are "just" compression.
pjc50 · 2h ago
Which is an interesting view when applied to the IP. I think it's relatively uncontroversial that an MP4 file which "predicts" a Disney movie which it was "trained on" is a derived work. Suppose you have an LLM which was trained on a fairly small set of movies and you could produce any one on demand; would that be treated as a derived work?

If you have a predictor/compressor LLM which was trained on all the movies in the world, would that not also be infringement?

mr_toad · 1h ago
MP4s are compressed data, not a compression algorithm. An MP4 (or any compressed data) is not a “prediction”, it is the difference between what was predicted and what you’re trying to compress.

An LLM is (or can be used) as a compression algorithm, but it is not compressed data. It is possible to have an overfit algorithm exactly predict (or reproduce) an output, but it’s not possible for one to reproduce all the outputs due to the pigeonhole principle.

To reiterate - LLMs are not compressed data.

Retr0id · 3h ago
AI and data compression are the same problem, rephrased.
oblio · 3h ago
Which makes Silicon Valley, the TV show, even funnier.
chisleu · 2h ago
holy shit it does. The scene with him inventing the new compression algorithm basically foreshadowed the gooning to follow local LLM availability.
selvan · 2h ago
May be, we are couple of years away from experiencing patent free video codecs based on deep learning.

DCVC-RT (https://github.com/microsoft/DCVC) - A deep learning based video codec claims to deliver 21% more compression than h266.

One of the compelling edge AI usecases is to create deep learning based audio/video codecs on consumer hardwares.

One of the large/enterprise AI usecases is to create a coding model that generates deep learning based audio/video codecs for consumer hardwares.

gcr · 17m ago
Goodbye MPEG group, and to be frank, good riddance I think. I'm glad that open codecs are now taking over on the frontier of SOTA encoding.

Maybe these sorts of handshake agreements and industry collaboration were necessary to get things rolling in 198x. If so, then I thank the MPEG group for starting that work. But by 2005 or so when DivX and XviD and h264 were heating up, it was time to move beyond that model towards open interoperability.

_bent · 1h ago
https://mpai.community/standards/mpai-spg

This makes zero sense, right? Even if this was applicable, why would it need a standard? There is no interoperability between game servers of different games

scotty79 · 2h ago
I think if IP rights holders were mandated to pay property tax it would make the system much healthier.
londons_explore · 2h ago
This. You should have to declare the value of a patent, and pay 1% of that value every year to the government. Anyone else can force-purchase it for that value, but leaving you with a free perpetual license.
LeafItAlone · 1h ago
Wouldn’t that only help the “big guys” who can afford to pay the tax?
MyOutfitIsVague · 1h ago
Presumably the tax would be based on some estimated value of the property, and affordability would therefore scale.
scotty79 · 2h ago
> The same obscure forces that have hijacked MPEG had kept it hostage to their interests impeding its technical development and keeping it locked to outmoded Intellectual Property licensing models delaying market adoption of MPEG standards. Industry has been strangled and consumers have been deprived of the benefits of new technologies.

Copyright is cancer. The faster AI industry is going to run it into the ground, the better.

rurban · 13m ago
Does he talk about Fraunhofer there? The guys, subsidized by German taxpayers, starting to charge license or patent fees.

Or is it MPEG LA? https://wiki.endsoftwarepatents.org/wiki/MPEG_LA

knome · 2h ago
This has nothing to do with copyright. It is an issue of patents.
dathinab · 3h ago
sure, it being a 6 digit code which has potential for social engineering can be an issue

like similar to if you get a "your login" yes/no prompt on a authentication app, but a bit less easy to social engineer but a in turn also suspect to bruteforce attacks (similar to how TOTP is suspect to it)

through on the other hand

- some stuff has so low need of security that it's fine (like configuration site for email news letters or similar where you have to have a mail only based unlock)

- if someone has your email they can do a password reset

- if you replace email code with a login link you some cross device hurdles but fix some of of social enginering vectors (i.e. it's like a password reset on every login)

- you still can combine it with 2FA which if combined with link instead of pin is basically the password reset flow => should be reasonable secure

=> eitherway that login was designed for very low security use cases where you also wouldn't ever bother with 2FA as losing the account doesn't matter, IMHO don't use it for something else :smh:

cpcallen · 3h ago
Did you mean to post this comment at https://news.ycombinator.com/item?id=44819917 ?
dathinab · 1h ago
yes, that is embarrassing
mschuster91 · 3h ago
I think you misplaced this comment and it belongs here: https://news.ycombinator.com/item?id=44819917