The related post on performance optimization is extremely interesting, in particular, the considerations drawn while moving from the unsafe ported code to safe¹:
> The first performance issue we hit was dynamic dispatch to assembly, as these calls are very hot. We then began adding inner mutability when necessary but had to carefully avoid contention. We found as we removed pointers and transitioned to safe Rust types that bounds checks increasingly became a larger factor. Buffer and structure initialization was also an issue as we migrated to safe, owned Rust types.
Based on their conclusions², each of those issues amounts to a few percentage points (total: 11%).
Based on the article, it seems that with highly complex logic, safety does come at the cost of raw performance, and it can be very hard to compensate (withing the safety requirements).
> Based on the article, it seems that with highly complex logic, safety does come at the cost of raw performance, and it can be very hard to compensate (withing the safety requirements).
In Rust. These are Rust issues, not issues with safety in general.
The issue with bound checks for exemple is entirely avoidable if you prove that all your calls are within bounds before compiling, same thing for partial initialization.
The core issue is that the strategies Rust adopts to ensure memory safety are neither a panacea nor necessarily the right solution in every case. That being said, I think it's a very nice idea to try to write a decoder in Rust and have a bounty for optimization. Rust is popular so work on producing fast and safe Rust is good.
HPsquared · 16m ago
If there isn't already a name for this compile-time safety checking in Rust, I vote we call it "passivation".
pizza234 · 22m ago
> The issue with bound checks for exemple is entirely avoidable if you prove that all your calls are within bounds before compiling, same thing for partial initialization.
The situation is more nuanced. The article dedicates a section to it:
> The general idea in eliding unnecessary bounds checks was that we needed to expose as much information about indices and slice bounds to the compiler as possible. We found many cases where we knew, from global context, that indices were guaranteed to be in range, but the compiler could not infer this only from local information (even with inlining). Most of our effort to elide bounds checks went into exposing additional context to buffer accesses.
(extensive information given in that section)
IshKebab · 45m ago
> Our Rust-based rav1d decoder is currently about 5% slower than the C-based dav1d decoder (the exact amount differs a bit depending on the benchmark, input, and platform). This is enough of a difference to be a problem for potential adopters
I'm really surprised that a 5% performance degradation would lead people to choose C over Rust, especially for something like a video codec. I wonder if they really care or if this is one of those "we don't want to use Rust because of silly reasons and here's are reasonable-sounding but actually irrelevant technical justification"...
topspin · 25m ago
Developers fight tooth and nail to get every bit of performance from video codecs because it goes directly to battery life and heat on a scale of billions of devices. You can't handwave a 5% performance drop as if this is some recipe app. People pour over microamp power analyzers and high resolution thermographs because they "really care."
bArray · 25m ago
I can think of a few use cases:
1. Desktop - If both implementations run the same but one is faster, you run the faster one to stop the decode spluttering on those borderline cases.
2. Embedded - Where resources are limited, you still go for the faster one, even if it might one day leas to a zero day because you've weighed up the risk and reducing the BOM is an instant win and trying to factor in some unknown code element isn't.
3. Server - You accept media from unknown sources, so you are sandboxed anyway. Losing 5% of computing resources adds up to big $ over a year and at enough scale. At Youtube for example it could be millions of dollars a year of compute doing a decode and then re-encode.
Some other resistances:
1. Energy - If you have software being used in many places over the world, that cost saving is significant in terms of energy usage.
2. Already used - If the C implementation is working without issue, there would be high resistance to spend engineering time to put a slower implementation in.
3. Already C/C++ - If you already have a codebase using the same language, why would you now include Rust into your codebase?
4. Bindings - Commonly used libraries use the C version and are slow to change. The default may remain the C version in the likes of ffmpeg.
Benjamin_Dobell · 16m ago
> especially for something like a video codec
Why especially video decoders?
> I wonder if they really care or if this is one of those "we don't want to use Rust because of silly reasons and here's are reasonable-sounding but actually irrelevant technical justification"...
I would have thought video decoders are specifically one of the few cases where performance really is important enough to trump language guaranteed security. They're widely deployed, and need to work in a variety of environments; everything from low power mobile devices to high-throughput cloud infrastructure. They also need to be low latency for live broadcast/streaming.
That's not to say security isn't a concern. It absolutely is, especially given the wide variety of deployment targets. However, video decoders aren't something that necessarily need to continually evolve over time. If you prioritize secure coding practices and pair that with some formal/static analysis, then you ought to be able to squeeze out more performance than Rust. For example, Rust may be inserting bounds checks on repeated access — where as a C program could potentially validate this sort of information just the once up front and pass the "pre-validated" data structure around (maybe even across threads) "knowing" that it's valid data. Yes, there's a security risk involved, but it may be worth it.
lgl · 34m ago
I may be wrong but if you're one of the "big guys" doing video then a 5% performance difference probably translates into millions of $ in the CPU/GPU bill
> I'm really surprised that a 5% performance degradation would lead people to choose C over Rust
I'm really surprised that because something is in Rust and not in C, it would lead people to ignore a 5% performance degradation.
Seriously... when you get something that's 5% faster especially in the video codec space, why would you dismiss it just because it's not in your favorite language... That does sound like a silly reason to dismiss a faster implementation.
edude03 · 13m ago
> just because it's not in your favorite language.
Kind of a strawman argument though. The question is, is the 5% difference (today) worth the memory safety guaranties? IE, would you be OK if your browser used 5% more power displaying video, if it meant you couldn't be hacked via a memory safety bug.
mvanbaak · 6m ago
No, it wouldn't
Because it also means your battery drains 5% faster, it gets hotter, you will need to upgrade your media player device etc etc etc.
Seen on the scale of the actual deployment, this is HUGE.
VWWHFSfQ · 38m ago
I think latency sensitive applications will usually prefer better performance and deal with safety issues, as opposed to better safety and deal with performance issues.
So I doubt it's any religious thing between c and Rust.
Ragnarork · 15m ago
$20K sounds very low for the effort and expertise that are demanded here in my opinion. It would be quite a steal to bring this to the same level as the state of the art (which, correct me if I'm wrong, but I believe is dav1d?) for only that sum.
jebarker · 14m ago
I assume they're hoping to nerd snipe someone
viraptor · 8m ago
And they did it well. If I was still a student without a family, I'd sink an unreasonable number of nights into this.
No comments yet
wslh · 10m ago
Yes, but you left many excellent candidates.
jebarker · 1h ago
I'd love it if someone started Kaggle for software optimization
degurechaff · 3h ago
just curious, why asian people not eligible?
rationably · 1h ago
It says nothing about "asian people". Verbatim quote, in full:
> The contest is open to individuals or teams of individuals who are legal residents or citizens of the United States, United Kingdom, European Union, Canada, New Zealand, or Australia.
greggsy · 30m ago
Interestingly if you follow through to the full T&C's [1], they add exclusions:
> ...not located in the following jurisdictions: Cuba, Iran, North Korea, Russia, Syria, and the following areas of Ukraine: Donetsk, Luhansk, and Crimea.
Showing that the only explicit exclusions are aimed at the usual gang of comprehensively sanctioned states.
Still doesn't explain why the rest of the world isn't in the inclusions list. Maybe they don't want to deal with a language barrier by sticking to the Anglosphere... plus EU?
bpicolo · 21m ago
Maybe because the laws for giving away money are complicated? There’s tax and reporting burden
washadjeffmad · 1h ago
Nationality, not ethnicity.
Turks are Asian. Russians are Asian. Indians are Asian. Etc.
They were probably just wondering why it's limited to Five Eyes + EU.
bArray · 42m ago
It'll likely be to do with financial responsibility due to where the funding comes from. They have an obligation to check that they are not sending funds to a terrorist group to solve code bounties, etc.
qalmakka · 1h ago
It's not that simple to do a bounty program, my uninformed guess is that they are almost definitely targeting a number of jurisdictions whose laws are familiar with and/or they have some kind of representive in
GolDDranks · 1h ago
As a resident of Japan, I thought the exact same thing. (I'm also a citizen of an EU country, which would permit me to participate, but most of my colleagues couldn't.)
cess11 · 1h ago
Probably legal and possibly political reasons that comes with the source of money.
bluGill · 48m ago
This pays for at most a week of work. I doubt it is worth anyones time to do unless they would do it for free anyway. Between the risk that someone else does it first and gets the reward and that if you are trying to make a living you need to spend time finding the next thing it just isn't much.
if you can fund someone for at least 6 months of work it becomes reasonable to work for these.
afavour · 42m ago
Man, I’d have gotten into Rust earlier if I knew the salaries were a minimum of $960,000
bluGill · 2m ago
You didn't account for overhead. Your take-home pay from projects like this is around $120,000 - you can do much better elsewhere if you are any good just getting a full time developer job in the midwest. (The Bay or senior level positions pay more)
Sure when you work you make a lot of money, but you end up needing to spend the vast majority of your time looking for the next gig and that is all unpaid.
Mashimo · 45m ago
> This pays for at most a week of work.
Does that mean that rust devs earn 60k monthly minimum?
demarq · 14m ago
I hear Steve Klabnik doesn’t go to the gas station, he just drives into the Mercedes dealership.
BetaMechazawa · 41m ago
> This pays for at most a week of work
What kind of lunatic would pay that kind of money for a dev for a single week
leonheld · 38m ago
What are you talking about? 25~30k is almost a year's salaries for a dev in Eastern Europe/South America.
sam_lowry_ · 11m ago
My first-hand Eastern European experience tells me that you should refresh your expectations. €50..60k is barely within the range of acceptable for a mid-senior Rust developer. You'd have to throw in quite some perks, like 100% remote work to lure someone to work for this money.
Hamuko · 45m ago
Is this yet another "every developer on the globe makes around the same as I do" post?
> The first performance issue we hit was dynamic dispatch to assembly, as these calls are very hot. We then began adding inner mutability when necessary but had to carefully avoid contention. We found as we removed pointers and transitioned to safe Rust types that bounds checks increasingly became a larger factor. Buffer and structure initialization was also an issue as we migrated to safe, owned Rust types.
Based on their conclusions², each of those issues amounts to a few percentage points (total: 11%).
Based on the article, it seems that with highly complex logic, safety does come at the cost of raw performance, and it can be very hard to compensate (withing the safety requirements).
[¹]: https://www.memorysafety.org/blog/rav1d-performance-optimiza...
[²]: https://www.memorysafety.org/blog/rav1d-performance-optimiza...
In Rust. These are Rust issues, not issues with safety in general.
The issue with bound checks for exemple is entirely avoidable if you prove that all your calls are within bounds before compiling, same thing for partial initialization.
The core issue is that the strategies Rust adopts to ensure memory safety are neither a panacea nor necessarily the right solution in every case. That being said, I think it's a very nice idea to try to write a decoder in Rust and have a bounty for optimization. Rust is popular so work on producing fast and safe Rust is good.
The situation is more nuanced. The article dedicates a section to it:
> The general idea in eliding unnecessary bounds checks was that we needed to expose as much information about indices and slice bounds to the compiler as possible. We found many cases where we knew, from global context, that indices were guaranteed to be in range, but the compiler could not infer this only from local information (even with inlining). Most of our effort to elide bounds checks went into exposing additional context to buffer accesses.
(extensive information given in that section)
I'm really surprised that a 5% performance degradation would lead people to choose C over Rust, especially for something like a video codec. I wonder if they really care or if this is one of those "we don't want to use Rust because of silly reasons and here's are reasonable-sounding but actually irrelevant technical justification"...
1. Desktop - If both implementations run the same but one is faster, you run the faster one to stop the decode spluttering on those borderline cases.
2. Embedded - Where resources are limited, you still go for the faster one, even if it might one day leas to a zero day because you've weighed up the risk and reducing the BOM is an instant win and trying to factor in some unknown code element isn't.
3. Server - You accept media from unknown sources, so you are sandboxed anyway. Losing 5% of computing resources adds up to big $ over a year and at enough scale. At Youtube for example it could be millions of dollars a year of compute doing a decode and then re-encode.
Some other resistances:
1. Energy - If you have software being used in many places over the world, that cost saving is significant in terms of energy usage.
2. Already used - If the C implementation is working without issue, there would be high resistance to spend engineering time to put a slower implementation in.
3. Already C/C++ - If you already have a codebase using the same language, why would you now include Rust into your codebase?
4. Bindings - Commonly used libraries use the C version and are slow to change. The default may remain the C version in the likes of ffmpeg.
Why especially video decoders?
> I wonder if they really care or if this is one of those "we don't want to use Rust because of silly reasons and here's are reasonable-sounding but actually irrelevant technical justification"...
I would have thought video decoders are specifically one of the few cases where performance really is important enough to trump language guaranteed security. They're widely deployed, and need to work in a variety of environments; everything from low power mobile devices to high-throughput cloud infrastructure. They also need to be low latency for live broadcast/streaming.
That's not to say security isn't a concern. It absolutely is, especially given the wide variety of deployment targets. However, video decoders aren't something that necessarily need to continually evolve over time. If you prioritize secure coding practices and pair that with some formal/static analysis, then you ought to be able to squeeze out more performance than Rust. For example, Rust may be inserting bounds checks on repeated access — where as a C program could potentially validate this sort of information just the once up front and pass the "pre-validated" data structure around (maybe even across threads) "knowing" that it's valid data. Yes, there's a security risk involved, but it may be worth it.
I'm really surprised that because something is in Rust and not in C, it would lead people to ignore a 5% performance degradation.
Seriously... when you get something that's 5% faster especially in the video codec space, why would you dismiss it just because it's not in your favorite language... That does sound like a silly reason to dismiss a faster implementation.
Kind of a strawman argument though. The question is, is the 5% difference (today) worth the memory safety guaranties? IE, would you be OK if your browser used 5% more power displaying video, if it meant you couldn't be hacked via a memory safety bug.
Because it also means your battery drains 5% faster, it gets hotter, you will need to upgrade your media player device etc etc etc.
Seen on the scale of the actual deployment, this is HUGE.
So I doubt it's any religious thing between c and Rust.
No comments yet
> The contest is open to individuals or teams of individuals who are legal residents or citizens of the United States, United Kingdom, European Union, Canada, New Zealand, or Australia.
> ...not located in the following jurisdictions: Cuba, Iran, North Korea, Russia, Syria, and the following areas of Ukraine: Donetsk, Luhansk, and Crimea.
Showing that the only explicit exclusions are aimed at the usual gang of comprehensively sanctioned states.
[1] https://www.memorysafety.org/rav1d-bounty-official-rules/
Still doesn't explain why the rest of the world isn't in the inclusions list. Maybe they don't want to deal with a language barrier by sticking to the Anglosphere... plus EU?
Turks are Asian. Russians are Asian. Indians are Asian. Etc.
They were probably just wondering why it's limited to Five Eyes + EU.
if you can fund someone for at least 6 months of work it becomes reasonable to work for these.
Sure when you work you make a lot of money, but you end up needing to spend the vast majority of your time looking for the next gig and that is all unpaid.
Does that mean that rust devs earn 60k monthly minimum?
What kind of lunatic would pay that kind of money for a dev for a single week