I dont know, arguing that http/2 is safer overall is a... bold claim. It is sufficiently complex that there is no standard implementation in the Python standard library, and even third party library support is all over the place. requests doesn't support it; httpx has experimental, partial, pre-1.0 support. Python http/2 servers are virtually unavailable at all. And it's not just Python - I remember battling memory leaks, catastrophic deadlocks, and more in the grpc-go implementation of http/2, in its early days.
HTTP 1.1 connection reuse is indeed more subtle than it first appears. But http/2 is so hard to get right.
jcdentonn · 40m ago
Not sure about servers, but we had http/2 clients in java for a very long time.
jiehong · 18m ago
nghttp2 is a C lib that can be used for serving as a server in many cases. Rust has the http2 crate.
Perhaps it isn’t that easy, but it could be put in common and used a bit everywhere.
cyberax · 32m ago
An HTTP/2 client is pretty easy to implement. Built-in framing automatically improves a lot of complexity, and if you don't need multiple streams, you can simplify the overall state machine.
Perhaps something like "HTTP/2-Lite" profile is in order? A minimal profile with just 1 connection, no compression, and so on.
spenczar5 · 29m ago
Isn't the original post about servers? A minimal client doesn't help with server security.
I would endorse your idea, though, speaking more broadly! That does sound useful.
The section "How secure is HTTP/2 compared to HTTP/1?" (https://portswigger.net/research/http1-must-die#how-secure-i...) responds to this. In short, there's an entire known class of vulnerabilities that affects HTTP/1 but not HTTP/2, and it's not feasible for HTTP/1 to close the entire vulnerability class (rather than playing whack-a-mole with bugs in individual implementations) because of backwards compatibility. The reverse isn't true; most known HTTP/2 vulnerabilities have been the kind of thing that could also have happened to HTTP/1.
Is there a reason you don't find this persuasive?
mittensc · 22m ago
The article is a nice read on request smuggling.
It over-reaches with argument about disallowing http/1.1.
Parsers should be better.
Moving to another protocol won't solve the issue.
It will be written by the same careless engineers.
So same companies will have the same issues or worse...
We just lose readability/debuggability/accesibility.
superkuh · 57m ago
> If we want a secure web, HTTP/1.1 must die.
Yes, the corporations and insitutions and their economic transactions must be the highest and only priority. I hear that a lot from commercial people with commercial blinders on.
They simply cannot see beyond their context and realize the web, http/1.1 is used by human people that don't have the same use cases or incredibly stringent identity verification needs. Human use cases don't matter to them because they are not profitable.
Also, this "attack" only works on commercial style complex CDN setups. It wouldn't effect human hosted webservers at all. So yeah, commercial companies, abandon HTTP, go to your HTTP/3 with all it's UDP only and CA TLS only and no self signing and no clear text. And leave the actual web on HTTP/1.1 HTTP+HTTPS alone.
GuB-42 · 23m ago
Yes!
Let's get real, online security is mostly a commercial thing. Why do you think Google pushed so hard for HTTPS? Do you really think it is to protect your political opinions? No one cares about them, but a lot of people care about your credit card.
That's something I disagree with the people who made Gemini, a "small web" protocol for people who want to escape the modern web with its ads, tracking and bloat. They made TLS a requirement. Personally, I would have banned encryption. There is a cost, but it is a good way to keep commercial activity out.
I am not saying that the commercial web is bad, it may be the best thing that happened in the 21th century so far, but if you want to escape from it for a bit, I'd say plain HTTP is the way to go.
Note: of course if you need encryption and security in general for non commercial reason, use it, and be glad for the commercial web for helping you with that.
jsnell · 30m ago
The author is only arguing against HTTP/1.1 for use between proxies and backends. Explicitly so:
> Note that disabling HTTP/1 between the browser and the front-end is not required
layer8 · 18m ago
It requires rather careful reading to understand that. Most of the site sounds like they want to eliminate HTTP/1.1 wholesale.
cyberax · 40m ago
> Also, this "attack" only works on commercial style complex CDN setups. It wouldn't effect human hosted webservers at all.
All you need is a faulty caching proxy in front of your PHP server. Or maybe that nice anti-bot protection layer.
HTTP 1.1 connection reuse is indeed more subtle than it first appears. But http/2 is so hard to get right.
Perhaps it isn’t that easy, but it could be put in common and used a bit everywhere.
Perhaps something like "HTTP/2-Lite" profile is in order? A minimal profile with just 1 connection, no compression, and so on.
I would endorse your idea, though, speaking more broadly! That does sound useful.
I'll note articles about HTTP2.0 vulnerabilities have been posted with some regularity here: https://news.ycombinator.com/item?id=44909416
Is there a reason you don't find this persuasive?
It over-reaches with argument about disallowing http/1.1.
Parsers should be better.
Moving to another protocol won't solve the issue. It will be written by the same careless engineers. So same companies will have the same issues or worse...
We just lose readability/debuggability/accesibility.
Yes, the corporations and insitutions and their economic transactions must be the highest and only priority. I hear that a lot from commercial people with commercial blinders on.
They simply cannot see beyond their context and realize the web, http/1.1 is used by human people that don't have the same use cases or incredibly stringent identity verification needs. Human use cases don't matter to them because they are not profitable.
Also, this "attack" only works on commercial style complex CDN setups. It wouldn't effect human hosted webservers at all. So yeah, commercial companies, abandon HTTP, go to your HTTP/3 with all it's UDP only and CA TLS only and no self signing and no clear text. And leave the actual web on HTTP/1.1 HTTP+HTTPS alone.
Let's get real, online security is mostly a commercial thing. Why do you think Google pushed so hard for HTTPS? Do you really think it is to protect your political opinions? No one cares about them, but a lot of people care about your credit card.
That's something I disagree with the people who made Gemini, a "small web" protocol for people who want to escape the modern web with its ads, tracking and bloat. They made TLS a requirement. Personally, I would have banned encryption. There is a cost, but it is a good way to keep commercial activity out.
I am not saying that the commercial web is bad, it may be the best thing that happened in the 21th century so far, but if you want to escape from it for a bit, I'd say plain HTTP is the way to go.
Note: of course if you need encryption and security in general for non commercial reason, use it, and be glad for the commercial web for helping you with that.
> Note that disabling HTTP/1 between the browser and the front-end is not required
All you need is a faulty caching proxy in front of your PHP server. Or maybe that nice anti-bot protection layer.
It really, really is easy to get bitten by this.