15,000 lines of verified cryptography now in Python

332 todsacerdoti 86 4/18/2025, 7:28:44 PM jonathan.protzenko.fr ↗

Comments (86)

chrisrodrigue · 9h ago
There's no mention of what Python version this is actually in.

After some digging, it looks like the answer is 3.14 [0], so we won't be seeing this until October [1].

One could argue that this is a security fix (just read the first sentence of the blog post) and should be included in all the currently supported versions (>=3.9) of Python [2].

[0] https://github.com/python/cpython/blob/main/Doc/whatsnew/3.1...

[1] https://peps.python.org/pep-0745/

[2] https://devguide.python.org/versions/

ashishb · 7h ago
Many famous libraries like Spacy do not support Python 3.13 and are stuck on Python 3.12 (Oct 2023).

So, even if this comes out in Python 3.14, any non-trivial project will have to wait till Oct 2026 (or Oct 2027) to be able to use it.

1 - https://github.com/explosion/spaCy/issues/13658

arp242 · 7h ago
Good grief that issue is a clusterfuck of bozos.

Sometimes I wish there was a GitHub with entry exam. "A library you use has a bug, you find a find a 3 month old bug report for your exact issue. Do you 1) add a comment with "me too", 2) express your frustration this issue hasn't been fixed yet, 3) demand the maintainers fix it as soon as possible as it's causing issues for you, or 4) do nothing".

Only half joking.

Hasnep · 1h ago
The Refined GitHub extension [1] automatically hides comments that add nothing to the discussion. [2]

[1] https://github.com/refined-github/refined-github [2] https://github-production-user-asset-6210df.s3.amazonaws.com...

rowanG077 · 6h ago
I actually find that hugely helpful that so many people are actually actively expressing they are hitting this. It's not easy to be able to get an idea what issues people are actually hitting with anything you have made. An issue being bumped with essentially "me too" is a highly valuable signal.
arp242 · 6h ago
Do you really think the maintainers don't understand that "doesn't work with Python 3.13" isn't going to affect tons of people?

There's some bozo asking "any news? I cant downgrade because another lib requirement" just two days after the maintainer wrote several paragraphs explaining how difficult it is to make it work with Python 3.13. This adds no value for anyone and is just noise. Anyone interested in actual useful information (workarounds, pointers on how to help) has to wade though a firehose of useless nonsense to get at anything remotely useful. Any seriously discussions of maintainers wanting to discuss things is constantly interrupted by the seagulls from Finding Nemo: "fix? fix? fix? fix? fix?""

Never mind the demanding nature of some people in that thread.

Just upvote. That's why this entire feature was added.

throwaway519 · 9m ago
To the non-technical founder, this is doing something.

They will not move to ~~mocking up~~ sketching a wireframe of something.

madars · 2h ago
After seeing "Jigar Kumar" cognitive exploits on xz mailing list a maintainer would be excused (and I'd even say, encouraged) to just ignore pressure tactics altogether. It's an open source project - if it works great, if it breaks, you get to keep both pieces.
hkt · 6h ago
Reactions, though.
stingraycharles · 6h ago
It’s not just that, it’s people commenting “this is unacceptable” and “I hate this library” that add very little value.

Also, you can upvote issues as well / leave reactions without leaving comments. It ensures a better signal:noise ratio in the actual discussion.

bagels · 4h ago
You're missing

5) Do 1, 2, and 3

6) Submit a a tested pull request with a fix

efitz · 3h ago
Except when you do 6 the maintainer rejects the request and says that it wasn’t a bug at all, and everyone is stuck with the bug.
matheusmoreira · 3h ago
Pull requests should not be normalized. The guy in charge of the code base has complete knowledge of it and can always do a better job than randoms making ad hoc changes to quick fix their own problems. Complaining about bugs and missing features is much easier. It's less of a headache and requires less effort and time investment.

Truth is developers don't want to end up maintaining other people's code anyways. It's gotten to the point I roll my eyes whenever I see people saying "patches welcome". Chances are they're not as welcome as you would think. "Show us the code", and when you do... Nothing. Submitting a pull request is no guarantee that they'll accept it. Hell it's no guarantee they'll even give you the time of day. You can put in a whole lot of effort and still end up getting refused or even straight up ignored.

Even if it's an unambiguous improvement on the status quo. They'll commit code that literally returns "not implemented" directly to master and actually ship it out to users. Your pull request will certainly be held to much higher standards though. And that's if they engage with you in an attempt to get your patches merged. They might just decide to ignore your patch and redo all the work themselves, which is exactly what all the complainers wanted them to do in the first place! If they're really nice they'll even credit you in the commit message!

If you're gonna put in effort into someone else's project, just fork it straight up and make it your own project. Don't waste time with upstream. Don't wait for their blessing. Just start making your own version better for you. Don't even submit it back, they can pull it themselves if they want to.

bagels · 43m ago
Boo to you. I've up-streamed patches to scipy and a few other python packages. Nobody ever said no to genuine high quality bug reports that were coupled with good fixes. Not saying that will happen every time, but if you're unsure, ask how receptive they are.
LtWorf · 6m ago
We should normalise GOOD pull requests. And possibly avoid making garbage ones that fail 100% of the testsuite.
throwaway519 · 6m ago
Truth.
rtpg · 3h ago
I was going to write something glib about getting things fixed but that thread looks gnarly!

To be honest I know so many people who use Pydantic and so many people who seem to get stuck because of Pydantic 2. I’m glad I have minimal exposure to that lib, personally.

I suppose the biggest issue really is type annotation usage by libs being intractable

pbronez · 7h ago
Just ran into something similar with Great Expectations. Python 3.12 is the newest I can run.
ashishb · 2h ago
Exactly, many compiled languages like Java and Go do not suffer from this issue.
fiddlerwoaroof · 1h ago
Tell that to all the companies stuck on Java 8 with old versions of Spring and Hibernate. This is the cost of an ecosystem where major libraries make breaking changes.
lmm · 48m ago
Spring and Hibernate broke the rules of the language and paid the price, and nevertheless all the companies I'm aware of managed to migrate the best part of a decade ago.
LtWorf · 4m ago
It's not about being compiled or not compiled. Python is now making breaking changes on every release instead of piling up a bunch of them and making python 4.

So what we get is a a mini python2/3 situation on every single release instead.

0cf8612b2e1e · 7h ago
uv seems to have reverted to defaulting to 3.12 instead of 3.13. Which I fully endorse owing to how many packages are not yet compatible.
devrandoom · 6h ago
> One could argue

How?

chrisrodrigue · 5h ago
From https://github.com/python/cpython/issues/99108#issue-1436673...:

> As evidenced by the recent SHA3 buffer overflow, cryptographic primitives are tricky to implement correctly. There might be issues with memory management, exceeding lengths, incorrect buffer management, or worse, incorrect implementations in corner cases.

This is a proactive fix for zero days that may be lurking in the wild.

bgwalter · 9h ago
Reading the article, they took a verified C library generated from F* from Microsoft, vendored the code in CPython and wrote a C extension.

And during the process they discovered that the original library did not handle allocation failures?

What is the big deal about Python here? It's just another wrapped C library.

quantumgarbage · 8h ago
A seamless, drop-in replacement, for all python users is actually a pretty good feat tbh
anon6362 · 9h ago
In general, it's good practice to use best components available, regardless of source. Cryptographic components especially shouldn't be DIY.

It would be cool if Ruby did something similar for stdlib/openssl, but it could be also be done in a gem too.

nine_k · 8h ago
> just another wrapped C library.

I suppose you mean "just another top-notch library available in the Python ecosystem"? :)

bolognafairy · 8h ago
What part of how this is being conveyed do you disagree with? Specifically.

This isn’t a language war.

neuroelectron · 7h ago
The headline makes it sound like it was written in Python which is a lot more readable than 15,000 lines of C
frumplestlatz · 6h ago
The 15k lines of C are generated. The implementations are written in (and formally verified using) F*, and then the C is generated from the F* code using KaRaMeL.

One should be able to trust the proofs of correctness and never look at (or maintain) the generated C directly.

- https://fstar-lang.org

- https://github.com/FStarLang/karamel

- https://github.com/hacl-star/hacl-star

myko · 4h ago
This is fantastic. Somewhere my CS prof is smiling
Retr0id · 9h ago
Will this bring support for "streaming" output from SHAKE?

Here's the (recently closed!) issue for pyca/cryptography regarding this feature, although I can't find an equivalent issue for the python stdlib: https://github.com/pyca/cryptography/issues/9185

Edit: Found a relevant issue, "closed as not planned": https://github.com/python/cpython/issues/82198

IlikeKitties · 9h ago
Modern, ubiquitous cryptography is now practically unbreakable (even for the NSA) and widely used. The crypto wars of the 90s seem rather quaint. Any thoughts on the effects on society this now has?
xyzzy123 · 9h ago
It's cool that we can largely "strike out" link level wiretapping from our threat models but it just means attackers move on to the endpoints. I have a wonderfully private link to google, my bank and my crypto exchange but all of those are coerced to report everything I do.
TacticalCoder · 9h ago
> ... but it just means attackers move on to the endpoints.

Yup but this doesn't scale anywhere near as well for the attackers.

FabHK · 3h ago
Sorry, if there are N clients, and M servers, then there are N+M endpoints, but N*M links, which is a lot more.
pyfon · 7h ago
Except for countries who hist companies who hold all the endpoints.
artursapek · 6h ago
Most internet traffic is cosolidated through a small number of providers like Cloudflare and AWS.
hkt · 6h ago
Cloudflare being, if it wants to be, an epic MITM given its control of DNS and its role as WAF. Line level surveillance barely matters now.
anon6362 · 9h ago
This is so vague as to be meaningless because body of research (attacks and hw-accelerated implementations), details, implementations, uses, and adversary/ies budget(s) matter. Furthermore, the obsession with optimizing for "fast" cryptographic algorithms/implementations undermine themselves when it comes to the cost of CPU-bound brute force attack which become cheaper and more attainable over time.
Analemma_ · 8h ago
I think modern cryptography is basically unbreakable if used correctly, but there is still a lot of work to do re: developer ergonomics and avoiding footguns in implementations. This is much better than it used to be, thanks to things like libsodium, moving away from RSA, and newer protocols that de-emphasize cipher negotiation, but there is still more to do, even with “good” crypto primitives. For example, AES used perfectly is probably unbreakable but AES-GCM has ways to bite the unwary; ideally we should think about an even newer symmetric block cipher or increasing awareness of better AES modes without GCM’s drawbacks.
gpcz · 9h ago
For now. If someone makes a practical quantum computer, pretty much every asymmetric primitive we use at the start of a cryptographic protocol to make a shared secret for symmetric encryption will be breakable.
dist-epoch · 8h ago
The switch to post-quantum encryption already started - Signal, Chrome, iMessage
matheusmoreira · 2h ago
> Any thoughts on the effects on society this now has?

I have observed two effects.

They are constantly trying to make it illegal for the common man to use cryptography. Constantly. Cryptography is subversive. It has the power to defeat judges, armies, nations. Authorities don't want normal people to have access to this. It will increasingly become a tool of corporations and governments.

Authorities are bypassing cryptography instead by attacking the end points. It's easier to crack other parts of the system and exfiltrate data after it's been decrypted than to break the cryptography. Easier to hack a phone and get the stored messages than to intercept end-to-end encrypted ciphertext and decrypt it.

jakeogh · 8h ago
I just got bit by device attestation. Tried to install the latest ebay app version via the Aurora Store (on GrapheneOS), and instead of presenting me with the option of using my ebay login, it wanted a google account at a play store login with no way to bypass. I was able to downgrade to the previous version which does not require the Integrity API, but the walls are closing in. Only took 7 months: https://news.ycombinator.com/item?id=41517159 (I did get ebay on the phone and filed an issue, hopefully others do the same)
imiric · 9h ago
Why are you so certain of this? The NSA has a long history of attempting to insert backdoors in cryptographic systems. Most recently they bribed RSA to make their compromised PRNG the default, which shipped in software as late as 2014, possibly even later.

And these are just the attempts we know about. What makes you think that they haven't succeeded and we just don't know about it?

IlikeKitties · 8h ago
We know from the Snowden Leaks that they couldn't crack PGP at the time. There's been some talks about cracking "export grade" cryptography and how that is done. I'm pretty confident that the newer hardened crypto libraries are pretty secure and since even the NSA recommends signal encryption now because the infrastructure in the US has so many holes the Chinese are in the entire mobile internet infrastructure, that's a pretty strong lead that it's very hard if not impossible to crack signal, at least for the Chinese.

It's also very likely that even if they can attack crypto in ways we don't know about, they can at best reduce the required time it takes to crack a given key. Chosing extra long keys and changing them frequently should protect you from lots of attacks.

mmooss · 8h ago
> It's also very likely that even if they can attack crypto in ways we don't know about, they can at best reduce the required time it takes to crack a given key.

Why do you say that? Very often the vulnerabilities are not in the mathematics but in the implementation.

If Signal works, as was pointed out during the recent scandal, it only protects messages in transit. The application on your phone is not especially secure and anyone who can access your phone can access your Signal messages (and thus any of your correspondents' messages that they share with you).

> that's a pretty strong lead that it's very hard if not impossible to crack signal, at least for the Chinese.

The NSA does not recommend Signal for classified communication.

The NSA thinks use of Signal is the best interest of the US government, as the NSA perceives those interests (every institution will have its own bias). It could be that Signal is the least insecure of the easily available options or that that the NSA believes that only they can crack Signal.

commandersaki · 7h ago
> If Signal works, as was pointed out during the recent scandal, it only protects messages in transit. The application on your phone is not especially secure and anyone who can access your phone can access your Signal messages (and thus any of your correspondents' messages that they share with you).

Device compromise is outside the threat model for Signal or any software for that matter.

> Why do you say that? Very often the vulnerabilities are not in the mathematics but in the implementation.

This is why we use primitives that are well understood and have straightforward implementations. I'm no expert but you can look at design of say Salsa20 or Curve25519 -- it was designed for straightforward constant time implementation. Yes NIST suite has implementation subtleties such as constant time implementations, but those ward off issues that are of pretty low concern / infeasible to attack (i've yet to see a timing channel or cache side channel be exploited in the wild except for the XBox 360 MAC verification). Also CFRG has been publishing guidance so we know how to properly implement KEMs, (hybrid) encryption, PAKE, hash to curve, signatures, ECDH, ECDSA, etc. Compound this with a lot of footgun free crypto libraries such as Go crypto, libsodium, monocypher, BoringSSL, Tink, etc. and these days you'd be hard pressed to make a misstep cryptography wise.

In my opinion, NSA advantage is not that it has a virtually unlimited budget, it's that they have better vantage point to carry out multi-user attacks. So does the likes of FAANG, Fastly, Cloudflare, etc.

mmooss · 6h ago
> Device compromise is outside the threat model for Signal or any software for that matter.

I agree about Signal - that's what they say iirc. Some software does take it into account. The point here is about security depending on much more than cryptography mathematics; Signal is just an example.

IlikeKitties · 8h ago
> Why do you say that? Very often the vulnerabilities are not in the mathematics but in the implementation.

I recommend this talk: https://www.youtube.com/watch?v=v8Pma5Bdvoo This gives you a good example how practical attacks and intentional weakening of crypto works.

And especially for common cryptos like AES and RAS you can easily compare the outputs of different implementations. If one is different, that one is suspect. And especially for open source crypto like OP, the implementation can easily be verified.

mmooss · 7h ago
You describe implementation as easy, but in practice it takes hard-to-find expertise and lots of resources. In the case of the OP, look what it took to get a secure implementation into Python, as late as the year 2025.
mmooss · 8h ago
> What makes you think that they haven't succeeded and we just don't know about it?

Yes, afaik they also have a history of not revealing exploits they discover. With a budget in the tens of billions and tens of thousands of employees, they have the resources to discover quite a bit.

badmonster · 1h ago
how reusable is the generic streaming verification framework beyond cryptographic hashes?
pluto_modadic · 4h ago
does this mean anything that imports the cryptography modules needs to include G++ or something weird, or is it compiled into cpython itself?
__s · 3h ago
The latter. Most of CPython is C you don't need compiler to run
hall0ween · 10h ago
I'm cryptographically ignorant. What does this mean for python?
otterley · 10h ago
https://en.wikipedia.org/wiki/Formal_verification

Provable correctness is important in many different application contexts, especially cryptography. The benefit for Python (and for any other user of the library in question) is fewer bugs and thus greater security assurance.

If you look at the original bug that spawned this work (https://github.com/python/cpython/issues/99108), the poster reported:

"""As evidenced by the recent SHA3 buffer overflow, cryptographic primitives are tricky to implement correctly. There might be issues with memory management, exceeding lengths, incorrect buffer management, or worse, incorrect implementations in corner cases.

The HACL* project https://github.com/hacl-star/hacl-star provides verified implementations of cryptographic primitives. These implementations are mathematically shown to be:

    memory safe (no buffer overflows, no use-after-free)
    functionally correct (they always compute the right result)
    side-channel resistant (the most egregious variants of side-channels, such as memory and timing leaks, are ruled out by construction)."""
hathawsh · 10h ago
Here is how I read it: CPython has a hashlib module that has long been a wrapper around certain functions exported from OpenSSL, but since a SHA3 buffer overflow was discovered in OpenSSL, the CPython project has now decided to wrap a library called HACL*, which uses formal verification to ensure there are no buffer overflows and other similar bugs.
jessekv · 10h ago
aseipp · 10h ago
Extreme ELI5 TL;DR: Your Python programs using the cpython interpreter and its built in cryptographic modules will now be using safer and faster, with no need for you to do anything.
kccqzy · 9h ago
Who uses its built-in cryptographic modules though? I wrote a fair bit of cryptographic code in Python more than five years ago and most people recommended the cryptography package https://pypi.org/project/cryptography/ over whatever that's built-in.
Retr0id · 9h ago
I'm a big fan of pyca/cryptography and I use it for any serious project, but if I just need hashing I tend to use the standard library hashlib - it saves a somewhat heavy dependency, and the API is less verbose.

Also, pyca/cryptography uses OpenSSL. OpenSSL is fine, but has the same "problem" as the previous python stdlib implementation. (Personally I think it's an acceptable risk. If anything, swapping in 15,000 lines of new code is the greater risk, even if it's "verified")

frumplestlatz · 6h ago
I'm curious why you put "verified" in scare-quotes, and why you think adopting formally verified code is a greater risk.
Retr0id · 6h ago
I don't think formal verification is bad, I just don't think it's a silver bullet. i.e. we should not get complacent and assume that formally verified code is 100% safe.
odo1242 · 10h ago
Is it faster? I’m pretty sure the main goal of this effort is just the “safer” part.
protz · 10h ago
the performance was pretty much identical, however, as an added benefit, Blake2 got quite a bit faster due a combination of 1) our code being slightly more optimized, and 2) python's blake2 integration not actually doing runtime cpu detection, meaning that unless you compiled with -march=native (like, Gentoo), at the time, you were not getting the AVX2 version of Blake2 within Python -- my PR fixed that and added code for CPU autodetection

bear in mind that performance is tricky -- see my comment about NEON https://github.com/python/cpython/pull/119316#issuecomment-2...

aseipp · 10h ago
The goal is to make things safer, yes, but speed is absolutely a major priority for the project and a requirement for production deployment, because the difference in speed for optimized designs vs naive ones might be an order of magnitude or more. It's quite speedy IME. To be fair to your point, though, it's also a moving target; "which is faster" can change as improvements trickle in. Maybe "shouldn't be much slower" is a better statement, but I was speaking generally :)

(You could also make the argument that if previous implementations that were replaced were insecure that their relative performance is ultimately irrelevant, but I'm ignoring that since it hinges on a known unknown.)

drewcoo · 10h ago
And safer is often slower to avoid timing attacks.
aseipp · 10h ago
I mean, most if not all of the code they're replacing (e.g. the vendored and vectorized Blake2 code) is also going to be designed and optimized with timing attacks in mind and implemented to avoid them. CVE-2022-37454 was literally found in a component that was optimized and had timing attack resistance in mind (XKCP from the SHA-3 authors). "Code that is already known to be wrong" is not really a meaningful baseline to compare against.
tsecurity · 9h ago
How much of the development of this was verified, and what does that consist of?

I worry a little when I read that things are verified and were hard.

aseipp · 8h ago
https://eprint.iacr.org/2017/536.pdf is the relevant paper that introduces the project and its broad design. Figure 1 on page 3 is a good place to look.
rtkwe · 9h ago
The first two shouldn't matter because the entirety of the code is verified and anyone can check the verification. The last is an issue with any cryptography but verification doesn't try to address that only that the code does precisely and only what it's supposed to; ie it should guarantee that there are no exploits possible against that library.
thebeardisred · 10h ago
Lines of code is a pretty poor measurement. Doubly so when you're boasting a large number in the context of cryptographic code.
bbeonx · 10h ago
This isn't really a metric. It's a description to help the reader understand the magnitude of effort that went into this project. SLoC is a bad metric for plenty of things, but I think it's fine for quickly conveying the scope of a project to a blog reader.
odyssey7 · 10h ago
Lines of code is also a poor indicator for “magnitude of effort.”

Tangent: generally I’m more inclined to believe quality is improved when someone claims 1000s of lines reduced rather than 1000s of lines written.

chaitimes · 9h ago
From my experience, pre LLMs, it was a valid proxy metric for effort
xeromal · 10h ago
See: AI generating 1000s of lines
aseipp · 10h ago
It isn't boasting about anything, it's a straightforward description of the 2.5 years of effort they went through for this project, and some of the more annoying "production integration" bits.
bbstats · 10h ago
the 2nd sentence:

"As of last week, this issue is now closed, and every single hash and HMAC algorithm exposed by default in Python is now provided by HACL*, the verified cryptographic library."

it's describing coverage.

daquisu · 8h ago
Which better measurement do you propose?
titaphraz · 9h ago
> Lines of code is a pretty poor measurement

But let's say at least that it's abused, exploited, perverted and a molested measurement before we call it a poor one. It not all that bad when you have context.

Tarucho · 10h ago
Not saying it isn´t useful but with this change Python's crypto depends on Microsoft (https://fstar-lang.org/ and https://project-everest.github.io/)
NavinF · 9h ago
> HACL* is a formally verified library of modern cryptographic algorithms, where each primitive is verified for memory safety, functional correctness, and secret independence. HACL* provides efficient, readable, standalone C code for each algorithm that can be easily integrated into any C project.

> All the code in this repository is released under an Apache 2.0 license. The generated C code from HACL* is also released under an MIT license.

You have an unusual definition of "depends on Microsoft". Anyone worried about depending on Microsoft should be able to maintain 15k lines of C that are already formally verified. Python already vendored the code so who cares who wrote that code?

No comments yet