Ask HN: Why hasn't x86 caught up with Apple M series?
431 points by stephenheron 3d ago 614 comments
Ask HN: Best codebases to study to learn software design?
103 points by pixelworm 4d ago 90 comments
The Math Behind GANs (2020)
130 sebg 26 8/28/2025, 11:42:35 AM jaketae.github.io ↗
https://proceedings.mlr.press/v137/kavalerov20a/kavalerov20a...
It turns out 2 classes is special. Better to add the classes as side information rather than try to make it part of the main objective.
Last time I used a GAN was in 2015, still interesting to see a post about GANs now and then.
Yes you can just concentrate on the latest models but if you want a better grounding in the field some understanding of the past is important. In particular reusing ideas from the past in a new way and/or with better software/hardware/datasets is a common source of new developments!
Though if you can rephrase the problem into a diffusion it seems to be prefered these days. (Less prone to mode collapse)
Gan is famously used for generative usecases, but has wide uses for creating useful latent spaces with limited data, and show up in few-shot-learning-papers. (Im actually not that up to speed on the state of art in few-shot so mabie they have something clever that replace it)
GANs were fun though. :)
In practice, when it comes down to code, even without higher-level libraries, it is surprisingly simple, concise and intuitive.
Most of the math elements used have quite straightforward properties and utility, but of course if you combine them all together into big expressions with lots of single-character variables, it's really hard to understand for everyone. You kind of need to learn to squint your eyes and understand the basic building-blocks that the maths represent, but that shouldn't be necessary if it wasn't obfuscated like this.
But my experience as a mathematician tells me another part of that story.
Certain fields are much more used to consuming (and producing) visual noise in their notation!
Some fields have even superfluous parts in their definitions and keep them around out of tradition.
It's just as with code: Not everyone values writing readable code highly. Some are fine with 200 line function bodies.
And refactoring mathematics is even harder: There's no single codebase and the old papers don't disappear.
Maybe for Von Neumann math was simple...
They threw in all the complex math to the paper. I could not initially understand it at all despite inventing the damn algorithm!
Having said that, picking it apart and taking a little time with it, it actually wasn't that hard - but it sure looked scary and incomprehensible at first!
Also there are libraries that abstract away most if not all the things, so you don't have to know everything
This statement is not true, there are counter examples I encountered in my university studies but I would say that intuition will get you very far. Einstein was able to come up with special theory of relativity by just manipulating mental models after all. Only when he tried to generalize it, that’s when he hit the limit of the claim I learned in school.
That being said after abandoning intuition, relying on pure mathematical reasoning drives you to a desired place and from there you usually can reason about the theorem in an intuitive way again.
Math in this paper is not that hard to learn, you just need someone to present you the key idea.
I hope anyone who is unsure will read your comment and at least try to follow it for a while.