What Is the Fourier Transform?

398 rbanffy 177 9/4/2025, 10:11:09 PM quantamagazine.org ↗

Comments (177)

oh_fiddlesticks · 8h ago
This video [1] from a visual effects channel (Captain Disillusion) has an excellent visual illustration of how the Fourier tranform works and how its used in visual effects in his video about blurring and unblurring ("ENHANCE!") images.

1. https://youtu.be/xDLxFGXuPEc?feature=shared

lock1 · 5h ago
While I like CD's works, I would say "CD / Blur" is the least informational of all the CD slash series. I guess it's a fun and more accessible option, but it certainly lacks depth compared to something like 3B1B's video on FT.
gus_massa · 2h ago
I like his videos and most of this particular video. For example, I agree with his choice to not discuss the nasty details of the FFT. Also, I think the "blur" part is fine, but the "unblur" part is too oversimplified.

I think he could have explained how the Gaussian filter almost kills all details / high frequency features, then rounding completely destroy them and then they can not be magically reconstructed to "unblur". He gives some hints about this, but they are to few and too general.

PS: There are some non lineal ticks "enhance" the "unblur" version , like minimizing the L1 norm of the gradient (or something like that). It helps with some images. I'm not sure which is the current state of the art.

alkyon · 2h ago
I found this video from 3Blue1Brown more informative in terms of mathematics involved:

https://youtu.be/spUNpyF58BY?feature=shared

Edit: In fact it was already mentioned in the comments, but I haven't noticed

hybrid_study · 5h ago
The Carl Sagan bit is also an amusing tribute.
anyfoo · 18h ago
If you like Fourier, you're going to love Laplace (or its discrete counterpart, the z transform).

This took me down a very fascinating and intricate rabbit hole years ago, and is still one of my favorite hobbies. Application of Fourier, Laplace, and z transforms is (famously) useful in an incredibly wide variety of fields. I mostly use it for signal processing and analog electronics.

segfault99 · 16h ago
When I did EE, didn't have access to any kind of computer algebra system. Have 'fond' memories of taking Laplace transform transfer functions and converting to z-transform form. Expand and then re-group and factor. Used a lot of pencil, eraser and line printer fanfold paper for doing the very basic but very tedious algebra. Youngsters today don't know how lucky.. (ties onion to belt, etc., etc.)
taneq · 8h ago
Did you make sproingies from the tear-off side strips of the printer paper, though? That was the best bit. :P
segfault99 · 7h ago
Of course!
schlauerfox · 2h ago
I just recently got my Computer Engineering degree which is the modern Electronics Engineering and we had a whole class on transforms. We had to do it on paper, but that professor at Cal State LA knew what the heck she was doing. We learned it good.
zwnow · 8h ago
No worries, as a self proclaimed youngster I didn't manage to understand Fourier in 2 days and never bothered again. Also had no other prior knowledge to algebra so maybe that's why I struggled. Never perceived algebra as useful in anything programming related, will continue to do so as most problems are solvable without it. I'll let the degree havers do all that stuff.
Sharlin · 5h ago
> Never perceived algebra as useful in anything programming related

Image, video, and audio processing and synthesis, compression algorithms, 2D and 3D graphics and geometry, physics simulation, not to mention the entire current AI boom that's nothing but linear algebra… yeah, definitely algebra isn't useful in anything programming related.

zwnow · 3h ago
Yea that's not what I work with my guy
Sharlin · 3h ago
You said "anything programming related", not "anything related to my work", "my guy".
hnuser123456 · 2h ago
So you're a programmer but you've never assigned a number to a variable or written any math operations? Do you just do string translations or something?
zwnow · 1h ago
I'm talking algebra you need a degree for. Well algebra u learn while getting one that is.
perching_aix · 7h ago
You might find LLMs to be a useful crutch for this to an extent, although it's very easy to take the wrong turn and go off into the deep end. But as long as you keep forcefully connecting it back to practical reality, you can get progress out of it. And of course, never actually make it calculate.
dsego · 5h ago
Have I got a video for you.

gingerBill – Tools of the Trade – BSC 2025 https://youtu.be/YNtoDGS4uak

IAmBroom · 1h ago
"Book learnin' didn't do me no good no how!"
armanj · 15h ago
Years ago, I often struggled to choose between Amazon products with high ratings from a few reviews and those with slightly lower ratings but a large volume of reviews. I used the Laplace Rule of Succession to code a browser extension to calculate Laplacian scores for products, helping to make better decisions by balancing high ratings with low review counts. https://greasyfork.org/en/scripts/443773-amazon-ranking-lapl...
CuriouslyC · 6h ago
Just for reference, in case you find yourself in an optimization under uncertainty situation again: The decision-theoretic right way to do this is generate a bayesian posterior over true ranking given ranking count and a prior on true rankings, add a loss function (it can just be the difference between the true rating of the selected item and the true rating of the non-selected item for simplicity) then choose your option to minimize the expected loss. This produces exactly the correct answer.
Shadowmist · 6h ago
I always assume that all the ratings are fake when there is a low count of ratings since it is easy for the seller to place a bunch of game orders when they are starting out.
anentropic · 33m ago
A bigger problem I find is many Amazon listings having a large number of genuine positive reviews, but for a completely different product than the one currently for sale.

Recently I was buying a chromecast dongle thing and one of the listings had some kind of "Amazon recommends" badge on it, from the platform. It had hundreds of 5 start reviews, but if you read them they were all for a jar of guava jam from Mexico.

I'm baffled why Amazon permits and even seemingly endorses this kind of rating farming

kragen · 12h ago
While this is a good idea, I think it's unrelated to the Laplace transform except that they're named after the same dude?
arethuza · 10h ago
When I think of Laplace Transforms I always think of control theory - poles, zeros etc.
kmarc · 10h ago
Probably that's why we are learning about it in the "Control Theory" classes at university. :-)

Jokes aside, I graduated as "Computer Engineer" (BSc) and then also did a "Master in Computer Science"; I was (young and) angry at the universe why soooo many classical engineering classes and then theory I had to sit through (Control theory, Electrical engineering, Physics), and we never learned about the cool design patterns etc etc.

Today I see that those formative years helped me a lot with how I develop intuition when looking at large (software) systems, and I also understand that those ever changing best design patterns I can (could have) just look up, learn, and practice in my free time.

I wish a today-me would have told my yesterday-me all this.

arethuza · 9h ago
I learned about it after I graduated with a CS degree - I mean in true university degree fashion we'd been taught about Laplace and Z transforms (and related things) but with no practical applications.

After graduating I joined an academic research team based mainly in a EE department who were mainly Control Engineers - we were mainly doing stuff around qualitative reasoning and using it for fault diagnosis, training etc.

arethuza · 3h ago
To be fair (and because I've just remembered - it was ~40 years ago) we did get some practical stuff covered in the maths part of my CS degree in the application of group theory (groups, rings & fields) to coding theory.
analog31 · 5h ago
My control theory professor (who was also my physics advisor -- it was a small college) explained it like this: Physicists like Fourier transforms because they go from minus to plus infinity, like the universe. Control engineers like Laplace transforms because they start at zero, and a control system also has a starting point.
zozbot234 · 8h ago
The so-called "Z transform" for discrete sequences is really just a misnomer for the actual method of generating functions (and formal power-series/Laurent-series). You just write a discrete sequence as a power series in z^(-1).
srean · 1h ago
> really just a misnomer

No. Things acquire different names if they are independently discovered by different communities.

Native Americans called Indians. Lol! what was that.

segfault99 · 7h ago
True dat. But you see there's this thing called 'Engineering Maths'. Apparently it's really bad for real mathematicians' blood pressure.
zozbot234 · 5h ago
Analytic combinatorics (the rubric where mathematicians would want to place all the region-of-convergence, zeros-poles, etc. analysis of generating functions–formal power/Laurent series–Z transforms that engineering often focuses on) is not exactly easy-going either. Other common methods (relating convolution to multiplication, inverting transforms etc.) would traditionally be comprised under the Operational Calculus of Mikusiński.
artyom · 16h ago
Essentially that's what electrical/electronics engineering is about.
jojobas · 9h ago
Then there's the whole mindfuck of fractional order Fourier (and other) transforms.
kragen · 11h ago
This is maybe a good first thing to read if you've never heard of the Fourier Transform before, but it makes it sound a great deal more arbitrary and random than it actually is. It might set your understanding back by giving you the illusion that you understand things you don't actually understand, and that would be a shame, because some of those things are more beautiful than a sunrise or a hummingbird.

It would be very sad to lose those treasures by passing them by because you thought you already had them. As the song says:

> When you're young, you're always looking

> On the far side of the hill;

> You might miss the fairest flower

> Standing by your side very still.

And the flowers of Fourier analysis are as fair as the fairest flowers in this universe, or any universe.

https://news.ycombinator.com/item?id=45134843 may be a clue to the hidden beauty, for those who are puzzled as to what I might be talking about.

nerdsniper · 10h ago
As usual, 3 Blue 1 Brown delivers: https://youtu.be/spUNpyF58BY?si=nSqHf_3zbhyu9YGd
kragen · 10h ago
This is a much better introduction than the article, and it's only 20 minutes.
yshklarov · 18h ago
As everyone in this thread is sharing links, I'm gonna pitch in, too.

This lecture by Dennis Freeman from MIT 6.003 "Signals and Systems" gives an intuitive explanation of the connections between the four popular Fourier transforms (the Fourier transform, the discrete Fourier transform, the Fourier series, and the discrete-time Fourier transform):

https://ocw.mit.edu/courses/6-003-signals-and-systems-fall-2...

RachelF · 12h ago
I wonder what happened to Wavelet transforms? The were very popular years ago, and now one never hears about them.
energy123 · 10h ago
The use-case is slightly different. Wavelets are suited for non-stationary signals, while Fourier Transform has no time localization so it's more for stationary signals. Although short-time Fourier transform exists, which can handle non-stationary signals under the assumption of local stationarity.

Also, a property of wavelets is they're non-parametric, which limits their utility in knowledge discovery applications.

For ML applications, my opinions is that they're somewhat superseded by deep learning methods that apply less restrictive inductive bias. As data grows, the restrictive prior assumptions of wavelets will hurt, sort of like how CNN is being abandoned for ViT, even though CNN can outperform in situations where data is scarce.

So overall, they have a pretty small set of usecases where they're more suited than other alternative tools.

acjohnson55 · 5h ago
They have specialized applications for sure. I think it's just not as hot an area for new applied math work as 20 years ago.
yshklarov · 12h ago
Really, do you think they've somehow fallen out of favor? If so, that's a surprise to me.

In any case, they are a bit more advanced, and out of scope for the undergraduate course I linked to.

mallowdram · 14h ago
Excellent! Thanks!
abetusk · 15h ago
I have a pet theory that the reason why the FT, and other transforms (generating functions, Mellin/Laplace/Legendre/Haar), are so useful is because many real world functions are sparse and lend themselves to compressed sensing.

The FT, as are many other transforms, are 1-1, so, in theory, there's no information lost or gained. In many real world conditions, looking at a function in frequency space greatly reduces the problem. Why? Pet theory: because many functions that look complex are actually composed of simpler building in the transformed space.

Take the sound wave of a fly and it looks horribly complex. Pump it through the FT and you find a main driver of the wings beating at a single frequency. Take the sum of two sine waves and it looks a mess. Take the FT and you see the signal neatly broken into two peaks. Etc.

The use of the FT (or DCT or whatever) for JPEG, MP3 or the like, is basically exploiting this fact by noticing the signal response for human hearing and seeing it's not uniform, and so can be "compressed" by throwing away frequencies we don't care about.

The "magic" of the FT, and other transforms, isn't so much that it transforms the signal into a set of orthogonal basis but that many signals we care about are actually formed from a small set of these signals, allowing the FT and cousins to notice and separate them out more easily.

mitthrowaway2 · 14h ago
As mentioned by other commenters, a reason for the FT's dominance in particular is because sine, cosine, and complex exponentials are the eigenfunctions of the derivative operator. Since so many real-world systems are governed by differential equations, the Fourier Transform becomes a natural lens to analyze these systems. Sound waves are one (of many) examples.

And there's another good reason why so many real-world signals are sparse (as you say) in the FT domain in particular: because so many real-world systems involve periodic motion (rotating motors, fly's wings as you noted, etc). When the system is periodic, the FT will compress the signals very effectively because every signal has to be harmonic of the fundamental frequency.

abdullahkhalids · 14h ago
The question is why "so many real-world systems are governed by differential equations" and "so many real-world systems involve periodic motion".

Well, stable systems are can either be stationary or oscillatory. If the world didn't contain so many stable systems, or equivalently if the laws of physics didn't allow so, then likely life would not have existed. All life is complex chemical structures, and they require stability to function. Ergo, by this anthropic argument there must be many oscillatory systems.

kragen · 12h ago
Differential equations aren't limited to describing stable systems, though, and there are chaotic systems that are also in some sense stable.

Ordinary differential equations can describe any system with a finite number of state variables that change continuously (as opposed to instantaneously jumping from one state to another without going through states in between) and as a function of the system's current state (as opposed to nondeterministically or under the influence of the past or future or some kind of supernatural entity).

Partial differential equations extend this to systems with infinite numbers of variables as long as the variables are organized in the form of continuous "fields" whose behavior is locally determined in a certain sense—things like the temperature that Fourier was investigating, which has an infinite number of different values along the length of an iron rod, or density, or pressure, or voltage.

It turns out that a pretty large fraction of the phenomena we experience do behave this way. It might be tempting to claim that it's obvious that the universe works this way, but that's only because you've grown up with the idea and never seriously questioned it. Consider that it isn't obvious to anyone who believes in an afterlife, or to Stephen Wolfram (who thinks continuity may be an illusion), or to anyone who bets on the lottery or believes in astrology.

But it is at least an excellent approximation that covers all phenomena that can be predicted by classical physics and most of quantum mechanics as well.

As a result, the Fourier and Laplace transforms are extremely broadly applicable, at least with respect to the physical world. In an engineering curriculum, the class that focuses most intensively on these applications is usually given the grandiose title "Signals and Systems".

abdullahkhalids · 1h ago
I agree broadly with what you say. I didn't have time to make a more comprehensive comment.
jcgrillo · 6h ago
One amazing application of spectral theory I always harp on when this topic comes up is Chebfun[1]. Trefethen's Spectral Methods in Matlab is also wonderful.

[1] http://www.chebfun.org/

kragen · 5h ago
I haven't read it! Thanks for the recommendation!
seanhunter · 13h ago
That first question is a tautology. It’s like asking “Why is a screwdriver so perfect for turning screws?”

We have discovered a method (calculus) to mathematcally describe continuous functions of various sorts and within calculus there is a particular toolbox (differential and partial differential equations) we have built to mathematically describe systems that are changing by describing that change.

The fact that systems which change are well-described by the thing we have made to describe systems which change shouldn’t be at all surprising. We have been working on this since the 18th century and Euler and many other of the smartest humans ever devoted considerable effort to making it this good.

When you look at things like the chaotic behaviour of a double pendulum, you see how the real world is extremely difficult to capture precisely and as good as our system is, it still has shortcomings even in very simple cases.

EMIRELADERO · 12h ago
As an aside, here's a relevant video about the (sometimes not) chaotic nature of double pendulums: https://www.youtube.com/watch?v=dtjb2OhEQcU
gsf_emergency_2 · 12h ago
What ought to be surprising is that the "thing" itself doesn't change.

A learning that describes chaos well enough may not want to be associated with "calculus", or even "math" (ask a friendly reverse mathematician about that)

https://www.johndcook.com/blog/2021/04/09/period-three-impli...

Somewhat tangentially, if Ptolemy I had responded (to Euclid) with anything less specific ---but much more personal--- than "check your postulate", we wouldn't have had to wait one millennium.

(Fermat did the best he could given margin & ego, so that took only a century or so (for that country to come up with a workable strategy))

Less tangentially, I'd generalize Quigley by mentioning that groups of hominids stymie themselves with a kind of emergent narcissism. After all, heuristics,rules and even values informed by experience & intuition are a sort of arrogance. "Tautology" should be outlawed in favour of "Narcissism" as a prosocial gaslighting term :)

cycomanic · 13h ago
> The question is why "so many real-world systems are governed by differential equations" and "so many real-world systems involve periodic motion". > > Well, stable systems are can either be stationary or oscillatory. If the world didn't contain so many stable systems, or equivalently if the laws of physics didn't allow so, then likely life would not have existed. All life is complex chemical structures, and they require stability to function. Ergo, by this anthropic argument there must be many oscillatory systems.

I would say that the it's very difficult to imagine a world that would not be governed by differential equations. So it's not just that life wouldn't exist it's that there wouldn't be anything like the laws of physics.

As a side note chaotic systems are often better analysed in the FT domain, so even in a world of chaotic systems (and there are many in our world, and I'd argue that if there wasn't life would not exist either) the FT remains a powerful tool

hackandthink · 9h ago
im3w1l · 12h ago
> Well, stable systems are can either be stationary or oscillatory.

In practice this is probably true, but I can see another possibility. The system could follow a trajectory that bounces around endlessly in some box without ever repeating or escaping the box.

abdullahkhalids · 1h ago
You can treat that, and scientist often do treat it, as a stationary system with some error bounds.

For example, the concept of homeostasis in biology is like this. Lots of things are happening inside the living body, but it's still effectively at a functional equilibrium.

Similarly, lots of dynamic things are happening inside the Sun (or any star), but from the perspective of Earth, it is more or less stationary, because the behavior of the sun won't escape some bounds for billions of years.

srean · 48m ago
If this box was of a bounded size then that trajectory would have interesting property - there are chunks of time you can edit out such that what remains will look as if they are converging on a point.

I suspect you will find ergodicity interesting.

AIPedant · 6h ago
On the simplest end of that spectrum, Taylor series are useful because many real-world dynamics can be approximated as a "primarily linear behavior" + "nonlinear effects."

(And cases where that isn't true can still be instructive - a Taylor series expansion for air resistance gives a linear term representing the viscosity of the air and a quadratic term representing displacement of volumes of air. For ordinary air the linear component will have a small coefficient compared to the quadratic component.)

yatopifo · 14h ago
As you noted, it’s about what’s important to us. The physical function may or may not be sparse, but our brain model is guaranteed to be sparse. A note played on a violin is anything but a sine function, yet our brains associate it with a single idealized tone. Our world model is super compressed.
schlauerfox · 2h ago
All these comments and no Steinmetz mention. The original papers he wrote applying the transforms to electrical engineering work are so genius and simplifying for the craft, and he's such a nicer person for one of the founders of the field of EE. Also the origin of that old story about $10 for hitting the machine with a hammer $9990 knowing where to hit it, from what I can gather.
IAmBroom · 1h ago
Steinmetz is an unsung hero of tech, except to well-educated EE's.

Was he a dwarf or a hunchback? I seem to recall a physical setback.

trgn · 2h ago
Can somebody eli5, im an amateur. How does the transform know the frequencies of the output. Do you have to specify a number n, and then it decomposes it into n frequencies. Or do you give it a list of frequencies, and then it decomposes the coefficient or amplitude or something for each?

I guess what i want to know, in the examples it always shows like 3 or 4 constituents frequencies as output, but why not hundreds or a million? Is that decided upfront /paramtetizable?

The article isn't helpful, it just says something like "all possible frequencies".

IAmBroom · 1h ago
I'll try.

Suppose you blinde, and are on one side of a black picket fence. Let's say the fence uprights are 1" wide, and 1" apart.

If you aim a light meter at the fence, it will sum up all the light of the visible slices of the background. (This meter reads its values out loud!)

Suppose the background is all white - the meter sees white slices, and get a 50% reading (because the fence stripes are black). It's as high as you can get.

Now move the picket fence half an inch to the left - still all white, still a 50% reading. Now you know the background is 100% white, no other colors.

But if when you move it, black stripes are exposed, then the meter reads 0%, and you know that the background is 50/50 white and black, in stripes.

Congratulations! You've just done a Fourier transform that can detect the difference between 0-frequency and a 1"-striped frequency pattern. In reality, you're going to continuously move that fence to the left, watching all the time, but this is simple.

But instead, imagine you got readings of 40% and 10%. What kind of pattern is that? Gray stripes? White stripes that aren't as wide as the fence posts? You'll have to build another fence, with another spacing - say 1/2" fence posts 1/2" apart. Repeat.

woopsn · 1h ago
If you take N samples of a real signal you will get N/2+1 bins of information from the DFT, covering 0Hz out to about half the sampling rate.

The bins do not actually measure a specific frequency, more like an average of the power around that frequency.

As you take more and more samples, the bin spacing gets finer and the range of frequencies going into the average becomes tighter (kind of). By collecting enough samples (at an appropriate rate), you can get as precise a measurement as you need around particular frequencies. And by employing other tricks (signal processing).

If you graph the magnitude of the DFT, signals that are a combination of power at just a few frequencies show just a few peaks, around the related bins. Eg a major chord would show 3 fundamental peaks corresponding to the 3 tones (then a bunch of harmonics)

https://en.wikipedia.org/wiki/Fourier_transform#/media/File:...

So you detect these peaks to find out what frequency components are present. (though peak detection itself is complicated)

slickytail · 1h ago
In a discrete Fourier transform, the number of frequencies you get out is the number of datapoints you have as input. This is because any frequencies higher than that are impossible to know (ie, they are above the sampling frequency).

But in the continuous Fourier transform, the output you get is a continuous function that's defined on the entire real line.

stared · 7h ago
Speaking of the Fourier transform, I recommend these visual explorable explanations: https://injuly.in/blog/fourier-series/index.html and https://www.jezzamon.com/fourier/
seanhunter · 12h ago
One thing I find fascinating about Fourier analysis is the way the trigonometric Fourier series played such a central role in “breaking mathematics”[1] and the crisis that led to providing a rigorous basis for limits and continuity and all the other stuff that is now called real and complex analysis.

Cauchy had just proved that the limit of the sum of an infinite set of continuous functions was itself continous, and then along came Fourier with “are you sure about that bro?” and showed that you could take the infinite sum of very clearly continuous functions (just sine and cosine) and approximate something like a sawtooth function (which was very obviously discontinous) as closely as you like.

[1] by which I mean making obvious the fact that they had been proceeding for 100+ years using calculus without a rigorous basis.

chamomeal · 16h ago
So weird, I was just reading this article yesterday. I did an undergrad in physics and really miss this stuff. Ended up getting nostalgic and watching 3 blue 1 brown videos while drinking tequila.
stackbutterflow · 10h ago
> Fourier argued that the distribution of heat through the rod could be written as a sum of simple waves.

How do you even begin to think of such things? Some people are wired differently.

AIPedant · 2h ago
It didn't come completely out of nowhere, Euler and Bernoulli had looked at trigonometric series for studying the elastic motion of a deformed beam or rod. In that case, physical intuition about adding together sine waves is much more obvious. https://en.wikipedia.org/wiki/Euler%E2%80%93Bernoulli_beam_t...

Other mathematicians before Fourier had used trigonometric series to study waves, and physicists already understood harmonic superposition on eg a vibrating string. I don't have the source but I believe Gauss even noted that trigonometric series were a solution to the heat equation. Fourier's contribution was discovering that almost any function, including the general solution to the heat equation, could be modelled this way, and he provided machinery that let mathematicians apply the idea to an enormous range of problems.

HelloNurse · 9h ago
I think he was very familiar with differential equations and series expansions and the "wild west" stage of calculus in general. The frontier of cool and interesting mathematics has moved a lot in 200 years.
jaccola · 9h ago
Also these stories are "storified" over time, the reality is always messier (same with startup founding stories etc..).

A common mistake I see in people reading mathematics (or even computer science papers) is to think the proof set out in the paper is the thought process that lead to the interesting insight. It is almost always an ex post facto formalisation.

laszlokorte · 18h ago
Shameless plug: If you are interested in Fourier Transform and signal processing you might enjoy my somewhat artistic 3D visualisation of the fourier transform as well as the fractional fourier transform [1]

(Fractional fourier transform on the top face of the cube)

And for short time fourier transform showing how a filter kernel is shiftes across the signal. [2]

[1]: https://static.laszlokorte.de/frft-cube/

[2]: https://static.laszlokorte.de/time-frequency/

hovden · 17h ago
If I might also plug ‘the Atlas of Fourier Transforms’. If your interested in understanding building intuition of symmetry and phase in fourier space, the book illustrates many structures.
laszlokorte · 16h ago
Looks amazing! Thank you
nblgbg · 16h ago
Thanks a lot for all of this ! https://tools.laszlokorte.de/
laszlokorte · 16h ago
I am glad you are enjoying it! :)
yshklarov · 18h ago
I love the visualization! Thanks for sharing.

How do you compute the fractional FT? My guess is by interpolating the DFT matrix (via matrix logarithm & exponential) -- is that right, or do you use some other method?

laszlokorte · 18h ago
I am glad you like it!

Yes the simplest way to think of it is to exponentiate the dft matrix to an exponent between 0 and 1 (1 being the classic dft). But then the runtime complexity is O(n^2) (vector multiplied with precomputed matrix) or O(n^3) opposed to the O(n log n) of fast fourier transform. There are tricks to do a fast fractional fourier transform by multiplying and convolving with a chirp signal. My implementation is in rust [1] compiled to web assembly, but it is based on the matlab of [2] who gladly answered all my mails asking many questions despite already being retired.

[1]: https://github.com/laszlokorte/svelte-rust-fft/tree/master/s...

[2]: https://nalag.cs.kuleuven.be/research/software/FRFT/

xphos · 17h ago
I made a cool rust fft tui a long time ago too

https://github.com/lquinn2015/FFT-tui

laszlokorte · 16h ago
Look great! I was already thinking about reimplementing mine as TUI
mallowdram · 14h ago
Fantastic! Thanks!
eric-burel · 6h ago
What always bothered me when trying to "feel" Fourier transforms is that to compute the oscillations, you need to wait some time. Mathematically, the transformation includes computing integrals. So it's tricky to understand how you compute the Fourier decomposition for a stream. Illustrations always show the whole signal over time but in real life you get the signal progressively. I'd be eager to read more on this.
thyristan · 6h ago
For a stream, you use a sliding window to compute the FFT. The size of the window of course limits the lowest frequency range that you can 'see', same for the highest frequency through the time quantization that digital data usually has. So there will be an upper and lower frequency limit, beyond those limits the results are meaningless.

And the window of course creates a latency, which is sometimes relevant for realtime audio filtering by FFT.

acjohnson55 · 5h ago
Check out https://en.m.wikipedia.org/wiki/Short-time_Fourier_transform

This is the "Fourier transform" that is most often used, in practice in computing.

A longer window length gives lower frequency resolution, but longer latency.

jeremyscanvic · 5h ago
The keyword you're looking for is time-frequency analysis and the main associated tool is the short-time Fourier transform(s). This is the theory underlying spectrograms and all those niceties!
yobbo · 6h ago
As I remember intuitively, it's a convolution over a time window. The size of the time window limits the frequencies that can be detected.
dsego · 6h ago
You do it in short windows, so you get eg. 512 samples and then run a short FFT. Or you can do longer windows that overlap, so e.g. hop by 512 but take 1024, more samples gives you more accurate results.
tzury · 14h ago
This is a great playlist by 3b1b about the subject:

https://www.youtube.com/watch?v=spUNpyF58BY&list=PL4VT47y1w7...

zkmon · 10h ago
The most shocking thing for me was the realization that frequency and wavelength are kind of dual forms. Ofcourse, every concept and thing that comes into existence does so only in the context of it's opposite or dual form. But recognizing that form is genius. 3b1b video goes a bit deeper.

Also a related fact is that the rate of change of a sine wave is itself shifted by pi/2, while the exponential curves need no shifting. I'm not a professional in these matters, but I guess there is a deeper connection between the rate of change and the static world.

chrisweekly · 6h ago
As usual, BetterExplained.com has it covered: https://betterexplained.com/articles/an-interactive-guide-to...
snake42 · 4h ago
I was going to post this link as well. That site has really helped me understand many things.
prameshbajra · 9h ago
I love the content here but the website hijacks my default mouse highlight and it does not allow me to right click and save highlights which is mildly annoying. Only me?
Koshkin · 2h ago
> The waves that oscillate more quickly — meaning they have more energy

Isn't this essentially a quantum-mechanical concept?

IAmBroom · 1h ago
? It's true for all waves in all media.
SillyUsername · 6h ago
Amazing.

Until I read this article I didn't properly understand Fourier transforms (I didn't know how image compression bitmaps were derived), now it's opened a whole new world - toying with my own compression and anything that can be continuous represented as it's constituent parts.

I can use it for colour quantisation too possibly to determine main and averaged RGB constituents with respect to hue, allowing colour reduction akin to dithering, spreading the error over the wave instead and removing the less frequent elements.

It may not work but it'll be fun trying and learning!

niemandhier · 13h ago
The Fouriertransform is a change of bases in infinite dimensional space.

That sounds complicated but if you look at the integral the exponential kernel is essentially a continuous “matrix” and the function you are integrating over that kernel is a continuous vector.

This observation on can be a guide to better understand infinite dimensional Hilbert spaces ( inner product space + stuff) and is one of the core observations in quantum mechanics where it’s part of the particle-wave concept as it transforms location space -> momentum space.

astrange · 16h ago
> A compression algorithm can then remove high-frequency information, which corresponds to small details, without drastically changing how the image looks to the human eye.

I slightly object to this. Removing small details = blurring the image, which is actually quite noticeable.

For some reason everyone really wants to assume this is true, so for the longest time people would invent new codecs that were prone to this (in particular wavelet-based ones like JPEG-2000 and Dirac) and then nobody would use them because they were blurry. I think this is because it's easy to give up on actually looking at the results of your work and instead use a statistic like PSNR, which turns out to be easy to cheat.

HarHarVeryFunny · 3h ago
Well, there are use cases for lossy compression as well as non-lossy, and nobody is saying they are the same. If you really need to heavily compress to reduce file size or transmission bandwidth then you'll likely need to use a lossy CODEC, so the question then becomes how can you minimize the reduction in perceived quality of whatever you are compressing (photos, video, audio), which comes down to how these various human sensory/perceptual systems work.

For vision we are much more sensitive to large scale detail (corresponding to low frequency FFT components) than fine scale detail (corresponding to high frequency components), so given the goal of minimizing reduction in perceived quality this is an obvious place to start - throw away some of that fine detail (highest frequency FFT components), and it may not even be noticeable at all if you are throwing away detail at a higher level of resolution than we are able to perceive.

It turns out that human vision is more sensitive to brightness than color (due to numbers of retinal rods vs cones, etc), so compression can also take advantage of that to minimize perceptual degradation, which is what JPEG does - first convert the image from RGB to YUV color space, where the Y component corresponds to brightness and the U,V components carry the color information, then more heavily compress the color information than brightness by separately applying FFT (actually DCT) to each of the Y,U,V components and throwing away more high frequency (fine detail) color information than brightness.

But, yeah, there is no magic and lossy compression is certainly going to be increasingly noticeable the more heavily you compress.

goalieca · 1h ago
You might be surprised to learn that videos and many images are broken down into brightness + color vectors. Brightness is typically done near the resolution of the image (eg 4K) but color is often 1/4 of that. That’s literally blurring the picture.
astrange · 1h ago
…Why would you write a comment like this assuming I don't know what 4:2:0 is when I know what a wavelet is?

It doesn't have the same effect because the original high frequency details are correlated and so they're preserved in the Y channel.

Salgat · 19h ago
Always blew my mind that every signal can be recreated simply by adding different sine waves together.
incognito124 · 19h ago
Back in my uni days I did not get why that works. Why are sine waves special?

Turns out... they are not! You can do the same thing using a different set of functions, like Legendre polynomials, or wavelets.

MontyCarloHall · 18h ago
>Turns out... they are not! You can do the same thing using a different set of functions, like Legendre polynomials, or wavelets.

Yup, any set of orthogonal functions! The special thing about sines is that they form an exceptionally easy-to-understand orthogonal basis, with a bunch of other nice properties to boot.

nestes · 18h ago
To be maximally pedantic, sine waves (or complex exponentials through Euler's formula), ARE special because they're the eigenfunctions of linear time-invariant systems. For anybody reading this without a linear algebra background, this just means using sine waves often makes your math a lot less disgusting when representing a broad class of useful mathematical models.

Which to your point: You're absolutely correct that you can use a bunch of different sets of functions for your decomposition. Linear algebra just says that you might as well use the most convenient one!

MontyCarloHall · 18h ago
>They're eigenfunctions of linear time-invariant systems

For someone reading this with only a calculus background, an example of this is that you get back a sine (times a constant) if you differentiate it twice, i.e. d^2/dt^2 sin(nt) = -n^2 sin(nt). Put technically, sines/cosines are eigenfunctions of the second derivative operator. This turns out to be really convenient for a lot of physical problems (e.g. wave/diffusion equations).

cjbgkagh · 19h ago
Another place where functions are approximated is in machine learning which use a variety of non-linear functions for activations, for example the ReLU f(x)= max(0,x)
kingstnap · 18h ago
It makes more sense when you approach it from linear algebra.

Like you can make any vector in R^3 `<x,y,z>` by adding together a linear combination of ` <1,0,0> `, ` <0,1,0> `, ` <0,0,1> `, turns out you can also do it using `<exp(j2pi0/30), exp(j2pi0/31), exp(j2pi0/32)>`, `<exp(j2pi1/30), exp(j2pi1/31), exp(j2pi1/32)>`, and `<exp(j2pi2/30), exp(j2pi2/31), exp(j2pi2/32)>`.

You can actually do it with a lot of different bases. You just need them to be linearly independent.

For the continuous case, it isn't all that different from how you can use a linear combination of polynomials 1,x,x^2,x^3,... to approximate functions (like Taylor series).

nomel · 19h ago
Taking the concept to a 2d path: https://www.myfourierepicycles.com

And, with your own drawing: https://gofigure.impara.ai

spaceywilly · 18h ago
I swear I read in a text book once that Fourier discovered this while a boat. He looked out at the waves on the ocean and saw how there were many different sized waves combining to make up the larger waves. I could never find it again, but that visual helped me understand how the Fourier transform works.
esafak · 19h ago
Only if it is band-limited.
anvuong · 18h ago
You are being confused with #samples needed for perfect reconstruction, i.e. Nyquist sampling frequency. Fourier series/transforms work regardless of the bandwidth of the signal, as long as the integral exists, i.e. it must vanish at infinity.

Essentially it's just projection in infinite-dimensional vector spaces.

esafak · 18h ago
That's what is commonly understood by reconstruction: perfect reconstruction. And for that you need a band-limited signal. Otherwise he would have said approximate- or lossy reconstruction.
ajross · 19h ago
No, even functions with non-finite frequency representation. You just need a non-finite number of sines. Nyquist speaks only to a finite number of samples.
CamperBob2 · 19h ago
And of infinite duration, if you want to split hairs.
femto · 19h ago
Same thing! :-) In the purest sense, finite bandwidth requires infinite duration and finite duration requires infinite duration.

The real world is somewhere in between. It must involve quantum mechanics (in a way I don't really understand), as maximum bandwidth/minimum wavelength bump up against limits such as the Planck length and virtual particles in a vacuum.

ndriscoll · 18h ago
The Heisenberg uncertainty principle in quantum mechanics comes about precisely because position and momentum are a Fourier transform pair.
cycomanic · 12h ago
And you can essentially "observe" the Heisenberg principle when looking at a moving object. If you are observing a plane for example to more accurately know its velocity you will need to observe over a longer time, but if you do this you loose accuracy about its position. This does affect radar systems, who can either send short pulses to accurately pin down the position or long pulses to measure the speed.
femto · 16h ago
In that vein, I've always wondered whether the fact that we live in a fundamentally quantum universe is just a mathematically consistent side effect of the Fourier transform and the Universe's limited extent in space-time. Is the Planck time just the point at which we can't determine the difference in frequency of two signals because the wavelength of their beat frequency has to fit inside the Universe? It's fun to think about.
CamperBob2 · 17h ago
Related, but not quite the same thing. Band-limiting is needed to avoid aliasing. The infinite-duration part (or perfect phase continuity at the boundaries, or a window function...) is needed to avoid Gibbs ringing.

An interesting anecdote from Lanczos[1] claims that Michelson (of interferometer fame) observed Gibbs ringing when he tried to reconstruct a square wave on what amounted to a steampunk Fourier analyzer [2]. He reportedly blamed the hardware for lacking the necessary precision.

1: https://math.univ-lyon1.fr/wikis/rouge/lib/exe/fetch.php?med...

2: https://engineerguy.com/fourier/pdfs/albert-michelsons-harmo...

femto · 16h ago
Can we agree that a fun thing about the Fourier Transform is the many different ways such a simple idea (a liner combination of basis functions) can be viewed and the many subtle implications it has?

For example, one viewpoint is that "Gibbs ringing" is always present if the bandwidth is limited, just that in the "non-aliased" case the sampling points have been chosen to coincide with the zero-crossings of the Gibbs ringing.

I find that my brain explodes each time I pick up the Fourier Transform, and it takes a few days of exposure to simultaneously get all the subtle details back into my head.

CamperBob2 · 16h ago
For sure. As I understand it, though, the Gibbs phenomenon arises due to the sinc kernel's infinite support (sinc in Fourier domain = rectangular window in time domain, equivalent to no window at all.)

No amount of precision, no number of coefficients, no degree of lowpass filtering can get around the fact that sin(x)/x never decays all the way to zero. So if you don't have an infinitely-long (or seamlessly repeating) input signal, you must apply something besides a rectangular window to it or you will get Gibbs ringing.

There is always more than one way to look at these phenomena, of course. But I don't think the case can be made that bandlimiting has anything to do with Gibbs.

Blackthorn · 18h ago
Are the dirac or kronecker delta functions infinite duration? I guess it depends on the proof whether you can shorten them or not.
lovelearning · 13h ago
Applying FT to images is a great way to see the world very differently, as if through an alien's eyes (or Geordi's visor from ST TNG!). A kind of stimulated out-of-the-box thinking. Dense features like fur or hair manifest as high-frequency components and eventually you start to develop an intuition for their patterns in the magnitude / power spectrums.
_kb · 14h ago
For those interested in an extension from this article, https://howthefouriertransformworks.com/ has a beautifully kitsch set of videos on the topic.

They do a really good job at breaking down the fundamental knowledge needed to build an understanding.

teleforce · 17h ago
>one man’s mathematical obsession gave way to a calculation that now underpins much of mathematics and physics

Underpins much of mathematics, science and engineering

anyfoo · 16h ago
That is a very true comment. Electrical Engineering for example would be nothing without Laplace (which is more or less an even more general Fourier).
pcfwik · 18h ago
To add another suggestion for understanding the Fourier transform, personally the first explanation that ever clicked with me was the Aho/Hopcroft/Ullman algorithms textbook.

Rather than talking about sine and cosine waves, they motivate the Fourier transform entirely in terms of polynomials. Imagine you want to multiply two polynomials (p(x) and q(x)). The key is to recognize that there are two ways to represent each polynomial:

1. "Coefficient form," as a set of coefficients [p_0, p_1, p_2, ..., p_d] where p(x) = p_0 + p_1x + p_2x^2 + ... + p_dx^d, OR

2. "Sample form," as a set of sampled points from each polynomial, like [(0, p(0)), (1, p(1)), (2, p(2)), ..., (d, p(d))]

Now, naive multiplication of p(x) and q(x) in coefficient form takes O(d^2) scalar multiplications to get the coefficients of p(x)q(x). But if you have p(x) and q(x) in sample form, it's clear that the sample form of p(x)q(x) is just [(0, p(0)q(0)), (1, p(1)q(1)), ...], which requires only O(d) multiplications!

As long as you have enough sample points relative to the degree, these two representations are equivalent (two points uniquely defines a line, three a quadratic, four a cubic, etc.). The (inverse) Fourier transform is just a function that witnesses this equivalence, i.e., maps from representation (1) to representation (2) (and vice-versa). If the sample points are chosen cleverly (not just 1/2/3/...) it actually becomes possible to compute the Fourier transform in O(d log d) time with a DP-style algorithm (the FFT).

So, long story short, if you want to multiply p(x) and q(x), it's best to first convert them to "sample" form (O(d log d) time using the FFT), then multiply the sample forms pointwise to get the sample form of p(x)q(x) (O(d) time), and then finally convert them back to the "coefficient" form (O(d log d) using the inverse FFT).

dghughes · 6h ago
Bill Hammack The Engineer Guy has a video series on this subject.

https://www.youtube.com/watch?v=6dW6VYXp9HM

antimora · 17h ago
I think the best explanation and understanding I received about Fourier Transform was from studying linear algebra.

The Fourier Transform equation essentially maps a signal from the time domain onto orthogonal complex sinusoidal basis functions through projection.

And the article does not even mention this. =)

chamomeal · 16h ago
Best explanation I got was from 3 blue 1 brown! It’s what you said, but with nice visuals
esalman · 17h ago
Almost every paragraph of this article says the same thing but in different ways.
cycomanic · 12h ago
While we are talking about fun facts of the Fourier transform, the transfer function of the DFT is a sinc. That's how OFDM works, it performs an FFT on parallel input symbol streams. If you look at the spectrum of your signal it's sinc (approximately because they are truncated) channels spaced at 1/symbol rate which don't exhibit (theoretically if they weren't truncated) any interchannel interference.
drmpeg · 4h ago
It's going from the frequency domain (parallel symbol streams) to the time domain (multi-carrier RF), so it's using the inverse FFT.
fracus · 15h ago
Learning about the FT in engineering and how it can represent pretty much any repeating signal was mind blowing. It was the culmination of so much learning in mathematics that brought me to that wow moment.
esalman · 17h ago
Reading the comments, it seems there are very few electrical engineers in hacker news.
bee_rider · 16h ago
Maybe they are EE’s, who just happen to still find some joy in the stuff from their intro classes. Sometimes little things can still make us happy, after all.
Terr_ · 15h ago
IANAMathematician, but I've tried to come up with a metaphor which I hope isn't too wrong:

1. You've start with a signal fluctuating going up and down, and it's on a strip of little LEDs labeled from -1 to +1.

2. You mount that strip to a motor, and spin it at a certain rate. After a while the afterimages make a blob-shape.

3. For each rotation rate, measure how much the shape appears off-center.

In this way you can figure out how much the underlying signal does (or doesn't) harmonize with a given rotation hertz.

svantana · 8h ago
An interesting mechanical analogy! In this example, the "integration" is done by the eye and perhaps by the LEDs depending on their type. But it's more akin to the Gabor Transform than Fourier, as there is locality in time.
dsego · 11h ago
Sounds like a variation on the mechanical strobe tuner.
ptzz · 14h ago
For lossy compression, turns out a sinusoidal (typically DCT) composition maximizes energy compaction and compress ability. A proof that this is true for AR-processes was a key realization for me. That you get a nice and intuitive domain to work with (modify frequencies) is a nice bonus on top :)
ThrowawayR2 · 17h ago
Kind of a tangent but why are there so many articles and videos popularizing the Fourier transform and practically none for the Laplace transform? The first person to do a well done 3Blue1Brown style video that focuses on intuition and visualization would probably be an overnight sensation (well, among engineers at least).
artyom · 16h ago
My take on this is that Fourier is easier to understand, has more immediate and relatable practical applications, and can be explained to "common folk" with just-enough sophistication and some tangible analogies (like the wool picture from the article).

Explaining Laplace to solve differential equations with analogies it's a bit more... complicated.

alphazard · 18h ago
There's a lot of great Fourier visualizations out there (3Blue1Brown has a series on it as well). A great intuition pump would be a game where the player has to recreate a waveform given to them, by toggling on/off switches for each fundamental frequency (and phase).

Has anyone ever seen something like that?

dsego · 10h ago
I created a small piece trying to build intuition, my approach was to think of it as probing with test signals, so instead of recreating a waveform, it's more poking at it with sines and seeing how well the test sine matches the target signal.

https://dsego.github.io/demystifying-fourier/

esafak · 19h ago
seam_carver · 16h ago
Fourier transforms are useful in applications like avoiding moire when downscaling manga and reducing rainbow artifacts in color eink manga. All in kindle comic converter.
ecoled_ame · 17h ago
I didn’t have time to read the article, but it’s cool that the position operator and the momentum operator in quantum mechanics are fourier transforms of one another.

No comments yet

noncoml · 19h ago
3Blue1Brown made a video with great visualizations: https://www.youtube.com/watch?v=spUNpyF58BY
mananaysiempre · 19h ago
I’ve dabbled in explaining the FT some, and I think there’s an important trait of that video that needs to be highlighted: it’s a superb demonstration of what the Fourier transform does, mechanically, but it does not at all try to explain why it works in the places we usually apply it or how (having, admittedly, a thoroughly anachronistic mathematical background) you could have invented it.

To be extra clear: it’s a very good video and you should watch it if you don’t have a feel for the Fourier transform. I’m just trying to proactively instil a tiny bit of dissatisfaction with what you will know at the end of it, so that you will then go looking for more.

hbarka · 18h ago
Isn’t the Fourier series fundamentals generally a required course in undergraduate college EE field?
sizzzzlerz · 17h ago
Of course. It's taught in both mathematics courses as well as engineering. The Fourier transform and it's digital domain cousin, the discrete Fourier transform play such a fundamental role across nearly every engineering discipline as well as physics and many other scientific areas, you cannot get through school without learning about them.
dotnet00 · 16h ago
Yes, but how many software engineers remember any of that? Most aren't using it.
Joel_Mckay · 14h ago
Indeed, because a Golomb ruler optimized DFT is performant... and thus actually useful. lol =3
lutusp · 15h ago
Such a shame. In an otherwise well-written article, the author mentions Cooley and Tukey's discovery of the FFT, but without mentioning that Gauss discovered it first, among others, each of whom approached the same idea from different directions.

The Wikipedia FFT article (https://en.wikipedia.org/wiki/Fast_Fourier_transform) credits Gauss with originating the FFT idea later expanded on by others, and correctly describes Cooley and Tukey's work as a "rediscovery."

acjohnson55 · 5h ago
As they say, everything in modern math is named for the 2nd person to discover it after Gauss.
olq_plo · 11h ago
Yes, bad article to omit that. It is such a cool fun fact. Gauss was unreal.
meindnoch · 9h ago
A change of basis.
photochemsyn · 13h ago
Fourier's fundamental discovery was that the most natural building blocks for periodic functions are complex exponentials. These in turn were based on Euler's identity, linking complex exponentials to sines and cosines, and which is algebraic (easy to differentiate, integrate, manipulate) and also encodes rotations and waves. So, a good reason to study complex analysis and include it in the engineering maths program.

This setup then allowed Fourier to take the limit of the Fourier series using a complex exponential expression. But keep in mind, this is all in the context of 19th century determinisitic thinking(see Fourier's analysis of the heat equation 1807, and Maxwell's treatise on 'Theory of Heat' c. 1870s), and it ran into real-world limits in the late 19th century, first with Poincare and later with sensitive dependence on initial conditions. . Poincare showed that just because you have a deterministic system, you don't automatically get predictability. Regardless this Fourier transform mathematical approach worked well in astronomy (at least at solar-system scale) because the underlying system really was at least quasiperiodic - essentially this led to prediction of new planets like Neptune.

But what if you apply Fourier transform analytics to data that is essentially chaotic? This applies to certain aspect of climate science too, eg, efforts to predict the next big El Nino based on the historical record - since the underlying system is significantly chaotic, not strictly harmonic, prediction is poor (tides in contrast are predictable as they are mostly harmonic). How to treat such systems is an ongoing question, but Fourier transforms aren't abandoned, more like modified.

Also, the time-energy quantum mechanics relation is interesting though not really a pure QM uncertainty principle, more like a classical example of a Fourier bandwidth relation, squeeze it on one axis and it spreads out on the other - a nice quote on this is "nothing that lives only for a while can be monochromatic in energy." Which sort of leads on to virtual particles and quantum tunneling. (which places a size limit of about 1 nm on chip circuitry).

The bottom line is that if you're applying elegant and complex mathematical treatments to real-world physical problems, don't forget that nature doesn't necessarily follow in the footsteps of your mathemetical model.

jtfrench · 12h ago
One of my favorites.
idiotsecant · 15h ago
Everyone loves the fourier transform because it's easy to understand but everyone ignores the laplace transform, which is much more beautiful, imo, and quite related.
blamestross · 16h ago
My favorite algorithm (just for the name) is [Lomb-Scargle Periodogram](https://docs.astropy.org/en/latest/timeseries/lombscargle.ht...) which is a method for performing a Fourier transform on irregularly sampled data.
carabiner · 18h ago
What really gets me going though is quaternions...