Author here for if you have questions on this chip...
magnat · 4h ago
The separate noise source is a bit of surprise here. Why is it necessary? Wouldn't RF noise produce same results?
kens · 3h ago
I'm not sure what the FM demodulator produces when it's mistuned, but I'm guessing that you'd get pretty much no output, rather than white noise (since there's no frequency for the demodulator to lock onto). The problem for the user is that you wouldn't know if your batteries are dead or if you just haven't found the station. By adding a "hiss" between stations, the radio has better usability
magnat · 2h ago
If RTL-SDR is a good reference - when demodulating FM it produces pretty much the same noise you'd expect from a mistuned oldschool radio.
rep_lodsb · 1h ago
I wondered about this too, but from the linked articles it seems to be designed that way in order to make it more user friendly: when not correctly tuned to a station, it outputs the artificial white noise instead of a possibly distorted signal from a nearby frequency (or just silence if the demodulator can't lock on to anything).
wkat4242 · 3h ago
It depends, if the RF frequency you use has a signal on it then it won't be random so it's not really noise. I wonder why they need a noise generator in a receiver chip though.. They're usually used for crypto stuff.
CamperBob2 · 2h ago
It's to provide "comfort noise" when the correlator indicates a missing or mistuned signal.
Muting the audio would make more sense -- and would certainly have been familiar to the CB[1] radio operators of the day in the form of a squelch effect -- but this chip was targeted at consumers who expected it to behave like a conventional FM radio.
1: An early incarnation of social media, for better and worse
CamperBob2 · 2h ago
In a conventional radio, yes, but I'll bet this approach would sound incredibly awful if mistuned.
contingencies · 2h ago
Hey Ken, great read as always. I wonder if in future you would consider doing an overview of the various early radio chips and their evolution. I recall recently reading some HAM projects and understanding that a lot of the later radio chips were clones of earlier designs. Given your suggestion that this earlier period of integrated radio innovation is 'low hanging fruit' in terms of RE-friendliness, it should be an interesting read and I'm sure a very large number of radio enthusiasts would love to see your insights.
CamperBob2 · 2h ago
The correlator is interesting. I don't see how it works. In the perfectly-tuned case, how does delaying the signal by half an (IF?) period and inverting it yield a match for the original signal? Inversion isn't the same as a delay.
I guess the idea is that the 70 kHz IF is effectively sampled at 2x the necessary Nyquist cutoff needed for 15 kHz baseband audio. So the signal content at half the period can be relied upon to match after an inversion and delay, assuming it was (a) band-limited at the source (or by the clever deviation-reduction scheme), which it would be; and (b) tuned correctly.
kens · 1h ago
The application note gives more details [1], but I find it a bit confusing. The idea is that as long as you are within about +/- 100 kHz of the station (a wide range), the radio will lock onto the right frequency (because of the frequency-locked loop), giving the nominally 70 kHz IF. Since the 70 kHz signal doesn't vary much over a half-wavelength (as you said), the correlator will be happy. The correlator will still stay locked as the IF varies +/- 15 kHz with the audio signal. (The correlator doesn't require a perfect match, just mostly matching.)
The problem is that if you mis-tune the radio by 100 kHz or so, the FM detector will give you an output, but it will be distorted. The issue is that the FM detector is linear over a small range, but outside that range, you get non-linear side lobes. So if you tune to a side-lobe frequency, the radio will lock onto the frequency, but the output will have harmonic distortion. In this case, the IF frequency is way off from 70 kHz, enough that the delayed signal and the inverted signal don't match at all, so the correlation fails and mutes the audio. Then you'd re-tune and find the right frequency.
Muting the audio would make more sense -- and would certainly have been familiar to the CB[1] radio operators of the day in the form of a squelch effect -- but this chip was targeted at consumers who expected it to behave like a conventional FM radio.
1: An early incarnation of social media, for better and worse
I guess the idea is that the 70 kHz IF is effectively sampled at 2x the necessary Nyquist cutoff needed for 15 kHz baseband audio. So the signal content at half the period can be relied upon to match after an inversion and delay, assuming it was (a) band-limited at the source (or by the clever deviation-reduction scheme), which it would be; and (b) tuned correctly.
The problem is that if you mis-tune the radio by 100 kHz or so, the FM detector will give you an output, but it will be distorted. The issue is that the FM detector is linear over a small range, but outside that range, you get non-linear side lobes. So if you tune to a side-lobe frequency, the radio will lock onto the frequency, but the output will have harmonic distortion. In this case, the IF frequency is way off from 70 kHz, enough that the delayed signal and the inverted signal don't match at all, so the correlation fails and mutes the audio. Then you'd re-tune and find the right frequency.
[1] See Figures 8-12. Link: https://www.tel.uva.es/personales/tri/radio_TDA7000.pdf