Many thanks to SWLing Post contributor, 13dka, who shares the following guest post:
“It’s usually hard to assess whether or not a sync detector helped with a particular dip in the signal or not, unless you have 2 samples of the same radio to record their output simultaneously and compare.”*
That’s what I wrote about the “pseudo sync detector” in my review of the Belka DSP last year.
Since I was recently upgrading to the Belka DX in order to pass on the Belka DSP to a friend, I had briefly two examples of almost the same radio on the table at the dike. I tuned them to the same stations and recorded some audio clips with one radio on sync detector, the other in regular AM mode, to answer the question whether or not sync has “helped with a particular dip in the signal”. Then I thought that demonstration would be an opportunity to try an explanation on what exactly (I think) sync detectors are all about anyway, hoping to find a middle ground between “technical” and “dumbed down beyond recognition”.
The trouble with sync detectors
Perhaps no component of a shortwave receiver is surrounded by so much misconception and confusion as sync detectors. Full disclosure: Until quite recently, I had an, at best, vague concept on what they do myself. It seems it’s not so much that people don’t know how they work, what they actually do when they work is where the ideas often diverge. Contributing factors might be that explanations on what problem sync detectors are supposed to solve are usually high-level theoretical and just as hard to digest as the explanations on how they solve that problem, and a few old magazine reviews and now internet buzz probably did their own to misguide or inflate expectations instead of explaining when to turn them on.
The outcome is that sync detectors tend to disappoint: I have often read things like “I turned it on but I couldn’t hear a difference” because people turn them on when there’s nothing for the circuit to do and then they think they do nothing, or they believe their purpose is “magically recovering weak signals” and get disappointed by the mixed results. Contrary to what some YT videos state, sync detectors are not “great when a signal fades a lot” (that’s what the automatic gain control is meant to keep in check) and no, they were not “invented by Sony”.
A very brief history of sync detectors
From the very beginning of radio broadcasting, people noticed that their favorite non-local stations were usually sounding just fine during the day (if they came in at all) and in the early evening, but not so great anymore when the family gathered around the radio on long, dark winter evenings or at night (on the soon to become popular shortwave even during the day). It didn’t take long until the sheer wonder of wireless entertainment wore off enough that listeners and engineers started asking why the tone of these stations was changing continuously and even worse, why some of them exhibited not only fading but some recurring and sometimes very nasty distortion. That’s why the first concepts on addressing these issues by somehow “exalting the carrier” started to emerge as early as 1922, and it took 30-50 more years until these early ideas evolved into the sync detectors we know today.
Not only in this historical context it seems obvious that dealing with the well-known old problem outlined above is (or was) basically all they were supposed to do. But by the time the theories could materialize in practical and affordable circuits, FM and TV were the all the rage and the downmarket didn’t care that much for AM broadcasting anymore. Apart from a few professional receivers and ham radio projects, sync detectors didn’t catch on in the consumer radio world. Until the 1970s, when they started to be marketed to demanding shortwave listeners – in usually quite technical or vague and hence confusing ways, and somehow the focus started shifting:
Because they occasionally do recover DX stations buried in the mud to some very varying extent and under some pretty elusive circumstances, this (dare I say) byproduct of the concepts understandably became a sales pitch and often also the one benchmark used to rate sync detectors in reviews, also understandably because rating the elusive thing they were meant to do is difficult.
A “selective fading” crash course
Most of us have learned that “multipath propagation” is the source of the trouble, because it causes…
The difference between fading and “selective fading” is that the latter affects only portions of the signal within the part of the spectrum a station occupies, hence the name. The somewhat unlucky term may have contributed to the confusion too, because (frequency-) selective fading and fading are two completely independent phenomena that can happen both separately and simultaneously! Still, a lot of explanations on the net and even very technical articles often lump them together… Anyway, still very abstract, right?
Why not just have a radio showing us what selective fading is? Here is a “waterfall” plot of a signal, showing the history of a station’s (HF-) spectrum over a span of several seconds, the station is subject to selective fading (and in the middle of the clip also regular fading):
For comparison, here is a waterfall plot of the same kind of signal that is not harmed by selective fading, but exhibits regular fading:
The slanted blue streaks wandering through the modulation spectrum in the first video are simply notches in the signal level, or if you will they are narrow slices of the spectrum “fading” while the neigboring parts remain unharmed. These notches come into existence because portions of the waves take different paths (hence “multipath propagation”) and travel a different distance than others and so they all hit your antenna at very slightly different times. When all these signal portions sum up in your radio they are causing “phase cancellations” resulting in the notches and hence colorations and distortions.
Bonus fact: Due to the ionosphere moving at its own pace and changing its shape and altitude (among a lot more variables) all the time, and the planet underneath stubbornly insisting on doing the same thing every day, all non-groundwave paths are also subject to a minuscule Doppler shift, that’s why these notches are not static but keep moving through the spectrum of your station’s channel in all kinds of speeds, directions and amounts.
I used the (“STANAG”) radioteletype signals because they have a very homogenous spectrum that makes the effect very obvious (this can also be seen very well on OTH radar or DRM stations). Now that you know what to look for, you will likely recognize the pattern on this AM broadcast station:
Please note how the signal is not subject to much regular fading but suffers a lot from selective fading, you can also faintly hear how it distorts when the notches pass the carrier peak in the middle of the spectrum. What, you’re still here? 🙂 Then let’s see in some more detail how “selective fading” affects signal quality:
1. “Phase distortion”
The notches wandering through the sidebands (modulation part) of the spectrum cause the usually slowly undulating, “phasing” sound, as if someone is constantly playing with the tone controls… you probably know what I’m talking about, it’s the very trademark “shortwave sound” some of us actually love. It’s not a very destructive effect yet, electric guitar players use effect pedals called “phaser” to recreate a very similar sound in a somewhat similar way and indeed, this kind of “distortion” is becoming most obvious when listening to music. But since it’s altering the tonal quality of the original signal it’s technically considered a distortion.
A sync detector can’t help much with that specific sound. Why I’m even mentioning it: It’s the very same mechanism that periodically wreaks havoc on AM modulation:
2. Carrier attenuation
The notches will sooner or later pass the carrier signal peak in the middle of the AM station’s spectrum and attenuate the carrier to some degree. When that happens, depending on the depth or intensity of these notches, the signal may sound “overmodulated” in shades between “a little scratchy” and “just plain nasty”. “Overmodulation” is what’s actually happening: Attenuating the carrier has the same effect as increasing the modulation level. Since it’s rather the ratio between carrier and modulation than overall signal strength causing this nasty effect, it can render even strong stations pretty unlistenable.
This is where the ideas of “exalting” and even replacing the carrier come from, and that’s also the source of aforementioned misunderstanding: The “replacement carrier” is not static, it needs to follow the modulation level, therefore it can’t do anything about regular fading.
What it sounds like and how sync detectors help with that
Here’s a clip with a musical intro (that’s Radio Thailand on 9920 kHz). I duplicated the first few notes of the music and play them twice in a row, first the recording in AM mode and then with the Belka’s sync detector. Note how prickly the piano sounds in the first example and how the distortion disappears in the “sync” part of the clip (0:04s):
This is also very noticeable on speech: Here’s another clip from the same program, note the sound of the word “architecture” and the scratchy sounding sibilants on the following words, then the music part with the last hit of the snare drum (0:06s) sounding like the drummer threw his sticks into a china store. The sync detector is smoothing that all out (0:08s). Isn’t that awesome?
I think this is pretty much the one simple thing that sync detectors were meant to do and what likely most sync detectors are doing just fine: They subtly remove this specific kind of distortion, ideally without adding their own colorations or artifacts. So when you hear some station with some sweet music that turns into heavy metal in intervals, that’s when you definitely want to turn these things on.
Weak signal recovery
While the described improvement can be achieved very consistently on signals with fair to strong levels, recovering weak stations or even “extreme DX” is where much of the “sometimes it works, sometimes it doesn’t” talk and the disappointment seems to come from. Any sync detector reaches a point where it stops working of course, where that point is depends on a number of factors:
– The type and performance of the sync detector. Duh! Even sync detectors of the same type can perform very differently due to their individual implementation. The now most common type of sync detector is the “selectable sideband”-style one, as found on the ICF-2001D/2010 or the NRD-525/535, and now e.g. any Tecsun radio with sync. The Sony’s sync detector performance is said to be “legendary”, the implementation confusingly called “ECSS” on the JRC radios ended up, let’s say “controversial” (that likely has to do with their price tags) and the results of the various Tecsun implementations are genuinely all over the place. But even the Sony’s legendary sync mode failed almost as often as it succeeded on weak signals in my personal recollection, and I’m only now beginning to understand why.
– Dynamic range and sensitivity. How “weak” a signal is depends to one part on the sensitivity and noise figure of the radio and a sync detector relies on what the radio in front of it can deliver. Food for thought: The aforementioned Sony ICF-2010/2001D was not only known to have a great sync detector, it also had a reputation of being particularly sensitive and quiet for the standards of the time. The next point is a very related matter:
– Whether or not additional interference is deteriorating reception. Besides the now ubiquitous wideband noise simply reducing practical sensitivity, this could be co-channel interference (another station or deliberate jamming) or local narrowband noise. Then the sync detector may try to improve reception of the station and the interference, and make the interference pop out more than the station. Here’s a pretty fatal example – this was captured with the sync detector of the PL-660, but the effect is not specific to that radio:
Speaking of which, I would’ve loved to demonstrate the virtues of “true” and the difference to “pseudo” sync detectors. Unfortunately I only have an old (2015) PL-660 and opposing the two radios simply did not work at all, which mostly has to do with the 2 points described above, the radios are just too different in performance:
– How much the weak station is actually suffering selective fading. First off, the same is true for strong stations: If there is no selective fading, the sync detector will (ideally) do nothing. When signals are just weak, noisy and subject to regular fading, a sync detector can’t help with that per se. Case in point: This is a medium weak signal from VOA on the 19m-band being mostly affected by regular fading and very little by selective fading. The sync detector just removes the residue bit of distortion and it’s pretty much impossible to tell the difference:
A better way of telling the difference is a stereo file, the radio with regular AM on the left and the sync one on the right channel – the signal stays mono in the middle for the most part, the differences will stick out to the sides:
An even better way is the “null test” – summing both channels in mono and flipping polarity on one – which is “nulling” (cancelling out) all common components of the signal and leaving only the difference:
Obviously, showing that something does nothing can be more complicated than demonstrating the opposite! For comparison, here’s a strong station (Radio Korea in German) with some moderate selective fading, as a stereo file and then “nulled out”. The less difference between the radio’s demodulation product, the quieter the difference will be. Obviously, the sync detector has something to do here:
Can sync detectors recover weak signals at all?
First off, all sync detectors can improve the signal-to-noise ratio a little, the “selectable sideband” variety often changes the filter and less bandwidth means better SNR, on top of that they let you select the sideband that’s least affected by noise or adjacent channel splatter, so the combined efforts of filtering and sideband selection can already improve weak signal reception. Removing selective fading distortion on top of that can make that a “magic button” indeed! Except…since everything changes constantly on shortwave, it can turn into the opposite seconds or minutes later, when the station decides to deliver less than the minimum required signal to keep the sync detector locked onto the carrier. In theory, the “true” sync detector doesn’t even need a carrier but so far I haven’t met an actual implementation that is capable of riding out a “carrierless” period enough to not wander off frequency. So moderately weak signals, actually affected by selective fading can reliably benefit from a sync detector. Absolute grassroots signals: not so much.
Back to the Belka: The Belka’s sync detector is different in that it uses both sidebands like (AFAIK) the circuits in e.g. the AOR 7030 or the IC-R75, and it doesn’t use a phase locking loop (PLL) to sync an internal carrier. Instead, it filters the original carrier and sends it through a limiting amplifier to stabilize it, in the tradition of venerable radios made by Harris, Racal and Drake (R-7). The British rather called it “quasi-” than “pseudo-“-synchronous detector and by the way, as I just found out – it’s also what the NRD-525/535 receivers do (permanently!) when they are in AM mode. While this quasi-synchronous detector doesn’t let you select sidebands, it does help a little with weak signals too, if all factors fall into the right place:
This is a weak signal from “Radio ZP30” (HCJB) on 3995 kHz. On the regular AM example playing first you may notice that the station is dipping into the noise for a second, the “sync” example has only a very brief dropout and the signal appears a little less noisy and more solid in general:
Here’s what it sounds like when it fails as much as it helps – and that’s exactly the point I’m trying to make – on the same station, a few minutes earlier or later. When the carrier drops below the noise, nothing can be recovered, the Belka goes silent where a “true” sync detector would lose lock and howl, also some interference messes with the signal:
Here’s an example how a sync detector can salvage intelligibility on a weak station (here: Woofferton on 17700 kHz in a closing band). The first part of the clip sounded to me like “..and the next poll? Paul? pole? cold?, versions 29 and 34…” and I was a bit surprised to hear what was actually being said in the second part (0:04s):
If you have interference on one sideband, you can still use “zero-beating” in SSB (aka “ECSS”) with any SSB-capable radio. On the Belka this is particularly easy: If the station is well-maintained no fine-tuning is necessary, due to the very precise and stable (<0.5ppm!) oscillator (yes, it boggles the mind again but it actually has a TCXO inside). The Belkas come well-calibrated and generally spot-on zero over the entire coverage range to begin with, and the calibration on my 1-year old -DSP and the new -DX model on my table was approximately 1/4 of one Hertz apart! So on a radio like this, it’s most times “switch to LSB/USB” and that’s it. The technical difference to a sideband-selecting “true” sync detector is that the latter is zero-beating the station automatically and very precisely, but that automatic is usually also its Achilles heel. ECSS doesn’t have this problem, the signal actually doesn’t need a carrier and the result is the same. Here’s a rare example where the distortion periods are so long that switching forth and back between AM and SSB in the middle of it reveals the effect very clearly (Absolute Radio 1215 kHz on the Icom IC-705):
Sync detectors address mostly one very specific problem with AM reception: The ugly audio distortion caused by selective fading reducing the carrier to the equivalent of an overmodulated signal.
Their real value is in keeping programs enjoyable for the program listener down to fairly low signals, and very few sync detectors fail at doing this. Recovering truly weak stations is a bonus with a high variance in the results, which is only partially owed to the quality or sophistication of the sync detector circuits. It also depends on external factors and it’s generally limited to mitigating the detrimental effects of selective fading for the most part, skipping all the little side benefits like “removing phase noise from the whatnot thingamajig…yada” here. With regard to the casual broadcast listener, this is probably all there really is to the “magic button”.
Final thoughts about the Belka’s “pseudo sync detector”
After recording these clips and doing more necessary research on sync detectors I found that the little Belka has tricked me once again into underrating its inner qualities. The “pseudo”-prefix in its name may do its own to cause doubts about its effectiveness – quite unjustly! It just silently and very effectively keeps the distortion away and doesn’t change the sound much otherwise, where other sync detectors make clear that they’re on by changing bandwidth/filter shape or volume, turning a light on or malfunctioning right away.
I think the “quasi”- or “pseudo-synchronous” principle it is based on is not as inferior compared to “true” sync detectors as the theory suggests: Where PLL-sync detectors would start howling while trying to re-sync, the Belka mutes the audio briefly and due to its outstanding sensitivity, the sync detector likely stops working at a point further down on the signal meter scale than many other radios would, particularly in this price range.
The sync detector doesn’t have selectable sidebands, but the Belka has all the precision needed for – more often than not – hassle-free zero-beating in SSB without the need to beat zeroes at all, so that’s just as good to me. Overall it’s doing these jobs with the deceivingly big portion of understatement that’s so typical for this tiny communications receiver.