Featured
Latest
Television Receiver...
 
Share:
Notifications
Clear all

Forum 1

[Sticky] Television Receiver Intermediate Frequencies

158 Posts
11 Users
60 Reactions
37.1 K Views
Synchrodyne
(@synchrodyne)
Posts: 537
Honorable Member Registered
Topic starter
 
 
To add to the previous post, it may be noted that in the subject RCA TV receivers, the double-conversion KRK-212 tuner was used only for the CATV channels.  Broadcast channels were always processed by one or the other of the conventional single-conversion tuners.  With dual-cable systems, one was processed double-conversion by the KRK-212, and the other single-conversion by the KRK-211.
 
Applicable references are:
 
RCA Plain Talk and Technical Tips, 1973 September-October: “MATV and CATV Provisions in RCA Color TV”.
 
NCTA Paper, 1973:  “A Television Receiver Especially Designed for CATV”; G.W. Bricker and G.C. Hermeling (both RCA).
 
Electronic Servicing, 1973 October, p.18ff: “Details of the new RCA cable receivers”.
 
 
Cheers,
 
Steve
 
Posted : 02/12/2023 9:33 pm
Synchrodyne
(@synchrodyne)
Posts: 537
Honorable Member Registered
Topic starter
 
 
The next datapoint on double conversion comes from a 1977 IEEE paper about European TV tuner practice (1).
 
Therein it was stated that [European] cable TV tuners used either a single conversion mode like conventional VHF tuners or a double conversion system.  A double conversion circuit was shown, thus:
 
Double Conversion Cable TV Tuner
 
 
 
It was accompanied by the following commentary:
 
“The PIN AGC input section is followed by a mixer circuit which converts the input frequency of 47 to 300 MHz up to a first I.F. of 600 MHz. The signal then passes through a selective MOS FET amplifier.  The second mixer output is tuned to 36 MHz.  The RF amplifier has been omitted because the IF frequency is an octave higher than the highest input frequency.  Therefore a highly selective input band pass filter is employed in order to reduce the oscillator radiation to the required level.”
 
In a general sense, this was not materially different to the RCA double-conversion cable tuner previously discussed.  The input frequency range was somewhat greater, but the 1st IF was in the same vicinity.  The 36 MHz number quoted for the 2nd IF appears to have been more-or-less the VIF/SIF midpoint of the standard European IF channels, 38.9/33.4 MHz, midpoint 36.15 MHz; for systems B/C/G/H, 33.5/39.5 MHz, midpoint 36.5 MHz; and 39.2/32.7 MHz, midpoint 35.95 MHz for system L.  Possibly the 600 MHz 1st IF was a midpoint number rather than say the VIF number.  600 MHz was in the 585 to 610 MHz gap between Bands IV and V, and at the time was probably not used for broadcasting.  So it was likely a relatively “quiet” zone in RF terms, and so a logical choice for the 1st IF.
 
The paper noted that broadcast TV tuners were and were expected to remain of the single-conversion type.  Thus double-conversion was still a CATV specialty.
 
 
(1) IEEE paper, 1977 May: “Past, Present and Future Trends of TV Tuner Design in Europe”; Joe H. Schuermann (TI Germany).
 
 
 
Cheers,
 
Steve
 
Posted : 02/12/2023 9:51 pm
Synchrodyne
(@synchrodyne)
Posts: 537
Honorable Member Registered
Topic starter
 
 
The double conversion TV tuner examples given so far have been specific to cable TV (CATV) reception.  RCA proposed the use of double conversion in broadcast receivers, in its US patent 4408348 of 1983 October 04, filed 1981 August.  One objective was to be able to cover all of the broadcast and cable channels with a single tuner of relatively low cost, rather than the three hitherto required for cable-ready receivers.
 
RCA delineated the required frequency coverage as follows:
 
RCA Frequency Coverage
 
 
 
Clearly, the choice of 1st IF was more constrained than it had been for the CATV-only case.  By this time, CATV channel allocations had been expanded upwards to 402 MHz.
 
RCA chose a 1st VIF of 415.75 MHz.  This was in the 402 to 470 MHz gap between the SB-CATV and UHF bands, and also avoided the 420 to 450 MHz radar band.  The 2nd VIF was the standard 45.75 MHz.
 
Incoming broadcast signal levels were expected to vary over a much wider range, 10 µV to 100 mV, than CATV signals, typically in the range 1 to 6 mV.  Inter alia this required the use of input tuning and RF amplification, neither required in the CATV-only case.  Given that the overall tuning range of 54 to 890 MHz was beyond the reach of the varactors of the time, it was split into three ranges, namely 54 to 150, 150 to 402 and 470 to 890 MHz.  The lower and middle ranges encompassed both broadcast and CATV channels, whereas the upper, UHF range was broadcast only, as shown in this block schematic.
 
RCA Multiband TV Tuner
 
 
 
The range of oscillator frequencies required was shown in this table:
 
RCA TV Tuner Oscillator Frequencies
 
 
 
The first conversion was necessarily supradyne,  It involved upconversion for incoming frequencies below 402 MHz, but downconversion for frequencies above 470 MHz.  This combination in a common 1st mixer was probably new to TV practice, although long-established in radio receiver practice.  (For example, European AM broadcast receivers were typically upconversion for LF but downconversion for MF.)  Thus the 1st SIF was below the 1st VIF, at 412.25 Mhz.
 
The second conversion was infradyne, with a 370 MHz local oscillator, to produce the standard 45.75 MHz VIF, 41.25 MHz SIF combination.  There did not appear to be good reason to do otherwise for the 2nd IF.  It was a “known quantity” with very wide range of available of filters and active devices to suit it.
 
 
As best I can determine, this RCA idea did not have much influence on actual practice at the time.  It did though establish that the all-channel, broadcast and CATV double-conversion tuner needed to be somewhat different to the CATV-only type.  A 1st IF between the top of the highest CATV band and the bottom of the UHF broadcast band was feasible only if the two had reasonable separation.  Upward movement of the top of the highest CATV band would eventually render such an IF unusable.
 
 
 
Cheers,
 
Steve
 
 
Posted : 02/12/2023 11:30 pm
Synchrodyne
(@synchrodyne)
Posts: 537
Honorable Member Registered
Topic starter
 

Posted by: @nuvistor

@synchrodyne Photo of one, photo from UKVRRR. They were also used when the communal system put BBC1 and ITV 625 onto  the VHF channels, it allowed us to install UHF only sets to customers in the flats.

-- attachment is not available --

 

 
I have found a datasheet for that Labgear VHF-to-UHF converter – see:  http://www.wrightsaerials.tv/albertsattic/Televerta_CM6022RA.pdf.
 
The input frequency range was 40 to 220 MHz, so it covered all of the VHF channels.
 
The nominal oscillator frequency was 550 MHz, giving an output range of 590 to 770 MHz.
 
But the oscillator was adjustable over the range 500 to 600 MHz.  That would have extended the output range to 540 to 820 MHz.  So there was a good probability of finding a “clear” channel free of any detectable broadcast.
 
Thus conversion was supradyne with the sum product being used, in order to preserve channel orientation.
 
In combination with a TV receiver it thus formed a double conversion system, with initial upconversion to a 1st IF in the range 540 to 820 MHz, followed by downconversion in the receiver itself to 39.5 MHz.
 
The gain was adjustable over the range -5 to +4 dB.  As the oscillator feedback to the input was limited to 2 mV, possibly there was an input buffer ahead of the mixer, with a variable gain stage following.  Maybe a balanced mixer was used, as well.
 
 
Cheers,
 
Steve
 

 

 
Posted : 03/12/2023 4:51 am
Nuvistor
(@nuvistor)
Posts: 4703
Famed Member Registered
 

@synchrodyne I would have seen that many years ago but forgotten the details. 

Steve there is a full stop at the end of the link which make it not resolve.

Working link.

http://www.wrightsaerials.tv/albertsattic/Televerta_CM6022RA.pdf

 

 

Frank

 
Posted : 03/12/2023 9:23 am
Synchrodyne
(@synchrodyne)
Posts: 537
Honorable Member Registered
Topic starter
 

Thanks for the link correction! 

 
Posted : 04/12/2023 1:56 am
Synchrodyne
(@synchrodyne)
Posts: 537
Honorable Member Registered
Topic starter
 
 
More-or-less contemporary with the RCA double conversion broadcast and CATV tuner described upthread was another proposal by RF Monolithics (RFM), described in a 1982 November IEEE paper (1).
 
In contrast to RCA’s objective of a simpler and lower-cost all-channel tuner, in this case the goal was much higher performance in respect of susceptibility to various forms of interference.  The development was done under an FCC contract.  When the FCC made its geographical UHF channel assignments in the early 1950s, it assumed the use of the then new standard 45.75 MHz VIF, along with certain typical receiver parameters respecting interference rejection performance.  This resulted in the so-called “UHF taboos”, a list that detailed channels that could not be assigned in adjacent or alternate service areas in order to protect prior assignments.  (A similar list was part of the European UHF channel assignment plan developed at the ITU 1961 Stockholm 1961 meeting.)  to The FCC wanted to know whether receiver design could be improved to the point where susceptibility to spurious responses was reduced enough to significantly reduce the taboo list, and so obtain greater utilization of the UHF band.
 
RF Monolithics noted the key points of concern that dictated receiver design as being:
 
a) eliminate image frequency problems
b) move the IF beat and half-IF frequencies far from the desired signal
c) greatly increase adjacent channel capability
d) greatly increase CM and IM performance
e) keep the UHF noise figure <10 dB
 
In particular it noted that items (a) and( b) suggested greatly increasing the IF frequency over the current 45 MHz, thus:
 
“Changing the IF from 45 MHz to a much higher IF such as 450 MHz moves the IF-related mixer spurious responses far enough from the desired signal frequency to make it a simple matter to filter them out in the RF front end. For example, a 450-MHz IF would have an image response at 900 MHz above the desired RF signal and an IF beat response 450 MHz from the desired signal.”
 
Higher order mixer spurs were also considered when it came to 1st IF selection.  Whilst it was thought that 1 GHz would be an even better choice, an IF filter at that frequency was considered to be a difficult proposition, and it would not be able to provide adequate adjacent channel rejection.  Thus with 450 kHz in mind as a broad destination, the following 1st IF boundaries were defined:
 
1) Less than 470 MHz to avoid the UHF-TV band.
2) Greater than 431 MHz to avoid VHF second harmonics.
3) Greater than 442.5 MHz to avoid third-order mixer spurs at channel 83.
4) Twice the 2nd local oscillator frequency to fall between channels.
5) To minimize spurious responses produced by 1st local oscillator mixing with 2nd local oscillator in the second mixer (internal spurious responses).
 
All of these criteria were met by a 448.725-MHz 1st IF choice.  A 2nd local oscillator frequency of 402.975 MHz was required to yield a 45.75 MHz 2nd IF.  The second harmonic of 402.975 MHz is 805.95 MHz which is 200 kHz above the sound carrier of channel 69 avoiding an internal spurious response at that channel or any other.  However, the use of a 448.725-MHz IF did necessitate significant filtering to avoid interference from high-powered government systems and amateur transmitters in that band.  Recall that RCA chose to avoid the 420 to 450 MHz area to avoid any such problems.  But RF Monolithics had placed its emphasis on avoiding spurs of both lower and higher orders.
 
It may also be noted that whereas the RCA tuner covered all broadcast and CATV channels, the RF Monolithics tuner was designed for the broadcast channels only.  It did not cover CATV channels, other than incidentally those that corresponded with broadcast channels.
 
Here is the receiver block schematic:
 
 
Improved High Performance TV Block Schematic
 
 
As an aside, whilst the front end of the receiver was high-performance, the “back-end” was quite ordinary.  It used quasi-synchronous vision demodulation without an anti-Nyquist filter, and conventional intercarrier sound.  The shortcomings of this approach had already been articulated, particularly in the Fockens & Eilers 1981 IEEE paper.  This was discussed in detail in the “Intercarrier Sound” thread ( https://www.radios-tv.co.uk/community/black-white-tvs/intercarrier-sound/).
 
Much of the RFM paper was about the special devices required for the RF and IF sections of the receiver, so somewhat out-of-scope here.  A performance comparison chart compared the spurious response of RFM double-conversion receiver with a conventional receiver:
 
Improved High Performance TV Measurements
 
 
 
It would appear that the original objectives in respect of spurious response were achieved.  And it did demonstrate that double conversion, carefully executed, was a valid pathway to better performance.  Nonetheless, this receiver did not appear to have had much effect on general TV receiver practice at the time, which mostly continued along the established single-conversion pathway.  Cost was likely a factor.
 
At this stage it could be said that double conversion was customary for CATV converters, in part because it was an easier way to cover the wide and near-continuous frequency spectrum involved.  On the other hand, although the feasibility and benefits of double conversion for broadcast reception had been demonstrated, the generally prevailing receiver performance level was easily achieved with the established single-conversion system and its standard IFs.
 
The above comparison chart also referred to a TI (Texas Instruments) receiver, which was an earlier attempt to realize the high performance double conversion concept.  This was briefly described in an appendix to the above-mentioned IEEE paper, and more fully described in an earlier IEEE paper which I have not seen. 
 
The TI receiver used a 346.125 MHz 1st VIF, a 303.975 MHz 2nd local oscillator, and a somewhat non-standard 2nd VIF of 42.15 MHz.
 
TI Receiver Block Schematic
 
 
 
 
It was said that the image frequency, IF-beat, half-IF, and LO radiation problems were addressed by using a high 1st IF of 346 MHz to move the frequency of those responses far enough away from the desired UHF channel to facilitate filtering them out.  In the event though, the choice of 346 MHz for the 1st IF was not optimum since third-order mixer spurious responses were encountered in the middle of the UHF-TV band (channels 40 through 57).  The worst case encountered was with channel 48 as the desired and channel 49 as the undesired signal.  With a - 55-dBm desired level, an undesired level of -28 dBm resulted in barely perceptible interference.  Thus there was a learning curve in respect of the optimum 1st IF placement.
 
 
 
(1) IEEE Paper 1982 November:  “An Improved High-Performance TV Receiver”; Darrell L. Ash (RFM).
(2) IEEE Paper 1978 February:  “High performance receiver”; D.L. Ash (TI)
 
 
 
Cheers,
 
Steve
 
 
Posted : 04/12/2023 2:04 am
Nuvistor reacted
Synchrodyne
(@synchrodyne)
Posts: 537
Honorable Member Registered
Topic starter
 
 
By the early 1980s, it appeared that although double conversion had been proposed for the reception of off-air signals in TV receivers, it had little or no use in practice.  On the other hand, double conversion had been the norm for American set-top CATV converters from the start.  This was confirmed in a 1984 IEEE paper by Scientific Atlanta (1).  To quote:
 
“Since set-top converters were first developed the architecture employed has been as shown in figure 1.  Signals from the cable were applied through a low pass filter (not shown) to a balanced mixer.  Here they were mixed with signals from a local oscillator whose frequency was higher than that of the incoming signal.  The resultant first IF was at some conveniently high frequency, usually in the lower portion of the UHF spectrum (which officially extends from 300 MHz to 3 GHz).  This high IF frequency was chosen in preference to the much lower IF used in TV receivers, in order to simplify design of the input filter and in order to prevent the possibility of L.O. radiation in-band.  The first IF signal, after amplification and filtering, was mixed with a fixed tuned second local oscillator to produce a second IF frequency which was the same as the frequency of channel 2, 3, or 4.  In recent times, many people have stopped using channel 2 as an output because that is the second harmonic of the citizens band.”
 
The figure 1 referred to is shown here:
 
Set Top Converter
 
Not provided was a timeline for when this type of unit was introduced, nor information on the early 1st IFs used.  Late 1960s is probably a reasonable estimate for the timing, though.
 
When coupled with a conventional (single-conversion) TV receiver, the combination was of the triple-conversion type.
 
By the time that the paper was written, there had been a move to PLL-based tuning for the 1st local oscillator, as shown in figure 2.  By that time, the device name had migrated to “Set-Top Terminal.  Also, an AFT loop, working on the 2nd IF, had been added in order to interpolate between the PLL steps of the 1st conversion.
 
Set Top Terminal
 
In respect of the IF choice, it was said:  “A typical terminal which tunes from 54 to 450 MHz will have its first local oscillator frequency range from 668 to 1058 MHz (an IF of 608-614 MHz is used, as this band is reserved for radio astronomy, precluding licensing of UHF transmitters at channel 37).”
 
Within that 608 to 614 MHz IF channel, the VIF was 612.75 MHz and the SIF was 608.25 MHz.  This it was inverted as compared to UHF channel 37, for which the vision carrier was at 609.25 MHz and the sound carrier at 611.75 MHz.  The inversion was a consequence of the initial supradyne conversion.
 
The second conversion to channel 2/3/4 required another inversion, so was also necessarily supradyne.  One may calculate the required 2nd local oscillator frequencies as 668.0 MHz (channel 2, vision at 55.25 MHz), 674.0 MHz (channel 3, vision at 61.25 MHz) and 680.0 MHz (channel 4, vision at 67.25 MHz).
 
It seems likely that the 612.75 MHz VIF was a modal 1st IF number for CATV converters in the 1980s, although lower numbers had been used previously.
 
Also, it may be seen that the previously discussed RCA KRK-212 CATV tuner of 1973, used as part of cable-ready TV receivers, had followed CATV converter architecture except that the 2nd conversion was to the standard US TV IF channel, and so was necessarily infradyne in order to retain the 1st conversion channel inversion.  Whether or not the RCA 587.75 MHz 1st VIF was drawn from CATV converter practice of the time is unknown.  Possibly the infradyne 2nd conversion was a factor in choosing the optimum 1st VIF, insofar as placement of the 2nd local oscillator frequency would need to be considered.
 
 
(1) IEEE Paper 1984 August:  “Operational Characteristics of Modern Set-Top Terminals”; James O. Farmer (Scientific-Atlanta)
 
 
Cheers,
 
Steve
 
 
Posted : 06/12/2023 2:37 am
Nuvistor reacted
Synchrodyne
(@synchrodyne)
Posts: 537
Honorable Member Registered
Topic starter
 
 
Further insights were provided in a 1990 Scientific Atlanta paper (1).
 
To quote from that paper:
 
“Modern cable systems provide up to 78 channels of television, between 54 and 550 MHZ .  A trend toward even higher frequencies has begun, with several manufacturers announcing equipment operational to 750 or 860 MHz.  The joint EIA and NCTA (National Cable Television Association) engineering committee is currently expanding it's [sic] channelization specification (IS-6) to a maximum frequency of 1,000 MHz, even though systems operating that high are not expected for a few years yet. (This effort has been delayed pending information as to the architecture of next generation TV tuners.)”
 
At the same time, it was noted that there was a trend away from set-top converters to cable-ready TV receivers.  That placed a bigger burden on TV receiver tuners.
 
TV tuners and CATV converter tuners had evolved along different pathways as best suited their respective functions.  TV tuners were single-conversion, whereas CATV tuners were double conversion with an initial upconversion to a 1st IF that was above-band.  From that one infer that the use of double conversion for TV tuners was still minimal in 1990.
 
Tuner Topologies TC & CATV
 
The 1st IF shown for the CATV case was still the 608 to 614 MHz range referred to in the 1984 Scientific Atlanta paper.
 
Again quoting from the paper:
 
“Traditionally TV manufacturers have eschewed the set-top type architecture for TV tuners because of it's cost and the TV's lack of a frequency response requirement.  Some manufacturers are now considering such an architecture for TV tuners as a way to provide a quality tuner capable of tuning continuously from 54 MHz to the top of the UHF band.  Several Japanese manufacturers are considering double conversion TV tuners with a first IF at 965.25 MHz.  This will meet the requirement for a continuously tuned product, but may introduce it's own problem if cable systems begin using frequencies as high as 1 GHz.  The joint EIA/NCTA Channelization specification, IS-6, is being delayed while the committee understands the proposed use of this frequency and what impact this will have on channelization.  Other significant advantages of the set-top tuner architecture include no in-band images and no in-band local oscillator emission.”
 
It is reasonable to assume that the Japanese setmakers involved had done their research, and had chosen a 962.5 MHz as an optimum number.  This would have required a 2nd local oscillator frequency of 916.75 MHz – clearly above the 890 MHz top of the UHF band - for downconversion to the standard 45.75 MHz.  It also differed from previous double-conversion broadcast TV proposal in that the 1st IF was above, not below the UHF TV band.  Thus it was also suitable for CATV reception where there was a near-continuous spectrum of channels, and so no gap suitable for a between-bands 1st IF.
 
I have not been able to ascertain whether or not the proposed 962.5 MHz 1st IF was actually used in domestic TV receivers.  But I do recall that in the early 1990s, Mitsubishi USA was advertising double-conversion tuners as a feature of its TV receivers and VCRs.  There was thus some chance that it had chosen the 962.5 MHz 1st IF.
 
Be that as it may, the certainties are that by the early 1990s:
 
- The Japanese setmakers at least had given serious consideration to the double-conversion case for broadcast TV reception and had derived a suitable 1st IF.
 
- At least one TV (and VCR) maker was using double-conversion tuners and was advertising the fact.
 
Whether the 962.5 MHz 1st IF proposal was specific to system M and American channelling is unknown.  It might also have been developed with Japanese domestic use in mind, with the standard 58.75 MHz VIF as 2nd IF.
 
Also, the 612.75 MHz 1st VIF, if not standardized, was evidently a commonly used number for North American CATV converters.
 
 
The paper also noted that baseband CATV converters had grown in popularity.  In these, the second conversion was not to channels 2/3/4, but to the standard 45.75 MHz VIF.  With a 612.75 MHz 1st VIF, that would have required a 2nd local oscillator of 667.0 MHz.  A (re)modulator provided a channel 3 or output where required.
 
A baseband converter feeding into the video and audio inputs of a domestic receiver-monitor would constitute a double conversion system overall.  But such a unit feeding its channel 3/4 output into a conventional TV receiver would be a multiple conversion system.  Two conversions to baseband, then one up to channel 3 or 4, then one down to baseband in the receiver, for four in total.
 
 
 
(1) IEEE Paper 1990 August:  “Specifications for Tuner Design for use in Cable Ready Television Receivers and VCRs”; James O. Farmer (Scientific-Atlanta).
 
 
 
Cheers,
 
Steve
 
 
Posted : 06/12/2023 2:42 am
Nuvistor reacted
Synchrodyne
(@synchrodyne)
Posts: 537
Honorable Member Registered
Topic starter
 
 
A brief 1988 IEEE paper on cable-ready TV receivers (1) included the following interesting commentary:
 
“Where single conversion tuners are subjected to many contiguous cable channels, as is the case for a cable system, second and third order distortions will be difficult to suppress to a level not visible in the television pictures.
 
“Cable Television converters utilize a double conversion tuner to resolve this problem.  It has been reported by at least one tuner manufacturer that the incremental cost of is approximately $3.50 over that of a single conversion tuner.”
 
A reasonable take from that is that double conversion tuners were certainly worthwhile for “cable ready” receivers that met that description in terms of performance as well as facilities.  And that single-conversion tuners, even where providing very good performance on broadcast channels, might not do so on cable channels.  But cost increment of double conversion might have been an impediment for what might be called the “mass market”, where margins were thin.
 
 
Against this, one may see some logic in RCA’s previously described 1973 approach, which was to use a double-conversion tuner for the CATV channels, but conventional single-conversion tuners for the VHF and UHF broadcast channels.  For dual cable systems the VHF tuner also doubled for the second CATV tuner.  That aspect would seem to have been contra-indicated on the issue of second and third order distortions.  But the VHF tuner in that receiver was of the conventional, mechanically switch-tuned type, even though the CATV and UHF tuners were of the varactor type, and even though RCA did have varactor type VHF tuners in its inventory.  One may wonder whether the mechanical VHF tuner was chosen because it had better spurious response rejection than did the varactor types of the time.
 
 
 
(1) IEEE Paper 1988 March: “Cableready Television Sets – The Myth Continues”; Earl Langenberg (American Television and Communications)
 
 
Cheers,
 
Steve
 
 
Posted : 07/12/2023 7:36 am
Nuvistor reacted
Synchrodyne
(@synchrodyne)
Posts: 537
Honorable Member Registered
Topic starter
 
 
A potential drawback in respect of the use of double conversion tuners in broadcast TV receivers was the relatively high level of phase noise introduced by wide frequency range upconversion oscillators, at least of the type typically used in CATV converters.  This made the use of intercarrier sound mandatory in order to obtain acceptable performance.  Yet that was counter to the then-recent trends in which improved sound channel performance, suitable for stereo, was obtained by moving away from the conventional intercarrier system.  Evidently adequately low phase noise could be obtained using conventional single-conversion VHF and UHF tuner architecture.
 
The situation was revealed by this commentary in a 1987 IEEE paper (1), my emphasis:
 
“RF set-top converters should not damage the BTSC signal, so long as an intercarrier detector is employed in the receiver. The oscillators in the set-top terminal tend to introduce phase noise to the picture and sound carriers. This is especially true if the oscillator is phaselocked. In wide bandwidth set-tops covering 50-550 MHz, the first LO tunes from 668 to 1166 MHz, 75methods for % of an octave. With this wide a tuning range and consumer size dollars to spend, the PLLs usually introduce enough phase noise that direct detection of the sound signal would introduce unacceptable noise. The same can be said for many game and VCR modulators. Since the picture and sound carriers are simultaneously passed through the mixer, they will receive identical phase noise. In the TV set the picture and sound carriers are mixed to obtain the 4.5 MHz sound signal in an intercarrier detector. This process removes the phase noise from the 4.5 MHz signal, so quality detection is possible. Intercarrier detection is universal in TV receivers and in VCRs, as well as in many TV tuner/decoder units. However, we are aware of at least one case in which a manufacturer marketed a TV band audio tuner which directly detected the sound carrier. Noise from this device was totally unacceptable when a set-top terminal was used ahead of it.”
 
More comment was provided in a recent post in the “Intercarrier Sound” thread (2).
 
From that one may deduce that for broadcast reception, at least at the higher quality end, better performance was required of double conversion TV tuners than was typically provided by CATV set-top converters.  That may have incurred a non-trivial cost increment that was a barrier to the widespread adoption of this technique.  I think that it would be reasonable to assume that the Mitsubishi double conversion tuners of the early 1990s were no worse, in phase noise terms, than concurrent higher performance single-conversion tuners.  But I have yet to find any significant information about these units.
 
 
(1) IEEE paper 1987 February; “Cable and BTSC Stereo”; James O. Farmer (Scientific-Atlanta, Inc.) & Alex B. Best (Cox Communications, Inc.)
 
 
Cheers,
 
Steve
 
 
Posted : 09/12/2023 9:59 pm
Synchrodyne
(@synchrodyne)
Posts: 537
Honorable Member Registered
Topic starter
 
 
At least based upon the information that I have found to date, it would appear that by the 1990s decade, double conversion TV tuners for broadcast reception had found some applications, but that single-conversion remained the majority choice.  On the other hand, double conversion was the long-established norm for CATV set-top boxes, where lower performance, especially in respect of oscillator phase noise, was accepted.  Possibly it was the case that for broadcast reception, the advantages offered by well-executed double-conversion were offset by a higher cost associated with doing it at a suitably high performance level.
 
The FCC’s earlier interest in double-conversion tuners was as a way of increasing UHF band utilization, as noted upthread.  But by 1990 or so, with digital TV (DTV) on the horizon, the need to advance along that vector had fallen away.  Thus double-conversion would then stand or fall on its performance/cost merits within the TV receiver envelope.
 
For the record, in respect of UHF band utilization, here is the list of the North American UHF “taboos”.
 
FCC UHF Taboos
 
For any given UHF channel, except those near the band edges, there were 18 other UHF channels that were preferably not allocated in the same or nearby areas.  These all stemmed from the standard IF choice, 41.25 MHz sound, 45.75 MHz vision.  The use of different IFs placed in the sub-VHF low band region would simply shift the taboos around, and not remove them.  The FCC had not made the US UHF allocations until the industry had standardized on a new IF, following the realization that the 1945-46 choice of 21.25-21.9 MHz sound, 25.75-26.4 MHz vision was unsatisfactory.  The technology of the time would not have allowed other than a sub-low band IF within the economic envelope for domestic TV receivers, and in fact that remained the case for a couple of decades at least.
 
But it may be seen that initial upconversion to a 1st IF above the UHF band would obviate the conflicts.  As would an initial downconversion (of the UHF channels) to a 1st IF just below the bottom of the UHF band (and anyway above 420 MHz) would also have eliminated conflicts.
 
The 41.25/45.75 MHz IF was a “best fit” for the North American VHF TV channels, but even so was not without its interference possibilities, as shown in this chart:
 
US VHF TV Interferences
 
 
Of those, only four were IF related.
 
 
In the European case, there was also a list of UHF exclusions developed at the 1961 ITU Stockholm European VHF-UHF meeting (ST61):
 
ITU ST61 UHF Exclusions
 
That chart was based upon the expected receiver IFs as detailed in the individual country submissions.  The difference in channel numbers was in either the upward or downward direction, according to the direction of the IF channel, i.e. either VIF low or VIF high.  
 
For systems G and H, other than in Italy, the exclusions were based upon the established and standard 33.4 MHz SIF, 38.9 MHz VIF.  Thus the channels suffering from potential interference were above the reference channel.
 
In the Italian case, both the European standard IF and the Italian standard 41.25 MHz SIF, 46.75 MHz VIF were considered, hence the greater number of exclusions.  Again, the affected channels were above the reference channel.
 
For System L , France, the applicable IF was 39.2 MHz SIF, 32.7 MHz VIF.  In this case, the affected channels were below the reference channel.
 
With system I for the UK and Ireland, the exclusions covered the then possibility that the IF would be in the range 33.0-33.5 MHz SIF, 39.0-39.5 MHz VIF, before the 33.5/39.5 MHz combination was chosen, although in practice there was no difference between the two as far as the affected channels were concerned, these being above the reference channel.
 
 
For the Eastern Bloc system K, both the old 27.75 MHz SIF, 34.25 MHz VIF and the incoming 31.5 MHz SIF, 38.0 MHz were accounted for.  Here the affected channels were above the reference channels.  Pertinent to the recent subset of postings, a third case was considered, this being of the double conversion type.  This was mentioned in the USSR submission to ST61:
 
USSR ST61 Submission
 
 
 
Downconversion from a UHF channel to a low VHF channel was in fact the same as had seen limited use in the USA in the early days of UHF.  Here the initial conversion would have been infradyne in order to preserve the channel direction.  Thus the affected channels would have been below the reference.  The channel O1, O2 and O3 vision carriers were respectively 49.75, 59.25 and 77.25 MHz, so the first conversion local oscillator would have been below the wanted channel vision carrier by those amounts.  The second conversion would have been normal for the O1, O2 and O3 channels.  The extent to which this form of double conversion was actually used is unknown.
 
 
Returning to the 1990s, the advent of DTV resulted in the development of TV receivers capable of accepting both analogue and digital transmissions.  One such – for North American use - was briefly described in the Mitsubishi Electric house magazine “Advance” 1998 December issue.  It was said to have a double conversion tuner with a 920 MHz 1st IF and a 44 MHz 2nd IF.  As far as I know, IF channels for DTV receivers were usually specified by their centre frequency.  44 MHz is the centre frequency of the standard North American IF channel, 41 to 47 MHz, so a reasonable assumption is that the established 41.25 MHz SIF, 45.75 MHz VIF combination was used for analogue reception.  Assuming that 920 MHz was the centre frequency of the 1st IF, then that implies a 917 to 923 MHz channel, with 917.25 MHz SIF and 921.75 MHz VIF for the analogue case.  That also assumes supradyne conversion, which was almost certainly the case to keep the oscillator frequency always above the UHF band.  Thus the second conversion would have been infradyne, with an 876.00 MHz local oscillator frequency, just above the 870 MHz top of the North American UHF band.
 
Whether that “920 MHz” 1st IF was inherited from the preceding Mitsubishi analogue-only TV receivers and VCRs with double conversion tuners, I do not know, but it seems possible, if not probable.
 
 
As previously noted, the US standard IFs for radio and TV receivers were promulgated by the industry trade association known variously and sequentially as the RMA, RTMA, RETMA, EIA and CEA.  In the 20th century, the last update had been RETMA Recommendation REC-109C, in 1955.  This became EIA REC-109C, without change, when the RTMA became the EIA in 1957.  In 2003 it became CEA REC-109C, again without change.  CEA did eventually undertake a review and update, which resulted in the issue of ANSI/CEA 109-D, as a standard rather than as a recommendation, in 2010 June.  The details of this standard remain stubbornly unfindable (other than by purchasing a copy).  Nevertheless, one may glean some information from the available preview:
 
ANSI CEA 109 D Contents
 
 
 
Item 5.3 indicates that there was some standardization of IFs for double conversion analogue TV receivers, and suggests that similar numbers may have been used for both the analogue and digital cases.  Apparently CEA-109-D became CTA-109-D in 2015.
 
 
The foregoing completes the planned subsection on (analogue TV receiver) double conversion.  Clearly there are significant gaps, and I’ll infill these as and when I find the requisite information.
 
 
Cheers,
 
Steve
 
 
Posted : 15/12/2023 10:53 pm
Nuvistor reacted
Synchrodyne
(@synchrodyne)
Posts: 537
Honorable Member Registered
Topic starter
 
 
As this subsection on double conversion has included adaptors (e.g. Band I-to-Band III, UHF-to-VHF, VHF-to-UHF) as well as regular TV receivers, then for the sake of completeness it seems appropriate to mention the small number of VHF-UHF communications receivers that had optional TV adaptors, and which used double conversion.
 
Two such were the ICOM IC-R7000 (of 1986) and the R9000 (of 1991).  More complete descriptions of the IF systems are available in the “Radio Receiver Intermediate Frequencies” thread at:  https://www.radios-tv.co.uk/community/postid/119210/ (R7000); and:  https://www.radios-tv.co.uk/community/postid/119316/ (R9000).
 
Given that these were radio receivers with TV reception as an accessory function, the IFs were chosen to address the radio requirements, with the TV units then simply following the same pattern, without any further adjustment.  Both the R7000 and R9000 TV adaptors were available in two versions, one for systems M/N, with 4.5 MHz vision-sound carrier separation, and the other for systems B/G/H, with 5.5 MHz vision-sound carrier separation.
 
Basic VHF-UHF coverage was 25 to 1000 MHz for the R7000, and 30 to 1000 MHz for the R9000.  So, both covered all VHF and UHF terrestrial broadcasting channels, and all VHF and UHF cable channels.  (The R9000 also had HF coverage from 0.1 to 30 MHz.)
 
Both were of the double/triple conversion type, double conversion as far as the TV function was concerned.  Each had two 1st IFs according to the frequency range.
 
In the R7000 case, the 1st (sound) IFs, obtained by supradyne conversion, were 778.7 MHz for the frequency range below 512 MHz, and 266.7 MHz above 512 MHz.  The corresponding numbers for the R9000 were 778.7 MHz above 500 MHz, and 278.7 MHz for the range 30 to 500 MHz.
 
In both cases the 2nd IF, obtained by infradyne conversion, was 10.7 MHz for all input frequencies.  The feed to their respective TV adaptors was taken from the start of the 10.7 MHz 2nd IF section.  The TV sound carriers were at 10.7 MHz, putting the vision carriers at 15.2 MHz for systems M/N or 16.2 MHz for systems B/G/H.  Final TV processing was done at these unusual vision frequencies, using an otherwise conventional IF strip, although it was also very unusual for systems M/N and B/G/H in having the SIF below the VIF.  In the R7000 case, this IF strip processed intercarrier sound as well as vision.  Split sound was also available via the 10.7 MHz wide FM IF branch.  In the R9000 case, only vision was processed by the TV unit, split sound only being available via the 10.7 MHz wide FM IF branch.  Final processing of split sound at 10.7 MHz aligned with what was done in Japanese high quality outboard TV tuners in the later 1970s.
 
The full set of IFs applicable to TV reception is shown in this chart:
 
R7000, R9000 TV IF Chart
 
A contemporary of the ICOM R7000 was the Yaesu FRG-9600 VHF-UHF communications receiver, double/triple conversion on the radio side, and with an optional system M/N TV adaptor.  The latter though was single conversion on the vision side, but double conversion on the (split) sound side, as sound was processed via the wide FM 10.7 MHz IF branch.  IFs for the TV pathway were 45.754 and 10.7 MHz sound, and 50.254 MHz vision.  More information on the FRG-9600 IF system is available in the “Radio Receiver Intermediate Frequencies” thread at:  https://www.radios-tv.co.uk/community/postid/119326/).
 
 
 
Cheers,
 
Steve
 
 
Posted : 30/12/2023 8:05 am
Pieter H
(@pieter-h)
Posts: 18
Eminent Member Registered
 

Hi Steve,

I haven't been in these discussions for quite a while, my apologies. But single versus dual conversion is an interesting discussion in which I was deeply involved while with Philips Tuners. This was the late 1990s, when introducing digital cable. It all brings back memories and you will find more about it in my Tuner History books.

Dual conversion fundamentally had and has many drawbacks: the wideband input gives a very bad noise figure, the dual conversion with two LOs doubles phase noise (which was especially an issue with QAM), and there was always the issue with oscillator drift and higher order beats. These problems could partly be solved using dual (!) PLLs, while linearity requirements dictated high power LOs. But the biggest issue was the low dynamic range, which made them unsuited for weak off-air signals. They were, and still are, therefore unusable for broadcast reception. Your suggestion that either Japanese or European TV makers were considering them surprises me, since this would make the TV set only fit for cable reception. I can only see that solution as a kind of local solution in areas with close to 100% cable coverage. But I can't remember to have ever seen a TV set with dual conversion tuner.

In practice they were therefore only used in US cable modems, not in the TV set, that continued to use standard tuners. The cable modem output to the TV either was baseband video or a fixed VHF channel like in VCRs. When digital cable QAM came up we tried very hard to get out single conversion tuners into US cable modems, but failed completely. Although we met all specs, dual conversion in the US was a kind of religion, no longer based on facts, but probably a means of market protection. To meet specs and push integration they increasingly used GaAs MMICs for up and down mixing and the LOs, which further increased the power consumption and the cost, making them twice more expensive than single conversion tuners. So, in the end also with the introduction of digital reception the global tuner concept partitioning did not change: everywhere single conversion tuners for broadcast & cable, analog & digital, except for the US cable market that used dual conversion. US cable users thus paid a high price (relatively) for their standards "Alleingang".

As a last side note it is interesting to know that also within Philips there has been a single - but big, and very expensive - dual tuner development, the infamous Weinert tuner development at Philips Research Labs during the early 1970s. This project was an effort to develop the first integrated tuner, for which the dual conversion concept was chosen in order to eliminate tuneable filters in the architecture. But despite major attempts, the tuner suffered from exactly the same drawbacks as mentioned above, and the project was stopped.

Cheers, Pieter

 
Posted : 07/03/2024 8:12 pm
Nuvistor reacted
Synchrodyne
(@synchrodyne)
Posts: 537
Honorable Member Registered
Topic starter
 
Hi Pieter:
 
Many thanks for that, which fills a large gap.  I have not been able to find very much information on dual-conversion TV tuners, and very little of that indicates their prevalence or otherwise.  Perhaps that lack of readily available information simply reflects the fact that they were not used very much.
 
That said, Mitsubishi appears to have been a major proponent, having used the technique in analogue TV receivers and VCRs from the early 1990.
 
The earliest reference that I can find is this American market advertisement from Stereo Review magazine 1992 April:
 
Mitsubishi Stereo Review 199204
 
 
 
The comment “While our dual conversion tuner improves the image rejection ratio, along with various other cable signal irregularities” does suggest that cable reception was the primary reason for using dual conversion.
 
Whether Mitsubishi ever used dual conversion tuners for receivers destined for other markets, such as Japan and Europe, I do not know,
 
 
Cheers,
 
Steve
 
Posted : 09/03/2024 11:13 pm
Synchrodyne
(@synchrodyne)
Posts: 537
Honorable Member Registered
Topic starter
 
From this post of 2022 September 24:
 
 
>>>>>
 
“Industry standards for IF for radio, automotive radio and FM receivers have long been in effect and are universally used by all manufacturers.  Such standardization has resulted in simplification in the design, the testing and the manufacture of radio receivers. It, also, has helped to simplify the problem of interference to and from radio receivers.
 
“The problem of standardizing an IF for TV receivers was considered by the RMA Committee on Television Receivers, R4, for the same reasons.  After a thorough study, the Committee recommended, on October 31, 1945, that a narrow band 21.25 to 21.9 mc be established as the limits of a standard IF for TV receivers.  It should be remembered that at the time TV Channel 1 occupied the frequency band 44 to 50 mc.  On May 6, 1948, Channel 1 was eliminated by FCC action.
 
“Because of the interference problems of the 21.25 to 21.9 IF, the Committee on Television Receivers, R4, again reviewed the IF problem and on June 22, 1949 recommended the standardization of 41.25 mc as the IF for TV receivers.  At this time the only other service operating within the 21 to 26 mc IF band was low power state police transmitters.  The recommendations of the Committee were later embodied into the RMA Standard Intermediate Frequency REC 109.”
 
>>>>>
 
Some more background on the 21.25 to 21.9 MHz case was provided in a 1946 January IRE paper:  “Capacitance-Coupled Intermediate-Frequency Amplifiers”, by Merwin J. Larsen and Lynn L. Merrill, both of Stromberg Carlson.  To quote:
 
“A frequency of 21.6 megacycles is used for the sound carrier of the amplifier to be discussed. This frequency was advocated by Newlon (*) in a report presented to the Radio Manufacturers Association committee on Television Receivers. The sound carrier selected lies midway in the 21.25- to 21.9-megacycle region formulated as a standards proposal by the executive committee of the receiver section of the RMA engineering department. With the sound carrier at 21.6 megacycles, the picture carrier is 26.1 megacycles.”
 
This report was noted as:  A. E. Newlon, "Selection of television sound and picture intermediate frequencies," Report to Radio Manufacturers Association Committee on Television Receivers, August, 1945.
 
From that we may deduce that the RMA had moved very quickly to develop a standard TV receiver IF once the FCC had defined the revised set of channel frequencies in 1945 July.
 
One could say that the US industry had very early recognized the benefits of having a standard TV receiver IF, which was developed and in place by the end of 1945.   And that when the initial choice was found to be less than ideal, it quickly developed a new standard IF, one that turned out to be quite durable.
 
That lesson to appears to have been missed at the time by most of the European countries.  With the exception of Italy, none of the participants in the 1952 ITU Stockholm VHF allocations meeting included actual or proposed standard IFs in their system and channel planning submissions.  Outside of Europe, it is known that Australia included a standard IF channel in its initial TV channel allocations planning.  Japan probably did the same, as its initial standard IF does seem to date from the inception of TV broadcasting there.  New Zealand chose the by then established CCIR standard IF for its system B TV broadcasting, and probably many other later starters did something similar, choosing as part of their planning an established IF appropriate for their system.  South Africa chose a new 38.9 MHz VIF, 32.9 MHz SIF combination for its system I network, and an ad hoc IF was developed for system K1 in Africa.
 
 
Returning to the original RMA standard, as noted this was specified to have an SIF in the range 21.25 to 21.6 MHz.  I think that it is pure coincidence, but it happened that the 21.4 MHz IF number, found with reasonable occurrence in some VHF and UHF communications equipment from the late 1960s (although first used earlier than that) fell into that range.
 
 
Cheers,
 
Steve
 
Posted : 14/09/2024 12:35 am
Synchrodyne
(@synchrodyne)
Posts: 537
Honorable Member Registered
Topic starter
 

A 1991 IEEE paper (1) by Philips Germany about a new TV vision and sound IF processing IC provided some background on the IF situation at the time.

 

It included the following commentary:

 

“For the time being, about ten international TV-standards exist, although each of them make use of the same basic principles.  These standards make use of different IF frequencies between 32.7 MHz and 58.75 MHz for vision IF signal processing and of a FM sound IF signal processing with intercarrier IF frequencies between 4.5 MHz and 6.5 MHz.”

 

The then-current TV standards were systems B, D, G, H, I, K, K1, L, M and N, so that was ten in total.  Philips’ use of the term “about ten” may have reflected the fact that in some cases the differences were minor, so that for example systems B/G/H and D/K/K1 could be respectively grouped together for TV receiver design purposes.

 

At the time, 32.7 MHz for system L was the lowest standard VIF in use, and 58.75 MHz, for system M in Japan only, was the highest.

 

The FM intercarrier frequency range was mentioned by Philips in part because a particular feature of the IC which was the subject of the paper was that it had a PLL-type FM intercarrier that could accept any incoming frequency between 4 and 8 MHz, thus more than covering those actually in use.

 

The “worked example” in the paper showed the multistandard B/G and L case, with a 38.9 MHz VIF for all three systems, and a 32.4 MHz SIF for system L. 

Philips Advanced VIF SIF IC Application

 

This application would not have dealt with the Band I L’ channels from a conventional tuner unless they had been subjected to prior inverting conversion.  The intercarrier output would have allowed for separate processing of a second audio channel FM intercarrier, and/or NICAM intercarrier.

 

 

 

Note that the Philips IC at interest was not identified by its marketing designation in the paper.

 

 

(1)  An Advanced 5V VIF-SIF PLL tor Signal Detection in TV Sets and VTRs

      Joachim Brilka, Joachim Keibel and Wolfgang Weltersbach’ Philips Semiconductors, Hamburg, Germany

      IEEE 1991 August

 

 

 

Cheers,

 

Steve

 
Posted : 09/11/2024 7:07 am
Synchrodyne
(@synchrodyne)
Posts: 537
Honorable Member Registered
Topic starter
 
Second conversion of the sound IF alone has been covered previously.  It was for example done on some early Belgian four-system receivers, some British TV-FM receivers and later on some Japanese split-sound TV receivers.
 
There were also some much later examples of second conversion of the FM intercarrier.
 
Two examples were described in 1991 IEEE papers, one by Siemens (1) and one by Thomson (2).  Both dealt with new TV VIF-SIF (Siemens) and SIF-only (Thomson) multistandard processing ICs.  Inherent in that was the need to accommodate of FM intercarriers of 4.5, 5.5, 6.0 and 6.5 MHz, for the primary (mono) sound channels.
 
The Siemens IC was of the quasi-parallel type, with FPLL vision demodulation, the vision FPLL also providing the “local oscillator” for intercarrier generation.  There was a single on-IC intercarrier channel which, by choice of bandpass filter and quadrature tank circuit parameters, could be set operate at any one of the standard intercarrier frequencies.  Any of the others that needed to be accommodated were first converted to the initially selected frequency using a PLL oscillator and mixer.  The oscillator frequency range spanned 10 to 12 MHz.  This allowed all of the conversions that could be needed, except between 6.0 and 6.5 MHz, either way, as shown in this table:
 
Siemens Intercarrier Conversion Matrix
 
 
 
In practice I imagine that the output intercarrier frequency chosen would most often be 4.5 or 5.5 MHz.  In a general sense, these second conversions were independent of TV system parameters.  But in practice, as it was arranged that there was no conversion for one of the systems, then the second conversions were made to a frequency that was an intercarrier for that unconverted system, and so determined by a TV system’s parameters and not a wholly independent choice.
 
Here is a block schematic of the Siemens IC:
 
 
Siemens VIF SIF IC
 
 
There were two final intercarrier channels, thus Zweiton systems could be accommodated.  The second final channel would be 242 kHz above the main channel for the system B/G version of Zweiton, and as far as I know, 240 kHz above it for the Korean system M version.  That slight difference would have been accommodated by appropriate choice of the bandpass filter and quadrature tank circuit tuning.
 
 
As said, the Thomson IC was SIF only, and was of the quasi-split type, again with FPLL vision carrier recovery.  Here the intercarrier, 4.5, 5.5, 6.0 or 6.5 MHz according to system, was downconverted to the relatively low number of 500 kHz.  The conversion was supradyne, using a PLL oscillator.  The same oscillator frequency thus converted the second channel, where used, to 242 or 240 kHz.
 
These relatively low final IFs allowed complete on-IC processing, including the bandpass filters and demodulators.
 
Here is the block schematic for the Thomson IC:
 
 
Thomson SIF IC
 
 
Thus, we may add 500 kHz and 242/240 kHz to the TV receiver SIF list.  Although they were converted from intercarrier frequencies, themselves determined from TV system parameters, they were determined independently of the latter, except in respect of the relationship between the main and second audio channels.
 
 
Note that neither the Siemens nor the Thomson ICs at interest were identified in the papers by their respective marketing designations.
 
 
 
(1) A World Standard Video and Sound IF IC
R. Heymann and H. Kriedt
Siemens AG, Germany
IEEE 1991 August
 
 
(2) Advanced Multistandard TV-Sound IF Integrated Circuit
M.A. Rieger and R.K. Koblitz
Thomson Consumer Electronics R&D Laboratories
 
 
Cheers,
 
Steve
 
 
Posted : 09/11/2024 7:13 am
Page 8 / 8
Share: