Featured
Latest
Radio Receiver Inte...
 
Share:
Notifications
Clear all

Forum 141

[Sticky] Radio Receiver Intermediate Frequencies

140 Posts
10 Users
39 Likes
33.8 K Views
Nuvistor
(@nuvistor)
Posts: 4594
Famed Member Registered
 

Get a lot of electronics in close proximity and it must be difficult keeping spurious signals out, although the way they are made they have lots of shielding.

You speculation may well be spot on.

 

Frank

 
Posted : 19/12/2018 11:10 pm
Synchrodyne
(@synchrodyne)
Posts: 519
Honorable Member Registered
Topic starter
 

Spurs were probably a limiting factor in the application of the multiple conversion approach, particularly in the valve era when the need for adequate cooling limited the amount of screening that could be provided.

At least anecdotally it has been observed that Racal ran into quite a few problems of this kind when designing the RA17, but evidently they were not insoluble. The Eddystone 880 high-stability HF receiver used a Collins-type circuit with two 1 MHz wide variable first IF bands and a second IF of 500 kHz, this approach minimizing the number of oscillator crystals required. That probably required careful internal screening to avoid deleterious spurs, and Eddystone also achieved negligible oscillator radiation outside of the receiver.

But even in the solid-state era, such problems were not unknown. I have seen it said that the optional VHF adaptor for the JRC NRD525 HF receiver produced a lot of self-interference. And when the Heathkit GR78 was discussed upthread, it was speculated that although double conversion could have been used to advantage on the two highest frequency bands, it was in fact used only for the highest frequency band in order to avoid possible self-interference problems in a simple receiver where individually screened compartments were not justified.

Also with the solid-state era came more complex tuning systems that could in and of themselves create spurs.

The Racal RA17 had been mentioned, so some analysis of its IF choices may be appropriate. Logically, the first IF band (1.3 MHz wide) had to be above the signal frequency range if it were to be used at all signal frequencies. And it had to be sufficiently far above the highest signal frequency that the latter was well down the lower slope of the first IF bandpass curve. Evidently 40 MHz nominal, with the lower edge of the bandpass at 39.35 MHz, met that requirement. Supradyne operation was also indicated, both to keep the first VFO frequency out of the signal frequency range and to minimize the first VFO swing ratio.

The second VFO frequency of 37.5 MHz was derived by mixing the output of the first VFO with the 1 MHz harmonic. Given that for correct error correction operation, the second VFO was obtained by the difference between the first VFO and a 1 MHz harmonic, and that the first VFO operated between 40.5 and 69.5 MHz, then the second VFO frequency was necessarily below that of the first IF, which meant infradyne operation. 37.5 MHz was probably as high as could be used without getting too close to the first IF band. The next available second VFO frequency in the upwards direction, 38.5 MHz, was likely too close to the IF band, and undesirable for other reasons. Inevitably the second IF band was going to be within the HF band. The range actually chosen, 2 to 3 MHz, which was a quieter region of the HF band. A 38.5 MHz second VFO would have produced a 1 to 2 MHz band that overlapped the MF broadcast band. Going higher than 3 MHz would have gotten into the busier part of the HF band, and would have made image rejection for the final conversion more difficult.

The final conversion was from the 2 to 3 MHz band to 100 kHz. I imagine that 100 kHz was chosen because at that frequency, all but the narrowest IF filters could be LC rather than crystal, and it also well-suited the use of the RA17 with external adaptors for ISB reception, etc. At the time, 100 kHz was a commonly used IF for final processing in point-to-point ISB/SSB receivers. Clearly with conversion from a 2 to 3 MHz range to 100 kHz, images were in-band, and the RA17 had a tunable tripe-bandpass filter between the second and third mixers.

Clearly, the 1 MHz harmonics (up to 32 MHz) had to be stopped from wreaking havoc elsewhere in the receiver. Having the first VFO moving in 1 MHz steps at half-MHz points (40.5 to 69.5 MHz) and the second VFO at a half-MHz point (37.5 MHz) may have helped in this regard, as the half-MHz points were as far away as possible from the 1 MHz harmonics. Unavoidably then the first IF was centred on a 1 MHz harmonic (40 MHz), but this was probably sufficiently far above the 32 MHz harmonic low-pass filter that it was not a major problem. But that could have been another lower constraint to the first IF choice.

Placing the second IF (2 to 3 MHz) between two 1 MHz harmonics looks as if it were another way of minimizing harmonic interference. Clearly, this IF band had to be sheltered from incoming HF signals, but that was not a new problem, as for examples receivers that followed the Collins-type approach also had 1 MHz wide IF bands – in fact usually two of them.

That the first and second mixers were wideband created the possibility of spur production. As far as I know Racal used special-quality valves (E180F?) for these mixers, whereas a conventional 6BE6 sufficed for the third mixer.

It is not surprising then that the RA17 had a rather complex mechanical construction with multiple screened compartments.

from WW 195708 p.389

Racal made a change with the RA117 variant in the RA17 series. Here the second IF of 2 to 3 MHz was first converted to a third IF of 1.6 MHz, and then to a fourth IF of 100 kHz. In this case the 2 to 3 MHz went through a fixed bandpass filter to the third mixer, which was fed from the third VFO, operating supradyne in the range 3.6 to 4.6 MHz. There was a relatively narrow 1.6 MHz bandpass filter between the third and fourth mixers. Doing it this way avoided the need for a tuned signal input to the third mixer, as all images were well out of band. But the third mixer was wideband, and used a special quality valve. 1.6 MHz (or thereabouts) was a commonly used HF receiver IF at the time, and fitted well with the objective.

The RA217 (and its derivatives) were essentially a solid-state (discrete bipolar) implementation of the same basic architecture, except that 1.6 MHz was used as the final in-receiver IF, with (presumably fairly standard) crystal IF filters used for all IF bandwidths. Evidently the desired results could be obtained without a further conversion to 100 kHz. However, the receiver did include a 1.6 MHz-to100 kHz (optionally to 455 kHz) frequency changer to feed an IF output to external devices that operated at these frequencies.

A further development was incorporated in the Racal RA919 receiver. Information about this is scarce, and I am not even sure that this model made it to production. This used the premixer approach for the second VFO. The output from the harmonic mixer was mixed with the output of an interpolating VFO, thus providing a second VFO that was variable over a 1.3 MHz range, with the result that the second IF was essentially a single frequency rather than a wide band. In this the first IF was 45 ± 0.65 MHz, the harmonic mixer output was 46.5 MHz (looks to have been rather close to the first IF range), mixed with the 5.55 to 6.45 MHz interpolation VFO output to provide a 40.5 ± 0.65 MHz feed to the second mixer for a 4.5 MHz second IF. The later IF or IFs are unknown, but I assume that there was at least one more conversion.

from RadCom 196801 p.32 Racal RA919

The inclusion of premixing for the second VFO in a Wadley loop was also used by Marconi in its Hydrus ISB receiver (which also had a second Wadley loop) and Eddystone in its EC958 receiver, in both cases apparently affecting the IF choices.

Cheers,

Steve P.

 
Posted : 25/12/2018 10:49 pm
Synchrodyne
(@synchrodyne)
Posts: 519
Honorable Member Registered
Topic starter
 

As already noted, whereas a significant degree of standardization – with reasons therefor - is found with domestic radio receiver IFs, in contrast communications receivers, both amateur/domestic and professional, are characterized by having quite a diversity of IFs, although some commonality can be found. The reasons behind the choices tend to remain obscure, although in some cases they can be derived by deduction, interpolation and extrapolation.

Recently I found a Philco paper of 1957 on SSB that does provide some background on the reasons for professional receiver IF choices. This paper is available at: http://www.navy-radio.com/manuals/ssb-93224.pdf. Note that this link goes direct to a nearly 30 MB .pdf. It is a very-well written paper. Apparently it was produced for the US Navy, which organization was/is known for its excellent training materials.

Firstly, there is comment on the common choice of the 100 kHz final IF for professional point-to-point SSB/ISB receivers. A relatively low frequency for the carrier insertion oscillator (CIO) was desirable in the interest of making this as stable as possible. In receivers with comparator-type, motor-operated AFC systems, the CIO also served as the reference oscillator, underscoring the need for stability. Stated was that the 100 kHz frequency was well-chosen, as it was a standard frequency used in many types of test equipment and frequency-controlling devices. 100 kHz crystals were easily obtained, and oscillators could combine accuracy with ruggedness.

I doubt that 100 kHz was in and of itself a critical choice. Somewhere in the general vicinity would work, but there was no technical reason to do other than choose the round number that was already well-known. Perhaps one aspect that would limit how low the final IF could go was the previous IF from which it was converted. The Philco paper includes a worked example receiver with a 3.1 MHz first IF, and noted that conversion to 100 kHz in that case required the use of a balanced mixer given the closeness of the local oscillator frequency (3.0 MHz) and the incoming signal frequency (3.1 MHz).

Secondly, after dealing with purpose-built receivers, the discussion moved on to SSB adaptors for use with conventional communications receivers. At least in American practice, these typically took a feed from the final IF, usually around 455 kHz, of the associated communications receiver. They converted this incoming IF to a lower IF, this conversion allowing fine tuning to accommodate a range around 455 kHz, also sideband inversion by switching between infradyne and supradyne, and the inclusion of an AFC lop within the adaptor. In respect of the adaptor IF, it was stated that the choice was limited principally by the availability of components, particularly the filters. Mentioned were 85 kHz, 100 kHz and 250 kHz, as well as the very low frequencies of 17 and 25 kHz.

Apparently 85 kHz was commonly used in low-frequency navigation and range receivers, with LC-based IF filters readily available. I am not sure, but I think that navigation receivers might have tuned down to 100 kHz in some cases. So 85 kHz might have been the highest practicable sub-100 kHz IF.

100 kHz was of course the norm for purpose-built professional SSB/IF receivers, with a wide range of crystal IF filters readily available.

The 250 kHz case was popular because there was available a range of suitable mechanical IF filters. The 250 kHz number was mentioned upthread in connection with a Marconi aviation HF SSB receiver described in Wireless World 1958 October. This included the comment: “Ssb transmitters and receivers in ground stations usually employ filters using quartz crystal resonators to give the sharp selectivity required, and very high performance is obtained with these units. In order to reduce size, however, airborne sets will probably use electro-mechanical filters, which give adequate performance in a smaller volume.”

Given that mechanical filters were available for frequencies below 250 kHz, including 100 kHz, a concomitant question is - was 250 kHz about the lowest frequency at which small volume could be obtained, with lower frequencies requiring volumes that were not significantly smaller than comparable crystal filters? That coupled with the allegedly poor group delay response of mechanical filters might have explained the preference for crystal units at 100 kHz. Of course, there may have been other reasons why 250 kHz was chosen for aviation SSB receivers.

An example of a point-to-point SSB/ISB receiver with 250 kHz final IF was the TMC (Technical Materiel Corporation) DDR-5 diversity model. (See: http://www.virhistory.com/tmc/tmc_pages/tmc_manuals/manuals_db/ddr-5/tm_ddr-5l_5_20_65.pdf),

The very low frequencies of 17 and 25 kHz allowed the use of relatively simple L-C filters, although care was needed to avoid conflict with the AF band. Upthread it was note that Racal used an 18 kHz final IF in its RA98 SSB/ISB adaptor for use with the 100 kHz IF output of its RA17 and RA117 receivers.

In the American case, 17 and 25 kHz would have been converted from 455 kHz. Neither had harmonics that were uncomfortably close to 455 kHz, or to the required conversion oscillator frequencies, whether infradyne or supradyne. That might have been a factor in their selection. 17 kHz might have been getting close to the lowest that was feasible in terms of converting from 455 kHz without running into problems, although proportionally it was no worse than converting from 3.1 MHz to 100 kHz. But it was also getting close to the AF band. If we assume a 6 kHz sideband possibility (as was the norm in point-to-point receivers), then the lower edge of that was at 11 kHz, perhaps about as low as one might want to go against an audio path with a steep 6 kHz low-pass filter.

An example of an SSB adaptor with the 17 kHz final IF was the TMC GSB-1 (see: http://www.virhistory.com/tmc/tmc_pages/tmc_manuals/manuals_db/gsb/ib_gsb-1_57_am.pdf)

With very low final IFs, the choice is clearly related to the preceding IF. In the Racal case, 17 kHz probably would not have worked, as its sixth harmonic is at 102 kHz, within the bandwidth of the incoming IF (94 – 106 MHz). But 18 kHz was fine, with harmonics at 90 and 108 kHz, and no harmonics near the oscillator frequency.

Thus in respect of radio receiver IFs, we have added some more numbers to the “what” side of the ledger, and a few more reasons to the “why” side, as well as establishing that 250 kHz can be viewed as a standard (de facto at least) IF number.

This thread has now been running for six years, and there is one thing that I have not found in my sporadic searches for additional information. (That sometimes occurs when I happen upon an idea when searching for something quite different.) The as yet unfound item is that someone, somewhere, preferably a professional with “inside” knowledge has already compiled and published a comprehensive and logical survey and analysis of the subject. Hope springs eternal!

Cheers,

 
Posted : 24/01/2019 1:04 am
Nuvistor
(@nuvistor)
Posts: 4594
Famed Member Registered
 
Posted by: Synchrodyne

This thread has now been running for six years, and there is one thing that I have not found in my sporadic searches for additional information. (That sometimes occurs when I happen upon an idea when searching for something quite different.) The as yet unfound item is that someone, somewhere, preferably a professional with “inside” knowledge has already compiled and published a comprehensive and logical survey and analysis of the subject. Hope springs eternal!

Cheers,

I think you have found that person, its yourself, your knowledge on the subject is vast. ? 

Frank

 
Posted : 24/01/2019 1:46 am
Synchrodyne
(@synchrodyne)
Posts: 519
Honorable Member Registered
Topic starter
 

Thanks for the vote of confidence, Frank. It certainly has been a voyage of discovery. One’s innate curiosity sometimes leads to strange places! Finding the “what” has generally not been too difficult; it’s the finding of the “why” that is the bigger challenge. On that, there appears to be more information available for the TV case than for the radio case. And once you find a couple or so TV cases well-explained in the literature, then you have the framework for analysis (at least at the back-of-the-envelope level) of the others. Overall it is somewhat like a jigsaw puzzle, in that once you have a cluster of pieces assembled on one sub-topic, you have a better idea of where to look for additional information.

I guess that it in abstract it is a rather arcane topic. On the other hand superhet circuitry was rather significant in the analogue-era radio and TV reception, and intermediate frequency was surely the most important parameter in the superhet principle.

Cheers,

Steve

 
Posted : 26/01/2019 1:15 am
Synchrodyne
(@synchrodyne)
Posts: 519
Honorable Member Registered
Topic starter
 

Reverting to the domestic FM receiver IF case, the attached item, from “Electronics” magazine for 1946 January, is the earliest reference to the standard 10.7 MHz number that I have so far found.

Electronics 194601 p.272 RMA IF Proposals FM & TV

   

from Electronics 194601 p.272 RMA IF Proposals FM & TV

The RMA was reported as having proposed that 10.7 MHz be the standard IF for VHF broadcast receivers. Soon thereafter – although as yet I have not determined exactly when – it was formalized as a standard.

The FCC had announced the reshuffle of VHF frequency assignments on 1945 June 27, at which time the FM band was slated to move from 42-50 MHz to 88-106 MHz. Thus the RMA had moved quickly to establish a suitable FM IF before any stations started using the new band. The cutoff date for inclusion in a 1946 January publication was probably 1945 late November or early December, so the relevant sub-committee was convened, and its work done within say five months.

Cheers.

Steve

 
Posted : 26/01/2019 1:17 am
Synchrodyne
(@synchrodyne)
Posts: 519
Honorable Member Registered
Topic starter
 

A couple of posts back, I mentioned that the Philco SSB paper stated that the choice of 100 kHz as the final IF in point-to-point SSB receivers was in part due to the fact that during WWII, 100 kHz had emerged as a standard reference frequency with the result that high-stability quartz crystals were readily available. A simple search produced verification of the 100 kHz significance, a couple of examples attached.

Electronics 194511 p.416
Electronics 194511 p.308

Marconi made the following comment about 100 kHz in its description of its CRF150/20B-SSR2 combination point-to-point diversity SSB receiver: “The third frequency changer changes the 465 kc/s, the second intermediate frequency, to 100 kc/s, the third intermediate frequency. The reason for this change is that the quartz crystal filters required for carrier and sideband selection are more readily available provided at 100 kc/s.” This receiver was more-or-less a combination of a modified standard double-conversion communications receiver with a separate SSB adaptor, but with some degree of integration. The three IFs were 1.2 MHz and 465 kHz in the receiver, and 100 kHz in the SSB adaptor. The 1.2 MHz first IF was evidently chosen to be just below the lower edge of the tuning range, 1.5 to 30 MHz. The second IF of 465 kHz, which was the final IF of the basic receiver, was the then (pre-Copenhagen plan) standard AM receiver IF in the UK.

Thereafter Marconi used the 100 kHz final IF in its point-to-point SSB receivers until into the solid-state era, about which more later.

The GPO used a 100 kHz final IF for its own-design point-to-point SSB receiver of circa 1953, which had a tuning range of 4 to 30 MHz. In respect of its IF choices, it said: “The first i.f. of such a receiver is usually in the range 1-4 Mc/s; a frequency of 3.1 Mc/s has been adopted in this receiver and is identical with the second i.f. in the transmitter drive equipment. For the most effective and economical designs of sideband and carrier filters using quartz-crystal resonators, the range 50 kc/s-200 kc/s is preferred; a second i.f. of 100 kc/s is suitable and is identical with the first i.f. in the transmitter drive equipment.”

Here the GPO had recognized a 50-200 kHz range as being suitable for the final IF, and of course 1000 kHz was at the geometric mean of that range.

One may suppose that the 3.1 MHz first IF was chosen as being a suitable number below the 4 MHz lower limit of the tuning range, and one that produced the 100 kHz final IF with a 3 MHz second oscillator. Whilst in radio applications, the choice between infradyne and supradyne operation was more of an individual design consideration than in TV practice, where one or the other was usually mandated, the GPO did follow the HF SSB convention in this regard. Below 10 MHz signal frequency, the first conversion was supradyne, resulting in sideband inversion such that the “A” channel (in an ISB system) appeared as the lower sideband. Above 10 MHz, infradyne first conversion was used, so there was no sideband inversion and the A” channel appeared as the upper sideband. To avoid further inversion, the second conversion was infradyne. One may speculate that the convention was developed at a time when oscillator stability was at a premium, particularly at the higher end of the HF range, so that infradyne operation at this end of the band, say above 10 MHz, was strongly preferred. On the other hand, at the lower end of the HF band and downwards into the MF band, infradyne operation became more difficult. For example, consider a receiver whose lowest tuning range was 2 to 4 MHz and which had a first IF of 1.6 MHz. With infradyne operation, that 2-to-1 tuning range would require a local oscillator swing of 0.4 to 2.4 MHz, a range of 6-to-1, probably very difficult to do in the days of variable capacitance tuning. With supradyne operation, the local oscillator swing would be 3.6 to 5.6 MHz, a range of 1.6-to-1, quite reasonable.

Once established as a point-to-point SSB norm, the 100 kHz final IF then found its way into general purpose communications receiver practice. Presumably this reflected the ready availability of narrow-bandwidth crystal filters. Eventually, starting from circa 1970, 100 kHz was gradually displaced by 1.4 MHz, as previously discussed.

Cheers,

Steve

 
Posted : 26/01/2019 5:23 am
Synchrodyne
(@synchrodyne)
Posts: 519
Honorable Member Registered
Topic starter
 

The use of very low final IFs, say below 50 kHz, for SSB receivers, appears to have had an economic element, as shown in these items referring to the Crosby Type 76 SSB Adaptor.

Electronics 195402 p.317
Tele Tech 195312 p.154

This was billed as a lower cost, lighter, more compact unit using toroidal coils rather than crystals and L-C circuits. The advertisement mentioned a 25 kc carrier filter, from which one may determine that the final IF was 25 kHz. This was one of two (17 and 25 kHz) very low final IFs noted in the Philco paper.

Crosby had used a 100 kHz final IF, with crystal filters, in its “full-sized” SSB adaptors, such as those fitted to the Model 155 triple diversity receiver, which was described in Communication Engineering (formerly FM-TV) 1953 July-August, p.29ff. (Available at: https://www.americanradiohistory.com/FM-Magazine-Guide.htm).

Electronics 195305 p.324 Crosby 155 Triple Diversity SSB Receiver
Electronics 195305 p.325 Crosby 155 Triple Diversity SSB Receiver

In that case one might say that Crosby had stepped into line with prevailing industry practice. Previously it appears to have used a 200 kHz final IF for its exalted-carrier adaptors, such as the Type ECC units used in the TMC TDRS triple diversity receiver. These required a very narrow carrier extraction filter, evidently quite doable at 200 kHz, but they did not require very sharp cutoff sideband filters, nor a very stable carrier oscillator. One assumes that when it entered the SSB market, Crosby saw 100 kHz as a better final IF choice, probably in part because of existing crystal filter availability.

Cheers,

Steve

 
Posted : 26/01/2019 5:34 am
Synchrodyne
(@synchrodyne)
Posts: 519
Honorable Member Registered
Topic starter
 

Some more on the origins of the 455 kHz IF number that became the standard in the USA, and was widely used, and in some cases standardized, elsewhere.

This item from Radio News 1937 December records that the FCC had advised the RMA that it would endeavour to avoid making in allocations in the 450 to 460 kHz band in order that 455 kHz could be established as a protected intermediate frequency.

from Radio News 193712 p.323 FCC and 455 kHz IF
Radio News 193712 p.323 FCC and 455 kHz IF

Some after this it would appear that the RMA declared 455 kHz to be a recommended standard IF.

So far I have found very little mention of 455 kHz exactly before late 1937. The attached receiver IF list from Radio Craft 1932 December shows signs of an upward move to the 450 kHz or thereabouts region, from the earlier lower numbers of which 175 kHz was probably the modal choice. 456 kHz was in use then, and not really materially different from 455 kHz.

Radio Craft 193212 p.344
Radio Craft 193212 p.345

Possibly the RMA had requested 456 kHz or another nearby number, such as 465 kHz, and the FCC, after looking at its existing allocations, saw the channel centred on 455 kHz as being the best choice for a protected IF.

With 10 kHz channel spacing, and the channels as multiples of 10 kHz, having the IF as an odd multiple of 5 reduced the possibility of IF beat interference from stations spaced by the IF. Image interference from stations spaced by twice the IF was still possible. But this could affect only the five lowest and five highest channels at the time, when the band (in terms of carrier frequencies) ran from 550 to 1500 kHz. Moving up to 485 kHz would have avoided this, but at the time a band expansion was in planning, at the time thought to be to 520 to 1600 kHz. (In the end it came out as 540 to 1600 kHz.) Avoiding image interference in that case would have required an IF that was within the lower end of the band, clearly impracticable.

Radio Craft 193701 p.429 BC Band Planned Expansion

Having the IF as an odd multiple of 2.5 kHz might have helped. With say a 457.5 kHz IF, the image would be 915 kHz away, so midway between channels where it fell in-band. But perhaps an odd multiple of 2.5 kHz would have made it difficult to assign a clear IF channel. Presumably once 455 kHz was chosen, the FCC thenceforth endeavoured to avoid geographic AM station channel assignments that might lead to image interference.

One may posit the basic requirements as being that the chosen IF be adequately below the lowest AM broadcasting channel so that receiver difficulties such as regeneration in the mixer stage were avoided, and also clear of the marine activity around 500 kHz, as well as being in an available clear space. That pretty much meant that it had to be somewhat below 500 kHz.

Cheers,

Steve

 
Posted : 28/01/2019 3:58 am
Synchrodyne
(@synchrodyne)
Posts: 519
Honorable Member Registered
Topic starter
 

In an earlier posting, I had noted that Radio News magazine for 1937 December had recorded that the FCC had advised the RMA that it would endeavour to avoid making in allocations in the 450 to 460 kHz band in order that 455 kHz could be established as a protected intermediate frequency.
 
I have since found some information on the preceding RMA request that resulted in this FCC pronouncement.  This was recorded in Radio Engineering magazine for 1937 February:
 

from Radio Engineering 193702 p.24 RMA 455 kHz IF
Radio Engineering 193702 p.24 RMA 455 kHz IF

 

The RMA request for the FCC assignment of 455 kHz as a protected intermediate frequency was made in 1937 January, following a study (with which the FCC was involved) in which it was found to the best possible choice.
Thus we can say with some certainty that 1937 was the year in which 455 kHz became the standard AM radio receiver IF in the USA.  I still haven’t found exactly when RMA issued it as a standard, but late 1937 or early 1938 would seem to have been likely.
 
Turning to FM, this comment is from an article in Communications magazine for 1945 October about crystal-controlled receivers for AM, FM and TV:
 
“With the recent announcement by RMA of the approved f-m i-f of 10.7 mc and the unanimous proposal of 22.25 mc for the i-f sound channel of television receivers, it is now possible to analyze the quantitative design features of the oscillator sections of these receivers.”
 

Communications 194510 p.70

 
That means that the FM standard IF of 10.7 MHz was developed and promulgated by the RMA very soon after the FCC announced that FM would be moving to the 88-108 MHz band from its previous 42-50 MHz assignment.  To be mentioned in a 1945 October magazine, the RMA probably announced no later than 1945 September.  Anyway, we now have 1945, probably the third quarter, as when the 10.7 MHz FM IF was established.
 
(The TV case turned out slightly differently, with the sound IF specified as a range, 21.25 to 21.9 MHz, rather than 22.25 MHz.)

Cheers,
 
Steve

Mod Note: Fixed formatting ? 

 
Posted : 16/08/2019 6:48 am
Synchrodyne
(@synchrodyne)
Posts: 519
Honorable Member Registered
Topic starter
 

Mod Note: Fixed formatting ?

Thanks for that. I knew it was wrong, but I couldn’t figure out either what I had done wrong or how to fix it.


Mod Note: When text is pasted in from another source the forums editor will retain an applications prior formatting. To avoid problems prior to pasting into the editor the user should select the "T" icon which is located  second to last on the far right of the editors tool bar. This puts the editor into text mode and any body of text now pasted in will have prior application formatting removed. This will 100% guarantee a body of text pasted in from another source will be presented correctly.

 
Posted : 19/08/2019 11:56 pm
Synchrodyne
(@synchrodyne)
Posts: 519
Honorable Member Registered
Topic starter
 

The 455 kHz standard AM IF had had some use as such before it was standardized by the RMA, but proximate numbers, such as 456 and 465 kHz, went back to the early 1930s at least.

There was also 260 kHz, which the RTMA (successor to the RMA) had standardized as an alternative AM IF by 1950, according to this comment by Langford-Smith:

Langford Smith p.1361 Standard Intermediate Frequencies

IF numbers such as 260 and 262 kHz also went back to the early 1930s or so. Apparently these were seen as a compromise between the earlier 175 kHz and like numbers, and the newer 456, 465 kHz, etc., with which good IF selectivity was harder to obtain. Given the information available in respect of the 455 kHz standardization activities, I suspect that 260 kHz was standardized somewhat later, perhaps following lobbying by some of the setmakers. As yet though I have found no information as to the background to its recognition by RTMA, or as to whether it had any recognition by the FCC. Be that as it may, it was certainly unusual in the standard IF world for a lower to follow a higher number.

Another twist was the addition of 262.5 kHz as a standard car radio IF. This number also originated in the early 1930s as part of the 260, 262 kHz cluster. It was presaged in Tele-Tech magazine for 1955 July, and recorded in the book “ITT Reference Data for Radio Engineers”, 4th edition,1956.

Tele Tech 195507 p.54 RETMA 262.5 kHz IF Proposal
ITT Reference Data Radio Engineers 4th 1956 p.106 Intermediate Frequencies

 

I haven’t yet figured it out, but presumably the rather precise 262.5 kHz number was chosen to minimize interference effects in car radios – which had to deal with widely varying signal strengths and with receiving weak signals in the presence of in-band string signals – and was better than say 260 kHz for this purpose.

An unusual application for 260 kHz was in the test receiver used for the Belar system during the late 1970s/early 1980s AM stereo trials in the USA. That was double conversion, with a 10.7 MHz 1st IF and a 260 kHz 2nd IF. (Belar had proposed what was essentially the 1959 RCA AM stereo system, but dropped out of contention quite early on.)

Cheers,

Steve

 
Posted : 20/08/2019 12:02 am
Synchrodyne
(@synchrodyne)
Posts: 519
Honorable Member Registered
Topic starter
 

Back-of-the envelope calculations indicates that with a 262.5 kHz IF, only the 4th harmonic, 1050 kHz, falls upon an American MW carrier frequency. All others are either 2.5 kHz or 5 kHz from a carrier. Also, all local oscillator frequencies that fall in-band are spaced 2.5 kHz from a carrier.

In contrast, with 260 kHz, all in-band harmonics fall upon carrier frequencies, as do all in-band local oscillator frequencies.

Thus if an IF around 260 kHz is to be used for any reason, e.g. easier to obtain good adjacent-channel selectivity, then 262.5 kHz is better than 260 kHz.

Clearly, around 260 kHz is less satisfactory in respect of image rejection than 455 kHz, but car radios I think mostly had RF amplifiers, which provided some amelioration. Apparently one of the original reasons for going to a 455 kHz or thereabouts IF (from around 175 kHz) was to obtain adequate image rejection without the need for a tuned RF amplifier, leaving just one tuned RF filter.

Cheers,

Steve

 
Posted : 20/08/2019 12:24 am
Synchrodyne
(@synchrodyne)
Posts: 519
Honorable Member Registered
Topic starter
 

In terms of American AM radio receiver IF standardization, the RMA/RTMA/RETMA sequence appears to have been 455 kHz alone, then 455 and 260 kHz together, and then a threesome of 455, 260 and 262.5 kHz. For FM, 4.3 MHz was first, then this was superseded by 10.7 MHz, although there might have been a short overlap period.

It does seem possible, even probable that the 10.7 MHz FM standard IF was declared by the RMA before it was actually used by any setmaker. Be that as it may, its early development appears to have headed-off the significant use of alternatives, at least within the USA. Zenith used 8.3 MHz for a short while, Philco used 9.1 MHz for somewhat longer, Motorola continued to use the previous standard 4.3 MHz for a while, and there may have been one or two others, but these all disappeared quite early on. American TV-FM receivers, a relatively short-lived species, generally would have used the first standard TV sound IF, 21.25 to 21.9 MHz for FM.

In the UK case, some of the non-standard FM IFs have already been mentioned, such as 9.72 MHz by HMV, 14.1 MHz by Fitton (Ambassador), 19.5 MHz by Bush and 12.5 MHz by Leak. One may add to that list 8.2 MHz, which was specified by Spencer (BBC) for the simple FM receiver constructional project published in WW 1951 November and December. Lowther used 9.2 MHz for its first FM tuner in 1953, whose circuit looks to have been derived from the Spencer design. Possibly the earliest use of 10.7 MHz by a UK manufacturer was in the Mullard GFR520 of 1949 (whose details align closely, perhaps suspiciously closely with those of the Hallicrafters SX-42 of 1946).

from WE 194911 p.374 Mullard GFR520

Cheers,

Steve

 
Posted : 22/08/2019 3:53 am
Terrykc
(@terrykc)
Posts: 4005
Member Rest in Peace
 
Posted by: @synchrodyne

Back-of-the envelope calculations indicates that with a 262.5 kHz IF, only the 4th harmonic, 1050 kHz, falls upon an American MW carrier frequency. All others are either 2.5 kHz or 5 kHz from a carrier. Also, all local oscillator frequencies that fall in-band are spaced 2.5 kHz from a carrier.

Thus if an IF around 260 kHz is to be used for any reason, e.g. easier to obtain good adjacent-channel selectivity, then 262.5 kHz is better than 260 kHz.

Steve, when the designed-for-the-US-market far eastern 'vest pocket' radios flooded the UK in the 1960s with their US 455kHz IF, the second IF harmonic of 910kHz was only the then frequency of 908kHz of the BBC's Brookmans Park transmitter, used to broadcst the Home Service to London and the surrounding area. The 2kHz beat was very unpleasant although the saving grace was that the Home Service didn't really appeal to the kids who, in the main, were the proud owners of these sets!

I can't see that a 2.5kHz beat with a US station would have been any improvement so it seems to me that neither 260kHz nor 262kHz was a very wise choice for the US.

When all else fails, read the instructions

 
Posted : 22/08/2019 10:37 am
Synchrodyne
(@synchrodyne)
Posts: 519
Honorable Member Registered
Topic starter
 

I imagine that receiver designers using 260 or 262.5 kHz IFs would need to take special care to avoid IF harmonics from getting back into the signal chain. Insofar as with these IFs, tuned RF amplifiers were more likely to be used to obtain reasonable image rejection, that also could have helped.

It would be interesting to see the RTMA/RETMA engineering support cases for the inclusion of the 260 and 262.5 kHz IF cases in its standard. RMA had previously selected 455 kHz as being the best available for AM MW reception with 10 kHz channel spacing. Thus one could infer that 260 and 262.5 kHz were not as good in a general sense, but that there were specific circumstances in which they could be used, perhaps even to advantage. Also, that there were factors in the automotive receiver case that pointed to 262.5 kHz rather than 260 kHz. Some receiver makers may have had their own reasons for wanting these standardized, but I imagine that they must have been supported by reasonable engineering cases to be accepted.

Anyway, for now, it appears to be something of an enigma as to why these two IFs were standardized, and somewhat after 455 kHz was adopted.

One could also ask, why 260 kHz and not say 250 or 270 kHz? Possibly the objective was to select the highest “round” number whose second harmonic fell below the low end of the MW band, hence 260 kHz. The offset to 262.5 kHz for the automotive case may have had more arcane reasoning.

The 175 kHz IF found in earlier American practice was evidently chosen so that its 3rd harmonic fell below the low end of the MW band.

Cheers,

Steve

 
Posted : 26/08/2019 10:50 pm
Synchrodyne
(@synchrodyne)
Posts: 519
Honorable Member Registered
Topic starter
 

Returning to the FM case, prior to the adoption by the RMA and the US industry of 10.7 MHz as the standard FM receiver IF, the 4.3 MHz number had been standardized by the RMA, as shown in Radio Craft 1941 June:

Radio Craft 194106 p.750 RMA 4.3 MHz FM IF

Exactly when this happened is unknown. The initial US band for regular (i.e. not experimental), 42 to 50 MHz, was established in mid-1940, so it seems likely that the RMA had completed its deliberations before the end of 1940.

The choice of 4.3 MHz was adequately above the top end of the 3.5 to 4.0 MHz amateur band, so that interference from that source should not have been a problem. And it ensured that all images were outside of the FM band.

Cheers,

Steve

 
Posted : 02/09/2019 3:57 am
Synchrodyne
(@synchrodyne)
Posts: 519
Honorable Member Registered
Topic starter
 

As recorded above, the American standard IF for FM receivers moved from 4.3 MHz for the original, 42-50 MHz band to 10.7 MHz for the later 88-108 MHz.

Nonetheless, towards the end of 1944, a proposal was made that the standard be moved from 4.3 to 8.25 MHz, in anticipation that the FCC would upwardly extend the existing FM band to somewhere beyond 50 MHz. In the event, the FCC’s mid-1945 decision was to move FM to the 88-108 MHz band.

The proposal was made by W.H. Parker of Stromberg-Carlson in a 1944 December IRE Proceedings article, the first page of which is attached.

IRE Proceedings 199412 p.751 Parker FM IF

The 8.25 MHz number was chosen, because at the time, it was the customary value for the American TV receiver sound IF. Although not stated by Parker, this would have allowed for an upward FM Band extension to say 58 MHz, on the basis that the IF should be somewhat greater than twice the width of the band at interest, as had been the case with the 4.4 MHz choice. Of course, following the FCC mid-1945 VHF reallocations, the standard TV IF was moved upwards, to a range of 21.25 to 21.9 MHz.

Although events effectively made the 8.25 MHz proposal redundant, it could be that it had some influence. Zenith used an 8.3 MHz IF on its early post-WWII receivers with dual-band FM coverage, before moving to the standard 10.7 MHz for receivers that covered only the 88-108 MHz FM band. 8.3 MHz might have been 8.25 MHz rounded upwards, on the basis that 2½ digit precision was not needed; 2 digits being sufficient, nor was alignment with what was by then an obsolete TV IF standard.

One may wonder, too, whether the 8.2 MHz IF choice for the Spencer (BBC) simple FM receiver published in Wireless World (WW) 1951 November and December was derived from the Parker 8.25 MHz proposal, in this case rounded downwards. In the interest of obtaining adequate IF gain, keeping the IF as low as possible was desirable for a receiver that had just one IF gain stage ahead of the limiter and discriminator. Thus an IF lower than the by-then established 10.7 MHz was indicated; 8.2 MHz could have been derived from first principles or from the Parker proposal. The tuning range was 87.5 to 95 MHz, with an aperiodic front end. The Spencer design also appears to have been the basis for the first Lowther FM tuner of 1953. Here the tuning range was extended to 85 to 100 MHz, and the IF was moved up to 9.2 MHz.

Still in the UK, in an editorial in its 1955 February edition, the journal Wireless Engineer (WE) argued for a standard FM IF of the order of 20 MHz in order to avoid interference with TV receivers. It also noted that where a “lowish” IF (i.e. 10.7 MHz) was used, it should be with the oscillator on the low side (infradyne).

WE 195502 p.33
WE 195502 p.34

In the event, BREMA chose 10.7 MHz, with an infradyne recommendation, although the latter was not always followed. WE recorded that just one manufacturer was using 19.5 MHz, clearly referring to the Bush case, as discussed upthread.

WE’s concerns about the 10.7 MHz FM IF and TV interference would not have been unique to the UK, but would have applied to much of the World. Evidently the problem turned out to be was lesser in practice than had been expected.

Cheers,

Steve

 
Posted : 06/11/2019 11:48 pm
Synchrodyne
(@synchrodyne)
Posts: 519
Honorable Member Registered
Topic starter
 
 

Apparently the use of the then common TV sound IF of 8.25 MHz for FM receivers had been mooted earlier than 1944, being mentioned in a 1941 IRE article by RCA staffers Foster and Rankin.  The article is available here:  https://www.americanradiohistory.com/IRE_Proceedings.htm.
 
The authors reviewed the IF requirements for the 42 – 50 MHz FM band from first principles, noting that an IF above 4 MHz would eliminate in-band images, an IF above 8 MHz would eliminate interference from in-band stations separated by IF, and an IF above 16 MHz would eliminate interference from in-band stations separated by half of the IF.  The last-mentioned was not considered to be a significant issue at the time.  However, it surfaced again in the 1970s and 1980s.  As I recall, it was a parameter that the late Gordon J. King used to measure in his FM tuner reviews of the period.
 
Of the then-standard 4.3 MHz FM IF, it was said:
 
“If image response from other frequency-modulation stations is to be eliminated an intermediate frequency of at least 4 megacycles has been seen to be necessary.  To minimize direct intermediate-frequency interference, a frequency range of some 200 kilocycles free from types of services likely to cause such interference is required.  The lowest such range above 4 megacycles is at 4.3 megacycles to which are allocated government, general communication, and coastal-harbor transmitters.  Another intermediate frequency in that general range which appears relatively free from direct intermediate-frequency interference is at 5.38 megacycles, where are allocated government, fixed, and general communication channels.”
 
The 8.25 MHz possibility was addressed thus:
 
“A frequency of 8.25 megacycles has been widely used as the intermediate frequency for the sound channel of television receivers, which suggests its use for frequency modulation in order to minimize component types.  Examination of frequency allocations shows that 8.26 megacycles is somewhat better than 8.25 megacycles over a 200-kilocycle width.  And that to 8.26 megacycles are allocated ship-telegraph and government services, neither likely to cause interference.  On the score of image, however, with the oscillator higher, both the 5-meter amateur and the second television bands fall within the image range.  With the oscillator lower, we find 10-meter amateur, police, and government frequencies.  It would appear therefore that this frequency, particularly with the oscillator lower, will be satisfactory in receivers having enough selectivity preceding the converter to insure good image ratio.”
 
Interesting though was a proposal for an even higher IF for use in lower-priced receivers with relatively limited front-end selectivity:
 
“In receivers in the lower-price classes, where selectivity ahead of the converter is not high, it would seem desirable to operate with the oscillator lower than the tune frequency to avoid the television channels, and to use an intermediate frequency that would not permit interference from the amateur bands.  This requires an intermediate frequency of over 11 megacycles.  At 11.45 megacycles we find government, fixed, and aviation allocations, and for the image, fixed, government, and broadcast stations, so that this frequency is comparatively free from likelihood of spurious responses. 
 
“It is not likely that frequencies appreciably higher than about 14 or 15 megacycles will be useful, because of decreasing stability and gain limitation due to tube and circuit capacitances.”
 
 
An interesting double-conversion approach was used by Motorola in the late 1940s for receivers for the then-new 88 - 108 MHz FM band.  This allowed retention of the established 4.3 MHz number as the second IF whilst providing an adequately high 1st IF to avoid spurious responses.
 
The local oscillator operated at half the signal frequency less half the second IF.  Thus it ran from 41.85 to 51.85 MHz for a signal range of 88 to 108 MHz.  This produced a variable 1st IF of 46.15 to 56.15 MHz, which may be noted as being 4.3 MHz above the oscillator frequency for all input frequencies.  This 1st IF was tuned at the input to the 2nd mixer, providing adequate pre-mixer selectivity for the 4.3 MHz 2nd IF.  The same oscillator as used for the 1st conversion was also used for the second conversion, giving the 4.3 MHz 1st IF.
 
I think that this scheme might have been developed for receivers that covered both the old and new FM bands, of which there were some in the immediate post-WWII period, and then carried over to single-band receivers.  For the 42 – 50 MHz band, single conversion with an oscillator swing of either 37.7 to 45.7 or 46.3 to 54.3 MHz would have produced the 4.3 MHz IF.  The 46.3 – 54.3 MHz range in particular largely overlaps with the 41.85 to 51.85 MHz range required for the 88 – 108 MHz band.  Nonetheless, Motorola did migrate to the standard 10.7 MHz IF by c.1950.

cheers,
 
Steve

Moderator Note:

The formatting of this post has been fixed: Remember as we've discussed before, either use the "paste as text" button on the editor toolbar, prior to pasting, (doing so will remove the retained application formatting where you created or copied the body of text from) or copy the body of text to notepad first, then from there paste into the forum editor without using the aforementioned "paste as text" editor button.  ? 

 

 
Posted : 13/12/2019 12:06 am
Synchrodyne
(@synchrodyne)
Posts: 519
Honorable Member Registered
Topic starter
 

As previously noted, there was quite a bit of diversity in IF choice for HF receivers. Standard and quasi-standard numbers were used, but there were also ad hoc selections, typically situational and probably according to individual designer preference.

By way of illustration, during the 1950s, Marconi had a three-tier range of professional point-to-point SSB/ISB receivers. At the top, what it called Group 1, was the HR92/HR93, which occupied a full-height rack. Next, in Group 2, was the HR21, another full-height rack unit. Then at the bottom, in Group 2, was the HR22, a simple single-box receiver. All were double-conversion, with tuning ranges and IFs as follows:

HR92/HR93 3 to 27.5 MHz, IFs 1.6 MHz and 100 kHz
HR21 3 to 27.5 MHz, IFs 2.6 MHz and 100 kHz
HR22 2 to 32 MHz, IFs 1.6 MHz and 100 kHz

1.6 MHz and 100 kHz were commonplace IFs for this kind of receiver. Against that, the 2.6 MHz 1st IF of the HR21 looks to be an “odd man out”, perhaps just a quirk, particularly as in general, the HR21 followed the same pattern as the HR92/HR93. But in fact there was a reason for it.

The HR92/HR93 was designed for continuous service on heavy traffic service without change of frequency. Consequently, its RF circuits were individually tuned. Retuning was thus a service operation, using an inbuilt oscillator, and not an operator function. This way, the RF circuits could be individually peaked for the chosen frequency. On the other hand, the HR21 was intended for intermittent duties, where a relatively quick change of frequency could be required. Thus it had ganged RF circuits, operator tuned. As Marconi put it “The first I.F. amplifier is centred at 2.6Mc/s so as to maintain a good image signal protection in spite of any small ganging inaccuracies which might exist.” That was certainly not a rationale that would readily come to mind.

With the HR22, designed to a lesser performance standard, evidently the image rejection with its 1.6 MHz 1st IF was adequate for purpose, and anyway, the fact that it tuned down to 2.0 MHz precluded anything materially higher.

In the mid-1960s, the Marconi MST (self-tuning) receivers, solid-state, superseded the HR92/HR93 and probably took over some HR21 applications, as well. The MST receivers tuned 2.5 to 27.5 MHz, and were double conversion with IFs of 2 MHz and 100 kHz. Given the tuning range, 2 MHz was close to the upper limit for 1st IF. As the self-tuning system implied tracking ganged RF circuits, then given the HR21 case, one might reasonably conclude that Marconi set the 1st IF as high as reasonably possible in order to maximize image rejection. But there was in fact a different reason for that choice. Apparently, the design of the self-tuning system, driven from the synthesizer settings, was made easier if the 1st IF was an integer number of MHz. One may then wonder if the 2.5 MHz lower coverage limit was more of a consequence than an initial condition. Marconi did consider the upconversion approach, but one reason for not using it was that it would have required a special synthesizer, whereas I think that the MST receivers used the same synthesizers as had been developed for the corresponding MST transmitters.

At the lower performance end, in 1968 Marconi introduced the Hydrus solid-state SSB/ISB receiver. This would have covered the HR22 and lower-end HR21 applications. Not depending upon an existing synthesizer, this was of the Wadley Loop upconversion type, triple-conversion with IFs of 40 MHz, 5 MHz, and 100 kHz. This would have predated the arrival of 1.4 MHz as an SSB processing final IF, hence the use of the established 100 kHz number. The 40 MHz 1st IF followed the Racal RA17 precedent. The 5 MHz 2nd IF can be explained by the fact that there was a second Wadley Loop, with pre-mixer, involved in the generation of the feed to the second mixer. With a 40 MHz 1st IF, 37.5 MHz was about the highest practicable feed frequency that could be used for the second mixer, meaning a mean 2nd IF of 2.5 MHz. But if the 37.5 MHz was to be further mixed in the second Wadley Loop, then the output would necessarily be lower than that, in this case nominally 35 MHz. The second loop was quite complex, involving 37.5 and 3.75 MHz feeds from harmonic generators and a VFO.

Cheers,

Steve

 
Posted : 09/06/2021 12:21 am
Page 4 / 7
Share: