Share:
Notifications
Clear all

[Sticky] Television Receiver Intermediate Frequencies

Page 5 / 6
 
Till Eulenspiegel
(@till)
Famed V-Ratter Registered

Hi Frank,          

                The Stella ST8617U and Philips 1768U were those sets that had the control panel on the left side of the cabinet, going against the convention of siting the controls to the right  of the CRT.   The sets had a fine tuner. Coloured markers could be attached to the turret tuner knob for easy station identification.

Actually it was a common practise to off-set the sound IF when a TV set was to be used to receive very weak signals. Of course the trade-off was a loss of picture definition but the useful increase of gain and reduction of noise made the modification worthwhile. 

The Stella ST8617U was a very reliable set, the only weakness was the line output transformer which was a plug-in component. Easy fix on house calls.

Till Eulenspiegel.

ReplyQuote
Posted : 29/11/2018 11:24 am
Nuvistor
(@nuvistor)
Famed V-Ratter Registered

We were lucky, very few locations with poor signal strength.

 

ReplyQuote
Posted : 29/11/2018 8:53 pm
Synchrodyne
(@synchrodyne)
Reputable V-Ratter Registered

In respect of the development of standard intermediate frequencies for analogue TV receivers, although the information is by no means complete, it is now possible to provide a reasonable overview, as follows.

The starting point is the USA in 1945, when the RMA developed its initial standard TV IF, following the FCC’s revised allocation of the VHF channels. In fact the initial standard IF was a range, with VIF 25.75 to 26.4 MHz and SIF 21.25 to 21.9 MHz. (Why the range is unknown, but presumably the RMA saw that anywhere in the range would be satisfactory, and so allowed the receiver makers some flexibility.)

In practice this original IF, which came to be known as the “low” IF, was not fully satisfactory basis field experience. Further study of the problem by an RMA subcommittee resulted in a new “high” IF of 45.75 MHz VIF and 41.25 MHz SIF being recommended, this becoming standard in 1950. An IF higher than 40 MHz was possible because in 1948 the FCC had deleted channel 1, 44 to 50 MHz.

The “high” 45.75/41.25 MHz IF was developed to minimize possible interferences for the established VHF channels. It was then used by the FCC as part of its basis for geographic allocation of UHF channels. In the UHF case, local oscillator frequencies and images were inevitably in-band, which meant that certain channel combinations needed to be avoided, both for transmitters within the same service area and for those in adjacent service areas. These exclusions were known in the USA as the “UHF Taboos”. This approach was practicable only with an agreed standard IF.

The US experience effectively set a precedent, which was that the standard IF should be as close to the bottom edge of Band I as practicable given the other constraints, and that it should be set to minimize possible interferences in the VHF channels. Then UHF channels allocations would be made with taboos established on the basis of the standard IF.

The US “high” IF apparently became the norm for all other territories that adopted the TV systems later known as CCIR systems M and N, and using the US channelling system, with the exception of Japan, which had its own channelling system and own standard IF for system M.

European-area VHF channel assignments – including some already in use – were defined at the ITU 1952 Stockholm meeting (ST52). Going into that meeting, Italy was the only country that already had a defined standard IF, with VIF 45.75 MHz and SIF 40.25 MHz. This was associated with its own VHF channelling system, different to that used elsewhere in Europe. The majority view at that meeting was that an IF channel as close as reasonably possible to the bottom edge of Band I was desirable, this following the US precedent.

Thus in the period following ST52, and mostly in 1954-55, there emerged a set of standard IFs that generally aligned with this precept. These took account of the channelling plans that came out of ST52, which were not quite the same as those that were taken into that meeting. These standard IFs were:

Western Europe generally, systems B, C and F:        38.9 MHz VIF, 33.4 MHz SIF
UK, system A:                                                                    34.65 MHz VIF, 38.15 MHz SIF
France, system E:                                                             28.05 MHz VIF, 39.2 MHz SIF
USSR and Eastern Europe, system D:                         34.25 MHz VIF, 27.75 MHz SIF

As best I can determine, the 38.9/33.4 MHz combination was also used elsewhere in the world where system B was chosen, with the only known exception being Australia.

Noted previously is that ad hoc arrangements were usually made for multistandard receivers, often incorporation some elements from the standard values. But nonesuch seem to have achieved the status of being actual standards, so although part of this thread, are not part of this particular posting.

In Japan, TV service started in 1953, using system M, and with a unique VHF channelling system that used Bands II and III, but not Band I. For reasons unknown, although possibly to do with receiver cost, a “low” standard IF was chosen, namely 26.75 MHz VIF, 22.25 MHz SIF. This looks as if it were derived from the earlier US “low” IF, adjusted to suit the Japanese channels.

In 1957, Australia adopted system B with its own, somewhat different set of VHF channels that used Band II as well as Bands I and III. At the same time, a compatible standard IF was developed, 36.0 MHz VIF, 30.5 MHz SIF.

European UHF channel assignments were made at the ITU 1961 Stockholm meeting (ST61). Following the US precedent, the allocations took into account the receiver IFs that would be used.

Thus the established system B combination of 38.9/33.4 MHz was also used for systems G and H generally. For Italy, both this IF and the existing standard 45.75/40.25 MHz were used. For system I, the UK proposed a range of 39.0-39.5 MHZ VIF, 33.0-33.5 MHz SIF, and Ireland proposed 39.0/33.0 MHz. Following (or perhaps during) ST61, both settled upon 39.5/33.5 MHz. For Russia and the eastern bloc, both the existing standard 34.25/27.75 MHz and a new 31.5/38.0 MHz combination were considered. The numbers for the French system L were 32.7 MHz VIF and 39.2 MHz SIF.

In the Italian and Eastern European cases, the older IFs were phased out in favour of the new over several years. In the Italian case the motivation would have been alignment with European practice. In the Eastern European case, there may have been preference for an IF that was closer to those of Western European and in the same bracket for UHF channel allocation purposes.

The ITU 1963 Geneva African VHF-UHF planning meeting produced a new IF, namely 40.2 MHz VIF, 33.7 MHz SIF, developed for system K1 receivers, along with a matching VHF channelling plan, for the French Outré Mer territories. This would also have been used in working out the system K1 UHF channel allocations. Presumably 38.9/33.4 MHz was used for systems B, G and H. What was used for system I is unknown, although from an allocation viewpoint, VIFs of both 38.9 and 39.5 MHz would have been in the same bracket. South Africa chose 38.9/32.9 MHz for system I, but whether that was done at the 1963 Geneva meeting, or later on when it introduced its TV service, is unknown.

Circa 1970 there were two changes, one major, one minor.

In Japan, the standard “low” IF had been found to be unsuitable for use with the UHF channels in particular, so a new standard of 58.75 MHz VIF, 54.25 MHz SIF was introduced. This was the highest of any of the standard IFs, and progressively replaced the older standard.

In Australia, a slightly different alternative IF of 36.875 MHz VIF, 31.375 MHz SIF was introduced as a way to combat specific interference problems that had arisen with the 36.0/31.5 MHz IF. This was the only standard IF whose frequencies were defined to three places after the decimal point, although one assumes that the intent was a precision of 2½ places, not 3.

In 1992, the ITU recorded that for system K1, IFs of 39.9 MHz VIF, 33.4 MHz SIF and 32.7 MHz VIF, 39.2 MHz SIF (the French system L standard) were in use for system K1 as well as the original 40.2/33.7 MHz. Almost improbably, system K1 was possibly the only system with three standard IFs to its name.

A major unknown is the China case. The evidence suggests that in latter years, the Russian 38.0/31.5 MHz IF was used. But it seems unlikely that this was in place back in 1957, when China started its TV service. The 37.0 MHz VIF/30.5 MHz SIF combination does occur in references to China. Perhaps this was the original number, calculated to suit its VHF channelling plan, which was different to the Russian plan. Resolving that is a work-in-progress. But if 37.0/30.5 MHz was a standard, then systems D and K would also have had three standard IFs associated with them.

Cheers,

Steve P.

ReplyQuote
Topic starter Posted : 19/12/2018 2:27 am
Synchrodyne
(@synchrodyne)
Reputable V-Ratter Registered
Posted by: Synchrodyne

South Africa chose 38.9/32.9 MHz for system I, but whether that was done at the 1963 Geneva meeting, or later on when it introduced its TV service, is unknown.

In fact the 38.9 MHz VIF, 32.9 MHz SIF option for system I was considered in the ST61 deliberations.

It was included in the Belgian submission to ST61:

ITU ST61 Doc 7 E Add 02 Sec 1 Belgium

Going into ST61, Belgium was undecided as to whether it would use system H or system I for its UHF TV service. Thus both cases were covered. The desired IFs were thus 38.9/33.4 MHz for system H and 38.9/32.9 MHz for system I.

It is reasonable to assume that the 38.9/32.9 MHz combination was technically acceptable. Thus it would have been one of the options available to South Africa when it came to make its choice. If, as previously suggested, the UK 39.5/33.5 MHz choice was tailored to the specific needs of UK dual-standard receivers, then it probably offered no technical advantage in South Africa.

In the Belgian case, where multistandard receivers were the norm, it may have been thought that using the same VIF for system I as already established for system B resulted in the simplest receivers. Whatever VIF was chosen, the intercarrier frequencies would be different, so that intercarrier sound channels in any event would have to cater for both the 5.5 and 6.0 MHz cases. That being the case, the VIF rather than the SIF presented the opportunity for commonality.

The eventual Belgian decision to adopt system H avoided the receiver complexities that would have arisen with system I. But in the UHF era, Belgian receivers also had to be capable of handling French system L transmissions. Here the solution was to use an SIF of 33.4 MHz, common with that of systems B, C, F and G/H. This put the system L VIF at 39.9 MHz. The AM sound channel was already dual-frequency, 27.75 MHz for system E and 33.4 MHz for systems C and F, so perhaps fitting in a 3rd frequency for system L (32.4 MHz) was seen as more difficult than moving the VIF out to 39.9 MHz, and accepting the need for switchable Nyquist filters in the vision channel. Pre the UHF era, the system E IFs in Belgian multistandard receivers were usually 38.9 MHz VIF and 27.75 MHz SIF. In the UHF era, these were sometimes moved up to 39.9 MHz VIF and 28.85 MHz SIF. Once the vision IF channel had to cater for Nyquist slopes with -6 dB points over both 38.9 and 39.9 MHz, both were available for use with system E. The Nyquist slope over 39.9 MHz, intended for system L with its 1.25 MHz vestigial sideband, was more favourable for system E (which had a 2.0 MHz vestigial sideband) than that of 38.9 MHz, intended for systems with a 0.75 MHz vestigial sideband.

Cheers,

Steve P.

ReplyQuote
Topic starter Posted : 25/12/2018 11:57 pm
Synchrodyne
(@synchrodyne)
Reputable V-Ratter Registered

Reverting to early US practice, the attached item, from “Electronics” magazine for 1946 January, is the earliest reference to the initial standard TV IF that I have so far found.

Electronics 194601 p.272 RMA IF Proposals FM & TV

   

from Electronics 194601 p.272 RMA IF Proposals FM & TV

The RMA was reported as having proposed that the standard sound IF for TV receivers be in the range of 21.25 to 21.9 MHz, and furthermore that the oscillator frequency be on the high side of the signal frequency (supradyne). Soon thereafter – although as yet I have not determined exactly when – it was formalized as a standard.

The FCC had announced the reshuffle of VHF frequency assignments on 1945 June 27, with a new set of VHF TV channels occupying the bands 44-50, 54-72, 76-88 and 174-216 MHz. Thus the RMA had moved quickly to establish a suitable TV IF before any stations started using the new channels. The cutoff date for inclusion in a 1946 January publication was probably 1945 late November or early December, so the relevant sub-committee was convened, and its work done within say five months.

That the RMA nominated the sound IF and a range rather than a single frequency looks unusual as compared with later practice.

At the time though the choice of the sound IF was logical, given that it was before the advent of the intercarrier technique ( https://www.radios-tv.co.uk/community/black-white-tvs/intercarrier-sound/#post-66058), so that the sound IF channel was the most sharply tuned part of the receiver, whereas the vision IF channel was relatively broadly tuned. Thus the sound IF was more amenable to precise definition.

Whilst I have not seen an explanation for the range 21.25-21.9 MHz, a reasonable deduction is that anywhere within that range was found to produce equally acceptable results.

Of course, definition of the sound IF combined with the supradyne oscillator requirement automatically defined the vision IF range as being 25.75-26.4 MHz. This was not mentioned in the Electronics magazine item. Rather the upper channel limits were said to be in the range 26.5 to 27.15 MHz.

That first RMA standard TV IF, later referred to as the “low” IF, was significant in several ways:

1. It was the first TV IF developed and promulgated by a standards-issuing organization of country-wide significance.

2. It turned out to be not fully satisfactory, and led to the development of a new standard “high” IF, just below the lowest VHF TV channel, which then set the pattern that was adopted for most cases worldwide.

3. It spawned the early use of the 25.75/20.25 MHz combination in Italy before the 45.75/40.25 MHz combination was standardized.

4. The initial Japanese standard IF of 26.75/22.25 MHz, which lasted until c.1970. was evidently a development of the first US standard.

Cheers.

Steve

ReplyQuote
Topic starter Posted : 26/01/2019 2:20 am
Synchrodyne
(@synchrodyne)
Reputable V-Ratter Registered

Insight into the choice of intermediate frequencies for four-standard TV receivers (for use in Belgium and border areas) is provided by an article in Philips Technical Review 1955 December.

38.9 MHz vision IF (VIF) was chosen because it was the best for systems B, C and F. By then it was already the standard for system B in Western Europe. So that was a case of “majority” rule.

Use of 38.9 MHz VIF for system E resulted in a sound IF (SIF) of 27.75 MHz, as compared with 33.4 MHz for systems B, C and F.

Having two SIFs was considered to be problematical at the time. Also, neither frequency was ideal when it came to providing stable and economical FM demodulation for system B. The answer was a second conversion to 7.0 MHz, which was satisfactory for FM demodulation, and was deemed to be the most suitable number.

Philips did not provide background detail on the 7.0 MHz choice, but one may see that in general it would not cause any harmonic difficulties, as all harmonics would fall on the channel boundaries for systems B, C and F, and by Band III, where were the system E channels of specific interest, any harmonics would probably be too weak to worry about.

For conversion from 33.4 to 7.0 MHz, supradyne, with oscillator at 40.4 MHz, was chosen over infradyne, with oscillator at 26.4 MHz. With 40.4 MHz, only one harmonic, the 5th at 202 MHz, fell in-band, and then in a position which did no harm to any of the system B/C/F or system E channels. Also, 40.4 was the lower adjacent sound IF, for which a rejection filter was provided in the vision IF strip.

For conversion from 27.75 to 7.0 MHz, infradyne operation, with oscillator at 20.75 MHz was necessary, as with the supradyne case, the oscillator was at 34.75 MHz, and so within the IF channel. The 20.75 MHz oscillator was only in operation during reception of system E. And these four-standard receivers were confined to the Band III channels for system E, with French channel 8A being of the most interest. Philips noted that the 9th harmonic of 20.75 MHz, at 186.75 MHz, was just above the channel F8A vision carrier at 185.25 MHz. This was nominally within the 2 MHz wide vestigial sideband, but the four-standard receiver used a 0.75 MHz vestigial sideband for all four systems. Also, that 9th harmonic, when translated to IF, would come out at 40.4 MHz, at which frequency there was the lower adjacent sound rejector.

4 Std TV Tx Vision Bandpass from PTR 195512 p.164

Not mentioned by Philips, but the 8th and 10th harmonics of 20.75 MHz fell at 166 and 207.5 MHz respectively. The 8th harmonic was within the main vision sideband for channels F5 and F6, but I don’t think that F5 was of much interest for cross-border reception. In the F6 case, the harmonic corresponded to a video frequency of 7.4 MHz, not really a problem when it is considered that the four-standard receivers had a rather truncated video bandwidth for system E. The 10th harmonic might have bene troublesome for channels F11 and F12, but I don’t think that these were in use back in the mid-1950s.

Nonetheless, to minimize the chance of the sound second conversion local oscillator output getting into the vision IF channel, a buffer amplifier stage was provided between the sound take-off point and the second conversion stage. This had to work at both 27.75 and 33.4 MHz.

Thus we now have the rationale for the choice of a 7.0 MHz second SIF used in some four-standard receivers.

As noted in previous postings, this was an initial position for Philips. To recap:

It is known that one or more other setmakers chose 11.8 MHz as the second SIF, although it is not known who they were.

In due course Philips abandoned the second conversion, instead using intercarrier sound (at 5.5 MHz) for system B, and the 33.4 and 27.75 MHz SIFs for systems C, F and E respectively. This required a three-frequency sound IF strip, but evidently that was not too daunting.

With the arrival of UHF transmissions, systems G/H and L had to be catered for. Systems G/H were no problem, requiring exactly the same IF processing as system B. At least for some of its receivers, Philips accommodated system L System L by using a VIF of 39.9 MHz and an SIF of 33.4 MHz, the latter coincident with of systems B, C and F. At the same time, the system E VIF was moved to 39.9 from 38.9 MHz, with the SIF thus moving to 28.75 from 27.75 MHz.

Cheers,

Steve

ReplyQuote
Topic starter Posted : 20/07/2019 6:02 am
Nuvistor liked
Synchrodyne
(@synchrodyne)
Reputable V-Ratter Registered

Returning to early US practice, this item in Radio Craft 1946 October noted that before the arrival of the 1945 standard IF, TV receiver IFs in the USA were 8.25 MHz sound and 12.75 MHz vision. These numbers apparently go back to the pre-NTSC 525/60 days, when the RMA 441/60 proposal was extant.

Radio Craft 194610 p.58 US TV IF

The initial RMA 441/60 proposal (1938) had used a 6 MHz channel, as determined by the FCC, with double-sideband vision with a carrier at 2.5 MHz above the channel lower edge and 2.5 MHz sidebands. The sound carrier was 3.25 MHz above the vision carrier, putting it 0.25 MHz below the upper edge. The FCC had allocated 19 6-MHz-wide VHF TV channels for experimental purposes, not all contiguous, between 44 and 294 MHz, although only the 7 between 44 and 108 MHz were of interest in the late 1930s.

To receive those 7 channels with the technology of the day, it was reckoned that an IF channel in the 7 to 15 MHz range was desirable. Having the vision carrier at the upper end would provide a more favourable frequency-to-bandwidth ratio for the vision IF channel. It was also anticipated that despite the double sideband transmission, receivers would use single-sideband reception, with a sharp cutoff at the upper end of the IF channel, passing through the vision carrier at the -6dB point. This would help to get the vision carrier at a high a frequency as possible within the acceptable range. Significant constraints were the existence of amateur bands in the 7 and 14 MHz regions, these thought likely to be a source of interference. Thus a vision IF of 13.0 MHz was chosen, with a concomitant sound IF of 9.75 MHz.

The RMA 441/60 proposal was amended when it was realized that vestigial sideband transmission would be possible. For the final proposal (1939), the 6 MHz channel was retained, with the vision carrier 1.25 MHz above the lower edge, accommodating a 0.75 MHz vestigial lower sideband. The upper sideband was 4 MHz wide, and the sound carrier was placed at 4.5 MHz above the vision carrier, so still 0.25 MHz below the channel upper edge.

To suit the modified channel internal dimensions, the suggested vision IF was moved down slightly from 13.0 to 12.75 MHz, this allowing for a gentler slope across the vision carrier to suit vestigial sideband operation. This put the sound carrier at 8.25 MHz, which in turn meant that the lower adjacent channel sound rejection point became 14.25 MHz, allowing good rejection in the 14 MHz region.

The FCC TV channel allocations were subject to some changes, although not right away. The lowest, 44 to 50 MHz, was deleted in mid-1940 when the 42 to 50 MHz range was assigned to FM broadcasting. (The 44 to 50 MHz channel was returned in mid-1945, but only for a brief interval). At the same time, there were some adjustments made to other channels, with the net result that there were 18 channels between 50 and 294 MHz, with only the 6 between 50 and 108 MHz being of immediate interest. Evidently these changes did not require reconsideration of the receiver IF.

The definitive NTSC 525/60 TV standard was issued in 1941. It retained the same 6 MHz channel as for the RMA 441/60 system, that being a fixed input parameter for the NTSC deliberations. The FCC channel assignments were unchanged from 1940, so there were still 18 channels between 50 and 294 MHz, of which the 6 between 50 and 108 MHz were of initial interest.

Given that the channel dimensions and the channel assignments, including the subset of interest, had not changed between the 1939 RMA proposal and the 1941 NTSC system, evidently there was no need to reconsider the IF case. Thus the 8.25/12.75 MHz combination was used, I think by the majority of those who produced TV receivers in the short period before other events of worldwide significance supervened.

I cannot trace that the 8.25/12.75 MHz IF was ever standardized by the RMA. Before the NTSC standard was issued, the RMA TV standard was experimental only, and it seems unlikely that the RMA would have standardized an IF under those circumstances. Perhaps some thought was given to the matter in 1941, after the NTSC TV standard was issued. Logically the RMA would have been concerned about the future use of the higher frequency channels, for which the existing IF would have been less suitable. But one can imagine that the matter was put on the back burner around the end of 1941. And by the time it was appropriate to reconsider it (1945), the FCC was planning a full rework of the its TV and FM frequency assignments, and the upper VHF channels were by then of interest.

Cheers,

Steve

ReplyQuote
Topic starter Posted : 02/09/2019 6:26 am
Nuvistor liked
Synchrodyne
(@synchrodyne)
Reputable V-Ratter Registered
Posted by: @synchrodyne

In Japan, TV service started in 1953, using system M, and with a unique VHF channelling system that used Bands II and III, but not Band I. For reasons unknown, although possibly to do with receiver cost, a “low” standard IF was chosen, namely 26.75 MHz VIF, 22.25 MHz SIF. This looks as if it were derived from the earlier US “low” IF, adjusted to suit the Japanese channels.

Upon further reflection, I think that it may be seen that the original Japanese TV IF choice, 26.75 MHz VIF, 22.25 MHz SIF, was not simply a slightly modified carryover of earlier American practice, but was determined as the best fit, at the time, for the chosen VHF channelling plan in Japan.

Japan did not use Band I for broadcasting purposes. Its VHF TV channels were accommodated in part of Band II (90 to 108 MHz; the Region 3 full allocation being 87 to 108 MHz) and an upwardly extended Band III (eventually 170 to 222 MHz, the Region 3 basic allocation being 170 to 200 MHz).

The Band II channel allocation probably outruled using the then newer American numbers of 45.75/41.25 MHz. In part, this would have been because the second harmonic of the VIF, at 91.5 MHz, was only just above the channel J1 vision carrier, at 91.25 MHz.

With the older American numbers of 25.75/21.25 MHz, the 4th harmonic of the VIF fell at 103 MHz, very close to the channel J3 vision carrier at 103.25 MHz, so that would not have worked either. That indicated that a unique solution was required.

At the time, going somewhat higher than the 40 MHz range was probably considered less desirable with the consumer-level IF filter technology then available. A VIF just above 36 MHz would have had a 3rd harmonic just above 108 MHz, the top edge of channel J3. But then the 3rd harmonic of the SIF would have been somewhere close to the sound carrier of channel J1. So that option did not look so good, either.

On the other hand, the problem could be solved by taking the American “low” IF and moving it up by 1 MHz, to 26.75/22.25 MHz. Then the 4th harmonic of the VIF fell at 107 MHz, within channel J3, but in a position, at the upper end of the vision main sideband, and still 0.75 MHz away from the sound carrier, where it could do little harm. Then the 5th harmonic of the SIF fell at 111.25 MHz, well out-of-band, whilst the 4th fell at 89 MHz, just below the bottom edge of channel J1. That looks to have been a fairly tight fit. The “next stop” in the upward direction was probably a VIF 4th harmonic of 108.5 MHz, 0.75 MHz above the channel J3 sound carrier. That meant a VIF of 27.125 MHz, and an SIF of 22.625 MHz. The 4th harmonic of that was at 90.5 MHz, probably too close to the channel J1 vision carrier at 91.25 MHz, bearing in mind that receiver vision channel selectivity was typically much milder than sound channel selectivity.

Such a low IF was less suitable for the UHF channels. Here, a higher IF was desirable – say in, or close to the 40 MHz range - to allow adequate receiver image rejection. The wide nature of the UHF band meant that whatever practicable IF was chosen, there would be in-band interference possibilities. But these were dealt with in the geographical channel allocation process, in that “taboo” situations were simply avoided.

Given that the Japanese VHF channel allocation effectively precluded an IF in the 40 MHz range, then something even higher was needed. And by the late 1960s, when solid-state signal circuitry was available and widely used , such was much less of a challenge in consumer-level products. The chosen VIF of 58.75 MHz had a 2nd harmonic of 117.5 MHz, well above the top edge of Band II. The 3rd harmonic was at 176.25 MHz, 0.5 MHz above the sound carrier of channel J4, and 1 MHz below the vision carrier of channel J5. The 4th harmonic, at 235 MHz, was well above the top edge of Band III. The SIF was at 54.25 MHz, with a 2nd harmonic at 108.5 MHz, 0.75 MHz above the sound carrier of channel J3. The 4th harmonic was at 217 MHz, very close to the 217.25 MHz vision carrier of channel J12. This last item seems incongruous, but perhaps it was thought that in the solid-state era, adequate screening, sufficient to avoid any problems, was more easily obtained. Also, strong IF harmonics tended to be generated mostly in the demodulation process, particularly from traditional AM envelope demodulators and FM discriminators and ratio detectors, and against that, for TV systems with FM sound, the SIF as such was seldom demodulated. Rather it was simply translated to a lower frequency, typically the 4.5 MHz intercarrier, but also 10.7 MHz in later Japanese split-sound practice. Also, on-chip demodulation tended to reduce the spread of IF harmonics. That 10.7 MHz could be used in a TV receiver environment was indicative of the improvements obtained in screening, etc. Recall that in the 1950s, when a second sound conversion was required, the 2nd IF had to be carefully chosen to avoid potential interferences, so was typically an unusual number.

In summary, it could be said that the Japanese Band II TV channel allocation effectively precluded the use of “30 MHz” and “40 MHz” IFs, and more-or-less forced the use of “20 MHz” or “50 MHz” types. In the early days, “20 MHz” was used. Later, when the use of the UHF channels indicated and technology allowed, “50 MHz” was used. This was possible in Japan because of the lack of Band I TV channels. Also, the initial 26.75/22.25 MHz IF channel was a considered choice, and not simply a carryover, with minor change, of earlier American practice.

Cheers,

Steve

ReplyQuote
Topic starter Posted : 23/06/2021 9:06 am
Nuvistor liked
Synchrodyne
(@synchrodyne)
Reputable V-Ratter Registered

Continuing with the Japan case, for the Band II channels, local oscillator frequencies for the 26.75/22.25 MHz IF were all above the band upper edge, the band being 18 MHz wide.

In the Band III case, where the band was 52 MHz wide, local oscillator frequencies were in-band for the lower channels, J4 through J8.

The local oscillator frequency for channel J4 (vision carrier 171.25 MHz) was 198 MHz. This was at the division point between channels J8 (192 to 198 MHz) and J9 (198 to 204 MHz), so it should not have been troublesome. And that pattern was repeated, one channel up in each case, for channels J5 through J7. But the channel J8 (193.25 MHz vision carrier) local oscillator was at 220 MHz, 2.75 MHz above the channel J12 vision, where it might have been somewhat troublesome.

And interesting aspect of the Japanese Band III allocations though was that channels J4 through J7 occupied 170 through 194 MHz, whereas channels J8 through J12 occupied 192 through 222 MHz. Thus there was an unusual 2 MHz overlap between channels J7 and J8. That would have introduced some complications in respect of the geographical assignments of those channels.

I have not seen a reason stated for that overlap. As mentioned, the original ITU 1947 Atlantic Region 3 Band III assignment was 170 to 200 MHz, although China (but not Japan) had an upward extension to 216 MHz from the start. The ITU 1959 Geneva allocations show Region 3 Band III at 170 to 216 MHz, with Japan having an upward extension to 222 MHz. Exactly when these changes were made is unknown, and whether, in the Japan case, the upper limit went from 200 to 222 MHz in one move, or was initially moved to 216 MHz and then to 222 MHz is also unknown. It appears that channels J4 through J7 were positioned successively upwards from the 170 MHz bottom edge, and the higher channels successively downwards from a 216 or 222 MHz upper edge, resulting in the aforementioned 2 MHz overlap. This may have been done to maximize the number of channels within Band III, albeit at the expense of the overlap that reduced the utility of channels J7 and J8. That this allocation helped reduce the effects of in-band local oscillator interference for some of the channels may have been fortuitous, but it is also possible that it was a partial cause and not just an effect.

The introduction of the 58.75/54.25 MHz IF ensured that all Band III channel local oscillator frequencies were out of band.

Cheers,

Steve

ReplyQuote
Topic starter Posted : 24/06/2021 8:14 am
Nuvistor liked
Synchrodyne
(@synchrodyne)
Reputable V-Ratter Registered

The Russian/Eastern European system D/K IF change is also an interesting case. The early numbers were 34.25 MHz VIF, 27.75 MHz SIF. But a moderate upward shift, to 38.0/31.5 MHz was made quite early on. At least it was in-play in the lead-up to the ITU 1961 Stockholm European UHF allocations meeting (ST61), although its actual implementation may have been spread over quite a few years. One may why such a small change was made. I have not seen a reason for this, and can only speculate. Something that comes to mind is that for UHF channel assignment planning, it might have been desired to have the IF in approximately the same position as the CCIR standard IF of 38.9/33.4 MHz. That way, the co-siting channel exclusions would be similar to those that obtained for the majority of Western Europe. On the other hand, with the established 34.25/27.75 MHz IF, they would have been somewhat different.

Whether the original IF was in some ways unsatisfactory is unknown. At least one may do some “back-of-the-envelope” calculations. These show that it appears to have been a fit to the Band I and Band II (76 to 100 MHz) channels. The 34.25 MHz VIF had a 2nd harmonic at 68.5 MHz, above channel R2 (originally O2), and a 3rd harmonic at 102.75 MHz, above channel R5. The 5th harmonic, at 171.25 MHz, fell below the bottom edge of Band III, 174 MHz. The 27.75 MHz SIF had a 2nd harmonic at 55.5 MHz, which was a relatively harmless position in channel R2, 0.75 MHz below the sound carrier. The 3rd harmonic, at 83.25 MHz, was also relatively harmless, at 0.5 MHz below the sound carrier of channel R3.

The local oscillator frequency for channel R1 was at 84 MHz, which was the boundary between channels R3 and R4. For channel R2 it was at 93.5 MHz, just 0.25 MHz above the vision carrier of channel R5, so potentially troublesome, and perhaps requiring that channels R2 and R5 not be allocated in the same area or in adjacent areas. For channels R3 through R5, the local oscillators frequencies were out-of-band.

In Band III, the channel R6 local oscillator frequency of 209.5 MHz was perhaps troublesome for channel R10, being 2.25 MHz above the vision carrier at 207.25 MHz. Similarly channels R7 and R8 might have adversely affected channels R11 and R12. So there were probably some Band III channel pairings that needed to be avoided in co-siting or adjacent-siting situations.

Now looking at the new IF, for the 38.0 MHz VIF, the 2nd harmonic was at 76 MHz, the bottom edge of channel R3, the 3rd out-of-band at 114 MHz, and the 5th at 190 MHz, which was the boundary between channels R7 and R8. So it looked to be clear of harmonic problems.

The 31.5 MHz SIF had a 2nd harmonic at 63.0 MHz, 3.75 MHz above the channel R2 vision carrier (59.25 MHz), so possibly mildly troublesome. The 3rd was at 94.5 MHz, 1.25 MHz above the channel R5 vision carrier (93.25 MHz), so probably more troublesome, although SIF harmonics seem to have been less problematic overall than VIF harmonics. The 5th harmonic, at 157.5 MHz, was well below the bottom edge of Band III.

The channel R1 local oscillator at 87.75 MHz was within channel R4, 2.5 MHz above the vision carrier (85.25 MHz). For channel R2 it was at 97.25 MHz, within channel R5, but at a not very harmful position, 4 MHz above the vision carrier (93.25 MHz).

For Band III channels R6, R7 and R8, their local oscillators fell harmlessly 0.5 MHz below the sound carriers of channels varrierR10, R11 and R12, respectively.

Thus one could say that the move to 38.0/31.5 MHz offered a somewhat improved situation in Band III, but just a different set of potential conflicts in Bands I and II. It is not impossible, but it seems unlikely that the this by itself was an adequate reason to change. That leaves the UHF requirement as the primary driver.

One could also ask that if UHF harmonization was an objective, why not simply move to the established CCIR VIF of 38.9 MHz. But for example, this had a 2nd harmonic at 77.8 MHz, just 0.55 MHz above the channel R3 vision carrier (77.25 MHz). From that one might deduce that 38.0 MHz was probably the closest approach reasonably possible, and close enough to be in the same bracket from a UHF planning viewpoint.

Cheers,

Steve

ReplyQuote
Topic starter Posted : 26/06/2021 12:37 am
Synchrodyne
(@synchrodyne)
Reputable V-Ratter Registered

I recently found an interesting 1952 IEE paper written by Pye staffers that included comments about IF selection for TV receivers.

The paper is:

“The Design of a Superheterodyne Receiver for Television”, by D.H. Fisher, P.A. Seagrave, and A.J. Watts. It was presented in 1952 May, although evidently written in late 1951.

A key point was that although the UK market receiver in question was for the Band I channels, Pye still chose a “high” IF, namely 35.0 MHz VIF, 38.5 MHz SIF. It argued that any IF lower than 34 MHz allowed the possibility of harmonic interference, as shown in this chart:

Pye Band I IF Harmonics

This 35.0/38.5 MHz combination was not far off the later BREMA standard, of 34.65/38.15 MHz. In the latter case, the slight downwards adjustment was made to minimize harmonic interference in the Band III channels, which had not been a consideration for Pye.

Pye was evidently interested in the export market, judging by these comments in the paper.

“Immediately one considers the conditions pertaining to the choice of an intermediate frequency for receivers to be sold outside the United Kingdom, it is seen that assumptions must be made concerning the channel allocations, carrier spacings and other factors of vital importance. In the United States the frequency allocations have been long fixed for the v.h.f. band, and calculations on this basis can be considered relatively permanent. Future expansion appears to be assigned entirely to the u.h.f. band, where different conditions apply. Canada, Mexico, Central America and even South America have also agreed to use the same standards, so that a given choice of i.f. covers a wide market.

“In European countries it seems fairly certain that the same frequency bands will be adopted, although the spacing between vision and sound carriers will be greater than in America by 1 Mc/s. It is unlikely that one country will operate many stations, and at present a clear frequency-allocation plan does not seem apparent. Also, the American channel 1, i.e. 48-54 Mc/s, has been retained, whereas in America it has been dropped.”

In respect of America, it said:

“Very full consideration of all the factors applying to the American standards leads to the possibility of three intermediate-frequency bands. These bands are 21-27, 33-39, and 4046 Mc/s. The first band has long been standard with American manufacturers. It has the disadvantage of being open to several forms of interference, mostly due to harmonics. The second channel is much more favourable, while the 40-46-Mc/s band is only open to two possible harmonic combinations which can cause beat signals to appear, but which are most unlikely. This channel has, of course, been made possible by the removal of channel 1 from the carrier allocations, and has the disadvantage of being very close to the vision carrier of channel 2, i.e. 54.25 Mc/s. This is one very good reason why the lowest i.f. has prevailed for so long even against logical argument. It has been standard practice among television tuner designers and manufacturers in America to use triode mixers for purely economic reasons. In this case, instability causes a serious problem if the intermediate and carrier frequencies are too close together. Neutralization is, of course, a solution to the problem, but the change-over to the higher intermediate frequencies has been slow.

“The authors were among the first to use the 40-46-Mc/s band for an American receiver. The results, as expected, were excellent. New valves in the American range make it possible to achieve very good performance at this frequency, and so far very little trouble has been found with any form of interference.”

Apparently the double triode, particularly the 6J6 and the 12AT7 example, were popular as mixer-oscillators in American practice, whereas the use of a pentode mixer, which would facilitate the use of the high IF, would have required a separate oscillator. Hence the economic reason for staying with the double triode. The first American triode pentodes, the 6X8 (RCA) and 6U8 (Tung-Sol), were announced in the second half of 1951, so appeared about the time that the paper was being drafted. They made it possible to use the high IF without economic penalty. The new IF valves referred to by Pye would have included the 6CB6 of 1950, developed especially for 40 MHz IF use.

For the European situation, Pye said:

“For European transmissions, a band of frequencies close to the British i.f. channel may be chosen. Carrier spacing is 5.5 Mc/s with the 625-line standard system resulting in a vision i.f. of 39.5 Mc/s and a sound i.f. of 34.0 Mc/s. Using the same types of valves as in the British receiver, the performance is in every way comparable. The fact that the bandwidth is approximately twice as great necessitates one extra valve in order to obtain comparable sensitivity. This band has, in practice, proved to be clear of undesirable spurious signals. At the same time it has been found that triode mixers need special attention if the lowest signal channel is to be received, and they prove to be somewhat unstable on all low-band channels."

The 34.0/39.5 MHz IF combination was somewhat higher than the eventual 33.4/38.9 MHz standard, but at the time, the European channel frequencies had not been cast in stone; that happened at the ITU 1952 Stockholm meeting. The 39.5 MHz vision IF was later recycled as it were, for the British System I. There may have been an historical connection, but there were also independent reasons for its later use.

By inference, Pye used an ECL80 as the frequency changer for the Band I receiver at interest. It also made the comment: “The authors note with regret that there appears to be no efficient triode-pentode available on the British market suitable for frequency-conversion service at the frequencies under consideration.

They may or may not have known about the American triode pentodes when the paper was written, but likely would have by the time it was presented. The first British triode pentode intended specifically for TV frequency changer work seems to have been the Mazda 20C1, in 1952. The more mainstream ECF80/PCF80 and ECF82/PCF82 appeared in 1953, the latter a 6U8 derivative.

The closing comment was: “It appears to be more than likely that valve and circuit designers will continue energetically to seek the most elegant solution to their combined problems. At present the greatest need is for valves giving improved performance in r.f. stage and mixer-oscillator service at frequencies up to 220 Mc/s.”

Such valves had appeared in the US towards the end of 1951, the previously mentioned triode pentodes and also the double triode cascode RF amplifiers, such as the 6BQ7. The two types were complementary. The pentode mixer was much preferred at Band I, but not beneficial and potentially detrimental at Band III. But the low-noise, high-gain cascode RF amplifier enabled the use of the pentode mixer at Band III without penalty. And the triode-pentode combination provided an economic single-valve envelope frequency changer. The valve makers did give Pye (and the rest of the industry) what it wanted to facilitate the use of “high” IFs in TV receivers.

Cheers,

Steve

ReplyQuote
Topic starter Posted : 30/11/2021 12:43 am
Synchrodyne
(@synchrodyne)
Reputable V-Ratter Registered

I recently found an interesting 1952 IEE paper written by Pye staffers that included comments about IF selection for TV receivers.

The paper is:

“The Design of a Superheterodyne Receiver for Television”, by D.H. Fisher, P.A. Seagrave, and A.J. Watts. It was presented in 1952 May, although evidently written in late 1951.

A key point was that although the UK market receiver in question was for the Band I channels, Pye still chose a “high” IF, namely 35.0 MHz VIF, 38.5 MHz SIF. It argued that any IF lower than 34 MHz allowed the possibility of harmonic interference, as shown in this chart:

Pye Band I IF Harmonics

This 35.0/38.5 MHz combination was not far off the later BREMA standard, of 34.65/38.15 MHz. In the latter case, the slight downwards adjustment was made to minimize harmonic interference in the Band III channels, which had not been a consideration for Pye.

Pye was evidently interested in the export market, judging by these comments in the paper.

“Immediately one considers the conditions pertaining to the choice of an intermediate frequency for receivers to be sold outside the United Kingdom, it is seen that assumptions must be made concerning the channel allocations, carrier spacings and other factors of vital importance. In the United States the frequency allocations have been long fixed for the v.h.f. band, and calculations on this basis can be considered relatively permanent. Future expansion appears to be assigned entirely to the u.h.f. band, where different conditions apply. Canada, Mexico, Central America and even South America have also agreed to use the same standards, so that a given choice of i.f. covers a wide market.

“In European countries it seems fairly certain that the same frequency bands will be adopted, although the spacing between vision and sound carriers will be greater than in America by 1 Mc/s. It is unlikely that one country will operate many stations, and at present a clear frequency-allocation plan does not seem apparent. Also, the American channel 1, i.e. 48-54 Mc/s, has been retained, whereas in America it has been dropped.”

In respect of America, it said:

“Very full consideration of all the factors applying to the American standards leads to the possibility of three intermediate-frequency bands. These bands are 21-27, 33-39, and 4046 Mc/s. The first band has long been standard with American manufacturers. It has the disadvantage of being open to several forms of interference, mostly due to harmonics. The second channel is much more favourable, while the 40-46-Mc/s band is only open to two possible harmonic combinations which can cause beat signals to appear, but which are most unlikely. This channel has, of course, been made possible by the removal of channel 1 from the carrier allocations, and has the disadvantage of being very close to the vision carrier of channel 2, i.e. 54.25 Mc/s. This is one very good reason why the lowest i.f. has prevailed for so long even against logical argument. It has been standard practice among television tuner designers and manufacturers in America to use triode mixers for purely economic reasons. In this case, instability causes a serious problem if the intermediate and carrier frequencies are too close together. Neutralization is, of course, a solution to the problem, but the change-over to the higher intermediate frequencies has been slow.

“The authors were among the first to use the 40-46-Mc/s band for an American receiver. The results, as expected, were excellent. New valves in the American range make it possible to achieve very good performance at this frequency, and so far very little trouble has been found with any form of interference.”

Apparently the double triode, particularly the 6J6 and the 12AT7 example, were popular as mixer-oscillators in American practice, whereas the use of a pentode mixer, which would facilitate the use of the high IF, would have required a separate oscillator. Hence the economic reason for staying with the double triode. The first American triode pentodes, the 6X8 (RCA) and 6U8 (Tung-Sol), were announced in the second half of 1951, so appeared about the time that the paper was being drafted. They made it possible to use the high IF without economic penalty. The new IF valves referred to by Pye would have included the 6CB6 of 1950, developed especially for 40 MHz IF use.

For the European situation, Pye said:

“For European transmissions, a band of frequencies close to the British i.f. channel may be chosen. Carrier spacing is 5.5 Mc/s with the 625-line standard system resulting in a vision i.f. of 39.5 Mc/s and a sound i.f. of 34.0 Mc/s. Using the same types of valves as in the British receiver, the performance is in every way comparable. The fact that the bandwidth is approximately twice as great necessitates one extra valve in order to obtain comparable sensitivity. This band has, in practice, proved to be clear of undesirable spurious signals. At the same time it has been found that triode mixers need special attention if the lowest signal channel is to be received, and they prove to be somewhat unstable on all low-band channels."

The 34.0/39.5 MHz IF combination was somewhat higher than the eventual 33.4/38.9 MHz standard, but at the time, the European channel frequencies had not been cast in stone; that happened at the ITU 1952 Stockholm meeting. The 39.5 MHz vision IF was later recycled as it were, for the British System I. There may have been an historical connection, but there were also independent reasons for its later use.

By inference, Pye used an ECL80 as the frequency changer for the Band I receiver at interest. It also made the comment: “The authors note with regret that there appears to be no efficient triode-pentode available on the British market suitable for frequency-conversion service at the frequencies under consideration.

They may or may not have known about the American triode pentodes when the paper was written, but likely would have by the time it was presented. The first British triode pentode intended specifically for TV frequency changer work seems to have been the Mazda 20C1, in 1952. The more mainstream ECF80/PCF80 and ECF82/PCF82 appeared in 1953, the latter a 6U8 derivative.

The closing comment was: “It appears to be more than likely that valve and circuit designers will continue energetically to seek the most elegant solution to their combined problems. At present the greatest need is for valves giving improved performance in r.f. stage and mixer-oscillator service at frequencies up to 220 Mc/s.”

Such valves had appeared in the US towards the end of 1951, the previously mentioned triode pentodes and also the double triode cascode RF amplifiers, such as the 6BQ7. The two types were complementary. The pentode mixer was much preferred at Band I, but not beneficial and potentially detrimental at Band III. But the low-noise, high-gain cascode RF amplifier enabled the use of the pentode mixer at Band III without penalty. And the triode-pentode combination provided an economic single-valve envelope frequency changer. The valve makers did give Pye (and the rest of the industry) what it wanted to facilitate the use of “high” IFs in TV receivers.

Cheers,

Steve

ReplyQuote
Topic starter Posted : 30/11/2021 3:14 am
Synchrodyne
(@synchrodyne)
Reputable V-Ratter Registered

Another Pye paper that refers to TV receiver IF selection was:

“The Design of Dual-Standard Television Receivers for the French and C.C.I.R. Television Systems”; C.J. Hall; BIRE 1959 July.

It might seem to have been a strange topic for a Pye technical paper, but at the time, the author was assigned as liaison to Grammont, in France.

Also apparently unusual is that the receiver in question covered just systems E and B, whereas as many multistandard receivers covered systems C and F, as well. The rationale was that it was for use in the French border areas where system B transmissions from Germany, Switzerland and Italy were available, bit which were too far south to receive Belgian and Luxembourg transmissions, as shown in this map.

Pye French Dual Standard TV Map

Pye elected to use a common IF strip, with the narrower bandwidth required for system B fitting inside the wider bandwidth for system E. It also elected to include the system E Band I channels, which with supradyne conversion , determined that the system E VIF channel be at the low end of the IF channel, whereas the VIF for system B would be at the high end. This was different to say typical Belgian four-standard receivers, where the IF channel was VIF high for all systems, meaning that system E Band I channels were excluded.

Given that an attenuator was required for the system E SIF, at the upper end of the IF channel, it was logical that this should also correspond with the rejector for the system B adjacent channel sound. This established the basic relationship between the system B and E IFs. Within that constraint, the actual carrier frequencies were chosen to minimize harmonic feedback, as follows:

System E VIF 27.35 MHz
System B SIF 31.5 MHz
System B VIF 37.0 MHz
System E SIF 38.5 MHz

 

Pye France Dual Std TV IF Channel

 

I am guessing here, but I imagine that Pye would have looked at the standard system E IF of 28.05/39.2 MHz, which would have put the system B numbers at 32.2/37.7 MHz. But presumably that didn’t work from a harmonic interference viewpoint, and the 0.7 MHz downward shift was required to achieve the optimum in this regard. One imagines that the harmonic interference situation may not have been quite as good as with the individual system standard IFs, but that it was satisfactory.

This arrangement put the -6 dB points on the system E IF curve at 27.35 and 37.0 MHz, the latter being on Nyquist slope for system B. Thus there was a Nyquist slope at each end of the IF curve. That meant that the system E vision bandwidth was quite wide, at 9.65 MHz, which would have been at the upper end of French system E receiver practice of the time.

As said, that approach was quite different to that often used for Belgian and some French four-system TV receivers. There, the standard system B 38.9 MHz VIF was used for all four systems, with the system E SIF then falling at 27.75 MHz. When system L arrived, this was accommodated by using the same SIF as system B, namely 33.4 MHz, with the VIF moved up to 39.9 MHz. Doing this kept the system L SIF the same as that for systems C and F, namely 33.4 MHz, thus obviating the need for a 3rd AM sound IF. With this addition, in some cases, system E was moved up by 1 MHz to 28.75/39.9 MHz. How Pye handled the addition of system L to its system E/B receivers is unknown, though.

Cheers,

Steve

 

 

ReplyQuote
Topic starter Posted : 30/11/2021 3:21 am
Synchrodyne
(@synchrodyne)
Reputable V-Ratter Registered
Posted by: @synchrodyne

I am guessing here, but I imagine that Pye would have looked at the standard system E IF of 28.05/39.2 MHz, which would have put the system B numbers at 32.2/37.7 MHz. But presumably that didn’t work from a harmonic interference viewpoint, and the 0.7 MHz downward shift was required to achieve the optimum in this regard. One imagines that the harmonic interference situation may not have been quite as good as with the individual system standard IFs, but that it was satisfactory.

Well, that guess was wrong. In fact the 32.2/37.7 MHz combination was used by Philips for French TV border area receivers that could also receive system B/G/H transmissions. Thus one may reasonably assume that it was not unduly problematical from the harmonic feedback viewpoint. Pye may have placed different weightings on the various factors when working through the trades-off.

The 32.2/37.7 MHz combination is shown in Pieter Hooijmans’ superb “Philips TV Tuner History”, at: https://www.maximus-randd.com/. The mention of this IF set occurs in Part 2: https://www.maximus-randd.com/tv-tuner-history-pt2.html, specifically in the chapter: “IF and multi-standard in combination with UHF”, https://www.maximus-randd.com/tv-tuner-history-pt2.html#ifuhf.

 

Table from Philips TV Tuner History

 

That table refers to the first half of the 1960s. But it does seem possible that the 32.2/37.7 MHz combination had been used previously, as French border area receivers might have been required in the later 1950s. I understand that cross-border transmissions were available in Strasbourg quite early on.

The other IF combinations shown in the table have cropped up earlier in this series. Worthy of further comment is the use of 33.4/39.9 MHz for system D/K in Finnish and Austrian dual-standard, receivers. It somewhat parallels the Belgian multistandard case in which 33.4/39.9 MHz was used for system D/K. That choice ensured that systems C, F and L shared a common SIF, and so the same AM sound IF strip. The different VIF for system L would also more easily have allowed the use of the shallower Nyquist slope for this system as compared with B/C/F. But for systems B/G/H and D/K, K, a common sound IF would not appear to have been so advantageous. Very probably intercarrier sound was used, meaning that there was a dual-frequency sound IF strip, 5.5 and 6.5 MHz. On the other hand, B/G and D/K shared the same Nyquist slope, so would have benefitted from a common VIF. System H had the shallower Nyquist slope, but I suspect that it was mostly treated as if it were the same as G. (In the mid-1970s, by which time quasi-synchronous demodulation had become reasonably widespread, the benefits of the wider vestigial sideband and the reduced Nyquist slope may have been lessened.

But the 32.4/38.9 MHz combination for system D/K was less satisfactory from a harmonic interference than 33.4/39.9 MHz. The second harmonic of 3839 MHz was very close to the channel R3 vision carrier. As discussed upthread, it is assumed that the system D/K “UHF era” IF was set as 31.5/38.0 MHz as being the closest workable approach to the 38.9 MHz VIF, at least from underneath. I haven’t run the numbers, but I expect that 39.9 MHz was more favourable.

 

Cheers,

Steve

ReplyQuote
Topic starter Posted : 09/12/2021 3:36 am
Nuvistor liked
Sundog
(@sundog)
Trusted V-Ratter Registered

Hi Steve,

Thank you for detailing this history of Intermediate Frequencies. I think few of us have fully considered the relevance of the choices of IFs over the world.

Some of the links you have referred to have also given a me a window to other technologically interesting things such as the `enneode' 9 electrode EH40/80 valve via the Maximus link.

I have to ask, will you eventually collate your findings into table? If you have no appetite for such, maybe one of your avid readers can.

Regards,

John

ReplyQuote
Posted : 09/12/2021 7:59 pm
Synchrodyne
(@synchrodyne)
Reputable V-Ratter Registered

Thanks for the feedback, John.

Yes, I plan to compile a tabulation. In fact I have had several attempts at it so far, none with a satisfactory outcome. My thinking now is to have three tables. One would be a simple list in ascending VIF order, each entry with a comment to indicate with which system(s) it was used, where it was used if it was “geographical”, approximately over what period it was used, and whether or not it was an actual standard number.

The second list would be ordered by system, using the CCIR letter designations, and would show standard and other IFs used for each system.

The third would be a list of standard IFs and their direct and indirect derivatives.

I’ll probably need to be selective about the inclusion of pre-standardization IFs, with just a representative sampling.

The enneode was certainly an interesting case. In fact there was a whole raft of quadrature-type, valve-based FM demodulators of which the enneode was but one. There is a brief partial summary here: https://www.vintage-radio.net/forum/showpost.php?p=1120946&postcount=15. The major use for valve-type quadrature demodulators appears to have been in intercarrier-type TV receivers, where it was not necessary to derive an AFC or tuning indicator bias, and where the poorer linearity, as compared with the Foster-Seeley discriminator, was not seen as an issue. So there was a connection to TV IFs.

The fact that the EQ40 quickly morphed into the EQ80 reflected Philips’ dilemma after having developed the Rimlock valve series as best addressing post-WWII requirements. That development vector was cut short by the surprise arrival of the noval type, which happened to be a better fit for the kinds of TV valves that were then (c.1949) seen to be required, including the triode pentode, the latter becoming highly desirable as an economic frequency changer with the upward movement of IFs. So one may impute a connection between TV IFs and Philips’ choice of the noval base for its definitive TV valve range, although in that case there were other factors at work.

Cheers,

Steve

ReplyQuote
Topic starter Posted : 10/12/2021 2:40 am
Nuvistor liked
Synchrodyne
(@synchrodyne)
Reputable V-Ratter Registered

Also apparent from the previously shown table from Pieter Hooijman’s opus is that the 32.2/37.7 MHz IF combination was also used for French border area receivers that were equipped to receive Belgian system C and F, and Luxembourg system F transmissions. This was rational. Given that such receivers needed to cover all French system E channels, it was logical to retain the standard 28.05/39.2 MHz IF for those. And coverage of both the Band I and Band III channels for systems C/F required a VIF-high IF channel. So there was no real possibility of having an SIF common with that of system E, at least without the IF channel, and in particular the VIF, extending well into Band I.

On the other hand, French border area receivers that were required to receive only the Luxembourg system F transmission on channel E7 could be simpler, with a common SIF of 39.2 MHz, giving a system F VIF of 33.7 MHz. With the Band III channels, either infradyne (as in this case) or supradyne conversion could be used.

That the Belgian multistandard case was treated differently reflected the fact that there, the system E channels of interest were all in Band III, so that the IF channel could be VIF-high or VIF-low as required, the former being chosen for commonality with system B/C/F practice. The French receivers were restricted to a VIF-low IF channel in order to allow supradyne conversion of the System E Band I channels.

Cheers,

Steve

ReplyQuote
Topic starter Posted : 10/12/2021 2:58 am
Nuvistor liked
Pieter H
(@pieter-h)
New V-Ratter Registered

Hi Steve, all,

thanks for your comprehensive analysis of TV Intermediate Frequencies over time. It has been very helpful also when writing my Tuner History.

At the same time I would like to stress one general issue related to the choice of IF: in contrast to the RF parameters of TV standards and channel allocation, which were highly regulated, IF was not formally standardized! Yes, there were efforts, as you have shown from multiple documents, to compile overviews that hinted at standardization, but this was not formally mandatory. As you have observed, values could either not be provided by countries, or change arbitrarily over time. In principle the choice of IF is something for the set maker, since IF signals do not come outside the TV, and therefore do not require standardization. Despite that, there was a clear convergence on quasi-standard values, often related to the received TV standard(s). Why was that? Let me try to give some observations based on digging up the tuner history:

  • I think a first driver was convenience. The analysis of cross-modulation is complex, especially with ever increasing numbers of channels. So when TV makers in general, often in co-operation with national telecom authorities, did these analysis and shared their results, a kind of natural convergence occurred to the "optimal" values. The 38,9 VIF for B/G and 34,65 VIF for A (in the UK) are examples, but it was done for all standards. In the UK BREMA defined this 34,65 VIF as the "strongly recommended" standard value, but it took 5 years for TV makers to converge on it. I have described the (slow) convergence of IF in UK and French sets here https://www.maximus-randd.com/tv-tuner-history-pt1.html#tunerbasics 3
  • Once these de facto standard IFs were established, it almost certainly became a matter of convenience to stick to them. Why make your life as tuner and TV set maker more difficult than necessary by changing the IF for every TV chassis (assuming that would have made sense)? For standard component selection and especially production testing it was much more convenient to use standard values.
  • Another argument was - increasingly - second sourcing. TV set makers simply demanded that tuner outputs adhered to the standard values, such that they could source tuners, SAW filters and IF demodulators from multiple suppliers. This would have been much more difficult in case each set maker used different IF settings.
  • One development that further enforced the use of the standard values, at least in Europe, was the Amtsblatt set of standards. They expected (read prescribed) the use of the standard values and then specified maximum breakthrough performance at N +/- IF/2 and +/- IF, where standard measurement set-ups were used at the approbation centres. Although theoretically Amtsblatt probably still allowed non-standard IF values, in practice life became difficult if a set maker did so. https://www.maximus-randd.com/tv-tuner-history-pt4.html#emc .
  • The next step that further enforced (at least temporarily) IF standardization was the introduction of SAW filters in the early 1980s. https://www.maximus-randd.com/tv-tuner-history-pt4.html#saw . First generations SAW filters clearly required fixed IF values, as did the new Quasi Synchronous IF demodulation ICs. This situation was to remain stable for the next 30+ years, and allowed the unprecedented component price reduction over those years.

As said, this remained the standard situation for a few decades.This is essentially the great analysis and story you have presented. Nevertheless there were several trends that went against these "fixed" IF values:

  • The fact that specific intermediate frequencies were not a formal pre-requisite is illustrated by the choices in the so-called front ends, tuners plus IF demodulation in one can. In this architecture the IF signal never came outside the can, the output was demodulated video and/or intercarrier sound. Philips therefore had complete freedom in choosing the IF. In front ends like the FQ800 and FQ900 all standards B/G/D/K/K'/M/L/L' use the same 38,9MHz IF, with the 2nd sound IF becoming the standard-dependent variable. Because these frontends only allowed limited set of SAW filters, there were always primary standards, with full Amtsblatt compliance, and secondary standards where certain compromises were accepted (like M in Europe).
  • In the next step, silicon tuner integration, the concept of classical IF was completely dropped, and the tuner IC provided in one or two steps down-conversion to Near Zero IF. These choices were purely based on internal architectural optimization, there was no formal requirement whatsoever driving this choice.
  • A last step in the classical tuners was the HD1800 hybrid analog-digital tuner which used a single 8MHz SAW centred at 36,15MHz. For B/G/D/K/I/L the VIF moved to 39,9MHz, for M/N to 38,0MHz. All VIF and SIF variations were then handled in the digital IF IC. This essentially put an end to the classical concept of standard-related more-or-less fixed IF values.

Hope this helps to put developments into perspective.

Cheers, Pieter

ReplyQuote
Posted : 28/12/2021 10:57 pm
Sundog
(@sundog)
Trusted V-Ratter Registered

I was writing a reply but have edited it as I don't have time complete. Tomorrow!

ReplyQuote
Posted : 29/12/2021 10:13 pm
Synchrodyne
(@synchrodyne)
Reputable V-Ratter Registered

Hi Pieter:

Many thanks for that. It was very timely, as in working on the compilation of the tables, I had realized that it was probably desirable to put in some comment on the nature of IF standards, and who actually promulgated them. As you say, IF standards were “soft” in nature, as compared with the “hard” standards associated with broadcast formats, etc. They generally fell into the advisory or recommended practice categories, but as far as I know were never mandatory.

In some cases they were developed by industry/trade organizations, such as RTMA (and its successors) in the USA, BREMA in the UK, and SCART in France. National standards bodies were also involved, such as Standards NZ. I haven’t found confirmation, but in South Africa, likely SABS was involved. In the USA, ANSI became involved at a later stage. I imagine that in Russia, GOST would have been active in this field. In Italy and Australia, it was done by Government agencies. I haven’t found any cases where TV IFs were developed by technical societies or supra-national bodies (e.g. IEC, ISO) though.

There were also interactions between the various bodies. In the USA, the FCC did not recommend IFs, but it prevailed upon the industry to settle upon a new “high” IF which it could then use as the basis for UHF channel geographical allocations, important given that at UHF, images were mostly in-band. Something similar happened with the ITU in respect of its ST61 European UHF planning meeting. The submissions from the various countries included recommended IFs, which were then used for channel allocation purposes. The EBU was involved with IF surveys at least in the 1950s. This seemed to have been more in the nature of information gathering and dissemination, but its outputs may well have been used by those who did develop IF recommendations.

I did discover a major gap, in that I have not identified any European body (or bodies) who promulgated the 38.9/33.4 MHz IF as a recommended practice for system B, etc., receivers. It is sometimes referred to as the CCIR standard IF, but I suspect that was not because the CCIR developed it, but simply because it was associated with system B, which initially was often referred to as the CCIR TV standard. It seems more likely that it may have been included in recommended practices from national trade organizations and/or standards issuing bodies. E.g. was it in a DIN standard? It was though included in a New Zealand standard for TV receivers.

Clearly, receiver makers always had the freedom to depart from recommended practices they so chose. It would seem that the use of “non-standard” IFs became easier as technology developed, particularly once ICs came into widespread use. It was in any event easier to screen solid-state circuits, and keeping potentially harmful signals on-chip also helped. In its 1952 IEE paper mentioned upthread, Pye made the comment: “Although an intermediate frequency capable of producing beat patterns can be used with the necessary screening and filtering, it has been found that these precautions are far more elaborate than normally required for maintaining stability at full gain.” By the late 1970s, with the widespread use of SAW filters and IF ICs including quasi-synchronous demodulation (thus avoiding the prolific harmonic-producing diode), that statement, if it applied at all, did so only very weakly.

I think this is illustrated by a couple of second sound conversion cases. Some early Belgian four-system receivers had a split sound system with a second conversion to a lower frequency better suited to FM demodulation. That 2nd sound IF had to be chosen carefully to avoid harmonic feedback interference, 7.0 and 11.8 MHz being the numbers used. Even then, a buffer stage between the sound take-off point and the second conversion frequency changer was usually required for full protection. In contrast, the Japanese were early with stereo TV sound, and quickly appreciating that something better than the intercarrier technique was needed for best quality reception, reverted to split sound for higher quality applications. Here they were able to use the standard 10.7 MHz FM IF (and so standard FM receiver components) as the TV second sound IF, something that would have been problematical in the valve era, but was easy in the IC era. Another example is the late use of the 32.7/43.85 MHz combination for system E in French receivers, to allow a common VIF with system L, and so a common SAWF. The 43.85 MHz SIF fell in Band I, and its use would have been a non-starter back in the valve era.

Cheers,

Steve

ReplyQuote
Topic starter Posted : 29/12/2021 11:44 pm
Nuvistor liked
Page 5 / 6