Notifications
Clear all

[Sticky] Television Receiver Intermediate Frequencies

Page 3 / 6
 
Synchrodyne
(@synchrodyne)
Reputable V-Ratter Registered
Posts: 421

I have recently read an interesting article in Wireless World 1949 July, namely “Television Station Selection – A Look to the Future”, by W.T. Cocking.

A significant part of this article addressed intermediate frequency selection for superheterodyne receivers, including the question of whether the oscillator should be above or below the signal frequency. Particularly interesting was that the author recommended the use of a “high” IF, that is one just under the lowest channel frequencies, even though he was considering only the case of five-channel Band I coverage.

Amongst the conclusions was that the television receiver of the future would be a “Superheterodyne with 34-Mc/s intermediate frequency and local-oscillator frequency above the signal-frequency.”

The complete article is available here: http://www.americanradiohistory.com/Arc ... 949-07.pdf, p.242ff.

It looks as if it were original work, not for example influenced by concurrent work in the US towards adopting a “high” TV IF. And it well predicted the future of British TV IF practice.

Cheers,

Steve

ReplyQuote
Topic starter Posted : 03/04/2016 12:46 am
Synchrodyne
(@synchrodyne)
Reputable V-Ratter Registered
Posts: 421

The BBC’s own TV receiving equipment shows a range of IFs, some standard, some non-standard, at least judging by the information available on the web.

Early UHF rebroadcast receivers, such as the RC5M/501 (c.1969) and RC5M/502 (c.1972) had non-standard IFs of 37.5 MHz vision and 31.5 MHz sound. Quite why is not readily deduced.

The later RC5M/503 (c.1985) rebroadcast receiver was also non-standard, but different at 40.75 MHz vision, 34.75 MHz sound.

Those numbers also applied to the RC1/511 grade 2 receiver (c.1982). There is a block schematic of this in Wireless World 1984 July (p.39), with the annotation “40.75 MHz (to suit synth)”.

So, 40.75 MHz was chosen to suit what was probably an early and relatively simple synthesizer design. Working through the numbers, that implies a local oscillator frequency of 512 MHz for channel E21, vision carrier 471.25 MHz. That LO frequency is a multiple of 8, and given the 8 MHz channel spacing, the multiple-of-8 for local oscillator frequency would have applied across the whole of Bands IV and V. Conceivably that was a desideratum for simple synthesizer design. And 40.75 MHz was the number nearest to the standard 39.5 MHz for which it was achieved.

The RC5M/503 was also synthesized, so the same considerations would have applied.

On the other hand, the UN1/604 (c.1969) and UN1/642 (c.1971) UHF cueing and general-purpose receivers had standard IFs, 39.5 MHz vision and 33.5 MHz sound.

The UN1/585 (c.1969) VHF 405-line cueing receiver had standard IFs, 34.65 MHz vision and 38.15 MHz sound. And the same was true of the experimental 405-line NTSC colour receiver described in BBC Research Department Report T060-7 of 1958.

But the TV/REC/3A (c.1959) 405-line rebroadcast receiver had a non-standard vision IF of 13.2 MHz, quite low even for a Band I-only unit. With oscillator low, the sound IF would have been 9.7 MHz.

The reason for that might be lurking in BBC Monograph #42, “Apparatus for Television and Sound Relay Stations”. In a brief description of the off-air TV receiver, it was said:

“This has been specially designed to have good reliability for unattended working and to minimize the amount of distortion which inevitably arises in the detection process.

“The i.f. stages are arranged to work at a lower frequency than that commonly employed so that delay distortion correction may be more easily introduced.”

Whilst the Monograph did not identify the specific receiver being referred to, it seems likely that its was the TV/REC/3A. Also described was a Band I TV translator of the double-conversion type, but its IF was not given.

Cheers,

Steve

ReplyQuote
Topic starter Posted : 27/04/2016 7:32 am
Synchrodyne
(@synchrodyne)
Reputable V-Ratter Registered
Posts: 421

Something I have pondered is the earlier Russian TV IF of 34.25 MHz vision, 27.75 MHz sound. This falls between the “low” and the “high” groups, the former usually towards the lower end of the 20 to 30 MHz range, and the latter usually nudging the lower end of Band I/Low Band. So the obvious question is why was it done that unusual way?

Russia had TV channels in Band II as well as in Band III, and in the earlier days, it would appear that only the Band I and Band II channels were used. At least that is indicated in a Wireless World 1956 August article about Russian broadcasting. Even so, it seems unlikely that any standard IF established in the 1950s would not have taken account of future Band III activities.

It is not difficult to imagine that the standard IF was chosen so that its 2nd harmonic fell between the highest Band I and lowest Band II channels, and that its 3rd harmonic fell above below the highest Band II channel. Thus the 2nd harmonic would need to be between 66 and 76 MHz, and its 3rd harmonic above 100 MHz. The first condition required an IF in the range 33 to 38 MHz, the second condition an IF above 33.33 MHz. As this range allowed the possibility that the 5th harmonic could fall below the lower edge of Band III, namely 174 MHz, that could have been an added condition which would then have limited the IF range to 33.33 to 34.8 MHz. Then perhaps within that range 34.25 MHz was the “best fit”.

That gave a channel 1 local oscillator frequency of 83.5 MHz, maybe enough clear of channel 3 sound (83.75 MHz) not to be a problem. But the channel 2 oscillator frequency was 93.5 MHz, a potential problem for channel 5 vision at 93.25 MHz. Perhaps channels 2 and 5 were not used in the same area. The oscillator frequencies for the three lower Band III channels, 6, 7 and 8 respectively fell into the three highest Band III channels, 10, 11 and 12, but that must have been viewed as being manageable, perhaps as an (n+4) taboo.

The 1st and 2nd conditions mentioned above were also met by the later 38.0 MHz vision IF, which is perhaps why it was chosen rather than say the European 38.9 MHz number. In this case the 5th harmonic fell at 190 MHz, which happened to be the boundary between channels 7 and 8, so it was harmless. The Band III taboo remained at (n+4), but the interference beat moved from mid-video band (2.25 MHz) right to the edge of the video band (6 MHz) where presumably it was less of a nuisance.)

Of course that is all post facto rationalization, and may or may not be correct. But at least it is possible to impute a rational explanation for the 34.25 (and 38.0) MHz numbers.

Cheers,

Steve

ReplyQuote
Topic starter Posted : 11/06/2016 11:52 am
Synchrodyne
(@synchrodyne)
Reputable V-Ratter Registered
Posts: 421

I also had another look at the Australian original standard TV IF (36.0 MHz vision, 31.5 MHz sound) on a similar basis, given that Australia also used some Band II channels in the earlier years.

The CCIR standard IF of 38.9 MHz would not have created any harmonic problems. But it would have resulted in some local oscillator problems. The channel 1 (56-63 MHz) and 2 (63-70 MHz) oscillator frequencies respectively would have fallen within channels 4 (94-101 MHz) and 5 (101-108 MHz). The 36.0 MHz IF avoided this, with channel 1 oscillator falling in the gap between channels 3 and 4, and channel 2 oscillator at 100.25 MHz, 0.5 MHz below channel 4 sound, where it probably did not do too much harm. The IF 5th harmonic was at 180 MHz, and so at the upper end of the channel 6 (174-181 MHz) vision sideband, with a beat frequency of 4.75 MHz.

One consequence though was that the channel 5 oscillator was at 138.25 MHz, exactly the vision carrier frequency of channel 5A (137-144 MHz). As far as I know, channel 5A, like channel 0 (46-52 MHz), was a later addition, so would not have been included in the initial calculations that arrived at 36 .0 MHz. I imagine that channels 5 and 5A would thus not have been assigned in the same area.

As with the Russian case, it is an uncorroborated, but a plausible explanation for the chosen IF and why it differed from the CCIR standard.

The later move to 36.875 MHz – for reasons not yet uncovered – would not appear to have caused too much upset although the IF 5th harmonic was then less favourably placed, with a 2.125 MHz channel 7 (181-188 MHz) beat. But that happened in the solid-state era when better receiver interstage screening was possible, so presumably was not a major issue.

Cheers,

Steve

ReplyQuote
Topic starter Posted : 12/06/2016 2:20 am
Synchrodyne
(@synchrodyne)
Reputable V-Ratter Registered
Posts: 421

I recently happened upon another piece of information about the development of American TV IFs.

This was in an article in ‘Electronics’ for 1950 November entitled “Production Experience with a 40-mc I-F Receiver”. The background note about the author, D.W. Pugsley, a GE staffer, stated:

“In 1945 the author, as a member of the RMA Television Receiver Committee, was appointed Chairman of the Subcommittee of Standards of Good Engineering Practice. This Subcommittee adopted an i-f of 21.25 to 21.9 mc for the sound i-f and 26.75 to 27.4 mc for the picture i-f.

“Practical experience with the intermediate frequencies has been relatively good except for the problems engendered by oscillator radiation. In addition there have been several minor difficulties such as double conversion effects, direct i-f interference from amateurs, industrial equipment, international short-wave stations, and f-m station image interference.

“The problem of oscillator radiation became so severe that in April 1948 the Television Receiver Committee met and rescinded its recommended standard i-f.

“The author became chairman of a special task force which restudied the problem. Foremost among those involved in this study were John D. Reid and P. G. Hoist. One year later, the Task Force recommended an i-f of 41.25 mc for sound and 45.75 mc for the picture. This was subsequently adopted as a standard by RMA.”

That confirms that the original American “low” IF was encompassed a narrow range of numbers, and that these were of RMA origin. Effectively the IF “channel” was defined as being anywhere within the range between (21.0 to 27.0 MHz) and (21.65 to 27.65 MHz). It also shows that the “high” IF dated from 1949.

The article is available at: http://www.americanradiohistory.com/Arc ... 950-11.pdf, page 98ff.

Cheers,

Steve

ReplyQuote
Topic starter Posted : 13/06/2016 12:17 am
Synchrodyne
(@synchrodyne)
Reputable V-Ratter Registered
Posts: 421

Looking again at some unusual cases, TV-FM receivers have sometimes had apparently non-standard IFs on the sound side. For the purposes of this thread, we can look at the cases where the TV sound IF differed from standard. Those where the FM IF differed from standard but the TV sound IF did not are covered in the corresponding radio receiver IF thread.

In the UK, Murphy in 1958 adopted double-conversion on the sound side for its TV-FM receivers. Both TV sound and FM, emerging from the tuner at the System A standard TV sound IF of 38.15 MHz, were downconverted to 6.31 MHz using oscillator-low conversion. At this lower frequency it was easier to provide adequate FM IF selectivity than at 38.15 MHz. This approach was described briefly in Wireless World 1958 October, page 477. Apparently it was not possible to use 10.7 MHz as the 2nd IF because of patterning problems. Assuming that amongst others, IF harmonics up to the 5th were problematical, then the 5th harmonic of 10.7 MHz, namely 53.5 MHz, would interfere with channel B2, and the 4th might interfere with B1. On the other hand, with a 6.31 MHz IF, these harmonics fell below Band I.

Nevertheless, the 10.7 MHz number was sometimes used for TV sound, as for example in the Jason JTV, JTV2 and Monitor/Mercury II FM-TV hi-fi sound tuners. I have read somewhere that when these were used on a common aerial system with a TV receiver, separated only by a resistive splitter, that patterning was possible. I suspect that that could well have been the interference mechanism that Murphy avoided with its 6.31 MHz 2nd IF. Whether a hybrid splitter would have reduced the interference I don’t know. The Jason JTV2 and Monitor/Mercury II, with but a simple-grounded grid RF amplifier, I imagine would have had greater IF and IF harmonic leakage back through the aerial connection than the original JTV, which I think had a cascode RF amplifier.

But knowingly or otherwise, Jason was following an American precedent, in that some 1950s FM-TV sound hi-fi tuners of the 1950s used the 10.7 MHz IF for both FM and TV sound. An example was the Knight SX702 described Radio-Electronics for 1956 May, page 79ff. For TV sound, this used a Standard Coil TV front end modified to provide a 10.7 MHz output.

Much later, circa 1977, the Pioneer TVX-9500 TV sound tuner, for the American market, was of the double-conversion type and employed a 10.7 MHz 2nd IF. The front end included conventional TV VHF and UHF tuners providing a standard sound IF output of 41.25 MHz, which was then downconverted to 10.7 MHz for feeding into conventional, IC-based FM IF and demodulation circuitry. AFC was provided for the 2nd conversion oscillator, which was on the high side, at 51.95 MHz. I think that it is reasonable to assume that with solid-state circuitry, it was much easier to provide interstage screening than it was with valve circuitry, where heat dissipation was a bigger issue. So 10.7 MHz and harmonics thereof at the aerial connection were probably well suppressed.

There were at least two British TV sound tuners of the 1970s dating from circa 1971, both solid state. The Motion Electronics model was available with VHF and/or UHF tuners, covered AM and FM TV sound, and was said to have an IF of around 35 MHz. The Lowther model was UHF and FM only, with unknown IF, but I’d imagine fairly high.

The “Component TV” era arrived circa 1981. Some of the early TV tuners for the American market had split sound, for better quality than could be obtained with the regular intercarrier system, with its inherent faults. Included in this group were models from Luxman and Sony. Both used a 2nd conversion, from 41.25 MHz to 10.7 MHz for the sound channel. In these units, stopping 10.7 MHz and harmonics from getting into the vision IF channel would have been required in addition to stopping them from getting to the aerial terminals would have been required, but it was evidently quite doable. I don’t know for sure, but I suspect that this kind of split sound approach was developed for the Japanese domestic market when FM-FM TV stereo and dual-channel sound arrived in 1978, and it was found that the intercarrier system was seriously wanting. But I think that the Japanese OEMs did later move to the European-style quasi-split sound for their TV tuners. Still, one may say that 10.7 MHz had non-trivial use as either a TV sound IF or 2nd IF.

Looking backwards again, independent double conversion for TV sound predated the 1958 Murphy case. Excluded here are the early converter-type UHF tuners, which made the whole receiver, vision and sound, double conversion. According to an article in WW 1956 November, page 539ff, second conversion was then the preferred approach for Belgian four-standard TV receivers. The sound 1st IF, 33.4 MHz for Systems B, C and F, and 27.75 MHz for System E, was downconverted to either 7.0 or 11.8 MHz. Apparently either of these minimized interference problems, but it was desirable to use a buffer amplifier between the sound take-off point in the vision IF channel and the sound 2nd frequency changer. That buffer stage would tend to inhibit backward transmission of oscillator and IF signals into the vision IF channel.

So of these four sound IF frequencies:

6.31 MHz appears to have been used only by Murphy. I don’t think that any other UK makers followed its sound channel double-conversion approach, and had they done so, it is open to speculation as to whether they would also have used 6.31 MHz, and beyond that, whether BREMA would have standardized it.

7.0 and 11.8 MHz had at least quasi-standard status in Belgium.

10.7 MHz was of course the de facto worldwide standard for Band II FM receivers, and an actual standard in many countries. And it has been widely used for non-FM applications, as noted in the radio receiver IF thread.

Cheers,

Steve

ReplyQuote
Topic starter Posted : 13/06/2016 9:52 am
Synchrodyne
(@synchrodyne)
Reputable V-Ratter Registered
Posts: 421

Returning to the American case, I have very recently found what is probably the definitive article on the move to the “high” IF. It is “Television Receiver Intermediate Frequencies”, by Paul F.G. Holst, in the journal ‘Electronics’ for 1948 August. (See: http://www.americanradiohistory.com/Arc ... 948-08.pdf, p.90ff.)

It was a seven-page article that delineated the problems that arose with the original RMA “low” IF, detailed the various and sometimes conflicting requirements and then explored the possibilities, with final detailed treatment and comparison of two options, namely 32.8 MHz sound, 37.3 MHz vision and 41.2 MHz sound, 45.7 MHz vision. The latter was considered the best choice given that TV channel 1, 44 to 50 MHz, had just been vacated. I have excerpted and attached the comparison chart.

Evidently the 41.2/45.7 MHz option was selected by the RMA for standardization, and then slightly adjusted to 41.25/45.75 MHz. Perhaps this was to provide integer edges to the notional IF channel, making it 41.00 to 47.00 MHz.

Cheers,

Steve

ReplyQuote
Topic starter Posted : 17/06/2016 5:21 am
Synchrodyne
(@synchrodyne)
Reputable V-Ratter Registered
Posts: 421

That American study has prompted another look at the Italian case And in turn that led to another set of post facto rationalizations.

The Italian TV IF “channel”, 40 to 47 MHz, was decreed in 1952 April, which was ahead of the ITU Stockholm VHF planning meeting, during which the Italian TV channels were defined as being a bit different to the European CCIR standard channels.

I’d say that this channel was probably derived from American practice, with the sound carrier moved downwards from 41.25 to 40.25 MHz to accommodate the CCIR 625-line 5.5 MHz sound spacing.

The American influence was already there. In 1949, both American and French TV transmitters were installed at Torino. The American transmitter adhered to American standards, except that it worked at 625/50 rather than 525/60. It retained the American 6 MHz channel and 4.5 MHz intercarrier, and transmitted in channel A6, 82 to 88 MHz. It may well have been the first transmitter to operate according to what later became the System N parameters. (This from WW 1950 February, p.45).

It seems likely that the existence of that Milano transmitter was why Italy was allowed channel C, 81 to 88 MHz, in the 1952 Stockholm plan. A WW 1959 March listing shows RAI Torino as the only major transmitter operating on channel C.

And just maybe the different arrangement of the Italian Band I and Band III channels was done to accommodate its decreed IF channel.

With an IF channel of 40 to 47 MHz, with a vision IF of 45.75 MHz, use of channel E2, 47 to 54 MHz, was probably impracticable. Channel B was aligned with channel E4, 61 to 68 MHz. The left room for just one channel below B, which was channel A at 52.5 to 59.5 MHz, or 1.5 MHz below channel E3. One might have expected alignment with channel E3, since that would have put the vision carrier at 55.25 MHz, the same as for channel A2, which was known from American practice to have been workable with a 45.75 MHz vision IF without creating insoluble receiver mixer regeneration problems. Anyway, the channel A vision carrier was 8.5 MHz below that of channel B.

In Band III, the “edge” channels D and H were aligned with channels E5 (174 to 181 MHz) and E10 (209 to 216 MHz) respectively. But three channels only, E, F and G, were fitted between them, rather than the four of the European plan.

A possible explanation is IF harmonic interference. The vision IF (45.75 MHz) fourth harmonic was 183.0 MHz. This would produce a 750 Hz beat with the channel E6/channel D vision carrier (182.25 MHz), which was probably an undesirable situation. That required that channel D be moved upwards, with 1.5 MHz being the apparent minimum change, and the number that was actually used. That put the interference 750 kHz below the vision carrier, at the lower edge of the vestigial sideband, where receiver response was significantly attenuated.

Whatever actual upward movement were chosen, a consequence was that only three channels could be fitted between E5/D and E10/H. Equispacing of these would have provided an 8.75 MHz spacing of the vision carriers. But in practice 8.5 MHz spacing was chosen for channels D and E, and E and F, with 9 MHz between F and G, and between G and H.

As to the reason for that asymmetry, possibly it was to avoid 5th harmonic sound IF (201.25 MHz) interference. Had channel G been 8.75 MHz below channel H, then its vision carrier would have been at 201.5 MHz, which would have allowed the harmonic to cause a 250 Hz beat. But placing it 9 MHz below put the vision carrier at 201.25 MHz, which reduced the beat to zero, at least nominally. One imagines that there would have been some slope demodulation (by the receiver Nyquist slope) of the interfering FM harmonic, but perhaps that was seen as the lesser of evils in a situation where there was not that much freedom to adjust.

With 8.5 and 9 MHz channel spacings, I should think that receiver vision IF bandpass adjacent channel traps were designed primarily for the more severe 8.5 MHz case, which was then deemed to provide adequate rejection for the 9 MHz case. That might be why channel A was separated 8.5 MHz from channel C, rather than being adjacent to it, although perhaps as cause rather than consequence. Once the 9 MHz separation between channels G and H had been seen as necessary, thus destroying the 8.75 MHz symmetry, there was probably some juggling of the other separations, and one of the limiting parameters could have been just how low could channel A be taken without risking receiver mixer regeneration problems. Perhaps that determined the 8.5 MHz number, which was then used for recalculating the Band III separations.

Cheers,

Steve

ReplyQuote
Topic starter Posted : 18/06/2016 2:02 am
Synchrodyne
(@synchrodyne)
Reputable V-Ratter Registered
Posts: 421

Axiomatic here are the interrelationships between TV receiver intermediate frequencies, the range of TV channel frequencies to be received, and the geographical allocations of those channels.

In general, one might expect that channel frequencies would have been assigned first, based upon the ITU Atlantic City 1947 band allocations, then the geographical disposition of those channels, and finally the optimum IFs. But it has not always turned out that way, as shown in the following commentary.

In the US case, the FCC assigned the VHF TV channels in 1945. This was ahead of Atlantic City 1947, at which the US 1945 VHF broadcasting bands (minis TV channel 1) were then adopted for all of Region II. Very quickly the RMA developed its initial “low” IF recommendation in line with those channel assignments. This “low” IF turned out to be problematical, so the RMA withdrew it in 1948 and then developed a new “high” IF, released in 1949. Even though a second iteration was needed in respect of the IF choice, both iterations followed rather than preceded the channel assignments. The “high” IF did not impose any undue restrictions on VHF channel geographical allocations beyond what was anyway reasonable practice.

But the FCC then used the new “high” IF as the basis for its UHF channel geographical allocations. With the UHF channels, in-band images and other interferences were, short of upconversion, unavoidable, so allocations had to be made with these in mind. The FCC had no power to fix a standard receiver IF, but by basing its UHF channel allocations on the RMA standard, it did provide a strong endorsement. So in this case, the channel allocations followed determination of the IF.

In Europe, definition of the VHF TV channels, both within and in some cases without the Atlantic City 1947 band limits, was undertaken at the ITU Stockholm 1952 meeting. Initial geographical channel assignments were also made at that meeting. But at the time, no standard TV receiver IFs had been developed except in Italy.

As noted in the previous posting, it certainly looks as if the prior Italian choice of standard IF, modelled on American practice, determined its need for non-standard channelling in both Bands I and III, which meant that it had one fewer Band III channel than did the other countries in Europe that used System B.

The CCIR standard System B IF of 38.9 MHz was evidently developed circa 1954 to fit the channels assigned at Stockholm 1952.

In the UK case, it was the imminent use of Band III and the consequent advent of multichannel tuning that led to the development of the BREMA standard IF. This was based upon the System A channel frequencies already defined.

Both the CCIR and BREMA IFs did not impose any undue restrictions on VHF channel geographical allocations beyond what was anyway reasonable practice.

The French case was somewhat different, though. The tête-bêche channelling for System E was defined at Stockholm 1952. And this included a small number of Band I assignments, although originally it had been planned to use only Band III for the 819-line service.

The development of a standard IF, at least one that was practicable, inevitably created restrictions on the use of those Band I channels. The tête-bêche system required oscillator-high for some channels and oscillator-low for others, depending upon whether the chosen IF channel had the vision carrier at the bottom end or the top end. But oscillator-low – with an IF channel just below Band I - was not workable for the Band I channels, as for the most part, oscillator frequencies would have been within the IF channel. So depending upon which way around the IF channel was oriented, either channels F2 and F4, or channel F3 would have been unusable. As it tuned out, the standard IF channel was chosen with the vision carrier at the bottom end, meaning that channel F3 was unusable. Thus it seems at least possible that the IF channel orientation was chosen to minimize the disruption to Band I use. Apparently channel F3 was originally assigned to a planned Tours transmitter.

Whether the deliberations over the CCIR standard IF included the multistandard receiver case is unknown. It seems possible, as multistandard receivers were the norm in Belgium from the start. But whether considered or not, the 38.9 MHz number was evidently satisfactory for the Belgian four-system receivers. This put the System E sound at 27.75 MHz, meaning that the IF channel was reversed as compared with the French case. In turn that meant that Belgian receivers would have been unable to be configured to receive channels F2 and F4. But that did not matter, as the reception requirements were for channel F8A and perhaps some others of the Band III channels, but not the Band I F-channels. As recorded previously, Belgian practice then worked out suitable frequencies for the sound second IF.

How the introduction of the French standard IF affected French multistandard receivers, developed quite early on for the border areas such as Strasbourg, is unknown. One assumes that these would have been required to cover the full French VHF channel set, so would have had a System E IF channel with vision carrier-low. On the other hand, vision-carrier high would have been required for channels E2 through E4, to allow oscillator-high operation. Actually though, a tête-bêche IF channel whose basic bandpass shape was determined by a pair of Nyquist slopes centred respectively on 28.05 and 38.9 MHz might have worked. For the System E case, a 39.2 MHz sound trap would have been switched in, and for the Systems B/C/F case, a 33.4 MHz sound trap and a high pass filter at around 33.9 MHz would have been switched in. That way the standard IFs for each system could have been used.

The European UHF channels were defined and initially allocated at the ITU Stockholm 1961 (ST61) meeting. The uniform use of 8 MHz channels helped. As with the US case, it appears that channel allocations were based upon existing standard IFs from VHF TV receiver practice. At least, that is the impression that one obtains from this excerpt from the ST61 Technical Annex. Thus were developed transmitter co-siting patterns such as n, n+3, n+6, n+10, avoiding the “taboo” combinations.

Unfortunately the IFs used in development of the ITU table were not stated, but one may make reasonable deductions. For the Systems G and H case, almost certainly it would have been the CCIR standard 38.9 MHz. That correlates with the image at channel n+9, and oscillator at n+5. Then the Italian System H case looks to have been calculated at both the CCIR standard and the Italian standard of 45.75 MHz. For the latter, the image was at n+11 and the oscillator at n+6.

The System K (OIRT) case was evidently calculated at both the then-Russian standard IF of 34.25 MHz and also at the CCIR standard IF or something proximate to it. The Russian standard IF put the image at n+8 and oscillator at n+4. Perhaps the Russians were by then pondering their later upward move to 38 MHz, in order to align with the general European pattern.

System I was more likely worked out on the basis of a 38.9 MHz IF, as I don’t think that the BREMA 39.5 MHz number had yet been debated or promulgated.

For the French System L case, the standard IF of 32.7 MHz might have been decided before ST61. For this, the oscillator would have been at n-4, and the image at n-9.

Whereas most of the European countries could, for UHF reception, continue using their existing TV receiver IFs established in the VHF era, both the UK and France were faced with new 625-line UHF systems and the need for dual-standard receivers, which would continue to use the established IFs for VHF reception, but needed “new” IFs for UHF.

In the French case, for simplicity at the receiver end, the System L IF channel was chosen to have the same sound IF as for System E, namely 39.2 MHz. A common sound IF was probably advantageous with AM sound on both systems. Since the IF channel should not encroach on Band I, this in turn meant that the vision IF was at the lower end of the channel, at 32.7 MHz. That required oscillator-low, not a problem at UHF, and not for Band III. But it had implications when System L was extended to Band I, where oscillator-low working was infeasible. The solution was to use inverted channels, with the vision carrier at the high end, for the Band I channels, which then allowed oscillator-high working. Hence System L’.

In the UK case, a common sound IF would have been of less value given that most receivers would have been expected to use the intercarrier technique for System I. Also, System I was to be used in VHF Bands I and III in Ireland near-term, and possibly in the UK in the more distant future. Allowing for Band I use meant oscillator-high working, and so an IF channel with the vision carrier at the high end. Effectively then UK dual-standard receivers would have a tête-bêche IF channel. As noted earlier in this thread, this could have been done with respectively 34.65 and 38.9 MHz IFs for 405 and 625 lines, but the 625-line number was moved up to 39.5 MHz to allow wider 625-line vision bandwidth whilst still using a basic double-Nyquist IF bandpass. This in turn created a requirement for much better receiver image rejection performance, given that the previously defined requirement for co-sited n+10 channels remained.

In the Australian case, the standard IF channel, with 36.0 MHz vision IF, was chosen more-or-less at the same as and to be compatible with the (unusual) VHF channel allocations, something that the CCIR standard number would not have done. Elsewhere outside of Europe with System B, the CCIR standard IF appeared to be satisfactory.

Cheers,

Steve

ReplyQuote
Topic starter Posted : 07/07/2016 6:21 am
Synchrodyne
(@synchrodyne)
Reputable V-Ratter Registered
Posts: 421

As previously noted, Japan stayed with a “low” IF, 26.75 MHz vision, until circa 1970, after which it changed to a “high” IF, 58.75 MHz vision, that was noticeably higher than any others.

On the face of it, that might be explained by inertia; that is, in 1953, when Japan started TV broadcasting, it simply adopted a variation of the earlier American “low” IF, at a time when the “high” IF was establishing itself in American practice, and for the most part, was yet to arrive in Europe.

But there might have been more to it. Japan adopted a channelling plan that was different to that used in the Americas. Channels J1 through J3 were in Band II, 90 through 108 MHz. Channels J4 through J12 were in Band III, 170 through 222 MHz, with a slight overlap between channels J7 (188 - 194 MHz) and J8 (192 - 198 MHz).

That could have made the use of the American 45.75 MHz IF problematical. Its 2nd harmonic was 91.5 MHz, very close to the channel J1 vision carrier at 91.25 MHz, and its 4th harmonic was at 183 MHz, similarly close to the channel J6 vision carrier at 183.25 MHz. Then the channel J4 oscillator frequency was 217 MHz, close to the J12 vision carrier at 217.25 MHz, although whether J12 was part of the original allocation or whether it was a later addition is unknown.

On the other hand, it does look as if 26.75 MHz was chosen to fit with the channelling plan. Its 4th harmonic was at 107 MHz, within channel J3, but in a position where it probably would not do too much harm. Its 5th harmonic, the highest usually considered, was well below Band III. It also put the channel J4 oscillator at 198 MHz, right on the boundary between channels J8 and J9, so out of harm’s way. Similarly the J5, J6 and J7 oscillators fell on channel boundaries, although the J8 oscillator was at 220 MHz, enough inside channel J12 to be a potential problem, I imagine. If though J12 was a later addition, part of the Band III “creep” beyond its original 216 MHz upper limit, then it would not have been part of the original deliberations.

As well as 26.75 MHz, both 32.75 and 38.75 MHz would have resulted in channel boundary-positioned oscillator frequencies for the lower Band III channels, and both would have avoided the J12 intrusion. But 32.75 MHz had a 3rd harmonic at 98.25 MHz, rather close to the J2 vision carrier at 97.25 MHz. And 38.75 MHz had a 5th harmonic at 193.75 MHz, very close to the J8 carrier at 193.25 MHz. So neither would have been good choices.

So by that back-of-the-envelope analysis, it looks as if the original Japanese IF choice was made as the “best fit” at the time, and it just happened to be a “low” IF. Going above 45.75 MHz was probably not a wise move at the time, as it would have complicated IF strip design with the domestic type valves that were likely to be available.

Quite why the change to 58.75 MHz was made is not clear, but avoidance of the channel J12 conflict and better allocation of the UHF channels were likely to have been factors. By the time 58.75 MHz arrived, the solid-state era was well established, and the requisite IF gains were easily obtained, particularly with integrated circuits.

With 58.75 MHz, the 2nd harmonic was above Band II. The 3rd harmonic was at 176.25 MHz, just above the boundary between channels J4 and J5, and unlikely to be problematical. Band III oscillator frequencies were all above the band. And the UHF “taboo” channels were well-removed from wanted channels, and so well down the RF selectivity curve.

Cheers,

Steve

ReplyQuote
Topic starter Posted : 09/07/2016 1:30 am
Synchrodyne
(@synchrodyne)
Reputable V-Ratter Registered
Posts: 421

I recently found some information on some French Philips multistandard TV receivers at this website: http://tsf-schoser-2.e-monsite.com/page ... nes-page1/.

The examples shown generally seem to have followed Belgian multistandard practice, covering Systems B, C, E and F, with System E being limited to the Band III channels, F5 and upwards. The earlier VHF-only models had a vision IF of 38.9 MHz for all systems, and sound IFs of 33.4 MHz for Systems B, C and F, and 27.75 MHz for System E. Mostly, but not entirely, the border areas where such receivers would be used were covered by Band III French transmitters, so the lack of the French Band I channels might not have been a major drawback. But on the other hand, they were not suitable for France-wide use.

The VHF-UHF models from 1962 had 39.9 MHz vision IF for French System L, with 33.4 MHz sound IF, this being the same as apparent Belgian practice. But the vision IF for System E was also moved up to 39.9 MHz, with the sound IF moving to 28.75 MHz. Apparent Belgian practice in the VHF-UHF era kept System E at 38.9 and 27.75 MHz.

Given that receiver complexity was about the same either way, one wonders why the change in the French Philips case. Perhaps it was because, assuming correct receiver IF bandpass design, the Nyquist slope over 39.9 MHz was gentler, being intended for a 1.25 MHz vestigial sideband. That made it closer to what was ideally required for System E, with its 2 MHz vestigial sideband. The Nyquist slope over 38.9 MHz would have had to be suitable for the 0.75 MHz vestigial sideband of Systems B, C and F, and I doubt that it would have been switched to something gentler for System E. It is understandable that with French domestic multistandard receivers, more attention to detail would have been observed in respect of System E than was generally the case for Belgian receivers. I imagine that those French multistandard receivers also had a wider vision bandwidth (such as 9 MHz) for System E, whereas Belgian practice was to use the same bandwidth for System E as for System F.

The earlier receivers shown at the above-mentioned site include valve lists, and although the functions are not shown, the EF183/EF184 counts are suggestive of four-stage vision IFs, which in turn suggests that they had wideband IF strips.

I still harbour a suspicion that there might have been some French multistandard receivers that accommodated all of the French VHF channels, Band I and Band III, and so – to accommodate channels F2 and F4 with oscillator high - had standard (or similar) IFs with vision below sound. These might have had a standard CCIR (or similar) IF within the wider IF channel, as IF channel with vision below sound would not have enabled use of the Band I E-series channels with oscillator-high. That is not beyond the bounds of possibility, as after all, British dual-standard receivers had what was a tête-bêche IF channel.

Some French TV receivers (for example from Grammont and Telemaster, I think) were also equipped to receive UK System A broadcasts, as well as Systems B, C , E and F, and here I think the Band I channels would have been necessary. Thus the System A IF channel would necessarily have been inverted with respect to the System B/C/F IF channel.

Regarding oscillator position, one might say that although oscillator-low was used in some cases, oscillator-high was the modal choice for TV receivers, established by American practice in the immediate post-WWII period. For the high band/Band III (and later for UHF), it didn’t matter, as either oscillator position was workable.

It was the low band/Band I that was the determinant. Oscillator-low could be used, but it placed significant restrictions on IF choice, given that the oscillator should always lie outside (and above) the IF channel. So the highest available vision IF would be half of (the lowest channel vision frequency less the channel width). Oscillator-high obviated that problem, and allowed the IF channel to be placed just under the low band/Band I, which was its desirable position.

Thus in general oscillator-high, with resultant inverted IF channels, was adopted worldwide. France though was a different case. The adoption of tête-bêche channelling required the use of both oscillator-low and oscillator-high depending upon channel orientation. That was fine for the Band III channels, but for Band I, where oscillator-low working was difficult, it meant that only the tête or only the bêche channels could be used. The standard IF (28.05 MHz vision, 39.2 MHz sound) was inverted relative to the tête channels, which in Band I were F2 and F4, so these could be and were used. But the sole Band I bêche channel, F3, was effectively lost.

When UHF and System L arrived, the standard IF was chosen as 32.7 MHz vision, 39.2 MHz sound, for ease of dual-standard receiver design. Thus the IF channel was non-inverted with respect to the European UHF channels, but the required oscillator-low working was not a problem at UHF. Nor was it a problem when System L was extended to Band III. But the later extensions to Band I was problematical; to allow oscillator-high working, inverted channels were required, hence System L’.

The above-mentioned Philips multistandard receivers, at least those with UHF System L capability, would also have dealt with Band III System L (with appropriate system switching) but not with Band I System L’.

Cheers,

Steve

ReplyQuote
Topic starter Posted : 27/08/2016 8:51 am
Synchrodyne
(@synchrodyne)
Reputable V-Ratter Registered
Posts: 421

It is great to see the old threads restored – thanks very much, Chris!

In the interregnum, I have found additional information on the IF topics – TV and radio -  so I’ll be posting some updates before too long.

Cheers,

Steve

ReplyQuote
Topic starter Posted : 23/09/2018 11:23 pm
Synchrodyne
(@synchrodyne)
Reputable V-Ratter Registered
Posts: 421

This posting provides more information about the origins of the standard intermediate frequency for French 819-line television system, namely 28.05 MHz vision IF (VIF) and 39.2 MHz sound IF (SIF)

Essentially this information is derived from Pieter Hooijmans’ very excellent web site at:

https://www.maximus-randd.com/tv-tuner-history-pt1.html

and subsequent discussion thereof on the UKVRR forum.

To recap, the French 819-line TV system (known as System E from 1961) was originally intended to occupy a 14 MHz wide channel.  To make best use of the limited available VHF spectrum in Bands I and III, a so-called tête-bêche channelling system was adopted, in which each channel space was occupied by both an even-numbered tête channel, with vision carrier at the high end, and an odd-numbered bêche channel, with the vision carrier at the low end.  Furthermore, the channel spacing was reduced from 14 to 13.15 MHz, this being achieved by abandoning the outer guard bands, 0.75 MHz outside the vestigial sideband, and 0.1 MHz outside the sound carrier, the two adding to 0.85 MHz.

As recorded at the ITU Stockholm 1952 VHF allocation meeting, for France, the lower limit of Band III was moved down to 162 MHz from 174 MHz  Thus the range became 162 to 216 MHz, 54 MHz wide.  The odd-numbered bêche channels were placed hard against the 162 MHz lower limit, so that the vision carrier of channel F5 was at 164.00 MHz, allowing for the 2.0 MHz vestigial sideband.  The even-numbered tête channels were offset upwards by 0.25 MHz, so that the sound carrier of channel F6 was at 162.25 MHz.  Presumably this offset helped to minimize the effects of mutual interference between tête and bêche channel pairs.  The net result was that tête sound carriers were offset to be 1.75 MHz lower than the opposite bêche vision carriers.  And opposite vision carriers were separated by 9.4 MHz.  Both of these numbers were pivotal in IF determination, as will be explained.

Note that the one Band III channel from the original 14 MHz-spacing plan that was already in use (at Paris and Lille) before the advent of the tête-bêche system was retained unchanged and renumbered as 8A.

Assuming an IF channel that was placed just below the lower edge of Band I, namely 41 MHz, then it was clear that for the outer channels in Band III, the local oscillator frequency would lie within that band, at which end depending upon whether oscillator-low (infradyne) or oscillator-high (supradyne) operation was chosen.  That also happened with other TV systems, but part of the IF selection process was ensuring that in-band oscillator frequencies were placed where they would cause minimum harm.  (And later on, UHF channel allocations were made on the basis of avoiding unfavourable combinations.)

In the System E case, the tête-bêche system meant that unlike the case with conventional channelling systems, there were no spaces where in-band local-oscillator frequencies could be placed with minimal harm.  What was an open space for a tête channel was important real estate for a bêche channel and vice-versa.  The answer was to have in-band local oscillator frequencies correspond to sound or vision carrier frequencies from the opposing series.  Given that opposing carrier spacings had been chosen to minimize intrusive effects, this was the optimum solution.  I imagine that this approach placed extra taboos on the Band III channel allocation matrix possibilities.

Clearly this choice of local oscillator frequency placement also limited the available IF choices.  If oscillator frequencies were to fall upon opposite-series sound carrier frequencies, then the VIF would need to be a multiple of 13.15 MHz (the channel spacing) plus 1.75 MHz (the offset between opposing sound carriers).  This applied whether the oscillator was below or above its applicable channel.  For an IF channel just under Band I, the solution was a VIF of (2 x 13.15) + 1.75 = 28.05 MHz.  Given that this was well below 41 MHz, then it was logical that the SIF should be above it, at 39.2 MHz.  In turn this implied supradyne operation for the tête channels, in order to obtain inversion of the vision and sound carrier relative positions, and infradyne operation for the bêche channels, in order to avoid inversion.

Alternatively, if the oscillator frequencies were to fall upon an opposite series vision carrier, then the VIF would need to be a multiple of 13.15 MHz plus 9.4 MHz (the separation between opposite channel vision carries).  Here the just-below-Band I solution was (2 x 13.15) + 9.4 = 35.7 MHz.  In this case there was insufficient room for the SIF above the VIF, so it was placed below, at 24.55 MHz.  This required infradyne operation for the tête channels, to avoid inversion, and supradyne for the bêche channels, to obtain inversion.

The consequence is that for Band III at least, there were two workable IF channels.  Each required alternating supradyne and infradyne operation through the channel sequence.  That was not difficult to provide for on turret tuners, although it probably outruled the incremental inductance type, at least if sequential channel switching were to be provided.

The Band I case was different, though.  That was because only supradyne operation was possible.  Infradyne operation would have required local oscillator frequencies within the IF channel, clearly undesirable.  For the tête channels F2 and F4, supradyne operation could deliver the 28.05/39.2 MHz IF, but not 35.7/24.55 MHz.  On the other hand, for the bêche F3 channel, supradyne operation could deliver 35.7/24.55, but not 28.05/39.2 MHz.  Thus all-channel receivers required the use of both IF sets, in what was effectively a tête-bêche vision IF strip, switchable between the 28.05/39.2 and 35.7/24.55 MHz cases.  With a tête-bêche IF strip, supradyne operation on all VHF channels was possible.  In that case, the IF strip would be switched between its two modes on a channel-by-channel basis, 28.05/39.2 MHz for the even-numbered tête channels and 35.7/24.55 MHz for the odd-numbered bêche channels.

The dual IF strip was used for some multichannel French TV receivers.  It may be inferred that this complexity was unwelcome, and probably by 1955 or thereabouts the decision had been made to eliminate it by the simple expedient of not using channel F3.  The ITU Stockholm 1952 list included but one channel F3 allocation (for Tours).  But that was not taken up.  Thus the deletion of that channel did not come at major cost.  Once receivers no longer needed to cover channel F3, the 28.05/39.2 MHz IF could be used for all other VHF channels, F2, F4 and F5 to F12, supradyne for the tête (even) channels and infradyne for the bêche (odd) channels.  Clearly the alternative of deleting channels F2 and F4 and using the 35.7/24.55 MHz IF was more expensive in terms of lost channels, although one might assume that there was the possibility of adding a bêche counterpart to channel F2.  Perhaps though that was not done in the original tête-bêche plan because there was nowhere it could be used without interfering with any or all of the existing Paris channel F1 441-line transmitter, the planned channel F2 transmitters, and existing or planned adjacent-country channel E2 transmitters.  Conceivably there could have been thoughts of reconfiguring F1 as an 819-line bêche channel once the 441-line transmitter closed down, but if so, that idea was overtaken by events in the IF saga.

Thus one could say that the System E 28.05/39.2 MHz IF effectively chose itself, and its adoption as a standard followed its early use.  That it had long-range consequences has been noted in earlier postings.  The French 625-line System L arrived on UHF in the early 1960s, and dual-standard receivers were required.  To simplify the design of these, System L had positive vision modulation and AM sound (without pre-emphasis), thus aligning it with System E in respect of these parameters.  Likewise to simplify receiver design, the System L standard IF was chosen to have the same as SIF as that for System E, namely 39.2 MHz.  Thus the VIF was necessarily lower than the SIF, at 32.7 MHz.

That said, the Belgian four-system TV receivers had used a common VIF of 38.9 MHz, with two SIFs, although that changed when UHF arrived, and System L reception capability was added.  But these receivers were more complex anyway, having to deal with both positive and negative vision modulation and FM and AM sound, the latter both with and without pre-emphasis.  Also, System E channel coverage was usually limited to a few in Band III.  Given that they were intended to work with relatively few cross-border System E transmissions, and not in the middle of France where multiple signals were present and could cause interference even where they were not directly usable, then the fine points of IF choice were less important.

The VIF-low IF channel adopted for System L as a consequence of SIF commonality with System E required infradyne operation for the UHF channels, whereas supradyne operation was the norm for the other European 625-line systems.  That was not a problem, nor was it when System L was extended to Band III, with regular vision-low channels.  But Band I did present a problem, as only supradyne operation was practicable.  The solution there was to use inverted channels, that is vision-high, for which the System L’ designation was used.  That allowed continued use of the standard 32.7/39.2 MHz IF, but it did create complications for multistandard receivers whose primary design centred on an a VIF-low IF channel.  In some cases that resulted in the reappearance of tête-bêche IF channels.

The 32.7/39.2 MHz IF was also one of those used for System K’ in Africa, presumably in those cases where only Band III channels were involved, given that African Band I System K’ channels were not inverted.  The original System K’ standard IF was 40.2 MHz VIF and 33.70 MHz SIF, derived from first principles in the early 1960s; more about that in a future posting.

Finally, since French VHF channel F3 is pertinent to this posting and is seldom included in listing of historical analogue TV channel frequencies, I have attached a page from Wireless World 1959 March in which it was included.

Cheers,

 

WW 195903 p.109
ReplyQuote
Topic starter Posted : 26/09/2018 4:55 am
Nuvistor
(@nuvistor)
Famed V-Ratter Registered
Posts: 4159

Did the TV sets at that time have the capability of producing the pictures that 819 lines would lead us to expect?

I am thinking of CRT’s maximum size of 23 inch but usually much smaller and the spot size of the CRT.

ReplyQuote
Posted : 26/09/2018 2:41 pm
Pieter H
(@pieter-h)
New V-Ratter Registered
Posts: 16

Hi Steve, good to see you back on the IF discussion!

In the meantime Jac Jansen borrowed me the book "TV conversion for ITA" by CE Lotcho, 1957. This book describes the set conversion necessary to receive the new VHF-III transmissions by ITV. Interestingly it contains a long list of all UK-produced TV sets from 1946-1954 including their IF system.

I've compiled an overview that you can also find on my site here, but it merits a bit more discussion then I have room for there. Some background: the book only mentions models and their IF, no timing. So using Radiomuseum.org I've tried to time the sets, sometimes using interpolation based on the set numbering, because only a few are listed in RM.org. So I don't claim 100% correctness, but all year entries are probably correct+/- 1 year. I've always listed a platform in its introduction year, they usually ran for 2 years. It seems the list of sets was compiled early 1955 (although the book was only published 1957) because the list consistently seems to cover sets till 1954. I assume, but have not checked yet, that in 1955, latest 1956 most set makers switched to the new BREMA required 34,65 VIF 38,15 SIF.

If you look at the overview there are a few observations to be made:

  1. it is remarkable to see that TRF (= non-heterodyne, direct RF detection sets) were still introduced as late as 1951! Hadn't expected that, because all Philips/Mullard sets were heterodyne from 1946.
  2. there quickly was a roughly 50-50 split in IF concepts: low SIF high VIF (the white fields in the table) vs. low VIF - high SIF (coloured fields). Almost until the very end.
  3. From around 1951 three clusters of IF approaches develop: A: high VIF around 12-16MHz; B: low VIF around 16MHz; C: low VIF around 34MHz. Each of them fairly stable over the following years. I think it is obvious who where the most influential lobbyists with BREMA to standardize on solution C: GEC, HMV and Invicta.
  4. There are some interesting individual movements. So far I was not aware of the concept used by Decca and GEC as late as 1951, where within one set 4 different IFs were used for different channels. The Decca 121 and 141 e.g. used 13,0, 13,5, 14 and 15MHz for the different VHF-I channels.
  5. Pye, that went to the highest IF of all by 1951 (35MHz), but then apparently ran into too many technical issues and dropped back to 16MHz for the following years.
  6. English Electric, which first moved to the "BREMA-ready" 35MHz VIF low, then apparently expected a more continental standard with 39MHz VIF high in 1954, and then finally switched to the BREMA standard.
  7. Philips, the core of my story, all of a sudden happens to be one of the most extreme with its IF choices. Initially, from 19146 to 1950, with every set the IF moved up 100kHz, which is difficult to explain since not a major influence on performance I would expect. And then, most surprisingly, when most competitors are moving to higher IFs, they switched back to 12MHz, where they were to stay until the switch to the BREMA standard in 1955. So for four years they had the lowest IF of all. This must have been a Mullard thing, because all other Philips TV development groups in Eindhoven and France moved to the 20MHz range at exactly this time. Strange and interesting.

Thought I'd share this with you.

Cheers, Pieter

Philips 1954 UK IF overview
ReplyQuote
Posted : 26/09/2018 10:02 pm
Synchrodyne
(@synchrodyne)
Reputable V-Ratter Registered
Posts: 421

Thanks Pieter!  It’s good to see you on this forum.

That’s a very nice analysis you have done of the pre-BREMA TV IF situation in the UK.  I have that Lotcho book, but I haven’t really paid much attention to the UK details for that period – I have tended to think in terms of 16.0 MHz VIF, 19.5 MHz SIF as a general proxy.  But there was certainly a very diverse approach, with more variations than I had expected.

An interesting point is that the “high” IF was mooted as long ago as 1949, in a Wireless World article by W.T. Cocking, attached.

WW 194907 p.242
WW 194907 p.243
WW 194907 p.244
WW 194907 p.245
WW 194907 p.246

Here the calculations were done on the basis of the UK Band I channels only, so the case for a “high” IF did not derive solely from the Band III calculations.  The outcome was that a VIF of around 34 MHz with oscillator high was recommended.  This was quite close to the eventual BREMA 34.65 MHz number, which also took into account the additional conflicts that could occur with the Band III channels.

WW 195412 p.582
WW 195412 p.583

It does seem possible that HMV, GEC and Invicta, all of whom made an early move to a “high” IF in the Band I-only era, may have been influenced Cocking’s work.

The use of channel-specific IFs by Decca and others could have been one way of avoiding in-channel oscillator and IF harmonics with “low” IFs, particularly with oscillator-low operation.  It could have been that the original IFs had been chosen for channel B1 operation only, with tweaking of those by a MHz or so being an easy way to accommodate channels B2 through B5 without major redesign

Cheers,

Steve

ReplyQuote
Topic starter Posted : 28/09/2018 5:14 am
Synchrodyne
(@synchrodyne)
Reputable V-Ratter Registered
Posts: 421
Posted by: Nuvistor

Did the TV sets at that time have the capability of producing the pictures that 819 lines would lead us to expect?

I am thinking of CRT’s maximum size of 23 inch but usually much smaller and the spot size of the CRT.

I don’t have a definitive answer, but my best estimate is that at least some receivers would have taken close to full advantage of the 819-line system capability, with for example spot size adjusted to avoid line overlap on the one hand but to provide a flat field on the other.

Indirect evidence that some of the setmakers were reasonably concerned about picture quality is provided in the attached Wireless World item.  The key part is:

“Quotation of the vision bandwidth is also very common in France.  9 Mc/s was the usual figure given but 10 Mc/s was also quite often quoted.”

It seems probably that setmakers who took the trouble to obtain 10 Mc/s bandwidth and so full horizontal resolution would also have taken the trouble to obtain full vertical resolution, at least for the larger screen sizes.

System E vision bandwidth was originally stated to be 10.4 MHz, and was reported as such through CCIR Report #83 (Warsaw 1956).  But in CCIR Report #124 (Los Angeles 1959) it had been rounded to 10 MHz.  (Originally (e.g. as recorded in Kerkhof and Werner), the aspect ratio was 4.12:3, but it had moved to the standard 4:3 by the time of CCIR Report #15 (Geneva 1951).  If 10.4 MHz was “right” for 4.12:3, then 10.1 MHz would be the corresponding number for 4:3.)

As an aside, I wonder many UK TV 405-line receivers in 1959 offered the full 3 MHz bandwidth, or even 90% of it, namely 2.7 MHz.  My guess is not very many.  From an IF strip design viewpoint, would it have been more difficult to do 3 MHz with 405, with the sound carrier just 0.5 MHz away, than 10 MHz with 819, where the sound carrier was 1.15 MHz away?

On screen sizes, I have the impression – but right now no supporting evidence to hand - that the French consumers leaned towards what we would regard as the “normal” screen size, namely 23 inches (and before that 21 inches), with less use of small (19 inches and under) screens.  If so, then 819 lines, properly executed, should have been advantageous in respect of picture quality.

Cheers,

Steve

 

WW 195910 p.456 French TV Receiver Vision Bandwidth
ReplyQuote
Topic starter Posted : 29/09/2018 12:13 am
Synchrodyne
(@synchrodyne)
Reputable V-Ratter Registered
Posts: 421
Posted by: Synchrodyne

 

The later move to 36.875 MHz – for reasons not yet uncovered – would not appear to have caused too much upset although the IF 5th harmonic was then less favourably placed, with a 2.125 MHz channel 7 (181-188 MHz) beat. But that happened in the solid-state era when better receiver interstage screening was possible, so presumably was not a major issue.

 

That was from an earlier post about the Australian standard TV receiver IF (originally 36.0 MHz VIF, 30.5 MHz SIF).

I have since found an explanation for the later change to 36.875 MHz VIF, 31.375 MHz SIF) in the ABCB (Australian Broadcasting Control Board) annual report dated 1970 June 30.  I have attached the pertinent pages:

ABCB Annual Report 19700630 p.92,93 Standard TV IF

Evidently with the 36.0 MHz VIF some receivers generated spurious signals when tuned to channel 2 (64.25 MHz vision, 69.75 MHz sound), but the problem was removed with the 36.875 MHz number.

The mechanism by which the spurious signals in channel 2 were generated is not obvious – to me anyway.  I suspect that it might have involved harmonics of the IF, local oscillator, etc.

With the 36.875 MHz VIF, the channel 1 (57.25 MHz vision) local oscillator was 94.125 MHz, close to channel 4 vision at 95.25 MHz, but far enough down the Nyquist slope and towards the adjacent channel sound trap so as not to be a guaranteed problem in all cases.

Cheers,

ReplyQuote
Topic starter Posted : 29/09/2018 1:51 am
Synchrodyne
(@synchrodyne)
Reputable V-Ratter Registered
Posts: 421

Earlier in this thread I had pondered the fact that System K1 (sometimes referred to as K’), used by many Francophone African countries, had three different IFs listed for it in ITU-R Recommendation BT.804 of 1992, namely:

1) 33.70 MHz SIF, 40.20 MHz VIF
2) 33.40 MHz SIF, 39.90 MHz VIF
3) 39.20 MHz SIF, 32.70 MHz VIF

At the time, I said:

“The three sets of IFs listed for System K’ are interesting. The third was the same as for System L, and required oscillator-low conversion, not a problem since as far as I know System K’ at VHF was used in Band III only, not Band I. The second used the System B sound IF of 33.4 MHz, with the vision IF thus one MHz higher than the CCIR number, at 39.9 MHz. The first looks to have been standalone, and one assumes was derived from ad hoc considerations basis conditions in one or other of the territories where System K’ was/is used.”

Contrary to my earlier assertion, it was intended that System K1 would be used in Band I, and that was a significant factor in the determination of the initial standard IF of 33.70 MHz vision, 40.20 MHz sound. The working out of this IF was detailed in a quite lengthy French submission to the ITU 1963 Geneva African VHF-UHF Broadcasting Conference. That submission was Document No. 44-E.

At the time what became System K1 was referred to as K*. Presumably this was a provisional designation to be used for the purposes of the conference until it was formally included in the CCIR list. System K* was essentially French System L but with negative vision modulation and FM sound. System L had used positive vision modulation and AM sound primarily so that it was aligned with System E in those respects, so easing the design of dual-standard receivers. In the French Outré Mer territories, there was not an existing System E service, so that requirement did not apply, and the by-then-customary negative/FM combination could be used. The arguments used for the System L vision bandwidth parameters, 6.0 MHz main sideband and 1.25 MHz vestigial sideband applied equally to System K*. Of course, System K* was also tantamount to System D/K with the vestigial sideband extended from 0.75 to 1.25 MHz.

Returning to the IF issue, the problem was framed as one of finding the optimum combination of a suitable IF channel, and syutable frequencies for the three Band I channels, together occupying 24 MHz bandwidth, and which could be placed anywhere in the 27 MHz wide 41 to 68 MHz range, not necessarily contiguously but not overlapping. The frequencies for the six contiguous Band III channels were a given.

The authors of the Document 44-E considered various possibilities. Noting that the Band III channels were suitable for either infradyne or supradyne operation, whereas only supradyne was feasible for Band I, the options were normal (vision low) Band I channels with a normal (vision high) IF channel and inverted (vision high) Band I channels and an inverted (vision low) IF channel. Working through the various constraints produced the recommended result, namely a normal IF channel with 40.2 MHz VIF and normal Band I channels of 42-50, 51-59 and 59-67 MHz.

As previously noted, the third of the above-listed System K1 IF sets was the standard for the French System L. This required infradyne operation with normal channels. So it would have been usable only in situations where reception of the System K1 Band I channels was not required.

The second IF set listed above, 33.40 MHz SIF, 39.90 MHz VIF, was one that had been used for System L in Belgian and French border area multistandard receivers. To recap, in the VHF-only era, these typically used 38.9 MHz VIF for Systems B, C, E and F, with 33.4 MHz SIF for Systems B, C and F, and 27.75 MHz SIF for System E. When UHF arrived, it was necessary to add System L capability. In this case it was evidently easier to have this share the 33.4 MHz SIF with Systems B, C and F, and introduce a 2nd VIF into the matrix, namely 39.9 MHz. When this happened, some setmakers also moved the System E VIF up to 39.9 MHz, with the SIF consequently moving to 28.75 MHz. Nominally at least the IF channel in the VHF-only era would have a Nyquist slope over 38.9 MHz suitable for the 0.75 MHz vestigial sideband of Systems B, C and F, with System E probably having to use this. With the 39.9 MHz VIF introduced for System L, the corresponding Nyquist slope would have suited the 1.25 MHz vestigial sideband of that system. That was a closer fit for the 2.0 MHz vestigial sideband of System E, so that could have been a reason for moving up the VIF for that system. The often in some ways non-standard IFs used for multistandard receivers did not usually find their way into the official and institutional lists. But the 33.4/39.9 MHz combination did because it was also later on a System K1 norm. Presumably it was not quite a good a fit as the original 33.7/40.2 MHz combination, but nevertheless acceptable. I suspect that the potential interference problems were less in latter-day solid-date receivers with SAWFs, ICs and synchronous demodulation.

Cheers,

Steve

ReplyQuote
Topic starter Posted : 21/10/2018 9:54 pm
Pieter H
(@pieter-h)
New V-Ratter Registered
Posts: 16

Hi Steve,

I never dived into the  K' standard because it is a bit outside my scope. However, looking at your last post I can say the following about the proposed IF values as mentioned.

In general it seems that none of the proposals is not connected to at the time existing solutions. Which makes sense, I would expect the French national broadcasting authorities to have aligned with the major French set makers, being Philips/La Radiotechnique, Thomson and Schneider. So each of the values as proposed is connected to an existing solution. Where the main technical argument to keep in mind is that at the time, until the emergence of SAW filters, the SIF filtering was deemed most problematic, because most narrow-band, so in most multi-norm sets the SIF was kept fixed and the VIF varied. See the many examples as listed on my site.

A. SIF 33,7, VIF 40,2
The reference here is a French tuner (Philips AT7650/25) with for Luxemburg (norm F) SIF=33,7 VIF=39,2. In standard French norm-E reception VIF=28,05 SIF=39,2. So in this proposal K' they would re-use the existing SIF and shift up the VIF by 1MHz

B. SIF=33,4, VIF=39,9
This the easy one: 33,4 was the universally used SIF for norm-B/G/H, the VIF was again moved upwards from 38,9 to 39,9. This by the way was a standard available tuner, the AT7650/84 for Finland, for receiving Russian norm-D next to B.

C. SIF=39,2, VIF=32,7
This is essentially a modification of A., where instead of keeping the SIF fixed now the VIF of 39,2 is kept, and the SIF lowered by 1MHz from 33,7 to 32,7. This scheme was actually used in the 1967 tuner AT7672/25, which used SIF=39,2 and VIF=32,7 (French L UHF) or 28,05 (French E VHF).

All in all it clearly gives the impression that, in co-operation with the set makers, they have tried, when defining the K' IF values, to maximize the re-use of existing solutions, where the three proposals were more or less equal in relative effort. Since there were no K'-only TV sets as far as I know, they were by definition always multi-norm, and set makers could choose which IF solution A, B or C to use, based on the highest synergy with the main norms.

Cheers, Pieter

ReplyQuote
Posted : 23/10/2018 8:43 am
Page 3 / 6