Quite some time back (early 1970s, I think), I set out to make a list of the standard television receiver intermediate frequencies associated with the major (terrestrial, analogue) broadcast transmission systems. There was no particular reason for so doing beyond idle curiosity. At the time I didn’t get all that far, but now with access to more information, I decided to complete the list to the extent that I can, although there are still some gaps and interesting questions. Having completed the summary, and this being the season for oddities and stepping out of the mainstream, it seemed reasonable to post it here.

Of the above, (1) through (3) were all established by late 1954, but all were introduced some time after their respective transmission systems were actually in use, following an earlier period in which lower, and usually non-standard intermediate frequencies had been used. Clearly, each was the result of careful tradesoff amongst the competing requirements including avoidance of beats whether internally or externally caused, and minimization of interference with other receivers and devices. Thus the choices would have taken account of the actual channel frequencies in use as well as the respective system parameters. All three had in common that they were placed just under the lowest Band I (low band) channel frequency, effectively as high as could be without invoking up-conversion, presumably to maximize image rejection inter alia. And all three were based upon “oscillator high” frequency conversion, such that the relative positions of the vision and sound carriers are transposed during frequency changing.

45.75 MHz arrived in US practice in 1950. Previously 25.75 MHz was a commonly used frequency. The higher number moved all VHF images out-of-band, and in the case of the VHF low band (Band I), above the FM band (Band II). In the UHF case, where images are unavoidably in-band, the FCC assumed a receiver IF of 45.75 MHz when it planned the UHF channel assignments in 1950-51, and at some stage thereafter it was adopted as a RETMA recommended standard.

The BREMA numbers for the UK 405-line system were as far as I know developed during 1954 in anticipation of the 1955 start of ITV Band III broadcasting. So as with the RETMA case, they were related to expansion of the channel frequencies used by an existing standard. Previously various lower numbers had been used for Band I-only receivers, such as 16.0 MHz vision, 19.5 MHz sound.

The CCIR IF numbers seem to date to somewhere in the 1953-54 period. Philips literature from 1953 associated with the original world series of TV valves gave an example receiver with IFs of 23.75 MHz vision, 18.25 MHz sound, so it seems unlikely to have been earlier than that. Whereas the US and UK standard IFs were promulgated by the respective receiver and allied equipment trade associations, RETMA and BREMA, as far as I can tell, although I am not completely sure, the European number of 38.9 MHz was established under CCIR auspices.

It would appear that the 38.9 MHz vision IF has been used worldwide wherever Systems B, C, F, G and H have been deployed, except that Australia had a different standard number. So one might say that it was a relatively robust choice when measured against the diversity of channelling systems actually used for Systems B/G/H, including the 8 MHz European UHF channelling which I doubt was in view back in 1953-54. In later years 38.9 MHz was also used to some extent for Systems D, I, K, K’ and L.

The BREMA and CCIR IFs were a little lower than those of RETMA, and this would appear to stem from the relativity of lowest frequency channels used for broadcasting. In North America the lower edge of the low band started at 54 MHz, whereas in Europe Band I started at 41 MHz, and in the early 1950s there was the possibility that channel E1, 40 to 47 MHz, might have been used. Thus European IFs needed to be below 40 MHz.

And so on to (4), the UK BREMA 625-line IF. This chosen in advance of establishment of the UK UHF TV network, and were developed in conjunction with the channel assignment plan, which included receiver image rejection as a defined parameter. I think it reasonable to assume that some account would also have been taken of the needs of the Irish TV network which also used System I in Band I and Band III channels, and the below-40 MHz positioning tends to support this. Anyway, here was a case where the optimum IF was chosen before transmissions started, not afterwards.

That the System I vision IF was 39.5 MHz as compared with the 38.9 MHz of Systems B/G/H is, on its face, simply the result of picking the best trade-off given the conditions prevailing. But then when South Africa adopted System I later in the 1970s, broadcasting in Band III and on UHF (European channel frequencies), it adopted standard IFs of 38.9 MHz vision, 32.9 MHz sound, the 38.9 MHz number aligning with that for Systems B/G/H. Assuming that 38.9 MHz was an appropriately considered choice, it certainly prompts a rethink of the basis for the UK choice of 39.5 MHz. Thus I wonder if it (the UK choice) was also influenced by the need for receivers to be dual-standard, which in turn would have made 38.9 MHz a less than ideal choice in terms of ease of receiver IF strip design, at least for those set makers pursuing maximum simplicity.

Consider that Mullard at least on occasion advocated dual-standard IF strips whose basic bandpass characteristic was 6 dB down at 34.65 and 39.5 MHz respectively, and so which included the Nyquist slopes for both the 405- and 625-line cases, additional system-specific shaping being provided by switchable traps, etc. This basic curve thus allowed a 625-line vision bandwidth of 4.85 MHz at -6 dB, which although short of the 5.5 MHz transmitted, was deemed to be adequate for most domestic receiver purposes. On the other hand, with a 38.9 MHz vision IF, the basic 6 dB bandwidth would have been 4.25 MHz, perhaps not enough for those setmakers who wanted 625 line horizontal definition to match that for 405 lines. Then the 405-line Nyquist slope would have had to be switchable. So it seems at least plausible that 38.9 MHz would have been found acceptable for 625-line reception alone, but that it was desirable to find another satisfactory number about half a MHz higher to more easily accommodate dual standard receivers, hence 39.5 MHz.

The relatively high IF used in Japan stems from the fact that the lower edge of the lowest TV channel, J1, was 90 MHz. Whether the 58.75 MHz number was used from the start of the Japanese TV service in 1953 I do not know, but a history wherein the American 45.75 MHz number was used initially, followed by an upward migration, is quite plausible. Japan appears to have been the only exception to the use of 45.75 MHz for System M, which number was also used for System N in Latin America. Also, Japan was the only example of a country not using Band I/low band for TV channels that took advantage of the opportunity that the non-use gave to move to a higher standard IF.

The Eastern European 38.0 MHz number seems unlikely to have been used from the start of 625-line television in the USSR, but was more likely a standard introduced in the first half of the 1950s after earlier use of lower IFs. It would appear that in later years, 38.9 MHz vision, 32.4 MHz sound was also used.

I am not sure about the 37.0 MHz number for China. It is from a single datapoint and it is not immediately apparent why it would be different to the Eastern European number. But whether 37.0 MHz or 38.0 MHz, it was probably used from the start of TV broadcasting in China in 1958.

The French System L IF had the vision carrier below the sound carrier, which would have required having the local oscillator on the low side of the signal, evidently not a problem with the UHF and Band III channels. When System L was extended to Band I, it was as System L’, with the channels configured with the vision carrier high, thus allowing use of the standard IFs with oscillator high.

I have not come across any information on IFs for French 819-line (System E) receivers, but there is evidence that French dual-standard receivers used the same 32.7 MHz vision IF for both systems, which would put the System E sound IF at 43.85 MHz. Whether the 32.7 + 43.85 MHz combination was used before the dual-standard era I do not know, but it seems possible maybe probable, unless there was a complete rethink at the beginning of the dual-standard era. Given the tête-bêche channelling system used in French VHF practice, oscillator low would have been required for the odd-numbered Band III channels F5 through F11, and oscillator high for the even-numbered Band I (F2, F4) and Band III (F6 through F12) channels. I am not sure which channels were tête and which were bêche, but as F8A was the first used, it is probably not unreasonable to assume that the even-numbered channels were tête. At one time Band I bêche channel F3 was listed, but not I think actually used. If 32.7 + 43.85 MHz had been used from the inception of French 819-line broadcasts in 1950, then firstly it was an early start with a “high” IF, and secondly the sound IF was actually inside Band I, which seems odd at first glance. But then perhaps the original thinking was to use only Band III for the 819-line service, although 43.85 MHz was also within the Paris 441-line channel F1. As I understand it the French for a short while did toy with the idea of having regular (405 or 441 line) and very high definition services side-by-side. The F channel allocations bespeak of a rethink, too. The original, used at Paris and Lille, was at the lower end of Band III, 185.25 MHz vision, 174.10 MHz sound. Then there was a change, possibly connected with a decision to go with 819 lines only, that introduced the tête-bêche system, inclusive of Band I channels and in the Band III case, with allocations starting at 162 MHz, below the lower edge. Thus the original channel became a non-sequitur, and was numbered as F8A.

Mention of the Lille transmitter, which allegedly was sited to give good coverage into Wallonia, brings in Belgian multistandard TV receiver practice. At least based upon information on some of the Philips models during the valve era, these were designed to receive Systems B, C and F on all E-channels, and System E on channel F8A (Lille) only. The standard CCIR IFs of 38.9 MHz vision, 33.4 MHz sound applied to Systems B, C and F. System E used 38.9 MHz vision, 27.75 MHz sound. Early examples had a second down-conversion for sound on all systems, to 7.0 MHz. Later, intercarrier sound was used for System B with a dual frequency (33.4 and 27.75 MHz) AM sound IF strip. Unknown is whether the 38.9 MHz vision IF number had been determined before the start of Belgian TV in later 1953.

In more recent times, multistandard receivers have used 38.9 + 32.4 MHz for System L, with 33.4 + 39.9 MHz for System L’, although that implies separate IF filters for L and L’.

The 32.7 MHz vision + 39.2 MHz sound IF combination was also nominal for the French Outré Mer System K’, which used Band III channels only. But 38.9 + 32.4 MHz was likely also used. Notwithstanding the fact that Band I was not used, existing standard numbers were preferred to the somewhat higher numbers that could have been possible.

The same could be said of the South African case, which has already been mentioned in comparison with the UK case.

Australia seems to have been the only System B country to deviate from the 38.9 MHz standard. Possibly its choice of 36.875 MHz was connected with its use of Band II TV channels and the need to avoid interference with non-broadcast services. Australia was also the only System B country to use 7 MHz rather than 8 MHz UHF channelling.

Note that I have not attempted to add-in second sound carrier IFs where used. In IF terms these lay where they fell as the result of existing vision carrier-to-oscillator relationships, and did not cause any fundamental changes.

Also today's Arris Cable Modems use 2nd IF 44MHz (2MHz, 6MHz and 8MHz wide available), 1st IF is 1.2GHz!

Ex VHS / TV PAL I Nicam IF filters are handy for projects being narrower than the Sound IF

Ex-Unknown94.6MHz!

EX-Cable Modem43

Ex-Equipment filters I have from TV & VHS39.6339.5 (are some of these really 39.4 Video carrier?) 33.4 (Audio)32.9 (Nicam) various 6MHz parts for Intercarrier IF, Video baseband trap, Quadrature detector for FM instead of a coil6.5MHz presumably Intercarrier IF for Nicam or else compatibility with PAL D/K audio? Can't remember what it's out of.

I have 4.5MHz, but I'm sceptical it's for PAL M. Probably out of something strange

I bought a couple of these chips for a project

But I forget what!

2.4 & 5.8 Video Senders of course use 6.0 & 6.5 subcarriers for the audio, so the FM Video receiver base band out feeds old style 6.0 & 6.5 FM IFs, usually using a separate SIL FM Radio IF/Detector for each channel.

The TDA9820 is certainly an interesting IC, with PLL FM demodulation, as well. As shown the main channel it covers the four standard intercarrier frequencies, namely 4.5 MHz (M, N); 5.5 MHz (B, G, H); 6.0 MHz (I) and 6.5 MHz (D, K, K’). And the secondary channel could be used for any of the three Zweiton second sound intercarrier frequencies, namely 4.72 MHz for M (Korea); 5.74 MHz for B/G/H (Europe & Australia) and 6.74 MHz for D/K (China). I am not sure if there was a companion dematrixing IC, and if so whether this covered the Korean M variant, which was a bit different in using M and S signals for stereo rather than M and R as in the German original.

Intercarrier sound is another topic in and of itself, worthy of a separate thread when time allows. Whilst much described in the literature in basic terms, its origins are not well recorded. Nevertheless it is clear that intercarrier sound was not on the horizon when NTSC (I) developed the US 525-line TV standard. This has prompted a modicum of searching for its origins. The history of the much more recent quasi-split sound technique should be better documented, but even that is somewhat fuzzy.

Where double conversion has been used, I should expect less standardization of IFs. In the 1990s in the US, Mitsubishi advertised its TV receivers as having double conversion. I never ascertained what those IFs were, but my guess was upconversion to somewhere around 1 GHz followed by downconversion to the standard 45.75 MHz or thereabouts.

Examples of second conversions on the sound side (excluding the intercarrier case) were the 7.0 MHz for AM and FM already mentioned for the Philips Belgium multistandard receivers, 6.52 MHz for AM and FM used by Murphy for some of its TV-FM receivers, and 10.7 MHz used by Sony, Luxman and possibly others when split sound processing (for System M) enjoyed a brief revival in the 1980s. In the last-mentioned case both Sony and Luxman used a VCO for the second conversion, steered by the FM discriminator output.

Synchrodyne wrote:... Australia was also the only System B country to use 7 MHz rather than 8 MHz UHF channelling ...

The UHF 'E' channels were planned with an 8MHz bandwidth so that they could accommodate any of the European 625-line systems whilst using the same vision carrier frequencies for all standards. No doubt this was done in the light of the experience gained in the VHF bands with system specific channels being used in different bordering countries.

The cross border interference problem does not occur in Australia so it made obvious sense to realise the bandwidth savings made possible by adopting system B only channelling.

In the days of coil based IFT a maker could really chose any IF (that didn't have strong Broadcast on it), hence so many AM and FM Radio versions even in 1950s. Quality sets in from 1930s had IF traps on RF or aerial (My Philips LD480AB 1956, Pye 39 JH/E 1949)

But once SAW and Ceramic filters became common, the cost of a custom frequency initial charge is VERY high, so maker chose whatever Murata (the original ceramic IF and still biggest) and Toko offered, so the different IFs used now are much more standardised, even across different equipment.

The latest development looks like Direct Conversion, but is actually "Zero IF". If you convert (either at RF directly or after 1st 1.2 GHz IF) with quadrature signal and quadrature L.O. you have two zero IF signals, I & Q. If these are digitised with ADC and then DSP is used as IF, you can separate the mathematical "negative" image frequencies in the Zero IF.

When DSP and ADC was slow, the triple conversion to 12.5KHz was popular, but TVs, Mobile Phone, WiFi, Bluetooth and even FM & DAB radio now often use direct ADCs running up to 200MHz, so the system is "dual conversion" with 2nd IF at zero Hertz.

Superb thread, the staff have discussed this and agreed this is worthy of being made a sticky

RegardsChris

Hi Chris:

Thanks very much and that’s a nice surprise. I was actually a bit hesitant to post it initially, but then I figured that I was unlikely to be the only one who was curious about the what and the why of this topic.

Terrykc wrote:The UHF 'E' channels were planned with an 8MHz bandwidth so that they could accommodate any of the European 625-line systems whilst using the same vision carrier frequencies for all standards. No doubt this was done in the light of the experience gained in the VHF bands with system specific channels being used in different bordering countries.

The cross border interference problem does not occur in Australia so it made obvious sense to realise the bandwidth savings made possible by adopting system B only channelling.

Yes, the logic is inescapable, but the odd thing is that only Australia acted accordingly. Other countries, such as New Zealand, which could have done the same, instead chose to follow the European 8 MHz model.

Michael Watterson wrote:In the days of coil based IFT a maker could really chose any IF (that didn't have strong Broadcast on it), hence so many AM and FM Radio versions even in 1950s. Quality sets in from 1930s had IF traps on RF or aerial (My Philips LD480AB 1956, Pye 39 JH/E 1949)

But once SAW and Ceramic filters became common, the cost of a custom frequency initial charge is VERY high, so maker chose whatever Murata (the original ceramic IF and still biggest) and Toko offered, so the different IFs used now are much more standardised, even across different equipment.

Indeed I think that it was the free choice that was available back in the days of L-C IF selectivity that drove the need for consensus industry standards for TV receiver IFs in order to minimize interference problems, and allow rational planning of channel assignments, particularly at UHF, where images were in-band.

As you say, the arrival of the SAW filter in domestic receivers in the late 1970s effectively cemented the standard choices, and perhaps caused some minor changes. Even so, the SAW filter makers had to produce quite a diversity of variants. An early Plessey catalogue shows 8 variants, and this was before the advent of QSS (for which Plessey might have been the first to offer a dedicated SAW filter).

Digressing slightly, FM radio receivers seem to have standardized on the 10.7 MHz IF worldwide fairly quickly after the US moved to the 88 to 108 MHz band. Most UK receivers and tuners I think conformed, notable early exceptions being Bush (19.5 MHz, which was also an early UK TV receiver sound IF) and Leak (12.5 MHz, for which it offered a technical justification, although it moved to the standard 10.7 MHz with the Stereofetic, perhaps because of ceramic filter availability.)

Amateurs in various countries have just got 472 to 479 kHz, which is unfortunate as many AM IF are 470 to 480. Previously some Administrations had licensed 501 to 504 kHz which really is better.490 kHz is a Maritime frequency.

Ideally not only should an IF be good for Image Rejection etc, but be an unused frequency. Hence 32MHz to 40 in Europe (44MHz USA) with 68MHz, 420MHz and 1.2GHz choices for higher IF.

Whilst I have some additional comments about the FM receiver IF case, I wonder whether there should not be a separate thread for this topic.

Returning to TV IFs, and in particular the UK case, does anyone know if BREMA issued any documentation (standards, papers, etc.) in connection with its standard IF numbers for both the 405- and 625-line cases?

The Fewings & Fife paper noted below (1) includes several references that I have not seen:

“Local Oscillator Radiation and Aerial Voltage from Television Receivers”, B.R.E.M.A. Report (1954). (IF choice could well be a part of this.)

W. Holm and W. Werner, “Choice of an intermediate frequency for television receivers to suit the C.C.I.R standard,” Funk und Ton 8, 1954.

“Choice of Intermediate Frequencies for Domestic Television Receivers” (Union Europeenne de Radiodiffusion Tech 3062-E, April 1954).

The last two suggest 1954 was the year in which the European 38.9 MHz number was generally adopted, although the possibility exists that it had been used before then.

Philips introduced the PCC84 and PCF80 valves in 1953, and certainly the rationale for the PCF80 would have been to address the situation where the IF was just below the lowest Band I channel; a pentode mixer was much easier to manage here, and even though it was contra-indicated for Band III, that could be offset by the use of a high gain, low noise RF stage, namely the cascode. So Philips was at least anticipating the upward IF movement and could have been using a higher IF in its early receivers that included these valves.

UK manufacturer Cyldon was offering its TV.12 Teletuner for export in 1953; the earliest trace I have is an advertisement in Wireless World for August, 1953. This used the PCC84 and PCF80 valves, covered 50 to 220 MHz, and had an IF output in the range 40 to 47 MHz. One might infer that Cyldon expected IFs for any system to be around 40 MHz, was perhaps uncertain as to where the actual CCIR number light land, and had chosen the USA example as its best precedent. At the same time, for UK domestic use, it was offering its Band I-only TV.5 Teletuner, based around an EF80 and ECC81 combination, and with IFs in either the 9.5 to 14 or 15.5 to 22 MHz range.

The adoption of 45.75 MHz in the USA has been well recorded by Fink (2).

In terms of infilling IF information gaps, any examples of French practice in respect of 819-line only or 625 and 819-line dual-standard receivers would be useful.

In the dual-standard case, Carnt & Townsend (3) give us a clue that a common vision IF was used, thus: “Cheaper receivers use the same narrow-band IF for both standards, but the better models use a 9 MHz I.F. with a narrow band filter after the U.H.F. tuner”. That supports the notion that in the dual-standard era, the normal vision IF was 32.7 MHz, and that dual-frequency sound IF strips were used. That probably allowed the simplest receiver design, which evidently had been an objective. For example, positive vision modulation and AM sound (without pre-emphasis) were chosen for system L because that aligned it with the 819-line system parameters and so simplified receiver design. (Sometimes the System L choice, the inferior one by the conventional wisdom of the day, although that is a challengeable assertion, has been attributed to protectionism and/or chauvinism, but if so, why then was System K’, negative + FM, chosen for the Outré-Mer territories?)

As to the 32.7 MHz vision IF, it might have been a new number, developed after ab initio consideration of dual-standard receiver interference mechanisms, but then it might also simply have been carried over from 819-line single-standard receiver practice. Whatever were the original reasons for the 819-line IF choice, whatever it was, still existed, as the VHF network was going to around for many years to come. Thus carrying it forward into the dual-standard era, and developing UHF channel assignments around it, would have been logical.

Multistandard receivers I should think for the most part involved some compromises. The Philips Belgium examples previously mentioned support this idea. The 38.9 MHz vision IF (with sound IF on the low side) was “right” for Systems B, C and F, but a forced choice for System E. Possibly an interesting study would be the Barco CRM2631 solid state monitor-receiver of the 1970s. There might have been more than one iteration of it, but as far as I know it predated the arrival of SAW filters for IF selectivity and so would have used LC filters, whether lumped (more likely with an IC-based IF amplifier) or distributed. Also, if it used quasi-synchronous vision demodulation, then that would lean to a single or at least minimum number of vision IFs. Whereas a single diode demodulator could be used at the end of an IF strip of variable bandpass shape and variable carrier frequency, a quasi-synchronous demodulator needs a tank circuit tuned on-carrier. Thus one wonders whether 38.9 MHz was used for all standards, even including System M. Barco seemed to be the pre-eminent producer of multistandard receivers back in the 1970s; presumably in part that is because four-standard receivers had been the norm in Belgium from the beginning of TV broadcasting there.

Cheers,

Steve

(1): D.J. Fewings and S.L. Fife; “A Survey of Tuner Designs for Multi-Channel Television Reception”; British IRE Journal, August 1955.

Firstly, Hawker & Pannett (1) list some of the commonly used IFs, including those for the British 405, British 625, CCIR 625 and US 525 line systems, the numbers for all of which agree with those previously quoted. For the Australian 625-line system though, the numbers are 36.0 MHz vision, 30.5 MHz sound. It appears as though these numbers moved upwards somewhat at a later date, perhaps when digitally synthesized tuning arrived.

That synthesized tuning sometimes required IF adjustments, at least in its early days, seems to be confirmed in a July, 1984 Wireless World article (2) that includes a block schematic for the BBC RC1/511 TV receiver, which shows IFs of 40.75 MHz vision, 34.75 MHz sound, “to suit synth.”

includes in section (2) on the page a tabulation of the various analogue TV transmission standards CCIR A through N, inclusive of intermediate frequencies for some but not all.

Most of the IFs quoted therein agree with other sources, including 32.7 MHz vision, 39.2 MHz sound for French system L. The interesting differences are:

System D: 34.25 MHz vision, 27.75 MHz sound.

System E: 28.05 MHz vision, 39.2 MHz sound

Regarding System D, this points to the possibility that perhaps 34.25 MHz was used in say the 1950s, and 1960s, with a later upward movement to 38.0 MHz. At least judging from the early Plessey SAWF literature, 38.0 MHz was established at the beginning of the SAWF era, or perhaps by it. And the SAWF era I think predated the general advent of synthesized tuning. So the change to 38.0 MHz would not likely have been driven by the advent of synthesized tuning.

Regarding System E, the 28.05 MHz vision, 39.2 MHz sound combination makes more sense than the 32.7 MHz vision, 43.85 MHz sound combination that I inferred from the Plessey SAWF literature in that it does not overlap the bottom edge of Band I. It looks then that the 39.2 MHz sound IF was retained for System L, which then determined the vision IF of 32.7 MHz. So one may speculate that perhaps the original idea for a French dual-standard vision IF strip might have been one where the upper end was determined by the 39.2 MHz sound rejection notch, and the lower end had alternative filters that put the -6dB point on the Nyquist slope at either 28.05 for 819 lines, or 32.7 MHz for 625 lines. But that would have tended to force vision IF strip design to full bandwidth on 819 lines. On the other hand, some of the setmakers might have preferred a common vision IF of 32.7 MHz (with Nyquist slope arranged as required for 625 lines), with a dual-frequency sound IF strip (a concept familiar from Belgian multistandard receiver practice) that could be switched for either 39.2 or 43.85 MHz. That would have allowed the flexibility to set the 819-line IF bandwidth as desired, worst case the same as for 625 lines.

I have in my mind circa 1977 for the general arrival of SAWFs in consumer TV receivers. I seem to recall that it was about a year or so after Mullard introduced its TDA2540/2541 vision IF ICs. That would have been gleaned from the pages of "Television" magazine.

SAW filters were first used, as far as I am aware, in Teletext receivers because of their excellent group delay characteristics.

The first set I know of was a Rank Arena (AC6333???) fitted with a TI-Fax decoder and I would place this no later than 1976.

The original Teletext Spec was published in 1974 but an intermediate version, which included all the new features, such as double height and background colour, followed on fairly quickly, with the full Broadcast Teletext Specification being released in 1976.

Once fully featured decoders based on the Philips chip set became available, the 'experimental' [1] TI-Fax decoder was dead in the water ...

EDIT Re-reading Steve's post I note he said "the general arrival of SAWFs in consumer TV receivers (my emphasis) so, on that basis, he is probably correct ...!

[1] One of my jobs involved designing a Teletext page/error counter driven from the TI-Fax decoder as part of the tests to see how well our network handled Teletext. However, getting information out of Texas was problematic as they were reluctant to divulge any information whatsoever! Mostly they only confirmed what my investigations with an oscilloscope had revealed but when it came to adding anything external, they were very reluctant. The best I could get was that i could 'probably' get away with one LS TTL load but they wouldn't give any guarantees due to the experimental nature of the chips ...

Regarding Belgian multistandard receivers, it would appear that when UHF arrived in the early 1960s, some were configured to receive French System L UHF transmissions as well as Belgian System H UHF transmissions.

For System L, the same sound IF, 33.4 MHz was used as for Systems B, C and F, which put the System L vision IF at 39.9 MHz. Thus the vision IF had switched traps, etc, to move the Nyquist slope between 38.9 and 39.9 MHz.

Overall then, the vision IF was 38.9 MHz for Systems B, C, E, F and H; and 39.9 MHz for System L.The sound IF was 33.4 MHz for Systems B, C, F, H and L; and 27.75 MHz for System E, along with a 5.5 MHz intercarrier for System B.

Actually, such receivers would also have covered System G transmissions (from Netherlands and Germany). One wonders whether there was switching of the Nyquist slope rate of change for System H, as compared with Systems B, C and F, to take full advantage of its characteristics, but I suspect not.

This arrangement predated System L’, which would have added some more complexity. I wonder if System L has had the largest number of IFs associated with it.

Meanwhile I have not meanwhile been unable to unearth any information on Barco multistandard receiver practice vis-a-vis IFs.

But I do have the schematic (1986 February) for the Sony Profeel VTX-100M multistandard TV tuner, which covered Systems B, D, G, H, I, K, L, and M. It is rather complex, and not all of the IFs are noted on the diagram, so some deductions are needed.

Starting with the AM sound subsystem uses a TDA4445F, which is stated to operate at 32.4 and 41.0 MHz, and which is fed from a point ahead of the SAW filter. Presumably 32.4 MHz is the sound IF for standard System L, which puts the vision IF at 38.9 MHz. Then 41.0 MHz is the sound IF for System L’, which in turn puts the vision IF at 34.5 MHz. The IF pathway ahead of the SAW filter includes a switchable 34.4 MHz trap, whose function would appear to be putting the Nyquist slope over 34.5 MHz for System L’. Assuming that the SAW filter puts the Nyquist slope over 38.9 MHz, then System L’ has a -6 dB video bandwidth of 4.4 MHz.

Given the existence of the 34.4 MHz trap, one assumes that the SAW filter response is flat down to below this number. A turnover at or below 33.9 MHz, which corresponds to 5 MHz (Systems B, G and H) bandwidth, would be my best guess.

There is a single SAW filter and a single video IF IC, a TDA4429T. This appears to have diode-switched demodulator and AFC tank circuits, allowing it to operate at two video IFs. These would be 38.9 MHz for all except System L’, and 34.5 MHz for System L’, although they are not indicated on the schematic.

The SAW filter also provides a quasi-split sound output that feeds a TDA2546A IC, this providing the 5.5 MHz, 5.74 MHz (Zweiton) and 6.0 MHz intercarriers, and demodulating both the 5.5 and 6.0 MHz intercarriers through a diode-switched tank circuit. The 5.74 MHz intercarrier is demodulated in a TDA4940 IC that also does the ID, and the 6.5 MHz intercarrier is demodulated in a TBA129 IC. So the SAW filter QSS output is assumed to be the vision carrier at 38.9 MHz plus wideband sound carrier coverage extending from a little below 32.4 to a little above 33.4 MHz. 4.5 MHz intercarrier (for System M) is taken from the video demodulator (i.e. conventional intercarrier process) and dealt with in another TBA129.

So a summary of the apparent IFs (vision then sound) at work here is:

Systems B, G and H: 38.9 and 33.4 MHz (the prevailing standard for these systems)Systems D and K: 38.9 and 32.4 MHz (not the original standard, but thought to be widely used)System I: 38.9 and 32.9 MHz (not the UK standard, but the South African standard)System L: 38.9 and 32.4 MHz (not the original standard, but thought to be reasonably widely used in multistandard receivers)System L’: 34.5 and 41.0 MHz (possibly Sony’s own choice, perhaps to facilitate the use of just one SAWF, as European multistandard practice is thought to be 33.4 and 39.9 MHz.)

An interesting facet of this tuner is that there are group delay correctors, apparently switched, in the post-demodulator video channel. So there was evidently some attempt to get the overall group delay characteristics matched to the various systems.

Re the previously mentioned EBU document:

“Choice of Intermediate Frequencies for Domestic Television Receivers” (Union Europeenne de Radiodiffusion Tech 3062-E, April 1954).

This appears not to be available at the EBU website (http://tech.ebu.ch/Jahia/site/tech/cach ... blications); at least I cannot find it if it is there. It occurs to me though that given the pan-European nature of the EBU, the said document might have addressed IF selection for all systems then used in Europe, not just the CCIR System B.

The Nomad appeared to be from the early 1970s or perhaps late 1960s, and covered Systems A, B, D, G, I, K, M and N, using mostly bipolar discrete circuitry. (One assumes that it also covered System H, although that is not mentioned.)

For System A, the standard IFs of 34.65 MHz vision, 38.15 MHz sound were used.

For all other systems, the sound IF was 33.5 MHz, that is, the UK System I standard number.

The circuit seemed to largely follow UK dual-standard practice, which thus determined the IF choices. That is, Systems A and I had standard IFs, and the others lay where they fell as a consequence. The vision IF channel had extra traps, switched to adjust the Nyquist slope and adjacent channel rejectors to the right place for each standard. The main sound IF channel was dual frequency, 38.15 MHz AM for System A and 6.0 MHz FM for System I, the latter with a ratio detector. The 4.5 MHz (Systems M & N), 5.5 MHz (Systems B & G) and 6.5 MHz (Systems D & K) intercarriers were handled in three separate single-frequency sound IF channels, each ending with a TAA350-based slope demodulator. AGC was mean-level for System A, sync tip for the others.

The data I have for the Nomad is dated January, 1973, which I suspect is reasonably late for a System A-capable receiver.

We may add another IF set to this list, namely 45.9 MHz vision, 40.4 MHz sound for System B in Italy, at least in the 1950s. For reasons I have never seen articulated, Italy’s VHF channel plan differed from that used in Western Europe for System B, and the lowest channel was IA, 53.75 MHz vision, 59.25 MHz sound, as compared with 48.35 MHz vision, 53.75 MHz sound for channel E2. And there was also the prospect that channel E1, 41.25 MHz vision, 46.75 MHz sound might come into use, so the standard IF had to be clear of this, as well. So in the Italian case, the IF could be pushed up somewhat.

This information comes from a 1954 Philips document that I actually downloaded some time back, but evidently did not inspect closely enough at the time, probably because it is in Dutch, although one can get the gist of it. It is about Philips’ then-current range of TV tuner units, of the turret type, and using the PCC84 and PCF80 valves. I suspect that it referred to the first iteration of tuners using these valves, which were released mid-1953.

For those variants with French channel capability (probably designed for use in Belgian receivers), the oscillator was high (normal) for channels F5 and F7, but low (reversed) for channels F8 and F8A.

Possibly the Italian IFs, 45.9/40.4 MHz, petered out somewhere along the way and Italian receivers reverted to the European standard 38.9/33.4 MHz. This could have been because setmakers moved to fitting pan-European tuners that had coverage down to channel E2. Or maybe the change came with the advent of UHF in the early 1960s, or with colour in the later 1960s, at which time setmakers may have wished to rationalize on variants. All speculation, though, although nodal points are obvious places to look for what might be concomitant changes. Certainly 45.9/40.4 MHz appears not to have lasted through to the SAW filter age. Whether it was an actual or de facto Italian standard, perhaps used for Italian VHF transmitter network and channel allocation planning, is unknown. Perhaps it was a number set that Philips chose.

An earlier Philips turret tuner, described in Philips Book IIIC of 1953, was the AT7501, covering channels E1 through E10, and with VIF 23.75, SIF 18.25 MHz. It used an EF80 RF amplifier with an ECC81 mixer-oscillator, so it was likely a bit noisy on the Band III channels.

Anyway, the above does not contradict the notion that 1954, or perhaps a little before, was when the CCIR standard 38.9/33.4 MHz IF arrived. Given that Belgian TV started around September, 1953, one wonders whether Belgian receivers were able to use the 38.9/33.4 MHz IF and the PCC84 and PCF80 (or similar) valves from the start. It could have been a close-run thing. Those kinds of valves facilitated the use of IFs that were nudging Band I from underneath, whilst such IFs really required those valves for robust implementation.

Further searching has unearthed ITU document ITU-R BT.804, “Characteristics of TV receivers essential for frequency planning with PAL/SECAM/NTSC television systems”. This included a table of intermediate frequencies, attached.

Largely it confirms the numbers already recorded from other sources. But it is not comprehensive, for example omitting Australian and South African practices.

The three sets of IFs listed for System K’ are interesting. The third was the same as for System L, and required oscillator-low conversion, not a problem since as far as I know System K’ at VHF was used in Band III only, not Band I. The second used the System B sound IF of 33.4 MHz, with the vision IF thus one MHz higher than the CCIR number, at 39.9 MHz. The first looks to have been standalone, and one assumes was derived from ad hoc considerations basis conditions in one or other of the territories where System K’ was/is used.

The US case is referred to EIA Standard Recommendation 109-C. This has now become ANSI/CEA 109-D, “Intermediate Frequencies for Entertainment Receivers”. I do not have a copy, as it is not available as a free download; one has to pay. But I have attached an excerpt of the available preview.

The Japanese TV IF band is shown as being protected, and I imagine that it was covered by an EIAJ or JIS standard. It is also noted as being applicable to all-channel (VHF and UHF) receivers, about which more later in this posting.

The French System L IFs were covered by a SCART recommendation. It is noted that a double-transposition was used for Band I. This is puzzling. In the event, System L’ was used in Band I with “inverted” channels, vision carrier high and sound carrier low. This would have allowed single conversion to the standard IFs with oscillator high, whereas the Band III and UHF channels required oscillator low. Had non-inverted channels been used in Band I, then one may see that oscillator low operation would have been difficult, and that double conversion could have been necessary. As the use of VHF channels for System L came after the initial UHF use, perhaps there was some deliberation over the Band I issue before the landing on System L’, and the information in the ITU document was derived at an intermediate stage. Of course, the French were accustomed to mixing both oscillator high and oscillator low conversions from the 819-line days with its tête-bêche channelling.

Searching for more information on French System E (819-line) IFs by various means, including simply using “28.05 MHz”, brought up a couple of interesting patent documents that indirectly provide further information.

This one, http://www.google.com.au/patents/US4141042, from Sanyo in 1977, refers to an IC-based TV AFT system. Whilst it mainly address the Japanese TV IFs of 58.75 MHz vision and 54.25 MHz sound, it does mention the need for wide frequency coverage if the needs of other TV standards are to be met, and the French system vision IF of 28.05 MHz is mentioned. Of course, by 1977, System E was on the way out and System L was dominant in France, but I suspect that the Sanyo folks chose the System E number because it represented the lower edge of the range of TV IFs then in use. Nevertheless, its mention serves to support other sources that report the 28.05 MHz number. And its timing supports the idea that to some extent at least, French dual-standard receivers used 28.05/39.2 MHz for System E and 39.2/39.2 MHz for System L, thus with a common sound IF.

The other, http://www.google.com/patents/US3647950, from New Nippon Electric in 1968, refers to a variable TV IF trap device. The key point for this thread though is that it refers to the Japanese standard IFs of 26.75 MHz vision and 22.25 MHz sound. From that one may infer that Japanese receiver practice started in 1953 with a “low” standard IF, probably derived from the then-outgoing American practice, where similar numbers were used, such as 25.75 and 21.25 MHz. I suspect then that the move to a “high” IF could have been driven by the advent of significant use of the UHF channels, and the previously noted comment in the ITU document, which associates the Japanese “high” IF with all-channel receivers, appears to confirm this supposition. One may place the timing of the IF transition as early in the solid state era. ICs and dual-gate mosfets would have made it easier to deal with the 50+ MHz IF, which might have been difficult in valve days, particularly before frame-grid types became economic for consumer equipment. Incidentally, this paper came up on the “28.05 MHz” search simply because that number was used as an example of where the adjacent channel sound IF would move to under certain drift conditions.

Maybe it is time to put the information discovered to date into some kind of tabular form, and I’ll ponder that possibility.

Further searching, mostly finding “snippet” references through Google Books, etc., has confirmed that for Japan, 26.75 MHz (vision) was the norm until around 1970 when the change was made to 58.75 MHz.

Thus Japan appears to have been the last territory to abandon the “low” IF. In this context, “low” means an IF channel that is a long way below the lowest Band I (Low Band) channel, and generally with the whole channel well below 30 MHz. “High” IF means an IF channel that is placed just under the lowest Band I channel. Actually, in the Japanese case, the IF channel of 54 to 60 MHz was not particularly close to channel J1, 90 to 96 MHz. But it was nevertheless the highest of the standard and quasi-standard IFs for single-conversion.

The Japanese low IF (channel 22 to 28 MHz) was probably derived from US practice, where 21 to 27 MHz was common but not the only range used. The slight upward adjustment may have been made after consideration of its interaction with the Japanese channel frequencies.

In the Australian case, 36.00 MHz (vision) was the original standard in 1956. It would seem that this choice, rather than the CCIR standard 38.9 MHz, had something to do with the Australian VHF channel frequencies, which were somewhat different to those used elsewhere. The change to 36.875 MHz, which may have supplemented rather than replaced 36.00 MHz, came in the early 1970s, and might have been related to the advent of colour broadcasts.

The French System E IFs of 28.05 MHz (vision) and 39.2 MHz (sound) apparently date back to the advent of TV receivers with multi-channel tuning. With earlier single-channel receivers, various IFs were used.

Regarding French System L’ and the previously mentioned double transposition referred to in ITU-R BT.804, I have found some clarifying comments in Jackson & Townsend (1). After noting that French receivers use oscillator low for System L (Band III & UHF) and oscillator high for System L’ (Band I), there is the following commentary:

“In a multi-standard receiver, for use around the French borders, normal System B and G vision intermediate frequen¬cies and vision-to-sound orientation can be used for the high-frequency channels, with the local oscillator operating on the high side of the incoming signal. For the Band I case, however, the picture and sound carriers are now reversed. This problem may be resolved in three ways:

- using an additional transposing up-convertor unit which re creates an if signal with the vision and sound carriers reversed.- providing a dual-purpose if processing channel suitable for vision carriers placed at the high side or low side of the response curve.- avoiding the problem by limiting the receiver to use in areas not covered by this format”

The first option looks to be a little complex. Essentially, L’ Band I channels would undergo an oscillator-high conversion to what might be called a phantom L channel, and in turn that would be converted, oscillator low, to the desired IF.

The second option was what was used in the Sony VTX-100M previously described. Also, there were Philips TV tuners (front ends), shown in the 1990 Data Handbook DC03, namely the UV815, etc., which appeared to follow this approach. On all standards except L’, vision IF was 38.9 MHz with sound low. For L’, vision IF was 33.4 MHz and sound IF was 39.9 MHz.

At this stage it is worth reprising European multistandard practice in respect of receiver IFs.

In the Belgian case, VHF-only receivers covered four standards, namely B, C, E and F. For B, C and F, the CCIR standard IFs, 38.9 MHz (vision) and 33.4 MHz (sound) were used, with oscillator high. For E, the CCIR 38.9 MHz vision IF was also used, which then placed the sound at 27.75 MHz. Thus the receivers had to deal with two AM sound IFs. System E channel coverage was usually limited, and required oscillator low for the even-numbered channels and oscillator high for the odd-numbered channels.

When UHF transmission arrived, Systems G and H were accommodated with no change in IFs, but System L was a complication. It was apparently accommodated by using the same sound IF, 33.4 MHz, as for B, C and F, which then resulted in a vision IF of 39.9 MHz, with oscillator high. Whereas with VHF turret tuners, having oscillator low for some channels but high for others could be accommodated, that would have been awkward with the early continuously variable UHF tuners. One assumes that switching the vision IF bandpass to move the Nyquist slope -6 dB point from 38.9 to 39.9 MHz was less complicated than adding a third AM sound IF at 32.4 MHz, very close to 33.4 MHz.

By the time that System L’ Band I transmissions arrived, Systems E and F were gone, and that allowed a rethink. In fact some of the changes might have occurred once System E was no longer catered for in new receivers, and before the advent of L’. Also the disappearance of System C would have eased requirements (I need to check the respective dates here, though.)

Thus using 38.9 MHz (vision) for B, G, H and L made sense, with oscillator high for all. This put the sound at 33.4 MHz for B, G and H, and 32.4 MHz for L. Thus just a single-frequency AM sound channel was required. For as long as System C capability was required, System L vision probably stayed at 39.9 MHz to allow the sound to be at 33.4 MHz and so avoid the complication of two AM sound IFs very close together. But that said, there were some AM sound ICs that did not require tank circuits, and so were to some extent “frequency wild”.

For L’ (Band I), oscillator high was mandatory and this resulted in an IF channel with vision low. In the Philips UV815 tuner mentioned above, the IFs chosen for L’ seem to have been the same 39.9 and 33.4 MHz numbers used originally for L in Belgian receivers, but reversed.

Thus IFs used in Belgian multistandard practice, at least in the Philips case, can be traced back to the CCIR 38.9 + 33.4 MHz standard.

In the French case, as mentioned above, the 28.05 MHz (vision) and 39.20 MHz (sound) combination was established with the advent of multichannel receivers, which one assumes had turret tuners to accommodate the oscillator-high, oscillator-low alternation through the channel sequence, something that would have been difficult with the incremental induction type. It is an open question as to whether 28.05 + 39.20 MHz for System E came into use before 38.90 + 27.75 MHz for the same system in Belgian receivers.

When System L arrived with UHF in 1964, the 39.20 MHz sound IF was retained, and the vision IF was therefore 32.70 MHz, with oscillator low for all UHF (and Band III) channels. With the System L’ reversed channels, the standard 32.7 + 39.2 MHz combination could be retained simply by using oscillator high for Band I.

Thus one may make partial sense of the three sets of IFs used for the French Outré Mer System K’. To the best of my knowledge, the originally defined VHF channels were in Band III only. In that, the System L IF, 32.70 + 39.20 MHz, with oscillator low, was logical. In territories where Band I channels were later used, these were not reversed, so that oscillator high was necessary for all channels. The 39.90 + 33.40 MHz combination seems to have been lifted from Belgian multistandard VHF-UHF receiver practice in respect of System L. So two out of three correlate with established practice. The third, 40.20 + 33.70 MHz, seems to have been standalone, with no obvious correlation.

The previously mentioned Chinese vision IF of 37.0 MHz appears to be validated by the existence of a Philips VHF-UHF tuner, the UV411HKM, shown in the 1990 Data Handbook DC03. This is listed as covering the Chinese channels only. Other System D/K tuners covering both the Russian and Chinese channels had the expected 38.0 + 31.5 MHz IFs. From this one may suppose that the Chinese started, in 1957, with 37.0 + 30.5 MHz as best suiting its VHF channel pattern, but that a later stage, there was some ad hoc migration to 38.0 + 31.5 MHz. Incidentally, the Chinese sound IF of 30.5 MHz is the same as the original (1956) Australian sound IF.

I now have the Wireless World December 1955 article on the BREMA TV IF recommendations. This does not refer to a specific BREMA document or publication, though. It includes a chart to show the possibilities and prohibitions, with the conclusion that a vision IF in the vicinity of 35.0 MHz was the best choice. The actual number chosen was 34.65 MHz, which put the fifth harmonic of the IF between channels 8 and 9.

Also interesting was an article by G.H. Russell in the April 1954 issue of WW making the case that with Band III TV imminent in the UK, it was appropriate to develop a standardized IF. The author came up with 35.25 MHz (vision), which was in general agreement with the later BREMA figure. The difference may be accounted for by the fact that the author considered that IF harmonics up to the 4th only needed to be taken account of. He did concede though that there was a body of opinion that the 5th harmonic could also be problematical, and indeed as it turned out BREMA was in this group.

So if nothing else, that is another little piece of System A history.

Russell also made the case for pan-European agreement, but then went on to note that some unilateral decisions had already been made, with the American and Italian examples quoted.

In the American case, the RMA number of 45.75 MHz is footnoted as having been recorded in the Electronics journal for November, 1950. So that gives us a definite date. I should infer that the REC 109-C revision of the pertinent RMA document followed this announcement. The previous REC 109-B issue is known to have covered FM and AM radio receiver IFs, but probably not TV IFs.

In Italy, there was a government decree that protected the 40 to 47 MHz as the TV IF band. This was footnoted as being recorded in Gazetta Ufficiale della Republica Italiana (Part 1), 1952 April 03. So the Italians were relatively early with a standard “high” IF. That an IF channel mostly within Band I could be used in Italy reflected the fact that the lowest Italian channel, IA, occupied 61 to 68 MHz, corresponding with E4. One wonders whether the choice of the 40 to 47 MHz IF channel was influenced by the American choice of 41 to 47 MHz.

Whether, within the 40 to 47 MHz channel, the actual IFs were defined is not recorded in the Russell article. One might expect 45.75 MHz vision, 40.25 MHz sound as the logical numbers following the usual System B channel pattern, albeit inverted. But the numbers used by Philips for its early Italian-channel turret tuner, namely 45.9 and 40.4 MHz, were still in-band.

That the Italian IF channel was government-decreed stands in contrast to the US and UK cases, where it was left to the appropriate trade associations to develop and make the recommendations. However, in both cases, the numbers so developed were then used by governmental organizations for channel assignment planning purposes. It may have been the same in France. At least the System L IFs were embodied in a SCART standard, and possibly that approach reached back to the System E IFs.

In both the American and Italian cases, the “high” IFs were to some extent pushing the technology. In 1950, a typical American TV tuner, with “low” IF output, would have used a 6AG5 pentode RF amplifier and a 6J6 double triode mixer-oscillator. The pentode RF amplifier was rather noisy at Band III, but was easy to use than quieter triodes, particularly in what was a mass-produced item of relative precision. RCA had seen the situation as being unsatisfactory back in 1948, and its development work culminated in the 6BQ7 cascode double triode RF amplifier announced early in 1951. This would have happened more-or-less independently of the move from “low” to “high” IF, which seems to have been an idea that came after 1948. I do not know the exact mixer history, but once the “high” IF was in sight, it must have become clear that the established 6J6 was not going to be an easy fit (looked at from a mass-production robustness viewpoint) with an IF that sat just below channel A2. Hence the pentode mixer, and RCA had its 6X8 triode-pentode available at about the same time as the 6BQ7 was released. But the pentode mixer could be used at Band III only if it were preceded by a high-gain, low-noise RF amplifier, so its deployment was essentially dependent upon the availability of the 6BQ7 and like valves.

In Europe, the corresponding valves, ECC84/PCC84 and ECF80/PCF80 did not arrive until mid-1953, which was around a year after the Italian TV IF decree. Perhaps Italian TV makers had access to American valves? Or possibly, they started with tuners using the typical EF80 + ECC81 combination; the greater separation between their standard IF and lowest channel as compared with the American case would have made that a somewhat easier.

Now to circle back to the previously mentioned Cyldon Teletuner TV.12; as advertised in WW 1953 August, it was stated to have an IF output in the range 40 to 47 MHz. Given that it was, at the time, aimed at the export market, it could have been that Cyldon picked an IF range that covered what may have been the only two national standards that were known to be established at the time, namely those in the USA (41 to 47 MHz) and Italy (40 to 47 MHz). I do not know exactly when the TV.12 was released, but the TV.5 Band I-only model was announced and advertised in WW 1953 June, so it was likely after that. The TV.5 came late in the Band I-only era, and one suspects that it did not find too many takers.

The Russell article made no mention of any other national TV IF standards or recommendations. Whilst one cannot be sure that this was because nonesuch existed when it was written (probably some months before it was published), it does seem likely that had any been established, they would have been mentioned. Still, pertinent information may not have been readily available in those days. Even today, in the “information age” obtaining the information required for this thread has not been easy. In the French case, the need for multichannel receivers probably arrived with the third transmitter, Strasbourg, channel F5, which started in October 1953 (according to WW 1953 December). Previously, Paris and Lille had both used what became channel F8A. The same WW item noted that the Lyons and Marseilles transmitters were scheduled to come on line in the second half of 1954. Conceivably the System E standard IFs (28.05 + 39.2 MHz) had been established by late 1953, in time for the Strasbourg transmitter start-up.

Talking of national standards, I have since found my copy of NZS 6551:1973, “Specification for Television Reception”. This stated that the CCIR IFs, 38.9 + 33.4 MHz, should be used in New Zealand as being the best compromise for the channels allocated in NZ. NZS 6651 was a revision (in anticipation of the arrival of colour) of the earlier NZS 1712:1963 (which I don’t have), which I should guess contained the same statement in respect of IFs.

Synchrodyne wrote:In Italy, there was a government decree that protected the 40 to 47 MHz as the TV IF band. This was footnoted as being recorded in Gazetta Ufficiale della Republica Italiana (Part 1), 1952 April 03. So the Italians were relatively early with a standard “high” IF. That an IF channel mostly within Band I could be used in Italy reflected the fact that the lowest Italian channel, IA, occupied 61 to 68 MHz, corresponding with E4. One wonders whether the choice of the 40 to 47 MHz IF channel was influenced by the American choice of 41 to 47 MHz.

Well I need to correct that error. In fact it was Italian channel IB that covered 61 to 68 MHz. Channel IA ran 52.5 to 59.5 MHz, with the vision carrier at 53.75 MHz. So the Italian IF channel was certainly nudging its lowest Band I channel from the underside.

I was planning to attach the WW 195412 article on the BREMA IF choice, but I cannot make the file small enough. The main chart is a double-page spread, and that might be the reason. But I could reduce the individual page scans for the WW 195404 Russell article, so that is attached.

(Irfanview would not get it down to the desired size, so I used Microsoft Office Picture Manager to get it down to around 250 kB, small enough for the old Microsoft Photo Manager to open, then used the latter to reduce it to a tad under 100 kB.)

Synchrodyne wrote:Maybe it is time to put the information discovered to date into some kind of tabular form, and I’ll ponder that possibility.

Attached is the first draft of the tabulation, with the IFs presented in ascending vision frequency order. When I have as much information as I think that I am likely to find, which point I suspect that I am approaching, I’ll also include a second list by CCIR System.

Probably it is not very readable as attached; the primary document is a spreadsheet, from which I “printed” a .pdf, converted that into a .jpg, then reduced it.

Some very recently found information on late-era SAWFs has thrown up some more numbers, particularly in respect of multistandard receivers, but I’ll discuss those separately.