One-to-many communication seems to be a business blessed by the blindfolded goddess. Newspapers are one of the first and, until recently, the most successful of such businesses. Coveted, pampered, feared, lured, controlled or suppressed by Public Authorities, they are incredibly powerful tools to shape public opinion – if content entices the public to purchase them in great numbers. Television, another one-to-many communication business, came to the fore much later, but is said to have overtaken much of the role that used to be played by newspapers (and radio). It looks as if somebody buying a newspaper or listening to a radio program or watching television were to say: I am opening my mind to you, would you like to fill it with your ideas?
Adding pictures to newspapers has been a great tool to get people’s attention, but moving pictures are much more powerful. To understand the reasons for its success in so many layers of yhe population suffice it to what an infant intent to watch television. What else is needed to prove success beyond the billion television sets that are in current use worldwide? This large number is all the more remarkable if one thinks that terrestrial television broadcasting is a very complex system whose deployment requires huge investments. In addition to what its takes to generate the pictures, it requires the installation of transmitting towers, which must be placed in line of sight because they use comparably high frequencies: VHF of about 100 MHz and UHF of a few hundred MHz. This is a technical nightmare in countries like Italy and Japan because of the mountainous nature of their territories.
The political environment has always dispensed loving care to television because of the obvious cultural, educational, entertainment, informational and political value of the medium. Until a time, many countries had a single television broadcasting agency or formally private company closely supervised, if not directly run, by the state. Many countries still require paying a viewing license to receive “public” television programs. It is not that many years since a few of the countries that used to have only state-run television agencies have allowed the establishment of “commercial” television companies. In order to be allowed to broadcast, these other companies must obtain a licence from the state because they use a portion of a limited-availability asset – VHF/UHF bandwidth – but they do not get money from a licence fee. So these companies have to be creative to be profitable, typically by resorting to advertisements in their programs or, more recently, offering pay TV.
The analogue television industry has been remarkably stable. The only significant innovation since the early years has been the introduction of colour in the 1950s and 1960s. Because TV is a full-fledged or at least kind of “public service”, owners of monochrome TV sets could not be disenfranchised by the introduction of an incompatible television service replacing the old monochrome service. Colour had to be introduced in a compatible fashion, i.e. in such a way that an existing television receiver, capable only of receiving monochrome television is still able to receive a colour signal and display it as a monochrome signal. In the delivery domain, one can mention the use of CATV, and the use of satellite broadcasting as examples of innovation, not to mention the many innovations that recent years have brought to consumers.
I said “the only significant innovation”, but I may have gone too far. In the late 1960s, Nippon Hoso Kyokai (NHK), the public Japanese broadcaster, started the development of a new generation of television system, called High Definition Television (HDTV). The system had approximately double vertical and horizontal definition compared to Standard Definition Television (SDTV) and an aspect ratio (the ratio of horizontal vs. vertical dimensions of the screen) of 16:9 instead of the standard 4:3 aspect ratio of SDTV. Therefore the bandwidth required by HDTV was roughly 5 times that of SDTV. NHK selected the frame frequency of 30.00 Hz (sharp) interlaced and 1125 vertical scanning lines, 1035 of which are active (information carrying) lines. No spectrum had yet been allocated for transmission of such a broadband signal, but the system immediately caught the attention of broadcasters around the world.
In the meantime, NHK engineers were working hard to develop a compression system called Multiple Sub-Nyquist Encoding (MUSE), a digital system able to compress the analogue HDTV signal intended for analogue transmission to fit in the satellite bandwidth of an SDTV program. This piece of work was truly admired by the scientific world. Actually, only a part of that world, because the digital purists – the majority – disliked the idea of doing the processing using digital techniques and the transmission with analogue techniques (as if all analogue delivery systems carrying bits did not also do the same). In the early 1980s, the word-of-mouth in the business was that Japan and the USA would team up to conquer the entertainment world with HDTV, the former with their control of the technology and their manufacturing prowess and the latter with their overwhelming capability to produce content suitable for this renewed television experience.
In the same years, the EBU had been working on an alternative project called Multiplexed Analogue Components (MAC). The project was inspired by the fact that, while the visual experience of NTSC and PAL/SECAM in the studio is similar, the spatial resolution of the latter two is significantly superior (by about 20%). If the resolution loss caused by interlace (so-called Kell factor) could be compensated by, say, doubling the frame frequency, one would obtain a system that was virtually flicker-free, thereby almost reaching the effective vertical resolution of interlaced HDTV. The advantage of this approach was that the television format used in the display (625 lines @ 25 Hz) would be preserved but the visual experience would be free from the typical PAL (and NTSC) artifacts, i.e. the mixing of colour and luminance information caused by the insertion of colour information in the holes of the luminance spectrum.
As for HDTV, which required the start of a new broadcasting service via satellite, whose programs could not be received by existing television receivers, MAC also required a special Set Top Box (STB) capable of decoding the signal. However, unlike HDTV that required a new and very expensive monitors, the output of a MAC STB could be used to feed a conventional television monitor preserving the purity of the signal if the set had a connector conforming to the Syndicat des Constructeurs d’Appareils Radio et Télévision (SCART) specification that supported component signals.
For years the ITU considered proposals for an international HDTV standard. The commendable idea was that – this time – the broadcasting industry should do away with national roads to television. There should be a single standard replacing the plethora of national television standards that would finally unify the broadcasting world. The watershed – not exactly in line with the expectations – happened at the ITU General Assembly of 1986. On that occasion a coalition of European PTT administrations, supported by a European Commission flexing its muscles for the first time in the CCIR arena, stopped the agreement by proposing a “new version” of HDTV. The main features of the system were: 25 Hz frame frequency, 1250 scanning lines (twice the number of PAL/SECAM’s) and a transmission system called High Definition MAC (HD-MAC) that exploited the MAC multiplexing functionalities for a transmission system, obviously incompatible with MUSE, even though it shared a number of technical principles with it. The major departure from MUSE, and a much-publicised feature impacting the user was that a D2-MAC receiver (one of the many variants of the MAC family and the one the European broadcasting industry had eventually converged to) would be capable of decoding a D2-MAC signal from an HD-MAC signal. This solution would have allowed the continuation of the deployment of D2-MAC, whose plans at that time were already quite advanced, at least in some countries, while the content and manufacturing industries, under the protective wings of the CEC, would gear up to provide the next step in television without jeopardising the already deployed population of receivers.
The work for developing the entire chain of equipment for eventual introduction in service was funded by Project 95 (EU 95) of the Eureka R&D program. About 1 billion USD went into that project that was technically very successful because it developed a range of products going from HD studio cameras and recording to transmission and receiving equipment. They were deployed and tested successfully in great number during the winter Olympics of 1992, the first to receive a full HD-MAC coverage.
Even more intense work was taking place in Japan. The advanced state of development of the technology allowed Japan to start a trial broadcasting service of 8 hours a day in 1989. During the decade that the trial service lasted, about half a million MUSE HDTV decoders were deployed.
The failure of ITU to adopt the HDTV Recommendation in 1986 had also given a blow to the HDTV plans in the USA. Dubbed as Advanced Television (ATV), the new project for the American path to HDTV was kicked off in the 1986-87 time frame when the National Association of Broadcasters (NAB) asked the Federal Communication Commission (FCC) not to re-allocate the spectrum, already assigned to broadcasters, to cellular telephony. The FCC complied and created the Advisory Committee on Advanced Television Services (ACATS). At that time, the prevailing view was that 12 MHz of spectrum, i.e. two television channels, would be needed to deliver HDTV. One 6 MHz channel would be used for the NTSC as the “base layer” and another for an HDTV “augmentation” signal. In this way the magic “compatible” extension from TV to HDTV would be achieved. Proposals were requested to show that this was feasible with the intention of selecting a system for the USA market. In a curious twist of history, the original American HDTV plan had started from the discontinuity advocated by the Japanese, just to end in the evolutionary Europeans approach of progression from TV to HDTV. But this was not going to be the end of the story.
For obvious reasons I had kept myself informed of what was brewing in CCIR for HDTV, and the failure of the ITU General Assembly to approve the HDTV Recommendation did not come as a surprise. In my years in CEPT, ETSI and CCITT committees I had plenty of opportunities to see what explosive combinations technology and politics could produce. But politics in CEPT and CCITT was incomparably less sophisticated than in CCIR because at that time telcos felt protected in their own local markets.
Out of reaction to the demonstrated lack of results caused by the inextricable mixing of politics and technology, I had begun to develop my own philosophy, prompted by my experience of too many smart people who had gone nowhere just because they had tried to manage both sides – political and technical – of the equation. So the recipe that I developed was: be aware of and conversant with the political issues but concentrate on the technical side. If the latter was successful, politics – with the innumerable nooses that its players would create for itself – would eventually have to bow and accept the results of technology. But, to be successful, the technical environment had to involve individuals from all parts of the world – lest we had a repetition of the IVICO experience – and, even more importantly, from all technical communities working for the different industries.
The 1st HDTV Workshop, held at L’Aquila on 12-13 November 1986, just a few weeks after the ITU General Assembly, saw the participation of the major technical players in the HDTV space. A few months later a Steering Committee (SC) was set up that was, in the words of the “Guidelines for Steering Committee Activity” that would guide the organisation during 14 years,
made of people representing the three global regions and technical communities on an equal footing.
That was my first experience in trying to put together people from different countries and technical communities to achieve a common goal with an eventual business objective that should be “on the horizon”, but should never allow to get in the way of the different business objectives of the different people representing different industries – not to say companies. That experience taught me many lessons that I would use in the following years, apart from giving me the opportunity to know a number of new people whose friendship I have preserved over the years. The only big regret I have is the loss of Yuichi Ninomiya, the inventor of MUSE at NHK and an SC member since the early days, who suffered an untimely death.
The HDTV Workshop continued for 14 years organising yearly conferences. Eventually, however, the launch of the ATV service in the USA, the trial HDTV service in Japan and the virtual neglect of anything “HD” in Europe after 1992, made the HDTV Workshop redundant. The last event was held in Geneva in 1999, but I had already left the workshop in 1994, at the peak of my MPEG-2 and DAVIC efforts to which I have now to turn.